8 Data Solution Trends for 2023 and Beyond
In this article
Data is the crown jewel of every organization, but unlocking the value of data is becoming increasingly more difficult as those organizations struggle with ever-increasing amounts of poorly classified and integrated data. It's a constant struggle to properly discover, classify, integrate, store and protect data so that it can be best leveraged to drive improved intelligence and outcomes.
We work with customers, partners and industries in all facets of data, and have uncovered patterns and insights into how various organizations are approaching their data journey. The following insights reflect the trends that are most important in 2023 and beyond.
According to IBM, 35% of companies are using AI and 42% are actively exploring AI for future use.
Artificial intelligence (AI) a few years ago seemed far off due to technical complexity, resource availability and cost.
Today, AI is showing up in all sorts of places ranging from main stage applications like ChatGPT to backstage applications like drug development. These applications frequently combine multiple analytical techniques like Deep Learning, Natural Language Processing (NLP), and Computer Vision, which is known as Composite AI.
We've seen successful applications in multiple industries ranging from optimizing mine operations, enhancing the customer and employee experience in restaurants, accelerating timeframes for drug development in life science, better-identifying fraud in banking, to accelerating resolutions with chatbots for telecoms.
Successful organizations will look to incorporate proven AI concepts into their businesses to accelerate innovation, delight customers and optimize operations.
Data-driven organizations are 23X more likely to acquire new clients and 6X more likely to maintain the customers they gain.
Data fabric refers to an architecture that facilitates the end-to-end integration of data across an entire organization. The best data fabrics give organizations and users access to the right data at the right time with a solid end-user experience, all at an optimal cost. They also provide access, control, protection and visibility quickly and securely.
Data mesh builds on the good foundation of data fabric and allows data owners to be more efficient with data. Data mesh relies on a comprehensive data strategy built in concert with mature data governance. This ensures organizations have accurate and good data available rather than needing to sort through old and irrelevant data. Metadata must be accurate, data must be properly classified, and data pipelines must be established and secure to enable real-time updates and learning.
Organizations must align business, data, cloud and IT functions to successfully build a data fabric and mesh that empowers and drives the organization forward.
By 2027, over 40% of large organizations worldwide will be using a combination of Web3, spatial computing and digital twins in metaverse-based projects aimed at increasing revenue.
A digital twin (DT) serves as a synchronized virtual representation of a real-world physical entity. DTs have a wide range of use cases, from a digital representation of a college campus that prospective students can virtually visit; to allowing physically handicapped people to ride a virtual roller coaster with a friend riding a physical roller coaster; to simulating how a submarine will pass through shoreline waters off the coast of North America; to aiding in the engineering and design of a new airplane engine.
If those descriptions sound challenging, it may be easier to think about recent movies like Iron Man or Ready Player One (RP1), which prominently feature digital twins. You see Tony Stark using a digital twin to design everything from his Iron Man suit to the arc reactor that powers him, and in RP1, you see people interacting with an entire digital universe.
Digital twins have all sorts of benefits ranging from simplifying and enabling new interactions as in the college visit or roller coaster examples, making engineering and design faster and less costly as in the engine example, or driving massive operating efficiencies as in the sub example above.
According to Research and Markets, the market for data center accelerators is projected to reach $46.6 Billion by 2026, growing at a CAGR of 44.9%.
Most accelerated computing today is done as part of specialized architectures such as Artificial Intelligence (AI) Model Training, Real-Time Inferencing, or High Frequency Trading (HFT) use cases. These systems tend to be highly custom and require advanced architectural design and specialized operating environments, as well as modern MLOps practices.
In addition to these specialized workloads, more general workloads can take advantage of GPU, DPU or FPGA-based accelerated computing instead of general-purpose CPUs to enhance overall application performance and drive greater efficiencies.
Many IT organizations will look to collaborate with sustainability or environmental teams to further justify and deploy these accelerated computing initiatives.
Request a High Performance Architectures (HPA) Briefing: A discovery session that introduces HPA technologies, products, solutions and key partners to deliver solutions that best meet an organization's needs.
The threat of cyber-attacks has been escalating rapidly in recent years with a 107% increase in ransomware and extortion attacks in 2022 vs. 2021.
Cyber threats have become so big that organizations have sprouted up across the world to provide ransomware as a service to other criminal enterprises that infect and exploit businesses.
There isn't a silver bullet for ransomware, but there are clear and proven ways to protect your business proactively from threats. Security teams must work hand in hand with IT to anticipate, withstand, recover and adapt to threats.
Simple things can be done like taking advantage of encryption and immutability features with critical production data, data protection and disaster recovery to mitigate risk, but more extensive cyber resilience and recovery programs must be developed to ensure business survivability in the face of an extensive cyber event.
According to Statista, 84% of large organizations have already adopted multicloud, with a mixture of on-premises IT, collocated apps, and a public cloud presence spanning multiple providers.
Many organizations have a stated goal of putting more workloads in the public cloud over time, but the transition path is difficult, and the majority will remain in a hybrid multicloud environment for the foreseeable future.
Managing these environments is challenging and requires organizations to either invest in unique data skillsets spanning each individual operating environment or purchase technologies that span traditional IT and public cloud architectures.
Successful organizations will prioritize their data technology investment in those vendors with proven architectures that span on-premises and all major public clouds, allowing for one skillset to manage them no matter where workloads run. Investments like these give organizations options and empower them to run workloads where they make the most sense.
90% of organizations have increased investments in sustainability since the pandemic.
Sustainable IT, or Green IT, is a model for driving waste out of the IT lifecycle to minimize environmental impact. This includes optimizing how IT equipment is designed, manufactured, shipped, installed, operated, decommissioned, reused and/or recycled in order to ultimately drive its environmental impact to zero.
Companies must get serious about considering their overall environmental impact and begin evaluating everything in and out of the data center for energy efficiency, renewable energy sources, waste reduction, water use, carbon footprint and location.
Successful organizations will employ multiple strategies to become more sustainable. These include searching for non-productive equipment (zombie computers), optimizing workloads, upgrading to more energy-efficient infrastructure, alternative cooling, enhanced recycling programs and new success measurements tied to sustainability and carbon impact.
By 2025, 40% of newly procured premises-based compute and storage will be consumed as a service, up from less than 10% in 2021.
Some customers are looking for a cloud-like experience but have no interest in putting their data into a public cloud provider, while others have a cloud-first/cloud-native mentality. Whether your organization's crown jewels are stored in an EPIC solution for healthcare, a database for an enterprise resource planning (ERP) system, or you have machine learning (ML) initiatives, we are seeing a shift from traditional capital expenditure (CAPEX) spending on IT infrastructure to a "pay by the drip" operational expense (OPEX) model.
We covered a lot of ground above, and there are other areas we're tracking including quantum simulation, graph databases, data service automation, bulk data transfer, decision intelligence and more.
Data is increasingly more important in all facets of a modern organization. In many organizations, the entire business has shifted to be oriented around data itself. There are many places where investing in data can improve your organization, but the important thing is to get started.
There are many ways WWT can help you take that first step forward ranging from ideation briefings to hands-on labs, to a full proof of concept (PoC) in our Advanced Technology Center (ATC).
Unsure where to start? Please feel free to schedule a complimentary Data Strategy Briefing today or reach out to your WWT representative for more information.