Effective infrastructure enables universal data intelligence

Data volumes are exploding across organizations of all types. Research firm IDC projects the amount of global data to more than double between now and 2026, with enterprise data leading that growth — increasing twice as fast as consumer data. Accordingly, it is a business imperative to store, protect, and provide access to this growing volume of data, while finding new ways to derive value from it.

The surge in data volumes is driven by multiple factors: Historical data that companies have been collecting for years continues to pile up. New data types are proliferating, such as IoT (Internet of Things) sensor data, operational technology (OT) data, and customer experience data. Core business functions, such as supply chain, are becoming increasingly more data driven.

As organizations transition to data-driven business models, they become keepers of immense and ever-growing amounts of data, which they must store, protect, and analyze. For many this is a sizable challenge: 80% of respondents to a survey by 451 Research said they work with more than 100 data sources. Their existing systems also struggle to keep pace: 30% of respondents said it takes their organization more than a week to go from data to insights.

Consequently, most organizations face a long list of data-related challenges. They must get their arms around a massive volume of data, ranging from historical to real-time. They need to determine what type of infrastructure modernization is required to process all that data, and then how to integrate that storage infrastructure with data services, workloads, and applications. And the questions continue: How do we apply automation, AI, and machine learning to data sets? How should we think about the cloud? And how do we take advantage of as-a-service models to deliver data-driven value to the business?

And above all, today’s economic conditions call for doing more with less. How can IT teams meet the business need for advanced data analytics when their budget growth is not keeping pace?

First steps toward data intelligence

Bharti Patel, senior vice president of product engineering at Hitachi Vantara, says businesses should aspire to offer seamless access to data and insights, what she calls “universal data intelligence.” That journey starts with getting a handle on the basics of data discovery and data classification and creating policies for how the organization handles different types of data.

“Thoughtfulness at the beginning of the journey is very important,” says Patel. To derive insights with business value, organizations first must get a handle on the data they have and are collecting. “People really don’t fully understand what value lies in their data; they don’t know where it resides or how to make the best use of it,” she says. “Your strategy about what data goes where—what goes to tape, what stays on-prem, what goes to cloud—is very important.”

To create that strategy, organizations need to define the purpose of their data: Is this data we’re storing simply for compliance purposes, with a low probability that we will ever need to retrieve it? Is this confidential data that must be stored on-premises, with multiple backups? Or is this data that needs to be accessed frequently and should be stored on high-performance NVMe (non-volatile memory express) systems?

Organizations must also prioritize data analytics initiatives to find the right balance between quick-hit projects that deliver limited benefit against more ambitious endeavors that could take longer but deliver more value in the long run.

Patel says it’s vitally important that organizations establish clear lines of communication between business and IT leaders. Everyone should be on the same page when it comes to identifying the most pressing business needs, agreeing on a priority list, and making sure that deliverables from the data analytics teams are presented in a way the business can put to use.

Infrastructure modernization

As data growth accelerates and data strategies are refined, organizations are under pressure to modernize their data infrastructure in a way that is cost-effective, secure, scalable, socially responsible, and compliant with regulations.

Organizations with legacy infrastructures often own hardware from multiple vendors, particularly if IoT and OT data is involved. Their challenge, then, is to create a seamless, unified system that takes advantage of automation to optimize routine processes and apply AI and machine learning to that data for further insights.

“That’s one of my focus areas at Hitachi Vantara,” says Patel. “How do we combine the power of the data coming in from OT and IoT? How can we provide insights to people in a heterogeneous environment if they don’t have time to go from one machine to another? That’s what it means to create a seamless data plane.”

Social responsibility includes taking a hard look at the organization’s carbon footprint and finding data infrastructure solutions that support emissions reduction goals. Hitachi Vantara estimates that emissions attributable to data storage infrastructure can be reduced as much as 96% via a combination of changing energy sources, upgrading infrastructure and hardware, adopting software to manage storage, and automating workflows—while also improving storage performance and cutting costs.

The hybrid cloud approach

While many organizations follow a “cloud-first” approach, a more nuanced strategy is gaining momentum among forward-thinking CEOs. It’s more of a “cloud where it makes sense” or “cloud smart” strategy.

In this scenario, organizations take a strategic approach to where they place applications, data, and workloads, based on security, financial and operational considerations. There are four basic building blocks of this hybrid approach: seamless management of workloads wherever they are located; a data plane that delivers suitable capacity, cost, performance, and data protection; a simplified, highly resilient infrastructure; and AIOps, which provides an intelligent automated control plane with observability across IT operations.

“I think hybrid is going to stay for enterprises for a long time,” says Patel. “It’s important to be able to do whatever you want with the data, irrespective of where it resides. It could be on-prem, in the cloud, or in a multi-cloud environment.”

Clearing up cloud confusion

The public cloud is often viewed as a location: a go-to place for organizations to unlock speed, agility, scalability, and innovation. That place is then contrasted with legacy on-premises infrastructure environments that don’t provide the same user-friendly, as-a-service features associated with cloud. Some IT leaders assume the public cloud is the only place they can reap the benefits of managed services and automation to reduce the burden of operating their own infrastructure.

As a practical matter, however, most organizations will always have data on-premises, in the cloud, and in edge deployments. Data will constantly move back and forth across all of those platforms.

A better approach is to view the cloud as a concept, an operational principle, and an experience based on the as-a-service consumption model. Once organizations gain this clarity, they can make decisions that enable them to apply the “everything-as-a-service” concept to enterprise resources whether they are located on-prem or in the cloud.

As Patel points out, executives don’t care where or how data analytics occur, as long as they get the insights they need in a format they can use. The key, she says, is that “people who have to make decisions about the data get what they want in a cost-effective manner at the right time.”

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Source

Leave a Reply

Your email address will not be published. Required fields are marked *