Storage of unstructured big data must be part of a company’s strategy

0


[ad_1]

For many IT organizations, data storage is an afterthought, not a strategic concern. When it comes to big data management, however, storage needs to take center stage.

Image: Maxger, Getty Images / iStockphoto

Unstructured data is used to vividly document key events, capture paper documents in a free digital format, and report on business operations through sensors and other IoT devices. Yet a 2020 NewVantage survey of C-level executives found that only 37.8% of companies surveyed feel they have created a data-driven culture, and more than half (54.9%) felt they couldn’t compete with other companies in data and analytics.

SEE: Snowflake Data Warehouse Platform: A Quick Reference (Free PDF) (TechRepublic)

“About 43% of all the data organizations capture goes unused, which is huge untapped value when it comes to unstructured data. The importance of understanding, integrating, and leveraging this unstructured data is critical to business efficiency and growth. put it to good use, ”said Jeff Fochtman, senior vice president of marketing at Seagate, which provides storage as an AWS S3 service. Fochtman was talking about the challenge of managing unstructured big data, which he said accounts for 90% of all global data in 2020 according to research conducted by IDC.

A major problem is data management. To master data management, companies need data architectures, tools, processing and expertise, but they must also think about their big data storage strategy.

To do this, unstructured data must be cataloged and analyzed; but the cost burden on businesses often prevents them from performing these intensive processing operations, which require large data centers and cloud architectures that deploy very large capacity data storage systems powered by hard drives. Second, once that data is processed, it must be able to be replicated and reassigned so that it can be sent to many different departments and sites in an organization that need different types of data.

“The need to access unstructured data close to its source and move it, if necessary, to a variety of private and public cloud data centers to be used for different purposes, is driving the shift from closed, proprietary IT architectures. and siled to open architectures, hybrid models, ”Fochtman said.

SEE: Bridging the gap between data analysts and finance (TechRepublic)

In these hybrid models, data storage must be orchestrated so that different types of data are stored at different points in the business. For example, IoT data that tracks operational efficiency in real time can be stored on a server in a manufacturing facility on the edge of the enterprise, while data stored for compliance and intellectual property reasons can be stored on the premises of the company’s data center. .

Since unstructured data is what it is – unstructured – data must be labeled for its meaning and purpose before subsets of it can be disseminated to different points in the business that have varying needs to know.

The scale of data storage, cataloging, security and dissemination operations is impressive. This is leading more and more companies to turn to cloud-based storage that can be purchased as needed without having to upgrade corporate data centers with high-powered storage drives, which is prohibitive.

“Every industry dealing with 100TB to multi-petabyte mass data sets faces challenges in data transport and analysis,” said Fochtman. “For example, consider the healthcare industry. The 100TB + of data the industry collects is integral to protecting and treating the mental and physical health of communities. Hidden in the raw format of these sets of big data can be correlations between diseases that we might not otherwise understand, more precise analysis of cancer data or other life-saving learning. But with such amounts of unstructured data, what’s the first step in getting value out of that data? Often times, that sets that data in motion. ”

SEE: How to Effectively Manage Cold Storage Big Data (TechRepublic)

This makes sense when you want to get the most value out of your big data, which every business wants to do. It also brings the conversation back to storage, which is so often left out in IT strategic planning programs when it shouldn’t be.

Instead, a strategic focus should be on cost and data agile storage that can be scaled up (or downsized) as needed. Cloud-based storage is best suited for this task, with a more circumscribed role for storage in on-premises data centers, which would focus on retaining highly sensitive data for business compliance and property. intellectual.

Attention should also be paid to how the data under management is distributed.

“We live in a data-driven world,” Fochtman said. “Successful businesses realize that if their mass data sets cannot move in an agile and cost-effective manner and if the data is not readily accessible, business value suffers. ”

Also look

[ad_2]

Leave A Reply

Your email address will not be published.