Data management A decade in review: Challenges in data storage in 2014 and today.

A guest post by Federica Monsone* | Translated by AI 13 min Reading Time

Related Vendor

While a decade brings little change in other areas, the technology of data storage and management experiences a revolution during this time. How have the challenges and solutions changed since 2014, and what do experts expect for the future?

How will storage media continue to evolve in the future to keep up with exponential data growth?(Image: Free licensed /  Pixabay)
How will storage media continue to evolve in the future to keep up with exponential data growth?
(Image: Free licensed / Pixabay)

Federica Monsone is the founder and CEO of A3 Communications.

Einstein knew that the perception of how quickly time passes depends on the perspective of the observer. And for a dog, a year may feel about as long as a span of seven years does to a human. In the data storage industry, on the other hand, changes are taking place much faster than in many other areas of human activity.

How do the challenges of data storage and management, that companies faced ten years ago, compare with those of today? What do experts say about the way in which the current storage landscape and its increasingly complex challenges are influencing technology developments?

What was then, what is now: Changing storage requirements

More than one expert records that the data storage challenges facing IT organizations in 2014 were quite similar to those of today – at least at a high level. "The challenges haven't changed much, even if the technology has. The biggest challenge was probably dealing with the constantly rising demand for storage capacity. The second challenge was protecting the data. Even if the intensity of ransomware attacks wasn't as high as it is today, data protection was a big issue. The third challenge was that there weren't enough staff available to handle the volume of storage. This personnel problem has only gotten worse since then," says Randy Kerns, Senior Strategist and Analyst at the analyst firm Futurum Group.

Brock Mowry, CTO & VP of Products at storage system provider Tintri, agrees, but adds an important caveat. "The challenges are essentially the same as they were ten years ago, but the scale and scope of these challenges have dramatically changed," he says.

Critical point storage capacity: Data growth and no end in sight

Erfane Arwani, CEO of Biomemory, a start-up company specializing in DNA storage and synthesis, emphasizes the difficulties in keeping up with data growth in 2014: "Companies were struggling to cope with exponential data growth and technology solutions that were not yet optimized for large amounts of data." Arwani points out that a decade ago, companies' hard drive capacities ranged only between 1 TB and 4 TB. In the decade since then, hard drive capacities have jumped significantly. Today, the highest capacity hard drives can handle 30 TB. At the same time, the use of flash storage in data centers has increased significantly, and the largest flash drives for businesses now have a capacity of over 60 TB.

In 2014, companies were still focusing on on-premises storage and used public cloud storage services to a lesser extent than today. "It was about choosing between NAS and SAN, and cloud solutions were comparable to ice baths - beneficial, but not suitable for everyone," says Ferhat Kaddour, Vice President of Sales and Alliances at Atempo, a data protection and management software provider. Ensuring sufficient overall capacity for a company was a multi-faceted task. "The challenge of scalability consisted of predicting future storage needs, optimizing storage use, and implementing effective strategies for storage partitioning," says Drew Wanstall, Vice President of Business Development at Scale Logic, a provider of storage and workflow infrastructures for media production.

Today, the amount of data is still growing rapidly. "It's interesting to see how data is growing at a frantic pace," says Enrico Signoretti, Vice President of Products and Partnerships at Cubbit, a provider of geodistributed cloud storage systems. Valéry Guilleaume, CEO of Nodeum, a data management software provider, named some of the new data sources that are driving this growth and have already ushered in the era of so-called big data. "Today, it's not just the users who are generating data, but also the systems that are being developed in the individual industries, such as data-generating cars, electronic microscopes, blade scanners or even seismic sensors. These new sources generate data at a speed that is not comparable to the data-generating sources of ten to fifteen years ago."

The difficulty of increasing physical storage capacity to keep pace with data growth has, however, been somewhat mitigated by the increasing use of public cloud storage, as well as improvements in data storage technology. Among the technological developments of the past ten years, the enormous price drop in flash storage is particularly noteworthy, which has led to widespread use of flash storage in corporate data centers. "The demand for capacity continues, but the scope and performance of flash allow for greater consolidation and fewer physical systems, less energy, cooling, and space requirement, as well as simpler means of boosting performance," says Kerns. "The technology to solve problems is available and more effective than ten years ago."

Subscribe to the newsletter now

Don't Miss out on Our Best Content

By clicking on „Subscribe to Newsletter“ I agree to the processing and use of my data according to the consent form (please expand for details) and accept the Terms of Use. For more information, please see our Privacy Policy. The consent declaration relates, among other things, to the sending of editorial newsletters by email and to data matching for marketing purposes with selected advertising partners (e.g., LinkedIn, Google, Meta)

Unfold for details of your consent

While other experts believe that storage scalability continues to be a major problem, Kerns' view is shared by further industry analysts. "More data does make management more complex, but less so than in the past. Storage solutions today are much more scalable than before. The data explosion, particularly in the field of artificial intelligence, presents the difficulty of finding the right data, putting it in the right, clean format, and making it quickly available. The challenge today is not so much in storing data, but rather in its use," says Scott Sinclair, Practice Director at the analyst firm Enterprise Storage Group (ESG).

Data security, data protection, data mobility - modern requirements

David Norfolk, Practice Leader at analyst firm Bloor Research, says: "The technical problems of ten years ago have largely disappeared. Storage is now cheap, reliable, and easy to scale. However, storage management - including threat management - is a cost factor today."

The threats mentioned by Norfolk also include cyberattacks, which, according to several experts on our panel, have significantly increased in number and intensity over the past ten years. "Security is clearly today's biggest challenge for data storage. While there have always been security threats from malicious actors and users, today's problems are indeed more difficult and costly to manage, which is due to the well-organized and funded ransomware actors who often belong to state-supported groups," says Paul Speciale, Chief Marketing Officer at object storage specialist Scality.

"With the ongoing ransomware boom and the emergence of malicious AI tools and as-a-service cybercrime models, data protection is now at the top of the challenges in the storage area. Not only is the number of security breaches increasing, but their impact is also increasing with improved tactics (multiple extortion), or the dual-strain attacks observed lately," says Sergei Serdyuk, Vice President of Product Management at Nakivo, a provider of backup, ransomware protection and disaster recovery solutions.

This is not the only change in the IT landscape that has driven up storage management costs. Ten years ago, data growth was driven by the general digitisation of the economy and the increasing use of analytics. Now, it is also being fuelled by the need to collect data for training AI and machine learning systems and, as Guilleaume describes, by the growth of the Internet of Things as a data source. Although the term IoT was coined in the 1990s, it has only become a daily reality in the last ten years. At the same time, companies are also generating more unstructured data. These now make up the majority of the data stored by companies. Unlike structured data, unstructured data is not organised according to a predefined database schema, which makes its management considerably more difficult.

High data quality - the key to a future-proof data concept.

"Today, one has to find one's way in a huge ocean of big data. From customer interactions to collected sensor data - even smaller companies process petabytes, larger ones even exabytes. The difficulties lie not only in the sheer amount of data, but also in the strategic tactics required to extract, categorise and protect them," says Kaddour. Norfolk of Bloor Research points out a critical target that is difficult to achieve when using unstructured data: "Quality, because the data comes from a swamp and not from a proper database."

"Efficient management of data at the edge has become critical. Ensuring data availability and fail-safe in distributed environments poses new challenges," says Johan Pellicaan, Vice President and Managing Director at Scale Computing, a provider of edge computing, virtualisation, and hyperconverged solutions.

Companies not only have to secure their data at the edge, but they also have to be able to move data between different locations. "Today's challenges are all related to the movement of data in multi- and hybrid cloud environments. About 50 percent of companies report that they are constantly or regularly moving data between on- and off-premises environments. These issues are more difficult to tackle because the environments are so diverse, with data spanning across AWS, Azure, GCP, the data center, or the edge," says Sinclair of ESG.