Data storage

A look at the evolution of data storage

By Sahil Chawla, CEO and Co-Founder, Tsecond

Many factors determine the evolution of data storage. The first is digital creation. Whether it’s media, Industry 4.0, IoT or corporate digital infrastructure, these are the enablers that generate huge amounts of data in the world.

Consider how we use machines to work and play in our households. Before the pandemic, the majority of the workforce was in offices with all connecting devices in one place. But now, as we work from home, in cafes and coworking spaces, the video conferencing technologies and social media we use are generating more and more data. As a result, a large amount of data is captured from the edge: both static (cafes, etc.) and mobile (planes, cars, ships, etc.).

Coming to Industry 4.0, every company is adopting and making their machines smarter. The IOT is no longer just for the workplace, but has become part of everyday life. For example, home automation generates a huge amount of data. Think of the Alexa and Google home devices used around the world – everything from household appliances to our children’s video games generates data.

Evolution of higher data storage capacities

Over the past two decades, due to the exponential increase in data usage, data centers have developed stringent requirements for greater storage capacity per square area and faster data transmission, the industry continued to evolve. Innovators focused on finding ways to achieve greater capacity and faster throughput, while using limited space and staying within their energy budget.

Flash technology has become popular due to its small size and ability to deliver information faster using significantly lower power consumption than hard drive technology. However, while this option solves some size and power issues, it has limitations. For example, the lifetime of most flash devices only allows you to write data to them so many times before they fail mechanically.

Over the past 90 years, data storage has evolved from drums and tapes to hard drives, then mixed media, flash and finally cloud storage. That’s where we are today, and as our storage needs grow, innovation continues to evolve in many areas.

The Paradigm Shift to Edge Data Storage
Big Data plays a central role in almost everything we do these days, but it’s no longer enough to simply have access to data-driven insights, especially if it’s stale and outdated. As the amount of data generated increases and data capture moves ever closer to edge environments, urgent processing is essential to provide timely intelligence that reflects real-time circumstances.

Organizations are under increasing pressure to obtain and apply information quickly, before situations change. This fact compels business leaders across all industries to embrace active data and deploy ways to capture and transport it for immediate processing.

Big Data Management Challenges
To optimize AI for the future, we also need high-performance systems. These can be storage or cloud-based systems, processed by modern, data-intensive applications. The more data you feed these applications, the faster they can run their algorithms and deliver insights, whether for micro-strategy tools or business intelligence tools. This is commonly referred to as data mining, and in the past we did it by putting the data in a warehouse and then running applications to process it.

However, these methods are full of challenges. Data-generating devices now continuously generate ever-increasing amounts of information. Whether the source is self-driving vehicles or healthcare, and whether the platform is a drone or an advanced device, all are capable of generating greater amounts of data than ever before. So far, the data management industry has been unable to capture these quantities, whether through networks, 5G, the cloud, or any other storage method.

These circumstances led to the loss of 90% of the data collected due to insufficient storage capacity and the inability to process it quickly and transmit it to a data center. The findings also apply to critical data captured at remote sites that don’t have connectivity or cloud applications running at the edge.

It is essential for us to develop ways to solve these challenges, solutions that include timely capture, transport to data centers / cloud and immediate processing using a single, lightweight storage that allows us to reap the benefits of putting AI at the service of humanity.

Preparing for the future
CIOs are always looking for new ways to profitably operate infrastructure. To prepare your business for high performance results in the future, all applications you run must be highly responsive. Not all data is permanently consulted. Some datasets are only viewed for a short time and then they are out of date. To optimize your infrastructure for cost, performance, and space elements, you should:

• Analyze data characteristics and identify data sets and segments that are not constantly needed.
• Design a data architecture that includes slow speed storage for non-critical data. It could be a hybrid of fast storage technologies and hard drive technologies.
• Layer the architecture so that only active data requires fast storage and the rest can be stored in slower locations.

There are certainly different types of products in SSD flash drives. High density storage systems will also follow this technology. The Big Data space is seeing groundbreaking innovations to address storage, migration, and deployment challenges and one needs to stay current to benefit from the data we have.