We’re constantly told that the ability to analysis and dissect vast sets of data in real-time is having a transformational effect on the ICT industry.

As a new wave of applications designed to leverage the insights gained from real-time data analysis begin to enter the market, it has triggered the arrival of more intelligent enterprises.

These kinds of organisations as┬áSAP Africa’s Julie Tregurtha, head of database & data management, views it, will need to tackle the growing challenge of handling the complexity of such sets of data, particularly if they are aiming to come to fruition.

“The escalating volume of data and the organisational complexity required to collect and store the data and then analyse it for insights that can deliver business value is forcing IT leaders to fundamentally rethink their data storage and database management strategies,” points out Tregurtha.

This requirement will only intensify with the exponential rise in the amount of data that more IoT ecosystems will create, she adds.

Limiting agility

Part of the problem is the lack of preparedness on the part of companies for the vast amounts of data they not only need to tap into and analyse in real-time, but also how said data is stored and accessed in an efficient manner.

By 2020 for example, it’s estimated by data scientists that the volume of data created by companies would double each year to reach 40 zettabytes (ZB), with the inclusion of IoT systems now increasing that figure to around 44ZB.

This massive amount of data is coupled by the fact that many business leaders are simply unable to handle the complexity that it represents. A recent survey by SAP found out that 74 percent of business leaders feel their data landscape is too complex, and as a result severely limits their agilty, with 86 percent believing that they were not getting the most out of their data.

This issue is one where better strategies around data management can alleviate, according to Tregurtha.

“Considering the sheer volume of data a modern enterprise generates, data storage strategies need to evolve quickly to enable IT leaders to deliver real-time value to the business using accurate insights gained from verified data,” notes Tregurtha.

Plotting the course

An interesting solution for storing and managing data is classifying according to temperature says Tregurtha, and ranking sets of data from hot to cold.

Hot data is classified as being the most accessed and includes content such as the latest sales data, with this type of information also accounting for less than 20 percent of an organisation’s stored data. As it requires immediate access, this kind of data is best stored in-memory.

Cold data on the other hand would be the other 80 percent in this scenario, and is accessed about 10 percent of the time by employees or analysts. It’s therefore less cost-effective to store this type of data in-memory.

“Enterprises wishing to allocate a temperature to different types of data to more effectively store and retrieve such data need to look at a form of dynamic storage hierarchy. While it’s important to have experienced and skilled database administrators driving the initial process of developing the data storage strategy, it is impossible for humans to determine the heat of data manually,” explains Tregurtha.

This is where data management platforms like SAP HANA can deliver the best value to customers if implemented correctly.

“SAP HANA enables advanced analytics running alongside high-speed transactions in real-time for hot data while also integrating with cold data solutions such as Hadoop as well as other external data storage solutions,” enthuses Tregurtha.

With the right data management platform in place, more intelligent enterprises can come to the fore, she concludes.

[Image – CC0 Pixabay]