Part 1 – History of Data Management
Let’s plot the advance of Data Management through the years; the axes (plural of axis; not my fault) will be Volume (self explanatory, but think logarithmic scale) and Intricacy, which encapsulates complexity, variability, velocity, veracity and any other V-word vendors seem to be throwing at it…
The early days of computing saw each application tailoring its data structure to the specific problem in hand. Volumes were low, data was carefully structured, multiple data sources were unheard-of.
Large apps used network or hierarchical databases, still tailored to each app.
In the late 70’s, companies started implementing Ted Codd’s ideas: the Relational Database. This broke the strong interdependence between application and data structure. It also standardised data access on SQL. At the same time, spread-sheets became popular, taking over simple data management from tailored systems.
Then, in the 90’s, it became apparent that there is a lot of value in combining data from various applications. It also became apparent that this is not an easy task. Data Marts emerged (although the term was coined somewhat later), followed by Data Warehouses.
And then something interesting happened.
A new set of companies emerged, with a new type of business problems: Yahoo, Google, Amazon and later Facebook had data challenges like never before: volumes were orders-of-magnitude higher than before, data did not have a simple, coherent structure and their main users expected all of this for free !
They needed a cheap solution for fast access to large amounts of data. So they developed a distributed file system that facilitated data access. Hadoop was born.
And then the world became jealous.
People starting asking “if they can benefit from this free data management framework, can we do the same?”. Using this for ‘normal’ business advantage is known as ‘Big Data’.
So, for 50 years IT have been telling the business that intricate data is hard; that they should carefully select what data to keep and what data to analyse.
Big Data changed all of that. Now we tell the business “keep everything, it can be useful for analytics”.
Ben Bor is a Senior Solutions Architect at Teradata ANZ, specialist in maximising the value of enterprise data. He gained international experience on projects in Europe, America, Asia and Australia. Ben has over 30 years’ experience in the IT industry. Prior to joining Teradata, Ben worked for international consultancies for about 15 years and for international banks before that. Connect with Ben Bor via Linkedin.