7 Tips on Moving from Big Data to Value

Thursday August 21st, 2014

Teradata ANZ held the second of its 2014 Teradata ANZ Summit sessions this week, and we were entertained and informed by experts in the art of extracting value from information.  Here are my top 7 take-aways from the event:

Today’s Big Data will be tomorrow’s small data:  Bill Franks, Teradata’s Chief Analytical Officer, told us that whilst there is a lot of hype around Big Data today, the evidence is good that it will eventually deliver value to everyone.  “I believe that in 10 years time, we will have achieved all the hyped expectations for Big Data, but that it will have been harder to achieve and take longer than the over-hyped claims of some IT sales people”, he said.  He referred back to the Dot Com bubble in the late 1990s and pointed out that today’s internet-enabled environment achieves all of those over-inflated claims and more, but that it took longer and was harder to achieve than was originally expected.

This, Bill believes, will also be true with Big Data.  As we develop approaches for integrating Big Data into the analytics environments, it will become more manageable and deliver increasing business value, and gradually just become “data” that is useful for assisting business decision making, rather than a separate or special case.  The integration of data into a single analytics ecosystem still remains the best way of delivering the value and the promise of Big Data.  Treating it as a separate case leads to silo-approaches that limit the ability to use the data effectively in all cases.

Just as separating e-commerce into a separate organisation from bricks-and-mortar retailing caused retailers difficulties that they are now starting to unwind, separating Big Data into a separate organisational unit or technology platform will leave it isolated and unable to be integrated as effectively as using a common integrated platform would.

People are starting to buy on analytics capability rather than fashion: Analytics is starting to move out of the back office of an organisation and become connected to products and valued by consumers.  Examples include heavy equipment (trucks, tractors, mining gear,etc) that is fitted with tracking and monitoring devices that feed performance, location and status information in real time to their users.

Another example is the health monitoring equipment such as the FitBit or the Nike Fuel bands that early adopters are buying to track sleep patterns, exercise and diet patterns and related information.  Banks are winning customers over by providing expense analytics, as utilities are providing usage and greenhouse gas emission analytics.  This is leading to monetising of analytics as a value to the customer, rather than just as a cost of operations.  More analytics are likely to be monetised, including the sale of aggregated anonymised data from organisations to their suppliers or customers.

Starting small is valid: Bill Franks believes that people make the assumption that because it’s called “Big Data”, that you have to start with a large and expensive environment.  “Don’t get sucked into the hype of having to go huge, just to get started”, he said.  Anchoring theory (the theory that you make decisions based on the context of the last reference point you’ve heard) tends to make people think that anything called Big needs a big machine to process it.

Big Data can be very small in volume, but high in complexity and value.  Using a “start small, grow large” strategy to prove value works just as well in the Big Data world as it does for any other project, and small projects with provable value will reduce fear and doubt in management as to the value in Big Data.

Firm foundations allow for change:  The Royal Bank of Canada’s Dr Mohammad Rifaie talked about the long-term journey that Royal Bank of Canada has undertaken to deliver a strong data warehouse environment that has been proven to be able to adapt to changing business needs, changing technology and changing expectations for the last 15 years.

Starting in 1995, RBC consolidated all their information services and assets into a single group with the goal of creating an integrated analytics environment.  Using a solid integrated data model as a central principle, this team has built a solid data and information management process that creates a single integrated data warehouse, spread over various technologies (Teradata Data Warehouse, Teradata Aster Big Data Discovery and Hadoop stores) as appropriate for each different type of analytic or data required.

This solid foundation has, however, also been able to expand over time to enable an Agile Analytics Factory for rapid user-driven development, a Big Data loading and integration process, text mining, social media analysis and other analytics platforms.  All these extensions make use of a single integrated store of data that ensures single business definitions of metrics, common reporting time windows and drives business benefits.

Using this approach, the cost of the platform has only risen 9% over the last 10 years, whilst increasing usage and business value achieved by more than 600%. 

Know your Customer but don’t Overshare: The value of all this data and analytics is to be able to provide a better service to your customer.  Royal Bank of Canada’s data warehouse is used to evaluate each customer as to their likely future cash needs, and offer them, next time the customer is at an ATM, the option of an overdraft if they are likely to run low before their next pay.

This service, which generates $5 in fees for each customer that accepts, generates sufficient revenue by itself to pay for the entire data warehouse program.  RBC is also implementing programs to evaluate the likelihood of ATM fraud by comparing the time and the distance between two different ATM transactions.  If the distance between the transactions is such that no one person could have travelled that far in the time available between transactions, then it is probably indicative of fraud.

However, as Dr Rifai stated, “how do you determine the difference between creepy and clever?” when using analytics.  There is a fine line here and his advice is to consider any use of analytics from the customers perspective to determine which use is appropriate. The acid test is “if someone else did it to your data, would you object?”.  Dr Rifai singled out KLM’s recent “Meet and Seat” program (where customers can opt in to share their social media profile to allow other passengers to select who to sit next to) as a bit too much on the creepy side for his taste.

Data is not Information: This issue has raised its head frequently over the last 30 to 40 years in BI and Analytics.  Data in and of itself is of no value.  It  obtains value when someone uses the data in an analytic to produce information that is used to inform a business decision.  That process converts data to information and information to intelligence.  Dr Rifai quoted his mother, when they were talking about the difference between information and intelligence, as saying “knowing a tomato is a fruit is information.  Knowing not to put a tomato in a fruit salad is intelligence”.

Information has a ValueRobert Hillard, COO of Deloittes Australia, spoke eloquently about the value of information in an economy affected by digital disruption.  With 32% of the economy being in industries that are already being substantially affected by information-enabled competition, information and its value has a high importance in strategy planning.  Robert advocated an information economy within organisations, where consumers of information pay their providers for the use of quality data.

This encourages collection and provision of quality data, and creates a market place in information distribution that allows organisations to focus on valuable data that will drive business success, and stops wasting time and resources on data that no-one values.

I’m now looking forward to the third Teradata ANZ Summit series, to be held in early November.  Let’s see what else we can learn about how to create and maintain value from information.

David Stewardson is a Senior Consultant in the Teradata Solutions Group. He has a very strong technical background and business acumen with over 23 years’ experience in the Data Warehouse business, specialising in Business Intelligence. During his extensive career, he worked in 6 countries, across 8 different industries (including Mining, Finance and Insurance, Utilities and Telecoms) and has been responsible for managing teams of varying sizes from five up to 150 in previous Business Analysis, Project Manager, Program Manager and Program Director roles. Connect with David Stewardson on Linkedin.

The following two tabs change content below.

One thought on “7 Tips on Moving from Big Data to Value

  1. avatarsrinivas

    This is one of the most valuable sessions. Big data without analytics is limited to storage that is available far cheap than before.

    Turn the data into information that can contribute significantly into business value.


Leave a Reply

Your email address will not be published. Required fields are marked *