Facing the Challenges of Real-Time Analytics

Wednesday July 1st, 2015

Recently I’ve noticed more and more businesses and Government departments wanting to embed real-time analytics into their operational and business processes. There are of course some fantastic uses of real-time analytics and the benefits that can be derived from analyzing data in real-time. Here’s a few:

Intelligent transportation systems

One of the most important applications of real-time big data analytics is to enable intelligent transportation systems. Of course we are seeing a lot more sensors monitoring traffic conditions and crowded streets. Locally to me is the ACT Government trialing parking sensors to improve traffic conditions. But we now see modern cars fitted with sensors as well. These sensors can be associated with communication capabilities such as GSM, satellite communications, WiFi, and Bluetooth to provide real-time monitoring for different conditions such as vehicle locations, average speeds, and driving behaviors of drivers as well as road conditions.

Financial Markets

Huge amounts of financial data are generated every second for stock and option trades from multiple markets, currency exchange rates, interest rates, and commodity prices. This data is not only big but also very dynamic. Companies and organisations can use this dynamic big data to detect opportunities and threats and to quickly react to them. Examples of these opportunities are: predicting increases or decreases in prices of some securities before a change actually occurs.

The timely reaction to such opportunities can be buying securities before their prices increase or selling some before the prices drop. In addition, there are some real-time fraud detection systems developed recently to detect and prevent financial threats in a timely manner. Such systems currently rely on smaller and limited data sets to achieve real-time performance. However, for these systems to be more effective, they need to deal with all available data. This financial data is usually huge and changes dynamically, which introduces several technical challenges.

Military decision making

Modern wars are built on data. We see this with the drone program. The key to winning a war is not only strength but also the ability to collect correct information about the current situation and make the right decisions quickly. In wars hundreds of different equipment, vehicles, structures and communication systems are used. Military vehicles can be tanks, armored vehicles, transportation and logistics vehicles, manned and unmanned aircrafts, military boats and ships, and underwater vehicles. Of course the faster a military organisation can consume, analyse and make decisions about the data, the quicker it can respond to threats on all facets.

So that’s just a taste of a few real time uses, and I’m sure you could find a use case for your business. But what about the challenges? Historically, data analytics has been about consuming, processing and outputting results, all of which take time. We refer to this style of analytics as the open-loop approach. In this approach big data for a specific domain is analyzed to obtain some new information and knowledge that can be used to enhance the operations or profitability of that domain. It is a slow process.

Real-time analytical challenges are based on a closed loop approach. Closed loop approaches are actions that are usually based on the current and previous situations. In other words they react instantly to what’s happening around them. This presents some challenges:

  1. Real-time event transfer- Data events need to be transferred in real-time to where they can be processed. These events can be transferred from their distributed sources as raw events or as filtered or aggregated events. All generated current row, filtered, and aggregated events can be transferred to a centralized processing point or to distributed intermediate processing points for pre-processing or for further filtering and aggregation before being transferred to the main decision making unit. In a real-time analytical approach, event filtering and aggregation should be done to reduce network traffic and processing time and without negatively affecting the accuracy and optimality of the decision making in the real-time big-data application. Of course this is easier said than done! A systems architecture with multiple platforms performing specific tasks is what is needed.
  2. Real-time Analytics– Real-time analytical processing may involve single or multiple integrated analytical services. These real-time analytical services should predict the performance and assessment of the risks. It relies on the real-time event information I mentioned in point 1 being passed from the situation discovery process and other offline stored big data such as maps, previous transactions, situations and decisions. Dealing with stored big data is usually challenging. Providing abstracted information to enhance processing time can also be very challenging. Real-time analytical services need to deploy fast algorithms that provide alternative options within a bounded time.
  3. Automated Vs human involved decision making– Ok great! We’ve been able to consume, transfer and analyse data quickly. But what about the auctioning on the output of the analysis? Do we also build in an automated process that makes a decision based on the result set, or wait for human intervention to make the decision? Some things just can’t wait for humans to become involved such as stock buy or sells. However the drone program although it can make a decision to fire a missile by itself based on heat signatures, the decision to fire is still made by a human. But we may just end up like Terminator and let drones make their own decisions. Of course with automated processes, we have to be entirely sure that we trust the data, the analytics process and the decision making very solidly.

So real-time analytics is a great idea in theory but complicated in its design and delivery. Traditional data analytics collection and analysis requires long periods of time to complete. Thus real-time analytics challenges our traditional architecture and methods. To build effective real time big data applications several challenges need to be addressed including: real time event (data) transfer; real time analytics; and real time decision making. At Teradata we are enabling real-time analytics through the adoption of applications such as Apache Spark, Shark and Kafka

So if you’re considering real-time analytics, you’ll have to radically change your thinking and adopt new approaches. Re-invention will be your mantra.

Ben Davis is a Senior Architect for Teradata Australia based in Canberra. With 18 years of experience in consulting, sales and technical data management roles, he has worked with some of the largest Australian organisations in developing comprehensive data management strategies. He holds a Degree in Law, A post graduate Masters in Business and Technology and is currently finishing his PhD in Information Technology with a thesis in executing large scale algorithms within cloud environments.

This blog first appeared on Forbes TeradataVoice on 30/06/2015

The following two tabs change content below.
avatar

Ben Davis

Senior Pre-Sales Consultant, Federal Government at Teradata
Ben Davis is responsible for pre-sales activities in the Federal Government market in Canberra. Ben consults across a broad range of government departments developing strategies to better manage the continual flood of data that these organisations are now facing. Ben is a firm believer that management of data is a continual process not a one off project that if managed correctly will deliver multiple benefits to organisations for strategic decision making. Previously Ben spent 6 years at IBM in a Senior Data Governance role for Software Group Australia & New Zealand and 10 years at Fujitsu in pre-sales and consulting roles. Ben holds a Degree in Law from Southern Cross University, a Masters in Business & Technology from the University of New South Wales and is currently studying for his PhD in Information Technology at Charles Sturt University. His thesis studies focus on data security, cloud computing and database encryption standards.
Category: Ben Davis Tags: , ,
avatar

About Ben Davis

Ben Davis is responsible for pre-sales activities in the Federal Government market in Canberra. Ben consults across a broad range of government departments developing strategies to better manage the continual flood of data that these organisations are now facing. Ben is a firm believer that management of data is a continual process not a one off project that if managed correctly will deliver multiple benefits to organisations for strategic decision making. Previously Ben spent 6 years at IBM in a Senior Data Governance role for Software Group Australia & New Zealand and 10 years at Fujitsu in pre-sales and consulting roles. Ben holds a Degree in Law from Southern Cross University, a Masters in Business & Technology from the University of New South Wales and is currently studying for his PhD in Information Technology at Charles Sturt University. His thesis studies focus on data security, cloud computing and database encryption standards.

Leave a Reply

Your email address will not be published. Required fields are marked *


*