Data Analytics in 2016- The 6 Trends We Must Be Aware Of

By | Monday December 21st, 2015

As we look to end 2015, a year in which more organisations both private and public sought to integrate data analytics within their processes and where hundreds of new products and features hit the market to support this space we now start to think about 2016 and where we are heading as an industry. Below I’m looking at 6 top trends for analytics in 2016 and why we need to be aware of them.

  1. The big question on every executive’s lips next year will be how to “monetize” their data? How will we leverage both the technology and the people resources of the organisation to derive value from the data that we generate and consume? Monetization of the data can be either in the form of increasing revenue streams within the organisation, finding new revenue streams or using the data to create efficiencies within the organisation to reduce cost.

Why do we need to be aware of this?- Financial pressures will dictate that as an analytics practice we must continue to answer to the business and show how data is the single most important asset that the company has today. Data has a value and we must show the value to the holders of the purse strings in order for us to continue to develop new insights.

  1. Whilst on the topic of monetization, I predict we’ll see a big increase in the monetization of the algorithms used to execute on analytical strategies. Up until now, we’ve been focused on developing our own algorithms or leveraging a lot of the out of the box algorithms that come with analytic platforms such as Aster. However developers of these algorithms will look to the broader use of their IP in return for royalty payments. Still in its infancy is the concept of algorithm marketplaces online where any organisation can procure a pre-built algorithm on a charge basis that runs on their own platforms & their own data. Algorithmia is one such example of an online marketplace. There are many organisations that simply cannot afford to hire or retain a data scientist to develop new features for their business and will instead seek out similar functions online.

Why do we need to be aware of this?- Look outside of what you can do in-house and start to evaluate whether it is quicker and cheaper to build yourself versus buy.

  1. Let’s face it, the data analytics industry at this moment is very tight in skilled available resources. The question of hiring or retaining data science skills will become increasingly difficult in 2016 and will in fact affect the delivery of major initiatives. Currently on the ground the number of “good” data scientists is still a rarified commodity. By good I don’t mean they have a basic understanding of several programming languages, but they also come equipped with the knowledge on how to seek out the data and apply a process to it. In the public sector alone, there is a dire shortage of resources with the skills needed to be able to identify the data and understand how to apply a data analytics science to the data to produce results. I do foresee the rise of the freelancing data scientist becoming more common in 2016. Reputable data scientists with a demonstrable background in delivering analytic products will be in hot demand. But they will differ to a typical contractor as they’ll become a “Top-gun for hire”. Limited to short stints and to solve specific problems, we’ll see 2-4 week engagements at a time.

Why do we need to be aware of this?- Seek to develop a resourcing model that meets your development cycles. Use external resources at a time during the project when they are most useful and avoid having to keep them on the payroll when they are not required. You’ll achieve a lot more for a lot less with a good strategic resourcing model.

  1. Of course my pet topic in Apache Spark will increase within the large scale analytics field. 2015 was definitely a breakthrough year for Spark. We saw the integration of Spark into many architecture plans and roadmaps and also were witness to some serious investments in the technology. 2016 is going to be no different with Spark becoming mainstream in analytic projects. Up until now it has been the domain of the wacky few locked away inside the basement. However Spark is getting noticed at the Executive level and mini projects looking at how to leverage the speed of Spark will begin to see light of day in 2016.

Why do we need to be aware of this?-If you don’t already have a Spark project within your Analytics teams, it’s high time to get on board the gravy train and start to explore its usefulness. Start small and grow from there.

  1. Speed to insight is paramount. I think the days of an extensive ETL process to get the data into a platform is not dead. Rather it has its place within an overall strategy. End users will demand analytics of their data a lot faster in 2016 and with that comes a need to adapt your architecture to meet that demand. But you don’t need to throw out the years of investment in ETL just to achieve faster analytics. What occurred yesterday must be in the hands of a decision maker the next day. Or depending on what line of business you are involved in, you must be able to synthesize the data in near real-time to make a decision. But to do this, we need a range of platforms and processes to achieve this outcome. 2016 will see the demands of business users being met with delivering faster outputs to them in the shortest time possible. It may not see the removal of the time expensive ETL process, but we’ll look to avoid it as much as possible.

Why do we need to be aware of this?- Analytics in 2016 is all about getting results to the end users in the shortest time possible. From the moment that the data is consumed through to the moment that it is received by the end user the clock is ticking. You’ll need to explore technology, architecture and processes that support high speed analytics.

  1. Innovation to integration- I talk about this a lot. Data analytics is seen very much as a side hobby for many within the IT Department. It’s not well understood nor does it translate itself into identifiable business outcomes. I’ve been witness to many organisations who are very good at setting up a data analytics practice only for it to be seen as a test laboratory for what could be possible. They fail to accurately describe how what has been developed will benefit the business and as a result some great ideas never get to see the light of day. 2016 will go down as the year where we take the innovative concepts from the lab and integrate into a business process. R&D labs within organisations will be required to justify their existence and prove a lot stronger link between what they do and how it benefits the overall goals of the organisation.

Why do we need to be aware of this?-It’s all about results. You can innovate all you want, but at the end of the day the group investing the money will want to see those innovations come into mainstream processes and deliver results.

So that’s my view of where the industry is heading in 2016. We are maturing at a faster rate every year and 2016 will simply be another year where we go up another gear and pack more into a single year than we thought possible. I hope everyone has a safe and Merry Christmas and in 2016 the world can become a safer place.

Ben Davis is a Senior Architect for Teradata Australia based in Canberra. With 18 years of experience in consulting, sales and technical data management roles, he has worked with some of the largest Australian organisations in developing comprehensive data management strategies. He holds a Degree in Law, A post graduate Masters in Business and Technology and is currently finishing his PhD in Information Technology with a thesis in executing large scale algorithms within cloud environments.

The following two tabs change content below.
avatar

Ben Davis

Senior Pre-Sales Consultant, Federal Government at Teradata
Ben Davis is responsible for pre-sales activities in the Federal Government market in Canberra. Ben consults across a broad range of government departments developing strategies to better manage the continual flood of data that these organisations are now facing. Ben is a firm believer that management of data is a continual process not a one off project that if managed correctly will deliver multiple benefits to organisations for strategic decision making. Previously Ben spent 6 years at IBM in a Senior Data Governance role for Software Group Australia & New Zealand and 10 years at Fujitsu in pre-sales and consulting roles. Ben holds a Degree in Law from Southern Cross University, a Masters in Business & Technology from the University of New South Wales and is currently studying for his PhD in Information Technology at Charles Sturt University. His thesis studies focus on data security, cloud computing and database encryption standards.
avatar

Latest posts by Ben Davis (see all)

Category: Ben Davis Tags: , ,
avatar

About Ben Davis

Ben Davis is responsible for pre-sales activities in the Federal Government market in Canberra. Ben consults across a broad range of government departments developing strategies to better manage the continual flood of data that these organisations are now facing. Ben is a firm believer that management of data is a continual process not a one off project that if managed correctly will deliver multiple benefits to organisations for strategic decision making. Previously Ben spent 6 years at IBM in a Senior Data Governance role for Software Group Australia & New Zealand and 10 years at Fujitsu in pre-sales and consulting roles. Ben holds a Degree in Law from Southern Cross University, a Masters in Business & Technology from the University of New South Wales and is currently studying for his PhD in Information Technology at Charles Sturt University. His thesis studies focus on data security, cloud computing and database encryption standards.

Leave a Reply

Your email address will not be published. Required fields are marked *


*