Author Archive

Big Data: not unprecedented; but not bullshit, either

Posted on: July 23rd, 2014 by Martin Willcox No Comments

 

Larry Ellison, Oracle’s flamboyant CEO, once remarked that “the computer industry is the only industry that is more fashion-driven than women's apparel”.  The industry’s current favourite buzzword - “Big Data” - is so hyped that it has crossed over from the technology lexicon and entered the public consciousness via mainstream media.  In the process, it has variously been described as both “unprecedented” and “bullshit”.

So is this all just marketing hype, intended to help vendors ship more product?  Or is there something interesting going on here?

To understand why the current Big Data phenomenon is not unprecedented, recall that Retailers, to take just one example, have lived through not one but two step-changes in the amount of information that their operations produce in less than three decades, as first EPoS systems and later RFID technology transformed their ability to analyse, understand and manage their operations.

As a simple example, Teradata shipped the world’s first commercial Massively Parallel Processing (MPP) system with a Terabyte of storage to Kmart in 1986.  By the standards of the day this was an enormous system (it filled an entire truck when shipped) that enabled Kmart to capture sales data at the store / SKU / day level - and to revolutionise the Retail industry in the process.  Today the laptop that I am writing this blog on has a Terabyte of storage – and store / SKU / transaction level data is table-stakes for a modern Retailer trying to compete with Walmart’s demand-driven supply chain and Amazon’s sophisticated customer behavioural segmentation.  Similar analogies can be drawn for the impact of billing systems and modern network switches in telecommunications, branch automation and online banking systems in retail finance etc., etc., etc.

The reality is that we have been living with exponential growth in data volumes since the invention of the modern digital computer, as the inexorable progress of Moore’s law has enabled more and more business processes to be digitized.  And anxiety about how to cope with perceived “information overload” predates even the invention of the modern digital computer.  The eight years that it took hard-pressed human calculators to process the data collected for the 1880 U.S. census was the motivation for the invention of the “Hollerith cards” by Herman Hollerith, founder of the Hollerith’s Tabulating Machine Company – which later became International Business Machines (IBM).

Equally I would argue that it is a mistake to dismiss Big Data as “bullshit”, because significant forces are currently re-shaping the way organisations think about Information and Analytics. These forces were unleashed, beginning in the late 1990s, by three disruptive technological innovations that have produced seismic shocks in business and society; three new waves of Big Data have been the result.

The first of these shocks was the rise (and rise, and rise) of the World Wide Web, which enabled Internet champions like Amazon, eBay and Google to emerge.  These Internet champions soon began to dominate their respective marketplaces by leveraging low-level “clickstream” data to enable “mass customisation” of their websites, based on sophisticated Analytics that enabled them to understand user preferences and behaviour.  If you were worried that my use of “seismic shock” in the previous paragraph smacked of hyperbole, know that some commentators are already predicting that Amazon – a company that did not exist prior to 1995 – may soon be the largest retailer in the world.

Social Media technologies – amplified and accelerated by the impact of increasingly sophisticated and increasingly ubiquitous mobile technologies – represent the second of these great disruptive forces.  The data they generate as a result are increasingly enabling organisations to understand not just what we do, but where we do it, how we think, and who we share our thoughts with.  LinkedIn’s “people you might know” feature is a classic example of this second wave of Big Data, but in fact even understanding indirect customer interactions can be a huge source of value to B2C organisations – witness the “collaborative filtering” graph Analytics techniques that underpin the increasingly sophisticated recommendation engines that have underpinned much of the success of the next-generation Internet champions, like Netflix.

The “Internet of Things” – networks of interconnected smart devices that are able to communicate with one another and the world around them – is the third disruptive technology-led force to emerge in only the last two decades.  Its ramifications are only now beginning to become apparent.  A consequence of the corollary of Moore’s Law – simple computing devices are now incredibly inexpensive and fast becoming more so – the Internet of Things is leading to the instrumentation of more and more everyday objects and processes. The old saw that “what gets measured gets managed” is increasingly redundant as we enter an era in which rugged, smart, - and above all, cheap – sensors will effectively make it possible to measure anything and everything.

We can crudely characterize the three “new waves” of Big Data that have accompanied these seismic shocks as enabling us to understand, respectively: how people interact with things; how people interact with people; and how complex systems of things interact with one another.  Collectively, the three new waves make it possible for Analytics to evolve from the study of transactions to the study of interactions and observations; where once we collected and integrated data that described transactions and events and then inferred behaviour indirectly, we can increasingly measure and analyse the behaviour – of systems as well as of people – directly. In an era of hyper-competition – itself a product of both globalisation and digitisation – effectively analysing these new sources of data and then taking action on the resulting insight to change the way we do business can provide organisations with an important competitive advantage, as the current enthusiasm for Data Science also testifies.

Contrary to some of the more breathless industry hype, much of what we have learnt about Information Management and Analytics during the last three decades is still relevant - but effectively exploiting the three “new waves” of Big Data also requires that we master some new challenges.  And these are the subject of part 2 of this blog, coming soon.

 

The buzzword “Big Data” is no longer limited to the IT sector. Big Data is a business topic and especially in Marketing, decision makers are working on a data driven strategy to be ahead of the competition. However, in many cases there is more talking about than working with Big Data. Martin Willcox, Director, International Big Data CoE at Teradata, thinks that it is time to stop discussing about what Big Data means and start facing the real challenges and benefits it entails. In a mini-series on the Teradata International blog, Martin will share his vision of Big Data and give us an idea of Teradata’s strategy. Stay tuned for his posts…

 

Among all the customer satisfaction measures, Net Promoter Score (NPS) stands out as the Real McCoy. We can see why, because NPS is considered to provide direct contribution to the bottom line of the corporate financial books.  That is the theory anyway! As we know, in theory, theory is the same as practice but in practise it is not! NPS suffers from some of these theory-practice gaps. Let’s see why and find out how discovery analytics can enhance customer satisfaction and in many cases can even help predict NPS!

The secret sauce of NPS metric is the simplicity by which it extracts individual customer’s opinion about a product / service directly after a customer’s interaction by asking the simple question in survey responses: 'How likely are you to recommend...'? NPS scale of 0 to 6 is considered Detractors, 7 to 8 is Neutral (Passive), while 9 to 10 signify Promoters. NPS Score is computed as % Detractors minus % Promoters. The process for Bottom-up NPS survey would look something like the chart shown below.

While simple, NPS is not without its shortcomings. As noted in the above chart, NPS is measured at a point in time and done so randomly after a customer contact event. As a result of not being integral to an operating model, NPS does not drive action in itself. Also time decay occurs after the contact event until the survey is done which may result in missed opportunities at touch points to rectify any poor perception. Besides, customer’s intention often does not translate to future behaviour as often noticed in subsequent surveys in which customers became detractors after being promoters/passives (Neutral) as observed in a recent project from the fragments of the path analysis below.

NPS score itself can be problematic as the same score can be computed in different ways that do not necessarily mean the same from customer loyalty point of view. NPS is often at company level whereas, many purchases/interactions are at product level. Moreover, NPS surveys measure events after the customer interacts at touch points and do not measure customer’s experience of using the service such as network quality in the telecom industry, unless the customer complains about the poor service.

Rather than waiting for NPS survey, proactive analysis of customers’ sentiment about new product launches and/or issues with a current product or service could be analysed from social media well ahead of a customer’s intention to contact the Call Centre as narrated in the chart below. Discovery analytics from Big Data sources will also enable new insights to be gained about competitors as well as own products/service as perceived in the market. Some Teradata customers (i.e. service providers) in the European and Asia Pacific regions have been able to correlate customers’ experience of product and service usage with NPS and thus are able to predict customers’ intention / perception well ahead of customer contact at touch points / subsequent NPS surveys. Many service providers leverage the customer contact notes and logs to perform text analytics and are able to use the insights gained to extend life time of their customers.   

Here is an example of sentiment analysis for the telecom industry that I have been able to easily perform using the Teradata Aster discovery platform by using the publically available social media data (http://forums.whirlpool.net.au/). This kind of analyses allows product and service quality to be compared against competitive offering to gauge market perception throughout the product cycle. [Note: Specific mention of competitors has been excluded from the chart below to avoid bias, but it has been interesting to observe many forum members’ sentiment about one service provider’s 4G/LTE network quality / quality of customer service compared to the others. Some even mention specific locations where the quality of a service provider is better].  

Similarly, analysis of the Call Centre logs can provide a wealth of information about individual customer’s opinion on products and customer service. See below examples of words captured in call centre dialogue that signify both negative and positive sentiments from individual customers. One service provider in North America, a tier-1 Teradata customer, has analysed call centre logs containing words/phrases such as ‘save; contract termination; …’ to predict customer churn and took preventive actions to retain a large number of high value customers.   

 

In summary, NPS scores measured in isolation provides little value to the organisation. When integrated with other enterprise and Big data sources, customer satisfaction and lifetime value can be enriched by holistic analysis of customers’ experience across all key stages of product and service usage experience as well as customer interaction at touch points. Teradata customers who have implemented the Unified Data Architecture (UDA) have been able to achieve bottom line improvements just by doing that.

Sundara Raman is a Senior Communications Industry Consultant at Teradata. He has 30 years of experience in the telecommunications industry that spans fixed line, mobile, broadband and Pay TV sectors. He specialises in Business Value Consulting, business intelligence, Big Data and Customer Experience Management solutions for communication service providers. Connect with Sundara on Linkedin.

One-to-one Personalisation will boost conversion rates

Posted on: July 11th, 2014 by Ruth Gordon No Comments

 

Blog Series by Teradata and Celebrus Technologies

Part 4 – One-to-one personalisation

In the world of big data and the quest to achieve a Single Customer View (SCV), the quality of data available to digital marketers is a concern. As the Digital Marketing Insights Report 2014 highlights, a lack of accurate data could have major implications for the analytics and personalisation efforts with which digital marketers are struggling to get to grips.

The digital marketer of 2014 is required to understand, identify and adapt to opportunities in a rapidly shifting world, which generates data on an unprecedented scale. However, with the level of sophisticated analytical technology available to digital marketers today, they are better placed to deliver a level of individual personalisation than ever before. There is no doubt that having the correct analytical tools in place, in combination with the right data, will enable digital marketers to develop a greater understanding of big data and provide a stable platform from which to create personalised customer campaigns.

Email continues to be the most prevalent method of customer personalisation with 92% of the survey respondents claiming to offer a level of personalisation in their email communications. This is no surprise as email marketing tools have been around for decades; but what is surprising is that a third of respondents (33%) reported that they undertake no website personalisation efforts - leaving a large gap of unfulfilled opportunity to communicate with customers in a more personalised fashion.

This is not because digital marketers don’t recognise the importance of personalisation – 51% responded that personalisation is currently either very important or critical to their digital marketing efforts – so what is the problem?

The answer: access to real-time data to drive that personalisation. The report showed that at present, 50% of marketers do not have access to real-time data. Encouragingly, by 2016 a huge 78% reported that they will have implemented a solution that utilises real-time data in one of three forms: website specific personalisation (21%), individual level interaction data (26%) and multichannel real-time decision technology (31%).

Understanding how the modern consumer interacts with a brand across both offline and online platforms is evolving as brands recognise the need to behave in a truly integrated, omni-channel way. Digital marketers are taking significant strides towards developing their understanding of how customers interact with their brands in real-time across multiple online and offline channels and devices. Investment in the right data and analytical tools to extract actionable insight from detailed customer data will ensure that digital marketers of the future are well on their way to achieving not only a SCV, but also ensuring that they treat “always on” customers in a coherent multi-channel way.

In practical terms this omni-channel understanding and personalisation empowers, for example, a call centre representative to offer a customer credit for a high value item he has browsed online. It also enables the company to avoid customer experience mistakes - such as offering a cheap insurance quote via email when the customer has just purchased via the website. When dealing in industries which have complex sales processes or registration processes, having real-time synergy between the online and offline environments allows brand representatives to provide a customer service based on individual preferences, actions and online behaviour. This personalised customer engagement can lead to increased trust within a brand, an improved experience and/or increased sales conversions.

When asked to look forward to 2016, an impressive 80% believe that personalisation will be key to their digital marketing success. To achieve those goals, investment is planned in both the analytical expertise and the technologies required to transform customer data into actionable insight that improves customer engagement, enhances the customer experience, increases business process efficiencies and Marketing ROI.

The omni-channel consumers of 2014 expect to have highly relevant and personalised interactions with a brand and the results from the Digital Marketing Insights Report 2014 demonstrate that digital marketers have the data available and are now getting to grips with the tools to deliver true one-to-one personalised marketing campaigns.


Who Needs Data Scientists? Get Business Value Now

Posted on: July 10th, 2014 by Gareth Clayton 6 Comments

 

I would like to make a proposition: Businesses can unlock the value in their big data today without the need for data scientists.

A great deal has been made about the cost of data scientist and/or how hard it is to find the skills needed to leverage big data. I don’t think this should hold you back. I am not suggesting there is no place for data scientists or highly specialised skills I am merely stating that a lot can be achieved before having to bring them in.

There is value locked up in big data

I don’t think anyone disagrees that there is value locked up within our organisations data, as well as the data that is publically available across the Internet. But how to best unlock this value?

Most organisations have people who can unlock the value

The best people to unlock the value are people who know your business. Organisations already have an array of analysts who know their business well: Business Analysts, Finance analysts, Process analysts, Engineers (depending on your industry) and IT analysts. I would like to propose that you are better off up-skilling your existing people before investing in expensive data scientist resources.

Business analysts tend focus on the business context and a business outcome.  Many analysis have already studied many of the data scientist concepts in their undergraduate degrees e.g. statistics, economics, engineering. They already know the fundamentals. Once you give them the tools they are more than capable to augment their existing knowledge to get value out of the data.

As a consultant that advises clients on best practice analytic practices I have come across many in-house analyst that have risen to the occasion. I recently worked with a process engineer with a manufacturing background. After 5 days formal training on a popular data-mining tool he was uncovering insight that had been missed by professional analytic consultants. A few years on he is now a professional data scientist. But that is another topic altogether.

Obviously to have a data scientist or analytic consultant available to advise and support them would be even better. But this shouldn’t stop you from starting with whom you already have on your team.

Modern tools enable you to focus on the business outcome rather than the math.

Modern toolsets have significantly improved the ability of business analysts to get value from big data, let's take two examples:

Teradata Aster – Designed for loosely structured and structured data it has a simple SQL interface (A skill most analyst already have). This focuses the analysis on defining the outcome in SQL and not on complex code. You can take weblogs, identify user interaction sessions within the logs and then run a nPath query across them in a single SQL statement. Within minutes you can find the most common paths to shopping cart abandonment or what customers are doing prior to calling your call center.

SAS Rapid Predictive Miner – An Excel GUI with simple wizard interface that unleashes the power of SAS in the background. A marketing analyst can create a targeted mailing list in minutes. The beauty of this is that the model that is generated in the background can be opened up by a data scientist and be audited or refined using SAS’s more advanced Enterprise Miner tool.

There is still a place for data scientists but it should not limit you.

Now, having just upset all the data scientists I know - Yes, there is definitely a place for data scientists. Data scientists can squeeze more value out of each model and can apply many different approaches that may see results when simpler methods do not work.

Get on with it.

I am not saying you don't need data scientists - What I am saying is you can get started with what you have and more importantly start driving value immediately. Focus on faster time to market and relevance to your business. If you do manage to hire a data scientist it is only going to add to the value you are already generating.

Gareth Clayton is a Senior Industry Consultant at Teradata with over 16 years experience in business analytics and information management. He has a diverse background in many industries but primarily in Telecommunications and Banking. Gareth is also passionate about educating the next generation where he has been a guest lecturer at La Trobe and Victoria Universities on the practical application of predictive analytic theory in a business context. Follow Gareth via twitter @AnalyticsROI or connect via Linkedin.

 

Blog Series by Teradata and Celebrus Technologies

Part 3 – Personalised Digital Marketing

Capturing the innumerable actions made online every minute of every day is a big challenge. Buyer expectations are on the rise and the modern consumer expects to have interactions from brands that are personal to them. Bombarding consumers with impersonal content just doesn’t work anymore. Put simply, consumers find this approach disengaging. A result of this is the growing requirement for ‘real-time marketing’, which gives marketers the power to shape one-to-one personalised messages during live website visits or call centre interactions.

The Digital Marketing Insights Report 2014 shows that many digital marketers understand the importance of real-time one-to-one personalisation and are committed to exploiting accurate customer analytics to help them achieve it.

Digital marketers with access to insightful customer analytics are reaping the rewards, with 71% reporting better customer targeting and 59% citing increased conversion rates. Unsurprisingly web analytics is the most common analytics undertaken with almost three-quarters (72%) reporting they actively use it to support their digital marketing efforts. While web analytics still has a critical place in the digital marketer’s armoury, when it comes to attempting to achieve the goal of a Single Customer View (SCV) it appears that many marketers have taken to using web analytical tools and data to try and obtain the level of detailed data analysis needed.

Delving further into the Digital Marketing Insights Report 2014, one reason for marketers’ reliance on the web analytics tools becomes clearer: over 70% of respondents said they collect web analytics or aggregated web data, making it the primary method of data collection across the survey - whereas only 37% are collecting the individual level data needed for SCV development. Perhaps this reliance on tools and data designed for another purpose - i.e. to understand the web channel not individual customers - is a contributing factor to why so few (21%) have so far achieved a true omni-channel SCV.

Understanding how an individual customer interacts across multiple touchpoints is becoming more and more critical to digital marketers as they increasingly develop more sophisticated methods of engaging with customers. These customer interactions represent vast quantities of data that are highly diverse and dynamic, which makes any analysis a challenge. This new world of data-driven marketing has different requirements and needs a new approach - the ability to perform advanced analytics, such as pattern analysis, to enable the discovery of new insight.

The power to interpret these signals and take an appropriate action - increasingly in real time, across multiple devices, channels or locations to deliver a true one-to-one experience - is the key to success. This capability is transforming CRM from where it has been languishing for many years, to eCRM activities like personalisation of websites and emails based on customer behaviour. These activities are just the beginning of an engagement led customer contact strategy drawing upon product affinity, path analysis for customer journey optimisation and process re-engineering, fraud analytics and behavioural based pricing – all of which combine to provide an informed level of customer analytics only recently available to digital marketers.

Spend attribution is also a very hot topic. The ability to look at customer journeys and understand the importance of every interaction leading up to and including the final purchase highlights the inadequacies of using first or last click analysis. Some brands are receiving up to 30% budget savings by optimising paid search bidding and affiliate payments.

Through integrating online with offline data, digital marketing activity is more likely to be a success. Digital marketers are missing a trick by not coupling great online behavioural data with equally-great purchase history and other types of offline data. The integrity of a brand’s SCV and personalised approach is bolstered by the depth of the data it contains and is crucial to driving a successful personalised communications campaign.

Hadoop is like your Mobile Phone

Posted on: July 7th, 2014 by Ben Bor 1 Comment

 

Following my blog series on "Big Data is Not Hadoop"  it seems reasonable to discuss WHY people think that Hadoop is Big Data.

I will use the humble Mobile Phone to illustrate.

 

 

 

 

 

 

 

 

Mobile phones had a humble beginning; to (mis) quote John 1:1 :” In the beginning was the Word, and the Word was ‘Hello…’ ”.  In other words, mobile phones, at the beginning, had exactly two advantages:

  • You could use them to talk to other people.
  • They were mobile: lightweight, requiring no accessories, you could carry one in your pocket.

Since they used LCD display, the power consumption was minimal, so you didn’t need to carry a charger with you.

 And then we started piling new features.  A big hit was a built-in camera.

Manufacturers persuaded consumers that the number of pixels is the important factor, therefore a camera in your mobile phone is good enough – you don’t really need a camera

This is absolutely wrong !!

Far more important than the number of pixels is the quality of the lens system.  Look at the following picture and try to imagine taking it with a mobile phone.

 

 

 

 

 

 
To cut a short story shorter, by trying to get your mobile phone to function over and above its ‘comfort zone’ you get a lot of benefits, but three BIG side effects:

  • Your mobile phone is less mobile; you need a bag and a charger.
  • Your mobile phone is no longer a phone; talking takes probably 10% of the usage time.
  • Your pictures are of low quality (good enough for a selfie, but not good enough for a decent-quality picture).

What does this have to do with Hadoop?

Hadoop is great.  It is a fantastic (and very cost-effective) solution to a large set of business problems. 

"A simplistic approach is to call the total set of these problems “Big Data”."

However, some organisations are trying to do too much with Hadoop:  this is the equivalent of taking ALL your pictures with your mobile phone.  They want to implement every system and every data warehouse on Hadoop. 

Like the mobile phone camera, this will be good enough for some.  But don’t be mistaken: Hadoop is not the all-in-all answer to all your problems.  A transaction-processing system is still better on existing technology.  A Data Warehouse is still better (and when cost-of-ownership is taken into account, also cheaper) with existing, not-only-Hadoop, technology.

And this is where architecture enters the picture.  An active Enterprise Information Architecture will pave the way for your organisation to select the best set of implementation choices for your business problems.  In the modern era, this is likely to include TP systems, data warehouses, discovery platforms and a Data Lake.

Ben Bor is a Senior Solutions Architect at Teradata ANZ, specialist in maximising the value of enterprise data. He gained international experience on projects in Europe, America, Asia and Australia. Ben has over 30 years’ experience in the IT industry. Prior to joining Teradata, Ben worked for international consultancies for about 15 years and for international banks before that. Connect with Ben Bor via Linkedin.

 

The challenges of Customer Data Collection and Storage

Posted on: July 2nd, 2014 by Ruth Gordon No Comments

 

Blog Series by Teradata and Celebrus Technologies

Part 2 – Data Collection and Storage

Even with the global economy in the early stages of recovery, marketing budgets are under tight control and competition remains fierce across many marketplaces. The emphasis is now on brands to know both their marketplace and customers much better than their competitors, but with the rapid change in consumer behaviour it is tough to gain that essential customer knowledge. The birth of the omni-channel customer has seen the generation of an extraordinary amount of behavioural data and marketers are viewing this data as an opportunity to gain in-depth customer insight - but it is a challenge.

Research from the Digital Marketing Insights Report 2014 has shown that there is an overwhelming acceptance amongst digital marketers that harnessing a Single Customer View (SCV) is vital in order to remain competitive, with 70% citing better customer insight and 60% improved marketing personalisation. However, a low number of those surveyed (21%) told researchers they feel they have achieved the SCV goal.

This shows a contrast between the recognition that a huge amount of data does exist that can be actionable and profitable, against the ability to leverage that data through implementing new technologies and enhancing data and analytics skills.

Digital marketers understand the need to create the SCV. However, precious few are yet to achieve a true SCV, let alone use this omni-channel insight to improve their customer engagement and enhance customer experiences. Encouragingly, 57% of respondents anticipate that they will have harnessed a SCV by 2016. So what’s stopping them from getting it now?

Like any marketing shift with a technological implementation requirement, there can be challenges. Part of the research in the Digital Marketing Insights Report 2014 examined the barriers blocking the way for the SCV. The three most common barriers were: ‘too time consuming’ (51%), ‘structural issues’ (46%) and ‘technology challenges’ (44%). Yet a majority (70%) believed that a SCV would generate improved customer insight.

An important insight gained from the report is that 36% of respondents cited storage and integration of customer data into a single customer database as a challenge to their marketing teams, preventing their digital marketing efforts from capitalising on their customer data. Encouragingly nearly half of respondents (49%) expect to have deployed single centralised databases by 2016.

As the situation stands today, over a third of respondents (38%) have actually managed to create a single, centralised database that is helping to deliver a SCV. However, almost half of respondents recognised the need to move customer data into appropriate storage systems over the next two years - such an integrated data warehouses (49%) and/or Big Data technologies such as Hadoop (27%) - rather than in separate data marts or with third parties. Organisations must be able to bring their multiple sources and types of customer data – transactional, loyalty, Web behaviour, social and mobile to name just a few – together in order to create that critical omni-channel SCV.

What does this mean for digital marketers wanting to improve their one-to-one personalised communications? The research shows that, as it stands, a small number of digital marketers are likely to be currently benefitting from having a SCV – these are the ones that have collected robust individual level data from across their online and offline channels and store this in a single, centralised database (or other appropriate storage) to make the analysis of this data swift and accurate.

Whilst the advice would be for the remaining marketers to act fast and implement these systems quickly, some effort is required to overcome barriers such as data quality, database storage and technical in-house problems.

To the credit of digital marketers surveyed, when analysing the responses regarding expectations for 2016, the landscape is positive with 57% expecting to have a SCV by 2016.

This ambition demonstrates that whilst only a few digital marketers have this technology at their disposal currently, the chasing pack includes a large number of digital marketers working very hard to integrate these systems into their organisations. Yes it can be a time consuming process fraught with challenges, but the vast majority of digital marketers recognise it as an extremely worthwhile undertaking.

World Cup 2014: the analysts press ahead

Posted on: July 1st, 2014 by Hermann Wimmer No Comments

 

“How shall we play the game? As though we are making love or catching a bus?” the French journalist Jean Eskenazi once asked. For a football coach, this truly is an essential question. What are the mechanisms of the game? What makes the difference between winning and losing? People tend to underestimate the complexity of football. In fact, there are few sports that leave so many options for action. And no one will ever fully understand the game in all its dimensions. Watching the World Cup, however, I find one key dimension to be a source of growing interest among fans. I have never heard or read so much about football tactics than during current spectacles in Brazil.

Have you ever wondered about the subtle differences among formations such as 4-3-3, 3-5-2, 4-4-2 and 4-2-3-1, the variations within those formations and why those differences matter? Take a look at the British website zonalmarking or the German equivalent spielverlagerung and you will find bloggers talking about the analytical geometry of the game, using technocratic language and drawing arrows on tactical boards. Their in-depth analyses of football games not only impress a rising number of readers but also influence mainstream media. Have you noticed that more and more TV commentators ever so often ponder tactical questions?

Still, of course, hardly anyone doubts that football lives by emotion. Then why on earth are some people now discussing football as if they were watching chess? Over the past decades, football has developed into science. First, the data revolution changed professional sports, as Mike Forde, then Chelsea’s Director of Football Operations, explained at the Teradata Universe last year. Football managers use data analytics to improve their team’s performance, dissecting every single route, every pass and every corner. Secondly, there were some great advances in the evolution of football tactics, particularly since the 1980s, as Jonathan Wilson described in his book “Inverting the Pyramid”. And so today you have a generation of football coaches highly obsessed with details, for instance committed tacticians such as Pep Guardiola or Jürgen Klopp who manage the German Bundesliga’s most successful teams at present. And this is probably the main reason why more fans want to dig deeper into the analysis. Those who have watched the games live or on TV want to read more than summaries. So does the intellectualization of the sport mean that it is not about emotion anymore? Of course not, as Guardiola and especially Klopp prove by showing their passion for the game at the touchline.

“Football is not mathematics,” Bayern Munich’s Karl-Heinz Rummenigge once famously stated. Well, calculation is certainly not the only aspect, although it is an increasingly important one. Just recently I stumbled over an analysis of Germany’s “national team of the century”. The journalists put together Germany’s best players since the World Cup 1966 in a 4-4-2 system merely relying on the data of their performances. The results partly surprised me. For example, the team did not include Lothar Matthäus because he never assisted a goal in the World Cups 1982-1994, ruining his overall index. This article reminds us that it takes much more than numbers to analyze a player or a game. However, I do welcome the trend towards more objective information. Maybe we will then not hear as many phrases like “the players had the wrong attitude”, assumptions that cannot be verified.

There is no doubt that football is and always will be about emotion. But when journalists write about a game, they should consider focusing more on tactics and especially using available data. For example, only few journalists analyzed the decisive moment in the Champions League Final of 2012 that led to the tie and, ultimately, Bayern Munich’s defeat. This is too bad since we are capable of literally breaking down each second of a sports event as the data is getting more and more comprehensive. For the remainder of this year’s World Cup, keep an eye on the use of data and questions of tactics in mainstream coverage. Go to one of the blogs mentioned above, if you want to know more than the mass media will tell you. Or even better, conduct your own analyses: many websites (including FIFA’s) provide several interesting stats for every match. Needless to say this should not prevent you from cheering on your national team.

 

Blog Series by Teradata and Celebrus Technologies

Part 1 – Introduction

As data-driven marketing becomes an ever more critical discipline to a well conceived marketing plan, the Digital Marketing Insights Report 2014 from Teradata and Celebrus Technologies examines the different trends within customer data collection and storage; analytics; and personalisation. The report investigates what the future of digital marketing holds for an industry attempting to meet the demands of the ‘always on – always connected’ digital consumer. The report provides detailed insight into what it takes to effectively manage a transition from digital marketing to ‘digital data-driven marketing’ in order to meet the expectation of today’s consumer to receive real-time engagement and overcome the personalisation demands of the future.

Analysing the key challenges with data, big or otherwise, the research endeavours to explore:

  • The different data collection and storage options
  • The goal of achieving a Single Customer View (SCV)
  • The importance of individual level data
  • The informed customer analytics that can be extracted from data
  • One-to-one personalisation techniques
  • Digital marketers’ priorities over the next two years.

Harnessing the power of omni-channel data driven marketing to achieve a SCV is the challenge currently facing many digital marketers, with 31% having an eye firmly on implementing a real-time personalisation capability within the next two years. However, some digital marketers are finding the process difficult and struggling to locate the right tools to realise this objective. Currently, 51% of respondents consider personalisation critical or very important to business success, with a further 32% rating it as important. However, as it stands, only 21% of respondents indicated that they are currently able to utilise their data to form a SCV, an essential building block of effective personalisation.

The research uncovers that an overwhelming number of digital marketers are stalling on implementing customer data infrastructure and analytical resources and focusing instead on social media and mobile; but why is this the case?

Digital marketers are facing a huge challenge and they are discovering that their current customer data and marketing toolkits are simply not up to the job. Data quality (23%) and data storage (36%) are the most cited problems for those surveyed with only 10% expressing the actual data collection as a pain point. Given the concrete benefits reported by those with SCV level analytics and personalisation tools, (70% of respondents agreed that a SCV provides better customer insight), those allocating resources to support data-driven marketing techniques are more likely to unlock the potential held within their data and transform their digital marketing strategies. Encouragingly, investment is anticipated in both the analytical expertise and the technologies required in order to turn insight into action that will improve loyalty, conversion and marketing ROI.

Our series of upcoming blogs will cover all aspects of the research; exploring the intricacies of the findings and the differing angles of data sources, customer analytics and personalisation. The next blog will consider the challenges in realising the ultimate goal of achieving the SCV and the importance of having the right data to do so. Following on from this, we will explore the use of informed consumer analytics and the integration of online and offline data to achieve digital marketing success.

Personalisation is a key theme across the Digital Marketing Insights Report 2014 research and the final blog will demonstrate the role and importance of personalisation in enabling the digital marketer to create timely and respectful personalised communication.