Author Archives: TCSET

Qualcomm: Using the Power of the Cloud to Enable Mobile Ecosystems Worldwide

January 10, 2018

How indispensable is your smartphone in your everyday life? Be honest.  Could you go more than a day or two without it?  How about an hour or two?  Whom do you thank for that? Probably a name you know, Qualcomm. Qualcomm connected the smartphone to the internet and that changed everything.

It’s impossible to imagine a smartphone without features Qualcomm made possible. Every time you snap, shop, navigate, stream, download, store something or even just talk, you’ve got the power of Qualcomm technology to thank.  Examples include backlit selfies that showcase image stabilization technology, finding the hottest restaurant on a crowded street even if you have no sense of direction, gorgeous graphics, as well as lightning-fast video streaming and immersive 3D experience.
Qualcomm works closely with the world’s leading network operators – such as China Mobile, Vodafone, Telefónica, AT&T, and Verizon –  to help connect new industries, new services and experiences that are changing every day.   They also work with the manufacturers who design and sell smartphones with Qualcomm chips to better understand how their chips can be designed more efficiently.  This, in turn, allows the service providers to add more phones with newer features!  A win-win-win for Qualcomm, the network operators and smartphone manufacturers.  We could add a fourth ‘win’ here and that is we, the consumer too!

Qualcomm is in partnership with the network operators refining those networks with a renewed culture that relies on;

  1. Sharing and allowing full access to analytics and data;
  2. Sophisticated analytics; and
  3. An innovative ecosystem that relies on data NOT moving.CraigQualcomm

“I have data scientists in our group, and a lot of people with machine learning expertise and everything else, but the biggest impact we’ve had is just sharing information.  Changing from a need-to-know company where people would share data because, ‘I’ll give it to you if you need to know it’, to almost like a need-not-to-know, like, ‘Why can’t I share it?’  And so that was really the driver…to share data across the company. Then the more advanced insights you can get from the machine learning, the data scientists, and everything.  But number one, get people access to as much as possible.” Craig Brown, Senior Director of Technology

 The key to number two above is actually number three.  Okay, we’ll clarify. For Qualcomm, sophisticated analytics depend on the simplicity of the ecosystem.  Eliminating movement and leaving data in its place ensures efficiency.

“The Teradata system used by Qualcomm is currently on AWS. It gave us the benefit of instant access to it. We can just turn it on, right? It gives us the thing where, in our development systems, we can use it in various environments, in various places. It’s not just in our Las Vegas data center…Building a great end-to-end ecosystem that best meets our needs, best meets our existing momentum, or best suits our existing momentum as well as giving us the best results in the future. And it, again, comes down to simplicity. Don’t move data around. And a much bigger thing is that Teradata is more than a Teradata box. Teradata is huge.” – Craig Brown, Senior Director of Technology

But we have buried the lead.  The goal state is really innovation in the chips and the networks that our smartphones rely on and that Qualcomm gets when they add it all together.

Screen Shot 2018-01-10 at 11.09.25 AMPartnering with network operators and manufacturers around the globe, Qualcomm strives for innovation.  Together, they test networks in existing and emerging markets to understand “total network profile.”  There is software running on the phones to determine how they are operating.  With geo-location knowledge, Qualcomm and the network operator collect data off cell phone towers to analyze multi-structured data about handoffs from cell tower to tower.  For example, if you have 100M phones/subscribers with 2B phone calls in a month, you can expect to collect 30 – 40 different types of data including cell phone year, manufacturer, format, service, etc to reveal: ​​​​​​​

  • Where does the network misbehave?
  • Is there an area of the network running much better than expected?
  • Where is the switching station?
  • Do certain phones and technologies perform better with certain cell towers and technologies?
  • How old is the cell tower?  What parts are being used in the cell tower?
  • Where are the anomalies?
  • Can we exploit those cases?

Working alongside consultants from Think Big Analytics, Qualcomm data scientists discovered the key lesson from this engagement was that simple things often work the best.  They found that a manually created decision tree or a simple probability model would solve many cases.  When more advanced techniques were needed, they tended to stick with supervised machine learning approaches such as support vector machines (SVMs) or random forests where the model will indicate the criteria it used to make a decision.  In this particular engagement, deep learning was a last resort for situations where it’s often difficult to show why a decision was made, and thus, more difficult to get adoption.  For example, Qualcomm previously had the goal of clustering cell towers into groups, and this lent itself to an unsupervised approach as well as the use of k-means clustering.

“Now what we can do is we can say, ‘Ah, let’s go and measure what we found here, and let’s go and measure the entire network and see.’ One of the machine learning techniques is where you do unsupervised learning, where you cluster things. And so I’m not teaching the machine what to do; it’s just looking for patterns, and it says, ‘Those things are acting the same way.’ And so the thing I just found, the thing that was playing up, if I can now, across an entire country, say, ‘Where else is it playing up the same way? Go out and try and fix them the same way.’” – Craig Brown, Senior Director of Technology

Congratulations to Qualcomm! Your equation of culture + analytics + ecosystem = business outcomes is your success!

 

Monsanto: Looking to the Cloud to Help Farmers Globally Grow Crops More Efficiently

August 1, 2017

Cloud, cloud, cloud.  If you’re in the analytics and data space, that’s probably been on your radar (or right in your face) for the past few years. Is cloud the real thing or just the CIO’s bright and shiny new object? Global agriculture giant Monsanto knows it is the real thing. With a broader strategy to move more applications to the cloud, they started with disaster recovery and will now start to move engineering and product development initiatives.

“For a lot of our research scientists, they see the Cloud as an opportunity to move data into a place where they can use different tools and different capabilities very quickly.  The administration time is very short.  They can spin up a new data platform and they can start getting right to the analytics very quickly without a lot of administration in the way.  At the same time our finance team wanted to redo and prove the way that we consolidate and close our financials on a monthly basis (see previous video on their finance transformation.) So this larger capital effort was already leveraging the enterprise data warehouse for collecting and consuming all of that data and centralizing it in one place.  They wanted to connect the closed reporting system on top of that warehouse, but we couldn’t do it unless we were able to offer SOX compliance and disaster recovery.” Troy Crites, Global BI Architecture Lead

With a DR data center that was running out of tile space, combined with the desire to exit the DR business altogether and concentrate on their business, the elastic and scalable architecture in the cloud was the perfect option. And, it’s just the start.

The ultimate goal is to take advantage of the high-powered analytics no matter where the data is – meaning anywhere and everywhere.

“Teradata Everywhere™, I think that gives us new opportunities to look at Teradata as not just an enterprise data warehouse platform, but as a vendor and a partner that can help us in a variety of different areas… the digitization of our business, and how we can enable real time solutions……” – Troy Crites, Global BI Architecture Lead

With portable licensing, Monsanto can implement Teradata across flexible deployment options. That is critical as Monsanto moves to Teradata IntelliCloud™ on AWS in the near future, co-locating Teradata in the same cloud as product development and engineering.

One of the biggest concerns for anyone deploying in the cloud is security.  Monsanto is no different and security was paramount. Multiple teams (network, firewall, and routing) worked together with Teradata teams to produce all of the certifications and security protocols.

“Teradata was top notch.  They were able to produce all of the certifications and the security protocols that they follow.  There are a particular number of reports that our security team was looking for, along with the audits.  And Teradata had just recently gone through a facility audit around their cloud facility.  Our team and their team worked hand in hand to feel comfortable.” – Troy Crites, Global BI Architecture Lead

All of this gives Monsanto the opportunity to concentrate on the business they are truly in.

“Monsanto’s mission is to build sustainable agricultural products in order to help our farmers get as much yield as possible while conserving energy and water. In the future as we move into more of a data science space, we’re starting to look at how we can digitize the farmer’s experience and be able to leverage that information in order to increase that yield and offer more sustainable opportunities.” – Troy Crites, Global BI Architecture Lead

Congratulations to Monsanto who is looking to Teradata Everywhere™ to help farmers across the globe grow crops more efficiently, working together for a brighter future.

 

 

Danske Bank: Innovating in Artificial Intelligence and Deep Learning to Detect Sophisticated Fraud

July 25, 2017

What is it about Artificial Intelligence (AI) that excites us data geeks? Is it the delivery of a promise given to us decades ago? Or endless possibilities that AI can give us now?  Probably a little of both. For Danske Bank, it’s giving business outcomes that deliver exciting results and AI is inspiring everyone on the team!NadeemGulzar

“Hands down, this is a high point in my career!  I’m a science guy, so I love these methods and talking about some of the more complex stuff.  It’s definitely worth it;  hands down.” – Nadeem Gulzar, Head of Global Analytics

Danske Bank is using Artificial Intelligence and Deep Learning to detect and then prevent sophisticated fraud in multiple areas and it’s working better and better every day because the models are learning.

Danske Bank distinguishes between two types of fraud – customer fraud and “fraudsters.” The customer is at the center of the fraud.  For example, the customer receives an email from a citizen in a remote country asking the customer to send money to help alleviate hardships or arrange a visit as there is a courtship in the making. And then there is true professional fraud done when a “fraudster” tracks the perfect time to do serious damage. This can include malware that is infected into a bank or where personal ID’s are taken and malware is added to devices.

“Sometimes in some scam cases the fraudsters attack us for ten minutes and then they never return. Their goal is actually to get the maximum value of those 10 – 15 minutes of fame as we call it and then stop up.” Nadeem Gulzar, Head of Global Analytics

Danske Bank is using LIME (Locally Interpretable, Model Explanation.)  LIME is a method used to explain the Deep Learning and what features matter. It is an open piece of software that basically helps the team using the model to explain the factors that make them believe that the model is solid. During this step and in one of the test projects, the team had to explain why they wanted to block a credit card transaction. In one example, bank customers buy from eBay, and the payment goes to China. But, today, the customer is using Alibaba. Is that fraud? Don’t know. And in this case, what do we tell the model to do?

In another example, the customer lives in Brazil, but today they are having lunch at a restaurant in Copenhagen. Is that credit card transaction fraudulent or not? This is where behavior data is important. Most customers have a preference, and when that preference isn’t chosen, what is occurring? Shall we execute the credit card transaction or not?

Is it all AI? No, human interaction is required to help train the models. For example, investigative officers were brought in to better understand anomaly detection. Human knowledge was needed to better understand the scene alongside the fraud models. Is this a sophisticated ring of “fraudsters”? Is this only one “fraudster”? Is this a group of 10-15 minute “fraudsters” creating a cluster? Is this a trend?

championchallengerOne of our favorite parts of this story is the ‘champion/challenger’ model strategy. Both champion and challenger models are always being tested using production data. And because there are billions of transactions happening every day, they can constantly improve the models. Danske Bank sets thresholds for the models, and when they go below a threshold, they determine if they are feeding it enough data. For example, do they need to add in geo-location data? Add in ATM data? Model comparison is done live!  And when appropriate, challenger models become champion models. That’s so cool!

All of this takes a diverse team.  Danske Bank employs platform, technical & data engineers, data scientists, the business and even highly trained criminal investigators – all of them work with experts in AI and Deep Learning to innovate.  And they will even hire from local universities!

The results for the business are more than impressive.  Before applying  AI and Deep Learning, Danske Bank had 1200 false positives a day.  Those were cases that had to be analyzed by Danske Bank investigators, sometimes even external agencies like Interpol.  Now that number has been reduced by 60%, saving bank investigators significant time and allowing them to investigate real cases of fraud. And that’s not all.  Detecting true positives has increased to 50%.  Teams at Danske Bank believe this is just the beginning.

Congratulations to Danske Bank for all the success in this renaissance of Artificial Intelligence!

Sanofi: Forwarding Medical Advances and Breakthroughs to Help People Have Better Health

June 12, 2017

Faster time to market in any industry is vital, but in bio-pharmaceuticals, it is even more than that – faster time to market can improve patient care or even change patient outcomes.  For the French multi-national bio-pharmaceutical Sanofi, analytics and data are the key factor in accelerating time to market. Operating in 100 countries around the globe and providing healthcare solutions to more than 170 countries, Sanofi produces critical pharma products for oncology, diabetes, cardiovascular, central nervous system and vaccines. Screen Shot 2017-06-12 at 5.57.19 PM

“Our goal is to help people have better health.  We produce vaccines to prevent disease in some of the poorest countries in the world.  We are researching solutions to meet unmet needs. We are also in consumer health, helping people feel better and live better.  We provide this to people all around the world.” – Martin Longpré, Solution Architect

It is well recognized that clinical data is one of the most sensitive industry assets and a competitive advantage that is required for critical evidence of a drug therapy’s efficacy, safety, as well as potential health and economic impact. Managing one of the largest R&D organizations in the bio-pharmaceutical industry, with a diverse pipeline, Sanofi R&D focuses on maximizing the efficiency of its clinical development organization as well as improving visibility into the progress of its global trials. Aiming toward the business outcomes of product innovation and risk mitigation, Sanofi R&D created MAESTRO.  Like a distinguished musician, MAESTRO is an agile, integrated warehouse designed to scale and address the clinical trial challenges that materialize from the high variability of the data captured.  Clinical trial data is as varied as it gets; everything must be entered – patient data, medical test results, protocol data, and medical side effects (even a headache must be captured).

Screen Shot 2017-06-12 at 5.57.08 PMMAESTRO, from the beginning was designed to scale.  Today, the solution is able to address the clinical trial challenges materialized by the high variability of the data captured and the continuous increase in volume introduced by an extensive portfolio of prescription drugs, vaccines, generics, and consumer healthcare products.  Sanofi R&D is able to manage dozens of studies at once (70-75) with the ability to scale to 200.  Researchers are able to refresh the study every ten minutes rather than every forty in the previous environment, allowing them faster access to new information.  And, the lockout time on studies has been reduced from eight hours to just under an hour!

“If we think about this system, it allows us to recognize and rapidly address any change in the status of a patient. If a patient experiences an adverse event during a clinical study, especially a serious adverse event, it is important that we get notified immediately so we can respond accordingly. The safety of our patients always comes first, in all the studies we do.” – Martin Longpré, Solution Architect

Teradata’s Enterprise Data Consulting organization helped Sanofi architect, integrate and migrate the data from Oracle to Teradata in addition to being instrumental in implementing important features such as JSON, User Defined Functions and Temporal. Project MAESTRO is impacting business outcomes, including;

  • Product Innovation and the ability of the R&D organization to potentially create new products or designs that are safer, more efficient, and meet the market needs sought by doctors and patients.
  • Risk identification, mitigation and evaluation should any agency (internal or external) want to understand a study’s traceability.

“The major advantage is that we have a history of all the interactions with the patient.  This means from the beginning of the clinical study, let’s say four years ago, to the end of the study, we will be able to provide every action that has taken place with that subject for all the different elements of the study.” – Martin Longpré, Solution Architect

All of which propels Sanofi towards their mission to “shape tomorrow’s health.”

“Félicitations” to Sanofi on project MAESTRO and all of your success!

 

Lufthansa Group: Connecting Europe to the World While Keeping the Customer at the Center of Business

June 5, 2017

“Big data costs money. Big analytics earns money.” Have you ever heard a more true statement?  That profound little nugget came from Heiko Merten, Head of Global Sales Business Intelligence Applications at Lufthansa Group.  Heiko knows what he is talking about – last year Lufthansa Group maintained critical profit margins and used that “big data and big analytics” to achieve three corporate KPI’s – maximize revenue, minimize costs, and maintain customer satisfaction. No easy feat operating multiple airlines and more than 18 companies that provide services to those airlines (think food, cleaning, and maintenance just to start.)

“That’s our KPI set which is supported by performance drivers. It means those factors that Screen Shot 2017-06-05 at 2.38.58 PMinfluence and affect those KPIs.  From a point of view, our top management should steer the company just via the top level KPIs and only if there are questions, uncertainties or need for clarification, will they drill down to have a look on the performance drivers together with middle management.” – Heiko Merten, Head of Global Sales Business Intelligence Applications

So, what are they doing and how are they doing it?

In order to measure those KPI’s, Lufthansa had to create a common data language from the multiple airline acquisitions and then integrate the data, from many internal and external sources, including revenue, reservation, marketing information, schedule, and market share while also bringing in competitive information to remain in tune with the competition. Lufthansa integrated their data with the Teradata Unified Data Architecture™ to break down the many different brand silos for crew scheduling, destinations, airplane schedules, jet fuel, and crew efficiency.  They have the ability to understand cross-functional performances of the different Lufthansa Group airlines, establishing a monetary evaluation of the carriers’ bookings.

Screen Shot 2017-06-05 at 2.30.46 PMWithin this highly competitive market, Lufthansa uses analytics to measure ‘fare share.’  Using pricing, route, airplane model and even customer segment data to determine if travel agents are giving Lufthansa group their fair share within a growing or shrinking market.

“How did the Lufthansa Group’s market share develop in comparison to the overall market development? Or who gained new share of the market? And to be more concrete, if Lufthansa grows by three percent in the market that overall grows thirty percent, that’s not a good sign. And vice versa. If a market shrinks by ten percent and Lufthansa shrinks just by five, that’s a good sign for Lufthansa…We are currently modeling or introducing a KPI called ‘fair share’ to evaluate the performance of agents or corporate customers reflecting exactly the fair market share of an agent.  That Lufthansa share that is sought thru an agent.”- Heiko Merten, Head of Global Sales Business Intelligence Applications

The integration of that data quickly led to another outcome or purpose for the analytical ecosystem which was to enable customer-specific offers and better service, establishing an overarching analysis among the multiple brands/carriers and business units.  That same integrated data also;

  1. Helps better steer sales and sales performances within Lufthansa Group airlines;
  2. Gets the full picture of their customer’s performance and rewards loyalty; and
  3. Fulfills the sales strategy and commercial targets on customer levels with optimal attribution of incentives.

With the right analytics, Lufthansa can overcome economic, competitive, and quality challenges to achieve unprecedented levels of excellence leading.  All coming back to…

“…Big data costs money.  Big analytics earns money. And that means, in my eyes, that additional revenue cannot be generated by data, it must be generated from the analytics that are based on the data.  This is an important point.  Have a good data set available and have powerful, high-performance reporting on top.”  – Heiko Merten, Global Sales BI Applications

Congratulations to Lufthansa Group for all your success!

DHL Express: Gaining Insights From a Global Finance Transformation

March 10, 2017

What DHL wants is simple….be “The logistics company for the world.” That fulfills a greater vision to improve the lives of people everywhere, enabling international commerce to boost economic growth with safer deliveries of medical goods and the ability to reach the most remote areas of the world. Revolutionizing the world of logistics takes innovation and the ability to recognize key opportunities and then transform. Key opportunities in this case meant finding actionable insights from the massive amounts of data DHL collects every day. That’s exactly what DHL Express (a division of DHL) did in 2013, embarking on a global finance transformation they appropriately named “Project INSIGHT.”

“The data that we get from INSIGHT allows us to be very specific when it comes to pricing, profitability, and costing. We can adjust our prices where we have very Screen Shot 2017-03-10 at 10.02.05 AMlittle capacity.  We can lower our prices where we have more capacity.  We can offer customers a much more balanced pricing proposition.”  Graeme Aitken, Vice President of Business Controlling

 Starting in one country and then replicating to others around the world, DHL Express integrated data from finance, operations, their customers and more, creating a sophisticated state-of-the-art costing system that takes advantage of the goldmine of data that DHL Express users have access to.

“What we’ve got is we understand very granular costing and profitability of every shipment, and because it’s at the level of the shipment, we can aggregate it up to trade lanes, countries, products, customers, and then we can start to take action on pricing, revenue management, capacity management, and so on.  So we can be very surgical about how we approach pricing, costing and profitability.  If we have a problem with profit, if we have a problem with cost, we can really be very specific about how we fix it.” – Graeme Aitken

This allows DHL Express full visibility into cost management, yield management and gives them the ability to predict and act on their customer’s economic cycles.

Screen Shot 2017-03-10 at 6.01.53 PMCost management allows DHL Express to take into account the hard costs of doing business no matter the customers (think planes, trains, automobiles, fuel, personnel etc.) Knowing those fixed costs allows DHL Express to better manage their customer pricing, ensuring it’s low enough for customer retention and yet high enough to be profitable for DHL Express.

With the capability of yield management, DHL Express can adjust for decreased or increased capacity.  They can adjust prices where they have little capacity and offer lower prices when there is more, resulting in a balanced pricing proposition. They can then pass insights and cost savings to their customers to save them money, improve their service, and increase their customer’s profitability.

“If, for example, we have an issue with a customer with failed delivery– so if we keep trying to deliver a shipment, our customer is not home — it’s bad for us and it’s bad for the original shipper because they’re getting lower customer satisfaction if a shipment’s not delivered. In many cases we’ve actually shared information with our customer. ‘We have an issue delivering shipments. We have an issue with bad address,’ and they get an insight into their own logistics data, and they can make their own improvements.” Graeme Aitken, VP of Business Controlling

Perhaps one of the most impressive results from the finance transformation is DHL Express’ ability to manage customer’s economic cycles with micro and macro 3rd party data. Whether the economy is up or down, DHL Express is now predicting and forecasting with better accuracy. They can look at historical data and add in economic indicators and strategic priorities to better manage their budget and revenue/profit plan.

“We have a brilliant view of the past. So if you extrapolate the past to the future, we should get a pretty accurate indication of where we’re going. So if you build in variables like inflation rates, Brexit, whatever else is happening in the global economy, we should be able to forecast more accurately using the very detailed costing data that we have currently.” Graeme Aitken, VP of Business Controlling

 DHL Express’ team is proud of the progress they have made in a few short years, knowing that 85% of digital transformation projects fail.  Graeme Aitken credits Teradata Consultants with consistent, innovative and critical work on Project INSIGHT.

“Teradata consultancy helped us build the system. One or two of them are still working on INSIGHT. If we ever have an issue, if we have a change request, if we want to make improvements, we always get the same people back because we had such a strong relationship, and over the years they understand almost as much of the business as we do and they can help us— ‘This is practical, this is not practical.’”  Graeme Aitken, VP of Business Controlling

Congratulations to the DHL Express team on the success of Project INSIGHT!

Lloyds Banking Group: One Ecosystem Serving Multiple Brands Delivering Business Outcomes to Help Britain Prosper

February 7, 2017

Committed to “helping Britain prosper,” Lloyds Banking Group serves 25% of Britain’s first time home buyers and 20% of business start-ups through their multiple brands. Data and analytics are at the core of the bank and are an integral part of Lloyds’ mission for the people of Britain.

 

Simon Howarth

Simon Howarth

“It’s across both operational and analytical data. It’s very much understanding the customer, providing a single view of the customer and their interactions with the bank, then being able to provide them with the best offers around products and their finances. It’s also to protect their data with support, risk systems, fraud systems and fraud identification. We provide that deep understanding of the customer in order to be able to set strategy, pricing, and organize the bank in the best way to be able to support both retail and commercial customers.” Simon Howarth, Chief Subject Matter Expert, Information Management.

The ‘mantra’ within IT at Lloyds is, “service is number one priority,” bridging architecture and the practical implementation of strategy. IT designs and manages the eco-system while different groups ‘crowd fund’ initiatives that benefit all of Lloyds in the areas of income generation, new product strategy, meeting regulatory requirements, customer experience and cost management.

When it comes to analytics and creating those business outcomes, Lloyds follows a winning process starting with a small ‘proof of value’ using minimal data and a known ‘kit.’ Then the team will bring in IT and different business areas together to identify common goals.  That’s when the ultra-unique ‘crowdfunding’ comes in to support different analytics initiatives and they deliver quickly with short sprints, share the findings and build more support.

In just a short time Lloyds Banking Group got quick wins with three separate Teradata Aster™ projects using the same 200 datasets

  • Transactions at the branch, internet banking, or ATM,
  • Digital (web, mobile and tablet platforms)
  • Telephone interactions with call center agents initiated by the customer or the bank
  • Updates to account/customer profiles
  • Outbound and offline marketing
  • Feedback (NPS surveys or complaints)

Use Case #1: Distribution Strategy

This included an investigation into what drives activity into bank branches, a significant and strategic impact to their cost infrastructure.  Key questions were asked during this use case including where should they invest bank branches over the next X years, number of ATM’s needed and where, along with the mix of skills and staff needed at the branches.

Use Case #2: Call Center Volume Investigation

lloydsFollowing the results of the distribution strategy use case above, our consultants were asked to analyze unexpected charges in customer call centers.  Customer calls had been declining steadily and naturally the bank was staffing toward that decline.  Yet, in November 2015, the decline turned upward and the bank worried that call volumes would continue to trend upward with service levels and customer satisfaction being impacted.  Call center staffing is a significant investment and the bank wanted to control and mitigate risk associated with cost of potential dissatisfaction.  And customer satisfaction is important!

The team gathered banking transactions and marketing interactions to sessionize the data and place it in a chronological order to understand when customers called (4 months of call center data, 4 months of subsequent banking transactions and 4B rows in Aster.)  And the triangulation began.

  • If you are an active account, 20% of that base called into the call center
  • If you were a payment customer, 26% of them would call into the call center
  • If you were a credit card customer, 33% would call the call center.

Upon further investigation, they realized that government tightened the rules for credit card which may have triggered their calling.  And for fraud alert transactions, there appeared to be a strong relationship between experiencing an alert and calling a call center.  There was strong evidence of rapid responses to fraud alerts with a pronounced spike into the call center.  Think about it.  If you received a text message that there was fraudulent activity on your credit card, how would you respond?

Use Case #3: Investigate Contactless Payments

There was strong evidence that contactless payment events often preceded call events, but there appeared to be a significant lag between payments and calls, that supporting the view that there is a “sudden shock” for customers when they realize that something isn’t right with their statement or transactions,Screen Shot 2017-02-07 at 10.01.15 AM

“Paths aren’t linear for somebody to actually get a product. Traditionally what we’ve done is look at a product’s completion; therefore the person has gone through step one, two, three. Actually, people go through steps one, two, back to one, then to three, back to two, back to one again, off out somewhere else. You can see a lot of looping going on, and that’s quite inefficient while it’s happening. So, a lot of the changes, or some of the changes that came off the back of that, aren’t actually to do with IT systems and what we’re actually going to do in the IT process. They’re to do with business process and how that then will result in success.” – Simon Howarth, Chief Subject Matter Expert for Information Management

The Group has used Teradata Customer Interaction Manager (CIM) for 8 years.  CIM helps to consolidate and organize data so that Lloyds Banking Group can

  • Segment target audiences into meaningful groups;
  • Visualize the impact segmentation strategies will have on engagement;
  • Select the right message for the right customer;
  • Predict customer responses to strategic offers based on historical data;
  • Send personalized, relevant and timely messages based on specific attributes and behaviors; and
  • Create and manage custom offers that align with a holistic brand messaging strategy.

By knowing the customer’s age and their geo-location on their mobile phone, they can send very targeted campaigns.  One-quarter (1/4) of sales come from the campaign management tool.    One of the biggest benefits of the Teradata CIM tool is the ability to tune a campaign, leading to an uplift in revenue.

Congratulations to Lloyds Banking Group for all of your success!

 

 

Maersk Line: Using the Internet of Things, Data, and Analytics to Change Their Culture and Strengthen the Global Supply Chain

November 6, 2016

Digitizing a century old shipping company that carries 15% of the world’s GDP isn’t easy but Maersk Line knows data and analytics are already changing their culture, benefiting customers and strengthening the global supply chain. Maersk Line, with over 600 vessels, has used data and analytics to optimize how ships approach the harbor, consume fuel, and now predicting failures in refrigerated containers (or reefers) using the Internet of Things (IoT). But that that just low hanging fruit – the future is bright.

Jan Voetmann Head of Analytics Engagement

Jan Voetmann
Head of Analytics Engagement

“We have some single items on the cost list that are fairly significant. If you take, as an example, our investment in fuel  (we call it bunker) or if you take something like moving around our containers,  there is an asymmetric relationship between where you need containers empty and where they arrive full. They need to be relocated all the time back to where we can fill them. We spend over a $1B dollars doing that each year, just moving the empty containers. To do that in an efficient manner, that is as automated and as intelligent as possible is a great win.”  – Jan Voetmann, Head of Analytics Engagement, Maersk Line

And those refrigerated containers? The reefer technology dates back to the 1930s but in this century, those reefers are outfitted with sensors that give Maersk Line visibility into exactly what’s happening during the journey.  What are the C0₂ or 0₂ levels in the containers? What is the temperature? These sensors also give Maersk Line the opportunity to fix the containers before they fall under service level agreements, potentially perishing its contents.

“It makes the process far more efficient.  If you didn’t know the state of the container, you would expect to have to go through the entire checklist. We can reduce that checklist significantly by checking out in real-time before it arrives what is actually the state of that container and we can get it through much faster and make it available to customers instead.” – Jan Voetmann, Head of Analytics Engagement, Maersk Line

Location. Location. Location.

That term is not only important in real estate, it’s important in Maersk Line’s daily operations.  Maersk Line vessels can carry up to 18000 containers. How would you find them without sensors, data and analytics?  Being data-driven gives Maersk the ability to know critical information, for example:

  • Where do we have empty containers? How can we reposition them?
  • Can they be rerouted and how quickly can they be filled?
  • Can these containers be made full at the same port with our customer’s products?

“When we have to relocate all our empty containers, the historic model has been that you have depots around the world and in each of those depots they individually assess, ‘what’s my need for empties right now to then make available to customers?’ That model can be made more efficient. What we’re building from the analytic side is actually a whole automated process for them that does that job. It assesses how much they need and it also automatically makes sure that those containers are routed to them so that they have the stock that is required without going too far.” – Jan Voetmann, Head of Analytics Engagement, Maersk Line

Maersk Line spends $1B every year moving empty containers so even a 10% improvement is significant. And that’s low hanging fruit.

Maersk Line knows they are at the very beginning of this digitization journey and that there are exciting changes ahead for them and their customers.

“Analytics is a relatively new capability within Maersk and my role is really from the inside to help accelerate the digital transformation and see from the outside, from the customer’s point of view, to try and create a more efficient machine and thereby passing on some better costs or lower costs to them.

From the customer’s point of view, it’s also about making sure that they actually have access to the products they need at the right time, of course at the right price. For us the next stage is to then commercialize it, to actually take all that sensor data and make it available both from a transparency point of view to customers. Also, for actions because you can probably imagine that if a customer had the ability to change the air pressure or the humidity or the temperature or the C0₂ or 0₂ distribution within the container, they can actually engage with our goods while they’re in transit and then completely change the value proposition to them.” – Jan Voetmann, Head of Analytics Engagement, Maersk Line

Customer access and transparency to their own goods? A possible new service and revenue stream for Maersk Line?  The future is bright.  All thanks to managing and harnessing the power of IoT, data, and analytics.   Congratulations to Maersk Line for all of your success!

Ticketmaster: Capitalizing on OPEX, Performance, Analytics and Data to Deliver Insights, All in the Cloud

November 2, 2016

This is a story about the cloud.  You’ve heard of the cloud right? If you’re in data and analytics, that’s what people are talking about these days.  But who’s doing it right? How are they using it?  How difficult or easy was the transition from on-prem to the cloud?   We give you…..Ticketmaster.screen-shot-2016-10-27-at-8-24-30-am

“Striving to put fans first,” Ticketmaster has been getting people into concerts, sporting events, and shows since it was started in a college dorm room in 1976.  Getting tickets has changed a bit in four decades and now Ticketmaster sells hundreds of millions of tickets every year.

“We are fans, too; ‘we get you in.’ That’s what we do. We enable that experience; that once-in-a-lifetime experience of being able to go see your favorite band, or your favorite Broadway show, or your favorite sports team.  We are the beginning, middle and end of that experience!” – Shawn Moon

Ticketmaster considered multiple factors before going to the cloud including flexibility, performance, and the ability to use OPEX dollars over CAPEX, to meet its company mandate to be in the cloud by 2017. Ticketmaster has a business model that requires the ability to spin resources up and down, quickly. Think Beyoncé tour going on sale at 10AM on a Saturday.  Lots of need. The next Tuesday and the rest of the week; not so much.

“The analytics piece is the way that we’re able to deliver the value that is locked up in the data. These days, you just can’t afford to have a big and long process of discovering something. You need a platform that is responsive, fast, grows as your data grows and as your demands grow.  The questions that the teams will be asking tomorrow can still be answered on the platform you have today. That’s important, because we don’t have time to go out and re-platform just to answer another question that might come up in six months.”

Ticketmaster wanted to get out of the day to day work of managing an environment.

“By moving into a cloud platform we have taken ourselves out of the business of managing that hardware, of upgrading that hardware, of securing that hardware, and put it into hands of a group that’s dedicated to those operations. That frees up our resources to be able to work on more value-add tasks.” – Shawn Moon

Ticketmaster made the move to the cloud in only ten weeks and most of Ticketmaster’s users don’t even know they made the move.  Except for the fact that, in many cases, performance is better.

“If you’re looking for drama, there’s not a lot of drama there. You know why there’s not a lot of drama there? It’s largely to do with the fact that, for six years, Teradata’s become a very trusted partner of Ticketmaster. I knew when we were talking about moving to the cloud, I knew and trusted that Teradata was going to make sure that the project was successful. I didn’t have to convince anybody in my organization of that. They’d seen the partnership that’s happened and developed over the years, with Teradata, and the way that they’ve come to the table when we’ve needed their assistance, and really come through for us.” Shawn Moon

Ticketmaster users include finance, marketing, and accounting.  But itscreen-shot-2016-10-27-at-8-25-27-am extends further than that to the venues and promoters that are putting on these shows. If sales are sagging for a certain concert or show, teams can quickly pivot and personalize offers in order to up attendance.  But, Ticketmaster goes even further, using the data and analytics to help fans enhance their experience.

“In data-warehousing, it doesn’t sound like something that’s really going to get you out there and help the fan experience. But, we’re able to take a look at the way fans interact with our platform. What do they like to see? Where do they like to go? Then we can help them discover new and exciting events and attractions, that they would like to see.”

That, in a nutshell, is your cloud story.  Congratulations to Ticketmaster on your cloud success!

 

Enedis: Innovating with the Digitization of Energy Distribution in France and Around the World

October 11, 2016

Internet of Things (IoT), data labs, open source, Teradata Unified Data Architecture ™ – all of which evoke thoughts of innovation, pushing the limits of analytics and data with cutting edge access to actionable insights.  And Enedis is doing it all.

As a neutral Distribution System Operator (DSO) covering more than 95% of France, Enedis is screen-shot-2016-10-05-at-4-36-22-pm-copyregulated by the French government and provides electricity to unregulated suppliers.  Ultimately though, Enedis is responsible for connecting 35 million customers within France to their electricity.  Continuity and quality are primary objectives and both have the opportunity to be better managed with analytics and data.

What and how are they accomplishing this?

 NO outages is a top priority.  Enedis uses data and analytics to help optimize the network which includes over 400-thousand generators and 1.3M kilometers of cables and lines.  Large volume of IoT data comes in through smart meters and sensors on the network.  Enedis collects consumption, environmental (think weather data), and consumer production data for internal and external initiatives.  And Teradata’s Unified Data Architecture ™ is the analytical ecosystem used to execute and operationalize business actions involving the analysis of data.  Externally, suppliers are able to see consumer usage, grid assets and use that information to determine their connection to the grid, pricing, and to have the ability to identify fraud.  Internally, users maintain and optimize  grid assets thereby increasing the efficiency of their €3.2B yearly investment into infrastructure.

“Our goal is really to predict because when I was speaking about 3.2B (Euros), if we can win two percent, three percent, or even four percent of this amount of money, it’s huge.  It’s a game changer both for quality and economical efficiency.”  – Christian Buchel, Deputy CEO & Chief Digital Officer

Developing predictive maintenance to prevent outages, Enedis places multiple data sources including 10 years of historical outage data into a Hadoop data lake.

An example use case is developing predictive maintenance on HV/MV transformers to prevent outages. Enedis places multiple data sources including 10 years of historical outage data into a Hadoop data lake, and then used Teradata Aster in a POC to understand patterns for failure; to ultimately improve the maintenance of these assets.

“We have put all the outage problems from the last decade.  We put everything in a data lake and we brought in external non-structured data, temperature, and lot of other data we have from the environment.  We brought in all the data available around this outage and mixed structured and non-structured data to develop the best algorithm to help us to be more efficient and to predict how we should perform maintenance; what kind of devices we should change and when?  This kind of predictive maintenance has been developed in other industries, such as aircraft, but not yet for energy companies.”  – Christian Buchel, Deputy CEO & Chief Digital Officer

A final use case with the Teradata Unified Data Architecture™ includes the ability to look at the reduction of non-technical losses, or fraud, thanks to smart meters. Enedis is able to create use cases representing different non-technical losses such as consumption with an inactive contract, metering dysfunctions, or presumptive fraud.

Changing the Culture

Putting analytics and data into the hands of employees (literally) has changed the culture of Enedis.  Recently, they deployed more than 10,000 field tablets and smartphones to their employees and developed an internal app store.  This immediate time to market with transparent and direct access to information has changed the culture.

“They can immediately access the problems they have when they are talking with a consumer or local authority.  They can show the mapping of the grid.  They can say, ‘Okay, we can connect here, here and here.’ The equipment, coupled with the mobile application is a huge game changer for the culture of the company because employees access the information immediately and can give answers to the local authority or consumers.”  – Christian Buchel, Deputy CEO & Chief Digital Officer

With open access, Enedis is also able to give power to the end consumer to have more choice, save energy, and money; improving customer satisfaction.

To empower the consumer, the consumer needs to have data.  They need to view historical load curves of their consumption.  We bring them this data so the consumer can say to their supplier, ‘This is not a good price because in the last month my consumption load curve was this, this and this……’  They are empowered by the fact that they know their consumption.  Data transparency is a way to empower the consumer.” – Christian Buchel, Deputy CEO & Chief Digital Officer

Congratulations to Enedis for its leadership in the digitization of energy distribution not only in France, but paving the way for other DSO’s around the world.