new markets

 

Farrah Bostic, presenting a message encouraging both skepticism and genuine intimacy, was one of the most provocative speakers at Strata 2014 in Santa Clara earlier this year. As founder of The Difference Engine, a Brooklyn, NY-based agency that helps companies with research and digital and product strategy, Bostic warns her clients away from research that seems scientific but doesn’t create a clear model of what customers want.

Too often, Bostic says, numbers are used to paint a picture of a consumer, someone playing a limited role in an interaction with a company. The goal of the research is to figure out how to “extract value” from the person playing that role. Bostic suggests that “People are data too.” Instead of performing research to confirm your strategy, Bostic recommends using research to discover and attack your biases. It is a better idea to create a genuine understanding of a customer that is more complete and then figure out how your product or service can provide value to that person that will make their lives better and help them achieve their goals.

After hearing Bostic speak, I had a conversation with Dave Schrader, director of marketing and strategy at Teradata, about how to bring a better model of the customer to life. As Scott Gnau, president of Teradata Labs, and I pointed out in “How to Stop Small Thinking from Preventing Big Data Victories,” one of the key ways big data created value is by improving the resolution of the models used to run a business. Here are some of the ways that models of the customer can be improved.

The first thing that Schrader recommends is to focus on the levers of the business. “What actions can you take? What value will those actions provide? How can those actions affect the future?,” said Schrader. This perspective helps focus attention on the part of the model that is most important.

Data then should be used to enhance the model in as many ways as possible. “In a call center, for example, we can tell if someone is pressing the zero button over and over again,” said Schrader. “This is clearly an indication of frustration. If that person is a high value customer, and we know from the data that they just had a bad experience – like a dropped call with a phone company, or 10 minutes on the banking fees page before calling, it makes sense to raise an event and give them special attention. Even if they aren’t a big spender, something should be done to calm them down and make sure they don’t churn.” Schrader suggests that evidence of customer mood and intent can be harvested in numerous ways, through voice and text analytics and all sorts of other means.

“Of course, you should be focused on what you know and how to do the most with that,” said Schrader. “But you should also be spending money or even 10% of your analyst’s time to expand your knowledge in ways that help you know what you don’t know.” Like Bostic, Schrader recommends that experiments be done to attack assumptions, to find the unknown unknowns.

To really make progress, Schrader recommends finding ways to break out of routine thinking. “Why should our analysts be chosen based on statistical skills alone?” asks Schrader. “Shouldn’t we find people who are creative and empathetic, who will help us think new thoughts and challenge existing biases? Of course we should.” Borrowing from the culture of development, Schrader suggests organizing data hack-a-thons to create a safe environment for wild curiosity. “Are you sincere in wanting to learn from data? If so, you will then tolerate failure that leads to learning,” said Schrader.

Schrader also recommends being careful about where in an organization to place experts such as data scientists. “You must ­add expertise in areas that will maximize communication and lead to storytelling,” said Schrader. In addition, he recommends having an open data policy wherever possible to encourage experimentation.

In my view, Bostic and Schrader are both crusaders who seek to institutionalize the spirit of the skeptical gadfly. It is a hard trick to pull off, but one that pays tremendous dividends.

By: Dan Woods, Forbes Blogger and Co-Founder of Evolved Media

Throwing your Hat in the Ring

Posted on: August 14th, 2013 by Dan Graham No Comments

 

In the 19th century, boxing was a rough sport where anyone could challenge the current champion.  Standing in a big circle, amid all the shouting and hollering, the newest challenger would throw his hat in the circle to bid for the next fight.

It seems like every month there is a vendor throwing a hat in the database analytics ‘ring’ to challenge Teradata.  It makes marketing sense to take this approach:  Why challenge #2 or #5 in the market?  The marketing folks from these startups know they will get press and attention if they can make a big claim against the world champion.  North of 98% of the time, the challenge is hype -- but that was the vendor’s goal.  Most of them really don’t want a fight, they want press, buzz, controversy and of course sales leads.

Consequently, many vendors with a new data access approach or ‘analytic trick’ challenge Teradata.   Some of these are called a “one trick ponies” because while they offer some narrow but cool function or feat, they are missing thousands of features, capabilities, and the technological maturity needed for a production-class data warehouse.   With due respect, often that one trick is noteworthy and later may be absorbed into Teradata’s products.  Established vendor claims are much more than one trick ponies but still far less capable than Teradata’s uniquely comprehensive approach.

So, looking around the market, who has challenged Teradata?  Kendall Square Research, Convex, Hypercube, SGI, DB2 Parallel Query Server, Metapa, RedBrick, Informix XPS, Datallegro, Greenplum, SAP BW, HP NeoView, Sybase IQ, Oracle Parallel Server, Paraccel, DB2, Hadoop, Hive,  NonStop SQL, and more.   Many of these products made an initial splash and got buzz, but have faded from history, some live on in niche markets, and others are still trying to prove themselves.

The Invisible Armor -- Product Maturity [i]

Taking literary license with a famous von Moltke quote: “No product plan survives contact with the customer.”  Until each feature and function survives a couple dozen late night phone calls, the product is just a research prototype, a box of barely usable hacks.  The excitement of the one-trick pony crashes and burns moments after that emotional 2 am phone call saying “It’s not working, come into the office.”

Unlike boxers, software gets incrementally better and more capable with age.  But like boxers,   software gets beat up by tough customers.  It takes time and money to refine features so that they work in hundreds of circumstances.  It takes customers exercising and working the software in a thousand ways, many of them unanticipated by the original laboratory.

If the customer is the hammer, Teradata is the anvil and the product is where they meet to forge a mature product.  Furthermore, customers demand enhancements based on their vertical industry, administrative needs, business policies, security, and so on.   Software maturity comes not only with age but with thousands of earnest customer interactions, repairs, and improvements.

It’s not easy to measure product and vendor maturity, but it is a huge barrier to entry for competitors.   Depending on who the challenger may be, Teradata has a 10, 20, even 30 year head start in parallel data warehousing.   We all love to innovate, but it doesn’t fill a 10 year gap in real customer experience.

In boxing, you must beat the heavyweight champ decisively, not one round out of twelve.  Teradata’s sales executive, Frank Triolo, told me back in 1990, “Dan, you don’t get into this business by throwing your hat in the ring.  This parallel stuff is tough, really tough.”   Customers hammering products are equally tough –and invaluable.

Triolo was right: customer experience always trumps marketing claims and media buzz. The stakes are high: a company’s data is its source of economic value and true enlightenment. The buzz stops here.

Dan Graham



[i] Also see Curt Monash, DBMS development and other subjects, March 18, 2013