By now, so much has been said about Apache Hadoop platform for big data that the amount of Hadoop discussion itself could probably qualify as a data set for petabyte-scale analysis. And if you analyzed sentiment, you’d see a very clear “good news, bad news” story emerge from those who have used – or tried to use – Hadoop: The platform brings tremendous value in its ability to handle massive and diverse amounts of data, but the sheer complexity makes Hadoop challenging to all but the most seasoned (and scarce) technologists and data scientists.
For the subset of companies out there whose core operations involve data engineering, this may not be a problem. But the rest of the corporate world has been stuck on the outside looking in, envious of the benefits but spooked by the intricacy. Worse yet, some firms have suffered through poorly understood, do-it-yourself Hadoop implementations that fail to leverage the investment and can even create unreliable results. As the Wall Street Journal’s Michael Hickins once put it, “data is inherently messy – it can be wrong, it can be duplicative and it can be irrelevant.”
So I’m not surprised when I see research showing 85-percent of Fortune 500 organizations remain unable to exploit big data for competitive advantage and that 75-percent of respondents to our own 2012 survey on big analytics say integrating technologies is an obstacle to deployments. More than three quarters of respondents in that survey felt they didn’t have the right staff to deploy, and more than half remained concerned about the complexity of big data technology.
That’s why I was proud as part of my keynote address today in San Jose at the sixth annual Hadoop Summit to announce the release of Teradata Portfolio for Hadoop, a comprehensive offering that takes the complexity and cost out of deploying and managing Hadoop. Simply stated, Teradata Portfolio for Hadoop is a single source for all things Hadoop that removes the engineering and operations guesswork for non-programmers, business analysts and business consumers looking to extract value and competitive advantage from their data in an unprecedented range of commercial environments.
You can learn a lot more in our press release, but even a quick summary makes the benefits clear. Backed by our partner Hortonworks, Teradata Portfolio for Hadoop includes flexible Apache Hadoop product platforms and software, along with best-in-class services, training and customer support and direct access to the latest technologies coming from the open source Apache Hadoop community.
Every organization is different, so Teradata Portfolio for Hadoop offers four implementation choices. If your IT department wants turnkey, ready-to-run solutions, we deliver two premium platforms: Teradata Appliance for Hadoop, a tightly integrated hardware and software appliance optimized for enterprise-class data storage and management; and Teradata Aster Big Analytics Appliance, for data staging and processing with more than 70 pre-packaged analytic functions and optional nodes of Hadoop. For engineering-led departments that opt to build their own systems but want the backing of Teradata, we’ve developed two commodity platforms for Hadoop: Teradata Software Only for Hadoop and Teradata Commodity Offering for Hadoop, the latter being a partnership with Dell for a lower cost, standard hardware configuration optimized for the Hortonworks Data Platform.
This variety of Hadoop implementations gives organizations real choices depending on the level of enterprise expertise and budgetary constraints. And Teradata offers additional assistance with software, consulting services and customer support. We can help with installation, mentoring and knowledge transfer, operations, management, administration, backup, security processes and more to ensure end-to-end analytical solutions for enterprises needing to manage most any type of data -- be it social media, documents, text, images, audio or any variety of multi-structured data.
Exponential data growth and velocity demand an ever-faster ability to capture, store and refine data. I’m proud to say Teradata Portfolio for Hadoop expands these capabilities beyond the rarified world of companies that can afford to hire a corps of data scientists commanding high annual salaries. “Main street” organizations now have comprehensive, intuitive and flexible options to leverage Hadoop’s tremendous power. After all, it may take a certain kind of technophile to get the most out of this week’s Hadoop Summit (and I admit I am one of those folks). But that doesn’t mean the rest of the business world shouldn’t be able to get the most out of Hadoop.
Latest posts by sgnau (see all)
- Top Five Watchwords for Implementing the Data Lake - March 18, 2015
- Data Security is a “Lifecycle” Commitment - February 26, 2015
- The Open Data Platform is an Answer to Our Hadoop Prayers - February 20, 2015