Two recent articles in Wired made me think about where the Web is heading and the role that data will play in it. The first was an interview with Reid Hoffman in which he described the current focus on Data and Big Data in particular as Web 3.0. The second was an article on the growing prevalence on A+B testing.
For me the concept that data is Web 3.0 is referring to data being the enabling platform for the “applications” that will be associated with Web 3.0 such as semantic web, pervasive personalisation, intelligent search and behavioural advertising among others.
A central component of most of these Web 3.0 applications will be the A+B testing of options determined by the analysis of detailed multi-structured data. This testing will lead to intensive, incremental refinement of cross channel interactions coupled with major iterative refinements.
This will require an increased focus on operationalising the analytics of “Big Data”, removing the latency in its collation, integration and analysis, making it available to tactical decision makers. The methodologies, processes and tools for operationalizing Big Data Analytics, delivering things such as security, auditability, repeatability, are still under development, as are the enhancements and integration of existing tools. For want of a better term, I am seeing this type of capability as Enterprise Analytic Planning with the implied correlation to ERP intended. I am curious how long it will take for true integrated end to end Enterprise Analytic tools that support real time analysis of integrated multi-structured data to be available and for their adoption to be pervasive.