Real-time data acquisition causes disruption to analytics for SAP R/3

A series on what is next in analytics on SAP R/3 – part 3

Most engineers do not like big words. However our newest “near-real-time” software has caused disruptive change! That is, for my team –and all that we have been working on for so many years.

Why is it disruptive?

Identifying and capturing newly created and changed records in the more than 25,000 SAP R/3 tables is not an easy task. My team has spent many years optimizing — what data warehouse people call — changed-data-capturing (aka. “CDC”). SAP R/3 does not process all transactions in the same way, and as a result, different business processes in R/3 require different CDC approaches. This is illustrated by the fact that SAP themselves had to develop, to enable data acquisition for their Business Warehouse, a designated technical framework, called the “business extractors.” We have chosen a different route, using generated ABAP-programs which select data directly from the core SAP R/3 tables.

However, with the latest technique – Data Replication –the CDC issue is removed! Data replication makes all of the advanced data acquisition engineering that we (and SAP for that matter) have been working on, redundant. With the new technique, changes have become the starting point, the “change logs”of the database under SAP R/3 are now the source of the data feeds. Reading the change logs makes our data replication a watertight CDC approach.

However, this new acquisition method adds more than just reliable CDC. It delivers real business value to R/3 users and is an important catalyst for making SAP ERP data useful.

Operational reports and managerial analytics from a single source

I don’t believe that there will ever be any company of substantial size that runs one ‘single’ transaction system. Therefore an integrated data warehouse is the only viable way forward. However, since the data warehouse is downstream to these transaction systems, a data transfer interface is needed. And some reports, especially the operational ones, need fresh data. This is exactly what data replication brings to the party, making it an essential technical component!

Flexibility to change and easy to grow

Data Replication moves raw, granular data from the upstream source system to the downstream target. The result is that replicated data is equal in both source and target. It makes sense to replicate complete tables with all columns, a principle sometimes referred to as “Touch-It, Take-It”*. This methodology adds flexibility because as the source system changes we automatically bring all that data into the warehouse. As your business expands the process of adding a new source, from a logical view point, is nothing more than adding a table and adding columns for usage in reports. And this is an in-database operation that can be done without touching SAP.

Use Teradata and remove the burden from the SAP servers

Raw, replicated data must be put in context. But now that can be done in Teradata because that is where a copy of that data is stored. And work done in Teradata, no longer needs to be done in SAP! Add to this, that the replication technique intrinsically spreads the load over the day and you have another explanation why replication significantly reduces the burden on SAP R/3 systems compared to the traditional batch oriented approach using ETL tools or SAP BW.

Disruption is good!

In my previous post I made reference to a 10 year old white paper of Neil Raden in which he explained that data needs to be integrated with historical facts and reference data so that it can be analyzed. In his words; “real-time analytics cannot exist without the integrated data and context a data warehouse provides”. In my next post I will write about how my team’s SAP expertise and ingenuity overcame the technical complexities of near-real-time analytics on SAP R/3.

* Note: The ‘touch-it, take-it’ technique is not feasible when the data acquisition technique is a batch operation (ABAP, FTP). This would put significantly more burden on the R/3 application server and require much much more initial configuration to deal with the R/3 specific meta data of the different installations.


Category: Other Patrick Teunissen

About Patrick Teunissen

Patrick Teunissen is the Engineering Director at Teradata responsible for the Research & Development of the Teradata Analytics for SAP® Solutions at Teradata Labs in the Netherlands. He is the founder of NewFrontiers which was acquired by Teradata in May 2013

One thought on “Real-time data acquisition causes disruption to analytics for SAP R/3

  1. avatarJacques Adriaansen

    Always interesting to share ideas and insights and I definitely want to know more about this Patrick. As you know I’m also quite skeptical about the prospects of “one ERP for all”. To be honoust, I’m also skeptical about the prospects of “one on-line datawarehouse, fed in real-time by the various ERP- or other information systems”. On the other hand, my scepticism in this area can only be eliminated by showing me flawlessly working systems delivering this promise.
    I raise my eyebrows as I read the words “real-time” and “analytics” in one sentence, especially when we are talking about heavy analytics, so to my pleasure I read “analytics” in your title and “near-real-time” in your first sentence… I wouldn’t have expected differntly. Heavy analysis needs to be done on a snap shot and making the snap shot shouldn’t take too much time. Any acceleration on this subject in the vastly complex area of the SAP R/3 (and any other huge ERP system) data model is welcome.
    In order to be able to find the answers for practically all “ad hoc” questions, the replicated date has to be granular. And that granular data also has to be put in context. Given the complexity of the data model, that is a task that takes a lot of research. Your and our experts have done a lot of work and a great job here.
    Let’s keep on talking and exchanging ideas!


Leave a Reply

Your email address will not be published. Required fields are marked *