How Big Data is turning the Smart Grid into the Smarter Grid

Monday November 4th, 2013

Analytics is already transforming the Utilities industry and we have only just started on the journey, and this was reinforced to me during the Smart Grid Australia conference that happened recently in Melbourne, where I was invited to be a guest panelist speaking on the topic of big data.

Smart Grid Australia is a Utilities industry event that is also open to organisations that support Utilities and also to tertiary sector specialists, government and non-government organisations, as well as hardware,software and professional services firms involved in Smart Grid developments.  The aim of the body is to champion the development of an effective Smart Grid infrastructure for Australia’s energy future. You can visit their website at

Whilst Smart Meters are regularly hot topics on shock jock’s radio shows (where they sound off on the old furphy that you can be irradiated by your smart meter and other such drivel), the Smart Meter is really only a part of the much wider issue of the Smart Grid.  The Smart Grid is all about delivering a more reliable, safer, and more efficient (read “cheaper”!) energy service to electricity consumers.  The aim is to achieve this outcome by using data and information and analytics better.  Here are some of the examples that were discussed at the forum:

1) The USA is conducting a trial (involving IBM and the CSIRO amongst others) of what’s called a Transactive Energy Grid.  This Grid has the concepts of information flowing from the Utilities to the consumer and also from the consumer to the Utilities, in addition to the energy flow.  The Utilities (starting with the generators, then the transmission network operator, the distributor and finally from the retailer) will send the consumer information about the pricing of energy per hour for the next 5-7 days. 
This will enable the consumer to shift some uses of electricity to cheaper time periods.  The consumer (or rather their smart meter) will send the utilities information about the projected energy use of that consumer for the next 5-7 days.  This two way flow has many benefits.  The consumer gets lower power bills by being better able to schedule optional load (e.g. pool filters, air conditioning, washing loads etc.) to times of day that have lower rates. 
The utilities industry sector gets both better information to plan generation load but also the benefit of having reduced peak loads which reduces the requirements for infrequently used spare capacity.  Typically, 25% of the cost of electricity is due to the peak load capability that is only used about 40 hours per year.  Reducing the peak load for these 40 hours will enable saving 25% of the energy bills.  This use of big data to create an aggregated forecast of demand and supply is several years away from production, but will be really exciting when introduced here.

2)  Image analysis was discussed, as part of maintenance planning for distributed electricity assets such as poles, transformers and fuses.  By analysing variation in digital images of these assets, maintenance activities were able to be more optimally planned to reduce unneeded maintenance and simultaneously service assets that were showing more rapid deterioration than expected.

3) The Internet of Things, referring to interconnected devices sharing information between them, was a key topic of conversation. The digitisation of assets will deliver richness of data that has not been possible before and will lead to clearer understanding of the performance of the grid. Some potential use cases were discussed, including SCADA data analysis to determine failures in earth shielding causing intermittent voltage drops.  Identifying these situations and rectififying them promptly will save lives, so it’s got a very high payoff for both staff and customers of utilities.

4) One utility talked about their usage pattern analysis and the ability to detect outliers in usage patterns.  One practical use of this approach to usage data analysis is the detection of houses used to grow marijuana, as these houses have decidedly fixed usage patterns that are easily able to be identified once usage data has been standardised and normalised. The revenue saved by the utility (as these houses almost never pay their bills!!), coupled with the public service benefit have made this use case a very profitable as well as a very effective one.

As the utility industry evolves towards the Smarter Grid, we will no doubt see more such use cases for analytics.

David Stewardson is a Senior Consultant in the Teradata Solutions Group. He has a very strong technical background and business acumen with over 23 years’ experience in the Data Warehouse business, specialising in Business Intelligence. During his extensive career, he worked in 6 countries, across 8 different industries (including Mining, Finance and Insurance, Utilities and Telecoms) and has been responsible for managing teams of varying sizes from five up to 150 in previous Business Analysis, Project Manager, Program Manager and Program Director roles. You can also  follow Dave on twitter @d_stewardson

The following two tabs change content below.

Leave a Reply

Your email address will not be published. Required fields are marked *