data provisioning
Greenbird CEO Thorsten Heller.

It’s a dirty job, but….

Do you remember the Discovery Channel, TV series ‘Dirty Jobs’?

Every week the presenter, Mike Rowe, would perform the difficult, strange, or messy parts of people’s occupations. They were usually the jobs the rest of us prefer not to think about. Every episode had the same opening,

“My name's Mike Rowe, and this is my job. I explore the country looking for people who aren't afraid to get dirty — hard-working men and women who earn an honest living doing the kinds of jobs that make civilised life possible for the rest of us. Now, get ready to get dirty.”

Related content:
Read more about Greenbird

Data provisioning is similar. It’s a boring job people would prefer not to think about….or actually do. Yet data cleaning and integration is critical for organizations who rely on data to improve services to customers or to make their businesses more efficient.

What is data provisioning (and why is it a challenge for utilities)?

Data provisioning is the process of making data available in an orderly and secure way to users, application developers, and applications.

Data provisioning is a demanding process for all businesses, but there are some unique challenges for utilities.

For example,

  • Utilities have geographically distributed data sources and an equally distributed workforce.
  • They have mission-critical operational systems that must be insulated from uncontrolled access
  • They are holders of large quantities of confidential data which must be kept secure.

A data-powered energy revolution

The energy sector today is undergoing a revolution. And it’s a revolution powered by data.

As our energy system becomes increasingly decentralized, data from grid edge developments such as microgrids, peer-to-peer trading, electric vehicles (EV) and distributed energy generation, is essential for the smooth running of the grid.

Emerging technologies such as digital twins, artificial intelligence and machine learning can all help make utilities more efficient, but only with accessible, accurate data.

Predictive maintenance, real-time energy flow analysis, EV charging impact monitoring, apps and dashboards are all innovations with the potential to help utilities become the modern, efficient, customer-focused organizations needed in the 21st Century. However, these technologies cannot be implemented without data. And this data must be ‘good’.

What do we mean by good data? Good data is consistent data; complete data; actual data; accurate data. How can we achieve this? It’s through the ‘dirty’ job that makes innovative, data-driven energy services possible, which in turn power the energy revolution. The dirty job not many people, especially tech people, like to do - the dirty job of data provisioning.

Dirty Data Strangles Innovation

Utilities are innovating and experimenting with emerging technologies. They are running pilot projects to learn and gain experience.

Implementing a pilot with Artificial Intelligence (AI) or Machine Learning is one thing. Getting those “first mover” projects into full-scale deployment and generating real business value is much more difficult. Why? Proof of Concept (PoC) projects typically use relatively tiny data sets. The data is generally manually extracted and specifically compiled and prepared for the purpose of the pilot.

When the project is scaled up, utilities face a major problem, the problem of dirty data.

Dirty data, aka, ‘rogue data’ is data that is inaccurate, incomplete or inconsistent – “rubbish in, rubbish out” as the saying goes.

Research shows that organizations typically spend more than 80% of all IT efforts just on preparing data or making ‘good’ data accessible. This means spending time and effort in fixing the dirty data problem. How can this be achieved? By integration, data cleansing, unification, structuring and managing access to data.

The reality in most utilities today is that less than 20% of IT spending is used to innovate or deliver new functionality.

Fixing the causes of dirty data

Utility data has so much potential value. It contains an abundance of information that can be harnessed for new service innovations. However, too often it’s collected inefficiently and much of that potential value falls by the wayside and is wasted.

There are many causes of dirty data, but an out-dated IT landscape and the typical ‘spaghetti’ integration approach are often crucial issues that must be resolved. Ageing IT landscapes are the perfect breeding ground for dirty data. They also provide a hostile environment for any data-driven innovation.

A modern data integration system such as Greenbird’s Utilihive digital integration hub can sort out the spaghetti mess, making data accessible and useable.

A digital integration platform enabling the energy revolution

Most utilities today understand the value of ‘good’ data. They also understand the importance of a modern, flexible, event-driven, cloud-native integration platform that handles the data provisioning challenge in an effective way.

A digital platform gives utilities back the resources they are currently spending on data provisioning, enabling them to spend more time on the innovation and value creation which is so crucial to their success in the ongoing energy revolution.