In the power grid of the 21st century, distributed energy resources (DERs) will be widespread, if not ubiquitous.
None of the grid’s three operational variables – load, generation and power flow – will be controlled by the utility or system operator. As a result, to meet demand, maintain grid stability and protect equipment, the distribution grid must respond to changing conditions in real time, not through a centralised command and control model like SCADA.
In other words, the distribution grid must become an intelligent network that is capable of responding quickly and locally to changing conditions as they occur and operating more autonomously based on a centrally controlled set of parameters.
These capabilities will be crucial for utilities to manage the technical and business challenges brought on by continually increasing amounts of distributed and renewable generation. There is simply no other way to effectively manage the grid amid these profound changes.
Extending visibility to the edge
A lot has changed in just the last few years. All the action in the industry today is unfolding within the distribution grid, particularly at the edge with distributed generation, energy storage, electric vehicle charging, load control/demand response, energy efficiency, smart homes and smart cities. Yet, utilities have little to no visibility in this realm with current technology.
To maintain stability of the distribution network, there is a critical need for visibility, management and the ability to act across the distribution grid. Previously, these capabilities were only required at the generation and transmission levels down to the substation. The only practical way to achieve this outcome is through distributed edge processing.
Today, utilities are faced with maintaining distribution infrastructure without the necessary level of funding to do so as more customers generate their own electricity.
In addition, the emerging transactive energy marketplace at the grid’s edge will involve many financial transactions that will fall outside of the utility’s traditional business and financial processes. If utilities are to survive and thrive amid these disruptive and competitive challenges, they will need to turn these threats into business opportunities and leverage their relationships with customers to be the key player in the ‘transactive grid’ and the enablement of decentralised, local power pools.
Edge intelligence-enabled meters and other devices on the distribution network provide the platform to manage these transactions and power flows in real time, keeping the utility relevant and in control of the transactions. As the incumbents, utilities will be the logical choice to perform this function, but only if they are ready to provide the services when required. If they are not ready, other disruptive players will happily fulfill that role. Implementation of edge processing capabilities – and the business agility it provides to manage these transactions – is the utility’s best defense against displacement or obsolescence.
The shifting data paradigm
Distributed intelligence, as in the distribution of computing power, analytics, decisions and action away from a central control point, is not a new concept. From smart phones running mobile apps to supply chain management solutions to multiplayer online gaming, distributed intelligence and computing have proven to be consistently effective in managing large, complex and data-intensive systems and organisations. The Internet of Things (IoT) is accelerating this trend. So why is edge processing such a hot topic at utilities today?
There are three main drivers:
• Data volumes
• Latency and resilience
• Technical capabilities
In both the general IT industry and the utility industry, the rate of data creation has consistently outpaced growth in network capacity and data storage by orders of magnitude over the past decade or more. For example, 10 years ago, typical residential electric meters produced one value per month for customer billing. Over the past decade, meters have progressed through hourly data, then 15- or 5-minute load profile data to today, where some distributed meter applications utilise 40 or more values per second on every residential meter. That represents an increase from 3.5 million data points per day per meter.
What about network capacities? A decade ago, typical Local Area Networks (LANs) operated at 10 Mbps. Today, typical LANs operate at 10 to 100 Gbps. That represents an increase in network speed of three to four orders of magnitude, while meter data growth over the same period has been eight orders of magnitude. To make matters worse, meter vendors are expected to release standard residential metering devices, delivering sensing sampling data (like voltage and current) thousands of times per second, which represents another three orders of magnitude growth in data.
The only practical way to manage these data volume challenges is through distributed intelligence, changing the challenge of granular data transfer into the delivery of relevant outcomes only.
Latency and resilience
Many operational applications for tomorrow’s grid will require voltage corrections, power flow adjustments and other control measures to occur in near real time across a large number of devices.
These outcomes are simply not attainable using models that rely on data transfer and subsequent command-and-control to and from a central system to many large volumes of devices in the field.
It is not economically practical across so many devices distributed throughout an entire utility service territory to provide multiple, high-speed, contingent network paths; this will make connectivity dropouts between devices and central command inevitable. The only practical solution to this dilemma is to distribute intelligence and processing power to the edge devices so they do not suffer from unmanageable latencies and can continue to act autonomously during interruptions in central connectivity.
It is only in the last two to three years that the cost of technology and communication networks capable of supporting true distributed edge processing across the electric distribution network (e.g. meters, reclosers, capacitor controls, fault recorders) has become technically viable and affordable. Today, it is cost effective to deploy widespread edge processing solutions with net positive business value.
This is the other reason edge processing is a hot topic Previously, it was only a conceptual or academic discussion; now it is cost effective and practical, even compared to the cost of current generation AMI and utility networks.
Putting distributed intelligence to work
Distributed intelligence in meters and other edge devices opens up a broad array of new use cases to operate a more efficient, reliable and safer grid, and to transform the customer experience. Here are some key examples.
High-impedance connections are a problem for utilities because they are difficult to detect and result in technical losses, voltage problems and customer complaints. In some cases, they can even lead to fires and other safety issues. With the ability to process and analyse one-second data and run apps directly on the meter, smart meters can measure impedance ubiquitously to detect and identify these ‘hot spots’ on the lowervoltage network and gauge their severity.
This enables utilities to proactively identify and fix problems that lead to losses and customer complaints before they become potentially serious safety problems.
Instead of relying on anecdotal event and meter tamper alarms to detect energy theft, distributed intelligence enables meters to detect energy theft by analysing the flow of current and voltage levels, greatly improving accuracy while reducing false positives.
Meters communicate via peer-to-peer communications to continually identify and analyse voltage fluctuations local to the transformer. The meters determine if the fluctuations are accounted by loads through meters on the transformer. Loads on the lowvoltage secondary that are not accounted by meters indicate theft. Meters also continually monitor their own outward impedance and detect meter bypassing and tampering through changes in that impedance.
It is now possible to accurately disaggregate loads in the customer premise in real time with no special equipment behind the meter and no need to do backoffice statistical modelling that requires historical data sets. Distributed intelligence enables the meter to analyse kWh, kVAR, kVA, Amp and Volt data by phase, which is all collected in 1-second or faster intervals and stored in the meter. Analysis of this data takes place continuously by an application running resident on the meter. Start-up and shut-down events are recorded and characterised through repetition, leading to a set of identified end-uses known by their distinctive signatures. This ability to disaggregate loads on a larger scale opens the door to create an entirely new and scalable portfolio of utility energy efficiency services for customers and opportunities to optimize demand response and energy efficiency programs.
In the face of mounting technical and business challenges associated with disruptive technologies like distributed energy resources, utilities must rethink their strategies and operational approaches to assure the stability of the grid and the success of their business. By strategically implementing edge intelligence into their distribution networks, utilities will not only help assure the reliability, efficiency and safety of the grid, but they will also stake out an enviable competitive position as we march down the path to the transactive energy marketplace of the future. MI
About the author
Pieter Coetzee is senior director of marketing for Itron’s Electricity Business Line and resides in Paris. He can be contacted at email@example.com.