Meter data management – how to make sure that your system meets current and future scalability needs


By Prosenjit Dutta

A Meter Data Management System (MDMS) is one of the key components of the investments required for deployment of advanced metering infrastructure (AMI) systems. In fact, MDM is probably the most valuable piece and the backbone of a successful large scale AMI deployment because it controls, validates and cleanses the core meter values before the data is made available to any other system.

While it is very important to assess and understand the functionalities and features provided by an MDMS and their relevance to a particular utility implementation, it is also extremely critical to assess the non functional aspects such as performance and scalability. This is such an important factor that it can drive an entire AMI implementation off course, and requires one to think through the entire implementation roadmap all over again at a time one least expects. Stress testing MDM solutions against relevant benchmarks is essential to the due diligence process prior to system purchase. Some solutions may perform well in pilot projects and small scale deployments, but as meter volumes ramp up, disaster can strike. Unfortunately, performing these tests as part of an MDM evaluation can be complex, time consuming, costly and tie up critical resources. Performing similar tests against multiple vendors amplifies these constraints immensely. However, this vetting process is necessary to ensure one deploys the best system for one’s business needs.

Hence, utilities should seriously consider performing scalability and stress testing on MDM systems before making selections or planning for upgrades. Utilities are advised not to depend solely on benchmarks performed and presented by the vendor companies. Actual results in a production environment might vary widely and might not yield satisfactory results.

Performance testing of an MDMS should cover key aspects of data imports from multiple sources, performing VEE (Validation, Editing and Estimation) on the imported data, conversion to billing ready formats and final exports. Depending on the current and projected size of the utility’s meter population, performance testing should be executed on production simulated environments. Utilities should also plan for stress testing of the MDMS, which means varying the loads and number of meters to identify possible points of failure or unsatisfactory performance and then taking corrective steps based on such findings.

Benchmark testing for any MDM solution should address the following critical AMIrelated scenarios at a minimum, and can help better evaluate a vendor’s performance and scalability claims: 

  • Benchmark 1: Receiving, validating, estimating and storing interval and register data 
  • Benchmark 2: Querying, aggregating, validating and exporting time-based rate billing determinants.

These two benchmarks are the essential stages in the meterto- cash workflow of most utility operations, and this workflow is typically the most technically vulnerable business process in an AMI deployment. Performing these benchmark tests at a significant scale can help accelerate the implementation schedule for an MDM solution. When evaluating MDM vendors, pay close attention not only to the results of benchmark tests, but also to the key factors used to create those results.

The various aspects of performance testing of an MDMS have been detailed in the figure.

MDMS table

Performance testing of an MDMS

To summarise, while this point of view describes some basic procedures to ensure performance and scalability of MDM systems, additional scenarios are useful and relevant to certain deployments, including provisioning and configuration loads, event and alert processing and reporting needs. Benchmarking analysis reduces the risk involved in selecting a vendor, and supports large scale AMI and MDM projects by demonstrating these typical scenarios at high scale, allowing one to focus only on the subset of additional stress tests required for one’s own equipment. With understanding of the way benchmarking tests are performed and what variability is entered into those tests through common scenarios, one can more fully understand what a prospective MDM solution can deliver and if it will meet one’s needs now and into the future.