AWS Database Blog
How power utilities analyze and detect harmonics issues using power quality and customer usage data with Amazon Timestream: Part 2
In the first post of the series, we demonstrated how to use an Amazon Timestream database and its built-in time series functionalities to interpolate data and calculate the correlation between customer energy usage and power quality issues. In this post, we show you how to build a power quality analysis Proof of Concept (PoC) using different AWS services, such as AWS IoT Core, Amazon Simple Storage Service (Amazon S3), Timestream, and AWS CloudFormation.
In a typical power quality analysis implementation, the source data normally comes from different systems.
- Power quality data is collected using third-party power quality meters and related applications that perform data acquisition, processing, and storage
- Customer and industrial (C&I) meter data is retrieved from advanced metering infrastructure (AMI) and meter data management (MDM) systems.
For more information, refer to Application integration in utility smart metering using AWS.
In this post, we focus on how customer meter data and harmonics meter data can be directly ingested from AWS IoT Core to a Timestream database. As part of the system integration for power quality harmonics analysis, you can use an AWS Lambda function to filter and route the selected meter data information into a Timestream database for harmonics analytics purposes.
Solution overview
The following diagram illustrates the overall data flow for this power quality analysis PoC.
The workflow steps are as follows:
- The Harmonics Meter Data Simulator is used to generate power quality data and send the data to AWS IoT Core every 15 minutes. This data includes 1 month of actual production metering data with data identification information removed. The C&I Meter Data Simulator is used to simulate meter data (interval consumption data kWh, voltage, ambient temperature, power quality harmonics data) to AWS IoT Core over Message Queue Telemetry Transport (MQTT) protocol.
- The meter data generated by smart meters is ingested to AWS IoT Core. AWS IoT Core is a managed cloud platform that lets connected devices easily and securely interact with cloud applications and other devices. AWS IoT Core offers secure device connectivity and device communication, with low latency and low overhead. AWS IoT Core can support billions of devices and trillions of messages, and can process and route those messages to AWS endpoints and to other devices reliably and securely. With AWS IoT Core, you can continuously ingest, filter, transform, and route the data streamed from connected devices. You can take actions based on the data and route it for further processing and analytics.
- When the data reaches AWS IoT Core, rules and actions are used to further filter the data and route it to data processing components. The processed data is stored into the Timestream database for power quality analysis.
- The Timestream database uses a built-in scheduler and interpolation and correlation calculation functions to calculate power quality correlations on a regular basis (such as daily or weekly).
- With the Timestream multi-measure feature, you can store multiple metering data in the same table. However, as of this writing, AWS IoT Core doesn’t support writing multi-measure values from IoT Core rule actions to Timestream tables. Timestream scripts are used to convert the single-measure values into multi-measure values with a built-in scheduler.
In the following sections, we explain the process of setting up different components in detail: the simulator, AWS IoT Core, Timestream database tables, and scheduled queries for single to multi-measurement conversion and for correlation calculations.
Prerequisites
This solution requires an active AWS account with the permission to create and modify AWS Identity and Access Management (IAM) roles along with the following services enabled:
- Amazon TimeStream database
- AWS IoT Core
- Amazon S3
- Amazon VPC
- CloudFormation
You should also be familiar with AWS IoT services, deploying AWS resources using AWS CloudFormation, and Python (version 3.6 and above) coding, and have a technical depth of at least advanced level.
Note that deploying AWS cloud resources incurs costs and you are responsible for the costs incurred for the deployed AWS resources. Make sure you follow the clean-up steps mentioned in the end of this post, once you have completed the PoC.
Download the quick start resources
You can download the quick start resources from the following repository onto your own machine or Amazon Elastic Compute Cloud (Amazon EC2) instance with Python development environment set up. This includes the CloudFormation templates to deploy the AWS Cloud resources that are discussed in this post, the sample data for the customer meter and harmonics meter, and a Python simulator script.
Set up the simulator
In the downloaded files, you will find a Python script named datagenerator.py
and the requirement.txt
file in the DataGenerator
directory. Install the required Python modules for the simulator:
In order for the simulator MQTT client to connect securely to AWS IoT Core, you need to configure the IoT device certificates with the right AWS IoT things policy attached. If you are not familiar with the AWS IoT device certificate configuration, refer to steps 1–4 in the following workshop lab. Create a certs directory under the directory where the Python script is located. The downloaded certificates should be copied over to the certs directory with the following names:
- Root CA1 –
rootCA.pem
- Private key –
privateKey.pem
- Certificate –
certificate.pem
Deploy AWS IoT rules and Timestream tables
To simplify the steps involved in creating the AWS Cloud resources and the required configurations, we have created CloudFormation templates. In the downloaded files, you will see iot_ts.yaml
. Deploy the iot_ts.yaml
template by providing a stack name, such as pq-iot-ts-v1. After the stack is deployed successfully, you will see the AWS IoT rules are created with the rule actions to ingest the filtered data into Timestream tables, as described in the architecture diagram shown earlier.
Simulate customer meter data and harmonics meter data
The simulator datagenerator.py
reads the data from the CSV files in the data directory and securely publishes the customer metering data (energy consumption: kwh, voltage: voltage, Received Signal Strength Indicator: rssi, and power factor: pf) and harmonics data into AWS IoT Core MQTT topics.
Before running the simulator, you can subscribe to the MQTT topic filter dpu/+
using the AWS IoT MQTT test client. This topic filter will subscribe to both the customer meter data and the harmonics meter data ingested by the simulator to AWS IoT Core.
Run the simulator by specifying the AWS IoT Core endpoint, as shown in the following code. To locate the endpoint, navigate to the AWS IoT Core console, choose Settings in the navigation pane, then Device data endpoint, then Endpoint. For more information, refer to AWS IoT Core – data plane endpoints.
The preceding command starts publishing the customer meter data and harmonics meter data to AWS IoT Core. We use the following MQTT topics:
If you want to further experiment with the data by updating the CSV files, you can choose to ingest either harmonics data or customer meter data by using the flag -o HARMONICS
or -o CUSTOMER
. This simulator code is for educational purposes only, and may not be used in production environments.
When the Python code is complete, validate the data in the Timestream table. Check if data exists in the following tables:
meter_id | measure_name | time | measure_value::double |
Customer_Meter_3 |
kwh |
2022-09-01 04:00:00.000000000 | 23.52 |
Customer_Meter_3 |
kwh |
2022-09-01 04:15:00.000000000 | 23.48 |
Customer_Meter_3 |
kwh |
2022-09-01 04:30:00.000000000 | 23.52 |
Customer_Meter_3 |
kwh |
2022-09-01 04:45:00.000000000 | 23.52 |
Customer_Meter_3 |
kwh |
2022-09-01 05:00:00.000000000 | 23.48 |
Customer_Meter_3 |
kwh |
2022-09-01 05:15:00.000000000 | 23.52 |
Customer_Meter_3 |
kwh |
2022-09-01 05:30:00.000000000 | 23.52 |
Customer_Meter_3 |
kwh |
2022-09-01 05:45:00.000000000 | 23.52 |
Customer_Meter_3 |
kwh |
2022-09-01 06:00:00.000000000 | 23.52 |
Customer_Meter_3 |
kwh |
2022-09-01 06:15:00.000000000 | 23.48 |
harmonic_meter_series_id | measure_name | time | measure_value::double |
Harmonic_Meter_ONE_VTHD_Phase_A |
harmonics_value |
2022-09-01 04:00:00.000000000 | 12.0 |
Harmonic_Meter_ONE_VTHD_Phase_A |
harmonics_value |
2022-09-01 04:15:00.000000000 | 12.0 |
Harmonic_Meter_ONE_VTHD_Phase_A |
harmonics_value |
2022-09-01 04:30:00.000000000 | 12.0 |
Harmonic_Meter_ONE_VTHD_Phase_A |
harmonics_value |
2022-09-01 04:45:00.000000000 | 13.0 |
Harmonic_Meter_ONE_VTHD_Phase_A |
harmonics_value |
2022-09-01 05:00:00.000000000 | 12.0 |
Harmonic_Meter_ONE_VTHD_Phase_A |
harmonics_value |
2022-09-01 05:15:00.000000000 | 12.0 |
Harmonic_Meter_ONE_VTHD_Phase_A |
harmonics_value |
2022-09-01 05:30:00.000000000 | 12.0 |
Harmonic_Meter_ONE_VTHD_Phase_A |
harmonics_value |
2022-09-01 05:45:00.000000000 | 12.0 |
Harmonic_Meter_ONE_VTHD_Phase_A |
harmonics_value |
2022-09-01 06:00:00.000000000 | 12.0 |
Harmonic_Meter_ONE_VTHD_Phase_A |
harmonics_value |
2022-09-01 06:15:00.000000000 | 12.0 |
Deploy scheduled queries, multi-measure tables, and correlation table in the Timestream database
After the IoT data is ingested successfully into the single-measure table in Timestream, the next set of resources can be deployed in to AWS Cloud. Deploy the CloudFormation stack scheduledquery.yaml
by specifying a stack name, such as pq-ts-correlation-v1.
The data sent by the meter data simulator is multi-measure in nature. Timestream supports multi-measure data storage against a single timestamp. Multi-measure records store your time-series data in a more compact format in Timestream’s memory and magnetic stores, which helps lower data storage costs. Also, the compact data storage lends itself to writing simpler queries for data retrieval, improves query performance, and lowers the cost of queries.
However, at the time of writing, the AWS IoT Core rule action doesn’t support the ingestion of multi-measure data directly into Timestream tables. To address this temporarily, you can run a scheduled query to convert the multiple single-measure data into a multi-measure value against a single timestamp in the Timestream table. To simply the configurations, we have automated this in the CloudFormation template and no further action is required to complete this step.
After you run the CloudFormation template, validate the scheduled query and multi-measure table results.
We first validate the scheduled query. On the Timestream console, choose Scheduled queries in the navigation pane under Management Tools. Ensure the scheduled query ran successfully by checking the Last runtime column to confirm if a date/time is displayed, which indicates that the query ran and is ready for the next scheduled run.
Use the following code to validate the output of the scheduled query in the "dpu"."customer-meter-data-multi"
table:
meter_id | measure_name | time | rssi | pf | kwh | voltage |
Customer_Meter_9 |
Customer-Meter |
2022-09-01 04:00:00.000000000 | -77 | 0.91 | 101.44 | 285.95 |
Customer_Meter_9 |
Customer-Meter |
2022-09-01 04:15:00.000000000 | -75 | 0.99 | 101.76 | 286.16 |
Customer_Meter_9 |
Customer-Meter |
2022-09-01 04:30:00.000000000 | -75 | 0.97 | 102.08 | 285.45 |
Customer_Meter_9 |
Customer-Meter |
2022-09-01 04:45:00.000000000 | -76 | 0.93 | 102.08 | 285.62 |
Customer_Meter_9 |
Customer-Meter |
2022-09-01 05:00:00.000000000 | -75 | 0.92 | 101.76 | 285.64 |
Customer_Meter_9 |
Customer-Meter |
2022-09-01 05:15:00.000000000 | -75 | 0.98 | 102.08 | 287.22 |
Customer_Meter_9 |
Customer-Meter |
2022-09-01 05:30:00.000000000 | -76 | 0.93 | 101.76 | 285.43 |
Customer_Meter_9 |
Customer-Meter |
2022-09-01 05:45:00.000000000 | -77 | 0.99 | 101.6 | 285.86 |
Customer_Meter_9 |
Customer-Meter |
2022-09-01 06:00:00.000000000 | -77 | 0.93 | 101.44 | 286.75 |
Customer_Meter_9 |
Customer-Meter |
2022-09-01 06:15:00.000000000 | -75 | 0.97 | 101.6 | 287.12 |
Perform correlation calculations
After the data is cleaned up, a Pearson correlation calculation is applied. Based on different business requirements, a Spearman correlation can also be applied.
As described in Part 1, when meter data comes in, it could have missing data, which will complicate the calculations. This query first uses linear interpolation to insert any missing meter interval data for both the harmonic meter and regular customer meters. Developers also can use other interpolation algorithms such as cubic spline, last sampled value, or even a constant value.
We use the following query to interpolate the data and perform the correlation calculations:
The following table contains the output (using sample data).
meter_id | harmonic_meter_series_id | result |
Customer_Meter_1 |
Harmonic_Meter_ONE_VTHD_Phase_A |
-0.1575810995751959 |
Customer_Meter_1 |
Harmonic_Meter_ONE_VTHD_Phase_B |
-0.1503472793143921 |
Customer_Meter_1 |
Harmonic_Meter_ONE_VTHD_Phase_C |
-0.14820913468681032 |
Customer_Meter_10 |
Harmonic_Meter_ONE_VTHD_Phase_A |
0.0467746498221175 |
Customer_Meter_10 |
Harmonic_Meter_ONE_VTHD_Phase_B |
0.047512685080802 |
Customer_Meter_10 |
Harmonic_Meter_ONE_VTHD_Phase_C |
0.024546804070675585 |
Customer_Meter_11 |
Harmonic_Meter_ONE_VTHD_Phase_A |
0.028516057883515952 |
Customer_Meter_11 |
Harmonic_Meter_ONE_VTHD_Phase_B |
0.002778196233820788 |
Customer_Meter_11 |
Harmonic_Meter_ONE_VTHD_Phase_C |
-0.014712870331358782 |
Customer_Meter_2 |
Harmonic_Meter_ONE_VTHD_Phase_A |
0.2735744789391751 |
You can download the quick start source codes and try it out now.
Clean up
When you’re comfortable with all the steps to complete the correlation and understand the benefits discussed in this post, you can delete the resources you created. To avoid any costs associated with the resources created, we recommend removing all the resources by deleting the CloudFormation stacks (pq-iot-ts-v1
and pq-ts-correlation-v1
).
Conclusion
In this post, we built a quick start with sample code to demonstrate how to use AWS IoT Core, Timestream, and its built-in time series functionalities to calculate the correlation between customer energy usage and power quality issues. In addition, we showed how Timestream database multi-measure records and scheduled queries make it easier and cost-effective to store and analyze different time series data for your advanced IoT applications.
In real-world utility solutions, developers could extend this PoC and use actual metering data for customer energy usage and power quality harmonics correlation calculations. It simplifies multi-dimensional data correlation analysis and calculations without the need for complex infrastructure setup and licensing commitments, and offers minimal further development efforts.
If you have questions or feedback, share them on the comments section.
About the Authors
Sanjeevi Rangan is a Senior Partner Solutions Architect and an IoT specialist based in Melbourne, Australia. He helps AWS partners in the APJ region to succeed in their solution building journey. Sanjeevi has 15 years of experience in software design and development. He is passionate about solving complex IoT problems and has led some of the strategic digital transformation engagements for customers across multiple geographies.
Sreenath Gotur is leading the Partner Solutions Factory on the Solution Architecture team at AWS, based out of Charlotte, NC. Prior to joining AWS, he was heading enterprise data management, enterprise data services, and data innovation portfolios with a large financial firm. Sreenath has a special interest in data and analytics, document databases, time series databases, and graph databases. In his spare time, he enjoys spending quality time with his family.
Bin Qiu is a Global Partner Solution Architect focusing on ER&I at AWS. He has more than 20 years’ experience in the energy and power industries, designing, leading and building different smart grid projects, such as distributed energy resources, microgrid, AI/ML implementation for resource optimization, IoT smart sensor application for equipment predictive maintenance, EV car and grid integration, and more. Bin is passionate about helping utilities achieve digital and sustainability transformations.
Sneh Bhatt is a Data Architect with Deloitte Consulting experienced in digital transformation, cloud database development on AWS, data integration, data architecture, data migration, data warehousing and big data, with 20 years of experience leading projects for LS-HC, FSI, and retail customers. Sneh is passionate about leading, architecting, and building data initiatives on the cloud.