Where can I get the datasets about electricity consumption from tens of smart electric meters?

#where can I get the datasets about electricity consumption from tens of smart electric meters?


  • backgrouds

    hello,guys!I come from China and I will graduate from my college in this year.So,i have a import graduation thesis,which involves how to detect anomaly smart electric meter by using data mining.That is to say,you are supposed to recognize which meter is anomaly ,which meter is normal, basing on the datasets recorded by meters.It sounds interesting and difficult.For instance,I can not find a datasets about smart electric meters,which includes power,voltage,current,electricity consumption etc.Even though I have downloaded some datasets relate to smart electric meters from uci ,datahub,gov.cn and kaggle,my teacher told me that data is not to match really smart electric meters.So ,i had come up with a idea ,which is find a open forums and get help from you.

  • requirements of datasets
    1 .data should includes power,voltage,current,electricity consumption(KWH) and timestamp etc.It is better that you provide more infomation about meters.

    2 .I am suppose to get datasets of some meters ,not just only one.I have ever download a datasets that records electricity consumption from only household.I can not have any idea to build a mathematic model base on a meters data.So,I look forward to finding the datasest , your tips and suggestions.

Finally,I hope to get your help and suggestions ,and my email is gdzenghaihong@gmail.com .Thank you!!

Hi @ZengHaihong

You could try to contact https://teddinet.org, which is a network of UK energy projects – some of the network partners may have, or know about, such data (in a suitably anonymised form).

–Ewan

@ewan_klein Thank you for your help ! I will contact this website as soon as possible.
–ZengHaihong

We are doing a pilot with TEDDINET for Frictionless Data. You can follow our progress here: GitHub - frictionlessdata/pilot-dm4t: Pilot project with DM4T

As part of DM4T (Data Management for TEDDINET), Open Knowledge International (http://okfn.org/) are working with Julian Padget and colleagues to pilot the use of Frictionless Data specifications and tooling (http://frictionlessdata.io/) for TEDDINET datasets. The goal of the pilot is to demonstrate this approach to preparing and publishing research data to facilitate greater re-use. This pilot is being conducted in the open on GitHub (GitHub - frictionlessdata/pilot-dm4t: Pilot project with DM4T).

As one example, a dataset containing Electrical Load Measurements from the REFIT project (http://www.refitsmarthomes.org/) has been “packaged” using our tooling: https://github.com/frictionlessdata/pilot-dm4t/tree/master/refit-cleaned. In the process, we made a structured version of the data dictionary found in the dataset’s README and stored it in a file called “datapackage.json”. We did not need to alter the CSVs that comprise the dataset.

Over the past several years, we have developed several tools to work with the Data Package format. For instance, our Good Tables tool (GitHub - frictionlessdata/goodtables-py: Goodtables is a framework to validate tabular data [MAINTENANCE MODE]) can be used to check that the data is both structurally valid and whether it conforms to the data types and formats specified in the datapackage.json file. Using the datapackage.json, we can also automatically import the dataset into an SQL database (GitHub - frictionlessdata/tableschema-sql-py: Generate SQL tables, load and extract data, based on JSON Table Schema descriptors.), Google’s BigQuery (GitHub - frictionlessdata/tableschema-bigquery-py: Generate BigQuery tables, load and extract data, based on JSON Table Schema descriptors.), or Python Pandas (GitHub - frictionlessdata/tableschema-pandas-py: Generate Pandas frames, load and extract data, based on JSON Table Schema descriptors.). The recently developed R library, datapkg (GitHub - ropensci-archive/datapkg: ⛔ ARCHIVED Read and Write Data Packages), can also be used to automatically import the full, multi-CSV dataset into R for further analysis: https://github.com/frictionlessdata/pilot-dm4t/blob/master/refit-cleaned/example.R

We look forward to sharing more outputs of this pilot as time goes on.