Note: If you have already deployed this solution, click here to view your deployment.
Save time and let one of these trained SI partners help you with a proof of concept, deployment and integration of this solution.
Estimated daily cost: $12
For more details about how this solution is built, visit the solution guide in GitHub.
Estimated provisioning time: 15 minutes
An energy grid consists of energy consumers, as well as various types of energy supplying, trading and storage components: substations accept power load or export excessive power; batteries may discharge energy or store it for future use; wind farms and solar panels (self-scheduled generators), micro-turbines (dispatchable generators) and demand response bids can all be engaged to satisfying the demand from the consumers within the grid. The costs of soliciting different types of resources vary, while the capacities and physical characteristics of each resource type limit the dispatch of the resource. Given all these constraints, a central challenge that the smart grid operator must face is how much energy each type of resource should commit over a given time frame, so that the forecasted energy demand from the grid is satisfied.
This solution provides an Azure-based smart solution, leveraging external open-source tools, that determines the optimal energy unit commitments from various types of energy resources for an energy grid. The goal is to minimise the overall cost incurred by these commitments while satisfying the energy demand. This solution demonstrates the ability of Azure to accommodate external tools, such as Pyomo and CBC, to solve large-scale numerical optimisation problems such as mixed integer-linear programming, parallelising multiple optimisation tasks over an Azure Batch of Azure Virtual Machines. Other involved products include Azure Blob Storage, Azure Queue Storage, Azure Web App, Azure SQL Database and Power BI.
Technical details and workflow
- The sample data is streamed by newly deployed Azure Web Jobs. The web job uses resource-related data from Azure SQL to generate the simulated data.
- The data simulator feeds this simulated data into the Azure Storage and writes messages into the Storage Queue, to be used in the rest of the solution flow.
- Another Web Job monitors the Storage Queue and initiates an Azure Batch job once the message is available in the queue.
- The Azure Batch service is used together with Data Science Virtual Machines to optimise the energy supply from a particular resource type given the inputs received.
- Azure SQL Database is used to store the optimisation results received from the Azure Batch service. These results are then consumed in the Power BI dashboard.
- Finally, Power BI is used for results visualisation.
©2017 Microsoft Corporation. All rights reserved. This information is provided “as is” and may change without notice. Microsoft makes no warranties, express or implied, with respect to the information provided here. Third-party data was used to generate the solution. You are responsible for respecting the rights of others, including procuring and complying with relevant licences in order to create similar datasets.