




Optimal Economic Control of Energy Storage 
Student(en):

Betreuer:
Marius Schmitt, Benjamin Flamm, Paul Beuchat 
Beschreibung:
Motivation
In modern energy markets, commodity prices fluctuate on a daily, weekly, and seasonal basis. Prices typically follow roughly predictable patterns, such as low nighttime prices due to reduced electricity demand. Market participants that have the ability to store energy can engage in arbitrage where the aim is to buy and store electricity when prices are low, in turn releasing and selling it in times of high prices. In Switzerland, this idea is already commonly used in pumpedstorage hydroelectric power plants. Determining optimal arbitrage policies allows for more energy storage to be used, thereby increasing the adoption of renewable energy sources and reducing society's dependency on fossil fuels.
Description
In this project, we seek to optimize the operation of energy storage units with respect to uncertain future prices. The main objective of the project is to design algorithms for this problem that scale more favorably with the system dimension than stateoftheart methods based on discretization of the state space. To this end, we intend to explore approximate stochastic dynamic programming (ASDP) and/or policybased stochastic model predictive control (MPC). Initially, we intend to gain intuition into the structure of the problem by analyzing a lowdimensional example. We can then use knowledge of the structure of optimal policies and value functions in the choice of the policies in MPC or basis functions for more sophisticated ASDP. The developed algorithms will be validated using a realistic model of several realworld, interconnected pumpedstorage units, as well as a hydrogen gas storage plant. This project will be conducted in cooperation with Alpiq, who will provide models for gas and pump storage units and historical energy price data.
Main steps
 Preliminary step: Choose an appropriate price model and select the parameters based on real data. This step can be based on results of a previously completed student project.
 Literature review: Read and understand methods for optimal control methods in the presence of uncertainty, i.e. Approximate Stochastic Dynamic Programming and/or stochastic MPC based on policies.
 Implementation: One or both approaches, depending on the scope of the project (SA/MA)
 ADP: Formulate the stochastic DP problem for a lowdimension test system. Can results on the structure of optimal policies be derived analytically? Implement a suitable ADP algorithm and compute the optimal policies for the test system to obtain a benchmark result.
 MPC: Choose a suitable policy parametrization and implement a stochastic MPC based on these policies. Solve the optimal control problem for the test system. How does MPC compare to ADP in terms of performance and computational tractability?
 Case study: Based on the results obtained for the test system, refine the implementation of the most promising algorithm. For a set of test systems, evaluate how computational and control performance scale as the system dimension is increased. Finally, apply the chosen algorithm to a realistic model of several realworld, interconnected pumpedstorage units.
 Possible extensions, if time permits: We intend to start with simple models for energy storages, but the following extensions are of practical relevance and can also be explored within steps 3 and 4:
 Nonconvex operation costs, in particular unit commitment costs
 Redundant actuators (e.g. pumps, turbines) with different, nonconstant efficiencies
 Theoretical, provable optimality of certain types of policies, e.g. threshold policies
 Final Presentation: At the end of the project, the student will present his work at the Automatic Control Laboratory in a 25min seminar.
The scope of the project can be adapted to a certain extent depending on the type of project (SA/MA) and your personal interests and expertise. Please contact Ben Flamm flammb@control.ee.ethz.ch, Marius Schmitt schmittm@control.ee.ethz.ch, and/or Paul Beuchat beuchatp@control.ee.ethz.ch for further information.
References
 Approximate Dynamic Programming: de Farias, Daniela Pucci, and Benjamin Van Roy. "The linear programming approach to approximate dynamic programming." Operations Research 51.6 (2003): 850865.
 Links between MPC and ADP: Bertsekas, Dimitri P. "Dynamic programming and suboptimal control: A survey from ADP to MPC." European Journal of Control 11.4 (2005): 310334.
 MPC using affine policies: Goulart, Paul J., Eric C. Kerrigan, and Jan M. Maciejowski. "Optimization over state feedback policies for robust control with constraints." Automatica 42.4 (2006): 523533.
 References detailing energy price and energy storage models are provided by Alpiq and will be made available at the beginning of the project.
Weitere Informationen

Professor:
John Lygeros

Projektcharakteristik:
Typ:
Art der Arbeit:
Voraussetzungen: Dynamic Programming, Model Predictive Control

Anzahl StudentInnen:
Status: taken

Projektstart: Semester: Fall 2016 
