Note: This content is accessible to all versions of every browser. However, this browser does not seem to support current Web standards, preventing the display of our site's design details.


Optimal Economic Control of Energy Storage



Marius Schmitt, Benjamin Flamm, Paul Beuchat


In modern energy markets, commodity prices fluctuate on a daily, weekly, and seasonal basis. Prices typically follow roughly predictable patterns, such as low nighttime prices due to reduced electricity demand. Market participants that have the ability to store energy can engage in arbitrage where the aim is to buy and store electricity when prices are low, in turn releasing and selling it in times of high prices. In Switzerland, this idea is already commonly used in pumped-storage hydroelectric power plants. Determining optimal arbitrage policies allows for more energy storage to be used, thereby increasing the adoption of renewable energy sources and reducing society's dependency on fossil fuels.


In this project, we seek to optimize the operation of energy storage units with respect to uncertain future prices. The main objective of the project is to design algorithms for this problem that scale more favorably with the system dimension than state-of-the-art methods based on discretization of the state space. To this end, we intend to explore approximate stochastic dynamic programming (ASDP) and/or policy-based stochastic model predictive control (MPC). Initially, we intend to gain intuition into the structure of the problem by analyzing a low-dimensional example. We can then use knowledge of the structure of optimal policies and value functions in the choice of the policies in MPC or basis functions for more sophisticated ASDP. The developed algorithms will be validated using a realistic model of several real-world, interconnected pumped-storage units, as well as a hydrogen gas storage plant. This project will be conducted in cooperation with Alpiq, who will provide models for gas and pump storage units and historical energy price data.

Main steps

  1. Preliminary step: Choose an appropriate price model and select the parameters based on real data. This step can be based on results of a previously completed student project.
  2. Literature review: Read and understand methods for optimal control methods in the presence of uncertainty, i.e. Approximate Stochastic Dynamic Programming and/or stochastic MPC based on policies.
  3. Implementation: One or both approaches, depending on the scope of the project (SA/MA)
    1. ADP: Formulate the stochastic DP problem for a low-dimension test system. Can results on the structure of optimal policies be derived analytically? Implement a suitable ADP algorithm and compute the optimal policies for the test system to obtain a benchmark result.
    2. MPC: Choose a suitable policy parametrization and implement a stochastic MPC based on these policies. Solve the optimal control problem for the test system. How does MPC compare to ADP in terms of performance and computational tractability?
  4. Case study: Based on the results obtained for the test system, refine the implementation of the most promising algorithm. For a set of test systems, evaluate how computational and control performance scale as the system dimension is increased. Finally, apply the chosen algorithm to a realistic model of several real-world, interconnected pumped-storage units.
  5. Possible extensions, if time permits: We intend to start with simple models for energy storages, but the following extensions are of practical relevance and can also be explored within steps 3 and 4:
    1. Nonconvex operation costs, in particular unit commitment costs
    2. Redundant actuators (e.g. pumps, turbines) with different, non-constant efficiencies
    3. Theoretical, provable optimality of certain types of policies, e.g. threshold policies
  6. Final Presentation: At the end of the project, the student will present his work at the Automatic Control Laboratory in a 25min seminar.
The scope of the project can be adapted to a certain extent depending on the type of project (SA/MA) and your personal interests and expertise. Please contact Ben Flamm, Marius Schmitt, and/or Paul Beuchat for further information.


  1. Approximate Dynamic Programming: de Farias, Daniela Pucci, and Benjamin Van Roy. "The linear programming approach to approximate dynamic programming." Operations Research 51.6 (2003): 850-865.
  2. Links between MPC and ADP: Bertsekas, Dimitri P. "Dynamic programming and suboptimal control: A survey from ADP to MPC." European Journal of Control 11.4 (2005): 310-334.
  3. MPC using affine policies: Goulart, Paul J., Eric C. Kerrigan, and Jan M. Maciejowski. "Optimization over state feedback policies for robust control with constraints." Automatica 42.4 (2006): 523-533.
  4. References detailing energy price and energy storage models are provided by Alpiq and will be made available at the beginning of the project.

Weitere Informationen

John Lygeros

Art der Arbeit:
Voraussetzungen: Dynamic Programming, Model Predictive Control
Anzahl StudentInnen:
Status: taken
Semester: Fall 2016