Files

Abstract

When it comes to residential buildings, there are several stochastic parameters, such as renewable energy production, outdoor air conditions, and occupants’ behavior, that are hard to model and predict accurately, with some being unique in each specific building. This increases the complexity of developing a generalizable optimal control method that can be transferred to different buildings. Rather than hard-programming human knowledge into the controller (in terms of rules or models), a learning ability can be provided to the controller such that over the time it can learn by itself how to maintain an optimal operation in each specific building. This research proposes a model-free control framework based on Reinforcement Learning that takes into account the stochastic hot water use behavior of occupants, solar power generation, and weather conditions, and learns how to make a balance between the energy use, occupant comfort and water hygiene in a solar-assisted space heating and hot water production system. A stochastic-based offline training procedure is proposed to give a prior experience to the agent in a safe simulation environment, and further ensure occupants comfort and health when the algorithm starts online learning on the real house. To make a realistic assessment without interrupting the occupants, weather conditions and hot water use behavior are experimentally monitored in three case studies in different regions of Switzerland, and the collected data are used in simulations to evaluate the proposed control framework against two rule-based methods. Results indicate that the proposed framework could achieve an energy saving from 7% to 60%, mainly by adapting to solar power generation, without violating comfort or compromising the health of occupants.

Details

Actions

Preview