The AEC industry plays a key role in climate change. According to the latest IPCC report (chapter 6) cities are generating about 70% of global CO2-eq emissions. This contribution impact negatively to people, infrastructure, and business due to the hazards related with the climate change. In addition, the negative impact in big cities is magnified by wrong decisions in urban design, land-use, building design and human activity. Contrarily, energy efficient buildings reduce greenhouse gases emissions and save costs related with the management and operations, associated benefits are clean air and wellbeing for users.
Design and build sustainable buildings are much more cost effective than build and retrofit later. Although urban regeneration and building refurbish are opportunities to target low carbon emissions and inclusive urban centres. Machine Learning (ML) models have begun a deep transformation in the AEC industry as the next natural step from the BIM paradigm, parametric and generative design.
In this post I will revise some Machine Learning models for AEC industry to tackle climate change.
Buildings varies in use, age, construction materials, dimensions, and location, so the optimal strategy will vary widely in relation with context and environment. For example, Johan Cruijff arena incorporates batteries from 148 electric cars to balance the energy supply in the arena. A comparable size arena in Central America will follow a different approach, because the energy cost is cheaper and the arena it's located outside the city. Both venues with same capacity (~55,000) will be in operations for around 60 years, because concrete lifespan. Nevertheless, they will follow different strategies to reduce the energy consumption.
In building design process, the energy analysis is necessary to predict the energy demand. The process implies to build digital models to represent the building physics and execute long thermodynamic computations. In some cases, the computation cost is so expensive that only 1 or 2 simulations can be executed during the design process. And most probably not all optimal solutions will be explored.
Supervised learning models like ensemble boosting can speed up the computation process learning from previous energy analysis. The model can be trained from different buildings, each of them represented by features like size, location, usage, energy consumption, etc. After the training The model will predict the energy consumption of the building based on their characteristics. It's similar for existing buildings. The data produced by energy meters and home monitors can be collected and used to train supervised models to predict the energy demand. This information is useful to evaluate building operation strategies and energy power companies looking for an optimized power network.
In some cases, the retrofit executed in a building to reduce the energy consumption will not produce the expected results. So, it's important to understand what went well and what failed, although it's even more critical understand the causes of outcome. High energy consumption in a building could be the answer from an old boiler and HVAC unit. Nevertheless, addressing the effects (high energy consumption) will not solve the problem, in contrast, addressing the cause could solve the problem. In classification problems like supervised learning, the effects and causes are required to make a succseful prediction. To solve the problem, we need to understand the cause.
HVAC systems provide to buildings thermal and indoor air quality for users. The HVAC system includes heating, ventilation and cooling or air-conditioning equipment. The distribution of the air conditioned into the building is through ductworks. The different equipment in the HVAC system is regulated by the controller, which based on the system sensors and their parameters, adjust the actuators. The controller is constrained by the knowledge of the system, which in most cases is provided by the sensors. For example, a boiler can only heat water at certain speed and specific temperature.
The Reinforcement learning process begins with an agent (controller) that observe the environment (read the data from the sensors) and take an action. If the result of the action helps to achieve the goal (reduce energy consumption) the agent is "rewarded". If the result of the action doesn't help to achieve the goal the agent is "penalized". The rewards and penalization helps to balance the weights in the neuronal network. At the end of the training process the agent learnt how to achieve the goal in an optimal way.
Reinforcement learning models can perform complex tasks by maximizing the reward function in real time. Conceptually, works like award your dog with a cookie when makes the right task and penalize when takes the wrong action: Reinforcement.
Reinforcement learning is the ML model behind self-driving cars and other applications. It's has been using successfully to automize HVAC systems in data centres reducing energy consumption and taking intelligent decisions by using different power sources.
Time Series analysis
Time series analysis is used for data that changes over time. Time series analysis is used in finances, healthcare, network traffics and weather analysis. For example, understand median surface temperatures during the year seasons.
Urban planning and cities
The coordination between energy consumption buildings at neighbour or city level is essential because they are connected to the power network. Some buildings can contribute to the network and/or minimize energy demand using alternative sources.
Building energy consumptions based on data is a good opportunity to apply ML models to improve systems and take optimized decisions. Although, the smart city model is from the 60s, useful data might not available making difficult to shape effective sustainable strategies. Good data is critically important to create effective machine learning models.
Transfer Learning consist in store the "knowledge" gained while solving one problem to apply in another different but related problem. The neuronal network architecture is optimized by the weights in the neurons during the training process. Later, the neuronal network can be packed and used in related problems. Reuse the knowledge from previously learned problems to new tasks has the potential to improve the efficiency, because the training process could be minimized. In theory, we can predict the energy consumption of a multi-purpose arena based on a model trained with stadiums data.
Gaussian process regression (GPs)
Urban building energy models are used to identify, support, and improve sustainable urban developments and energy efficiency initiatives in neighbours and cities. These large 3D models allow the energy simulation of buildings at neighbour or city scale. And this is achieved, balancing detailed 3D modelling and energy model accuracy.
A common GPs application is to recover values from incomplete data. For example, given incomplete data from a city area a GPs model can recover missing values like energy consumption, temperature, etc. The model works applying random values to calculate the probability of distribution over all data. They perform well on small datasets and incomplete time series data because these are normally distributed.
Architectural design is digitally developed. From concept to BIM the material properties and dimensions of walls and floors are selected carefully to achieve energy performance. Nevertheless, as soon as construction start the uncertainty about the project begins to accumulate. For example, thermal properties for a panel sandwich, may vary from different construction batches. During the transportation, site storage and assemblies' other uncertainties could occur.
Uncertainty quantification will anticipate uncertainty in the digital design process by predicting probability distribution of results determining which factors are more important. For example, Uncertainty quantification will answer what is likely to happen when the venue will be subjected to a range of uncertainties and variable inputs. How to optimize when a clear objective is unknown.
ML models need to learn from the search space (data) to find optimal solutions and once the model is trained can be applied to similar problems. ML models can be trained infinite number of times and the general rule indicates that bigger the data used in the training process better the model will perform. This means a big impact in energy consumption predictions in the design process, because every decision taken can be evaluated almost immediately. In contrast, stochastic optimization like Genetic Algorithms (Gas) will find optimal solutions after evaluating substantial number of options, because doesn't know what kind of problems are trying to solve. Although there is different type of GAs that could minimize this issue like Micro GAs.
In the last 20 years the BIM, parametric and generative design, and energy simulations have been produced massive amounts of data and makes sense, now to use that data to train ML models to tackle climate change.
The article is inspired and based on this paper: Tackling Climate Change with Machine Learning