Continuous Commissioning to ensure and improve building performance
MetadataShow full item record
SubjectGreedy correlation screening; Forecasting; Heuristic algorithm; Building monitoring; Continuous commissioning; ICAR/08
The aim of this thesis regards the development of a methodology to improve both the energy efficiency and IEQ on buildings. Considering a broader view, this work belongs on the continuous commissioning procedure, in particular on the definition of a building model based on monitoring data, to ensure building performance. The work focuses on three main aspects: the definition of a monitoring system on buildings, the features selection problem on mathematical linear models and models selection process. The goal is to define an automatic procedure that starting from the real data of a monitoring system is able to identify a mathematical building model, selecting the right features. To reach this aim several steps need to be covered. As a first, the definition of three main levels of monitoring system is presented. Each level gives a suggestion about the grade of achievable accuracy on the analysis of building performance as well as the definition of the measurements points to carry out. Consequently, the data gathered from a monitoring system need to be pre-processed, identifying possible outliers or errors. This is an important step because the reliability of model and of results depends on the clarity of data. To characterize the building, black box models are chosen. ARX and ARMAX models are an effective instrument to evaluate continuous building performance from insufficient monitoring data. However, selecting the right model features is NP-hard. The problem of finding a minimal subset of informative inputs has been studied extensively in various fields but automatic, fast, and reliable procedures to find optimal models for building performance evaluation are still missing. A novel feature selection algorithm named Greedy Correlation Screening (GCS), which identify possible solution at a time by greedily maximizing the correlation between inputs and output and minimizing cross-correlations between inputs, is developed. These two objectives are competing, thus leading to best tradeoffs. Among these, the best model is automatically selected by applying filters and quality criteria such as the adjusted coefficient of correlation and non-correlation of residuals. The methodology is applied in two case studies with different intended use and objectives. An office building in San Michele all’ Adige (IT) in which the goal is to identify a flowrate models of two main floors to calculate the real thermal consumption of each floor and a residential home in Leicester (UK), where a model of indoor temperature is required to predict the missing data during the heating season and to predict the possible overheating in summer. In the first one, the performance of this heuristic method is compared to two of the best algorithms used in the field, such as GRASP for feature selection and NSGA-II (Non-dominated Sorting Genetic Algorithm). In the second case, the performance of GCS is validated. The results demonstrates on one hand, that the proposed method solves the problem of feature selection in building performance estimation efficiently and reliably. On the other hand, that the algorithm could fails to find a good ARX model for the analyzed cases, if the aim of the process is the forecast of a long period (e.g 24 hours). In this case, the problem could be overtake considering the ARMAX models instead of ARX. Moreover, the procedure to create the model is automatic, making it ideal for integration into a Building Management System (BMS).
Showing items related by title, author, creator and subject.
Formal verification of wastewater treatment processes using events detected from continuous signals by means of artificial neural networks: Case study: SBR plant Luccarini, L; Bragadin, GL; Colombini, G; Mancini, M; Mello, P; Montali, M; Sottara, D (Elsevier, 2010)This paper proposes a modular architecture for the analysis and the validation of wastewater treatment processes. An algorithm using neural networks is used to extract the relevant qualitative patterns, such as apexes, ...
Abiteboul, S; Chan, H; Kharlamov, E; Nutt, W; Senellart, P (ACM, 2010)Sources of data uncertainty and imprecision are numerous. A way to handle this uncertainty is to associate probabilistic annotations to data. Many such probabilistic database models have been proposed, both in the relational ...
Astromskis, S; Janes, A; Sillitti, A; Succi, G (World Scientific Publishing, 2014)The reputation of lightweight software development processes such as Agile and Lean is damaged by practitioners that claim benefits of such processes that are not true. Teams that want to demonstrate their seriousness, ...