Maarten De Strycker, Liesje Van Gelder, Valérie Leprince
Languages: English | Pages: 12 pp
Bibliographic info:
39th AIVC Conference "Smart Ventilation for Buildings", Antibes Juan-Les-Pins, France, 18-19 September 2018

Since January 1st, 2018, airtightness testing has become implicitly mandatory for every new residential building in Flanders. There is no minimum requirement for airtightness. However, there is one for the global performance of the building envelope (S-level, taking into account thermal insulation, airtightness, solar gains, etc.), and a poor airtightness would jeopardize the chance to reach the required S-level. Before 2018, the Flemish Region was already promoting airtightness tests by using a very disadvantageous default value in the Energy Performance Calculation when the test was not performed.  
To guarantee the reliability of airtightness tests, each test must be declared conform according to STS-P 71-3 requirements and legal requirements. This means that the tester shall be qualified and its company recognised to perform the test. BCCA (Belgian Construction Certification Association) has set up a quality framework to recognise airtightness testers and their companies.  
This paper describes the requirements of the Flemish Energy Agency for the organizer of such a quality framework and, more specifically, requirements for the airtightness testers themselves. Among others, these requirements include the inspection of airtightness testers onsite, through desktop inspection, and the obligation to register every test in a database. 
This article will describe the inspection process and its output. Desktop and onsite inspections represent both 10% of tests performed. These are done by a dozen of qualified inspectors all around Flanders. The tester has to inform BCCA of every measurement at least the day before and send a text message when the test actually starts and ends. The tester is informed within 5 minutes after the end of the test when there will be an inspection. 634 onsite inspections have been performed in 2017, of which less than 1% has resulted in a difference of more than 10% with the first measurement (and those were performed at little flowrate). Furthermore, 11 major non-conformities (2%) and 40 small non-conformities have been reported through onsite inspection. Therefore, this inspection process seems relevant to deter testers from manipulating results. 
With the desktop inspection, BCCA spots non-conformities in the reports. In 2017, out of the 631 desktop inspections, 52 major non-conformities and 111 small non-conformities have been pointed out. 
The database gathers around 7000 tests per year. Information in the database includes: administrative data, main destination (residential, school, ...), time and location of the test, leakage rate (m³/h), heat loss area (m²) and/or internal volume (m³) and the full test report. Statistics show a skew-normal distribution of measurement results with an average of v50= 3.36 m³/h/m². 
This paper concludes that it is possible to develop a qualification framework at limited cost for the testers, with an efficient inspection process that avoids manipulation of results and hence improves the reliability of results.