Good Homes Alliance conference: ‘Closing the Performance Gap’

The Good Homes Alliance (GHA) recently held its first annual conference, ‘Closing the Performance Gap’ which provided an overview of different strategies and costs for ongoing monitoring and post occupancy. The audience was primarily housebuilders, academics, environmental consultants and product manufacturers, and a few architects including Justin Bere.

Research conducted by Leeds Metropolitan University’s (LMU) Centre for the Built Environment on a variety of new housing developments since 2005 has shown losses of up to 120% higher than designed. The average gap between designed and built performance was 60% (see Figure 1 below). The conference was devoted to exploring ways to close this gap.

Figure 1

Monitoring implemented to date by a range of developers and academics was presented including information on costs, the protocols, the strategies, learning and feedback; how co-heating tests are conducted, what the tests have revealed and how thermal performance problems can be avoided. The full programme and list of speakers can be seen here.

The second half of the conference looked at the monitoring and feedback part of GHA’s LowCarb4Real programme, which explores how to improve the energy performance of new homes. GHA’s Jon Bootland explained how to choose a meaningful monitoring strategy along with an overview of the GHA monitoring programme and first results from recent testing that measured fabric performance of four new dwellings.

To expand the evidence base for the energy performance of new homes and feedback to industry, GHA initiated a monitoring programme in October 2009 with a number of academic and developer partners. Sponsors include the Energy Saving Trust (EST) and DCLG, and research teams are from University College London (UCL), Oxford Brookes University (OBU) and Leeds Metropolitan University (LMU).

Bootland highlighted the complexity of good monitoring. It is important to identify how to measure results but equally crucial to determine how and whether results will be shared. It took around 9 months for all the teams involved to agree on how to reveal the results. The diagram below has some useful information on costs of different approaches.

Post construction prototype testing could cost around £50k, while monitoring in use and POE cost £1-2k per dwelling for simple meter data and occupant feedback. Temperature and humidity measurements could cost about £5-8k per dwelling.

The sites selected for monitoring were projects by GHA developer members:
FeildenCleggBradley Studios’ One Brighton – BioRegional Quintain (Ecohomes Excellent, One Planet Living)

FeildenCleggBradley Studios’ One Brighton Zeigle block with external insulation, biomass CHP and PV


Richards Partington Architects
‘ Temple Avenue in Derwenthorpe, York for the Joseph Rowntree Housing Trust (Code Level 4)

Dwelling one - thin joint masonry cavity wall system Dwelling two - timber-frame structurally insulated panel (SIP) system that was clad externally in brick

Temple Avenue: Dwelling one - thin joint masonry cavity wall system Dwelling two - timber-frame structurally insulated panel (SIP) system that was clad externally in brick

Malcolm McAll and BrooksDevlin’s The Old Apple Store, Ecos Homes in Stawell, Somerset (Code Level 5)

The Old Apple Store- Timber frame with timber cassettes

The Old Apple Store- Timber frame with timber cassettes

The GHA study started with post construction testing of a prototype dwelling, because it is easier disaggregate the data before dwellings are occupied. Once the baseline for the fabric is set, this variable can be eliminated from any in-use monitoring studies.

The GHA monitoring programme methodology consists of two phases:

1.    Post construction testing (April 2011)- most useful for feedback about construction methods and processes and for testing prototypes and Quality Assurance
Co-heating test on the thermal efficiency of the fabric
Standard air-tightness testing
Measuring the performance of the services (where funded)

2.    Monitoring in-use and POE (Jan-Sep 2011)- useful to identify extremes of performance or for studies of occupant satisfaction behaviour variations
Measuring energy and water consumption
Measuring temperature and Internal Air Quality (IAQ)
Gathering resident and user behaviour patterns
Interviews and questions on comfort and satisfaction levels

GHA Fabric Performance results
Because the GHA developers were aiming for higher levels of performance to begin with such as Code 4, 5 or Passivhaus, the fabric performance gap on these projects is significantly lower (10-20%) than the 60-100% found on the other sites that were tested. This was achieved by applying a whole systems approach involving designers, consultants and engaging with the supply chain.

Predicted v/s Measured Heat Loss Parameter (HLP) of the 4 tested dwellings

Bootland also described other lessons learned, namely that performance can be improved but it takes a lot of work. Testing is currently expensive, but some companies have begun providing products for testing which could lead to competitive prices. A number of companies involved in testing and monitoring - including Stroma, Natural Building Technologies, BSRIA Instrument Solutions, GasTec, Wattbox Ltd and First Utility – displayed their products and services at the event. Industry standards and protocols for testing should be set up to measure all data in the same way. Fabric testing helps to show designers/developers what works and what doesn’t and helps to identify faults in the design and construction process.

With these results, the GHA’s Low Carbon 4 Real programme seeks to develop best practice exemplars. Next steps include further monitoring as part of TSB’s building performance evaluation programme.

Speaker presentations from the conference can be downloaded here.

by Tanisha Raffiuddin, ex AJ sustainability intern, now employed at the Passivhaus Trust

Subscribe to Footprint by email.

2 Responses to “Good Homes Alliance conference: ‘Closing the Performance Gap’”

  1. There is an error in the dates for monitoring given above. Shouldn’t these read 2010?

  2. Thanks for this, Jess. We’ve double checked and the work is ongoing and the dates are correct.

    Hattie

Leave a Reply