Sustainable Design and Development


Paul Appleby provides strategic advice to design and masterplanning teams on the integrated sustainable design of buildings, based on the premises set out in his 2010 book covering:

• Sustainability and low carbon design strategy for developments and buildings

• Passive design measures for masterplans and buildings

• Low carbon technologies and renewables

• Land use, density, massing and microclimate

• Social and economic requirements for sustainable communities

• Policy, legislation and planning - history and requirements

• Sustainability and environmental impact assessment methodologies

• Sustainable construction and demolition

• Integrated sustainable transport planning

• Computer simulation of building environments

• Thermal comfort

• Air quality hygiene and ventilation

• Waste management and recycling

• Materials and pollution

• Water conservation

• Landscaping, ecology and flood risk

• Light and lighting

• Noise and vibration

• Security and future proofing

Paul Appleby has been involved in the sustainable design of buildings for much of his career including recent high profile projects such as the award-winning Great Glen House, the Strata tower and the proposed masterplan for the iconic and challenging Battersea Power Station site (see postings below).

E mail paul at paul.appleby7@btinternet.com if you want to get in touch














Thursday, 23 September 2010

The Zero Carbon Challenge


The Building Regulations for England and Wales are on a trajectory to achieve ‘zero carbon’ for new homes and schools by 2016 and for other buildings by 2019. This is all part of the UK Government’s strategy to meet its commitment to achieve a reduction in overall carbon emissions of 80% by 2050.

In the meantime the 2010 amendments to Parts L, F and J will be coming into force on 1 October, comprising revisions to regulations covering respectively:

• Conservation of fuel and power;
• Ventilation; and:
• Combustion appliances and fuel storage systems.

These represent the next step on the road to zero carbon and include CO2 emission targets 25% below those required by the 2006 Part L and 40% below the notional 2002 value, once corrected for changes in fuel carbon intensity.

The Labour Government established the Zero Carbon Hub (ZCH) in June 2008, following recommendations arising from the Calcutt Review of House Building Delivery http://www.callcuttreview.co.uk/default.jsp, under the auspices of the National Housing Building Council (NHBC) with the remit to both come up with a definition for ‘zero carbon’ and support the regulatory process. The definition for zero carbon has yet to be agreed and Grant Shapps, the Housing Minister in the Coalition Government, has told ZCH to review the level of on-site renewables required within the definition.

ZCH are in the process of publishing the results of a number of studies, setting out the key areas that need resolving before the 2016 amendments can be made. They have recently published documents covering future climate change; closing the gap between design and built performance; and how performance standards should be expressed; as well as an overview report http://www.zerocarbonhub.org/. They are still to produce the reports promised on carbon compliance tools and the carbon intensity of fuels.

The work already carried out on the definition of zero carbon has rowed back from the compliance requirements originally set out in the Code for Sustainable Homes to achieve Level 6 under Category 1: Energy and Carbon Dioxide Emission. This required the dwelling to achieve zero carbon for regulated and unregulated emissions (i.e. including household appliances and cooking) using some combination of passive design and renewable technology. The requirement currently under consideration will typically allow up to 53% of emissions to be achieved through “Allowable Solutions”, based on a reduction of at least 70% of regulated emissions compared with 2006 Part L targets (TER) to be achieved by energy efficiencies and on-site low carbon and renewable technologies. For an average dwelling meeting a 2006 TER regulated emissions represent around 67% of the total CO2 emissions. Allowable Solutions have not yet been agreed but are likely to include importing locally generated renewable electricity; exporting low or zero carbon (LZC) energy; financial contributions to providing LZC infrastructure and/or improving the energy efficiency of buildings in the neighbouring community. Of course in most cases some combination of these could be employed, along with going beyond the 70% criterion for passive design and on-site LZC measures.

In July 2009 ZCH published the results of a consultation defining a fabric energy efficiency standard for zero carbon homes. Similar in concept to Passivhaus and Energy Saving Trust standards this provides energy targets in kWh/m2/yr and limiting U values and air tightness standards, although it sets its sights a lot lower, with a recommended target of 39 kWh/m2/yr for an apartment or mid-terraced house compared with the Passivhaus target of 15 kWh/m2/yr, for example. U values are commensurately weaker, particularly for windows which are 1.4 W/m2K compared with 0.8 W/m2K for a Passivhaus window. ZCH recommends an air permeability of 3, compared with 1 for a Passivhaus dwelling. The reasons for not setting more challenging criteria are not clear, but appear to relate to cost and currently available construction techniques. However designers and house builders may decide it is more economically feasible to use a specification closer to the Passivhaus level rather than investing in an array of “Allowable Solutions”.

Designers may be interested in recent UK experience reported in BRE’s Autumn 2010 Constructing the Future newsletter: “Experience in Europe indicates that while a 6% extra overall cost is likely, the quality assurance procedure can actually help to reduce costs.... (Whilst) a housing project in London, which BRE is advising on, has achieved PassivHaus for the same cost as a delivering a typical social housing project.”

In the recently published ZCH studies overheating was a key focus, resulting in 14 recommendations for urgent action, including the development of an improved technique for predicting overheating for integration into the SAP calculation. The problem is that well-insulated airtight dwellings are prone to over-heating, particularly when window opening is problematic due to the close proximity of noise sources, such as roads, and/or site shape and orientation drives the design towards west facing bedrooms and/or living rooms. Potential temperature rise from global warming will of course exacerbate this problem. Any technique developed within SAP for predicting overheating must take into account predictions for temperature rise associated with global warming, such as those developed by CIBSE which publishes Future Test Reference Year and Design Summer Year (TRY/DSR) data for 14 sites across the UK for the years 2011-2040, 2041-2070 and 2071-2100. These take into account the four UKCIP02 climate change scenarios between Low to High CO2 emission rates. (www.ukcip.org.uk/index.php?id=161&option=com_content&task=view).

ZCH is also recommending a change to the method used in SAP for determining the carbon emissions factors for electricity. These are currently based on historical data, whilst ZCH recommend using predictions for 15 year rolling averages, updated annually. The modelling carried out by ZCH indicates that decarbonisation of the electricity grid will have a major impact on the energy balance for a typical new home, and hence the most efficient methods for meeting energy demand. ZCH envisages that as electricity generation decarbonises targets will have to be set in terms of primary energy demand rather than CO2 emissions.

SAP 2005 did not include allowance for a comprehensive menu of upstream CO2 equivalent emissions associated with energy generation. BRE are currently developing a consistent approach to ensuring such factors as fuel extraction, processing and delivery are accurately estimated in SAP, particularly for biomass and liquid biofuels.

ZCH also considers more work is required to establish consistent and reliable information and guidance to determine the CO2 emissions associated with community energy schemes. Currently Building Regulations require this to be undertaken by a competent person, but provide no standard methodology.

In 2007 Calcutt reported on the disparity between predicted and actual heat losses from homes as an example of the poor standards prevailing in house construction. This has subsequently been confirmed in a study of 16 dwellings by Leeds Metropolitan University, which found that some experienced a heat loss more than double that predicted by SAP. Ideally the heat loss for each new dwelling should be measured after construction and compared with calculated value. However this is unlikely to be practicable since current methodologies, such as the co-heating test, take at least a week to carry out and have to be done in winter, and hence are not commercially viable.

ZCH have recommended that a carbon compliance accreditation scheme be developed for designers, suppliers, manufacturers and builders that could include accredited details such as have been developed for Part E under the Robust Details scheme. This would include post-construction whole house audits of a sample of whole dwellings and services as part of the accreditation process. They are considering the incorporation of confidence factors (i.e. margins) to the calculation for dwelling emission rate (DER) that would be reduced for accredited organisations and hence provide an incentive for accreditation.

It seems that the whole process leading to zero carbon by 2016, with an interim stage in 2013, has been thrown into turmoil by the new Building Regulations Minister Andrew Stunell at a meeting of the Energy Efficiency Partnership for Homes in July, revealing that he has instructed civil servants to examine the feasibility of bringing forward the 2013 Part L revision to 2012. This has not gone down well with the construction industry, not least because of the uncertainties that it has introduced to an already challenging economic time for the industry.

Thursday, 2 September 2010

Climate Change: the scientist, the journalist and the politician


The frustrating situation that climate change science finds itself in reminds me of one of those disaster movies – you know the one – where a flawed hero is trying to save the planet, or whatever, but is being distracted at every turn by either his own problems, or noises off. In our case the flawed hero is represented by the Intergovernmental Panel on Climate Change (IPCC) and the climate change scientific community; and the flaws have received enough publicity in recent months for me not to dwell on them here. A small but vocal minority of so-called climate change sceptics and deniers have provided the noises off. It could be argued that the ‘hero’ needs these distractions to overcome his flaws and go on to save the day!

This might describe the process that followed the exaggerated claims of Himalayan glacier melting rates quoted in the Working Group II contribution to the IPCC 4th Assessment Report, resulting in publication of the “Review of the Processes and Procedures of the IPCC” by the Inter-Academy Council on 29 August (IAC Report). There have also been two inquiries into the circumstances behind the leaked emails from the University of East Anglia Climatic Research Unit, colloquially known as “Climategate”, chaired by Lord Oxburgh and Sir Muir Russell that reported in April and July 2010 respectively.


Put simply all this has revealed that some scientists working in the climate change sector have been both slipshod in their practices and overly protective of the information they hold. On the other hand some of the tactics of the more extreme ends of the climate sceptic community have been aggressive, bizarre and ignorant. See Skepticalscience for a useful review of these tactics. Sometimes lost in this melee however are some perfectly reasonable questions concerning some of the more dramatic certainties coming from the climate change protagonists.

The main problem here is just how much is riding on climate change predictions. You have only got to read the 2006 Stern Review on the Economics of Climate Change to get a feel for the sums involved and the potential impacts of the decisions with which Governments are faced. As we saw at COP 15 in Copenhagen in December 2009 some Governments are not yet in a position to sign up to significant cuts in carbon emissions when faced with the prospect of curtailing the rate of economic growth at home. On the other hand, although the energy sector makes much of its involvement in renewable energy, there have been reports from the US that companies involved in oil exploration, such as Koch Industries and Exxon Mobil, have funded climate sceptic groups to the tune of millions of dollars.


Although many politicians and journalists involved in making decisions and reporting on and around climate change have a background in science, mostly they rely on the information provided to them by scientists. But of course climate science is incredibly complex and the prediction of climate change far from certain. Politicians and journalists on the other hand like to deal in certainties. In his excellent online book “Sustainable Energy without the Hot Air” (free to download from MacKay) David MacKay provides an example of how journalists can get it so badly wrong. The following is a quote from Dominic Lawson writing in the 8 June 2007 edition of the Independent and paraphrased by Professor MacKay :


“The burning of fossil fuels sends about seven gigatons of CO2 per year into the atmosphere, which sounds like a lot. Yet the biosphere and the oceans send about 1,900 gigatons and 36,000 gigatons of CO2 per year into the atmosphere – . . . one reason why some of us are sceptical about the emphasis put on the role of human fuel-burning in the greenhouse gas effect. Reducing man-made CO2 emissions is megalomania, exaggerating man’s significance. Politicians can’t change the weather.”


Unfortunately Mr Lawson makes some fundamental errors in this article. Apart from getting all the numbers wrong, the emissions from the biosphere and oceans into the atmosphere are balanced by almost exactly the same quantity of CO2 flowing in the opposite direction and being absorbed by the biosphere and oceans. As the IPCC and others can testify, getting the numbers right is also important. In Mr Lawson’s case he makes the common error of mixing up carbon and CO2. In fact the figure he quotes for CO2 emissions from anthropogenic activities is actually that for carbon and should read 26 Gt CO2/annum. Worse still the 36,000 gigatons quoted represents the amount of carbon held in the oceans, the estimated flow rate given by MacKay is 90 gigatonnes of carbon/annum (330 Gt CO2/annum), whilst the flow to and from the biosphere has been estimated at 440 Gt CO2/annum. This cyclical flow of gases between the earth and its atmosphere has been occurring since the atmosphere, oceans and biosphere evolved and is an intrinsic part of the earth’s ‘metabolism’.


The challenges of reporting on climate change science are being presented by the BBC’s Environment Correspondent, Roger Harrabin in his essay “Uncertain Climate”, the first half of which was aired on Radio 4 on Monday 30 August. In this he recognizes that the nuances of climate change science have been lost in the mix, whilst some scientists have expressed an exaggerated degree of certainty about the prospects for global calamity. In contrasting interviews with Al Gore and Tony Blair, he demonstrates both the potential for evangelism driving out rationality, in the case of Gore, and an acceptance that politicians must apply the precautionary principle whilst communicating uncertainties (Blair).


Coincidentally a well-known ‘sceptic’ and author of the “Skeptical Environmentalist” Bjorn Lomborg was reported on the same day as having gone through a Damascene conversion in his most recent book “Smart Solutions to Climate Change”. In fact Lomborg is only the editor of this volume, which has a number of contributors. A closer reading of his cannon however reveals a more nuanced picture than the headlines might portray. Lomborg has never denied the science, he has only questioned priorities, although based originally on what some have claimed is a simplistic comparison of the cost of mitigating climate change with tackling malaria, HIV/AIDS and inadequate sanitation and water supply. This latest book focuses specifically on the priorities to mitigate climate change, analysing the likely costs and benefits of a very wide range of policy options, including geo-engineering, mitigation of CO2, methane and 'black carbon' (soot) emissions, expanding forestation, research and development of low-carbon energy technologies and encouraging green technology transfer. In an interview for the Guardian Lomborg is reported as saying that "the crucial turning point in his argument was the Copenhagen Consensus project (of which he is Director), in which a group of economists were asked to consider how best to spend $50bn. The first results, in 2004, put global warming near the bottom of the list, arguing instead for policies such as fighting malaria and HIV/AIDS. But a repeat analysis in 2008 included new ideas for reducing the temperature rise, some of which emerged about halfway up the ranking. Lomborg said he then decided to consider a much wider variety of policies to reduce global warming, 'so it wouldn't end up at the bottom'." Which sounds a bit like altering the parameters in order to give the result you are after.


The metaphorical hero in our disaster movie may be the IPCC, but the villain of the COP 15 conference proved to be China. As can be seen from the Figure at the start of this post, copied from the MacKay book referred to above, the total CO2 equivalent emissions in 2000, as indicated by the area of each block, were similar for China to that for the USA, although the per capita emissions in the US were about 6 times those for China. However China is going through its very own industrial and economic revolution leading to massive growth in all those criteria that result in increasing greenhouse gas emissions.

China is reported to be commissioning new coal fired power stations at the rate of 2 or 3 per week, with a long term programme to construct more than 500. They are in the middle of a major airport construction and improvement programme, with 42 new airports in the pipeline and 70 being improved. Car ownership increased 5 fold between 2003 and 2008, whilst China’s urbanisation continues unabated, with the percentage of urban residents increasing from 18% in 1978 to 44% in 2006 according to an article in the Economist online.

The latest International Energy Agency statistics (ref IEA energy statistics 2010) indicate that China’s CO2 emissions doubled between 2000 and 2008, coincident with a doubling of coal production.

These same statistics predict an increase in total primary energy supply (TPES) globally of 65% by 2030 from 1990 figures based on ‘policies under consideration’. No predictions are given for the corresponding change in CO2 emissions.

These are scary statistics, and there are plenty more where they came from! One conclusion that can be drawn from the above is that the failure of COP 15 and the ‘noises off’ from climate sceptics must not be allowed to get in the way of future global agreements to mitigate global warming. What we all need is a clear and consistent message from the scientists who have dedicated their lives to studying this complex subject.