The hunt for gas price riggers shouldn't be directed at the Big Six

The alleged rigging of wholesale gas market seems to involve a deliberate manipulation of the price for gas for delivery the following day. More precisely, the whistle-blowers allege that unspecified traders successfully forced down the spot price at 4.30pm on a specific day in September at the end of the gas year. Some commentators have darkly asserted that this suggested manipulation might influence gas prices paid by UK consumers. The assumption seems to be that it might be in the interests of the Big Six suppliers to force down spot prices at a particular moment. This theory doesn’t bear examination and needs to be immediately abandoned.

There may be good reasons why a trader or group of traders tried to illicitly force down the price for immediate delivery but there is no basis for thinking that this price-fixing would affect retail gas prices in a way beneficial to the Big Six. Though the amount they actually need will depend on the weather on each day, these companies buy most of their gas well in advance. That is, they enter into contracts with suppliers to buy defined amounts for each specific future period. Some contracts will be struck for delivery several years into the future. (If the Big Six didn’t do this, they wouldn’t be able to offer retail fixed price offers).

Their retail prices, which were all recently adjusted upwards, reflect among other things their costs of buying gas from wholesale suppliers in this way. Below, I’ve extracted statements from their press releases on how they look at wholesale gas prices and the effect these have on costs to retail customers. I’m afraid their comments are written in technical language but I think they give a reasonably clear impression of how they estimate their future gas costs and what is the consequent effect on consumer prices. The utilities say that they look almost exclusively at a range of what are called ‘forward’ prices that reflect gas markets well into the future. The spot price for delivery on one particular day has no influence on their perception of prices they will have to pay over the following months and years.

SSE

‘..it is the two year rolling average price that is the most relevant in determining future domestic energy prices’.

Figures for increased gas costs are ‘(b)ased on the average forward price for winter 2012 during the 24 months prior to July 2012 compared to the average forward price for winter 2011 during the 24 months prior to July 2011'

Scottish Power

Scottish Power’s estimates of the rising wholesale cost of gas is derived from a calculation of the 'Average percentage (rise) based on the 12 month forward wholesale energy cost as at September 2012 compared to the 12 month forward wholesale energy cost as at January 2012'

British Gas

‘The 13% rise in the wholesale market is based on the average forward price for winter 2012 during the 18 months prior to October 2012 compared to the average forward price for winter 2011 during the 18 months prior to October 2011..’

EDF

‘Wholesale gas energy costs based on the average wholesale gas prices for delivery in each quarter in the coming year.’

Last point: we may believe that the Big Six aren’t completely open (see the previous post on this web site) but how can market rigging that reduces the market quotation encourage them to increase their prices?

Sustainability: All That Matters

In my new book, published this week by Hodder in the UK, I put forward a idiosyncratic view:  I suggest that we are wrong to conflate sustainability and the living of an ethical life. Sustainability is essentially a problem of engineering. Can we build an economy that allows all 9 billion people in 2050 to live with approximately the same standard of living as the richest 1 billion of today? I think the answer to this question is unambiguously ‘yes’, but with one important caveat to which I will return. An ethical life – perhaps one which rejects standard Western norms of high levels of consumption of material goods – is a set of rules we may as individuals wish to follow. But such a lifestyle is very little to do with sustainability. If global society manages to achieve sustainability I suggest it will not come from millions of people living better lives (which we all ought to do anyway) but from using science and economic growth to help us dramatically reduce the impact we have on the planet’s operations.

The crucial finding in the book – and one which I was very surprised to come across – was that the earth’s crust is very  likely  to contain enough minerals to provide the world of 2050 with all that it needs. With reasonable care, we won’t run out of anything important. Some metals will get quite scarce, but humankind will simply switch to reasonable alternatives. There are few materials that cannot be substituted quite easily by others. Even rare earths are abundant and distributed across the globe. It is simply that only China is mining them at the moment (partly because of the highly polluting nature of some extraction techniques). If we build a proper recycling and reuse infrastructure – ‘the circular economy - we can expect to be able to manage quite well.

Another finding which I didn’t expect is that there is quite strong evidence that wealthy human societies reach a peak in their consumption of material resources. Perhaps the best way of putting it is that we need a stock of important metals, of which steel is the best example, and once attained, our needs fall sharply. In the case of steel, even the richest countries require about 10 tonnes per person and no more. So we don’t need ever increasing amounts of metals or other materials to live an increasingly prosperous life. There is a natural limit on human material requirements. We really  don’t have infinite needs.

I know this is a contentious view which is rejected by almost everybody working in the field of sustainability. I’m suggesting that economic growth is perfectly compatible with sustainability. In fact I go further, saying that the improvements in science that come with GDP growth will enable us to face the challenges of sustainability more effectively. Once we have reached a certain standard of living, more economic growth doesn't result in us using more natural resources. We may even require less. Growth is good, I tentatively hypothesise.

What about the stresses and strains put upon the earth’s natural systems by thoughtless human exploitation? Aren't we likely to disrupt vital but little understood ecologies by, for example, our horrifying indifference to falling biodiversity? Perhaps with slightly more confidence than warranted, I say  no, the loss of biodiversity  is a tragedy and an ethical disaster but is not likely to affect mankind’s ‘sustainability’.

There’s one important exception to my optimism. Climate change seems to me to represent a threat to human life. Market mechanisms and good sense may enable us to live reasonable lives in 2050 were it not the threat from increased temperatures, rising sea levels and magnification of weather extremes. I conclude that reducing the rate of increase in global concentrations of greenhouse gases is the only really difficult challenge posed by the requirement for sustainability. Everything else I think we can deal with.

In the space allowed by the publisher I could only write a short book. I couldn't include much of the numerical analysis my thesis really requires. And so perhaps I won’t convince anyone that we need to separate out the really good reasons to live simpler and less material lives from very different challenge of using advances in science and technology to enable us to reduce our impact on the planet. And I won’t make any friends by  emphasising that many things we regards as ‘natural’, such as cotton, are actually far more destructive of the world’s sustainability than manufactured alternatives such as polyester. Unfortunately perhaps, a sustainable world is a less natural one than one we might ideally want. But as writers such as Stewart Brand and Mark Lynas have pointed out, this is Anthropocene and humankind has to engineer itself out of its problems.

Myths about Heathrow expansion

The proponents of a third runway at Heathrow-  and the many others who think that airport expansion is necessary to boost the economy - have convinced many of the reasonableness of their arguments. But do modern economies need more aviation to boost business? Do the arguments of the expansionists have any validity whatsoever? A quick comparison of 2000 and 2011 traffic levels at Heathrow shows that the advocates are quite simply wrong in many of their assertions. BAA lobbyists have invented reasons for expansion that have no factual basis. Using Civil Aviation Authority (CAA) data, this article examines their arguments in turn.[1] 1, ‘Heathrow is predominantly a business airport’ and therefore economic growth depends on its expansion.

Most of the airport’s traffic comes from passengers travelling on leisure purposes. Only 31% of the people arriving at the airport are there on business. (And ‘business’ is widely defined to include such travellers as au pairs, students and armed forces personnel).

2, ‘Business travel is growing’ and Heathrow’s capacity shortages are constraining business.

As George Monbiot recently pointed out, business air travel is consistently shrinking in the UK. Heathrow is no exception to this trend. In the period between 2000 and 2011, businesspeople’s travel numbers fell by 12%, compared to a rise of 17% in Heathrow leisure travellers.

The percentage of all Heathrow passengers travelling for business purposes fell from 38% in 2000 to 31% in 2011. Heathrow is rapidly losing reliance on business travellers.

3, ‘Heathrow is principally a hub airport’

There is a grain of truth of this assertion. More passengers do use Heathrow in order to take connecting flights than any other major UK airport. And the percentage of all travellers taking connecting flights from Heathrow is tending to rise. But the percentage of connecting passengers is still only 34%, up from 30% in 2000.

(The number of terminating passengers, as opposed to those connecting to another flight has barely changed in the last ten years. The sense that Heathrow is full to overflowing entirely arises because of the increase in connecting traffic.)

4, ‘The connecting passengers are business customers – and we need to encourage their travel through London’.

This is untrue. 72% of connecting passengers in 2011 were leisure passengers, up from 70% in 2000. Today’s number is marginally higher than the 69% of all travellers who were flying for leisure through Heathrow. The growth in connecting passengers is driven by non-business travel.

In 2011 almost a quarter of Heathrow’s total passenger traffic arose from leisure travellers switching planes. In terms of absolute numbers, over 60% of the increase in passenger numbers at Heathrow between 2000 and 2011 arose from connecting leisure passengers.

Perhaps like many others, I’ve never understood why the UK should inflict aircraft noise on millions of people in order to enable international travellers with no connection to the UK to switch from one flight to another. If travellers want to connect at Schipol or Frankfurt rather than Heathrow that seems perfectly OK to me.  The argument that the UK’s status in the world depends on the possession of the biggest airport hub seems lamentably weak.

5, ‘UK business need Heathrow to expand, otherwise we will lose out to foreign companies’

The number of UK nationals using Heathrow for business purposes fell by 19% in the period. Foreign business travel numbers only declined by 3%. Part of the decline in UK numbers probably arises because of the fall in intra-UK flights over the last eleven years as national links have moved to other London airports. But the number of international business travellers from the UK also fell much faster than the number of foreign businesspeople using Heathrow.

6,’ Heathrow has flights to fewer destinations in China and other rapidly industrialising countries’ and so it needs more capacity

I tried to deal with this argument in an earlier article on this blog. In summary, Heathrow doesn’t connect to as many cities in China as some other airports. This is because the preponderance of Heathrow flights go to Hong Kong, from which UK travellers can connect to other cities in China. The number of flights from Heathrow to China is actually far greater than the numbers from other main European airports. The Heathrow/Hong Kong link dwarves all other European connections, with almost three times as many flights as any other airport pair.

 



[1] The numbers in this piece are derived from the CAA’s 2000 and 2011 Passenger Survey Reports, to be found at http://www.caa.co.uk/default.aspx?catid=81. I used 2000 to avoid the impact of the 9/11 suspension of flights, and large consequent drop in air travel, in 2001.

 

Wind power variability

In his response to the article on wind power written by Mark Lynas and me, Professor Gordon Hughes says that gas turbines need to be kept running because the amount of electricity generated by wind varies so rapidly. This short note examines the actual variability of wind power generation over the last three months and compares it to the variability of total demand for electricity. I show that the demand for power is typically over ten times as variable as the supply of wind generated electricity. The point is this: if the National Grid can cope with large half hourly swings in the demand for electricity, then it can cope with the erratic supply from wind farms. Because supply and demand must balance on an electricity grid, swings in demand have exactly the same impact as similarly sized variations in supply. I analysed the electricity produced each half hour from 2nd July to today, 2nd October.

 

Degree of variation between one half hour and the next Number of instances 2nd July to 2nd October
Less than 50 MW 2698
51-100 MW 1087
101-200 MW 550
201-300 MW 91
301-400 MW 11
401-500 MW 2
501-600 MW 1
601-700 MW 1
Average variation 52 MW

 

The average variation in wind output was 52 MW. I then compared this figure to swings in total electricity demand in the same period. The average variation in demand was 678 MW, more than ten times as great as the average variability of wind output. In fact, the maximum variation in wind output between adjacent half hours (674 MW) was less than the average variation in total demand. The maximum half hourly swing in demand was almost four gigawatts, or about six times the maximum variation in wind power output.

The National Grid can cope with much more rapid changes in the supply or demand for electricity than are currently ever likely from the use of the current number of wind farms.

Some criticisms can be made of my simple comparison. First, I am using data from a time of year when wind power generation is relatively low. In winter, variability of wind generation will be greater. But variations in demand will also be much greater in the dark months of December and January. Second, it can be contended that demand variations are more predictable than swings in wind power. This point has some validity: demand moves up and down each day according to a relatively predictable pattern. However unforecast variations from the predicted level of demand can and do occur and these will be far greater than today's wind variability. Weather forecasting allows good prediction of when power will increase or drop. Third, what is true today may not be true when the UK grid has to cope with perhaps five times as much wind power as at present. However even if we multiply the maximum half hourly variability of wind power in the last three months five-fold, we would still see less variation in supply than the maximum variation in demand experienced over the last three months.

Wind does not impose on the National Grid a substantial extra burden to balance supply and demand than exists already.

 

Spain's grid operator shows how CO2 changes as wind and solar vary.

One of the common responses to the article that Mark Lynas and I wrote for the Guardian earlier this week was to question our assumption that a lower fossil fuel share of total generation would result in lower emissions. It seemed obvious to us that if we showed that higher wind generation reduced the number of gas-fired power stations operating it would cut CO2 emissions. This certainty was not shared by some readers. Other national electricity grids provide real time data on CO2 emissions that may help settle the issue. Non-UK data will enable enthusiasts - for want of a better word - to track estimated greenhouse gas emissions and watch how they change as the balance of suppliers on the grid adjusts to higher and lower wind.

I’m a particular fan of the Spanish graphics.

https://demanda.ree.es/demandaEng.html

Follow the yellow line with your mouse and the site will provide the CO2 and generation at each time period. The wonderful pie chart on the right adjusts to reflect the changing balance of supply. If you want to check that wind cuts emissions without doing any boring spreadsheets, look through some of the days this week when the wind was blowing and compare them to when it was not. The data for previous days can be shown by adjusting the date in the bottom left hand box. The tabs on the bottom right allow you to look at the CO2 emissions or power output.

Really, really lovely visualisation of data.

Wind reduces carbon emissions

Plenty of people still say that wind power is useless because fossil fuel power stations still have to operate to back up the wind turbines just in case the wind drops. This is an incorrect view. Yes, the operator of the national electricity grid has to have surplus capacity waiting to generate in case of the unexpected loss of major power station, as occasionally happens. But the amount of wind power rises and falls relatively slowly across the UK and the grid doesn’t need to have a separate back-up to deal with the variation in this source of power.

Put at its simplest, when the wind is blowing the UK is using less fossil fuel to create electricity. This saves money on fuel and reduces carbon emissions.

In an article on the Guardian web site to be published on Wednesday 25th, Mark Lynas and I estimate the reduction in greenhouse gases arising from the UK’s rapidly growing fleet of wind farms. Our calculations use data produced every half hour for the three months to 20th September 2012 to show that – on average – a one gigawatt increase in wind power reduces the amount of gas generation by almost exactly the same amount. If you are that type of person, all the numbers are available to play with on the Guardian’s pages.

The last couple of days have also been very windy. So I thought it might a good check to look at the data and show visually that wind turbines save carbon emissions. This chart uses only 48 data points, compared to the 4,400 in our main statistical analysis. The technique employed is the same. We estimate the expected level of gas powered output (which varies strongly with the demands placed on the national grid). Then we plot the difference between the actual and expected use of gas power stations compared to the level of wind power available. The correlation is immediately clear to the eye.

Lord Lawson gets his facts wrong

In a speech on climate change to an Italian conference, Nigel Lawson concluded with a fierce attack on the honesty of China’s published policies on climate change. He said that the country has ‘no intention whatever of taking the decarbonisation route’ despite its strong public stance on climate change. China, Lord Lawson continued, ‘firmly intends to remain a carbon-based economy’. He implied that the massive growth in wind turbine installation has happened merely to ‘impress credulous foreigners’. Most of the country’s large number of highly successful solar PV manufacturers ‘are on the brink of bankruptcy’, he said. The purpose of this article is to provide some of the figures to contest Lord Lawson’s assertions. Does China intend to remain a ‘carbon-based economy’, in Lawson’s words?

Over the last five years, China has consistently pointed to the severe impacts of climate change. This is what a government white paper said in November 2011: ‘Climate change generates many negative effects on China's economic and social development, posing a major challenge to the country's sustainable development.’ The government further says that ‘China is one of the countries most vulnerable to the adverse effects of climate change’.

Across the world ‘In recent years, worldwide heat waves, droughts, floods and other extreme climate events have occurred frequently, making the impact of climate change increasingly prominent’. The government’s response has been to introduce some of the most aggressive carbon reduction programmes in the world. It claims ‘remarkable results’ and that the current five year plan ‘established the policy orientation of promoting green and low-carbon development, and expressly set out the objectives and tasks of addressing climate change.’

Can we believe these statements any more than similar claims made by other governments? Or should we follow the Lawson line that China is dishonestly pretending to back renewable energy in order to increase the sales of its PV and wind turbine manufacturers? This question matters enormously: if China continues to expand its economy on the back of increased use of coal, the prospects for an early peak in global emissions are substantially reduced.

China’s share of installations of the major renewable technologies

Chinese companies have aggressively expanded the number and average size of renewable energy projects. Table 1 gives an estimate of the share of total global installations.

Installed capacity at end 2011 (in Gigawatts or GW)

 

World capacity

China capacity

China share

Hydro

970

210

22%

PV

70

3

4%

Wind

238

62

26%

 

In other words, China has over a quarter of the world’s total wind generating capacity. Chinese wind resources are substantial, particularly in the west of the country and huge further expansion is possible. One source suggests a potential of over 2,000 gigawatts. Nevertheless Lawson asserts that the huge number of Chinese turbines is nothing more than a front for the sales efforts of its manufacturers, stating that ‘almost half’ are not actually connected to the electricity grid. This assertion is incorrect. Just over 50 GW of capacity was delivering power in June 2012, or almost 80% of the installed wind turbines. It is certainly true that large amounts of investment are needed to connect wind farms in the west of China to industry and homes in the east. While this is happening wind farms often have to wait for new transmission lines. But no electricity company in the world erects turbines without planning to have them generate revenue from the sale of power.

China has almost a quarter of worldwide hydro-electric power capacity. The massive Three Gorges dam, which finally reached full power this year, is the most important of its plants but is only about 10% of China’s total hydro capacity. Several other huge projects are under construction.

Until last year, most of China’s solar panels were exported. The high feed-in tariffs in Italy and Germany provided a substantial market and helped push the cost of PV down to less than half the figure of even a few years ago. The Chinese government responded by introducing its own PV feed-in tariff and local installations soared in 2011.

New capacity in 2011

China’s investment in renewable technologies in 2011 dwarfed all other countries.

Capacity added in 2011 (Gigawatts or GW)

World growth in 2011 China growth China share of growth
Hydro

25

12

49%

PV

30

2.5

8%

Wind

40

18

44%

 

Of the 40 GW of wind power added worldwide, China’s share was almost half. The figure was about three times that of the USA, the next most important market. A similar share of new hydro capacity was gained. China’s PV installation were smaller, but grew from a very much smaller base. The country completed by far the world’s single largest PV farm in 2011, a park of about 200 MW.

Share of investment captured by renewables

According to the respected industry newsletter Platts Power in Asia, China invested about $53bn in electricity generation and transmission in the first seven months of this year.(1) This was split roughly 50:50 between transmission and new power stations.

Share of electricity capital expenditure

Power generation 51%
Of which, fossil fuel power 14%
Of which, non fossil power 36%
Power distribution (‘the grid’) 49%

 

Similar total amounts were invested in the corresponding months of 2011. The percentage of all expenditure devoted to renewables and nuclear was also about 36% in that year. The share devoted to fossil fuel plants fell from 17% to 14% between 2011 and 2012. The impression conveyed by Lawson that China continues to emphasise investment in fossil fuel sources is wrong: China puts over twice as much money into low carbon technologies as it does into coal and gas power stations.

Low carbon technologies require much more capital investment per unit of expected annual electricity output. (On other hand they cost much less in annual operating expenses). As a result of its investment China added about 31 GW of new electricity capacity in the first seven months, of which about 18 GW uses fossil fuel. Wind and hydro was about 11 GW. I cannot find an accurate figure for solar PV but it was probably about 2 GW. (For comparison, the total fossil fuel generation capacity of the UK is about 70 GW).

Share of electricity generation held by renewables and nuclear

China’s economy continues to grow at high rates. Electricity demand now grows substantially less fast than GDP as a result of energy efficiency improvements, particularly in industry. Measured power generation rose only 2% between July 2011 and July 2012. In this period, electricity production from fossil fuels actually fell from 337 to 322 terawatt hours. (For reference, total UK electricity use is about 350 Terawatt hours a year).

Coal and gas generation is still almost three quarters of total electricity production. But non-carbon sources produced 26% of Chinese electricity in July 2012, up from 20% a year earlier. This increase was largely due to higher rainfall levels improving hydro production from 68 to 92 terawatt hours. But wind power rose by over 50 % to provide 1.5% of total Chinese electricity.

Lord Lawson threw several insults at the Chinese wind industry. As well as claiming that ‘almost half’ the turbines are not connected, he said that wind would not reach 1.5% of electricity production until 2015. July 2012 proved him wrong.

Lawson also said that PV would only account for 0.1% of Chinese power production in 2020. Precise figures are not easily available but on the basis of the installed capacity of panels, I calculate that his pessimistic figure was also exceeded – and quite comfortably - during July 2012. As PV installations are growing at an extremely rapid rate, 2012 annual production is likely to be at least twice what Nigel Lawson predicted for 2020. Lawson’s prediction was made in August 2012, suggesting that his researchers at the Global Warming Policy Foundation are simply not keeping up with the pace of Chinese investments in clean technology. A couple of days careful research would have shown that his figures for wind and solar do not remotely reflect the current reality.

Future trends

I’ve tried to suggest so far that China is investing extremely heavily in low carbon sources of electricity and that this capital expenditure is beginning to show in the total capacity for power generation and in electricity output.

What about the longer-term future? Nigel Lawson focused his disparaging remarks on the position in 2020. Can we make reasonable estimates of the share held by renewables in eight years’ time? We have to guess at rates of electricity demand growth and forecast how much cash will continue to flow into newer types of electricity generation.

In the case of wind power, a Chinese state research organisation worked with the International Energy Agency to produce a roadmap for 2020. The report suggested a possible total of 200 GW of power, or nearly three times today’s total UK power generation capacity. Hydro will rise from about 210 GW to about 330 GW, an increase of over 50%. I cannot find an official figure for PV, but I think a figure of 70-75 GW is easily possible. Nuclear capacity is forecast by a leading state organisation to rise from about 11 GW to about 70-80 GW.

If overall electricity production rises at 5% a year – very low by the standards of the last decade but in excess of recent figures – the major renewables and nuclear will capture just under 35% of the electricity market by 2020. Biomass will add to this but I cannot find an estimate for this technology at the end of the decade. Wind will contribute over 5%, nuclear 8% and solar PV in excess of 1%. Hydro power will still dominate with about 20%.

July 2012’s electricity production figures were only 2% above a year earlier. If this pattern continued, the major renewables and nuclear would contribute over 43% of supply in 2020. But whatever the pattern of growth in demand, China’s investment in renewables and nuclear may mean that fossil fuel use barely rises in the next decade. This isn’t mistaken charity on China’s part. Though its coal reserves are large, they will only provide about 35 years of power at current rates of consumption. Oil is scarce. Natural gas reserves may be in easier supply but availability will depend on fracking.  As important, the poor air quality in cities caused by coal burning is affecting health. Climate change, almost invisible in the temperate UK, is already severely affecting Chinese western regions.

Lord Lawson seems to be utterly wrong: China is making prodigious efforts to hold down its fossil fuel use in line with its international commitments and its own national self-interest.

(1) Thank you very much to Raj Gurusamy of Platts for providing me a copy of this newsletter.

At last, a plausible biofuel

Most species of algae contain oil that can be refined to make motor fuel. For ten years or more, entrepreneurs have been looking at ways of growing algae quickly, harvesting the product and then crushing it to create ‘green crude’ oil. What is probably the first commercial scale algae production plant has just opened in New Mexico. Does it look as though algae-to-fuel will be commercially viable? It is certainly a vastly better biofuel than corn ethanol but doesn’t yet appear to compete with solar PV as a source of low-carbon motor fuel. (I’m classing electricity as a fuel for cars). Using photosynthesis, almost all algae capture CO2 from the air as their source of growth. They produce oil and when this oil is burnt, the CO2 returns to the air. Motor fuel made from algae can thus claim to be close to carbon neutral. Most algae grow best in warmth and strong sunlight, meaning that the product is potentially well suited to sunny deserts where the land has few alternative uses.

The Sapphire Energy bio-refinery in Columbus, New Mexico is an extraordinary new venture that demonstrates the viability of commercial algae cultivation. Make no mistake, this seems to be a huge technical success. Huge ponds 200 metres long grow different species of algae depending on the time of year. The refinery extends over 120 hectares (300 acres). All the processing is done on site. It doesn’t use potable water.

By most metrics, Sapphire’s plant isn’t yet competitive with solar PV. And since most places (hot deserts) that are suited to algae are also suited to PV, the long-term future of algae refineries isn't immediately clear. Nevertheless, my quick calculations suggest that the plant produces about five watts of power for each square metre of space.  (The numbers to support this assertion are appended at the bottom – if you find a mistake, please tell me). By contrast, a big solar farm in sunny New Mexico may achieve 15 watts/sq metre, about three times greater.

(Is there a logic to this difference? Yes, PV cells turn about 15% of the energy falling on them into electricity, although not all space is used because of the gaps between rows in solar farms. Photosynthesis is much less efficient, averaging less than 2% in most circumstances. So PV will always tend to be more efficient in terms of energy conversion per unit area.)

Although algae cannot easily compete with PV at generating power, they are far better than corn ethanol, a petrol substitute made from corn cobs. My calculations suggest that land growing corn/maize produces about 0.2 watts per square metre, less than a twentieth of the figure for algae. So the mad and immoral policy of mandating that almost half the US corn crop is converted into motor fuel is clearly an extremely inefficient way of generating biofuels. Algae production is a much, much better tool for the decarbonisation of oil.

So does it matter that PV is better at converting light into fuel than algae? In the US, with its huge resources of unused desert, probably not. Sapphire has produced some estimates of what its new bio-refinery will produce and my quick calculations suggest that the entire crude oil need of the country could be grown on about 3% of the area of the continental US. This is a huge expanse of land, but slightly less than the approximately 3.5% given over to growing corn.

But more important than the land taken up by algae is the capital and operating costs of a biorefinery. The company’s press releases suggest that the cost of constructing the algae farm exceeded £135m. Dividing this by the energy value of the output of oil suggests a capital cost of about $2.40 for every yearly kilowatt hour. In the same sun-drenched location, PV would cost around a third as much. Of course, the cost of the refinery reflects that it is an ambitious prototype. Perhaps the cost will halve by the time the tenth bio-refinery is constructed? But it will still be more expensive than PV today.

The much higher operating costs of the algae farm also weight the economics in favour of PV, which needs no permanent workforce once it is constructed. But all these disadvantages are comprehensively outweighed, you might say, by the fact that Sapphire Energy delivers liquid energy, able to be poured into cars and planes as a direct replacement for refined oils. Algae may well turn out to be the most efficient way of generating low-carbon liquid fuels outside those areas lucky enough to be able to grow sugar cane. It is difficult to see any other way of replacing aviation fuel at reasonable cost to people and to planet.

Nevertheless, the better overall performance of PV should cause us to hesitate before backing algae for petrol replacement for land-based vehicles. And there’s one crushing final argument in favour of using electricity for powering cars. The energy coming from a PV panel flows into the grid and is extracted to charge a battery in an electric car. The car then uses this electricity with about 80% efficiency (the ratio of useful power delivered to the wheels as a percentage of the energy value of the electricity taken from the grid). By contrast, even efficient modern internal combustion engines only deliver about 25% conversion. Not only are algae plants probably more costly, far more space-using and have higher operating costs, they also produce a fuel with a third the value of electricity when converted to the energy of motion. Fantastic achievements Sapphire Energy, but we still should be pushing for electric cars.

 

Energy performances per square metre – back of the envelope numbers drawn from Sapphire Energy’s press releases

Expected eventual oil output per day – 100 barrels

Energy value barrel of oil - 1,600 kWh

Daily energy value of algae oil – 160,000 kWh or 160 MWh

Yearly energy ouput – 58,400 MWh

Space used – 120 hectares

Energy output per hectare – about 487 MWh

Square metres per hectare – 10,000

Annual output per sq metre – about 48.7 kWh

Hourly output per sq metre – about 5.5 watts compared with substantially less than 1 for other biofuels.

 

 

 

No rationale for Heathrow expansion

A growing number of influential people are saying that Heathrow needs a third runway. The main arguments being voiced are

a)      There’s a crushing shortage of airport capacity in the South East of England.

b)      Heathrow’s status as a business airport  is threatened by congestion

c)       The shortage of runway slots means that flights to China and other important business destinations are unavailable.

The numbers don’t support any of these conclusions.

a)      Shortage of airport capacity in the South East

The number of flights handled by London airports in the year to the first quarter of 2008 was about 975,000. This was down just over 9% from the same figure 4 years earlier, when the figure was 1,073,000. In other words, we know that London can handle at least 98,000 flights than it does at the moment. There’s no shortage of capacity.

b)      Heathrow’s status as a business airport is threatened by congestion

Only 37% of Heathrow passengers were business users in 2011. 63% were leisure fliers. Any problem is caused by holiday passengers.

c)       Flights to China and other boom economies are too few in number

The number of passengers flying to and from the Far East, including China, fell about 2.5% between 2007 and 2011. Only about 3% of UK passengers were going to the Far East. More went to Switzerland.

More details about the abundance of UK flights to China can be found in an earlier post on this web site.

 

Business air travel is not rising. This may be a function of the state of the world economy or it might reflect a decreasing need for people to get into aeroplanes to do business with each other. The arguments of politicians and journalists that suggest that UK exports are being held back by the lack of Heathrow capacity are palpably weak.

After a ruling from the Competition Commission, Heathrow's owners have sold Gatwick and have recently reluctantly agreed to dispose of Stansted. Is it unfair to suggest that the public relations campaign to try to convince us that UK plc is being held back by the lack of a third runway at Heathrow is driven simply by a desire to win back business from the newly competitive Gatwick and the threat from Stansted?

Matt Ridley says we are apocalypse junkies

The ever-stimulating science writer Matt Ridley has just published another of his doom-laden warnings about human susceptibility to doom-laden warnings. He tells us that the history of the last fifty years shows that when policy makers are goaded into action by naïve environmentalists, they invariably make things worse.  Scientists exaggerate the potency of ecological threats and their expensive cures often achieve nothing. His closing theme is an increasingly common one: ‘why should we trust the scientists on climate change, when they have been wrong about every single environmental issue of the last half century?’ Fifteen years ago, the Economist published a very similar article to Ridley’s in an attempt to get us to stop worrying about global warming. The examples of false catastrophe were strikingly similar to those that Ridley uses this week: the population explosion, the depletion of oil and gas, acid rain, cancer causing chemicals, exhaustion of metal supplies, food production, Ebola virus and so on. In both cases, the writers are eager to tell us that things are actually getting better every day and ecologists should button their lips. Time for a quick retrospective check: how well has the Economist’s Panglossian optimism about the next decades been matched by reality? Very badly indeed, it turns out.

The price of metals

The Economist article started with the conventional attack on Malthus (also a target for Ridley, of course) for suggesting that population growth would outstrip food supply. But it rapidly switched to the 1972 Club of Rome report (Ridley goes for this as well) which projected a rapid exhaustion of available resources of important minerals, such metal ores. Prices would rise rapidly, said the Club.

What foolishness, exclaimed the Economist in 1997, as it displayed a chart showing metal prices falling by nearly 50% since the apocalyptic report. There is no shortage of ores and no reason for concern.

How have things changed since 1997? Below is a chart showing the price of probably the important metal, copper. (US $ per tonne). The cost has more than quadrupled since the Economist article. Other metals have also substantially increased in price.

The magazine went on to make the obligatory reference to the bet between Paul Erlich and Julian Simon. Erlich, a deep environmental pessimist, lost money to Simon who had correctly predicted that metal prices would fall.

Ridley also covers the wager in detail. Surprisingly, he makes no mention at all of the sharp rise in the price of most commodities since 1997. Instead, he says, ‘they grew cheaper’, a comment that will surprise anyone buying any industrial commodity in 2012t. The Economist made light hearted fun of the school textbooks that said that minerals would run out. Ridley does the same.

 

 

Food

In 1997, The Economist showed that food prices had fallen significantly since 1960. The prevailing pessimism about agricultural yields was unwarranted.

Was its rosy view of the future correct, or would food prices start to rise again? Unfortunately for the world, food prices have become much more volatile and typically much higher than they were. How does Matt Ridley deal with this inconvenient fact? He says that ‘food prices fell to record lows in the early 2000s’…but ‘a policy of turning some of the world’s grain into motor fuel has reversed some of that decline.’ Not quite correct – current world food prices in the last five years are substantially higher in real terms than at any time since 1990.

 

 

 

Cancer

In 1997, the Economist said that mortality from cancers not related to smoking ‘is falling steadily’ in the age group 35-69. Ridley repeated the claim last week saying that ‘in general, cancer incidence and death rates, when corrected for the average age of the population have been falling now for 20 years’.

Not strictly true: according to research from Cancer Research UK, 61,000 people in the 40-59 group were diagnosed in 2008 compared to 48,000 in 1978, a substantial rise even after taking into increased population numbers. Part of this increase is due to better screening and earlier diagnosis, but Ridley is choosing to ignore the many troubling signs that cancer rates may well now be rising. The Economist of 1997 and Ridley of 2012 pour ridicule on the idea that ‘chemicals’ have much to do with cancer incidence - and they are probably right – but any complacency over the number of cases is severely  misguided.

Amazonian deforestation and deserts

The Economist said that the problem was exaggerated and the area logged each year was falling. True: 1997 saw a figure of only 13m hectares (about 5% of the area of Great Britain). But by 2004, Amazonian deforestation had risen sharply again, to a level over double the earlier figure. After sustained action from the Brazilian government, the rate of loss has fallen but almost 20% of the total forest has now been lost.

Even more surprisingly, the Economist felt able to assert that in dry areas there had been ‘no net advance of the desert at all’. The UN thinks differently today, suggesting that 23 hectares are lost to the desert every minute. Unusually, Ridley doesn’t mention this theme at all, probably acknowledging the overwhelming evidence that fragile drylands are turning into deserts at uncomfortably rapid rates.

Acid rain

It’s on acid rain caused by power station emissions that the Economist of 1997 and Ridley of 2012 are most at one. They even use the same quotation from a 1990 US government report. Ridley calls acid rain ‘a minor environmental nuisance’ and both authors point to 1980s opinions that acidification didn’t affect the total volume of standing wood, once thought to be a severe threat. Ridley asserts that there is little evidence of any connection between acid deposition and increasing acidity of streams and lakes. (He is in a very small minority in his scepticism on this).

Both the Economist and Ridley imply that the environmentalists who demanded restrictions on the pollution from coal-fired power plants had needlessly panicked. Woodlands ‘thrived’ in a more acidic environment said the Economist. Not so, says the US government in its latest (2011) report on the impact of acid rain. ‘Despite the environmental improvements reported here, research over the past few years indicates that recovery from the effects of acidification is not likely for many sensitive areas without additional decreases in acid deposition’. Even now, the acidification of land and rivers remains a serious problem.

****

To both these two authors, separated by fifteen years, environmentalists constantly over-estimate the impact of humankind  on the world’s ecological systems. The globe, they say, is a much more robust organism than we think and can withstand our meddling. We should look to find technological solutions to ecological problems and not needlessly impose costly  regulation.

Most sceptics have the intellectual honesty to stop at this point and admit that the degradation of stratospheric ozone is a good counter-example. If the planet’s governments hadn’t introduced a restriction on ozone-depleting chemicals such as CFCs in the late 1980s and early 1990s, the ozone hole would still be rapidly increasing and letting in increasing quantities of dangerous UV-B radiation. (Too much UV-B causes skin cancer in humans and some animals and affects plant growth).

Matt Ridley won't have any of this nonsense. In a jaw-dropping series of paragraphs, he asserts that the connection between CFCs and other chemicals known to react with ozone and the decreases in ozone levels is unproven. The careful work by Paul Crutzen and others that won a Nobel Prize for showing how a single atom of chlorine can unbind many molecules of ozone is not good enough. Nor is the evidence of the impact of the ozone hole on skin cancer. Ridley says that the ‘the mortality rate from melanoma actually levelled off during the growth of the ozone hole’. It’s not unfair to describe this conclusion as utter nonsense: increasing skin cancer incidence has been linked to rising UV-B radiation for several decades.

But to Matt Ridley it seems more important not to allow the environmentalists to be right about anything. He appears to be worried that his readers might believe if scientists were right – just once in the 1970s - to link man-made chemicals to the extreme dangers of rapid ozone destruction, they might also be correct to say that global warming threatens mankind’s future. I really think we could have expected more from one of Britain’s best writers on science.

 

James Hansen on extreme weather events

James Hansen’s recent paper uses detailed temperature records to demonstrate that the chance of an area experiencing extremely hot summer weather has increased dramatically in recent years. Several similar publications have shown recently how climate change has increased the likelihood of very adverse weather. Scientists like Hansen do this work because they are wrestling with the need to communicate to the general public that global temperatures won’t increase every year but that the chance of extreme events in the form of ferocious heat or catastrophic rainfall is rising rapidly. Hansen’s conclusions are important in that they are the first attempt to show how the frequency of very high land temperatures has risen. But the second major finding of his paper has not been noted by the scientific press: temperature variability has increased. Not only has the average temperature risen but the distribution of temperature has widened, meaning that extremes are more likely. We didn’t know this, and the finding is deeply worrying.

The ‘bell curve

Many natural phenomena demonstrate a pattern called the ‘normal distribution’ or ‘bell curve’. Measure the height of Finnish women or the IQ of Singaporean children and the results will follow a predictable form that resembles the shape of a bell. Most observations are clustered around the mean with increasingly small numbers of results away from the central peak.

Temperatures follow this pattern. If I logged the average noon temperature for August days in London each year, I would find that the observations followed a bell curve pattern. Findings would be grouped around a central (mean) figure. The number of years above this level would be approximately equal to the numbers below it. The shape of the observations would be roughly symmetrical above and below the mean.

The bell curve is reassuringly familiar. In fact, it seems to me that humankind naturally assumes that most natural phenomena follow this pattern. This ‘normal distribution’ follows a clear statistical pattern. A calculation called the standard deviation predicts the width of the curve. Some distributions are quite tight, meaning that the curve has a small standard deviation and the curve falls sharply away from the mean. Others are fatter, with a high standard deviation. Whatever the size of the standard deviation, a proper bell curve has about 68% of all observations within one deviation of the mean. This percentage is almost universal.

Hansen’s paper calculates the standard deviation of summer temperatures (June-July- August) over land in the northern hemisphere. He shows the standard deviation was about 0.5 degrees C in the period 1950-1980. This number tells us that about 68% of all average 24 hour temperatures over the three month period for a particular spot will fall within the range +0.5 degrees to -0.5 degrees of the average. So if London’s average (24 hour) temperature is 16 degrees in the summer, it will be within the range 15.5 to 16.5 degreees just over two thirds of all years. This is quite a tight curve. Extreme variations of, say, +2.0 degrees are therefore very rare indeed.

Most models assume that the impact of climate change will be to shift the bell curve upwards. So if land temperatures rise by an average of 1 degree C, then London’s summer warmth will also rise by 1 degree, and the standard deviation will stay the same. One standard deviation (68% of observations are within this figure) will remain 0.5 degrees C. This is a convenient assumption: it implies that the effects of climate change are predictable and smooth. All that happens as the world warms is that the curve of likely future summer temperatures rises but the shape of the curve remains the same.

Hansen and his colleagues show that this is probably an incorrect assumption. They demonstrate that the curves of temperature are widening. The mean temperature is rising but the probability of extremely warm periods is increasing as well. The change isn’t massive. Hansen says that for the average spot in the northern hemisphere the standard deviation was 0.5 degrees in the period 1950 to 1980 but had risen to 0.54 or so in the period 1981 to 2010.

Also, the curve of possible outcomes is no longer symmetrical. Assessed against the average, the chance of very warm summers has increased sharply but the likelihood of colder summers has risen much less. (This means that the curve of temperatures doesn’t resemble a true bell curve any more – there’s a bulge on the higher side). All-in-all, the chance of really hot summers has increased quite sharply, even when assessed against a rising average global temperature. Global warming is significantly raising the chance of really extreme hot periods.

Until now, most climate scientists have assumed that bell curve will stay in shape. If Hansen’s research is correct and rising greenhouse gas concentration are producing a sharper increase in extreme hot events than predicted, we have yet another reason to worry. Adapting to a changing climate is more difficult if the extremes of hot weather or major rainfalls are more severe. (As we see in the American corn belt or the rain-hit north-west of England this summer.)

Even more fundamentally, some scientists have wondered whether the earth’s response to rising greenhouse gas concentrations would follow a bell curve type pattern or not. A doubling of pre-industrial greenhouse gas concentrations, which seems an increasingly likely outcome by 2050, was predicted to increase temperatures by between 2 degrees and 4.5 degrees with probability  distributions within this range that resemble a traditional bell curve. Marty Weitzman at Harvard has led the questioning of whether this is a reasonable assumption. He suggests that the right hand side of the probability curve may be much ‘fatter’ than the left (and therefore resembling what is known as a Pareto distribution rather than a normal curve - think of the shape of a beached whale). Hansen’s latest gloomy paper gives some important support to Weitzman’s hypothesis of the fat rightward tail.

This may seem abstruse and academic statistical worrying. It is not. Humans are brought up to expect the normal distribution in natural phenomena. Measure the height of your colleagues at work, or the time they take to drink a cup of coffee and you will find an approximately standard bell curve. We have instinctively assumed that temperature changes will continue to exhibit the same probability distribution as they have in the past. Remove that comfortable assumption and we have another major uncertainty to worry about.

 

 

 

Two reverse ferrets on energy policy

British journalists use the expression ‘reverse ferret’ when identifying changes in an organisation’s stance on an important issue. An important feature of a good reverse ferret is that the abrupt switch must never be acknowledged. In the last week the Department of Energy (DECC) reversed five years of British policy in two crucial ways. First, it has abandoned any pretence of technology neutrality in sponsoring additions to electricity generation capacity and now supports nuclear and gas in preference to renewables. Second, it has indicated that gas powered generation is no longer assumed to be accompanied by Carbon Capture (CCS) by 2030.

A successful reverse ferret is usually accompanied by a decoy: a story that distracts journalists attention while the U-turn is carried out. In this case DECC allowed a minor competing story about the rate of change in wind subsidies to attract press coverage. Masterly work, at least if you don’t worry too much about climate change.

The end of the orthodoxy of technology neutrality.

In its Carbon Plan of December 2011, published less than eight months ago, DECC wrote ‘In the 2020s, the government wants to see nuclear, renewables and CCS competing to deliver energy (meaning electricity) at the lowest possible cost. As we do not know how costs will change over time, we are not setting targets for each technology..’.

This summarised the energy policy of the UK government. It would set not targets but treat each potential source of low carbon electricity equally. Whichever technology forced down costs fastest would end up as the dominant provider of electricity. That’s all changed. The ministerial announcement on support for renewables on 25th July reduced support (as expected) for wind and for large scale solar PV.  Onshore wind now gets 0.9 Renewable Obligation Certificates (ROCs) worth about £40 a megawatt hour. PV will no longer be eligible for ROCs and will have to rely on the feed-in tariff of about £68 a megawatt hour, a figure which will be cut to about  £41 by  2015. (In both cases, these subsidies will be supplemented by payment for electricity, probably at about £45 per MWh.)

Where does this leave onshore renewables compared to nuclear? Nuclear will benefit from a different form of subsidy, the so-called ‘contract for difference’. In all important respects this is a feed-in tariff disguised to enable government ministers to be able to claim that nuclear receives no direct subsidy. The Times recently reported that the nuclear industry was demanding feed-in tariffs of £165/MWh. Denials rapidly followed from both government and electricity generator and the level at which the tariff will be set will probably be around £130/MWh. This support will continue for several decades.

Total payments for low-carbon electricity

Onshore wind £95/MWh
Solar farms £123/MWh (falling sharply to around £96 by 2015)
Nuclear £130/MWh

 

Nuclear power is going to be subsidised far more heavily than low-cost renewables.. This may well be a logical decision by government. Without baseload nuclear power, guaranteeing electricity supply is going to be very tricky. But let’s be clear: nuclear is going to receive a higher rate of financial support, guaranteed for longer, than the currently lowest cost renewables. In order to make the nuclear renaissance happen, we now see huge subsidies to draw in EdF and Chinese money. Financial neutrality has gone. We now have an industrial policy that incentivises one technology against another.

Support for gas

Until a few months ago, government policy documents routinely asserted that almost all electricity production would be low-carbon by 2030. The amount of CO2 emitted from power stations would have to fall to an average of a fifth or even a tenth of current levels. If gas or coal were used, they would have to be accompanied by CCS. The December 2011 Carbon Plan said ‘Fossil fuels without CCS will only be used as back-up electricity capacity at times of very high demand’.

That commitment has gone. The 25th July ministerial statement said ‘We do not expect gas to be restricted to providing back up to renewables’. If gas remains cheap ‘we expect it to continue to play a key role ensuring that we have sufficient capacity to meet everyday demand and complementing relatively intermittent and inflexible generation’.  It is only ‘in the longer term (that) we see an important role for gas with CCS’. The statement didn’t admit this, but the carbon targets for 2030 have in consequence been abandoned.

Accompanying the new explicit support for gas was a nice sweetener for the offshore exploration industry. A fund of £500m was announced to back investment in less financially attractive gas fields. We should put that in context. The current support regime for marine renewables is expected to provide £50m for wave, tidal and offshore wind R+D over the next four years. In other words, offshore renewables will get one tenth the help given to offshore gas.

That’s how it stands – high and guaranteed support for nuclear and subsidy for gas. Renewables are to have financial help withdrawn. These extraordinary reverse ferrets were largely ignored by the press, which focused on whether the UK Treasury or DECC ‘won the battle’ over the precise level of support for onshore wind. Did Chancellor Osborne or Energy Secretary Davey beat the other into pulp? A great tactic from the DECC press office, ensuring that a minor skirmish attracted attention while huge policy changes were left unnoticed.

 

The world’s largest community owned PV farm achieves minimum fund raising target

Westmill Solar, a 5 megawatt PV farm sited between Swindon and Oxford, is one of the largest arrays in the UK. It was built a year ago to profit from the high feed-in tariffs then available to large PV installations. Adam Twine, the farmer on whose land the 21,000 panels were sited, kept a right to buy back the solar farm from its original financiers. Twine is an enthusiast for community ownership and recently set up a cooperative to purchase the whole array. Small investors can apply to buy shares now, with local residents given priority. If successful, the new cooperative will be the biggest community owned solar farm in the world. The new business announced yesterday that it has raised the minimum £2.5m necessary to take the deal forward to the next stage. Other community should copy Twine’s scheme: the UK needs thousands of renewable energy projects like this one, giving decent returns to local people. (NOTE - on August 1st, Westmill announced it had exceeded its £4m total target and applications are now closed).

The investment opportunity

Westmill Solar is seeking to raise £16.5m to buy the PV farm. The business is looks to finance up to about a quarter of this (£2.5m  to £4m) from individual investors. The remainder (from £12.5m to £14m) is being sought from institutional bond holders at an interest rate of about 3.5% above retail price inflation (RPI).

The proposed financing has several unusual features. These make the investment opportunity more difficult to explain than comparable projects. Nevertheless, the innovations should form a model for future renewable energy fundraisings from communities because they improve the returns to small investors.

  • The business will buy back 5% of its years each year from year 2 to year 10. This is a tax efficient way of returning capital to shareholders.
  • The dividends[1] paid shareholders (as opposed to bond investors) will start low and gradually rise as the bond holders are paid off. By the end of year 24 - when the business is expected to be wound up as the Feed-in Tariffs cease - the returns illustrated in the prospectus appear to be over 50% a year on the shareholder capital remaining in the business.
  • The index-linked returns paid on solar PV investments create a very high degree of reliability of cash flow. Compared to wind, PV output is also far more stable from year to year. This means that businesses like Westmill Solar can run themselves with only a thin layer of shareholder capital, enhancing percentage returns on their cash.

The Feed-in Tariffs and revenue from exported electricity will produce a gross income of about £1.7m a year, rising with inflation. At today’s inflation rate, the bondholders will take interest in the first full year of about £0.8m. Running costs are approximately £200,000, leaving about £0.7m to begin to pay back some of the debt and provide a return to the small investors. As bond holders are paid back, an increasing fraction of the total income can be diverted to the shareholders enhancing returns. This isn’t likely to be a particularly  good investment for those seeking high dividends in early years but it could be an exceptional opportunity  for those seeking to make savings for financial needs in ten to twenty  years’ time, such as people wanting to improve their pension plans.

What are the risks?

By 18th July, 660 investors had committed £2.5m (an average of just under £4,000). This means that the equity fund raising has achieved its minimum target. The key remaining risk is that the investment bank seeking to raise the bond finance from institutional investors is unsuccessful in raising this money. If this happens, Westmill  Solar’s purchase of the PV farm will not proceed and the private investor money will have to be returned. The prospectus notes that the company will deduct up to 5% of investors’ money  to pay  for the costs of organising the offer to shareholders.

The other main risk is probably very high levels of inflation in the next few years. The bond holders’ return will be set as a percentage over RPI inflation. If, for example, 2015 RPI inflation is 7%, the interest payable to bond holders will use up almost all the cash coming into the business. Although the income from Feed in Tariffs will also rise, this will only partially compensate for the high interest costs. Until the company has paid back a large fraction of the £12.5-£14m debt to bondholders, very high inflation could represent a serious threat to the viability of the business. How likely is that we will see inflation rates well above today’s levels within the next fifteen years? Who knows, but  there is clearly a risk.

The other main risk is much more manageable. Levels of sunshine could be lower than projected. Levels of solar radiation hitting the UK don’t vary much from year to year but there is an obvious concern that the poor summer seasons of recent years might be a long-term pattern. Or a big volcanic eruption might affect sunshine levels for a year or so.

The wider importance of this fund raising

Communities around the UK can copy this scheme. Although the cuts in the Tariffs temporarily destroyed the viability of large scale PV, cost reductions now mean that big solar farms are financeable again. Westmill is financing a total of £16.5m to buy the PV farm but a new venture might well be able to build a similar array for less than £7m. Many prospective solar farms of this size are now in the planning approval process in the south west of England.

Many congratulations to the directors of Westmill, who have done an exceptional job in getting the project to this stage of development. My colleagues at Ebico and I remain keen to work with other communities to develop locally-owned renewable energy projects, providing good returns to smaller investors.

Owned by Eden Project employees, the much smaller PV array we developed in late 2011 was recently made runner-up in the Renewable Energy Association project of the year awards. Either using our model, or copying Westmill’s innovations, every town and village in the UK can now have its own wind, biogas or PV farm.

(Full disclosure: I myself haven’t yet applied for shares in Westmill but will probably do so over the next few days).



[1] Because the Westmill Solar business is what  is known as an ‘Industrial and Provident Society’ the dividends are paid as untaxed interest.

World land temperatures for June hit record high

You wouldn’t guess this from the UK’s weather, but world temperatures on land were the highest ever recorded for June. May was similarly record-breaking. The April to June quarter exceeded historic records for northern hemisphere land temperatures. Combined land and ocean figures make June the fourth hottest ever across the globe as a whole. As the cool water phase (El Nina) of the eastern Pacific drew to a close, world land temperatures have risen as expected in the last few months. The hot weather continues in July while Britain waits for a sight of the sun.

Many of us trying to communicate climate change issues have been approached by news media over the past few weeks asking whether the UK abysmal summer weather indicates that ‘the global warming scare’ is over. No, I say, the science remains exactly the same. In a warmer world weather may well become more erratic, more unusual. We should not pay much attention to episodes of unusual cold in Britain or elsewhere but focus on global averages. The last few months have been as warm, or warmer, than the past few years. Many places – on all continents -  are experiencing record temperatures.

Somehow, this response simply doesn’t work. Journalists are not interested in extreme temperatures 1000 km away in Austria (highest ever June temperatures) or the US (records broken across most of the country). The only thing worth commenting on is that the UK has had the wettest June since record-keeping began and the coldest midsummer month since 1991. Humankind finds its very difficult  to comprehend a global mean or a new record set in a strange and unknown part of the world.

A poll this week shows that Americans (experiencing hot weather on their continent) are agreeing with the climate change hypothesis in increasing numbers. But Britons drying their houses after repeated inundations understandably show no such belief. Truly it is going to be difficult to get any substantial global response to the climate change challenge.

 

DECC numbers on energy efficiency need checking

Today’s presentation on electricity efficiency opportunities from the Department of Energy (DECC) makes a series of important errors in its estimates of the savings that can be made in domestic homes.[1] For example DECC overstates the amount of power used in domestic lighting by almost a factor of three. Its projected efficiency savings are almost twice as great as today’s total use of electricity for this purpose. By contrast, DECC substantially underestimates the use of power for space and water heating. What is most surprising is the clear conflict between many of the figures presented and other recently published DECC data. Today’s document was supposed to show the large possibilities for improvements in the efficiency of electricity use. What seem to be simple mistakes completely undermine its credibility. More fact checking, please. Domestic homes consume about a third of all UK electricity. This figure is tending to rise both because of de-industrialisation and because of the relatively slow progress at reducing electricity use in households. It is only recently that home use has fallen substantially whereas industrial and commercial consumption has been falling for most of the last decade.

Efficiency matters. As DECC said on its web site when it released the presentation today.

Encouraging greater efficiency in the use of electricity is potentially very valuable to all of us. It can reduce electricity bills both directly and also indirectly through limiting the overall cost of the electricity system in terms of funding for new generation, transmission and distribution infrastructure.

Lighting

This DECC report suggest that domestic homes use 42 terawatt hours (TWh, over 10% of total UK electricity demand). In the magisterial GB Housing Energy Fact File, published by DECC in September 2011, this figure is estimated at only 16.5 TWh.[2] This second figure is widely used and is assumed to be approximately accurate. The recent Energy Savings Trust report suggests an even lower figure of about 540 kWh a year per house, equivalent to about 14 TWh for all UK homes.

Today’s  DECC document estimates that efficiency savings of 26 TWh can be made, largely by the replacement of old-fashioned light bulbs (‘incandescents’) by compact fluorescent lamps(CFLs). So the total savings claimed to be available are far greater than total domestic use. Not only is the number wrong, but the efficiency improvements from the switch to CFLs have already been partly made. The remaining gain will come from moving from comparatively wasteful CFLs to high efficiency LEDs. The savings from this switch might be about 10 TWh but are unlikely to be more.

Appliances and electronics

The new DECC document says that appliances and electronics in homes consume about 47 TWh but its own September 2011 report suggested a figure of 58.4 TWH, a figure almost 25% higher and in rough agreement with the EST June analysis.

In the case of home electronics, efficiency savings of 38% are said to come from a reduction in standby losses. (I could find no source provided for this estimate).This is not a credible figure. No modern consumer electronics now have significant power consumption when not in use and efficiency saving will be much lower than 38%

Heating use

The new report suggests that household electricity use can be reduced by building improvements such as installing ‘high efficiency windows’. The total potential saving identified is almost 15 TWh. But in another DECC document, this time from 2010, the total amount of electricity used to heat UK homes is estimated at 17 TWh.[3] Therefore the new estimate is that almost 90% of electricity consumption to heat homes can be avoided by retrofitting insulation and other improvements. This is not a supportable assumption.

 

Reducing electricity demand in the UK is an important objective. I have only researched the section on domestic use but this portion is said to offer almost half the possible efficiency savings.  Furthermore, the costs of efficiency improvements are stated to be less than the financial gains to the householder from using electricity. This assumption, which in my experience is rarely true, is never examined. It seems to me that the DECC report does not meet reasonable expectations for policy proposals from a government department, even in the draft form in which it is currently presented.

 

 

 

 

 

 

 

 

Can you predict someone's carbon footprint by knowing how much money they have

(This article was written in 2009 and uses data from previous years. Expenditure patterns change slowly and the conclusions are likely to be broadly  accurate today. I am posting it now because the data is referred to in an magazine article to be published in the next few weeks.) If we include the full impact of flying, the average person in the UK is responsible for about twelve and a half tonnes of greenhouse gases each year. About half of this total comes directly from running our homes and from personal travel. The rest comes from the things we buy, our carbon dioxide output at work and from manufacturing industry.

In mid November the Prime Minister gave his first speech on climate change. He said that emissions may have to fall by 80% by 2050. This means moving from twelve and a half tonnes per person down to two and a half. Many scientists say that even this is too much and we may eventually have to cut our average emissions to no more than one tonne. This is less than 10% of today's level.

Who is going to find it most difficult to reduce their emissions? What types of families are going to have to make the steepest cuts? We did some work to examine how much carbon dioxide the rich generate compared to the less well-off. The results show that if all of us are going to have to live within a small allowance, the better-off are going to have to really cut back. The richer you are, the more you spend on goods and services that produce carbon dioxide emissions. So, for example, people who don’t have much money don’t fly away on holidays very much. But some people travel by air ten times a year. The wealthiest ten per cent of the country have a carbon dioxide footprint just from motoring of almost two tonnes, enough to use up most of an individual's allowance for 2050. This is over four times the level of the least well-off. When we added up the sums, we found that the richest ten per cent have emissions almost two and a half times as great as people at the other end of the income range.

Our technique.

We used a very good source of government data, the annual Family Spending survey. The information in this huge report is taken from a large number of detailed questionnaires that were completed in 2005 and 2006, the latest year for which data is available. For each group of one tenth ofUKhouseholds, moving from poorest to richest, the survey says what they typically spend on hundreds of different items. For example, you can find out what people in various income groups spend in ice cream, alcopops, children's clothing or even reading glasses. Thousands of people fill in the questionnaire and the numbers are thought to be very accurate.

In the latest report, the top ten per cent of households have an average spending, across all family members, of almost £1,300 a week. The middle-ranked households had an expenditure of about £500, and the people at the lowest income level were spending less than £135 a week.

How did we calculate a carbon footprint?

We looked at the main different types of household expenditure that involve burning fossil fuels. For example, we noted spending of gas and electricity. When you heat your house, the boiler is emitting carbon dioxide to the outside world. Slightly differently, when you turn on the lights, a power station has to burn just a bit more fuel. This means more CO2 up the power station chimney. We also looked at how much people spend on petrol and diesel. If they spend money, it means they bought litres of fuel which are burnt in the engine and carbon dioxide comes out of the exhaust. We also examined the money spent on public transport and flying, though the numbers are slightly less good for aviation than we'd like. Information about foreign holidays is there and, finally, we looked at money spent on meat. Meat is important because animals emit greenhouse gases such as methane and because they eat grains, which have generally required artificial fertiliser to grow. Fertilisers take a lot of fossil fuel to make.

Let's look at the main categories in turn.

Heat and power for the home

People in the top income groups spend a lot more on gas and electricity. The latest data shows the richest 20% of households spending £17 a week on domestic power and heat, while the bills of the poorest 20% were only £9 a week. But richer families have much larger households. Many of the poorest people in theUKlive alone but the top 20% of families have an average of over 3 people in the house. So when we look at spending per person on gas and electricity, we find that it doesn’t vary much across the income groups.  The richest people actually have a slightly lower carbon footprint than the less well-off. But there isn’t a big difference between the various groups. This was surprising. Because rich people generally have much bigger houses and more space per person, we thought that they would spend more on fuel per person. This isn’t the case, suggesting that less well-off people may live in houses that are not particularly well insulated.

Interestingly, we also made a calculation about the carbon footprint of the electricity we use. We used standard government numbers to work out what the average emissions are per person. Very roughly, it is just under a tonne each for electricity, and between one and a half and two tonnes for gas.  If nothing else, this does show how far we have to go to meet the latest targets set by Gordon Brown.

Petrol and diesel for the car

Whereas there isn’t much difference between income groups when it comes to heat and power, we do see large disparities in fuel use for cars. The richest people spend over four times as much per person as the least well-off. Richer people generally have bigger cars and drive them longer distances. Very roughly, people in the least well-off portion of theUKpopulation have a carbon footprint of less than half a tonne from driving, compared to almost two tonnes among the wealthiest.

Public transport

Public transport expenditure is very interesting. Rail use goes up dramatically as people get wealthier. It is almost ten times as much among the richest as among the people at the bottom end of the spectrum. On the other hand, households with less money actually spend more on bus and coach fares. As we know, most people don’t travel much by public transport, so the impact on total carbon dioxide emissions is not that great.

Air travel

It's here that we see the most striking differences. The numbers are less precise than for car travel, but the richest groups spend over ten times as much on foreign holidays (usually taken by air) and perhaps five times as much on flights. This almost certainly  means that the top 10% have a footprint of more than four tonnes from flying compared to well under half a tonne for the least well off group. In a future world in which carbon dioxide is much more carefully controlled, many people's flying habits are going to have to change, or the airlines are going to have to have to find a way of burning less fossil fuel.

Meat

At about £4 per person per week, this doesn’t vary much between income groups, though high income homes do spend a little more. Household diets vary enormously and there isn’t any obvious evidence that the most carbon-intensive food are disproportionately eaten by any one income group.

What does it all add up to?

We have looked briefly at the main carbon culprits – the things which have the greatest impact on your personal responsibility for climate changing gases. We can summarise roughly how emissions vary by income group.

 

Approximate greenhouse gas emissions per person

(tonnes per year)

 

Bottom 10% Average income Top 10%
Electricity 0.9 0.8 0.9
Gas 1.7 1.6 1.7
Motor fuels 0.4 1.1 1.8
Public transport 0.1 0.1 0.2
Air travel 0.4 1.5 4.0
Meat 0.3 0.3 0.4
TOTAL 3.8 5.4 9.0
 
Approximate expenditure per person per week £80 £200 £450

 

Almost all of the difference is driven by the much higher figures for air and car travel in the highest income groups. The most prosperous people have carbon emissions from these sources of almost two and a half times the least well-off. Add all these numbers up, and the climate emissions vary by almost two and a half times across the income range.

 

 

 

 

Dutch trial of domestic fuel cells for grid balancing

In late May, Germany met more than 50% of its power needs from solar PV at midday on two successive days. This astounding success brings a problem with it. How does the country manage to balance its electricity grid as solar electricity ramps up towards noon and then falls away later in the afternoon? Most analysts assume that large-scale natural gas power stations are the logical complement to intermittent renewables. An announcement today (June 19th 2012) from a small Australian company should make us question this assumption. It has just announced a trial of its domestic-scale fuel cell power plants for grid balancing in the Netherlands. These tiny fuel cells are highly flexible, powering up and down in a matter of seconds. In theory this technology could be the cheapest way of matching supply and demand in a renewables-dominated world. But they need to be in millions of homes to create a large enough buffer and to push the capital costs down to competitive levels.

I wrote the first edition of Ten Technologies to Fix Energy and Climate four years ago. Each of the ten chapters focuses on one or two companies that looked as though they had the technical edge to prosper in a world in which low carbon energy sources take a larger role. The good news is that almost of these businesses are still in existence. The bad news is that most of them haven’t broken through to commercial viability. This probably tells us a great deal about the state of the battle against carbon emissions: even the best technologies have yet to take off because of the difficulties of getting to competitiveness against fossil fuel power stations that have had a century to reduce their costs.

One of the most interesting companies I wrote about was Australia’s Ceramic Fuel Cells. Ceramic, as it seems to be known in its home country, is the owner of the grid balancing technology now on trial in the Netherlands. Ceramic makes what are in effect small domestic electric power plants. These refrigerator-sized devices sit in the kitchen or boiler room generating about one and half kilowatts of power, about three times the average domestic consumption, by splitting natural gas (mostly methane or CH4) into hydrogen and carbon dioxide. The hydrogen then combines with oxygen in air in an electric circuit creating water, an electric current and some heat. Most of the time these units are exporting their power into the local grid and they  do so at about 60% fuel efficiency, at least as good as the best full-sized power station. Moreover, the waste they generate can be captured to provide 100 litres a day of hot water, enough for most homes.

The Ceremic technology is still expensive – almost £20,000 to install a device that generates electricity worth (at retail prices) no more than £1,200 a year. (For comparison, a modern gas turbine plant might have a capital cost of about £1,500 per 1.5 kilowatts of peak output). The value of the hot water might be another £3000 at most. Even with hefty feed-in tariffs, the homeowner is unlikely to see a high return. As with many clean technology companies, Ceramic is stuck making small volumes of its products at a high unit cost. To get down to £4-5,000 per installation, the company needs to make several thousand units a year, not the hundreds it is making at the moment. Although many people say that its technology is further advanced than other small fuel cell company in the world, it still has to fight for every sale and is reliant on the support from big utilities around the world which are charmed by the technology.

Fuel cell technologies are not carbon neutral if they use natural gas. But if the gas comes from biological sources, such as anaerobic digestion of agricultural wastes, it can provide genuinely renewable electricity. In addition, the ability of Ceramic’s products to turn up and down at a few seconds notice can provide very valuable grid balancing. At the moment of writing, wind is barely providing any of the UK’s electricity but is expected to generate almost two gigawatts by this time tomorrow. As the wind turbines ramp up, small deviations from the expected increase could be evened out by tens of thousands of Ceramic fuel cells adjusting their output to smooth the power from wind. This service can be worth much more than the standard wholesale price of power and may be the most important single source of income for the owner of a small fuel cell power plant.

Of course the critical thing is to get thousands of fuel cells spread around a country to all respond quickly to a signal to increase or decrease their power output. This is purpose of Ceramic’s trial in the Netherlands with its partners, the utility Liander and IBM. A number of its Blue Gen products will be controlled remotely by ‘smart grid’ software to see how effectively they can be combined to rapidly ramp output up or down to match minute by minute variations in the power from wind and solar.

Whether you believe that the carbon-free future for electricity generation should be based on nuclear or renewables, we all have to face the difficulty of ensuring that the electricity system can match supply  and demand minute by minute. Nuclear power stations have to be run at peak power or not at all and wind and solar production can neither be accurately predicted nor managed. We will either need huge amounts of storage (perhaps hydrogen or pumped water or compressed air) or highly flexible generators. As things stand today, Ceramic’s products are the most easily adjustable generators on the market. The company may need another £200m of capital to get its products down to reasonable production costs, but its twenty year old technology is one of the most interesting parts of the low carbon future.

The Rothamsted battle

Eight small plots of wheat at Rothamsted research centre are the focus of an increasingly bitter dispute. These 6 metre by 6 metre squares of genetically modified cereals are threatened with destruction by one group of determined environmental campaigners this weekend (27th May 2012). Other equally committed environmentalists fiercely defend the importance of the science. If successful, the Rothmasted GM wheat will reduce the need for the use of insecticides, particularly the group called pyrethroids that kill aphids and other pests as well as beneficial insects. Since pyrethroids may be implicated in the collapse in pollinating bee numbers, GM wheat might have major beneficial impacts.

Wheat is the single most important source of human nutrition. About 20% of the world’s calories come from this crop. Increasing the yield from this cereal is therefore a crucial part of the world’s route towards securing food for three billion more people by 2050. Aphid infestation can cause significant losses to the tonnage of wheat taken from a field. One pest – wheat midge – can reduce yields by 50% or more in the most affected fields. Reducing losses to wheat crops caused by aphids is a vital part of improving global food availability.

The Rothamsted GM wheat incorporates a gene that helps create a substance called (E) beta farnesene. The chemical is what is known as an ‘alarm pheromone’ produced by aphids.  It signals danger to other aphids, which therefore tend to avoid it. By contrast, the predators of aphids seem to be attracted to it, perhaps because it identifies where large concentrations of their prey might be found.

( E) beta farnesene is found in several common plants, such as peppermint, and the Rothamsted researchers have added the genes that create this substance to the genome of wheat. (Anyone with mint in the garden knows that it is rarely damaged by insects – so at least in the UK the omens are good). This genetic modification is building on a recent series of papers suggesting that directly applying  E beta farnesene to wheat may reduce aphid numbers on the crop. Incorporating the production of farnesene into the wheat itself may be an even better way of reducing aphid damage.

The main benefit from the genetic modification may be the reduction in the need to use synthetic insecticides. In the UK about three quarters of all wheat has an insecticide applied, according to the last government survey. Most of these fields have synthetic pyrethroids sprayed onto the crop. Artificial pyrethoids are similar to the natural insect repellent in plants such as chrysanthemums. These insecticides work by affecting the sodium ‘gates’ in organisms and are particularly destructive to insects and to aquatic animals. These insecticides are only toxic to mammals in extremely high doses and their short life means that they are regarded as relatively safe. But they destroy all insects, including the predators of wheat-destroying aphids and so tend to diminish biodiversity.

There is some evidence that sub-lethal doses of pyrethroids, perhaps in combination with other insecticides such as neonictiniods, affect many higher functions of creatures such as bees. By ‘higher functions’, I mean such things as memory (for example, where the home hive is) and ability to communicate the direction of pollen through the bee dance to other hive residents.

Vital though they are to crop protection, pyrethroids may therefore also cause some of the problems we now see in bee survival. Wheat itself does not require bees for pollination but the doses of insecticides are possibly reducing the number of bees in the wild, with severe consequences for the future pollination of many other crops.

The argument in favour of the Rothamsted GM experiment is that – if successful – it will help to reduce the insecticide load experienced by bees during their foraging. The world needs Rothamsted to succeed if it is to produce more food at a lower environmental cost. Many of the complaints about the experiment, such as the risk of contamination of locally grown wheat, are almost certainly wrong, simply because the E beta farnesene gene introduced into the Rothamsted wheat is extremely unlikely to be able to be transmitted to non-GM crops. In particular, wheat pollen does not travel more than a few metres and even if does merge with non GM wheat almost certainly cannot transmit the E beta farnesene gene.

Rothamsted research centre is probably the oldest plant breeding laboratory in the world. Not only has it assisted in the development of new agricultural technologies, it also claims with much justification to be the ‘birthplace of modern statistical theory and practice’. The new GM wheat trial, properly approved by regulatory authorities, is a worthy and scientifically robust attempt to see if techniques can be developed to reduce the use of chemicals, particularly pyrethroids, in the field. Unfortunately, it is a wonderful irony that the lab that initially developed pyrethroids in the 1960’s was none other than Rothamsted itself. The major improvements in insect control that the laboratory developed to the benefit of people around the world may just have helped trigger part of the collapse of bee populations. Perhaps GM wheat will have the same short term benefits as pyrethroids but then cause further problems in ecological stability.

 

Food versus fuel: a debate that has only one possible conclusion

Ben Caldecott of Climate Change Capital argues in the Guardian that ‘sustainable’ aviation requires the use of biofuels. He suggests a target of about 60% bio-based ingredients in the fuel that powers planes at UK airports . He doesn’t begin to address the implications for food supply, or show how biofuels will reduce global emissions. My calculations suggest replacing 60% of the UK’s aviation kerosene with fuels of biological origin would use all of the UK’s home produced cereal and oil seeds crops and substantially increase food imports. Furthermore, to replace the food used to make aviation fuel on farmland elsewhere in the world would result in a net increase in greenhouse gas emissions. The inconvenient truth is that biofuels are never an answer to climate change problems. Put crudely, photosynthesis in growing plants captures energy provided by the sun. This energy can either be used to fuel human beings, providing them with the two or three kilowatt hours a day they need to function, or it can be used to create power for other purposes. For example, the energy in corn (maize) can be turned into alcohol that replaces petrol in a car. Or it can provide food for human beings or cattle.

A kilogramme of wheat contains about 3,000 (kilo) calories, equivalent to about three and a half kilowatt hours. Biofuel processing plants use the energy in foods to create liquids that can power engines and jet turbines. Ben Caldecott wants us to switch to 60% biofuels in aviation fuel. How much food would that require?

In 2011, the UK used about 11.4 million tonnes of aviation fuel. The total energy value in this kerosene was about 133 terawatt hours. (Contrast this with the UK’s total electricity use of about 350 TWh, about three times as much).

Britain produced about 24 million tonnes of grain and oil seeds. This was mostly wheat but also included barley, oats and oil seed rape. The energy value of this was about 84 terawatt hours. So if every single food grain produced in Britain this year was turned into liquid fuel at 100% energy efficiency, we’d only cover about 60% of our needs for aviation fuel. But even in the most efficient conversion process, only about half of the energy value in grains can be turned into fuel. Even if Britain turned every single grain produced this year into kerosene, the country would barely meet a third of its need for aviation fuel.

No problem, Ben Caldecott and other biofuel fans might say: we simply need to import more food. The question that arises is whether growing more food elsewhere would increase greenhouse gas emissions to a greater or lesser extent than the savings from reduced oil use in airplanes. Unfortunately, even simple calculations show that conventional agriculture produces more emissions than aviation per unit of energy. Growing food using conventional agriculture uses large amounts of energy to produce nitrogen and phosphorus fertilisers. More importantly, nitrogen applications to fields increases the emissions from soils and watercourses of nitrous oxide, a far worse global warming gas than CO2. The net impact on global emissions of producing an extra tonne of food is probably at least 550 kilogrammes of carbon dioxide equivalent. (Much, much more if it is new land converted from forest or grassland to arable). And unfortunately the saving of CO2 from replacing kerosene with oil seeds is far less than 550 kg per tonne of food.

As study after study has shown around the world, biofuels don’t save emissions. As importantly,  every tonne of food that is converted to liquid fuel increases the price of basic foodstuffs for poor people. Ben Caldecott's article welcomes increased air travel. He and his colleagues at Climate Change Capital should ask themselves whether feeding the aviation industry is more important than avoiding hunger and starvation. The numbers simply don’t support the view that aviation can become more ‘sustainable’ by switching from fossil fuels to biologically sourced equivalents.

Rubbish

In a report published this week (1st May 2012), the UK’s Royal Society asserted (p68) that the accumulation of waste products in a modern society is strongly linked to the size of GDP. In simple terms, more growth equals more rubbish.  Similar jeremiads about the severe impact of economic growth on global ecologies pervade the report. So the authors might be slightly embarrassed to see the latest data on household rubbish published a couple of days later by the UK government. These numbers show that the average person now produces less waste than fifteen years ago. Let’s get the facts right, please: economic progress is not necessarily bad for the environment. The volume of waste produced by an economy is a good index of its impact on the natural world. Everything we consume starts by being extracted from the earth’s crust or soil, is then processed to make it useful to us and eventually turns into waste. Whether it is an iPad, a hamburger or a Volkwagen Golf, our goods all ultimately come from the ground. After delivery a service to us, everything is discarded and becomes rubbish, collected by the local council every week or so.

The conventional view of the world is that growth in GDP always takes the form of increased consumption of physical goods. As we get richer, we’re told,  we buy more stuff. For a long while this simplification was broadly correct. A large fraction of the extra income that households gained in wealthy countries between 1960 and about 2000 was spent on things you could touch. We bought cars, washing machines, more clothes, TVs and garden furniture. As the Oxford sociologist Jonathan Gershuny points out, the second half of the 20th century is often portrayed as the beginning of the ‘service’ economy but it is characterised more accurately as the period when household life became mechanised. Households acquired a large number of heavy machines.

That era ended in advanced economies a decade or so ago. In the UK, most indices of physical consumption show a decline from around 2002, a point I have called ‘peak stuff’. That decline will continue. We have the machines we need and the ones we have last longer (compare the lifespan of a car today with one a generation ago for example), and are generally lighter and easier to recycle. I know it is difficult to believe, but we eat less, use less water and travel fewer kilometres each year. Broadly speaking, we are slowly replacing the consumption of physical goods with the pursuit of pleasurable experiences. Each year, a larger fraction of our income goes on visiting the David Hockney exhibition, attending a Manchester United football match or paying for out Netflix subscription.

We see this in the amount of waste we throw away. Waste production per person in the UK peaked at around 520 kg a year in the year to March 2002. The latest two quarters figures are fifteen per cent below that level. The lastest quarterly figures suggest a figure of about 443 kg. The decline from year to year isn’t smooth but is probably getting steeper. (Please note that the last two columns in the chart below are for the most recent quarters. The apparent slackening in the rate of decline is an artefact of the way DEFRA draws the chart). Today’s waste levels are well below the levels of 1996/7. By contrast, in the period from 1997 to today, inflation-adjusted GDP has risen by over a third. (This isn’t quite a fair comparison since the UK population has also increased during the last fifteen years). Household rubbish is actually a small fraction of the total flow of waste out of the economy. Construction waste is far more important but this is also falling sharply. All in all, we produce far less rubbish than we did a couple of decades ago.

The probable implication? In contrast to what the Royal Society says, growth may be good for the environment. We waste less and are prepared to devote more cash to ecological protection. Technology improvements mean things last longer and use fewer physical resources to make.  Regretfully, I have to say that the world’s most prestigious scientific institution should spend more time checking its facts. As people get richer, they don’t buy, and then dispose of,  more goods. As England shows, more GDP doesn't mean more waste.

 

Source: DEFRA, Local Authority Collected Waste for England, May 2012

(http://www.defra.gov.uk/statistics/environment/waste/wrfg22-wrmswqtr/)