Scroll down....

Friday, 28 July 2017

Met Office State of the Climate 2016

The Met Office 3rd annual State of the UK Climate report, released today, shows 2016 was the 13th warmest year (records dating back to 1910).

2016 has been 0.5 °C warmer than average and the last decade 0.3 °C warmer (1981-2010) over the UK as a whole and for many it was also sunny with sunshine levels 4% above the 30 year average (1981-2010) for the UK overall.

National Grid release Future Energy Scenarios (FES) 2017

The National Grid's Future Energy Scenarios (FES) document, released in July 2017, aims to encourage and inform a national debate about how the national grid should transition towards an energy system that is secure, affordable and sustainable. It is published every year with the assistance of stakeholders from across the industry.

One of its key messages is that an energy system with high levels of distributed and renewable generation has become a reality, with growth in this area set to continue. However, that in turn increases the complexity of running the energy system. Furthermore, market and regulatory arrangements need to adapt quickly to a more flexible energy system involving an increasing number of participants.

Electricity demand will increase in the future (the increasing entry of electric vehicles (EVs) into the automotive market springs to mind here) and the shape of that demand will also change. A range of solutions need to be introduced in order to deliver the best value for consumers, including smart energy technologies, a coordinated approach across the whole system, improved transmission and distribution infrastructure and various commercial approaches such as encouraging consumer behaviour change.

The document predicts four probable future energy scenarios: Two Degrees, Slow Progression, Steady State and Consumer Power.

In the first of these, increased investment delivers high levels of low carbon energy with consumers making conscious choices to become greener and being able to afford to do so. This scenario sees all the UK low carbon energy targets being achieved.

In the Slow Progression scenario, low economic growth and affordability suppress the transition to a low carbon energy system. The focus moves to longer-term environmental policies.

The Steady State scenario predicts current 'business as usual' attitudes prevailing with the main focus being security of supply at low cost. This is the least affluent of the scenarios and the least green, with little appetite for investment in low carbon technologies.

Finally, the Consumer Power scenario predicts high economic growth with more money to spend. Consumers show little interest in switching to low carbon technologies or to become environmentally friendly.

So which scenario is the most likely? The document doesn't really say, and that isn't really its function anyway, leaving that instead to the wider national debate. Instead, what it does do is explore each scenario in detail and explains what each involves and its potential impacts.

The rest is up to us.

Further information:

National Grid

FES debate

Tuesday, 16 May 2017

Aiming for operational excellence: An interview with David Swindin of Cubico Sustainable Investments

As mentioned previously in Renewable Energy Magazine, Cubico Sustainable Investments has acquired a portfolio of 18 Italian PV plants (109 MW) from Silver Ridge Power Italia, a Joint Venture between Riverstone and SunEdison. Cubico employs a unique business model where they invest in and then manage renewable assets for the long-term – typically 30-35 years. Cubico is backed by two leading Canadian pension funds and has very ambitious plans, aiming to expand rapidly and become one of the world’s leading renewable energy companies in terms of operational excellence.

REM talked to David Swindin, Head of EMEA at Cubico, to find out more about the company and its vision for the future.

Monday, 15 May 2017

H&M joins EP100 in order to enhance its energy efficiency

International fashion retailer H&M has joined global collaborative initiative EP100 in order to enhance its energy productivity and transition to a net zero carbon company.  

Monday, 8 May 2017

Antwerp scientists discover how to generate power from polluted air

Researchers from the University of Antwerp and KU Leuven (University of Leuven), Belgium, have a process that purifies air while at the same time generating power.

"Appealing to Authority": Legitimate approach or not?

In my various debates with climate change deniers, I have often been accused of "appealing to authority", i.e. judging the worthiness of an opinion based on the credentials of the person and/or organisation that presented it. This, to me, is a sensible and logical thing to do, since no-one would really expect an artist to have a good working knowledge of passenger jetliner mechanics, or actually be unleased on a jetliner if it required some maintenance, for the simple reason that this task is properly the job of a qualified aircraft mechanic, not an artist.

Oh but how they moan. Interestingly though, I've just seen this comment on a blog piece about appealing to authority which explains the deniers tactics in this area very well:

"...climate change deniers seem to use the illegitimacy of arguments from authority to dismiss the validity of the scientific consensus that is arrayed against them. That is to say, they invoke the 'marketplace of ideas' fallacy that expert and non-expert opinions are equally valid instead. However, they have no problem heaping praise upon – and almost indulging in idol-worship of – a “friendly” non-expert like Monckton or, even more so, a genuine (but no less mistaken) expert like Lindzen."

Barry Bickmore, writing on Climate Asylum, appears to have found himself in exactly the same situation as I found myself with Willis Eschenbach and What's Up With That:

"Several months ago I wrote a post here about how Lord Christopher Monckton’s handler, Bob Ferguson, had tried to get me to do a live debate with Monckton.  I declined, because I felt that live debates favor people who, well… make up whatever they want.  Instead, I proposed a written online debate, in which we would have time to check each other’s sources.  This proposal was flatly refused."

Some of these deniers are pretty sneaky, but given they are so good at lying, that shouldn't be too much of a surprise. And they will try any rotten tactic they can to avoid listening to or accepting facts while also trying to convince others to accept their nonsense.

Wales’s largest onshore wind farm begins generating energy

The largest onshore wind farm in Wales, constructed by Vattenfall, started operating at full power for the first time late on Sunday 7th May 2017.

Wednesday, 3 May 2017

Online trolls and climate science: The latest attempted lie by Anthony Watts

I posted on Twitter last night that Kevin Grandia in 2009 had referred to the folks posting on Watts Up With That as a 'bunch of online trolls with precious little science'. Well, that isn't the remark but it's pretty close. Grandia was posting in The Huffington Post and he was absolutely right about them in my view, judging from my own experience.

I could carry on countering the comments on Willis Eschenbach's piece, but it's actually pretty monotonous really, so time to move on. Well, in a fashion.

The comments, or some of them, are fairly good examples of the standard myths put out by deniers, so its worth countering them in a way, but really only as 'case studies' and opportunities to present the truth of what's going on with climate change. On a wider scale, it's worth watching WUWT generally, as more blogs appear on there every day, and of course these lies have to be countered. However, there are other things to do as well. For that reason, it's probably best, now, to focus on the headline blogs on WUWT and other stuff that makes it out of the comment zone and into the wider online discussion arena.

The latest WUWT lie by Anthony Watts states that NOAA tide gauge data shows no sea level rise acceleration. It mentions the IPCC AR5 WG1 (the IPCC 5th Assessment Report 2013), and it is worth taking a look at that first. The Summary for Policymakers for AR5 is here. Some of the conclusions it draws are as follows:
  • The atmosphere and ocean have warmed, the amounts of snow and ice have diminished, sea level has risen and the concentrations of greenhouse gases have increased.
  • The globally averaged combined land and ocean surface temperature data, as calculated by a linear trend, show a warmong of 0.85 (0.65 degrees C to 1.06 degrees C) over the period 1880 to 2012, when multiple independent produced datasets exist. The total increase between the average of the 1850-1900 period and the 2003-2012 period is 0.78 (0.72 to 0.85 degrees C), based on the single longest dataset available.
  • It is virtually certain that the troposphere has warmed since the mid-20th century.
  • It is very likely that the number of cold days and nights has decreased and the number of warm days and nights has increased on the global scale since 1950.
  • On a global scale, the ocean warming is largest near the surface, and the upper 75 metres warmed by 0.11 (0.09 to 0.13 degrees C) per decade over the period 1971 to 2010. 
  • Proxy and instrumental sea level data indicate a transition in the late 19th to the early 20th century from relatively low mean rates of rise over the previous two millennia to higher rates of rise (high confidence). It is likely that the rate of global mean sea level rise has continued to increase since the early 20th century.
The WUWT blog piece cites one of the AR5 comments:

"It is very likely that there is a substantial anthropogenic contribution to the global mean sea level rise since the 1970s. This is based on the high confidence in an anthropogenic influence on the two largest contributions to sea level rise, that is thermal expansion and glacier mass loss"

Here's the WUWT cruncher:

"NOAA tide gauge coastal sea level rise data measurements encompassing the 46 year period from 1970 through 2016 do not support and in fact clearly contradict the UN IPCC AR5 WG1 conclusion regarding supposed man made contributions to increasing rates of sea level rise since the early 1970s."

There is no explanation of this statement at all, and I suspect it's based on cherrypicking of the data. Meanwhile, we should turn to this NOAA sea level rising fact sheet:

"Yes, there is strong evidence that global sea level gradually rose in the 20th century and is currently rising at an increased rate, after a period of little change between AD 0 and AD 1900. Sea level is projected to rise at an even greater rate in this century. The two major causes of global sea level rise are thermal expansion of the oceans (water expands as it warms) and the loss of land-based ice due to increased melting"

The NOAA fact sheet doesn't actually say that the cause of the expansion of the oceans from warming and the increased melting is due to human activity. However, the mention of increased melting of ice, along with the other NOAA data correlates with the data available elsewhere suggesting that it is human activity that is causing the warming. This means that Watts needs to support its statement that NOAA data does not support man-made warming. He doesn't.

Elsewhere, the NOAA site features a paper by Bruce B. Parker Marine Technology Society Journal, Vol. 25, No. 4, 1992. This is fairly dated now but it states that:

"The present temperature and salinity profile data base appears to be inadequate to produce reliable, long series of steric heights to look at this question"

So does NOAA believe that sea level rise is due to man-made climate change?

Yes. There is a video on this NOAA Ocean Service website, produced by NASA and NOAA, that explains this perfectly.

What Anthony Watts is doing is that he's taking advantage of the NOAA's apparent reluctance to link data concerning sea level rise with man-made global warming to drive doubt into the discussion.

Next, Watts goes on to talk about semi-empirical models (SEMs), citing a passage in AR5:

"Many semi-empirical model projections of global mean sea level rise are higher than process-based model projections (up to about twice as large), but there is no consensus in the scientific community about their reliability and there is thus low confidence in their projections"

Watts uses this problem with SEMs to cast doubt on a report by the California Ocean Science Trust, reported by The Mercury News, which suggests that sea level rise off California could rise by 10 feet over sea level rise.

On this discussion of sea level rise, NASA says this:

"Projections of global sea level rise by 2100, the year upon which climate modelers typically focus, vary widely depending on modeling methods and on assumptions—the rate of increase in greenhouse gas emissions, for example, and especially how ice sheets will respond to warming air and ocean water."


"SEMs [Rahmstorf et al., 2012 and references therein] take a simple approach—a kind of shortcut—to simulating future sea level rise. Instead of trying to model the processes underlying sea level change, these models rely on sea-level changes observed in previous decades and their relationship to global temperature. Then they apply that same relationship to the century to come. The resulting projections tend to be significantly higher than those derived from process-based modeling."


"An illustrative example can be found in a recent study contrasting the projections of process-based and semi-empirical models [Perrette et al., 2013]. Global mean sea level rise from major sources—thermal expansion, glaciers, and the Greenland and Antarctic ice sheets—total 0.42 meters by 2100 in the process based RCP 6.0 model, considered a mid-range, standard-type emission scenario. But updated with the semi-empirical approach, the same model yields a total of 0.86 meters, more than twice the process-based value." 

A paper by John C. Moore et al (2015) says this in its abstract:

"We review the two main approaches to estimating sea level rise over the coming century: physically plausible models of reduced complexity that exploit statistical relationships between sea level and climate forcing, and more complex physics-based models of the separate elements of the sea level budget. Previously, estimates of future sea level rise from semiempirical models were considerably larger than those from process-based models. However, we show that the most recent estimates of sea level rise by 2100 using both methods have converged..."

Stefan Rahmstorf (2007) in Science explains the situation thus:

"...our capability for calculating future sea-level changes in response to a given surface warming scenario with present physics-based models is very limited, and models are not able to fully reproduce the sea-level rise of recent decades. Rates of sea-level rise calculated with climate and ice sheet models are generally lower than observed rates. Since 1990, observed sea level has followed the uppermost uncertainty limit of the Intergovernmental Panel on Climate Change (IPCC) Third Assessment Report (TAR), which was constructed by assuming the highest emission scenario combined with the highest climate sensitivity and adding an ad hoc amount of sea-level rise for “ice sheet uncertainty

While process-based physical models of sea-level rise are not yet mature, semi-empirical models can provide a pragmatic alternative to estimate the sea-level response. This is also the approach taken for predicting tides along coasts (for example, the well-known tide tables), where the driver (tidal forces) is known, but the calculation of the sea-level response from first principles is so complex that semi-empirical relationships perform better. Likewise, with current and future sea-level rise, the driver is known [global warming (1)], but the computation of the link between the driver and the response from first principles remains elusive. "

So, Watts is using the uncertainty among scientists regarding whether or not SEMs can be relied upon to predict future sea-level rise to cast doubt on scientific reports focusing on sea level rise and their inevitable coverage by the media.

However, Rahmsdorf makes it clear that this is basically detail, that the issue is "computation of the link between the driver and the response". The larger, overriding, conclusion that global warming is causing sea level rise is already known and accepted.

Except by climate change deniers that is....

Tuesday, 2 May 2017

Maize is not to blame!

Maize has been wrongly blamed for soil erosion. My article in Bioenergy insight has just been published and you can see a brief summary here:

In February 2014, just after the serious winter floods that hit the UK in that year and in 2013, Guardian columnist George Monbiot went on the war path against maize, claiming that the crop is a major cause of soil erosion and run off. 

But it's not quite as simple as that, and blaming maize in isolation, rather than inappropriate farming practices generally is a major error. It's just not true that maize is the culprit, yet maize farmers have still got the blame. It's time to set the record straight.

Fortum Charge & Drive enters UK e-mobility market with new charging infrastructure agreement

Fortum Charge & Drive has signed a cooperation agreement to provide Franklin Energy in the United Kingdom with the Charge & Drive cloud solution.

The agreement will enable Franklin Energy to offer e-mobility solutions to electric vehicle drivers across the UK and operate its growing electric vehicle charging infrastructure. The company has ambitious plans to use the Fortum Charge & Drive solution to further expand its charging infrastructure in the United Kingdom and develop its service, thereby providing the most user-friendly EV charging solutions in the UK. This agreement will enable the introduction of the Fortum Charge & Drive cloud solution to operate Franklin’s existing EV charging stations as well as new chargers to be installed in the future.

Franklin, based in Liverpool, currently offers charging services in multiple locations in the UK and has secured an investment to build a significant charging infrastructure of its own as well as to operate charging stations for others. In 2016 it launched the LiFe charging network, and by the end of 2017 Franklin Energy aims to have a presence in every major UK city in England, Scotland and Ireland.

ADBA slams food waste levels in England

The Anaerobic Digestion & Bioresources Association (ADBA) has joined the Environment, Food and Rural Affairs (EFRA) Select Committee in calling food waste levels in England a ‘scandal’.

EFRA’s final report into the costs of food waste in England has been described by ADBA and EFRA as a ‘wake-up call' for the next Government. The report, published this week, examines what Committee Chair Neil Parish has called the ‘grotesque economic, environmental and social costs’ of food waste in England, estimated at over £10 billion a year excluding the disposal costs to Local Authorities.

It makes a series of recommendations on how food waste levels could be reduced and explicitly recognises the role that anaerobic digestion (AD) plays in recycling inedible food waste into low-carbon heat and power, green transport fuel, and organic biofertiliser, helping to recover valuable resources from food waste and meet the UK’s decarbonisation targets. The report states that AD is the best treatment option for food waste that cannot be avoided or redistributed for human or animal consumption.

Tropospheric Warming

Another comment on Eschenbach's blog:

"does not a tropospheric record of warming over the past 35 years that is less then 1/3 of the projected computer model mean for the troposphere, ( note, not just the missing hotspot) cause you to question the deeply adjusted surface record.

At any rate, whatever the cause of the surface warming, ( real warming, UHI, one way adjustments, etc…) Per CAGW theory, the surface warming CANNOT be from CO2, as that surface warming must occur as a result of prior tropospheric warming."

To answer the first question, according to Sherwood et al (2008), there was indeed some controversy about tropospheric warming, with some scientists observing changes that are inconsistent with modelling.  However, this was due to errors in the recording mechanisms that have now been corrected.

Turning to Skeptical Science again, modern satellite data does now "show warming in the troposphere that is consistent with the surface temperature record". John Christy and Ben Santer of University of Alabama in Huntsville (UAH, 2006) have this to say:

"This significant discrepancy no longer exists because errors in the satellite and radiosonde data have been identified and corrected. New data sets have also been developed that do not show such discrepancies."

More information on this can be accessed here.

Hansen 1988

I am now going to divert away from the Eschenbach blog again and address one of the comments to this blog instead, this one concerning Hansen's 1988 paper. The comment (or at least the relevant bits) is as follows:

"...why not focus on measuring the accuracy of the forecasts based on the models.

Let's start with the Hansen paper of 1988:"

Which is this one:

"Figure 1, shows a forecast of a surface temperature rise of c. 1.1 deg C from 1980 to 2016 under scenario A"

Oh really? I wonder if he is talking about a different paper? Although the URL seems to be correct. The Fig 1 on my paper is this:

Actually, the correct graph (having looked at the Skeptical Science entry for Hansen 1988) is this one:

"Plate 4 and section 5.2.3 predicts even faster warming of the troposphere, particularly the tropical troposphere. Indeed, the troposphere warming is cited as an 'useful diagnostic' for the greenhouse effect."

This is Plate 4:

This is 5.2.3:

"Now to what happened.

1) Greenhouse gas emissions have been even higher than predicted under scenario A."

Since I can't find the Figure 1 he is referring to, I've no idea what he is referring to in this sentence. So I am just going to have to take the whole lot and treat it as one big conclusion.

"2) According to wft, surface temperatures have risen by about 0.9 deg C (Giss), from 1980 to 2016, but have since fallen as the El Nino effect dissipates. Many challenge the GISS dataset, as the level of warming from 1880 to 2000 shown by that dataset has risen from around 0.5 in 2002 to about 1.2 deg C in 2014. Surely, they knew how to read thermometers in the early 20th century?

3) Troposphere temperatures have risen about 0.4 deg C for the same period.

At the same time, estimates of Equilibrium Climate Sensitivity (ECS) to CO2 doubling have fallen to about 1.6 deg C. This is (just) within the IPCC range of 1.5-4.5 deg C, but interestingly quite a bit below the alleged danger limit of 2 deg C.

In conclusion, Hansen's forecast was wrong. He predicted far more warming than has actually occurred. Crucially, the troposphere has warmed less than the surface, so the prediction about the troposphere being a useful diagnostic for the greenhouse effect has been proven wrong.

Ergo, the greenhouse effect has been over-stated. Moreover, spending trillions to reduce CO2 emissions is probably the wrong solution to global warming. If far more of the warming is natural than we thought, then it would be better to spend on adaptation, since that will also help with CO2 induced warming."

Now it is my turn to make some points.

This URL ( isn't hyperlinked, and when I pasted it into Google it went nowhere, so I can't see the graph.

The URL cited above for 'realclimatescience' leads to something called "The Deplorable Climate Science Blog". According to DeSmogBlog, this is operated by Steven Goddard, alias Tony Heller. DeSmogBlog tells us that he has a Masters in Electrical Engineering (not climate science) and a BS in Geology (not climate science). His real name is Tony Heller, as Steve Goddard is a pseudonym. Heller wrote an article in the 2008 edition of The Register stating that Arctic Ice is not receding while also claiming that data from the National Snow and Ice Centre (NSIDC), which drew the opposite conclusion, was incorrect. However, he later retracted that statement:

“Dr. Walt Meier at NSIDC has convinced me this week that their ice extent numbers are solid…. It is clear that the NSIDC graph is correct, and that 2008 Arctic ice is barely 10% above last year — just as NSIDC had stated.”

Heller/Goddard can't be trusted on climate science. Therefore neither can this comment by 'Anonymous'.

Anonymous overall conclusion though is that the Hansen forecast was wrong. Let's see what others say about this.

Skeptical Science answers this point in detail:

"Hansen's 1988 results are evidence that the actual climate sensitivity is about 3°C for a doubling of atmospheric CO2."

"In 1988, James Hansen projected future warming trends. He used 3 different scenarios, identified as A, B, and C. Each represented different levels of greenhouse gas emissions.  Scenario A assumed greenhouse gas emissions would continue to accelerate.  Scenario B assumed a slowing and eventually constant rate of growth. Scenario C assumed a rapid decline in greenhouse gas emissions around the year 2000.  The actual greenhouse gas emissions since 1988 have been closest to Scenario B. As shown below, the actual warming has been less than Scenario B.

As climate scientist John Christy noted, "this demonstrates that the old NASA [global climate model] was considerably more sensitive to GHGs than is the real atmosphere."  However, Dr. Christy did not investigate why the climate model was too sensitive.  There are two main reasons for Hansen's overestimate:
  1. Scenario B, which was the closest to reality, slightly overestimated how much the atmospheric greenhouse gases would increase. This isn't just carbon dioxide. It also includes methane and chlorofluorocarbons (CFCs).
  2. Hansen's climate model had a rather high climate sensitivity parameterClimate sensitivity describes how sensitive the global climate is to a change in the amount of energy reaching the Earth's surface and lower atmosphere.
If we take into account the lower atmospheric greenhouse gas increases, we can compare the observed versus projected global temperature warming rates, as shown in the Advanced version of this rebuttal. To accurately predict the global warming of the past 22 years, Hansen's climate model would have needed a climate sensitivity of about 3.4°C for a doubling of atmospheric CO2.  This is within the likely range of climate sensitivity values listed as 2-4.5°C by the IPCC for a doubling of CO2. It is even a bit higher than the most likely value currently widely accepted as 3°C.

In short, the main reason Hansen's 1988 warming projections were too high is that he used a climate model with a high climate sensitivity. His results are actually evidence that the true climate sensitivity parameter is within the range accepted by the IPCC."

So there you go.

Monday, 1 May 2017

CO2 as the driver of climate change

Moving forward a little bit on Eschenbach's blog comments to another statement by Chimp, he says:

"That is, if CO2 indeed be the predominant driver of “climate change”"

More utter nonsense. That carbon dioxide (CO2) is the driver of man-made climate change is basic science. The warming potential of CO2 has been known about since 1859 when John Tyndall conducted laboratory experiments to identify gases in the atmosphere that trap heat. He identified water vapour (H2O) and carbon dioxide (CO2) as two of the most important ( The warming potential of CO2 remains true even though it is only present in the atmosphere in small quantities, i.e. a few parts in ten thousand. Tyndall's conclusions were supported by other scientists, such as Svante Arrhenius and Arvid Högbom, and many others afterwards.

There is also this question of whether CO2 lags (i.e. fails to keep up with) temperature. However, the science confirms that 90 percent of the warming followed an increase in atmospheric CO2. Skeptical Science ( puts it like this:

Over the past 400,000 years CO2 and temperatures are closely correlated. However, data from Antarctic ice cores show that the initial changes in CO2 followed changes in temperatures by about 600 to 1000 years. This has led to deniers to conclude that CO2 can't be responsible for current warming.

The problem with this claim is that the initial changes in temperature were caused by changes in the Earth's orbit around the sun. Orbital changes affects the amount of seasonal sunlight reaching the Earth's surface. However, the lag between temperature and CO2 is explained by the oceans. As the temperatures of the oceans rise, they release CO2 into the atmosphere, which in turn increases the warming, leading to yet more CO2 being released. Thus the CO2 increases becomes the cause and effect of more warming. This is known as a positive feedback.

Shakun et al found that:

  • The Earth's orbital cycles triggered warming in the Arctic approximately 19,000 years ago, causing large amounts of ice to melt, flooding the oceans with fresh water. 
  • This influx of fresh water then disrupted ocean current circulation, in turn causing a seesawing of heat between the hemispheres.
  • The Southern Hemisphere and its oceans warmed first, starting about 18,000 years ago.  As the Southern Ocean warms, the solubility of CO2 in water falls.  This causes the oceans to give up more CO2, releasing it into the atmosphere
 The New Scientist ( explains that this proves that "rising CO2 was not the trigger that caused the initial warming at the end of these ice ages", but no scientist has claimed that to be the case anyway. Rather, it was to do with the orbital changes (Milankovitch Cycles). This does not in any way negate the fact that CO2 drives warming.

Ah, the 1930's temperature record...

Next up on Eschenbach's blog, a comment by Steve Case about the 1930's temperature record:

"Wow, the cooling of the 1930’s temperature record has hit new lows"

The New Scientist has this to say:

"After rising rapidly during the first part of the 20th century, global average temperatures did cool by about 0.2°C after 1940 and remained low until 1970, after which they began to climb rapidly again.

The mid-century cooling appears to have been largely due to a high concentration of sulphate aerosols in the atmosphere, emitted by industrial activities and volcanic eruptions. Sulphate aerosols have a cooling effect on the climate because they scatter light from the Sun, reflecting its energy back into space.

The rise in sulphate aerosols was largely due to the increase in industrial activities at the end of the second world war. In addition, the large eruption of Mount Agung in 1963 produced aerosols which cooled the lower atmosphere by about 0.5 degrees C while solar activity levelled off after increasing at the beginning of the century.

The clean air acts introduced in Europe and North America reduced emissions of sulphate aerosols. As levels fell in the atmosphere, their cooling effect was soon outweighed by the warming effect of the steadily rising levels of greenhouse gases."

I am going to skip forward now, but only a little bit, to a quite ridiculous statement by someone called Chimp, who says:

"There has been no statistically significant warming in the 21st century. The so-called “surface” data are totally bogus, anti-scientific fantasy perpetrated by lying, tough-feeding bureaucrats in order to try to make their GIGO models look less epically failed and to keep the CACA gravy train rolling."

Of course, its utter bull. Chimp provides no evidence to support either claim. However, the Australian Government reported that 2013 was the hottest year ever on record: :

"2013 was Australia’s warmest year since records began in 1910. Mean temperatures across Australia have generally been well above average since September 2012. Long periods of warmer-than-average days have been common, with a distinct lack of cold weather. Nights have also been warmer than average, but less so than days.

The Australian area-averaged mean temperature for 2013 was +1.20 °C above the 1961–1990 average. Maximum temperatures were +1.45 °C above average, and minimum temperatures +0.94 °C above average. Temperatures were above average across nearly all of Australia for maximum, mean and minimum temperatures, with large areas of inland and southern Australia experiencing the highest on record for each.

Australia has experienced just one cooler-than-average year (2011) in the last decade. The 10-year mean temperature for 2004–2013 was 0.50 °C above average, the equal-highest on record. Averages for each of the ten-year periods from 1995–2004 to 2004–2013 have been amongst the top ten records.

The Australian mean rainfall total for 2013 was 428 mm (37 mm below the long-term average of 465 mm). In comparison with rainfall in all years since 1900, 2013 sits close to the median or mid-point of historical observations.

Annual rainfall was below average across a large region of the inland east centred on western Queensland and extending into northern South Australia and the Northern Territory. Rainfall was above average over parts of the Pilbara and the south coast of Western Australia, as well as along the east coast and northern Tasmania."

 The World Meteorological Organisation (WMO) confirmed that 2016 was the warmest year on record (

So Chimp's statement is just utter nonsense. However, he goes on to say:

"The peaks in 1997-98 and 2015-16 are from super El Ninos, totally natural events. The recent one was fractionally warmer, but there is no reason to suppose that humans are responsible for the insignificant difference."

....except that data from NOAA shows that the 2014-2016 El Nino didn't even begin until October 2014 and was pretty borderline until mid-2015. It did add to the warming from late 2015, but if the El Nino factor is removed, 2016 is still the warmest year on record. So Chimp's statement on this is again a load of nonsense. 

 Climate scientist Zeke Hausfather from Berkeley Earth says more on this here

"It is all but certain now that 2016 will shatter historical records to be the warmest year ever by a wide margin. It was helped along the way by a large El Niño event, which tends to be associated with warmer temperatures globally. But, even without El Niño, 2016 would likely still be the warmest year ever."

He goes on to say that David Rose's article in the Mail on Sunday in November 2016 was "deeply misleading" based on cherry-picking an obscure temperature record, the creator of which warned that it should be "used with caution". Hausfather says that the warming of 2014, 2015 and 2016 has been driven by human greenhouse gas emissions, not El Nino.

Tampering of data by NASA? Oh please, really...

Next up on Eschenbach's blog is this comment by A C Osborn:

"Don't - massive tampering of the data by NASA & NAOO – give you slight pause?"

I am unsure of whether Osborn is supporting or criticising this 'tampering of data' claim here, but it gives me an opportunity to squash the claim anyway.

NASA - the organisation that sent man to the moon...really? Brian Cox had said something to say about this:

So I covered this just now, but let's go over it again:

The series of blog posts, written by climate change denier Paul Homewood were highly publicised in the Daily Telegraph by Christopher Booker – who has regularly been criticised (for example by George Monbiot) – for his woeful knowledge and regular mistakes. Both Homewood and Booker focused on the adjustments made to temperature readings at particular monitoring stations around the world, claiming that these adjustments throw the entire science of global warming into question. However, this isn’t the case at all, because these adjustments, as I recently mentioned and explained, are a normal and important part of climate science.

The National Oceanic and Atmospheric Administration, the U.S. agency responsible for monitoring national and global temperature trends, has debunked these accusations of data manipulation on several occasions on its website. So, explaining again, over time, the thousands of weather stations around the world have undergone changes that often result in sudden or unrealistic discrepancies in observed temperatures requiring a correction, for example a building might be constructed nearby that affects the temperature readings collected by the measuring station, or the technology might be updated or changed etc. However, there is detailed station history available that helps to identify and correct the discrepancies. Some of the corrections required are relatively simple.

NOAA maintains about 1,500 monitoring stations, and gathers data from more than a thousand other stations in countries around the world. This data is shared freely by many national and international organisations. There are actually fewer monitoring stations today than there used to be; modern stations have better technology and are accessible in real time, unlike some older outposts no longer in use. The raw, unadjusted data from these stations is available from many sources, including the international collaboration known as the Global Historical Climatology Network and others.

As the years go by, all those stations undergo various types of changes: This can include shifts in how monitoring is done, improvements in technology, or even just the addition or subtraction of nearby buildings.

For example, a new building constructed next to a monitoring station could cast a shadow over a station, or change wind patterns, in such ways that could affect the readings. Also, the timing of temperature measurements has varied over time. And in the 1980s, most U.S. stations switched from liquid-in-glass to electronic resistance thermometers, which could both cool maximum temperature readings and warm minimum readings.

Monitoring organizations like NOAA use data from other stations nearby to try and adjust for these situations and the impacts they have on the accuracy of the data. This data from other stations is used to either raise or lower the temperature readings for a given station, a process known as homogenisation. The most significant adjustment around the world, according to NOAA, is actually for temperatures taken over the oceans, and that adjustment acts to lower rather than raise the global temperature trend.

These homogenisation methods have been validated and peer-reviewed. For example, a 2012 paper in the Journal of Geophysical Research confirmed the effectiveness of the homogenisation processes for NOAA’s network of stations, and even noted that “it is likely that maximum temperature trends have been underestimated.” In other words, there may have actually been more warming than NOAA has reported.

Another paper, from 2010, looked into the siting of U.S. monitoring stations in particular, and again found no problem with the homogenisation methods. “The adjusted [U.S. Historical Climatology Network] temperatures are extremely well aligned with recent measurements. … In summary, we find no evidence that the [conterminous United States] average temperature trends are inflated due to poor station siting.”

Berkeley Earth, a climate science nonprofit founded in early 2010 by scientists expressing skepticism at the time about global warming, has also found no undue manipulation of temperature data in its own analyses. Its page specifically on the Paraguayan Puerto Casado station that Homewood mentioned shows the adjusted readings do in fact show a rise in temperature over time.

An October 2011 paper in the Journal of Geophysical Research provides an overview of the entire Global Historical Climatology Network’s temperature data set, including detailed information about adjustments. In total, at least one “bias correction” was applied to 3,297 of the 7,279 stations in use at some point since 1801, though most of these occurred from the 1950s through the 1980s. There are approximately equal numbers of adjustments in the positive and negative directions.

So the claim of 'data tampering' is just ridiculous actually.

Energy & Environment Dates 2012