Skip to content

The United Nations: An Unconstrained Bureaucracy

August 10, 2018

By Paul Homewood


It’s two years old, but still a highly relevant insight into the operations of the UN:





The United Nations is financed mostly by taxpayers from a few donor countries but the large and growing bureaucracy is too far removed from those taxpayers to be directly accountable to them. It is run by unelected, unaccountable, undisciplined, and incompetent bureaucrats. The organization’s size, budget, and scope are unconstrained. The budget funding process provides perverse incentives for these bureaucrats to increase the size and scope of their organization simply by creating multitudes of agencies and programs, and by inventing problems and environmental crises set on a global scale.


Putin’s Green Puppets

August 10, 2018

By Paul Homewood


Guido has the low down on Russia’s latest attempt to undermine fracking in the west:


Growth In Thermal Generation Continues To Outpace Renewables In China

August 9, 2018

By Paul Homewood





Anybody who thinks China is rapidly shifting to renewable energy needs to look at the latest electricity data from the China Energy Portal.

Whilst wind and solar generation has increased by 51 TWh year-on-year in Q2, thermal has increased by 176.9 TWh.

It was a similar situation in Q1:



To put the figures into perspective, total generation in Q2 was 3194 TWh, so the increase of 51 TWh from wind/solar represents just 1.6%. However, because total generation increased by 245 TWh, demand for coal and gas generation increased even more.

In total, wind and solar accounted for 6.6% of generation in the quarter, compared to 5.5% a year ago.

Year-on-year, installed thermal capacity has risen by 4.1%, following an increase of 4.6% in 2017.

Tornado Trends

August 8, 2018

By Paul Homewood 


As I’ve often commented, NOAA keep insisting on publishing charts of total tornado numbers, even though they know full well that these numbers are grossly misleading, and simply reflect the fact that more tornadoes get to be reported these days.

As McCarthy & Schaefer pointed out in their paper, “TORNADO TRENDS OVER THE PAST THIRTY YEARS”:

 The increase in reported tornado frequency during the early 1990s corresponds to the operational implementation of Doppler weather radars. Other non-meteorological factors that must be considered when looking at the increase in reported tornado frequency over the past 33 years are the advent of cellular telephones; the development of spotter networks by NWS offices, local emergency management officials, and local media; and population shifts.

The growing “hobby” of tornado chasing has also contributed to the increasing number of reported tornadoes. The capability to easily photograph tornadoes with digital photography, camcorders, and even cell phone cameras not only provides documentation of many weak tornadoes, but also, on occasion, shows the presence of multiple tornadoes immediately adjacent to each other.


NOAA themselves state:

One of the main difficulties with tornado records is that a tornado, or evidence of a tornado must have been observed. Unlike rainfall or temperature, which may be measured by a fixed instrument, tornadoes are short-lived and very unpredictable. If a tornado occurs in a place with few or no people, it is not likely to be documented. Many significant tornadoes may not make it into the historical record since Tornado Alley was very sparsely populated during the 20th century.

Much early work on tornado climatology in the United States was done by John Park Finley in his book Tornadoes, published in 1887. While some of Finley’s safety guidelines have since been refuted as dangerous practices, the book remains a seminal work in tornado research. The University of Oklahoma created a PDF copy of the book and made it accessible at John Finley’s Tornadoes (link is external).

Today, nearly all of the United States is reasonably well populated, or at least covered by NOAA’s Doppler weather radars. Even if a tornado is not actually observed, modern damage assessments by National Weather Service personnel can discern if a tornado caused the damage, and if so, how strong the tornado may have been. This disparity between tornado records of the past and current records contributes a great deal of uncertainty regarding questions about the long-term behavior or patterns of tornado occurrence. Improved tornado observation practices have led to an increase in the number of reported weaker tornadoes, and in recent years EF-0 tornadoes have become more prevelant in the total number of reported tornadoes. In addition, even today many smaller tornadoes still may go undocumented in places with low populations or inconsistent communication facilities.

With increased National Doppler radar coverage, increasing population, and greater attention to tornado reporting, there has been an increase in the number of tornado reports over the past several decades. This can create a misleading appearance of an increasing trend in tornado frequency. To better understand the variability and trend in tornado frequency in the United States, the total number of EF-1 and stronger, as well as strong to violent tornadoes (EF-3 to EF-5 category on the Enhanced Fujita scale) can be analyzed. These tornadoes would have likely been reported even during the decades before Doppler radar use became widespread and practices resulted in increasing tornado reports. The bar charts below indicate there has been little trend in the frequency of the stronger tornadoes over the past 55 years.


To illustrate the reality, let’s first look at EF-0 trends.

Read more…

Hothouse Earth

August 7, 2018

By Paul Homewood

Latest crap from the warmist establishment, gleefully blown up by the BBC



Climate change: ‘Hothouse Earth’ risks even if CO2 emissions slashed

It may sound like the title of a low budget sci-fi movie, but for planetary scientists, “Hothouse Earth” is a deadly serious concept.

Researchers believe we could soon cross a threshold leading to boiling hot temperatures and towering seas in the centuries to come.

Even if countries succeed in meeting their CO2 targets, we could still lurch on to this “irreversible pathway”.

Their study shows it could happen if global temperatures rise by 2C.

An international team of climate researchers, writing in the journal, Proceedings of the National Academy of Sciences, says the warming expected in the next few decades could turn some of the Earth’s natural forces – that currently protect us – into our enemies.

Each year the Earth’s forests, oceans and land soak up about 4.5 billion tonnes of carbon that would otherwise end up in our atmosphere adding to temperatures.

But as the world experiences warming, these carbon sinks could become sources of carbon and make the problems of climate change significantly worse.

So whether it is the permafrost in northern latitudes that now holds millions of tonnes of warming gases, or the Amazon rainforest, the fear is that the closer we get to 2 degrees of warming above pre-industrial levels, the greater the chances that these natural allies will spew out more carbon than they currently now take in.

Back in 2015, governments of the world committed themselves to keeping temperature rises well below 2 degrees, and to strive to keep them under 1.5. According to the authors, the current plans to cut carbon may not be enough if their analysis is correct.

“What we are saying is that when we reach 2 degrees of warming, we may be at a point where we hand over the control mechanism to Planet Earth herself,” co-author Prof Johan Rockström, from the Stockholm Resilience Centre, told BBC News.

“We are the ones in control right now, but once we go past 2 degrees, we see that the Earth system tips over from being a friend to a foe. We totally hand over our fate to an Earth system that starts rolling out of equilibrium.”


The utterly corrupt body of climate science has been getting ever more desperate to scare people about climate change and thereby submit to their radical anti capitalist agenda.

People are not falling for it, so we are now being subjected to ever more absurd announcements like this.


The whole premise of this latest is so wholly ridiculous that, if it had been in any other field of science it would have instantly dismissed as juvenile fantasy rambings with no evidence at all.

But, unfortunately, this is climate science, where any poorly qualified hack with a taste for easy grant money can publish whatever nonsense they want, in the knowledge that the corrupt pal review system will give it the nod, and that the complicit left wing media will give it top billing as indisputable truth


This is the latest puerile attempt to fool the public.


I am on holiday, but it took me about ten minutes to spot the gaping fallacies in their argument.


  1. The world has been much warmer than now, even in the recent past, yet we have never had this supposed runaway warming.


In particular, temperatures in the Arctic have been much higher throughout just about all of the holocene. We know from ice cores that temperatures in Greenland in the 19thC were the lowest since the ice age.

Yet these con artists expect us to believe that current temperatures are leading us into oblivion.

Using the same logic, the same plunge into the LIA should also have led to runaway cooling.


The simple fact is that the world has a remarkable stable climate, which reacts to natural climatic changes, but does not run out of control.


2) The Study relies heavily on warming in the Arctic, which supposedly will lead to ice loss, methane release etc, which will in turn cause further warming.

But we know that Arctic temperatures were much higher a few thousand years ago. We also know they were just as high in the 1930s and 40s.

There was no runaway warming then, and won’t be now.


3) The study also invokes the prospect of the Amazon rainforest dying off because of global warming, and the Sahel returning to desert. It is said that these events will add to CO2 in the atmosphere.

Yet we know from historical evidence during the early holocene that a warmer world is a wetter, greener one. That is why the Sahara was verdant 5000 years ago.

Proper climate scientists, such as H Lamb knew a long while ago that cold global climates lead to droughts, not warm ones.


The public has shown itself remarkably resilient in its determination not to have its standard of living damaged by left wing climate policies. Hence the increasingly desperate attempts by the climate mafia to blame every hurricane, flood, heatwave and drought and wildfire on fossil fuels.

And when that does not work, why not try hell and damnation as well?


You would be entitled, reading the above warnings of apocalypse, to think that we are all doomed anyway. Yet, surprise, surprise, the authors offer us a get out of jail card free! Worldwide communism:

The authors say a total re-orientation of human values, equity, behaviour and technologies is required. We must all become stewards of the Earth.

Now, isn’t that a surprise?

Guardian: UK Churches Latest Victims of ‘Climate Change’

August 7, 2018

By Paul Homewood


From Breitbart:


Climate change” threatens the future of UK’s historic churches, endangering roofs, towers, and spires, according to the National Churches Trust.


Read more…

Is your energy supply vegan?

August 5, 2018

By Paul Homewood


The mind boggles!


We’re more conscious than ever about the impact our buying choices have on the planet. But the energy industry still hides a secret about how it uses animals in its energy production.

You’ve got a right to know how your energy is made, so you can choose a supplier that fits your principles. Here’s everything you need to know about vegan energy, and how to switch to an animal-free supplier.

What is vegan energy?

Vegan energy is the production of electricity or gas that doesn’t involve the use of animals or animal by-products.

There are two main sources of non-vegan energy generation in the UK: anaerobic digestion (AD), and biomass. Both AD and biomass energy production can contain by-products of animal farming – like factory-farmed livestock, slaughterhouse waste, fish parts, and animal slurry.

While energy providers are required by Ofgem to declare the fuel mix of the energy they supply, there’s no obligation to declare whether animals are used in its production.

What’s the difference between green energy and vegan energy?

Green energy is the production of electricity or gas through renewable sources, in place of traditional fossil fuels. Green energy is produced by harnessing power from solar, wind, wave and tidal sources.

But just because an energy supply is green, it doesn’t make it vegan.

Many green energy companies, like Good Energy, Bulb, and Octopus, supply electricity that’s been generated using animals or animal by-products.

List of UK energy suppliers who use animal by-products in their fuel supply.


Do vegans really object to burning poo?

The $2.5 trillion reason we can’t rely on batteries to clean up the grid

August 4, 2018

By Paul Homewood


H/t Dave Ward


This article is reposted from MIT Energy Review:


The $2.5 trillion reason we can’t rely on batteries to clean up the grid

James Temple

A pair of 500-foot smokestacks rise from a natural-gas power plant on the harbor of Moss Landing, California, casting an industrial pall over the pretty seaside town.

If state regulators sign off, however, it could be the site of the world’s largest lithium-ion battery project by late 2020, helping to balance fluctuating wind and solar energy on the California grid.

The 300-megawatt facility is one of four giant lithium-ion storage projects that Pacific Gas and Electric, California’s largest utility, asked the California Public Utilities Commission to approve in late June. Collectively, they would add enough storage capacity to the grid to supply about 2,700 homes for a month (or to store about .0009 percent of the electricity the state uses each year).

The California projects are among a growing number of efforts around the world, including Tesla’s 100-megawatt battery array in South Australia, to build ever larger lithium-ion storage systems as prices decline and renewable generation increases. They’re fueling growing optimism that these giant batteries will allow wind and solar power to displace a growing share of fossil-fuel plants.

But there’s a problem with this rosy scenario. These batteries are far too expensive and don’t last nearly long enough, limiting the role they can play on the grid, experts say. If we plan to rely on them for massive amounts of storage as more renewables come online—rather than turning to a broader mix of low-carbon sources like nuclear and natural gas with carbon capture technology—we could be headed down a dangerously unaffordable path.

Small doses

Today’s battery storage technology works best in a limited role, as a substitute for “peaking” power plants, according to a 2016 analysis by researchers at MIT and Argonne National Lab. These are smaller facilities, frequently fueled by natural gas today, that can afford to operate infrequently, firing up quickly when prices and demand are high.

Lithium-ion batteries could compete economically with these natural-gas peakers within the next five years, says Marco Ferrara, a cofounder of Form Energy, an MIT spinout developing grid storage batteries.

“The gas peaker business is pretty close to ending, and lithium-ion is a great replacement,” he says.

This peaker role is precisely the one that most of the new and forthcoming lithium-ion battery projects are designed to fill. Indeed, the California storage projects could eventually replace three natural-gas facilities in the region, two of which are peaker plants.

But much beyond this role, batteries run into real problems. The authors of the 2016 study found steeply diminishing returns when a lot of battery storage is added to the grid. They concluded that coupling battery storage with renewable plants is a “weak substitute” for large, flexible coal or natural-gas combined-cycle plants, the type that can be tapped at any time, run continuously, and vary output levels to meet shifting demand throughout the day.

Not only is lithium-ion technology too expensive for this role, but limited battery life means it’s not well suited to filling gaps during the days, weeks, and even months when wind and solar generation flags.

This problem is particularly acute in California, where both wind and solar fall off precipitously during the fall and winter months. Here’s what the seasonal pattern looks like:

If renewables provided 80 percent of California electricity – half wind, half solar – generation would fall precipitously beginning in the late summer.Clean Air Task Force analysis of CAISO data

This leads to a critical problem: when renewables reach high levels on the grid, you need far, far more wind and solar plants to crank out enough excess power during peak times to keep the grid operating through those long seasonal dips, says Jesse Jenkins, a coauthor of the study and an energy systems researcher. That, in turn, requires banks upon banks of batteries that can store it all away until it’s needed.

And that ends up being astronomically expensive.

Read the full post here:


I did my own analysis a few weeks ago, which also showed that for battery storage to supply any meaningful back up for intermittent renewables would be ridiculously expensive and, in practical terms, pie in the sky.

This latest study comes to similar conclusions.


Interestingly, if we take the quoted four 300MW storage plants proposed, and work back from “to supply 2700 homes for a month”, I reckon this equates to 1350MWh (assuming annual consumption of 6000KWh).

In other words the storage plants can supply the full 1200MW for only slightly longer than an hour-about the same from memory as the Tesla in SA.

It seems an awfully expensive way to provide such a pitifully tiny amount of energy. And, of course, the power needed to charge them up in the first place must be paid for as well.At least small scale gas peakers can run as long as you want them to.

Latest Air Pollution Scare Debunked

August 3, 2018

By Paul Homewood


H/t stewgreen


The BBC has gone to town on this latest pollution scare research:

Low levels of air pollution linked to changes in the heart

Regular exposure to even low levels of air pollution may cause changes to the heart similar to those in the early stages of heart failure, experts say.

A study of 4,000 people in the UK found those who lived by loud, busy roads had larger hearts on average than those living in less polluted areas.

This was despite the fact people in the study were exposed to pollution levels below the UK guidelines.

Researchers called on the government to reduce air pollution more quickly.

A team of scientists, led from Queen Mary University of London, analysed health data of people who had no underlying heart problems and were part of the UK Biobank study, including the size, weight and function of their hearts.

Researchers also looked at the pollution levels in the areas they lived in.

Their study found a clear link between exposure to higher pollution levels and larger right and left ventricles – important pumping chambers in the heart.

For every extra one microgram per cubic metre of PM2.5 – small particles of air pollution – and for every 10 extra micrograms per cubic metre of nitrogen dioxide, the heart enlarged by about 1%.

The changes were comparable to being consistently inactive or having elevated blood pressure, said Dr Nay Aung, who led the study’s data analysis.

“Air pollution should be seen as a modifiable risk factor,” he said.

While the exact locations where people lived were not included in the study, most were outside of the major UK cities and all of them were exposed to levels of PM2.5 air pollution well below current UK limits.

In the study, average annual exposures to PM2.5 ranged from eight to 12 micrograms per cubic metre.

This is lower than the UK limits of 25 micrograms per cubic metre but closer to the World Health Organization’s recommended limit of 10 micrograms per cubic metre.

This fine particle pollution is particularly dangerous because it can penetrate deep into the lungs and cardiovascular system.

Exposure to nitrogen dioxide in the study ranged from 10-50 micrograms per cubic metre – the UK and WHO limits are 40 micrograms per cubic metre.


It’s hard to know where to start with this junk science that a sixth former should be ashamed of putting forward.

For instance, what about a whole range of social factors, such as smoking, alcohol, diet, exercise etc. Are people who live in areas of high pollution also likely to be more exposed to these risks?

And as Stew points out, your heart does not suddenly change as a result of pollution, or anything else. It could have taken decades to reach the current state.

The study explains that it is based on 3920 individuals’ cardiovascular images taken in 2010, and correlated to air pollution stats in 2005.

But the key parameter is that the studied cohort was 62+/- 7 years old.

In other words, they have all lived most of their lives amongst far greater pollution than they do now. And it’s highly that those still living in areas of heavy traffic have been exposed to much greater pollution in past decades.

There is zero evidence that current, low levels of pollution have had impact at all on the health of the sampled population.


This is the link to the study:



July Arctic Sea Ice Extent Highest Since 2005

August 3, 2018

By Paul Homewood


According to DMI, average Arctic sea ice extent in July is at its highest level since 2005.

With temperatures at normal levels, there is little prospect of Peter Wadhams predictions coming true this year!