From buckets to beehives: how scientists collect climate data

From old seafarers to modern scientists, RTCC looks at the past and the present of temperature records

Pic: Jeremy Potter NOAA/OAR/OER

Pic: Jeremy Potter NOAA/OAR/OER

By Sophie Yeo

In 1852, a US Naval officer drew up a map of the North Atlantic.

The lines and figures neatly stencilled in by Lieutenant Matthew Fontaine Maury represented data on wind, currents and water temperature from the log books of dozens of ships in the waters off Florida, the Gulf Coast and the Caribbean.

For scientists today, this sepia document shows how keen sailors and explorers were to measure the heat of the ocean even before the climate had started to change.

Over 150 years ago, their reasons were different from those who set out to gather sea temperature data today. The captains of these 19th century ships hoped their records would improve weather forecasting, allowing them to avoid the storms that could wreck their vessels.

But the data that they collected – using thermometers dunked into wooden buckets of water hauled on deck – is still used today by climate scientists curious to see how the planet is warming.

Maury's 1852 map (Pic: United States Hydrographical Office)

Maury’s 1852 map (Pic: United States Hydrographical Office)

Such data makes up the historical records on which US federal agencies NASA and National Oceanic and Atmospheric Administration (NOAA) will have relied to make the deduction that they announced today: that 2014 was the hottest year ever.

The world of climate data itself has a tempestuous history. The leaked Climategate emails of 2009 cast aspersions onto the records and caused many to question the integrity of the scientists responsible for compiling them.

Parliamentary enquiries proved that the allegations were unfounded, but the scandal – which unfolded just before the UN’s fateful climate conference in Copenhagen began – saw a surge in interest in how far the figures can be trusted.

“In the short term it had a certain amount of influence in certain circles, but there’s no doubt that the world has moved on a long way since then. But overall it didn’t affect the picture of the warming earth,” says Richard Black, director of the Energy and Climate Intelligence Unit.

From wood to canvas

Since the days of wooden buckets and sailing ships, the history of climate data collection has been one of constant improvement and ever greater accuracy, over both land and sea.

By the late 1800s, sailors started to use more robust canvas buckets to collect the water. After 1940, ships began to use sensors on their engines, which could transmit the information automatically. And since the late 1990s, a large number of drifting buoys have been deployed across the world’s oceans, making it easier to monitor the temperature in areas beyond the reach of ships.

On land, problems with thermometers being exposed to the sunlight were resolved around the 1970s with the invention of the Stevenson Screen – a white apparatus that looks like a beehive, which protects the instruments from outside interference. Many of these have now been replaced with automatic weather stations, which can automatically read and transmit the data.

stevenson-screen2

In 1938, an amateur climatologist called Guy Callendar was the first to use the records to prove that land temperatures were warming and that carbon dioxide was to blame – a fact that he celebrated, as it would delay “the return of the deadly glaciers”.

Nowadays, there are thousands of temperature data stations all over the world, which are installed, monitored and shared by each country’s meteorological agency.

The network is extensive. As long as the country is a member of the World Meteorological Organisation, they are obliged to gather the data. Oceanographic centres, mainly in developed countries, work with marine industries to gather sea surface data.

This means that, from the Antarctic to North Korea, scientists have a good grasp of what the climate is doing almost everywhere.

Corrections

But that doesn’t mean the data is perfect.

Scientific institutions collect the data into datasets – including the UK Met Office’s HadCRUT, NASA’s GISTEMP, NOAA’s MLOST and the Japanese Meteorological Agency – but the fact that some of these measurements are handled manually means that quality control exercises have to be carried out to ensure the data is clean.

“Occasionally people do things like miss out a minus sign when the data gets submitted, or lose a decimal place, for example. The quality control process is going to vary from centre to centre,” says John Kennedy, a senior scientist on climate monitoring at the UK’s Met Office.

Handling the historic data has also proved something of a headache for scientists, as different collection methods and sometimes roving stations mean their records are not always directly comparable.

This has been particularly problematic with sea temperature data, as it affects a large portion of the results. The transition to automatic ship sensors means that the temperature data collected by canvas buckets is about 0.4C too cold, due to the time it took to haul the water up on deck, and therefore has to be adjusted upwards.

The growth of cities around the stations can also have an impact, as the centres of Vienna and London, for instance, are warmer than areas in the country. But studies have shown that the impacts of this are smaller, as the devices are normally placed in the city’s parkland or airport. Unlike the bucket dilemma, it only affects a portion of the data.

These are all issues that scientists are able to correct with carefully formulated algorithms.

But when a job involves monitoring consistent and reliable data in some of the remotest area of the world, the issue of maintenance inevitably crops up – even for the most technologically advanced devices.

“If they’re in the Antarctic, they can get buried by snow. Wind storms can get them as well. It can take months to get them back because they can only go to them in the summer months,” says Professor Phil Jones, director of the Climate Research Unit at the University of East Anglia.

He adds that the cheaper Stevenson Screens, which are often made of wood and need to be painted white, come with their own problems. “If they’re not well maintained they tend to rot and paint flakes off, so it doesn’t reflect as much light as it should do, and warms up more than it should do.”

And when it comes to the dispersion of weather stations, not all countries are created equal. Large sparsely populated countries, or remote and hostile environments, have fewer stations simply because it is difficult to place and monitor the stations.

The Empty Quarter in the Arabian Peninsula, for instance – which consists of some 650,000 square kilometres of sand desert – is unmonitored. Data is also sparse in the Antarctic and the Sahara.

This can be problematic when scientists want to observe temperature patterns in a particular area of the world. While global temperatures are rising, in some areas it is happening faster than others. In the Arctic, for instance, certain feedback mechanisms mean the snowy region is warming twice as fast as the average rate.

This can have implications for vulnerable regions trying to adapt to the new patterns of the changing climate.

“Africa contains a lot of climate vulnerable societies, so it is one place where you would love to have really accurate data. But the situation is improving,” says Black.

Keeping secrets

What these aberrations do not impact is the large scale climate record. To work out global temperatures, all scientists need is 100 stations sending them an accurate piece of data once a month.

This is lucky, because hoovering up the figures from the thousands of weather stations and depositing them into datasets is not necessarily easy.

In fact, NOAA forms its dataset from around 7,000 land stations, while the Met Office has around 5,500, which allow the agencies to build up a detailed picture of how climate change is happening regionally, as well as globally.

But much of this information is sensitive, both commercially and nationally.

In African countries, for instance, there has been reluctance in recent decades to make the data freely available for everyone.

“They don’t want to be a collection agency for western scientists,” says Jones. “They want to get involved in the science as well. They don’t want their scientists to work for companies in Europe, and so by keeping hold of the data it might their scientists stay in the country more.”

European countries, and particularly Poland, have also been reluctant to share their data in recent years – particularly on rain and sunshine – as consultancies in the US are able to cash in on the findings by selling forecasts, often to agricultural companies.

In many EU countries, the meteorological service is expected to make a profit, and the data it collects can provide a valuable stream of revenue. Wars and changes in government can also interrupt the flow of information coming out of a country.

Ocean data collected by ships can also be delayed. Fishing fleets in Japan, Taiwan and South Korea may be reluctant to reveal their whereabouts “if they are catching really good fish”, says Jones. Shipping companies around Somalia may hold similar qualms due to the threat of pirates.

But these concerns are merely glitches in a long history of remarkably robust data collection.

Lay the records of the four main meteorological agencies – each working independently and with their own methods – on top of each other on a graph, and what is striking is how the lines match up almost exactly.

nasa_tempdatasets

That is why, when NASA revealed today that the world is hotter than ever, they know they can count on the numbers – even those collected by old seafarers with wooden buckets.

Read more on: Comment | Research | | | |