STATION-BASED DATA SETS.
The most quickly important weather and climatic
conditions for people are those near the surface of
the earth on which they live. A coordinated network
of ground observatories measuring temperature,
pressure, precipitation, humidity, and sometimes
other variables near the surface has expanded
rapidly in the 20th century, when it began to
emerge in the mid-19th century (Fleming 1998: Ch.
3). Today, thousands of observatories around the
world observe this condition daily, often overseen
by the national Meteorological Agency. In recent
decades, major efforts have been made to collect
past surface observation records to generate long-
term global datasets useful for climate change
studies (eg, Menne et al.,. 2012; Rennie et al. 2014).
These ongoing efforts include important 'data
structures' activities, including the imaging and
digitization of paper records, with international
cooperation and in some cases the help of ordinary
citizens.
, However, retrieving digitized station data is only
the first step. As Edwards (2010:321) emphasizes,
"...if you need global data, you must create it."
Thousands of observational records, equivalent to
millions of individual observations, are merged,
quality-controlled, and transformed into a
homogenized grid to build global temperature data
sets to aid in climate research. Records are provided
from multiple sources, and merging aims to prevent
duplication while maximizing station coverage
coverage (Rennie et al. 2014). Quality control
procedures identify and delete invalid data. For
example, the production of the Global Historical
Climate Network-Daily (GCHNDaily) database
includes 19 automated quality assurance tests
designed to detect redundant data, values and
space above climate, time and internal
inconsistencies. Included (Durre et al., 2010). You
want to eliminate jumps and trends in station time
series due to homogeneous non-climatic factors.
For example, equipment is replaced with new ones,
buildings are built nearby, or the timing of
The most quickly important weather and climatic
conditions for people are those near the surface of
the earth on which they live. A coordinated network
of ground observatories measuring temperature,
pressure, precipitation, humidity, and sometimes
other variables near the surface has expanded
rapidly in the 20th century, when it began to
emerge in the mid-19th century (Fleming 1998: Ch.
3). Today, thousands of observatories around the
world observe this condition daily, often overseen
by the national Meteorological Agency. In recent
decades, major efforts have been made to collect
past surface observation records to generate long-
term global datasets useful for climate change
studies (eg, Menne et al.,. 2012; Rennie et al. 2014).
These ongoing efforts include important 'data
structures' activities, including the imaging and
digitization of paper records, with international
cooperation and in some cases the help of ordinary
citizens.
, However, retrieving digitized station data is only
the first step. As Edwards (2010:321) emphasizes,
"...if you need global data, you must create it."
Thousands of observational records, equivalent to
millions of individual observations, are merged,
quality-controlled, and transformed into a
homogenized grid to build global temperature data
sets to aid in climate research. Records are provided
from multiple sources, and merging aims to prevent
duplication while maximizing station coverage
coverage (Rennie et al. 2014). Quality control
procedures identify and delete invalid data. For
example, the production of the Global Historical
Climate Network-Daily (GCHNDaily) database
includes 19 automated quality assurance tests
designed to detect redundant data, values and
space above climate, time and internal
inconsistencies. Included (Durre et al., 2010). You
want to eliminate jumps and trends in station time
series due to homogeneous non-climatic factors.
For example, equipment is replaced with new ones,
buildings are built nearby, or the timing of