FREE ELECTRONIC LIBRARY - Books, dissertations, abstract

Pages:     | 1 || 3 | 4 |   ...   | 8 |

«Robert Rohde1, Judith Curry2, Donald Groom3, Robert Jacobsen3,4, Richard A. Muller1,3,4, Saul Perlmutter3,4, Arthur Rosenfeld3,4, Charlotte Wickham5, ...»

-- [ Page 2 ] --

In all three of these prior approaches, every record used in gridded averaging is assumed to be equally reliable. More precisely, they make the assumption that their quality control and homogenization processes address erroneous and biased data prior to the gridding and averaging step in such a way that each resulting time series is deserving of equal weight. (GISS makes a partial exception in that a corrective model for urban heat island biases is applied after gridding.) This has the effect that records subject to many bias “corrections” can be given the same weight in an average as a record where no bias adjustments were found to be necessary. In such cases, the differences in data quality may play a role in how the uncertainty is assessed, but not in the construction of the global average.

All three of the averaging processes currently being used rely on the concept of a “baseline” parameter to define the “normal” weather. The baseline can either be introduced for each record before gridding (e.g. HadCRU) or it can be introduced after gridding and defined at the level of the grid cell average (e.g. NASA). The intent of the baseline temperature parameter is to capture the “normal” climate at that location by reference to the average weather over some specific reference period (e.g. 1960-1980). Each time series is then replaced by an “anomaly” time series consisting of the differences from the baseline. This approach is motivated by the observation that temperatures change rapidly with latitude (about 1 C per 150 km poleward) and altitude (about 1 C for every 220 m of surface elevation), and that these changes are quite large compared to the approximately 1 C / century of global warming that one wants to investigate. In effect, the baseline parameters are meant to capture most of the spatial variability between sites.

In particular, the average of anomaly series should be much less sensitive to biases due to the start and stop of individual records. Without some adjustment for such spatial variability, an excess of high (or low) latitude stations could erroneously pull the corresponding global average to lower (or higher) values.

The use of an individual baseline parameter per station (or grid cell) makes no assumptions about the underlying spatial structure. This means the maximum spatial information can, in principle, be removed from each record; however, several trade-offs are incurred in doing so. First, the use of predefined reference intervals will limit the usability of stations that were not active during the corresponding period (though other compensating approaches are often used). Secondly, by defining all stations to have zero anomaly during the reference period, one may suppress true structure in the temperature field at that time.

Specifically, reconstructions using this method will have lower spatial variability during the reference interval than at other times due to the artificial constraint that all regions have the same mean value during the reference period.

Lastly, after gridding the data and creating anomaly series, each existing group creates a large-scale average using an area-weighted average of non-empty grid cells. HadCRU and GISS add an additional nuance, as they apply a post-stratification procedure prior to their final average.

Specifically, they create averages of specific latitude bands (or hemispheres in HadCRU’s case), and then combine those averages to create the final global average. This has the effect that each missing cell in a latitude band is essentially replaced by the average of the valid cells in the band before constructing the ultimate global average. To a degree this approach also compensates for the fact that certain areas (e.g. the Northern Hemisphere) tend to have much greater historical coverage than others. Monte Carlo tests we conducted generally confirm that latitudinal banding improves the accuracy of the overall average given the techniques employed by HadCRU and GISS; however, we observe that such approaches are largely an indirect means of incorporating information about the spatial structure of the temperature field that could be modeled more directly.

3. New Averaging Model The global average temperature is a simple descriptive statistic that aims to characterize the Earth. Operationally, the global average may be defined as the integral average of the temperatures over the surface of the Earth as would be measured by an ideal weather station sampling the air at every location. As the true Earth has neither ideal temperature stations nor infinitely dense spatial coverage, we can never capture the ideal global average temperature completely; however, we can use the data we do have to constrain its value.

As described in the preceding section, the existing global temperature analysis groups use a variety of well-motivated algorithms to generate a history of global temperature change.

However, none of their approaches would generally correspond to a statistical model in the more formal sense.

Let ( ⃑ ) be the global temperature field in space and time. We define the


(⃑ ) () ( ⃑) (⃑ ) [1]

Uniqueness can be guaranteed by applying the constraints:

( ⃑) ⃑ ∫,

–  –  –

Given this decomposition, we see that ( ) corresponds to the global mean temperature as a function of time. ( ⃑) captures the time-invariant spatial structure of the temperature field, and hence can be seen as a form of spatial “climatology”, though it differs from the normal definition of a climatology by a simple additive factor corresponding to the long-term average of ( ). The last term, ( ⃑ ), is meant to capture the “weather”, i.e. those fluctuations in temperature over space and time that are neither part of the long-term evolution of the average nor part of the stable spatial structure. In this paper, we show how it is possible to estimate the global temperature field by simultaneously constraining all three pieces of ( ⃑ ) using the available data. (Because we are introducing a large number of symbols, we summarize all the key symbols in the Appendix.) As our study is based solely on the use of land-based temperature data, we choose to restrict the spatial integrals in equation [2] to only the Earth’s land surface. As a result, our study will identify ( ) with the land temperature average only. Rather than defining a specific base interval (e.g. 1950-1980) as has been common in prior work, we will show below how it is possible to reconcile all time periods simultaneously. As a result, the time integral in equation

–  –  –

Where SSD denotes the sum of square deviations and such a minimization would attempt to minimize the error terms. Though appealing, [6] is ultimately misguided as ( ) is distributed highly non-uniformly in both space and time, and the temperature histories at neighboring stations are highly correlated. A naïve application of [6] would result in ̂( ) biased towards the most densely sampled regions of the globe.

However, [6] does inspire our first natural set of constraint equations, namely ∑ ( ( ) ̂( ) ̂ ( ⃑ )) [7] ̂ ∑ Since ̂ is specific to a single station, there is no disadvantage to simply stating that it be chosen to minimize the error at that specific station.

To determine the other fields, it is instructive to consider the properties that we expect ̂ ( ⃑ ) to have. To begin, it should have (at least approximately) zero mean over space and time in accordance with equation [2]. Next, we expect that weather fluctuations should be highly correlated over short distances in space. These considerations are very similar to the fundamental assumptions of the spatial statistical analysis technique known as Kriging (Krige

–  –  –

In most practical uses of Kriging it is necessary to estimate or approximate the covariance matrix in equation [9] based on the available data (Krige 1951, Cressie 1990, Journel 1989). NOAA also requires the covariance matrix for their optimal interpolation method; they estimate it by first constructing a variogram during a time interval with dense temperature sampling and then decomposing it into empirical spatial modes that are used to model the spatial structure of the data (Smith and Reynolds 2005). Their approach is nearly ideal for capturing the spatial structure of the data during the modern era, but has several weaknesses. Specifically this method assumes that the spatial structures are adequately constrained during a brief calibration period and that such relationships remain stable even over an extended period of climate change.

We present an alternative that preserves many of the natural spatial considerations provided by Kriging, but also shares characteristics with the local averaging approach adopted by GISS (Hansen et al 1999, Hansen and Lebedeff 1987). If the variance of the underlying field changes slowly as a function of location, then the covariance function can be replaced with the

correlation function, ( ⃑ ⃑⃑), which leads to the formulation that:

(⃑ ⃑) (⃑ ⃑ ) (⃑ ) (⃑ ⃑ ) [12] (⃑ ⃑) (⃑ ⃑ ) ( ) ( ) (⃑ ) (⃑ ⃑ ) ( (⃑ ⃑) (⃑ ⃑) )

–  –  –

The Kriging formulation is most efficient at capturing fluctuations that have a scale length comparable to the correlation length; however, it also permits the user to find finer structure if more densely positioned data is provided. In particular, in the limit of infinitely dense data, the Kriging estimate of the field will necessarily match the field exactly. This is in direct contrast to the GISS and HadCRU averaging approaches that will always smooth over fine structure.

A further modification is made by assuming that ( ⃑ ⃑⃑) | ⃑ ⃑⃑| ( ), where denotes the distance between ⃑ and ⃑⃑. This allows us to parameterize the correlation field as a simple function of one variable, though it admittedly neglects differences in correlation that might be related to factors such as latitude, altitude, and local vegetation, etc. The correlation

function is parameterized using:

[14] ( ) () This is compared to a reference data set based on randomly selecting 500,000 pairs of stations, and measuring the correlation of their non-seasonal temperature fluctuations provided they have at least ten years of overlapping data. The resulting data set and fit are presented in Figure 2. Pair selection was accomplished by choosing random locations on the globe and locating the nearest temperature records, subject to a requirement that it be no more than 100 km from the chosen random location. The small constant term measures the correlation over the very largest distance scale; however, for the purposes of equation [12] it is computationally which we did while scaling the rest of equation [14] by ( ) to advantageous to set compensate near. This allows us to treat stations at distances greater than ~4000 km as completely uncorrelated, which greatly simplifying the matrix inversion in equation [12] since a majority of the matrix elements are now zeros. Figure 2 shows that the correlation structure is substantial out to a distance of ~1000 km, and non-trivial to ~2000 km from each site.

Figure 2. Mean correlation versus distance curve constructed from 500,000 pair-wise comparisons of station temperature records.

Each station pair was selected at random, and the measured correlation was calculated after removing seasonality and with the requirement that they have at least 10 years of overlapping data. Red, green, and yellow curves show a moving range corresponding to the inner 80, 50, and 20% of data respectively. The black curve corresponds to the modeled correlation vs. distance reported in the text. This correlation versus distance model is used as the foundation of the Kriging process used in the Berkeley Average.

Based on the data, the best fit values in equation [14] were = 0.1276, = 2.4541 x 10-4 / km, = 5.3881 x 10-7 / km2, -2.7452 x 10-11 / km3, = 8.3007 x 10-14 / km4 and = 0.0272. These were the values we used in the Berkeley Earth temperature reconstruction method.

Pages:     | 1 || 3 | 4 |   ...   | 8 |

Similar works:

«Arch. Math., Vol. 51, 71-80 (1988) 0003-889X/88/5101-0071 $ 3.50/0 9 1988 Birkh/iuser Verlag, Basel S a n d w i c h T h e o r e m u n d offene Zerlegungen in schwach K-analytischen Banachrfiumen Von DOMINIKUS NOLL 1, Problem. Es sei E ein Banachraum, C, D seien konvexe Teilmengen yon E, o sei algebraisch innerer Punkt von C D, kurz (1) o ~ (c D)L Es seien q~: C ~ ]R eine konvexe, ~/: D ~ ]R eine konkave Funktion mit ~/(x) = q~(x)ffir alle x e C n D. Gibt es eine stefige, affine Funktion h: E ~...»

«Travaux du Musium Narional d'Hi.~toire Naturelle Vol. XLIII pp. 47-57 2001 ctGrigore Antipa)) EMENDATIONS ON THE DESCRIPTION OF THETISPELECARIS REMEX GUTU AND ILIFFE, 1998 AND THE DIAGNOSIS OF THE ORDER BOCHUSACEA (CRUSTACEA: PERACARIDA) MODEST GUTU Abstract. New data regarding the acron, the number of the thoracomeres fused with the cephalon, of the free pereionites, of the maxillipeds and locomotor pereiopods in the species Thetispelecaris remex Gutu and Iliffe, 1998, and, implicitly, in the...»

«Available online at www.scholarsresearchlibrary.com Scholars Research Library Der Pharmacia Lettre, 2012, 4 (5):1490-1494 (http://scholarsresearchlibrary.com/archive.html) ISSN 0975-5071 USA CODEN: DPLEB4 Formulation and Evaluation of Fast Disintegrating Tablets of Caffeine by Using Effervescent Formulation Approach Neha Tomar, Mohit Tomar, Neha Gulati and Upendra Nagaich Dept. of Pharmaceutics, BIT-School of Pharmacy, By-Pass Road, Partapur, Meerut-250103 (U.P.) _ ABSTRACT In recent decades, a...»

«Press release (not to be published before: 25 August 2015, 10.30 am) DESIGN PRIZE SWITZERLAND – Edition 2015/16 Announcement of the NOMINATIONS Langenthal, 25 August 2015 (3070 characters with spaces) With an enhanced competitive structure and a new category of Product Design Investment Goods, Design Prize Switzerland is responding to recent developments in the design industry. As a result of this tightening up, every discipline now has its own prize category. The industry showed its...»

«Carnival/Wearable-Art/Presence Laura Beloff Researcher, Planetary Collegium Plymouth University. Visiting lecturer, University of Art And Design, Helsinki. off@saunalahti.fi http://www.realitydisfunction.org/ keywords: wearables, art, carnival, presence culture Intro Carnival is said to be not only a celebration but also a critique of the existing social order. By initiating an event of the world upside down carnival indicates the artificiality of any social structure. Likewise, emerging from...»

«Andrews Uniuersity Seminary Studies, Spring 1984, Vol. 22, NO. 1, 81-97. Reprinted material as indicated in Editor's Note, below. MARTIN LUTHER'S INTERVENTION IN BEHALF OF THE BRETHREN OF THE COMMON LIFE IN HERFORD WILLIAM M. LANDEEN Editor's Note: This article is a shortened and edited version of Landeen's chapter Martin Luther and the Deuotio Moderna in Herford in Kenneth A. Strand, ed., The Dawn of Modern Civilization: Studies in Renaissance, Reformation and Other Topics Presented to Honor...»

«UNIVERSIDADE FEDERAL DE SANTA CATARINA PÓS-GRADUAÇÃO EM LETRAS – INGLÊS E LITERATURA CORRESPONDENTE “EXTERMINATE ALL THE BRUTES”: COLONIALISM AND RACISM IN JOSEPH CONRAD’S HEART OF DARKNESS por EDUARDO MARKS DE MARQUES Dissertação submetida à Universidade Federal de Santa Catarina em cumprimento parcial dos requisitos para obtenção do grau de MESTRE EM LETRAS FLORIANÓPOLIS Novembro de 2002. Esta Dissertação de Eduardo Marks de Marques foi julgada adequada e aprovada em sua...»

«See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/45177982 Die italienische Mafia: Paradigma oder Spezialfall der organisierten Kriminalität? Article · January 1999 Source: OAI READS 1 author: Letizia Paoli University of Leuven 58 PUBLICATIONS 481 CITATIONS SEE PROFILE All in-text references underlined in blue are linked to publications on ResearchGate, Available from: Letizia Paoli letting you access and read them immediately....»


«ST 701 Doctrine of the Holy Spirit Name: Daniel H Chew Fiducia and the Anatomy of Faith INTRODUCTION What is faith? Faith can refer to the objective faith to be believed (fides quae creditur), or the subjective element of faith (fides qua creditur) believers are to exercise.1 In the Reformation, the description and definition of the subjective element of faith was a matter of contention between the Reformers and the Roman Catholics. Traditional Roman Catholicism with its view of fides implicita...»

<<  HOME   |    CONTACTS
2016 www.book.dislib.info - Free e-library - Books, dissertations, abstract

Materials of this site are available for review, all rights belong to their respective owners.
If you do not agree with the fact that your material is placed on this site, please, email us, we will within 1-2 business days delete him.