Period of Concentration: Stochastic climate models

Abstracts for the talks

Estimation of stochastic models from data and application to el nino data

Markus Abel  (Universität Potsdam)
Wednesday, June 01, 2005
We analyze data from temperature measurements in the pacific in three steps to obtain a stochastic model. 1) Public data are embedded using the isomap algorithm to obtain an estimate of the dimension of the system and the corresponding embedded time series. 2) These time series are treated by nonparametric regression, a system of three coupled differential equations is obtained. The unexplained deviance can be modeled by a stochastic term, corresponding to observation. As a side result we find a delay of 6 months in one of the variables, consistent with a recently proposed model by Tsiperman, taking into account Rossby waves in the pacific ocean. 3) The so-obtained model is integrated, an ensemble prediction is performed to yield probabilities for el nino occurence.

Geometric singular perturbation theory applied to stochastic climate models

Nils Berglund  (University of Toulon)
Tuesday, May 31, 2005
Geometric singular perturbation theory offers an efficient framework for the study of ordinary differential equations with well-separated time scales. It combines the construction of invariant manifolds, which allow a low-dimensional effective description of the dynamics reduced to slow variables, with a local analysis near bifurcation points. We present extensions of this theory to systems of slow-fast stochastic differential equations, constructing, in particular, neighbourhoods of invariant manifolds in which sample paths concentrate. This approach will be illustrated on a few simple models of the North-Atlantic Thermohaline Circulation. Joint work with Barbara Gentz (WIAS, Berlin).

Some Application of Generalized Stability Theory to Climate Dynamics I

Brian Farrell  (Harvard University)
Tuesday, May 24, 2005
These lectures review a set of ideas and approaches to climate theory based broadly on stochastic methods and specifically on the non-normality of the linear system underlying perturbation dynamics in both certain and uncertain systems. Because the term non-normal is imprecise in its connotation, the term Generalized Stability Theory is used for this approach. Areas to be covered are i) The concept of stability of a statistical quantity with application to sensitivity of storm track statistics. ii) The concept of stability of uncertain systems with application to climate forecast. iii) The concept of structural stability of turbulent systems with application to jet vacillation. iv) The concept of the intrinsic non-normality of time dependent systems with application to the statistical stability of dynamical systems.

The Stochastic Parametric Mechanism for Generation of Surface Water Waves by Wind

Brian Farrell  (Harvard University)
Wednesday, June 01, 2005
Synoptic scale eddy variance and associated fluxes of heat and momentum in mid-latitude jets are sensitive to small alterations in mean jet velocity, dissipation, and static stability. In this lecture the sensitivity of variance and fluxes to such structured changes in the mean jet is examined. In particular the structured change in the jet producing the greatest change in disturbance variance or flux is obtained. The method used builds on previous work in which storm track statistics were obtained using a stochastic model of jet turbulence. This work extends generalized stability theory from addressing stability of deterministic forecast to addressing stability in the context of statistical forecast.

Averaging Principle for Deterministic and Stochastic Perturbations I

Mark Freidlin  (University of Maryland)
Monday, May 23, 2005
First, I will consider averaging in the simplest case: for systems with one degree of freedom and the first integral without singularities. Then I will introduse various regularizations for the system with one degree of freedom and saddle points and show that in the general situation one should consider random perturbations of the equation, not just the initial condition, to regularize the averaging principle for deterministic perturbations. I will describe the limiting slow motion as a stochastic process on the corresponding graph. Next, I will give conditions for averaging principle to be valid for perturbations of multifrequency systems in the action-angle coordinates. I will consider some many-degrees of-freedom systems with singularities in the first integrals. In this case, an open book space should be considered as the phase space for the limiting slow motion. I will consider some applications of this theory to dynamics of incompressible fluid.

Convection-Diffusion in Stationary Incompressible 3D-Flow which is Close to Planar Motion

Mark Freidlin  (University of Maryland)
Monday, May 30, 2005
We will show that a pure deterministic motion should be approximated in certain situation by a stochastic motion. We will introduce for the deterministic system a relative entropy and describe the motion of the wavefronts for a class of reaction-convection equations.

Residence-time distributions as a measure for stochastic resonance

Barbara Gentz  (WIAS Berlin)
Wednesday, June 01, 2005
Stochastic resonance (SR) is believed to play an important role not only in numerous technological and physical applications, but also in biological and climate systems. Apart from spectral properties of the signal, residence-time distributions have been proposed as a measure for SR. For the paradigm of the motion of a periodically forced Brownian particle in a bistable potential, we explain the relation between first-passage-time and residence-time distributions. Going beyond exponential asymptotics, we are able to give rigorous expressions for the densities of these distributions. In a broad range of forcing frequencies and amplitudes, the distributions are found to be close to periodically modulated exponential ones, where the periodic modulations are governed by a universal function, depending on a single parameter related to the forcing period. Joint work with Nils Berglund (CPT-CNRS Luminy, France).

Large deviations for diffusions and stochastic resonance

Samuel Herrmann  (Université Nancy)
Tuesday, May 31, 2005
We consider potential type dynamical systems in finite dimensions with two meta-stable states. They are subject to two sources of perturbation: a slow external periodic perturbation of period T and a small Gaussian random perturbation of intensity , and therefore mathematically described as weakly time inhomogeneous diffusion processes. A system is in stochastic resonance provided the small noisy perturbation is tuned in such a way that its random trajectories follow the exterior periodic motion in an optimal fashion, i.e. for some optimal intensity formula11. The physicists' favorite measures of quality of periodic tuning - and thus stochastic resonance - such as spectral power amplification or signal-to-noise ratio have proven to be defective. They are not robust w.r.t. effective model reduction, i.e. for the passage to a simplified finite state Markov chain model reducing the dynamics to a pure jumping between the meta-stable states of the original system. An entirely probabilistic notion of stochastic resonance based on the transition dynamics between the domains of attraction of the meta-stable states - and thus failing to suffer from this robustness defect - is investigated by using extensions and refinements of the Freidlin-Wentzell theory of large deviations for time homogeneous diffusions. Large deviation principles developed for weakly time inhomogeneous diffusions prove to be key tools for a treatment of the problem of diffusion exit from a domain and thus for the approach of stochastic resonance via transition probabilities between meta-stable sets.

Probability density functions, ensembles and limits to statistical predictability

Richard Kleeman  (Courant Institute)
Monday, May 30, 2005
Ensemble predictions are an integral part of routine weather and climate prediction because of the sensitivity of such projections to the specification of the initial state. In many discussions it is tacitly assumed that ensembles are equivalent to probability distribution functions (p.d.fs) of the random variables of interest. In general for vector valued random variables this is not the case (not even approximately) since practical ensembles do not adequately sample the high dimensional state spaces of dynamical systems of practical relevance. In this talk these ideas are placed on a rigorous footing using concepts derived from Bayesian analysis and information theory. In particular it is shown that ensembles must imply a coarse graining of state space and that this coarse graining implies loss of information relative to the converged p.d.f. To cope with the needed coarse graining in the context of practical applications, a heirarchy of entropic functionals is introduced. These measure the information content of multivariate marginal distributions of increasing order. For fully converged distributions (i.e. p.d.f.s) these functionals form a strictly ordered heirarchy. As one proceeds up the heirarchy however, increasingly coarser partitions are required by the functionals which implies that the strict ordering of the p.d.f. based functionals breaks down. This breakdown is symptomatic of the poor sampling by ensembles of high dimensional state spaces and is unavoidable for most practical applications. In the second part of the talk the theoretical machinery developed above is applied to the practical problem of mid-latitude weather prediction. It is shown that the functionals derived in the first part all decline essentially linearly with time and there appears in fact to be a fairly well defined cut off time (roughly 45 days for the model analyzed) beyond which initial condition information is unimportant to statistical prediction.

A Stochastic View of El Nino I

Cecile Penland  (Climate Diagnostics Center )
Monday, May 23, 2005
In this presentation, I review the paradigm of El Nino as a stochastically- forced system. In particular, the inter- play of stochastic forcing and non-normal linear dynamics explains a large part of El Nino variability. Since controlled laboratory simulations of climate cannot be performed (except on computers), care- ful falsification, as opposed to verifi- cation, of theory by data was required to establish this paradigm. I shall des- cribe this procedure and, in so doing, show how complex a simple model can be.

Date and Location

May 23 - June 01, 2005
Max Planck Institute for Mathematics in the Sciences
Inselstraße 22
04103 Leipzig
Germany
see travel instructions

Scientific Organizers

Peter Imkeller
Humboldt Universität zu Berlin
Berlin
Contact by Email

Stefan Müller
Max Planck Institute for Mathematics in the Sciences
Leipzig
Contact by Email

Administrative Contact

Katja Bieling
Max Planck Institute for Mathematics in the Sciences
Leipzig
Contact by Email
05.04.2017, 12:42