Home > Library > MSRI Preprints > 1999 > Preprint 1999-001 > Abstract

Abstract for MSRI Preprint 1999-001

A Conditioning Function for the Convergence of Numerical ODE Solvers and Lyapunov's Theory of Stability

Divakar Viswanath

For the ordinary differential equation (ODE) $\dot{x}(t) = f(t,x)$, $x(0) = x_0$, $t\geq 0$, $x\in R^d$, assume $f$ to be at least continuous in $t$ and locally Lipshitz in $x$, and if necessary, several times continuously differentiable in $t$ and $x$. We associate a conditioning function $E(t)$ with each solution $x(t)$ which captures the accumulation of global error in a numerical approximation in the following sense: if $\tilde{x}(t;h)$ is an approximation derived from a single step method of time step $h$ and order $r$ then ${\Vert {\tilde{x}(t;h) - x(t)} \Vert} \lt K(E(t)+\epsilon)h^r$ for $0\leq t\leq T$, any $\epsilon \gt 0$, sufficiently small $h$, and a constant $K \gt 0$.

Using techniques from the stability theory of differential equations, this paper gives conditions on $x(t)$ for $E(t)$ to be upper bounded linearly or by a constant for $t\geq 0$. More concretely, these techniques give constant or linear bounds on $E(t)$ when $x(t)$ is a trajectory of a dynamical system which falls into a stable, hyperbolic fixed point; or into a stable, hyperbolic cycle; or into a normally hyperbolic and contracting manifold with quasiperiodic flow on the manifold.