** Next:** Correlation dimension
**Up:** Practical implementation of nonlinear
** Previous:** The Lyapunov spectrum

Solutions of dissipative dynamical systems cannot fill a volume of the phase
space, since dissipation is synonymous with a contraction of volume elements
under the action of the equations of motion. Instead, trajectories are confined
to lower dimensional subsets which have measure zero in the phase space. These
subsets can be extremely complicated, and frequently they possess a fractal
structure, which means that they are in a nontrivial way
self-similar. Generalized dimensions are one class of quantities to
characterize this fractality. The *Hausdorff dimension* is, from the
mathematical point of view, the most natural concept to characterize fractal
sets [67], whereas the *information dimension* takes into
account the relative visitation frequencies and is therefore more attractive for
physical systems. Finally, for the characterization of measured data, other
similar concepts, like the *correlation dimension*, are more useful. One
general remark is highly relevant in order to understand the limitations of any
numerical approach: dimensions characterize a set or an invariant measure whose
support is the set, whereas any data set contains only a finite number of
points representing the set or the measure. By definition, the dimension of a
finite set of points is zero. When we determine the dimension of an attractor
numerically, we extrapolate from finite length scales, where the statistics we
apply is insensitive to the finiteness of the number of data, to the
infinitesimal scales, where the concept of dimensions is defined. This
extrapolation can fail for many reasons which will be partly discussed
below. Dimensions are invariant under smooth transformations and thus again
computable in time delay embedding spaces.
Entropies are an information theoretical concept to characterize the amount of
information needed to predict the next measurement with a certain
precision. The most popular one is the Kolmogorov-Sinai entropy. We will
discuss here only the correlation entropy, which can be computed in a much more
robust way. The occurrence of entropies in a section on dimensions has to do
with the fact that they can be determined both by the same statistical tool.

** Next:** Correlation dimension
**Up:** Practical implementation of nonlinear
** Previous:** The Lyapunov spectrum
*Thomas Schreiber *

Wed Jan 6 15:38:27 CET 1999