# Markus-Lyapunov Fractals

Markus-Lyapunov Fractal

A representation of the regions of chaos and stability over the space of two population growth rates.

Note: boldface will later become mouseovers,

Remember to put in references.

How do I type dashes? –

How much to put in "Basic Explanation"? I've left out any explanation of chaos...

## Basic Explanation

When mathematicians see a system that changes size, we like to try to model it with formulas. This is easy for systems that always change in one way – say, by always increasing – but such systems rarely show up in the real world. Much more common are systems such as those created by animal populations. A population cannot grow infinitely, but rather is constrained by the amounts of food, space etc., available to it. To model the pattern of growth and reduction that occurs as populations approach and retreat from their maximum sizes, mathematicians have developed the logistic formula, which models population growth fairly accurately by including a factor that diminishes as population size grows, just as food and space would diminish.

The logistic formula is driven by the initial population size and by the potential rate of change of that population. Mathematically, the more powerful of these values is the rate of change; it will determine whether the size of the population settles to a specific value, oscillates between two or more values, or becomes chaotic. To help determine which of these outcomes would occur, a mathematician named Aleksandr Lyapunov developed a method for comparing changes in growth and time in order to calculate what has been dubbed the Lyapunov exponent. This is a handy little indicator, and here's why:

• If it is zero, the population change is neutral; at some point in time, it reaches a fixed point and remains there.
• If it is less than zero, the population will become stable. The lower the number, the faster and more thoroughly the population will stabilize.
• If it is positive, the population will become chaotic.
Another example of a Markus-Lyapunov fractal, this one with chaos in black and stability in gold.

What does all this have to do with the fantastical shapes of the Markus-Lyapunov fractal? Well, a scientist named Mario Markus wanted a way to visualize the potential represented by the Lyapunov exponent as a population moved between two different rates of growth. So he created a graphical space with one rate of growth measured along the x-axis and the other along the y. Thus for any point, (x,y), there is one specific Lyapunov exponent that predicts how a population with those rates of change will behave. Markus then assigned a color to every possible Lyapunov exponent – one color for positive numbers and another for negative numbers and zero. This second color he placed on a gradient, so that lower negative numbers are lighter and those closer to zero are darker, with zero itself being black. Some Markus-Lyapunov fractals also display superstable points in a third color or black. By this code, Markus could color every point on his graph space based on its Lyapunov exponent.

Consider the main image on this page. The blue "background" shows all the points where the combination of the rate of change on the x and y axes will result in chaotic population growth. The "floating" yellow shapes show where the population will move toward stability. The lighter the yellow, the more stable the population.

An enlargement of "Zircon Zity," showing self-similarity.

The movements from light to dark and the dramatic curves of the boundaries between stability and chaos here create an astonishing 3D effect. But the image is striking not only for its beauty but also for its self-similarity. Self-similarity is that trait that makes fractals what they are – zooming in on the image reveals smaller and smaller parts that resemble the whole. Consider the image to the right, enlarged from a section of the main image above. Here we see several shapes that repeat in smaller and smaller iterations. Perhaps ironically, this type of pattern is a common property of chaos.

## A More Mathematical Explanation

### The Logistic Forumula

This comes from the field of Verhulst Dynamics. Basic, unrestricted growth can be represented by

1        $x_{(n+1)}=Rx_n$

But this, as discussed above, is not a realistic model of population growth in the ecological world. To account for the changing rate of change, R, of an actual population, Verhulst constructed

2        $R=\mathbf{r}(1-x_n)$

In this way, the overall rate of change, R, is higher when xn is lower and lower when xn is higher. Re-inserting this in our initial representation of growth, we have:

3        $x_{(n+1)}=(1-x_n)\mathbf{r}x_n$

This is the logistic formula.

A bifurcation diagram for the logistic formula. The populations sizes resulting from 120 iterations of the formula based each r value are plotted above that r value (after an initial, unrecorded period of 5000 iterations to level the systems). Note the self-similarity shown in the enlarged section.

#### Bifurcation

The logistic equation is interesting partly for its properties of bifurcation. Bifurcation occurs when a system "branches" (hence the name) into multiple values. In the logistic formula, this means that xn goes from a single value to oscillating among two or more values; the population volume ceases to be constant and begins fluctuating between multiple volumes.

The diagram to the left shows how the logistic formula bifurcates as the value of r changes. Population sizes, xn, (y-axis) are plotted against the r values (x-axis) that generate them. The most stable state therefore appears as a single horizontal line. When this line appears to "branch" into two, we are observing bifurcation; the population is now oscillating between two volumes. As the branching continues, so does bifurcation: Three lines show oscilation among three volumes, four lines show oscillation among four volumes, and so forth. The grey areas show where the system bifurcates to the extent that it essentially "oscillates" among all possible xn values. That is, it becomes chaotic. As r values increase, we see wider and wider "bands" of chaos where ranges of r values yield only chaotic systems. Above the r value of 3, these "bands" become continuous, and all r values yield chaos.

In the enlarged portion, we can see that the diagram of logistic bifurcation is self-similar. This is the fractal property that carries through into Markus-Lyapunov fractals.

### The Lyapunov Exponent

The discrete form of the Lyapunov exponent is

4        $\lambda=\lim_{N \to \infty}\frac{1}{N}\sum_{n=1}^N \log_2 \frac{dx_{(n+1)}}{dx_n}$

In other words, the Lyapunov exponent $\lambda$ represents the limit of the mean of the exponential rates of change that occur in each transition, xn $\rightarrow$ x(n+1), as the number of transitions approaches infinity.

What does this have to do with stability? The key is the log2 component, which renders numbers under 1 negative and those over 1 positive. This is what yields the properties of Lyapunov exponents laid out in the "Basic Explanation" – those mean overall rates of change that make the system finite must be less than 1, giving us a negative Lyapunov exponent, while those rates of change that expand the system to the point of chaos must be greater than one, giving us a positive exponent. When the mean overall rate of change is zero, the logarithmic component no longer exists, showing exactly what happens in a superstable system; the rate of change ceases to exist.

In other words, the Lyapunov exponent is a method for examining the rate of change of a system considered over infinite iterations, then taking that rate of change and making it easily identifiable as a value that induces either chaos or stability.

#### In the Logistic Formula

Basic differentiation shows us that, for the logistic formula (3):

5        $\frac{dx_{(n+1)}}{dx_n}=\mathbf{r}-2\mathbf{r}x_0^2$

Using this and a sufficiently large N number of iterations, we can approximate the Lyapunov exponent for the logistic formula to be:

6        $\lambda \approx \frac{1}{N}\sum_{n=1}^N \log_2 \mathbf{r}-2\mathbf{r}x_0^2$

Here we can see much more clearly a property that we have been assuming -- the variable that has the greatest impact on the stability of the logistic equation is r, not xn. This is clear here because, as we differentiate the logistic formula in order to examine its overall rate of change, xn is reduced to the constant value x0 (the starting volume of the population). It is therefore no longer a variable when we look at the formula on the level of its Lyapunov exponent, and so changing the value of xn does not change whether the logistic function yields chaos or stability.

### Forcing the Rates of Change

Mathematically, the important part of Markus's contribution to understanding this type of system was not his method for generating fractals, but his use of periodic rate-of-change forcing. We have been discussing the great impact of the r value in determining the output of the logistic formula, but this value can have still greater impact if we do not choose to keep it constant. Anyone who has studied biology, as Markus has, knows that the rates of change of a populations size do not simply fluctuate with changing supplies of food and space, but also often alternate between two or more specific potential rates of change depending on such things as weather and mating seasons.

In terms of the logistic formula, this means we choose a set of rates of change, r1, r2, r3,..., rp, where p is the period over which the rates of change loop. When we force the rates of change to follow such a loop, we have a new, modular logistic equation (3):

A Markus-Lyapunov fractal with rate-of-change pattern ab
7        $x_{(n+1)}=\mathbf{r}_{n \text{mod} p}x_n(1-x_n)$
It is in these forced alterations in rates of change that the fascinating shapes of the Markus-Lyapunov fractal come out. Each of the fractals is formed from some pattern of two rates of change, r1 = a and r2 = b. Because the axes used to map these fractals are measurements of changes in a and b, the pattern a would simply yield a set of vertical bars, just as the pattern b would yield horizontal bars. However, once the patterns start to become mixed, more interesting results come out. The image to the right shows an ab pattern. Note that it is much simpler than other images shown on this page; the main image, for instance, is a bbbbbbaaaaaa pattern.