# Markus-Lyapunov Fractals

(Difference between revisions)
 Revision as of 09:07, 26 May 2011 (edit)← Previous diff Revision as of 09:11, 26 May 2011 (edit) (undo)Next diff → Line 20: Line 20: This comes from the field of Verhulst Dynamics. Basic, unrestricted growth can be represented by This comes from the field of Verhulst Dynamics. Basic, unrestricted growth can be represented by {{EquationRef2|1}}$x_{(n+1)}=Rx_n$ {{EquationRef2|1}}$x_{(n+1)}=Rx_n$ - Where ''x''(n+1) is the population size at time n+1. But this, as discussed above, is not a realistic model of population growth in the ecological world. To account for the changing rate of change, ''R'', of an actual population, Verhulst constructed + Where ''x''(''n''+1) is the population size at time (''n'' + 1). But this, as discussed above, is not a realistic model of population growth in the ecological world. To account for the changing rate of change, ''R'', of an actual population, Verhulst constructed {{EquationRef2|2}}$R=\mathbf{r}(1-x_n)$ {{EquationRef2|2}}$R=\mathbf{r}(1-x_n)$ - Where '''r''' is a parameter for the potential rate of change of the population. In this way, the overall rate of change, ''R'', is higher when ''x''n is lower and lower when ''x''n is higher. Re-inserting this in our initial representation of growth, we have: + Where '''r''' is a parameter for the potential rate of change of the population. In this way, the overall rate of change, ''R'', is higher when ''x''''n'' is lower and lower when ''x''''n'' is higher. Re-inserting this in our initial representation of growth, we have: {{EquationRef2|3}}$x_{(n+1)}=(1-x_n)\mathbf{r}x_n$ {{EquationRef2|3}}$x_{(n+1)}=(1-x_n)\mathbf{r}x_n$ This is the logistic formula. This is the logistic formula. [[Image:Bifurcation.gif|frame|left|A bifurcation diagram for the logistic formula. The populations sizes resulting from 120 iterations of the formula based each '''r''' value are plotted above that '''r''' value (after an initial, unrecorded period of 5000 iterations to level the systems). Note the self-similarity shown in the enlarged section.Peitgen, H., & Richter, P. (1986). ''The Beauty of Fractals: Images of Complex Dynamic Systems''. Berlin: Springer-Verlag.]] [[Image:Bifurcation.gif|frame|left|A bifurcation diagram for the logistic formula. The populations sizes resulting from 120 iterations of the formula based each '''r''' value are plotted above that '''r''' value (after an initial, unrecorded period of 5000 iterations to level the systems). Note the self-similarity shown in the enlarged section.Peitgen, H., & Richter, P. (1986). ''The Beauty of Fractals: Images of Complex Dynamic Systems''. Berlin: Springer-Verlag.]] ====Bifurcation==== ====Bifurcation==== - The logistic equation is interesting partly for its properties of bifurcation. Bifurcation occurs when a system "branches" (hence the name) into multiple values. In the logistic formula, this means that, as the '''r''' value grows, ''x''n goes from a single value to oscillating among two or more values; the population volume ceases to be constant and begins fluctuating between multiple volumes. + The logistic equation is interesting partly for its properties of bifurcation. Bifurcation occurs when a system "branches" (hence the name) into multiple values. In the logistic formula, this means that, as the '''r''' value grows, ''x''''n'' goes from a single value to oscillating among two or more values; the population volume ceases to be constant and begins fluctuating between multiple volumes. - The diagram to the left shows how the logistic formula bifurcates as the value of '''r''' changes. Population sizes, ''x''n, (''y''-axis) are plotted against the '''r''' values (''x''-axis) that generate them. The most stable state therefore appears as a single horizontal line. When this line appears to "branch" into two, we are observing bifurcation; the '''r''' value has changed so that the population is now oscillating between two volumes. As the branching continues, so does bifurcation: Three lines show oscilation among three volumes, four lines show oscillation among four volumes, and so forth. The grey areas show where the system bifurcates to the extent that it essentially "oscillates" among ''all'' possible ''x''n values. That is, it becomes chaotic. As '''r''' values increase, we see wider and wider "bands" of chaos where ranges of '''r''' values yield only chaotic systems. Above the '''r''' value of 3, these "bands" become continuous, and all '''r''' values yield chaos. + The diagram to the left shows how the logistic formula bifurcates as the value of '''r''' changes. Population sizes, ''x''''n'', (''y''-axis) are plotted against the '''r''' values (''x''-axis) that generate them. The most stable state therefore appears as a single horizontal line. When this line appears to "branch" into two, we are observing bifurcation; the '''r''' value has changed so that the population is now oscillating between two volumes. As the branching continues, so does bifurcation: Three lines show oscilation among three volumes, four lines show oscillation among four volumes, and so forth. The grey areas show where the system bifurcates to the extent that it essentially "oscillates" among ''all'' possible ''x''''n'' values. That is, it becomes chaotic. As '''r''' values increase, we see wider and wider "bands" of chaos where ranges of '''r''' values yield only chaotic systems. Above the '''r''' value of 3, these "bands" become continuous, and all '''r''' values yield chaos. In the enlarged portion, we can see that the diagram of logistic bifurcation is self-similar. This is the fractal property that carries through into Markus-Lyapunov fractals. In the enlarged portion, we can see that the diagram of logistic bifurcation is self-similar. This is the fractal property that carries through into Markus-Lyapunov fractals. Line 37: Line 37: The discrete form of the Lyapunov exponent is The discrete form of the Lyapunov exponent is {{EquationRef2|4}}$\lambda=\lim_{N \to \infty}\frac{1}{N}\sum_{n=1}^N \log_2 \frac{dx_{(n+1)}}{dx_n}$ {{EquationRef2|4}}$\lambda=\lim_{N \to \infty}\frac{1}{N}\sum_{n=1}^N \log_2 \frac{dx_{(n+1)}}{dx_n}$ - In other words, the Lyapunov exponent $\lambda$ represents the [[Limit|limit]] of the {{EasyBalloon|Link=mean|Balloon=The presence of $\frac{1}{N}\sum_{n=1}^N ...$ shows that this is a mean.}} of the {{EasyBalloon|Link=exponential|Balloon=The presence of $\log_2$ makes the relationship between $\lambda$ and the original equation exponential.}} {{EasyBalloon|Link=rates of change|Balloon=$\frac{dx_{(n+1)}}{dx_n}$ defines the rate of change.}} that occur in each transition, ''x''n $\rightarrow$ ''x''(n+1), as the number of transitions approaches infinity. + In other words, the Lyapunov exponent $\lambda$ represents the [[Limit|limit]] of the {{EasyBalloon|Link=mean|Balloon=The presence of $\frac{1}{N}\sum_{n=1}^N ...$ shows that this is a mean.}} of the {{EasyBalloon|Link=exponential|Balloon=The presence of $\log_2$ makes the relationship between $\lambda$ and the original equation exponential.}} {{EasyBalloon|Link=rates of change|Balloon=$\frac{dx_{(n+1)}}{dx_n}$ defines the rate of change.}} that occur in each transition, ''x''''n'' $\rightarrow$ ''x''(''n''+1), as the number of transitions approaches infinity. What does this have to do with stability? The key is the log2 component, which renders numbers under 1 negative and those over 1 positive. This is what yields the properties of Lyapunov exponents laid out in the "Basic Explanation" – those mean overall rates of change that make the system {{EasyBalloon|Link=finite|Balloon=A geometric series or sequence is finite if is multiplied by a factor, '''r''' < 1, that makes it converge to a discrete value or set of values.}} must be less than 1, giving us a negative Lyapunov exponent, while those rates of change that expand the system to the point of chaos must be greater than one, giving us a positive exponent. When the mean overall rate of change is zero, the logarithmic component no longer exists, showing exactly what happens in a superstable system; the rate of change ceases to exist. What does this have to do with stability? The key is the log2 component, which renders numbers under 1 negative and those over 1 positive. This is what yields the properties of Lyapunov exponents laid out in the "Basic Explanation" – those mean overall rates of change that make the system {{EasyBalloon|Link=finite|Balloon=A geometric series or sequence is finite if is multiplied by a factor, '''r''' < 1, that makes it converge to a discrete value or set of values.}} must be less than 1, giving us a negative Lyapunov exponent, while those rates of change that expand the system to the point of chaos must be greater than one, giving us a positive exponent. When the mean overall rate of change is zero, the logarithmic component no longer exists, showing exactly what happens in a superstable system; the rate of change ceases to exist. Line 48: Line 48: Using this and a sufficiently large ''N'' number of iterations, we can approximate the Lyapunov exponent for the logistic formula to be: Using this and a sufficiently large ''N'' number of iterations, we can approximate the Lyapunov exponent for the logistic formula to be: {{EquationRef2|6}}$\lambda \approx \frac{1}{N}\sum_{n=1}^N \log_2 \mathbf{r}-2\mathbf{r}x_0^2$ {{EquationRef2|6}}$\lambda \approx \frac{1}{N}\sum_{n=1}^N \log_2 \mathbf{r}-2\mathbf{r}x_0^2$ - Here we can see much more clearly a property that we have been assuming -- the variable that has the greatest impact on the stability of the logistic equation is '''r''', not ''x''n. This is clear here because, as we differentiate the logistic formula in order to examine its overall rate of change, ''x''n is reduced to the constant value ''x''0 (the starting volume of the population). It is therefore no longer a variable when we look at the formula on the level of its Lyapunov exponent, and so changing the value of ''x''n does not change whether the logistic function yields chaos or stability. + Here we can see much more clearly a property that we have been assuming -- the variable that has the greatest impact on the stability of the logistic equation is '''r''', not ''x''''n''. This is clear here because, as we differentiate the logistic formula in order to examine its overall rate of change, ''x''''n'' is reduced to the constant value ''x''0 (the starting volume of the population). It is therefore no longer a variable when we look at the formula on the level of its Lyapunov exponent, and so changing the value of ''x''''n'' does not change whether the logistic function yields chaos or stability. ===Forcing the Rates of Change=== ===Forcing the Rates of Change===

## Revision as of 09:11, 26 May 2011

Markus-Lyapunov Fractal
A representation of the regions of chaos and stability over the space of two population growth rates.

# Basic Description

The Markus-Lyapunov fractal is much more than a pretty picture; it is a map. The curving bodies and sweeping arms of the image are in fact a color-coded plot that shows us how a population changes as its rate of growth moves between two values. All the rich variations of color in the fractal come from the different levels of stability and chaos possible in such change.

Before anything else, in order to generate a Markus-Lyapunov fractal, we must be able to represent animal population change mathematically. This entails more than simply writing an equation for constant growth or constant reduction – a population cannot grow infinitely, but rather is constrained by the amounts of food, space, etc., available to it. To model the pattern of growth and reduction that occurs as populations approach and retreat from their maximum sizes, mathematicians have developed the logistic formula, which models population growth fairly accurately by including a factor that diminishes as population size grows, just as food and space would diminish.

The logistic formula is driven by the initial population size and by the potential rate of change of that population. Mathematically, the more powerful of these values is the rate of change; it will determine whether the size of the population settles to a specific value, oscillates between two or more values, or becomes chaotic. To help determine which of these outcomes would occur, a mathematician named Aleksandr Lyapunov developed a method for comparing changes in growth and time in order to calculate what has been dubbed the Lyapunov exponent. This is a handy little indicator, and here's why:

• If it is zero, the population change is neutral; at some point in time, it reaches a fixed point and remains there.
• If it is less than zero, the population will become stable. The lower the number, the faster and more thoroughly the population will stabilize.
• If it is positive, the population will become chaotic.
Another example of a Markus-Lyapunov fractal, this one with chaos in black and stability in gold.

What does all this have to do with the fantastical shapes of the Markus-Lyapunov fractal? Well, a scientist named Mario Markus wanted a way to visualize the potential represented by the Lyapunov exponent as a population moved between two different rates of growth. So he created a graphical space with one rate of growth measured along the x-axis and the other along the y. Thus for any point, (x,y), there is one specific Lyapunov exponent that predicts how a population with those rates of change will behave. Markus then assigned a color to every possible Lyapunov exponent – one color for positive numbers and another for negative numbers and zero. This second color he placed on a gradient, so that lower negative numbers are lighter and those closer to zero are darker, with zero itself being black. Some Markus-Lyapunov fractals also display superstable points in a third color or black. By this code, Markus could color every point on his graph space based on its Lyapunov exponent.

Consider the main image on this page. The blue "background" shows all the points where the combination of the rates of change on the x and y axes will result in chaotic population growth. The "floating" yellow shapes show where the population will move toward stability. The lighter the yellow, the more stable the population.

# A More Mathematical Explanation

### The Logistic Forumula

This comes from the field of Verhulst Dynamics. Basic, unrestricted growt [...]

### The Logistic Forumula

This comes from the field of Verhulst Dynamics. Basic, unrestricted growth can be represented by

1        $x_{(n+1)}=Rx_n$

Where x(n+1) is the population size at time (n + 1). But this, as discussed above, is not a realistic model of population growth in the ecological world. To account for the changing rate of change, R, of an actual population, Verhulst constructed

2        $R=\mathbf{r}(1-x_n)$

Where r is a parameter for the potential rate of change of the population. In this way, the overall rate of change, R, is higher when xn is lower and lower when xn is higher. Re-inserting this in our initial representation of growth, we have:

3        $x_{(n+1)}=(1-x_n)\mathbf{r}x_n$

This is the logistic formula.

A bifurcation diagram for the logistic formula. The populations sizes resulting from 120 iterations of the formula based each r value are plotted above that r value (after an initial, unrecorded period of 5000 iterations to level the systems). Note the self-similarity shown in the enlarged section.[1]

#### Bifurcation

The logistic equation is interesting partly for its properties of bifurcation. Bifurcation occurs when a system "branches" (hence the name) into multiple values. In the logistic formula, this means that, as the r value grows, xn goes from a single value to oscillating among two or more values; the population volume ceases to be constant and begins fluctuating between multiple volumes.

The diagram to the left shows how the logistic formula bifurcates as the value of r changes. Population sizes, xn, (y-axis) are plotted against the r values (x-axis) that generate them. The most stable state therefore appears as a single horizontal line. When this line appears to "branch" into two, we are observing bifurcation; the r value has changed so that the population is now oscillating between two volumes. As the branching continues, so does bifurcation: Three lines show oscilation among three volumes, four lines show oscillation among four volumes, and so forth. The grey areas show where the system bifurcates to the extent that it essentially "oscillates" among all possible xn values. That is, it becomes chaotic. As r values increase, we see wider and wider "bands" of chaos where ranges of r values yield only chaotic systems. Above the r value of 3, these "bands" become continuous, and all r values yield chaos.

In the enlarged portion, we can see that the diagram of logistic bifurcation is self-similar. This is the fractal property that carries through into Markus-Lyapunov fractals.

### The Lyapunov Exponent

The discrete form of the Lyapunov exponent is

4        $\lambda=\lim_{N \to \infty}\frac{1}{N}\sum_{n=1}^N \log_2 \frac{dx_{(n+1)}}{dx_n}$

In other words, the Lyapunov exponent $\lambda$ represents the limit of the mean of the exponential rates of change that occur in each transition, xn $\rightarrow$ x(n+1), as the number of transitions approaches infinity.

What does this have to do with stability? The key is the log2 component, which renders numbers under 1 negative and those over 1 positive. This is what yields the properties of Lyapunov exponents laid out in the "Basic Explanation" – those mean overall rates of change that make the system finite must be less than 1, giving us a negative Lyapunov exponent, while those rates of change that expand the system to the point of chaos must be greater than one, giving us a positive exponent. When the mean overall rate of change is zero, the logarithmic component no longer exists, showing exactly what happens in a superstable system; the rate of change ceases to exist.

In other words, the Lyapunov exponent is a method for examining the rate of change of a system considered over infinite iterations, then taking that rate of change and making it easily identifiable as a value that induces either chaos or stability.

#### In the Logistic Formula

Basic differentiation shows us that, for the logistic formula (3):

5        $\frac{dx_{(n+1)}}{dx_n}=\mathbf{r}-2\mathbf{r}x_0^2$

Using this and a sufficiently large N number of iterations, we can approximate the Lyapunov exponent for the logistic formula to be:

6        $\lambda \approx \frac{1}{N}\sum_{n=1}^N \log_2 \mathbf{r}-2\mathbf{r}x_0^2$

Here we can see much more clearly a property that we have been assuming -- the variable that has the greatest impact on the stability of the logistic equation is r, not xn. This is clear here because, as we differentiate the logistic formula in order to examine its overall rate of change, xn is reduced to the constant value x0 (the starting volume of the population). It is therefore no longer a variable when we look at the formula on the level of its Lyapunov exponent, and so changing the value of xn does not change whether the logistic function yields chaos or stability.

### Forcing the Rates of Change

Mathematically, the important part of Markus's contribution to understanding this type of system was not his method for generating fractals, but his use of periodic rate-of-change forcing. We have been discussing the great impact of the r value in determining the output of the logistic formula, but this value can have still greater impact if we do not choose to keep it constant. Anyone who has studied biology, as Markus has, knows that the rates of change of a populations size do not simply fluctuate with changing supplies of food and space, but also often alternate between two or more specific potential rates of change depending on such things as weather and mating seasons.
A Markus-Lyapunov fractal with rate-of-change pattern ab

In terms of the logistic formula, this means we choose a set of rates of change, r1, r2, r3,..., rp, where p is the period over which the rates of change loop. When we force the rates of change to follow such a loop, we have a new, modular logistic equation (3):

7        $x_{(n+1)}=\mathbf{r}_{n \text{mod} p}x_n(1-x_n)$

It is in these forced alterations in rates of change that the fascinating shapes of the Markus-Lyapunov fractal come out. Each of the fractals is formed from some pattern of two rates of change, a and b. So a pattern aba would mean each point on the fractal is colored based on the Lyapunov exponent of the logistic formula 7, where r1 = a, r2 = b, and r3 = a. That is, the r values would cycle a,b,a,a,b,a,a,b,a....

Because the axes used to map these fractals are measurements of changes in a and b, the pattern a would simply yield a set of vertical bars, just as the pattern b would yield horizontal bars. However, once the patterns start to become mixed, more interesting results come out. The image to the right shows an ab pattern. Note that it is much simpler than other images shown on this page; the main image, for instance, is a bbbbbbaaaaaa pattern.

# Why It's Interesting

An enlargement of a section of "Zircon Zity," showing self-similarity.

### Fractal Properties

The movements from light to dark and the dramatic curves of the boundaries between stability and chaos here create an astonishing 3D effect. But the image is striking not only for its beauty but also for its self-similarity. Self-similarity is that trait that makes fractals what they are – zooming in on the image reveals smaller and smaller parts that resemble the whole. Consider the image to the right, enlarged from a section of the main image above. Here we see several shapes that repeat in smaller and smaller iterations. Perhaps ironically, this type of pattern is a common property of chaos.

For more images of the fractal properties of chaotic systems, see the Henon Attractor, the Harter-Heighway Dragon Curve, and Julia Sets.

One artist superimposed and edited several real Markus-Lyapunov fractals to create this piece of art.

### Artistic Extensions

After Markus saw the incredible beauty and intriguing three-dimensionality of the images generated by his plotting system, he immediately sent the images to a gallery in the hopes that it would display his images in an exhibition.[2] It's easy to see why he did so, and in fact, pictures based on these fractals have become a large part of what is called "fractalist" art. As with all domains of fractalist art, there is a great deal of debate in the art community over whether these images are truly "art" given their intrinsic reliance on a purely scientific, algorithmically-generated chart. One could say that such a process is devoid of creativity, but it is equally valid to say that the identification and presentation of the beauty in the science is an art in itself – a concept that is critical in modern art. Either way, there has been an undeniable artistic fascination with Markus-Lyapunov fractals; if the image seems familiar, you have likely seen it on posters, t-shirts, or any other canvas for graphic design.

# References

1. Peitgen, H., & Richter, P. (1986). The Beauty of Fractals: Images of Complex Dynamic Systems. Berlin: Springer-Verlag.
2. Dewdney, A. K. (1991). Leaping into Lyapunov Space. Scientific American, (130-132).
Other Sources Consulted
Elert, G. (2007). The Chaos Hypertextbook. http://hypertextbook.com/chaos/