<.H3>
<./H3>
<.H3>
<./H4>
Markus-Lyapunov Fractals - Math Images

# Markus-Lyapunov Fractals

(Difference between revisions)
 Revision as of 09:14, 20 May 2011 (edit)← Previous diff Revision as of 09:18, 20 May 2011 (edit) (undo)Next diff → Line 15: Line 15: The logistic formula is driven by the initial population size and by the potential rate of change of that population. Mathematically, the more powerful of these values is the rate of change; it will determine whether the size of the population settles to a specific value, oscillates between two or more values, or becomes {{EasyBalloon|Link=chaotic|Balloon=Chaos is a state in which numbers seem to move erratically and no pattern emerges. Though each value can be precisely determined from the last, no value but the first can be predicted on its own.}}. To help determine which of these outcomes would occur, a mathematician named Aleksandr Lyapunov developed a method for comparing changes in growth and time in order to calculate what has been dubbed the {{EasyBalloon|Link=Lyapunov exponent|Balloon=$\lambda=\lim_{N \to \infty}\frac{1}{N}\sum_{n=1}^N \log_2 \frac{dx_{(n+1)}}{dx_n}$}}. This is a handy little indicator, and here's why: The logistic formula is driven by the initial population size and by the potential rate of change of that population. Mathematically, the more powerful of these values is the rate of change; it will determine whether the size of the population settles to a specific value, oscillates between two or more values, or becomes {{EasyBalloon|Link=chaotic|Balloon=Chaos is a state in which numbers seem to move erratically and no pattern emerges. Though each value can be precisely determined from the last, no value but the first can be predicted on its own.}}. To help determine which of these outcomes would occur, a mathematician named Aleksandr Lyapunov developed a method for comparing changes in growth and time in order to calculate what has been dubbed the {{EasyBalloon|Link=Lyapunov exponent|Balloon=$\lambda=\lim_{N \to \infty}\frac{1}{N}\sum_{n=1}^N \log_2 \frac{dx_{(n+1)}}{dx_n}$}}. This is a handy little indicator, and here's why: - *If it is zero, the population is change is neutral -- at some point in time, it reaches a fixed point and remains there. + *If it is zero, the population change is neutral -- at some point in time, it reaches a fixed point and remains there. *If it is less than zero, the population will become stable. The lower the number, the faster and more thoroughly the population will stabilize. *If it is less than zero, the population will become stable. The lower the number, the faster and more thoroughly the population will stabilize. *If it is positive, the population will become chaotic. *If it is positive, the population will become chaotic. Line 33: Line 33: This comes from the field of Verhulst Dynamics. Basic, unrestricted growth can be represented by This comes from the field of Verhulst Dynamics. Basic, unrestricted growth can be represented by {{EquationRef2|1}}$x_{(n+1)}=Rx_n$ {{EquationRef2|1}}$x_{(n+1)}=Rx_n$ - But this, as discussed above, is not a realistic model of population growth in the ecological world. To account for the changing rate of change, $R$, of an actual population, Verhulst constructed + But this, as discussed above, is not a realistic model of population growth in the ecological world. To account for the changing rate of change, ''R'', of an actual population, Verhulst constructed {{EquationRef2|2}}$R=\mathbf{r}(1-x_n)$ {{EquationRef2|2}}$R=\mathbf{r}(1-x_n)$ - In this way, the overall rate of change, $R$, is higher when x_n is lower and lower when x_n is higher. Re-inserting this in our initial representation of growth, we have: + In this way, the overall rate of change, ''R'', is higher when ''x''n is lower and lower when ''x''n is higher. Re-inserting this in our initial representation of growth, we have: - {{EquationRef2|3}}$x_{(n+1)}=(1+x_n)\mathbf{r}x_n$ + {{EquationRef2|3}}$x_{(n+1)}=(1-x_n)\mathbf{r}x_n$ This is the logistic formula. This is the logistic formula. [[Image:Bifurcation.gif|frame|left|A bifurcation diagram for the logistic formula. The populations sizes resulting from 120 iterations of the formula based each '''r''' value are plotted above that '''r''' value (after an initial, unrecorded period of 5000 iterations to level the systems). Thus oscillations among two volumes appear as two lines, oscillations among four volumes appear as four lines, and chaos generates so many points that the graph space appears grey. Note the self-similarity shown in the enlarged section.]] [[Image:Bifurcation.gif|frame|left|A bifurcation diagram for the logistic formula. The populations sizes resulting from 120 iterations of the formula based each '''r''' value are plotted above that '''r''' value (after an initial, unrecorded period of 5000 iterations to level the systems). Thus oscillations among two volumes appear as two lines, oscillations among four volumes appear as four lines, and chaos generates so many points that the graph space appears grey. Note the self-similarity shown in the enlarged section.]] ====Bifurcation==== ====Bifurcation==== - The logistic equation is interesting partly for its properties of bifurcation. Bifurcation occurs when a system "branches" (hence the name) into multiple values. In the logistic formula, this means that x_n oscillates between two or more values; the population is fluctuating between these volumes. The diagram to the left shows how the logistic formula bifurcates as the value of '''r''' changes. The population sizes (''y''-axis) are plotted against the '''r''' values (''x''-axis) that generate them. The grey areas show where the system bifurcates to the extent that it essentially "oscillates" among ''all'' possible x_n values. That is, it becomes chaotic. As '''r''' values increase, we see wider and wider "bands" of chaos where ranges of '''r''' values yield only chaotic systems. Above the value of 3 these "bands" become continuous, and all '''r''' values yield chaos. + The logistic equation is interesting partly for its properties of bifurcation. Bifurcation occurs when a system "branches" (hence the name) into multiple values. In the logistic formula, this means that ''x''n oscillates between two or more values; the population is fluctuating between these volumes. The diagram to the left shows how the logistic formula bifurcates as the value of '''r''' changes. The population sizes (''y''-axis) are plotted against the '''r''' values (''x''-axis) that generate them. The grey areas show where the system bifurcates to the extent that it essentially "oscillates" among ''all'' possible ''x''n values. That is, it becomes chaotic. As '''r''' values increase, we see wider and wider "bands" of chaos where ranges of '''r''' values yield only chaotic systems. Above the value of 3 these "bands" become continuous, and all '''r''' values yield chaos. In the enlarged portion, we can see that the diagram of logistic bifurcation is self-similar. This is the fractal property that carries through into Markus-Lyapunov fractals. In the enlarged portion, we can see that the diagram of logistic bifurcation is self-similar. This is the fractal property that carries through into Markus-Lyapunov fractals. Line 49: Line 49: In other words, the Lyapunov exponent $\lambda$ represents the limit of the {{EasyBalloon|Link=mean|Balloon=The presence of $\frac{1}{N}\sum_{n=1}^N ...$ shows that this is a mean.}} of the {{EasyBalloon|Link=exponential|Balloon=The presence of $\log_2$ makes the relationship between $\lambda$ and the original equation exponential.}} {{EasyBalloon|Link=rates of change|Balloon=$\frac{dx_{(n+1)}}{dx_n}$ defines the rate of change.}} that occur in each transition, $x_n \rightarrow x_{(n+1)}$, as the number of transitions approaches infinity. In other words, the Lyapunov exponent $\lambda$ represents the limit of the {{EasyBalloon|Link=mean|Balloon=The presence of $\frac{1}{N}\sum_{n=1}^N ...$ shows that this is a mean.}} of the {{EasyBalloon|Link=exponential|Balloon=The presence of $\log_2$ makes the relationship between $\lambda$ and the original equation exponential.}} {{EasyBalloon|Link=rates of change|Balloon=$\frac{dx_{(n+1)}}{dx_n}$ defines the rate of change.}} that occur in each transition, $x_n \rightarrow x_{(n+1)}$, as the number of transitions approaches infinity. - What does this have to do with stability? The key is the \log_2 component, which renders numbers under 1 negative and those over 1 positive. This is what yields the properties of Lyapunov exponents laid out in the "Basic Explanation" -- those mean overall rates of change that make the system {{EasyBalloon|Link=finite|Balloon=A geometric series or sequence is finite if is multiplied by a factor, '''r''' < 1, that makes it converge to a discrete value or set of values.}} must be less than 1, giving us a negative Lyapunov exponent, while those rates of change that expand the system to the point of chaos must be greater than one, giving us a positive exponent. When the mean overall rate of change is zero, the logarithmic component no longer exists, showing exactly what happens in a superstable system; the rate of change ceases to exist. + What does this have to do with stability? The key is the log2 component, which renders numbers under 1 negative and those over 1 positive. This is what yields the properties of Lyapunov exponents laid out in the "Basic Explanation" -- those mean overall rates of change that make the system {{EasyBalloon|Link=finite|Balloon=A geometric series or sequence is finite if is multiplied by a factor, '''r''' < 1, that makes it converge to a discrete value or set of values.}} must be less than 1, giving us a negative Lyapunov exponent, while those rates of change that expand the system to the point of chaos must be greater than one, giving us a positive exponent. When the mean overall rate of change is zero, the logarithmic component no longer exists, showing exactly what happens in a superstable system; the rate of change ceases to exist. In other words, the Lyapunov exponent is a method for examining the rate of change of a system considered over infinite iterations, then taking that rate of change and making it easily identifiable as a value that induces either chaos or stability. In other words, the Lyapunov exponent is a method for examining the rate of change of a system considered over infinite iterations, then taking that rate of change and making it easily identifiable as a value that induces either chaos or stability.

## Revision as of 09:18, 20 May 2011

Markus-Lyapunov Fractal

A representation of the regions of chaos and stability over the space of two population growth rates.

Note: boldface will later become mouseovers,

Remember to put in references.

How do I type dashes?

How much to put in "Basic Explanation"? I've left out any explanation of chaos...

## Basic Explanation

When mathematicians see a system that changes size, we like to try to model it with formulas. This is easy for systems that always change in one way -- say, by always increasing -- but such systems rarely show up in the real world. Much more common are systems such as those created by animal populations. A population cannot grow infinitely, but rather is constrained by the amounts of food, space etc., available to it. To model the pattern of growth and reduction that occurs as populations approach and retreat from their maximum sizes, mathematicians have developed the logistic formula, which models population growth fairly accurately by including a factor that diminishes as population size grows, just as food and space would diminish.

The logistic formula is driven by the initial population size and by the potential rate of change of that population. Mathematically, the more powerful of these values is the rate of change; it will determine whether the size of the population settles to a specific value, oscillates between two or more values, or becomes chaotic. To help determine which of these outcomes would occur, a mathematician named Aleksandr Lyapunov developed a method for comparing changes in growth and time in order to calculate what has been dubbed the Lyapunov exponent. This is a handy little indicator, and here's why:

• If it is zero, the population change is neutral -- at some point in time, it reaches a fixed point and remains there.
• If it is less than zero, the population will become stable. The lower the number, the faster and more thoroughly the population will stabilize.
• If it is positive, the population will become chaotic.
Another example of a Markus-Lyapunov fractal, this one with chaos in black and stability in gold.

What does all this have to do with the fantastical shapes of the Markus-Lyapunov fractal? Well, a scientist named Mario Markus wanted a way to visualize the potential represented by the Lyapunov exponent as a population moved between two different rates of growth. So he created a graphical space with one rate of growth measured along the x-axis and the other along the y. Thus for any point, (x,y), there is one specific Lyapunov exponent that predicts how a population with those rates of change will behave. Markus then assigned a color to every possible Lyapunov exponent -- one color for positive numbers and another for negative numbers and zero. This second color he placed on a gradient, so that lower negative numbers are lighter and those closer to zero are darker, with zero itself being black. Some Markus-Lyapunov fractals also display superstable points in a third color or black. By this code, Markus could color every point on his graph space based on its Lyapunov exponent.

Consider the main image on this page. The blue "background" shows all the points where the combination of the rate of change on the x and y axes will result in chaotic population growth. The "floating" yellow shapes show where the population will move toward stability. The lighter the yellow, the more stable the population.

An enlargement of "Zircon Zity," showing self-similarity.

The movements from light to dark and the dramatic curves of the boundaries between stability and chaos here create an astonishing 3D effect. But the image is striking not only for its beauty but also for its self-similarity. Self-similarity is that trait that makes fractals what they are -- zooming in on the image reveals smaller and smaller parts that resemble the whole. Consider the image to the right, enlarged from a section of the main image above. Here we see several shapes that repeat in smaller and smaller iterations. Perhaps ironically, this type of pattern is a common property of chaos.

## A More Mathematical Explanation

### The Logistic Forumula

This comes from the field of Verhulst Dynamics. Basic, unrestricted growth can be represented by

1        $x_{(n+1)}=Rx_n$

But this, as discussed above, is not a realistic model of population growth in the ecological world. To account for the changing rate of change, R, of an actual population, Verhulst constructed

2        $R=\mathbf{r}(1-x_n)$

In this way, the overall rate of change, R, is higher when xn is lower and lower when xn is higher. Re-inserting this in our initial representation of growth, we have:

3        $x_{(n+1)}=(1-x_n)\mathbf{r}x_n$

This is the logistic formula.

A bifurcation diagram for the logistic formula. The populations sizes resulting from 120 iterations of the formula based each r value are plotted above that r value (after an initial, unrecorded period of 5000 iterations to level the systems). Thus oscillations among two volumes appear as two lines, oscillations among four volumes appear as four lines, and chaos generates so many points that the graph space appears grey. Note the self-similarity shown in the enlarged section.

#### Bifurcation

The logistic equation is interesting partly for its properties of bifurcation. Bifurcation occurs when a system "branches" (hence the name) into multiple values. In the logistic formula, this means that xn oscillates between two or more values; the population is fluctuating between these volumes. The diagram to the left shows how the logistic formula bifurcates as the value of r changes. The population sizes (y-axis) are plotted against the r values (x-axis) that generate them. The grey areas show where the system bifurcates to the extent that it essentially "oscillates" among all possible xn values. That is, it becomes chaotic. As r values increase, we see wider and wider "bands" of chaos where ranges of r values yield only chaotic systems. Above the value of 3 these "bands" become continuous, and all r values yield chaos.

In the enlarged portion, we can see that the diagram of logistic bifurcation is self-similar. This is the fractal property that carries through into Markus-Lyapunov fractals.

### The Lyapunov Exponent

The discrete form of the Lyapunov exponent is

4        $\lambda=\lim_{N \to \infty}\frac{1}{N}\sum_{n=1}^N \log_2 \frac{dx_{(n+1)}}{dx_n}$

In other words, the Lyapunov exponent $\lambda$ represents the limit of the mean of the exponential rates of change that occur in each transition, $x_n \rightarrow x_{(n+1)}$, as the number of transitions approaches infinity.

What does this have to do with stability? The key is the log2 component, which renders numbers under 1 negative and those over 1 positive. This is what yields the properties of Lyapunov exponents laid out in the "Basic Explanation" -- those mean overall rates of change that make the system finite must be less than 1, giving us a negative Lyapunov exponent, while those rates of change that expand the system to the point of chaos must be greater than one, giving us a positive exponent. When the mean overall rate of change is zero, the logarithmic component no longer exists, showing exactly what happens in a superstable system; the rate of change ceases to exist.

In other words, the Lyapunov exponent is a method for examining the rate of change of a system considered over infinite iterations, then taking that rate of change and making it easily identifiable as a value that induces either chaos or stability.

Insert logistic formula considerations

### Forcing the Rates of Change

Mathematically, the important part of Markus's contribution to understanding this type of system was not his method for generating fractals, but his use of periodic forcing. We have been discussing the great impact of the r value in determining the output of the logistic formula, but this value can have still greater impact if we do not choose to keep it constant. Anyone who has studied biology, as Markus has, knows that the rates of change of a populations size do not simply fluctuate with changing supplies of food and space, but also often alternate between two or more specific potential rates of change depending on such things as weather and mating seasons.

In terms of the logistic formula, this means we choose a set of rates of change, $\mathbf{r}_1, \mathbf{r}_2, \mathbf{r}_3,..., \mathbf{r}_p$, where p is the period over which the rates of change loop. When we force the rates of change to follow such a loop, we have a new, modular logistic equation (3):

A Markus-Lyapunov fractal with rate-of-change pattern ab
5        $x_{(n+1)}=\mathbf{r}_{n \text{mod} p}x_n(1-x_n)$
It is in these forced alterations in rates of change that the fascinating shapes of the Markus-Lyapunov fractal come out. Each of the fractals is formed from some pattern of two rates of change, r1 = a and r2 = b. Because the axes used to map these fractals are measurements of changes in a and b, the pattern a would simply yield a set of vertical bars, just as the pattern b would yield horizontal bars. However, once the patterns start to become mixed, more interesting results come out. The image to the right shows an ab pattern. Note that it is much simpler than other images shown on this page; the main image, for instance, is a bbbbbbaaaaaa pattern.