# Markus-Lyapunov Fractals

(Difference between revisions)
 Revision as of 14:00, 23 May 2011 (edit)← Previous diff Current revision (15:17, 14 July 2011) (edit) (undo) (32 intermediate revisions not shown.) Line 1: Line 1: - {{Image Description + {{Image Description Ready |ImageName=Markus-Lyapunov Fractal |ImageName=Markus-Lyapunov Fractal |Image=Markus-Lyapunov1.gif |Image=Markus-Lyapunov1.gif - |ImageIntro=A representation of the regions of chaos and stability over the space of two population growth rates. + |ImageIntro=Markus-Lyapunov fractals are representations of the regions of chaos and stability over the space of two population growth rates. + |ImageDescElem=The Markus-Lyapunov fractal is much more than a pretty picture – it is a chart. The curving bodies and sweeping arms of the image are a color-coded plot that shows us how a population changes as its rate of growth moves between two values. All the rich variations of color in the fractal come from the different levels of stability and chaos possible in such change. - '''''Remember to put in references.''''' + The [[Logistic Bifurcation#Jump2|logistic map]] is one of the most concise mathematical representations of population growth. Depending on the rate of fecundity used in the map, it will generate either a neutral system, a stable oscillating system, or a [[chaos|chaotic]] system. To help determine which of these outcomes will occur, the mathematician Aleksandr Lyapunov developed a method for comparing changes in growth rate in order to calculate a value called the {{EasyBalloon|Link=Lyapunov exponent|Balloon=The Lyapunov exponent represents the overall rate of change of a system over many iterations, expressed logarithmically.}}. This is a useful indicator because, for the logistic map, - |ImageDescElem=When mathematicians see a system that changes size, we like to try to model it with formulas. This is easy for systems that always change in one way – say, by always increasing – but such systems rarely show up in the real world. Much more common are systems such as those created by animal populations. A population cannot grow infinitely, but rather is constrained by the amounts of food, space etc., available to it. To model the pattern of growth and reduction that occurs as populations approach and retreat from their maximum sizes, mathematicians have developed the {{EasyBalloon|Link=logistic formula|Balloon=This is an iterative formula that determines each new iteration based on the difference between 1 and the value of the previous iteration. In this way, the values cannot become larger than 1.}}, which models population growth fairly accurately by including a factor that diminishes as population size grows, just as food and space would diminish. + - The logistic formula is driven by the initial population size and by the potential rate of change of that population. Mathematically, the more powerful of these values is the rate of change; it will determine whether the size of the population settles to a specific value, oscillates between two or more values, or becomes [[Chaos|chaotic]]. To help determine which of these outcomes would occur, a mathematician named Aleksandr Lyapunov developed a method for comparing changes in growth and time in order to calculate what has been dubbed the {{EasyBalloon|Link=Lyapunov exponent|Balloon=The Lyapunov exponent represents the overall rate of change of a system over many iterations, expressed logarithmically.}}. This is a handy little indicator, and here's why: + *If the Lyapunov exponent is zero, the population change is neutral; the population size begins and remains at a constant, fixed point. - *If it is zero, the population change is neutral; at some point in time, it reaches a fixed point and remains there. + *If the Lyapunov exponent is less than zero, the population will become {{EasyBalloon|Link=stable|Balloon=Stability is different from a fixed point. A system that oscillates between two values is stable, and a system that oscillates between sixteen values is still stable.}}. The more negative the number, the faster and more thoroughly the population will stabilize. - *If it is less than zero, the population will become {{EasyBalloon|Link=stable|Balloon=Stability is different from a fixed point; A system that oscillates between two values is stable, and a system that oscillates between sixteen values is still stable.}}. The lower the number, the faster and more thoroughly the population will stabilize. + *If the Lyapaunov exponent is positive, the population will become chaotic. - *If it is positive, the population will become chaotic. + [[Image:Markus-Lyapunov2.jpg|left|frame|Another example of a Markus-Lyapunov fractal, this one with chaos in black and stability in gold.]] [[Image:Markus-Lyapunov2.jpg|left|frame|Another example of a Markus-Lyapunov fractal, this one with chaos in black and stability in gold.]] - What does all this have to do with the fantastical shapes of the Markus-Lyapunov fractal? Well, a scientist named Mario Markus wanted a way to visualize the potential represented by the Lyapunov exponent as a population moved between ''two'' different rates of growth. So he created a graphical space with one rate of growth measured along the ''x''-axis and the other along the ''y''. Thus for any point, (''x'',''y''), there is one specific Lyapunov exponent that predicts how a population with those rates of change will behave. Markus then assigned a color to every possible Lyapunov exponent – one color for positive numbers and another for negative numbers and zero. This second color he placed on a gradient, so that lower negative numbers are lighter and those closer to zero are darker, with zero itself being black. Some Markus-Lyapunov fractals also display {{EasyBalloon|Link=superstable|Balloon=$\lambda=-\infty$, as the lowest possible Lyapunov exponent, indicates the fastest possible approach to stability.}} points in a third color or black. By this code, Markus could color every point on his graph space based on its Lyapunov exponent. + What does all this have to do with the fantastical shapes of the Markus-Lyapunov fractal? The scientist Mario Markus wanted a way to visualize the potential represented by the Lyapunov exponent as a population moved between ''two'' different rates of growth. So he created a graphical space with one rate of growth measured along the ''x''-axis and the other along the ''y''. Thus for any point (''x'',''y'') there is one specific Lyapunov exponent that predicts how a population with those rates of change will behave. - Consider the main image on this page. The blue "background" shows all the points where the combination of the rate of change on the ''x'' and ''y'' axes will result in chaotic population growth. The "floating" yellow shapes show where the population will move toward stability. The lighter the yellow, the more stable the population. + Markus then created a color scheme to represent different Lyapunov exponents – one color represents positive numbers, and another represents negative numbers and zero. This second color he placed on a gradient from light to dark, so that lower negative numbers are lighter and those closer to zero are darker. The bands of black that appear in many fractals therefore show where the Lyapunov exponent is exactly zero, and bands of white indicate {{EasyBalloon|Link=superstable|Balloon=$\lambda=-\infty$, as the lowest possible Lyapunov exponent, indicates the fastest possible approach to stability.}} points. By this code, Markus could color every point on his graph space based on its Lyapunov exponent. -
+ - |ImageDesc====The Logistic Forumula=== + - This comes from the field of Verhulst Dynamics. Basic, unrestricted growth can be represented by + - {{EquationRef2|1}}$x_{(n+1)}=Rx_n$ + - But this, as discussed above, is not a realistic model of population growth in the ecological world. To account for the changing rate of change, ''R'', of an actual population, Verhulst constructed + - {{EquationRef2|2}}$R=\mathbf{r}(1-x_n)$ + - In this way, the overall rate of change, ''R'', is higher when ''x''n is lower and lower when ''x''n is higher. Re-inserting this in our initial representation of growth, we have: + - {{EquationRef2|3}}$x_{(n+1)}=(1-x_n)\mathbf{r}x_n$ + - This is the logistic formula. + - [[Image:Bifurcation.gif|frame|left|A bifurcation diagram for the logistic formula. The populations sizes resulting from 120 iterations of the formula based each '''r''' value are plotted above that '''r''' value (after an initial, unrecorded period of 5000 iterations to level the systems). Note the self-similarity shown in the enlarged section.]] + - ====Bifurcation==== + - The logistic equation is interesting partly for its properties of bifurcation. Bifurcation occurs when a system "branches" (hence the name) into multiple values. In the logistic formula, this means that ''x''n goes from a single value to oscillating among two or more values; the population volume ceases to be constant and begins fluctuating between multiple volumes. + - The diagram to the left shows how the logistic formula bifurcates as the value of '''r''' changes. Population sizes, ''x''n, (''y''-axis) are plotted against the '''r''' values (''x''-axis) that generate them. The most stable state therefore appears as a single horizontal line. When this line appears to "branch" into two, we are observing bifurcation; the population is now oscillating between two volumes. As the branching continues, so does bifurcation: Three lines show oscilation among three volumes, four lines show oscillation among four volumes, and so forth. The grey areas show where the system bifurcates to the extent that it essentially "oscillates" among ''all'' possible ''x''n values. That is, it becomes chaotic. As '''r''' values increase, we see wider and wider "bands" of chaos where ranges of '''r''' values yield only chaotic systems. Above the '''r''' value of 3, these "bands" become continuous, and all '''r''' values yield chaos. + Consider the main image on this page. The blue "background" shows all the points where the combination of the rates of change on the ''x'' and ''y'' axes will result in chaotic population growth. The "floating" yellow shapes show where the population will move toward stability. The lighter the yellow, the more stable the population. - + - In the enlarged portion, we can see that the diagram of logistic bifurcation is self-similar. This is the fractal property that carries through into Markus-Lyapunov fractals. + + Based on this color assignment, if a logistic system with rates of change ''x'' = 2 and ''y'' = 1.6 has a Lyapunov exponent of -0.3, there will be a dark yellow pixel at the graph location (2, 1.6), showing that the system moves slowly toward stability. If, instead, that logistic system had a Lyapunov exponent of 1, that same pixel would be blue, showing chaos.

+ |ImageDesc= ===The Lyapunov Exponent=== ===The Lyapunov Exponent=== - The discrete form of the Lyapunov exponent is + [[Image:Dynamical1.gif|right|frame|As a logistic system progresses through ''n'' iterations, the distance, d''x''0, between two arbitrarily close points entered into the system will become a new distance, d''x''''n''.]] - {{EquationRef2|4}}$\lambda=\lim_{N \to \infty}\frac{1}{N}\sum_{n=1}^N \log_2 \frac{dx_{(n+1)}}{dx_n}$ + [[Image:LambdaGraphs.gif|right|frame|In both graphs, we see the evolution of ${\operatorname{d}x_n\over\operatorname{d}x_0}$ from ''n'' = 0 to ''n'' = 10. The upper graph shows this evolution for λ = -1, while the lower shows this for λ = 1. Notice how quickly we observe convergence in the upper graph and divergence in the lower.]] - In other words, the Lyapunov exponent $\lambda$ represents the [[Limit|limit]] of the {{EasyBalloon|Link=mean|Balloon=The presence of $\frac{1}{N}\sum_{n=1}^N ...$ shows that this is a mean.}} of the {{EasyBalloon|Link=exponential|Balloon=The presence of $\log_2$ makes the relationship between $\lambda$ and the original equation exponential.}} {{EasyBalloon|Link=rates of change|Balloon=$\frac{dx_{(n+1)}}{dx_n}$ defines the rate of change.}} that occur in each transition, ''x''n $\rightarrow$ ''x''(n+1), as the number of transitions approaches infinity. + The Lyapunov exponent is a measure of the rate of divergence of two infinitesimally close points in a dynamic system. For a single-variable system such as the logistic map, we can consider two points at an arbitrarily small distance, d''x''0 from each other. After ''n'' iterations of the system, they will be at distance d''x''''n'' from each other. The Lyapunov exponent λ represents this change with the approximation: + {{EquationRef2|Eq. 1}}${\operatorname{d}x_n\over\operatorname{d}x_0}\approx 2^{\lambda n}$ + Or, isolating the Lyapunov exponent λ: + :::$\frac{1}{n}\log_2{\operatorname{d}x_n\over\operatorname{d}x_0}\approx \lambda$ + Generalizing this for all ''n'', we use [[Summation Notation|summation notation]] to consider every step of iteration: + {{EquationRef2|Eq. 2}}$\frac{1}{n}\sum_{1}^n \log_2 {\operatorname{d}x_{(n+1)}\over\operatorname{d}x_n} \approx \lambda$ + In order for this to be accurate, however, it must measure the system's divergence over an infinite period of time, so we define λ as the limit of {{EquationNote|Eq. 2}} as ''n'' approaches infinity: + {{EquationRef2|Eq. 3}}$\lambda=\lim_{N \to \infty}\frac{1}{N}\sum_{n=1}^N \log_2 {\operatorname{d}x_{(n+1)}\over\operatorname{d}x_n} + This is the discrete form of the Lyapunov exponent. - What does this have to do with stability? The key is the log2 component, which renders numbers under 1 negative and those over 1 positive. This is what yields the properties of Lyapunov exponents laid out in the "Basic Explanation" – those mean overall rates of change that make the system {{EasyBalloon|Link=finite|Balloon=A geometric series or sequence is finite if is multiplied by a factor, '''r''' < 1, that makes it converge to a discrete value or set of values.}} must be less than 1, giving us a negative Lyapunov exponent, while those rates of change that expand the system to the point of chaos must be greater than one, giving us a positive exponent. When the mean overall rate of change is zero, the logarithmic component no longer exists, showing exactly what happens in a superstable system; the rate of change ceases to exist. + To consider the implications of λ, let us return to {{EquationNote|Eq. 1}}: + :::{\operatorname{d}x_n\over\operatorname{d}x_0}\approx 2^{\lambda n} + We can see that, for λ < 0, the difference between d''x''0 to d''x''''n'' will disappear as ''n'' grows. (Indeed, the lower the value of λ, the more quickly this change will disappear.) Similarly, for λ = 0, there will be no difference at all. For λ > 0, however, the difference between the initial distance between the points and the final distance between the points will expand exponentially as ''n'' grows. In other words, a positive Lyapunov exponent indicates a system in which an infinitesimal change in initial conditions can result in massively different final conditions. A negative Lyapunov exponent, on the other hand, indicates a system in which the effects of such an initial change will fade over time. - In other words, the Lyapunov exponent is a method for examining the rate of change of a system considered over infinite iterations, then taking that rate of change and making it easily identifiable as a value that induces either chaos or stability. + Recall that the phenomenon captured by a positive Lyapunov exponent – wide variation resulting from infinitely small initial changes – is one of the conditions for a system to be chaotic. Thus a positive Lyapunov, in the presence of other conditions of chaos, implies a chaotic system. - ====In the Logistic Formula==== + ====In the Logistic Map==== - Basic differentiation shows us that, for the logistic formula ({{EquationNote|3}}): + Recall that the [[Logistic Bifurcation#Jump3|logistic map]] is: - {{EquationRef2|5}}[itex]\frac{dx_{(n+1)}}{dx_n}=\mathbf{r}-2\mathbf{r}x_0^2$ + {{EquationRef2|Eq. 4}}$x_{(n+1)}=\mathbf{r}x_n(1-x_n)$ - Using this and a sufficiently large ''N'' number of iterations, we can approximate the Lyapunov exponent for the logistic formula to be: + From which we find - {{EquationRef2|6}}$\lambda \approx \frac{1}{N}\sum_{n=1}^N \log_2 \mathbf{r}-2\mathbf{r}x_0^2$ + :::${\operatorname{d}x_{(n+1)}\over\operatorname{d}x_n}=\mathbf{r}-2\mathbf{r}x_n$ - Here we can see much more clearly a property that we have been assuming -- the variable that has the greatest impact on the stability of the logistic equation is '''r''', not ''x''n. This is clear here because, as we differentiate the logistic formula in order to examine its overall rate of change, ''x''n is reduced to the constant value ''x''0 (the starting volume of the population). It is therefore no longer a variable when we look at the formula on the level of its Lyapunov exponent, and so changing the value of ''x''n does not change whether the logistic function yields chaos or stability. + Inserting this into the Lyapunov exponent, {{EquationNote|Eq. 3}}, we have the Lyapunov exponent for the logistic map: + {{EquationRef2|Eq. 5}}$\lambda=\lim_{N \to \infty}\frac{1}{N}\sum_{n=1}^N \log_2 \mathbf{r}-2\mathbf{r}x_n$ + As noted above, negative values of λ here indicate stability in the logistic map. Also, in the case of the logistic map, any system with a Lyapunov exponent greater than zero is a chaotic system. We can see this when we compare the [[Logistic Bifurcation|logistic bifurcation]] diagram with a graph of the Lyapunov exponents for the logistic maps of changing '''r''' values: + [[Image:Lyap Bifurc1.gif|center|frame|Here a graph of the Lyapunov exponents for logistic maps with 0 < '''r''' < 4 is overlaid in red on the [[Logistic Bifurcation|logistic bifurcation]] diagram. The points of the red graph were calculated directly in Matlab using {{EquationNote|Eq. 5}}.]] ===Forcing the Rates of Change=== ===Forcing the Rates of Change=== - Mathematically, the important part of Markus's contribution to understanding this type of system was not his method for generating fractals, but his use of periodic rate-of-change forcing. We have been discussing the great impact of the '''r''' value in determining the output of the logistic formula, but this value can have still greater impact if we do not choose to keep it constant. Anyone who has studied biology, as Markus has, knows that the rates of change of a populations size do not simply fluctuate with changing supplies of food and space, but also often alternate between two or more specific potential rates of change depending on such things as weather and mating seasons.[[Image:AB.gif|frame|A Markus-Lyapunov fractal with rate-of-change pattern ''ab'']] + Mathematically, the important part of Markus's contribution to understanding this type of system was not his method for generating fractals, but his use of periodic rate-of-change forcing. The value of '''r''' has a great impact on the output of the logistic map, but this value can have still greater impact if we do not choose to keep it constant. + [[Image:AB.gif|frame|A Markus-Lyapunov fractal with rate-of-change pattern ''ab'']] - In terms of the logistic formula, this means we choose a set of rates of change, '''r'''1, '''r'''2, '''r'''3,..., '''r'''''p'', where ''p'' is the period over which the rates of change loop. When we force the rates of change to follow such a loop, we have a new, {{EasyBalloon|Link=modular|Balloon=This equation uses modular notation. Here ''n'' mod ''p'' indicates that once the value of ''n'' has passed ''p'', the value of '''r''' returns to '''r'''1 and continues from there.}} logistic equation ({{EquationNote|3}}): + In terms of the logistic map, this means we choose a set of rates of change, '''r'''1, '''r'''2, '''r'''3,..., '''r'''''p'', where ''p'' is the period over which the rates of change loop. When we force the rates of change to follow such a loop, we have a new, [[Modular arithmetic|modular]] logistic map ({{EquationNote|Eq. 4}}): - {{EquationRef2|7}}$x_{(n+1)}=\mathbf{r}_{n \text{mod} p}x_n(1-x_n)$ + {{EquationRef2|Eq. 6}}$x_{(n+1)}=\mathbf{r}_{n \text{mod} p}x_n(1-x_n)$ - It is in these forced alterations in rates of change that the fascinating shapes of the Markus-Lyapunov fractal come out. Each of the fractals is formed from some pattern of two rates of change, '''r'''1 = ''a'' and '''r'''2 = ''b''. Because the axes used to map these fractals are measurements of changes in ''a'' and ''b'', the pattern ''a'' would simply yield a set of vertical bars, just as the pattern ''b'' would yield horizontal bars. However, once the patterns start to become mixed, more interesting results come out. The image to the right shows an ''ab'' pattern. Note that it is much simpler than other images shown on this page; the main image, for instance, is a ''bbbbbbaaaaaa'' pattern. + It is in these forced alterations in rates of change that the fascinating shapes of the Markus-Lyapunov fractal come out. Each of the fractals is formed from some pattern of two rates of change, ''a'' and ''b''. So a pattern ''aba'' would mean each point on the fractal is colored based on the Lyapunov exponent of the logistic map {{EquationNote|Eq. 5}}, where '''r'''1 = ''a'', '''r'''2 = ''b'', and '''r'''3 = ''a''. That is, the '''r''' values would cycle ''a,b,a,a,b,a,a,b,a...''. + + Because the axes used to map these fractals are measurements of changes in ''a'' and ''b'', the pattern ''a'' would simply yield a set of vertical bars, just as the pattern ''b'' would yield horizontal bars. However, once the patterns start to become mixed, more interesting results come out. The image to the right shows an ''ab'' pattern. Note that it is much simpler, in the quantity of spires and crossing arms, than other images shown on this page; the main image, for instance, is a ''bbbbbbaaaaaa'' pattern.

- |AuthorName=Mathematica 5 + |other=understanding [[Logistic Bifurcation]] + |AuthorName=BernardH |SiteURL=http://en.wikipedia.org/wiki/File:Lyapunov-fractal.png |SiteURL=http://en.wikipedia.org/wiki/File:Lyapunov-fractal.png |Field=Dynamic Systems |Field=Dynamic Systems |Field2=Fractals |Field2=Fractals - |WhyInteresting=[[Image:MarkusLyapunovZoom.gif|200px|frame|An enlargement of a section of "Zircon Zity," showing self-similarity.]] + |WhyInteresting= +
+ [[Image:MLBlowup.gif|frame|An enlargement of a section of "Zircon Zity," showing self-similarity.]] ===Fractal Properties=== ===Fractal Properties=== - The movements from light to dark and the dramatic curves of the boundaries between stability and chaos here create an astonishing 3D effect. But the image is striking not only for its beauty but also for its self-similarity. Self-similarity is that trait that makes [[fractals]] what they are – zooming in on the image reveals smaller and smaller parts that resemble the whole. Consider the image to the right, enlarged from a section of the main image above. Here we see several shapes that repeat in smaller and smaller iterations. Perhaps ironically, this type of pattern is a common property of chaos. ('''''WHY?''''') + The movements from light to dark and the dramatic curves of the boundaries between stability and chaos here create an astonishing 3D effect. But the image is striking not only for its beauty but also for its '''self-similarity'''. Self-similarity is that trait that makes [[Field:Fractals|fractals]] what they are – zooming in on the image reveals smaller and smaller parts that resemble the whole. Consider the image to the right, showing an enlarged section of the main image above. Here we see several shapes that repeat in smaller and smaller iterations. Perhaps ironically, this type of pattern is a common property of chaos. - For more images of the fractal properties of chaotic systems, see the [[Henon Attractor]], the [[Harter-Heighway Dragon]] Curve, and [[Julia Sets]]. + For more images of the fractal properties of chaotic systems, see the [[Blue Fern]], the [[Henon Attractor]], the [[Harter-Heighway Dragon]] Curve, and [[Julia Sets]].

[[Image:Markus-Lyapunov_Play.png|frame|left|One artist superimposed and edited several real Markus-Lyapunov fractals to create this piece of art.]] [[Image:Markus-Lyapunov_Play.png|frame|left|One artist superimposed and edited several real Markus-Lyapunov fractals to create this piece of art.]] ===Artistic Extensions=== ===Artistic Extensions=== - After Markus saw the incredible beauty and intriguing three-dimensionality of the images generated by his plotting system, he immediately sent the images to a gallery in the hopes that it would display his images in an exhibition. It's easy to see why he did so, and in fact, pictures based on these fractals have become a large part of what is called "fractalist" art. As with all domains of fractalist art, there is a great deal of debate in the art community over whether these images are truly "art" given their intrinsic reliance on a purely scientific, algorithmically-generated chart. One could say that such a process is devoid of creativity, but it is equally valid to say that the identification and presentation of the beauty in the science is an art in itself – a concept that is critical in modern art. Either way, there has been an undeniable artistic fascination with Markus-Lyapunov fractals; if the image seems familiar, you have likely seen it on posters, t-shirts, or any other canvas for graphic design. + After Markus saw the incredible beauty and intriguing three-dimensionality of the images generated by his plotting system, he immediately sent the images to a gallery in the hopes that it would display his images in an exhibition.Dewdney, A. K. (1991). Leaping into Lyapunov Space. ''Scientific American'', (130-132). It's easy to see why he did so, and in fact, pictures based on these fractals have become a large part of what is called "fractalist" art. As with all domains of fractalist art, there is a great deal of debate in the art community over whether these images are truly "art" given their intrinsic reliance on a purely scientific, algorithmically-generated chart. One could say that such a process is devoid of creativity, but it is equally valid to say that the identification and presentation of the beauty in the science is an art in itself – a concept that is critical in modern art. Either way, there has been an undeniable artistic fascination with Markus-Lyapunov fractals; if the image seems familiar, you have likely seen it on posters, t-shirts, or any other canvas for graphic design. - |InProgress=Yes + |References= + + |InProgress=No }} }}

## Current revision

Markus-Lyapunov Fractal
Fields: Dynamic Systems and Fractals
Image Created By: BernardH
Website: [1]

Markus-Lyapunov Fractal

Markus-Lyapunov fractals are representations of the regions of chaos and stability over the space of two population growth rates.

# Basic Description

The Markus-Lyapunov fractal is much more than a pretty picture – it is a chart. The curving bodies and sweeping arms of the image are a color-coded plot that shows us how a population changes as its rate of growth moves between two values. All the rich variations of color in the fractal come from the different levels of stability and chaos possible in such change.

The logistic map is one of the most concise mathematical representations of population growth. Depending on the rate of fecundity used in the map, it will generate either a neutral system, a stable oscillating system, or a chaotic system. To help determine which of these outcomes will occur, the mathematician Aleksandr Lyapunov developed a method for comparing changes in growth rate in order to calculate a value called the Lyapunov exponent. This is a useful indicator because, for the logistic map,

• If the Lyapunov exponent is zero, the population change is neutral; the population size begins and remains at a constant, fixed point.
• If the Lyapunov exponent is less than zero, the population will become stable. The more negative the number, the faster and more thoroughly the population will stabilize.
• If the Lyapaunov exponent is positive, the population will become chaotic.
Another example of a Markus-Lyapunov fractal, this one with chaos in black and stability in gold.

What does all this have to do with the fantastical shapes of the Markus-Lyapunov fractal? The scientist Mario Markus wanted a way to visualize the potential represented by the Lyapunov exponent as a population moved between two different rates of growth. So he created a graphical space with one rate of growth measured along the x-axis and the other along the y. Thus for any point (x,y) there is one specific Lyapunov exponent that predicts how a population with those rates of change will behave.

Markus then created a color scheme to represent different Lyapunov exponents – one color represents positive numbers, and another represents negative numbers and zero. This second color he placed on a gradient from light to dark, so that lower negative numbers are lighter and those closer to zero are darker. The bands of black that appear in many fractals therefore show where the Lyapunov exponent is exactly zero, and bands of white indicate superstable points. By this code, Markus could color every point on his graph space based on its Lyapunov exponent.

Consider the main image on this page. The blue "background" shows all the points where the combination of the rates of change on the x and y axes will result in chaotic population growth. The "floating" yellow shapes show where the population will move toward stability. The lighter the yellow, the more stable the population.

Based on this color assignment, if a logistic system with rates of change x = 2 and y = 1.6 has a Lyapunov exponent of -0.3, there will be a dark yellow pixel at the graph location (2, 1.6), showing that the system moves slowly toward stability. If, instead, that logistic system had a Lyapunov exponent of 1, that same pixel would be blue, showing chaos.

# A More Mathematical Explanation

Note: understanding of this explanation requires: *understanding Logistic Bifurcation

### The Lyapunov Exponent

[[Image:Dynamical1.gif|right|frame|As a logistic system progresses throug [...]

### The Lyapunov Exponent

As a logistic system progresses through n iterations, the distance, dx0, between two arbitrarily close points entered into the system will become a new distance, dxn.
In both graphs, we see the evolution of ${\operatorname{d}x_n\over\operatorname{d}x_0}$ from n = 0 to n = 10. The upper graph shows this evolution for λ = -1, while the lower shows this for λ = 1. Notice how quickly we observe convergence in the upper graph and divergence in the lower.

The Lyapunov exponent is a measure of the rate of divergence of two infinitesimally close points in a dynamic system. For a single-variable system such as the logistic map, we can consider two points at an arbitrarily small distance, dx0 from each other. After n iterations of the system, they will be at distance dxn from each other. The Lyapunov exponent λ represents this change with the approximation:

Eq. 1        ${\operatorname{d}x_n\over\operatorname{d}x_0}\approx 2^{\lambda n}$

Or, isolating the Lyapunov exponent λ:

$\frac{1}{n}\log_2{\operatorname{d}x_n\over\operatorname{d}x_0}\approx \lambda$

Generalizing this for all n, we use summation notation to consider every step of iteration:

Eq. 2        $\frac{1}{n}\sum_{1}^n \log_2 {\operatorname{d}x_{(n+1)}\over\operatorname{d}x_n} \approx \lambda$

In order for this to be accurate, however, it must measure the system's divergence over an infinite period of time, so we define λ as the limit of Eq. 2 as n approaches infinity:

Eq. 3        $\lambda=\lim_{N \to \infty}\frac{1}{N}\sum_{n=1}^N \log_2 {\operatorname{d}x_{(n+1)}\over\operatorname{d}x_n}$

This is the discrete form of the Lyapunov exponent.

To consider the implications of λ, let us return to Eq. 1:

${\operatorname{d}x_n\over\operatorname{d}x_0}\approx 2^{\lambda n}$

We can see that, for λ < 0, the difference between dx0 to dxn will disappear as n grows. (Indeed, the lower the value of λ, the more quickly this change will disappear.) Similarly, for λ = 0, there will be no difference at all. For λ > 0, however, the difference between the initial distance between the points and the final distance between the points will expand exponentially as n grows. In other words, a positive Lyapunov exponent indicates a system in which an infinitesimal change in initial conditions can result in massively different final conditions. A negative Lyapunov exponent, on the other hand, indicates a system in which the effects of such an initial change will fade over time.

Recall that the phenomenon captured by a positive Lyapunov exponent – wide variation resulting from infinitely small initial changes – is one of the conditions for a system to be chaotic. Thus a positive Lyapunov, in the presence of other conditions of chaos, implies a chaotic system.

#### In the Logistic Map

Recall that the logistic map is:

Eq. 4        $x_{(n+1)}=\mathbf{r}x_n(1-x_n)$

From which we find

${\operatorname{d}x_{(n+1)}\over\operatorname{d}x_n}=\mathbf{r}-2\mathbf{r}x_n$

Inserting this into the Lyapunov exponent, Eq. 3, we have the Lyapunov exponent for the logistic map:

Eq. 5        $\lambda=\lim_{N \to \infty}\frac{1}{N}\sum_{n=1}^N \log_2 \mathbf{r}-2\mathbf{r}x_n$

As noted above, negative values of λ here indicate stability in the logistic map. Also, in the case of the logistic map, any system with a Lyapunov exponent greater than zero is a chaotic system. We can see this when we compare the logistic bifurcation diagram with a graph of the Lyapunov exponents for the logistic maps of changing r values:

Here a graph of the Lyapunov exponents for logistic maps with 0 < r < 4 is overlaid in red on the logistic bifurcation diagram. The points of the red graph were calculated directly in Matlab using Eq. 5.

### Forcing the Rates of Change

Mathematically, the important part of Markus's contribution to understanding this type of system was not his method for generating fractals, but his use of periodic rate-of-change forcing. The value of r has a great impact on the output of the logistic map, but this value can have still greater impact if we do not choose to keep it constant.

A Markus-Lyapunov fractal with rate-of-change pattern ab

In terms of the logistic map, this means we choose a set of rates of change, r1, r2, r3,..., rp, where p is the period over which the rates of change loop. When we force the rates of change to follow such a loop, we have a new, modular logistic map (Eq. 4):

Eq. 6        $x_{(n+1)}=\mathbf{r}_{n \text{mod} p}x_n(1-x_n)$

It is in these forced alterations in rates of change that the fascinating shapes of the Markus-Lyapunov fractal come out. Each of the fractals is formed from some pattern of two rates of change, a and b. So a pattern aba would mean each point on the fractal is colored based on the Lyapunov exponent of the logistic map Eq. 5, where r1 = a, r2 = b, and r3 = a. That is, the r values would cycle a,b,a,a,b,a,a,b,a....

Because the axes used to map these fractals are measurements of changes in a and b, the pattern a would simply yield a set of vertical bars, just as the pattern b would yield horizontal bars. However, once the patterns start to become mixed, more interesting results come out. The image to the right shows an ab pattern. Note that it is much simpler, in the quantity of spires and crossing arms, than other images shown on this page; the main image, for instance, is a bbbbbbaaaaaa pattern.

# Why It's Interesting

An enlargement of a section of "Zircon Zity," showing self-similarity.

### Fractal Properties

The movements from light to dark and the dramatic curves of the boundaries between stability and chaos here create an astonishing 3D effect. But the image is striking not only for its beauty but also for its self-similarity. Self-similarity is that trait that makes fractals what they are – zooming in on the image reveals smaller and smaller parts that resemble the whole. Consider the image to the right, showing an enlarged section of the main image above. Here we see several shapes that repeat in smaller and smaller iterations. Perhaps ironically, this type of pattern is a common property of chaos.

For more images of the fractal properties of chaotic systems, see the Blue Fern, the Henon Attractor, the Harter-Heighway Dragon Curve, and Julia Sets.

One artist superimposed and edited several real Markus-Lyapunov fractals to create this piece of art.

### Artistic Extensions

After Markus saw the incredible beauty and intriguing three-dimensionality of the images generated by his plotting system, he immediately sent the images to a gallery in the hopes that it would display his images in an exhibition.[1] It's easy to see why he did so, and in fact, pictures based on these fractals have become a large part of what is called "fractalist" art. As with all domains of fractalist art, there is a great deal of debate in the art community over whether these images are truly "art" given their intrinsic reliance on a purely scientific, algorithmically-generated chart. One could say that such a process is devoid of creativity, but it is equally valid to say that the identification and presentation of the beauty in the science is an art in itself – a concept that is critical in modern art. Either way, there has been an undeniable artistic fascination with Markus-Lyapunov fractals; if the image seems familiar, you have likely seen it on posters, t-shirts, or any other canvas for graphic design.

# References

1. Dewdney, A. K. (1991). Leaping into Lyapunov Space. Scientific American, (130-132).