Field:Dynamic Systems
From Math Images
Dynamic Systems
Roughly speaking, dynamical systems theory is the study of the longterm behavior of evolving systems. A dynamical system can be thought of as a model consisting of an abstract state space (see below) and a set of equations that specify how the state of the system changes with time. These models can treat time as continuous, using differential equations, or as coming in discrete steps, using recurrence equations, also called difference equations.

History
Modern dynamical systems theory originated with questions about the stability and evolution of the solar system. A major stimulus to the field's development came in 1885 when King Oscar II of Sweden offered a prize for anyone who could mathematically determine whether the solar system would remain in its current arrangement or if the planets would eventually spiral into the sun or whiz off into space. It turns out this problem is too complex for an exact mathematical solution, but the work done on the problem laid the groundwork for future examinations of dynamical systems. The work of Henri Poincaré was especially groundbreaking, and as a result he is considered the father of dynamical systems theory.^{[1]}Since then dynamical systems theory has become a field with applications in mathematics, physics, engineering, biology, and economics, among others.^{[2]}
Major Concepts
Although dynamical systems theory is often used to describe actual systems, it can be hard to see the realworld significance of all the big words, symbols, and swirly pictures unless you know the lingo. Below are a couple of the major concepts you'll encounter when reading about dynamical systems.
State Space
Dynamical systems are represented in a state space, or phase space. The state space of a system is an abstract space whose dimensions correspond to the variables of the system. The state of the system at a given time is represented by a single point in state space specified by coordinates that are the values of all the system’s variables at that time. The evolution of a particular state of the system, represented by the path a point follows through state space, is called a trajectory or orbit.
For example, imagine a system of two particles. We can visualize the motion of the particles with two moving points (x_{1} , y_{1} , z_{1}) and (x_{2} , y_{2} , z_{2}) in threedimensional space, or we can think of the system of particles as a single unit, a moving point (x_{1} , y_{1} , z_{1}, x_{2} , y_{2} , z_{2}) in an imaginary, sixdimensional state space.
Coordinates in state space are by no means limited to positions in physical space. They simply represent the variables of the system in question. In an economy, the quantities of interest, or variables, are things like number of products produced, price levels, interest rates, etc. Dynamical systems theory treats each of these quantities as a dimension in the system's state space, so that the overall state of the economy could be visualized by a single point and its development by the trajectory this point follows through state space.
Evolution Rule
How a dynamical system evolves depends on its evolution rule. A system's evolution rule provides a prediction of its future state or states based on its current state. We can think of any given trajectory as the collection of all future states resulting from applying the evolution rule to the initial state of the system.
Mathematically, evolution rules are equations or sets of equations. Roughly speaking, you plug the state space values of the initial state of the system into the evolution rule equation and it spits out a prediction of the state space values of the state, or states, to follow. If this prediction is a unique result, a single set of state space values, the evolution rule is said to be deterministic. If instead an initial state has multiple possible results, the evolution rule is said to be stochastic, or "random". Note that this is not the same as being chaotic.
Evolution rules are further differentiated by how they treat time. Some rules use continuous time, where a future state of the system can be calculated based on the initial state and the amount of time elapsed. Other rules use discrete time, where the state of the system is only evaluated after discrete intervals of time and each consecutive state is calculated from the previous state. The trajectory of a system with continuous time looks like a continuous curve, while the trajectory of a system with discrete time is a series of individual points spiraling through state space.
Which type of time is used depends on the system being modeled. For a system of a planet orbiting a star, you would probably use continuous time so that you can evaluate the state of the system at any time along the orbit. For a system of coin tosses, only the result of each toss matters, so you would normally use discrete time in order to "skip over" the states of the coin midtoss. The same goes for many population models, where it is convenient to use discrete time with generations as your time intervals.^{[3]}.
Attractor
In the long run, many dynamical systems show some regularity in their behavior. In the state space model, such regularity is represented by an attractor. An attractor is the set of points in state space that the trajectory of the system migrates towards over time, if it does at all. An attractor can be a single fixed point, multiple points that the system alternates between, a loop or more complex curve, or even an infinite set of points.
Fixed point attractors represent systems that eventually settle down towards some final, stable state. Attractors shaped like closed curves represent periodic systems.
Infinitepoint attractors possessing noninteger or fractal dimension are known as strange attractors. They arise in systems with nonperiodic, chaotic dynamics. The Lorenz Attractor and the Henon Attractor are examples of strange attractors.
More Concepts
Maps
A deterministic evolution rule with discrete time is called a map. Mathematically, in a state space S, a map is represented
and the evolution of the system is defined by
This is essentially the same as saying that a map f takes a point in the state space S at time t and turns it into another point in S at time t + 1.
A fairly simple example of a map is the Logistic map, which is used to model population dynamics.
Flows
A deterministic evolution rule with continuous time is called a flow. The trajectory of a flow is given by the function
In other words, if you know the initial state x(0), the state at any time t can be found using the function of time represented by .
Flows are differentiable with respect to time everywhere, which means that there is a vector field associated with any given flow.
Poincaré Sections
Examples of Dynamical Systems
To see all the pages related to dynamical systems, see the dynamical systems category.
References
 ↑ "Dynamical Systems Theory". Encyclopedia Brittanica Online Academic Edition. Encyclopedia Brittanica, Inc. 2012. Web. 16 Jul. 2012. http://www.britannica.com/EBchecked/topic/175173/dynamicalsystemstheory
 ↑ Brin, M., & Stuck, G. (2002). Introduction to dynamical systems. Cambridge; New York: Cambridge University Press.
 ↑ James Meiss (2007) Dynamical systems. Scholarpedia, 2(2):1629