A dynamical system is any system which evolves over time according to some pre-determined rule. The goal of dynamical systems theory is to understand this evolution. For example: fix your favourite function f from the unit interval to itself (for example cos(x)); now choose some point x(0) in the interval, and define x(1)=f(x), x(2)=f(f(x)), etc (i.e. x(n) is the result of applying the function f to the point x(0) n times). How does the sequence of points x(n) behave as n tends to infinity? How does this behaviour change if we choose a different initial point x(0)? What if we investigate a system which evolves continuously over time? Dynamical systems theory seeks to answer such questions. The more interesting systems are the 'chaotic' ones, where varying the initial point x(0) leads to very different behaviour of the sequence x(n).

Sorry, there are no lists here yet. You could try:

  • Clicking My Lists from the menu. Your course enrolled lists are stored here.
  • Searching for the list using the form below:

Lists linked to Dynamical Systems

There are currently no lists linked to this Module.