Lecture 2

This is the summary of Lecture 2, 02/11/2011.

Fundamental Solutions
Our basic problem is to solve a system of differential equations that can be written as

$$\begin{cases} \dot{X}=AX, A \in M(N\times N)\\ X(0)=X_{0} \end{cases} $$

By using the Superposition Principle, we aim to find a simple way to get a solution from the given initial state $$X_{0}\,$$.

If we have

$$Y_{0}=\lambda X_{0} + \mu X_{0}' \Rightarrow \Phi (t,Y_{0})=\lambda \Phi (t, X_{0}) + \mu \Phi (t, X_{0}')\,$$

So we can establish a relation:

$$X_{0}\leftrightarrow \Phi (t, X_{0})\,$$ $$X_{0}'\leftrightarrow \Phi (t, X_{0}')\,$$

So we are mapping vectors in $$\mathbb{R}^{n}$$ to a vector space $$\mathcal{S}$$ of set of solutions via one-to-one, linear operator $$\mathcal{L}$$:

$$\mathbb{R}^{N}\xrightarrow{\mathcal{L}}\mathcal{S}$$ $$X\xrightarrow{\mathcal{L}}\Phi(\bullet,X)$$

Therefore, $$\mathcal{L}$$ is an isomorphism, and a basis for $$\mathbb{R}^{N}$$ is also a basis for $$\mathcal{S}$$. Of course, an obvious base for $$\mathbb{R}^{N}$$ are the unitary vectors $$e_{1}, ..., e_{N}\,$$ such that

$$e_{1}=\begin{bmatrix}1\\ 0\\ \vdots \\ 0 \end{bmatrix}, e_{N}=\begin{bmatrix}0\\ \vdots \\ 0 \\ 1 \end{bmatrix}$$

and, thus, this is also a basis for $$\mathcal{S}$$. So we can solve our initial system for $$X_{0}=e_{1},...,e_{N}\,$$, obtaining $$X_{1},...,X_{N}\,$$. Finally, by writing a matrix $$M\,$$ taking the vectors $$X_{1},...,X_{N}\,$$ as columns, we see that this obeys

$$\begin{cases} \frac{dM}{dt}=AM\\ M(0)=Id \end{cases} $$

The matrix $$M(t)\,$$ is called the fundamental solution. This way, we can calculate:

$$\Phi(t,X)=M(t)X(0)\,$$

Periodic Linear Systems
In this case, the system to be solved change slightly:

$$\begin{cases} \dot{X}=A(t)X\\ X(0)=X_{0} \end{cases} $$, A is a T-periodic

In this case, the analysis of the system becomes more complicated. For instance, restricting the value of the eigenvalues $$\lambda_{1}(t),...,\lambda_{n}(t)\,$$ to strictly negative real values does not ensure stability anymore. Therefore, we need stronger tools to cope with this (as Floquet Theory, for example)

Perturbations in a linear system
We can postulate here two kinds of perturbations to our linear system (being represented here by Y and Z):

$$\begin{matrix}\begin{cases} \dot{X}=AX\\ X(0)=X_{0} \end{cases} & \begin{cases} \dot{Y}=(A+B)Y\\ Y(0)=Y_{0} \end{cases} & \begin{cases} \dot{Z}=AZ+F(Z)\\ Z(0)=Z_{0} \end{cases}\end{matrix} $$

As we had already seen, if a system is not on the border between stability regions, a sufficiently small perturbation will not change its stability criteria evaluation. Moreover, it is possible to show that, for example, for a 2-dimensional system,

$$\exists h: \mathbb{R}^{2}\rightarrow \mathbb{R}^{2}$$ where h is an homeomorphism, such that

$$h\phi_{t}(h^{-1}(X))=\psi_{t}(X)\,$$

Unfolding
Here, we consider the case below where the perturbation is represented by a matrix to be added to A:

$$\begin{cases} \dot{Y}=(A+B)Y\\ Y(0)=Y_{0} \end{cases}$$ $$C=A+B\,$$

In this case, C will be one out of a finite set of shapes and will be a deformation (or unfolding) of A. This unfolding can be versal, miniversal or universal. For a 2x2 matrix, for example, a perturbation B would have 4 parameters. However, $$A+B \rightarrow C(\alpha, \beta, \gamma)$$ $$\exists h, \exists \alpha, \beta, \gamma : A+B $$ is similar to $$C (\alpha, \beta, \gamma)\,$$

Note that not necessarily all 3 parameters need to be used. An unfolding can have, in this case, two or even only one parameter.

Linearization
Here, we will deal with the other kind of perturbation, that when the system is not homogenous anymore:

$$\begin{cases} \dot{X}=AX + f(X)\\ X(0)=0 \end{cases}$$ $$f(0)=0,Df(0)=0\,$$

Here, without proof, we present a theorem: if $$Re(\lambda)<0\,$$ for all $$\lambda\,$$ eigenvalues of A, then 0 is a locally asymptotically stable equilibrium of the equation. If $$\exists \lambda : Re(\lambda)>0\,$$, then 0 is unstable.

Local Stability (Lyapunov Sense)
We take a neighborhood $$V\,$$ of the point $$X^{*}\,$$. For this point, if the clause

$$\forall \mathcal{U} \in V, \exists \mathcal{W} \in V : \phi_{t}(\mathcal{W}) \in \mathcal{U} \forall t \ge 0$$

is true, the point is said to be Lyapunov Stable. If, moreover, $$\lim_{t\rightarrow \infty}\phi_{t}(\mathcal{W}) = X^{*}$$

the point is said to be asymptotically stable.

Hartman-Grobman theorem
$$\begin{cases} \dot{X}=f(X) & (\xi)\\ X(0)=0 \end{cases}$$ $$f(0)=0,Df(0)=A\,$$

We establish the extra condition that the system should be far from the borders of stability regions, that is, the real parts of its eigenvalues should all be different.

The Hartman-Grobman theorem shows that locally around 0 the flow of the system $$(\xi)\,$$ is similar (i.e. there is an homeomorphism from both flows) to that of the linearized version, that is, $$\dot{Y}=AY\,$$. The proof will not be presented here, and can be found at chapter 5 of the book "Systèmes Dynamiques - une introduction" from Charles-Michel Marle.

Numerical methods for solving ODEs
This subject is discussed extensively in the article dedicated to Numerical Methods