Numerical Methods

Numerical Resolution of ODEs
Here we will be dealing with simple ODEs instead of systems. Therefore, our problem is of the form

$$\begin{cases} \dot{x}=f(x)\\ x(0)=x_{0} \end{cases}$$

As we are dealing with numerical solutions, our solution will be, in fact, an approximate one $$\tilde{x}\,$$.

Taylor Algorithm
$$\begin{cases}\tilde{x}(t+\Delta t)=\tilde{x}(t)+ \Delta t f(\tilde{x}(t))\\ \tilde{x}(0)=x_{0}\end{cases}\,$$

This is a fixed-timestep iterative method. It calculates the result in an explicit fashion. The error is of $$O(\Delta t)\,$$.

The Taylor Algorithm can be extrapolated for higher orders. For example, for order 2, we have

$$\frac{d^{2}x}{dt^{2}}=\frac{d}{dt}f(x)=f'(x)\frac{dx}{dt}=f'(x)f(x)\,$$ $$\tilde{x}((k+1)\Delta t)=\tilde{x}(k\Delta t)+\Delta t f(\tilde{x}(k\Delta t)) + \frac{\Delta t^{2}}{2}f'(\tilde{x}(k \Delta t)) f(\tilde{x}(k \Delta t))\,$$

On the 2nd order Taylor Algorithm, the error is of $$O(\Delta t^{2})\,$$.

Runge-Kutta algorithms
This algorithm is similar to the previous one, but instead of estimating the slope of the interval between $$t\,$$ and $$t + \Delta t\,$$ by the slope at the beginning at the interval, it estimates the slope also in intermediate points. For example, a multistep 4 Runge-Kutta will use a total of 4 values for the slope in the interval. It's still explicit and constant timestep, but it's more precise than Taylor.

Multistep Algorithm
In this case, the value of $$t + \Delta t\,$$ will be calculated as the linear combination of N previous values.

Implicit Methods
$$ \tilde{x}(t+\Delta t)=\tilde{x}(t)+ \Delta t f(\tilde{x}(t+\Delta t))\,$$

In this case, what we have to solve is a Newton-Raphson problem. The interval is the same and the order of the error also, but the constant that multiplies $$\Delta t\,$$ is now improved and more precise.

All the three methods presented before can also be used implicitly.

Semi-implicit Methods
For those, we establish the problem as a Newton-Raphson exactly like in the implicit one, but we only use the first iteration of the solving algorithm as result.

Adaptative Timestep Methods
In this case, the algorithm will adjust the value of the timestep according to precision parameters established by the user. It is certainly an improvement, but at the cost of added complexity.

Adaptative Timestep and order
These are complex, robust algorithms that normally define a cost function in order to optimize the timestep and the order that will be used to calculate the solution. Using a bigger order results in more computations, but as the error is of a smaller order, bigger timesteps can be used, so the computations will be done less times.