Introduction to Numerical Differentiation

Welcome! In your standard Maths course, you’ve likely spent a lot of time finding the derivative (the gradient) of functions like \(x^2\) or \(\sin(x)\) using algebra. But what happens if the function is so messy that you can't differentiate it normally? Or what if you only have a table of data points instead of a formula?

This is where Numerical Differentiation comes to the rescue! It allows us to estimate the gradient of a curve at a specific point using just the coordinates. It’s like estimating the steepness of a mountain by looking at the ground a few centimeters in front of and behind you, rather than having a perfect 3D map of the whole range.

Quick Review: Remember that a derivative \(f'(x)\) represents the gradient of the tangent at a point. In numerical methods, we use a small step, called \(h\), to help us find this gradient.

1. The Forward Difference Method

The Forward Difference Method is the simplest way to estimate a gradient. It’s based on the very first definition of differentiation you likely saw in Year 12 (differentiation from first principles).

Imagine you are standing at a point \(x\). To find the gradient, you take a tiny step forward to \(x + h\), find the change in "height" (the y-values), and divide by the step size.

The Formula

\(f'(x) \approx \frac{f(x + h) - f(x)}{h}\)

Step-by-Step Guide:

1. Identify the point \(x\) where you want the gradient.
2. Choose a small value for \(h\) (the smaller, the better, usually!).
3. Calculate \(f(x)\) and \(f(x + h)\).
4. Plug these into the formula.

Analogy: Imagine you are walking up a hill. To guess how steep it is exactly where you are standing, you look at the ground exactly one step in front of you and see how much higher it is than where your feet are right now.

Don't worry if this seems a bit rough! Because we are only looking "forward," this method often over-estimates or under-estimates the true gradient if the curve is bending sharply.

Key Takeaway: The Forward Difference Method uses the point itself and one point ahead. It is easy to use but not the most accurate.

2. The Central Difference Method

If the Forward Difference is like looking one step ahead, the Central Difference Method is like looking one step ahead and one step behind. This usually gives a much better "balance" and a more accurate result.

The Formula

\(f'(x) \approx \frac{f(x + h) - f(x - h)}{2h}\)

Watch out! A common mistake is to divide by \(h\). Because you have moved a distance of \(h\) forward and \(h\) backward, the total horizontal distance between your two points is \(2h\).

Why is it better?

By using points on either side of \(x\), the "errors" from the curve bending often cancel each other out. Graphically, the line connecting \(f(x-h)\) and \(f(x+h)\) is almost always closer to the actual tangent line than the line used in the Forward Difference method.

Memory Aid: "Central" means the point we care about is in the center of our two calculation points. "One step forward, one step back, divide by the double-track (\(2h\))."

Key Takeaway: The Central Difference Method uses points on both sides of \(x\). It is generally much more accurate than the Forward Difference method for the same value of \(h\).

3. Accuracy and the "Order" of Methods

In Numerical Methods, we talk about how fast an error disappears as we make \(h\) smaller. This is called the Order of the Method.

First Order vs. Second Order

1. Forward Difference is a First Order Method, written as \(O(h)\).
What this means: If you halve the step size \(h\), the error in your gradient estimate also roughly halves.

2. Central Difference is a Second Order Method, written as \(O(h^2)\).
What this means: If you halve the step size \(h\), the error roughly quarters (\(0.5^2 = 0.25\)). This is why it gets accurate so much faster!

Did you know? Even though smaller \(h\) values usually mean more accuracy, if you make \(h\) too small (like \(0.000000000001\)), computers and calculators can start making "rounding errors" because they can't handle that many decimal places. There is a "sweet spot" for \(h\)!

Quick Review Box:
- Forward Difference: \(O(h)\) - Error halves when \(h\) halves.
- Central Difference: \(O(h^2)\) - Error quarters when \(h\) halves.

4. Using a Sequence of \(h\) values

In your exam, you might be asked to calculate the gradient for a sequence of decreasing \(h\) values (e.g., \(h = 0.1\), then \(h = 0.05\), then \(h = 0.025\)).

By looking at how the answers change, you can observe the limitation of accuracy. If the answers for \(h = 0.05\) and \(h = 0.025\) are almost identical to 4 decimal places, you can be fairly confident that your answer is accurate to that level.

Common Mistakes to Avoid:

- Denominator Drama: Using \(h\) instead of \(2h\) for the Central Difference.
- Rounding too early: If you round your \(f(x+h)\) values to 2 decimal places in the middle of the sum, your final answer will be junk! Keep as many digits as possible until the very end.
- Negative Signs: Be extra careful when calculating \(f(x-h)\) if \(x\) is already negative or if the function involves subtracting terms.

Summary: Numerical differentiation is about estimating gradients using small steps. The Forward Difference is a simple first-order method, while the Central Difference is a superior second-order method that provides better accuracy by looking at points on both sides of our target.