A brief introduction to Taylor series

Today marks the day where I finally do it. The day that I explain the most powerful and wonderful mathematical formula birthed from calculus. What I feel is personally one of the most beautiful ideas I think the human mind has ever conceived. This beautiful Idea is known as the Taylor series. The Taylor series is a method of turning non polynomial functions into polynomial functions. When I say functions, I mean all of the nice kinds;-trigonometric, exponential, logarithmic. Now, this all probably sounds too good to be true and you may scratch your head in disbelief and ask yourself “How could it be possible to do such a thing?” And I say to that, sometimes dreams can come true.

0. A deep relation between derivatives and coefficients of a polynomial

Consider a general polynomial p(x) of degree’, in the form

Image for post
Image for post

Now, the question I propose is: how do the coefficients relate to a polynomial and its derivatives? To start, we evaluate our polynomial at x = 0. Then, all terms having any non zero powers of ‘x’ die off.”

Image for post
Image for post

Suppose we took a derivative of both sides of our original polynomial,

Image for post
Image for post

If we were again to put x=0 then we get a result similar to what happened when we put x=0 in the original polynomial,

Image for post
Image for post

And, we do that again to find for the third coefficient (a_2)

Image for post
Image for post
H.O.T means higher-order terms

but this time, it is slightly different because we have a two downstairs,

Image for post
Image for post

Right, but how would we get the coefficient of the ‘kth’ term? To motivate it, I introduce another induction-based result.

Let us consider,

Image for post
Image for post

Now, the first derivative of this is,

And the second derivative of our original expression is

Image for post
Image for post

with some inductional thinking, we can say that the ‘jth’ derivative would be as follows

Image for post
Image for post

(note: The exponent on the left side means jth derivative not g to the jth power)

So, for the kth coefficient, we need to the kth derivative, for that just put j=k, and alas we get,

Image for post
Image for post

Right, so bringing back our original polynomial p(x),

Image for post
Image for post

The kth derivative of both sides,

Image for post
Image for post

So, the thing to notice here is that the terms which had exponent greater than ‘k’ still have an ‘x’ on them, so when we evaluate the kth derivative at x=0, then all of those terms vanish. And, hence we get a formula relating kth coefficient and evaluation of kth derivative of the polynomial at zero

Image for post
Image for post

Fantastic, so here we can see that, in the simplified form of a polynomial, all the coefficients of the monomial terms have a coefficient of the kth derivative of the polynomial evaluated at zero divided by k factorial. Now let’s rewrite the polynomial using the general formula we got for its coefficients,

Image for post
Image for post

So, our last term is x raised to m because our premise was that the polynomial is one of degree ‘m’.

1. Motivating the Taylor Theorem.

Consider the function,

Image for post
Image for post

and suppose I wanted to break this function into a sum of polynomials. To start, we consider the graph of sin x as shown below

Image for post
Image for post
Sine Curve... A true classic

Now, I propose that we can write this as as a weighted sum of other monomials,

Image for post
Image for post
A graphical representation of combining monomials to generate sin x curve

So, now the question is how to figure out the weights and how to figure out how many terms there would be needed in this polynomial formulation of our sin x curve. So, we could start by equating sin x to ‘nth’ degree polynomial.

Image for post
Image for post

The problem for us, is finding the “weights”, as in the coefficients of the polynomial… well why don’t we fall back into the previous result and say that the polynomial coefficients are determined to sin x and it’s derivatives? As in, if sine really had a polynomial then we should be able to set up the same relation between its derivatives evaluated at zero and coefficients of the polynomial.

So, the smart way to calculate the coefficient of evaluating the derivatives is to create a table as I have shown below,

Image for post
Image for post

Right, so we found the coefficients till the fourth term. Plugging them back into the polynomial expression we had developed initially

Image for post
Image for post

And, with some clever induction,

Image for post
Image for post

Now, if you really sat down and tried calculating this, you would see the series goes on forever. And no one wants to calculate an infinite sum, and no one really has to calculate the infinite sum because the series converges to the value in just a few terms! Let’s take a finite amount of terms on the left side and see how the ‘fit’ of the polynomial curve varies as we take more and more terms. Below I have shown a gif of how the polynomial gets better fitting to the sine curve as we take more and more terms.

Image for post
Image for post
The width of the fit increases with more and more terms!

Now, I wish to discuss a common misconception that the value of series should be infinite because we take infinite terms, but one may notice that as we take higher and higher terms, slowly and steadily with more terms, the factorial starts growing faster and faster than the x term with the exponent on top.

Image for post
Image for post

After some k, so what happens is that after some kth term, the sum of series isn’t much affected if you take higher terms or not. For example, suppose our x=2, then just when k=4, the inequality becomes true, precisely put as an equation, so after the fourth term in our polynomial, we can start neglecting stuff because they start becoming very small.

Image for post
Image for post

What we have done here is something called the Maclaurin series, but what we wanted was the Taylor series. For the Taylor series, instead of having the higher-order polynomial terms go to zero at x=0, we instead make them go to zero at a point x=a, so that our coefficients are modeled by the derivatives of our function around the point x=a.

That is if we shifted the polynomial such that all of the nonconstant terms go to zero at a point x=a,

Image for post
Image for post

And, in a motivated spirit put x=a,

Image for post
Image for post

Skipping the formalities of an algebra similar to that which we have done before, we could show that the coefficients of this polynomial would be as follows

Image for post
Image for post

Now, using this formula, we can the Taylor series of sin(x) as

Image for post
Image for post

Now, how does our polynomial change as we change ‘a’ relative to ‘sin x’, well in the original polynomial we put a=0 ( the Maclaurin), and hence the polynomial became better fitting to the polynomial around x=0, but suppose we choose some other ‘a’, say a=1, then the polynomial would be better on the sin x curve around x=1. Below I have shown a gif of how that would play out.

Image for post
Image for post

The purple line is x=a, as the purple line moves, the point where we create the series changes. And the red curve is the sine curve over which the Polynomial curve looks like its having a wobbly ride over

And, finally, with generality, I present the formula for turning a general ‘nice’ function f(x) into a polynomial which should approximate well for a point nearby to x=a

Image for post
Image for post

3. Applications

The most profound application of this is that we can easily derive the famous Euler’s identity using it. Doing some work, we find the Maclaurin of some famous functions (all of these are nice functions, which give the correct output value for a given input if we take enough terms of series)

Image for post
Image for post
Image for post
Image for post
Image for post
Image for post

Now suppose we plug x=it, where i is the imaginary unit, into the series of the exponential and then do some sneaky ‘rearrangement’…

Image for post
Image for post
Image for post
Image for post
Image for post

Ta-da! Oh, wait HOLY SHIT! DID WE JUST PROVE EULER'S?!?!

Now, to get the identity they often reference in pop math put replace everywhere you see at with a pi!

Image for post
Image for post
Image for post
Image for post

Ehm, further discussion would get ‘complex’, so we will avoid that. But… BONUS TIME!!!

4. BONUS

A cool application of Taylor is in how it allows us to explore the motion of objects around the equilibrium as we give it tiny disturbances. Consider a single variable potential function V(x), and let us Taylor expand it around a point x=a where ‘a’ is the position of equilibrium. Then,

Image for post
Image for post

Now, the force on a particle at equilibrium is zero hence V’(a)=0, and we can change our coordinate system in such a way that at the equilibrium, our potential is zero. Finally, we will assume that our disturbances are small enough that we can neglect terms after (x-a)²

Image for post
Image for post

Then, the force near the equilibrium is given by,

Image for post
Image for post

So, now we have F=-k(x-a), and this is exactly the force equation for a shifted oscillator! This physically means that for ‘small perturbations’ near an equilibrium point x=a, The motion undergone is a simple harmonic oscillator!

Now let’s try to interpret this mathematical idea using physics! If an object has some excess energy which prevents it from settling at the mean position of x=a, This means that when the body crosses the mean position it has some velocity. Now, even though the force at equilibrium is zero when the particle comes in with velocity, it reaches the next extreme position from where it falls oscillates back again into our least potential state where it crosses to the other side again. Oho, Could you find the Time period of such a motion? (note: this may be easier for someone familiar with physics)

So, we could even further generalize this to see how motions change as we increase the perturbation into the system! What if we had motion where we didn’t neglect the (x-a)² term... Now that’s something to ponder!

End.

Hope you enjoyed my article and now share the same love for the Taylor series as I do

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store