Apparently this is a thing. Consider the function . Maybe you know that you can express as an infinite polynomial:
I find this difficult to believe, despite being already familiar with it. Here's a rather standoffish proof: just multiply the right-hand side by and see what you get.
Okay. We can rescue our intuition a bit by noting that the equality only makes sense when the series converges, i.e. when . For example, when , we have
This is just a geometric series. It all is, if you think of as the common ratio between terms. It's also a Taylor series, since we can get it by differentiating a bunch of times. You can also get it by doing polynomial long division -- just like regular division, except you replace by .
____ 1 - x | 1 |
Consider the first term only. One goes into one one time, so write that at the top. |
1 ____ 1 - x | 1 |
Now multiply by and write it beneath. |
1 ____ 1 - x | 1 - 1 + x |
Subtract. |
1 ____ 1 - x | 1 - 1 + x ________ 0 + x |
How many times does one go into ? |
1 + x ____ 1 - x | 1 - 1 + x ________ 0 + x x - x^2 |
Now multiply by and write it beneath. |
1 + x ____ 1 - x | 1 - 1 + x ________ 0 + x - x + x^2 _________ 0 + x^2 |
Subtract again. We'll put up top, and then do this forever. |
But I pulled a bit of a fast one there. Why did I write the divisor as 1 - x
instead of -x + 1
? We usually put the larger term first (we certainly do in decimal).
It actually changes the answer when you do that. Try dividing by for yourself. (The first thing you have to do is decide how many times goes into : that would be times, which is your first term, which means the answer definitely isn't the same). The answer you get is:
(Source). But the world hasn't gone mad, and are still equivalent algebraically, so we must have equality to as well. And indeed we do:
Until today, I implicitly believed that "Taylor series" and "series expansion" were the same thing. But I'm here to tell you that every Taylor series, converging when is small, has an evil twin, a mystical equivalent, a hippie cousin: the Laurent series, which converges when is large. This is the Laurent series for .
You can see that, because of all the 's in the denominators, it won't converge unless . So it's a perfect complement to the Taylor series, which occupies the space. Both are infinite polynomials that represent the function in some sense.1
Questions: Could there be more types of series expansions? Wolfram Alpha refers to Laurent series as "centered around " (vs. Taylor series centered around ) -- what does that mean? The forbidding Wikipedia page makes it clear that Laurent series have something to do with , but what's the relation?
Another Question: Here we've taken a divison problem that ordinarily gives an infinite but convergent answer, and "flipped it around" to get an answer that doesn't converge. Is there anything in doing this for a decmial long division problem, i.e. we keep pulling down larger and larger powers of 10?
[1] To me, it even makes a loose kind of sense which ordering gives you which series. Suppose you're always putting the most significant terms on the left, to mimick positional number systems. Then, writing means that , and this gives you the Taylor series which is only true for .
[Edited 10/31/2019 to remove an incorrect statement.]