Tuesday, December 12, 2006

The strange case of summation

It is easy for one to be fascinated by the summation of series, as it seems to give us an opportunity to bridge the divide between here and infinity.

Allow me to present an example,
1 - 1/2 + 1/3 - 1/4 +....ad inf. = log 2 --------------------(1)

The series of absolute values of the summands is the familiar harmonic series. Henceforth we will refer to the series (1) as the alternating harmonic series, owing to the polarity shifts. The general term is [(-1)^(n-1)]/n for n from 1 to infinity.

There are a variety of summation techniques in the modern mathematician's arsenal, though none of these guarantees a closed form evaluation given a particular series.

But whether it's a summation of a basic arithmetic progression, a geometric progression or even a particular value of a function's series expansion that we employ for those evaluations we can manage, we would not have gotten to the heart of the operation and answered "what summation is". Addition itself is well defined, but the passing from a finite sum to an infinite one was marked with unique difficulties. Certainly, following Cauchy, we would today define clearly the infinite sum of a series to be the limiting value of its sequence of partial sums and much of modern analysis builds upon this notion. However, stopping here would deny us of a breadth of examples mathematicians have been considering since antiquity, and the posts to follow will consider some of these in detail, while this post will introduce the notion of 'conditional convergence'.

Let us investigate by considering the general case of our initial example. We have that,

log (1+x) = x - x^2/2 + x^3/3 +....ad inf. ---------------------(2)

is an expansion, called the Taylor expansion about 0 (or just the McLaurin expansion) of the function log (1+x). We must consider log (1+x) and not simply log x due to log x diverging at x = 0 (since log x = a => e^a = x, notice that e^a need be zero for ln x at x = 0 which does not happen on the real line).

Coming back to the series, it can be shown to have validity within the unit circle and the extremity x = 1, where we had the first identity.

The validity at the perimeter of (2) however is not without ambiguity. This is because (1) is said to be conditionally convergent. Should the series corresponding to (1) with all positive terms diverge, (1) is conditionally convergent and can remarkably be made to sum to any value we like by permuting the order the terms are summed! We certainly wouldn't want a finite sum to change its value depending on how it was evaluated, but in the realm of infinite series such situations seem unavoidable.

So the only way to hold on to our beautiful result (1) is to acknowledge that rearranging an infinite series can sometimes alter its sum and define value log 2 as a sort of principle value arising when the 'natural order' of the expansion is retained.



What may be just as surprising however, is that the series of positive terms corresponding to (1), viz.

H = 1 + 1/2 + 1/3 + 1/4 + .......

diverges! We shall hereafter refer to H as the harmonic series and shall presently see that despite terms becoming smaller and smaller along the way we may show divergence.

Since it is the same as its series of positive terms we can show that it diverges as follows by grouping terms,

H = 1 + 1/2 + (1/3 + 1/4) + (1/5 + 1/6 + 1/7 + 1/8) + .......

> 1 + 1/2 + (1/4 + 1/4) + (1/8 + 1/8 + 1/8 + 1/8) + ......
= 1 + 1/2 + 1/2 + 1/2 + .......

where the last series is seen to grow without bound and by our argument, as does H, completing the proof.

The above argument is due to Nicole Oresme and dates back from medievel times. A generalisation is the Cauchy condensation test for convergence of series.

for further proofs.