In this post I wanted to discuss an unintuitive fact related to infinite series which often leads to confusion among early students of Real Analysis.

Even if you have not seen the formal definition of a series before you may have seen examples of them, such as the harmonic series or the infinite geometric series , perhaps calling them “infinite sums”.

The notation using the greek letter sigma can be quite misleading, since series do not always behave as nicely as one might initially expect. We will discuss what strange things can happen if one is not careful.

Formally, given an infinite sequence , a series is nothing but another special sequence where .

This sequence is denoted and we call the partial sums of the series.

Apart from obvious issues, such as divergence to infinity (i.e the fact that the sum can grow uncontrollably large in either direction), there are other more subtle issues where series may behave differently from honest sums.

Consider a permutation and a convergent series .

We may ask: *Does converge to the same value as *

Or to give a more concrete informal example:

Is the same as

Perhaps surprisingly the answer is No in general.

The example I gave above was a rearrangement of the alternating harmonic series.

It is a well known fact that this series converges to and one can show that the rearrangement converges to something different, namely .

The diagram below shows the numeric value of the first 30 partial sums of either series.

It seems like our intuition about series as ordinary sums break down here.

In fact as we will show in the rest of this post, matters are even worse.

It turns out that you can rearrange this series (and other series like it) to converge to any real number you like! If you wish you can even make it diverge to infinity.

The defining property of series that converge in this way but are sensitive to rearrangement is called **Conditional Convergence**.

Usually this property is stated in another way, as a series which is convergent but not Absolutely Convergent.

A series is **Absolutely Convergent** if converges.

The true meaning of this property is somewhat hidden behind this definition.

As it happens,

*convergent series which are insensitive to rearrangement are precisely those which are absolutely convergent*.

Let’s prove this statement in one direction:

**Proposition 1:**

Let be an absolutely convergent series, converging to .

Then also converges to for every permutation .

**Informal Proof:**

Let and .

To show convergence our goal is to show that the quantity can be made arbitrarily small for sufficiently large , that is to say as .

Since converges to we know that as .

To understand if also converges to we should suggestively compare with . For if and get very close to each other, knowing that gets very close to we ought to see that gets very close to as well for sufficiently large .

This type of idea in analysis is usually expressed via the triangle inequality:

We know how to manage the quantity .

We want to know what happens to as grows large.

Consider a fixed .

Since is a bijection (permutations are by definition bijections) s.t .

Let and consider the set of subscripts that do not belong to after applying .

Then

This is where the hypothesis about absolute convergence kicks in.

Absolute convergence implies as .

Thus since we find that as .

In combination with as we get from (*) that as .

This shows that converges to .

To prove the converse we need to show that all convergent series that are not absolutely convergent (i.e conditionally convergent) are non-invariant under rearrangement.

From now on we will assume is conditionally convergent.

As I hinted initially something stronger holds true for conditionally convergent series.

Not only are the limits of these series sensitive to rearrangement, these series can be rearranged to converge to any real number. This result is commonly known as Riemann’s rearrangement theorem.

The idea behind the proof is essentially to construct a rearrangement by indefinitely adding positive and negative terms from the original series in order to oscillate around the chosen real value, and show that these oscillations get smaller and smaller.

Define and

There are a few technical issues that need to be resolved before we can claim the construction idea is well defined.

- Are there infinitely many non-zero , so we can indefinitely add positive and negative terms?
- Is unbounded above and unbounded below so we have a chance at getting close to any real value?
- Does , both converge to so that we can make oscillations arbitrarily small?

Answer to all 3 questions need to be Yes for this construction to work.

To see (1) we could argue by contradiction and e.g assume there are only finitely many non-zero (the case works similarly). In fact let’s assume something even weaker and just suppose that is convergent.

Then

By assumption both series on the right hand side converge, implying is convergent. But we assumed was not absolutely convergent so this is a contradiction. Therefore cannot converge and in particular this means the number of negative terms are infinite.

From this we can also deduce (2) because is monotonically decreasing (being a series of non-positive terms) and if it was bounded below then a standard result from real analysis (Monotone Convergence Theorem) says that needs to converge which we know it cannot. Hence cannot be bounded below.

A symmetrical argument establishes (1) and (2) for .

(3) follows directly from the fact that terms of convergent series necessarily tend to zero.

To see this just write and notice that the right hand side converges to zero as since both series converge to the same value.

Hence we can answer all 3 questions in the affirmative.

Now let’s sketch a proof of Riemann’s Rearrangement Theorem

**Riemann Rearrangement Theorem:**

Let be a conditionally convergent series and let .

Then there exists a permutation s.t converges to .

**Proof (Sketch):**

The proof is constructive (the best kind).

Let .

By (2):

since is unbounded above we can choose the smallest s.t

.

since is unbounded below we can choose the smallest s.t

.

We can continue to choose the smallest s.t etc..

Inductively if we denote and

we obtain a rearrangement of s.t

and .

We can make this definition since we have an infinite supply of positive and negative terms by (1).

An arbitrary partial sum in the rearrangement has one of below two forms:

or

.

We want to show as .

We clearly have .

By construction,

and

.

Therefore

By (3) we know that as so is being squeezed between two sequences converging to zero and is therefore forced to converge to zero as well by the so called Sandwich Theorem.

This shows that the rearranged series converges to .

Since was arbitrary the result follows.

To put back some intuition behind this phenomenon the proof tells us that conditionally convergent series are a result of two divergent series , cancelling out each other to allow for convergence.

In the cancelling race between the two series we can give one series an “infinite head start” by pushing terms from the other series further away into infinity. This way we can accumulate terms quicker in the partial sums from one of the series, at a pace we set, and thus have it converge to anything we like. I am being purposely vague here, but hopefully this makes some intuitive sense.

In a way one could think of the terms in a conditionally convergent series as being infinite streams of positive revenue and negative expenditure. By rearranging expenditures ahead in time we could alter our book balance (and in theory make infinite profit).

Funny enough this is the basic idea behind Ponzi schemes.

Of course in reality Ponzi schemes do not generate infinite profit.

It inevitably falls apart because the criminal cannot find sufficiently many revenue sources to sustain the expenditure commitments of the scheme, so when the scheme finally starts to collapse the criminal has to run away with the money.

The basic fallacy in applying this idea to real life is that things are usually not infinite even though people sometimes behave as if they are. Economic bubbles are a good real world example which behave very similar to Ponzi schemes but on a much larger scale. Eventually money is solely being made at expense of new investors entering the market, the profit being completely detached from the markets intrinsic value. Of course the number of investors nor their money is infinite, so sooner or later people are bound to lose money as the bubble implodes due to lack of new buying market participants (i.e there are only sellers left).

Pingback: Known convergence of 2 series, prove bijection - MathHub