Extract from the Book

1 Numbers Shmumbers: Why Being ‘Bad with Numbers’ May Actually be a Good Thing

 

The invention of the laws of numbers was made on the basis of error, dominant even for earliest times, that there are identical things (but in fact nothing is identical with anything else)  Nietzsche

 

 

Nietzsche was a German philosopher who died in 1900 perhaps from a brain tumour, or tertiary syphilis, or from an early form of dementia, depending on the source you choose to believe. At the age of 44 he suffered from a complete nervous breakdown that may have been precipitated by witnessing the savage whipping of a horse. After that he never really recovered his mental faculties.

So he seems hardly worth quoting at all but many authors find themselves compelled to do so. This is because he was a dangerous and heretical thinker who was very good at producing rather terse and pointed quotes. ‘God is dead’ is probably his most famous; ‘That which does not kill us makes us stronger’ is another good one. Indeed one whole book of his was given over to these aphorisms [1]. This plethora of quotable phrases makes Nietzsche irresistible to many authors because you can usually find one to support whatever case you are trying to make. So wide ranging are these scatter gun quotes that Nietzsche’s work was at one point misappropriated by the Nazis, though he appears neither to have been anti-semitic nor nationalistic.

But Nietzsche is worth quoting here because he was one of the first of the modern philosophers to harbour grave doubts about science and also about its most powerful tool: mathematics. Much of this doubt came from his own maverick nature but also from his early career as a classical scholar of Greek and Roman texts. Concerns about mathematics have been around for thousands of years but, eclipsed by maths’ apparent success, they are often forgotten. These concerns had produced disbelief and even anger in some mathematicians but have, over time, been airbrushed out of its teaching so that few realise they constitute any problems at all.

In this chapter we will examine some fundamental problems with mathematics and how a lack of understanding of the innate unreality of mathematics has led to some of the most absurd concepts in science (Black Holes: I’m looking at you!).

For those who don’t like or understand maths I want to reassure you that we’re only going to use one simple arithmetical equation in this chapter. Lest you think I am being patronising to those who aren’t ‘good with numbers’, I can assure you I mean quite the opposite, as you will see later.

 

Here comes the simple but extremely dangerous equation:

 

1 + 1 = ?

 

What’s the answer?

Before we go any further, I want you to focus on the certainty you feel about knowing the correct answer. I’m going to take a wild guess and assume the answer you came to was ‘2’.

Well now I’m going to tell you the most awful truth about that equation. A truth that you never hear in schools or universities and that has, in many ways, been ignored for centuries. A truth that some might even argue is the root of all the evils in the world.

The truth is simply that, in the real world, 1+1 NEVER equals 2.

How can that be? Add one thing, say a soccer ball, to another identical soccer ball and you’ve got two soccer balls, surely?

But the fact is, as Nietzsche pointed out, that there are no two things that are identical in the whole universe. From grains of sand all the way up to galaxies: all have differences. Some of these differences are extremely subtle but the fact is they are still there. No matter how things are closely machined by nature (for example sand grains) or by man (for example footballs or sugar grains) there are always tiny differences.

Now you may consider this nit-picking. Grains of sand are so similar that it hardly matters that they are not identical. And for most of the works of man, for example shovelling sand into a cement mixer to help construct a building, that’s true. But by making that simple approximation, by ignoring the leap we’ve taken for the purposes of simple expediency, we’ve disregarded what may be the most profound truth of all.

Science, in whatever shape or form, is the search for truth. All hard science, and much of the so-called ‘softer’ social sciences, is based on numbers. It’s how we handle the universe. And all our equations and calculations are based on simple arithmetic. And that arithmetic is based on one plus one equalling two.

But, however you cut it, all our mathematics is built on a fundamental untruth because it supposes that two things can be considered identical.

Basing whole systems, whether they be scientific or political or whatever, on untruths can still work, for a while anyway. You could, for example, design a well lubricated mechanical system to run forever by pretending that friction didn’t exist.  It may run for a while but, sooner or later, any such system will fail. Nazi ideology was based on the idea that Germanic races were inherently superior to others. How did that work out in the end?

For quite a while now I’ve been putting forwards these arguments to physics colleagues and friends. This has elicited a range of comments and ripostes. Let me try and summarise these comments, leaving out the occasional rude word, and supply an appropriate response.

 

What about atoms? They’re identical. One carbon atom plus one carbon atom equals two carbon atoms. Give me a break!

The truth is I don’t know if atoms are identical and neither does anyone else because we can’t see down to that scale. As far as we are aware they behave pretty much identically using our coarse methods of measurement. This might lead some to suppose they are identical but it certainly isn’t proof. There are tiny differences in everything else we can see, so why shouldn’t the same thing apply at the atomic scale? In earlier times, grains of sand, fleas and just about anything else that was tiny were thought to be identical until the microscope was invented. The finer the detail we can resolve, the greater the variation we inevitably find. The idea that atoms are really identical supposes a radical departure from our real world experience.

 

It doesn’t matter because maths works! Where would your iPAD and your smart phone and your Kindle be if it didn’t? You’re wasting my time.

Maybe. Maths does indeed work, within certain limits we will discuss later, and will continue to work up to a point. Then it won’t, just like any other system based on a fundamental untruth. Maybe we should be at least cognisant that this approach sets fundamental limits on our perception of reality. Nothing is identical to anything else but in everything we do, from calculating population size to ordering a round of drinks, we ignore this. What are we missing by ignoring the one thing which appears to be universally true: that nothing is the same as anything else?

 

 

1+1=2 was never meant to be taken literally. It’s about representation. One dollar bill plus one dollar bill may not give you two identical dollar bills because of slight variations in their physicality such as their weight (and not least because of the varying amounts of cocaine adhering to them), but what we are actually talking about is the representation of their worth.

Though a more subtle response than the others, it is falling into the basic trap set by the equation in the first place. Financial value, corresponding to what a dollar represents is, like the identical objects required for arithmetic, a notional concept that resides only in our heads. It is not some hard, absolute, real world thing, though I can’t help but accept that not having financial worth, notional or otherwise, can make life very real indeed.

 

The equation is only an approximation. Two grains of sand may vary in terms of their shape and mass but they are approximately the same.

That’s a better answer but it only comes after you’ve pointed out that the equation is never actually true. Ask a hundred people what 1+1 equals and I doubt a single one will say: approximately 2.

And it is that word approximately that is so crucial here. Two objects can only be identical enough for the 1+1 to equal 2 if we determinedly ignore all the things that make them different. If, for example, we ignore the different shapes and masses and colours and elemental contaminants that go into each grain of sand. One Nelson Mandela plus one Joseph Stalin can indeed equal two humans but only if we ignore all their manifold differences.

The point is that even this most simple of arithmetic is an artificial construct that never directly corresponds to the real world. And yet this artificial way of thinking is drummed into us from an early age. It’s one of the ‘three Rs’: reading, ‘riting and ‘rithmetic. Yet most teachers, themselves unaware they have taken this step into unreality (in other words treating some things as equal by the simple expedient of ignoring all the aspects of them that are different) do not pass on this awareness to the young people they teach in turn.

Perhaps this is why some people struggle with arithmetic and mathematics generally. These are often intelligent people who are perhaps labelled as ‘artistic’ because they are ‘bad with numbers’. Maybe the reason for this is that arithmetic doesn’t make sense to them at a fundamental level. Perhaps they are innately aware of the complexity of life where nothing is identical to anything else. Yet in their maths classes they have to pretend things are identical and will get punished if they don’t toe the party line.

The problem for such people is exacerbated because they are taught arithmetic when they are only a few years old, when they are far too young to articulate any feelings of unease. Arithmetic is taught in a dogmatic and unquestioned fashion. How can a six year old stand up to that? Instead they retreat, tail between their legs, convinced they are somehow inferior because they are ‘bad with numbers’.

As well as making perfectly intelligent people feel inferior, the nature of arithmetic, or perhaps the mind set from which it springs, arguably paves the way for some unfortunate consequences.

Early teaching of arithmetic is perhaps the first and most forceful way we are inculcated with the view that ‘things’ can indeed be regarded as identical. It is a powerful tool. After all, if in later life you own a pub then it is important to work out how many bottles of beer you need to order a month. It doesn’t help you to focus on the slight differences in the shape and weight of the bottles and the minor variations in their contents.

The propensity to regard two objects as equal probably didn’t originate with arithmetic, but instead may represent aspects of how our brains are hard-wired to work, as we will see later. A universe where everything is different from everything else is a scary place. If you were on the African plains ten thousand years ago, and were starving and needing to hunt an antelope, then you couldn’t allow yourself to be distracted by the fact that each one was different. Instead, you needed to focus on the commonalities: for example the footprints antelopes make, the way they move and so on.

For our ancestors, each animal didn’t have a unique identity, they just become antelopes. It’s how our limited, as opposed to omniscient, minds handle the notion of them. In a complicated universe one has to simplify to survive. There will, of course, be some nuances on top of this; each animal won’t act identically and an experienced huntsman will factor this in when they track and hunt them.

As man developed away from hunting single animals and supplementing his diet by picking the odd misshapen fruit, he made the move towards agriculture. In doing so he began to deal with many animals such as chickens, or cultivated vast orchards of fruit and vegetables. Selective breeding over time made all these animals and plants more and more alike and this made the need for developing mechanisms to handle these greater numbers even more compelling. Arithmetic was invented and the awareness of the non-identical nature of animals and plants began to fade away. What room is there for individual identity in a factory farmed chicken amongst a multitude of others?

And arithmetic spoke to something in the way our limited minds worked. An example is found in the matter of tribal identity. This concept allows us to deal intellectually with a ragtag bunch of wildly different individuals. If you want to warn a child not to go into the territory of a few hundred hostile individuals, you don’t want to go through a list of all of them. It’s easier to define them all as a specific tribe.

It’s a handy, shorthand way of dealing with a complex situation but the problem is that by stripping out the complexity of individuality it starts to characterise all the people within each tribe as the same, especially tribes which aren’t friendly.

And that is the first step on a very slippery slope. As soon as we start to mentally handle large groups of individuals by ignoring their differences, then we start to see them as ‘all the same’ and terrible things can happen. To Hitler, Jews were an undifferentiated mass that meant his earlier bad experiences with a few individuals tarred them all with the same brush.

Perhaps the earliest and clearest example of this linkage between arithmetic and evil can be found in the ancient concept of decimation. The word is often used wrongly nowadays to denote a massacre in which all, or at least many, of a group of people are killed. In fact this was a technique originated in the early days of the Roman Empire. It was employed as a form of punishment for groups of people. The first recorded use was in 471 BC where decimation was applied to soldiers in a legion that had shown cowardice or had misbehaved in some way. Men were divided into groups of ten and drew lots. The person who drew the shortest straw was then bludgeoned to death by his colleagues. In other words, only one man in ten was killed.

No attempt was made to discern individual guilt or to discriminate in terms of any actions, good or bad, perpetrated by each individual. Instead they were all equated as ‘the same’ and their level of guilt was considered exactly equal.

Incidentally, though decimation suggests one in ten were killed, the number could be one in five or whatever was thought appropriate by the one doing the decimating.

Decimation did not begin and end with the Romans but was practised by the Italians during the First World War and the Soviets during the Second. Indeed a Soviet general at Stalingrad personally shot every tenth man until his ammunition ran out [2].

And that’s only what armies did to their own troops; decimating their enemies, such as captured prisoners, was also practised.

But the arithmetical ‘quotas’ which exactly equated one man with another and consigned them to death in vast multitudes, reached its pinnacle in Russia at the time of Stalin. It was known as the Great Purge

Russia faced huge potential unrest due to a famine largely caused by Stalin’s forced collectivisation of farming. In order to tame this general disenchantment, Stalin developed an essentially random mechanism to keep Soviet citizens in a state of perpetual terror. Stalin’s stated aim was to reduce the reservoir of terrorists and spies. Later on, it was used to reduce the threat from other wings of the communist party led by Bukharin and Trotsky.

Perhaps a million people were killed during the Great Purge or died due to the terrible conditions in the prison camps they were consigned to as part of other arithmetically determined quotas. Victims came from the top to the bottom of Soviet society, with five out of six of the original Politburo members succumbing, three out of five army marshals, eight out of nine admirals. Intellectuals of all persuasions were imprisoned and perhaps only a quarter survived. Peasants, churchgoers and clergy suffered similar fates.

At first, victims were targeted because of at least some suspected activities; for example the study of sun spots was considered un-Marxist and nearly thirty astronomers paid for this with their lives. However, the situation became even grimmer when the Soviet leaders resorted to arithmetic alone. Top down calls provided the actual numbers to be executed within the military and across the regions of the country. Tens of thousands of executions were ordered without naming any specific individuals. Local party officials, to show their zeal and loyalty, sometimes asked for their quotas to be increased.

Arithmetic can clearly be a dangerous business, playing as it does to the limitations of how our minds work, so where did our interest in numbers come from and how did it lead to other aspects of what we call mathematics?

 

 

The History of Numbers

 

Counting has been around for at least 30,000 years, as the 55 marks in groups of five found on a wolf bone in the Czech Republic would seem to attest [3]. The grouping in fives is presumably because of the number of fingers on a human hand. Bearing in mind the material on which the marks were made, it’s surprising the person had any fingers or even a hand left at all. That was one tough arithmetician!

Nowadays we’re used to counting in units of ten and it was the Egyptians who started using this decimal system in about 3000 BC.

Pythagoras took numbers to a new level in the 6th Century BC. Though a Greek, he learnt his numbers from the Egyptians but took the whole business farther, transcending their basic use in counting. Numbers became sacred things in themselves. ‘Number is the first principle’, ‘Number is the essence of all things’, ‘All is number’ he wrote.

From being used to count chickens to becoming sacred is quite a jump for arithmetic, but this is only the first example of where something that was really only a tool has become the subject of veneration. As we will see, in the millennia that followed Pythagoras, scientists and mathematicians would often elevate the tools of their trade to the point of worship. The tools were the theories (mathematical, physical, biological) that tried to explain reality, but somehow in the process these often became that reality. The tools became the Laws.

Describing the history of numbers is dull stuff. Books that deal with it struggle to find much of humour so, when it comes to Pythagorus, authors inevitably focus on the issue of beans.

Poor old Pythagoras couldn’t stand beans. Not only were they the cause of flatulence but he also thought they too closely resembled human genitals. No, I’m not sure why either.

So great was his aversion to beans that he would rather have had his throat cut than run across a bean field. And indeed one day, chased by his enemies and finding his way blocked by just such a bean field, that is exactly what happened.

Pythagorus did other unusual things. He began his own sect and even, at one point, claimed he was a god. The sect was secret and new disciples weren’t allowed to speak or to make any other noises during their first years of membership.

Pythagorus even had a man called Hippasus killed because he had the temerity to give away the most dangerous secret of Pythagoras’ sect. I am going to explain what this terrible secret is so you’d better brace yourself for a major revelation.

In Pythagoras’ perfect world of sacred numbers, any real number could be expressed as the ratio of two whole numbers. For example: three-and-a-half can be expressed as 7 divided by 2.

All was indeed perfect until Hippasus came up with his filthy heresy, namely that some numbers could not be expressed as the ratio of two whole numbers. One example of this is the pesky number pi (3.14159… and on and on).

If that’s not worth killing someone for, then I don’t know what is!

As we will sadly and repeatedly see, Hippasus was only the first in a long line of people to suffer and even die because they came up with an uncomfortable truth that did not conform to prevailing theory.

That’s a terrible shame and waste because theory may never be the truth.

It would appear from Pythagoras’ statements such as ‘All is number’ that he was perhaps the first one not to understand that numbers could only ever be an approximation to reality. Certainly Euclid, whose work followed on from Pythagoras, stated as his first ‘common notion’ that: ‘Things that are equal to the same thing are also equal to one another.’ This implies that, as far as Euclid was concerned, there were at least three identical things in the universe whereas the truth is that there aren’t even two.

The Greeks thought all of nature could eventually be understood through mathematics and that all its workings could be unearthed by mathematical reasoning.

They’re all long dead, and therefore can’t sue, so I’m going to blame them for all the problems with science and mathematics described in this book.

That said, it must have been easy and comforting to be taken in by this way of thinking. If you’d never conceived of numbers then they might indeed seem like powerful magic. For example, any system which uses coinage is only possible by the rules of arithmetic. Numbers make many things possible and seem to put the world on a firmer footing. It’s not surprising that some ancient peoples thought numbers held magical powers, often using them in their religious rituals.

The early Muslim world did a lot of thinking about mathematics and numbers (they weren’t afraid of the irrational ones like Pythagoras) but then Muslim theologians pretty much stopped further development. The reason seems to be that they were concerned it would uncover secrets Allah might want to remain hidden.

It would seem that in the Roman, Greek and Muslim worlds there was a widespread belief in the absolute, and indeed sacred, meaning of numbers. Left far behind was the awareness that numbers were just simple tools that could only ever reflect reality in approximate ways.

Even in these supposedly enlightened times this is still essentially the case.