Is Mathematics Philosophically Analytical?Mike Arnautov, February 2008 There is a philosophical view, going back at least to David Hume, that mathematical thinking is purely analytical in nature. In other words, that it consists of working out more and more complex but inescapably tautological consequences of some initial postulates. This perception had been clearly shaped by exposure to Euclidean geometry, seen as the epitome of a mathematical theory. Plausible as it is, this view of mathematics is deeply misguided. The problem lies in a phenomenon one might describe as a trap of overfamiliarity – a condition in which one is so familiar with a particular line of thought that it appears to be completely seamless. We look at basic schoollevel mathematics and can no longer see the nondeductive leaps that were necessary for its construction. It is certainly true that a lot of mathematics is indeed concerned with deductive or inductive (in the mathematical sense) reasoning from given premises. However major developments are generally neither deductive nor mathematically inductive in nature. They typically consist in either extending existing concepts in a novel way beyond the domain of their initial definition, or in forming entirely new concepts and applying to them some of the preexisting mathematical methods. The earliest example of an extending move can be clearly seen in the extension of natural numbers (1, 2, 3 ...) to integers (... 2, 1, 0, 1, 2 ...). We are so familiar with negative numbers and with the crucial concept of zero, that we can no longer see just how "unnatural" they actually are. It is only by exploring the history of mathematics that one can appreciate the boldly innovative moves involved in their creation. The initial resistance to this innovation and the struggles of Indian mathematicians to come to grips with the rules governing nonpositive integers in performing basic arithmetical operations, leave one in no doubt that considerably more was involved than mere deduction or induction. A somewhat less familiar, and therefore a more instructive example can be seen in the development of the concept of exponentiation. The initial move of defining exponentiation in the first place, is a purely analytical one: "X to the power of N" is simply a shorthand for "N instances of X all multiplied together", where X is called a "base" and N is the "exponent". In this definition, exponents are necessarily positive integers – it clearly makes no sense to multiply a number by itself a fractional, let alone a negative, number of times. However, once the rules for multiplying and dividing powers of the same base are established, it becomes quite clear that it makes sense to define zero power of any number to have the value of 1, and to define powers with negative exponents as reciprocals of powers with the corresponding positive exponents. In retrospect, this first extension of the concept of exponentiation can be seen as obvious to the point of being inevitable. However, it flatly contradicts the original "naive" definition of exponentiation and therefore cannot be arrived at either by straightforward deduction, or by mathematical induction. The highly imaginative and distinctly nonanalytic move of discarding the original definition of exponentiation was motivated by an illdefined, yet immediately plausible, notion of "making sense". As it happens, two more creative mathematical moves are also to do with exponentiation. One was the fairly familiar one of defining fractional powers as involving inverse exponentiation ("taking root"). The other, stunning in its consequences, but far less familiar was the generalisation of exponentiation to complex numbers. Once again, both of these generalisations can be seen as the only ones that make sense, but neither can be arrived at by purely analytical methods. An imaginative leap is required in both cases. As noted above, mathematics also develops by defining entirely new concepts, which is a process once again not reducible to purely analytic thought. A classic example with farreaching consequences is Galua's definition of a group of transformations, with its consequent group classifications (e.g. into commutative and noncommutative groups, or into simple and composite groups), generalisations (rings, algebras, fields, lattices...) and manipulations (e.g. decomposition of groups into subgroups). By no means all such subsequent elaborating activity is analytical – some at least is concerned with further "making sense" of the new concept and its relation to the rest of mathematics. Another rather interesting example is the creation by Descartes of what we now refer to as analytical geometry. Initially the invention of the Cartesian coordinate system appeared to be no more than a new formalism to describe the familiar world of Euclidean geometry. The full, explosive consequences of that invention started coming into view only over 100 years later, with Gauss' work on the geometry of curved surfaces, leading to the creation of fullyfledged nonEuclidean geometries in the 19th century (and the subsequent revolution in physics in the 20th century). As is now clear, Descartes' innovation paved the way to a profound generalisation of geometry in a way inconceivable within the original Euclidean formulation. Once the presence of nonanalytical moves in the development of mathematics is appreciated, it quickly becomes apparent just how allpervasive such moves are. Practically all significant advances in mathematics can be attributed to imaginative leaps of one sort or another. To list at random just a few of those not mentioned above: the concepts of a variable and of a function, infinitesimals and calculus, enumeration of infinite sets, topological manifolds, imaginary and complex numbers, probability, matrices and tensors, Gaussian extension of functions in the complex plane... This is hardly consistent with the notion of mathematics as a purely analytical science. The really curious feature of mathematics has been already alluded to above: while clearly nonanalytical in nature, all of these advances are seen in retrospect as "forced", as making sense in a unique and distinctly nonarbitrary way. Needless to say, it is conceivable that there are/were other alternatives, which would have also made sense in ways incompatible with the ones we have chosen in the historical process of development. However, looking at specific cases, it is hard to see how this could be the case. Furthermore, mathematical innovations sometimes reveal unexpected connections between different branches of mathematics – a celebrated recent example being the "moonshine" (an official term!) connection between the "monster" group discovered in cataloguing all simple Galua groups, and more traditional number theory. So how inevitable are all these developments? Could mathematics be constructed in some other, possibly incompatible, way? This is an old argument, echoing the Platonic and Aristotelian views of the world: are we inventing mathematics, or are we discovering it? As far as I am aware, if pressed on the subject, most practising mathematicians incline to the view that we are inventing mathematical tools for discovering structures which in some sense do objectively exist. This view is strongly reinforced by Gödel's stunning discovery of the essential incompleteness of mathematics, and by the subsequent working out of the consequences of this. Take for example Cantor's Continuum Hypothesis which says that there is no set with cardinality strictly greater than that of all integers, and strictly smaller than that of all real numbers. As Gödel and Paul Cohen showed between them, the Continuum Hypothesis is an undecidable proposition: it cannot be proved or disproved from the axioms of set theory. Both the existence or the nonexistence of a set of such intermediate cardinality can be added as a set theory axiom (clearly a nonanalytical move in either case). And yet, a set of such intermediate cardinality either can or cannot be constructed, so the Hypothesis is either true or false, but mathematics does not allow us to determine which is the case. Roger Penrose in his books "The Emperor's New Mind" and "Shadows of the Mind" argues (to my mind inconveniently, yet fairly persuasively) that Gödel's incompleteness theorem also has profound implications for our conception of the mind. He argues that while Gödel's proof could be constructed "mechanically" by a computer, some "noncomputational" move must be necessary to grasp the full range the proof's consequences. All of this raises some profound philosophical questions. It seems clear that Hume's view of mathematics was at best simplistic, and at worst quite misguided. Discovery of mathematical truths has much more in common with our exploration of physical reality than is usually acknowledged. Is this fact in any way related to the issue of the "unreasonable effectiveness of mathematics"? I do not think such an inference can be made, but that's a different topic.  o O o 
