With the help of my friend Ali, whose interest in the Mandelbrot set sparked a discussion of imaginary numbers, negative numbers, etc., I've had a chance to clarify some of my views about whether mathematical concepts have some transcendental reality or not. I'm not going to pretend I've solved this question, but here's where my thinking stands right now:
First, the puzzle. Some logical/mathematical concepts, most obviously the Law of Identity, seem to be analytic a priori truths. But consider the thought experiment ofa universe in which there is nothing rather than something, which prima facie appears possible although perhaps in fact it is not. In such a world, would these logical concepts exist? And what would they mean.
Well, decades ago Russell and Whitehead attempted to create a formal system based on pure logic. Russell himself went on to show that an attempt to build such a system on set theory, as they had done, was doomed to failure, because the concept of a 'set' couldn't be rigorously defined in such a way that it could be used to define a number system without leading to contradictory results.
So this attempt to define numbers in terms of sets failed. But clearly numbers are useful and were historically discovered because they describe our world. Whatever formal system is true, it's true because it corresponds with reality. The law of identity (non-contradiction) is a necessary condition for any formal system to be considered true, but not a sufficient condition. It doesn't tell us, for example, how to decide which of three alternative geometries, each of them internally consistent, is true.
We might think that we could choose among such systems based on their correspondence with the real world, and in a given, socially agreed upon context, we can. But Godel demonstrated that no consistent formal system can be used to prove everything that is in fact true, and it follows that no such system can correspond perfectly to the ideal of a transparent mirror of reality in itself.
Consider, for example, addition. Any reasonable system of arithmetic will have, either as axioms, the basic additive equations--a+0=a, a+b=b+a, a+(-a)=0, (a+b)+c=a+(b+c)
, and so forth. We assume, because we derive these formalized intuitions about addition from the world, and apply them in talking about the world, that they represent truths about the world.
This assumes the existence of a 'true' mathematical description of the world that a given mathematical description--inconsistent with other internally consistent mathematical systems with different axioms--approximate. However, it is clear that such a system, even in theory, can't be derived either from pure logic or from the correspondence theory of truth. It is nothing but a chimera.
What any formal system tells us about unambiguously is the rule set for manipulating a set of symbols defined with reference to that system. Since the system cannot in fact provide a complete description of capital-T Truth, there is an apparent gulf between the formal system's symbols and the universe they purport to describe.
But everyone knows that 1+1=2, right? So '1', '+'. '=', and '2' must all mean something, right? Perhaps not. Wittgenstein famously observed the mistake in assuming that because the same word was applied to a class of objects that it followed that they all had something in common. He recommended, instead, that we simply look to see whether they have something in common.
Do they? Consider the '1'. There's 1 dog, 1 mole of atoms, 1 o'clock, 1 "the loneliest number", number 1 signifying preeminence or priority, the ace in my solitaire game, "the One" a single (ubitary) person hopes to meet, 1 thousand, 1 million, 1 in a million, the '1' I'm using to make my point, etc., and it's not clear at all that all the instantiations here have something in common. The same point can be made for such "clearly" intuited concepts as '+', '2', '=', etc.
It does no good to point out that my examples are of imprecise application of a concept since that is exactly my point. In the context of the formal system in which they are defined, the only precise, transparent application of a number defined in that system iswith reference to tat system's rules for manipulating that number. Any application beyond that system--for example, extending '1+1=2' to 'Woody the dog and Misty the dog together make two dogs' is by analogy, by a perceived family resemblance between '1' and 'Woody', even though in a different context, perhaps, Woody, like Walt Whitman, contains multitudes. It does no good to add rules explaining how to apply, say, the 'Woody' symbol, since there is no set of such rules that can provide a complete description of reality. My ability to see that the equation applies depends on perceiving the "oneness" of Woody. The same goes for threes, fours, negative twos, etc.
In this example, we are required to accept the mathematical equation as a model describing reality. But just as when we use Tonka cars as a model to describe a real-life car accident, we are bound to focus on the family resemblance and ignore irrelevant differences. Our capacity to do this rests on our ability to see that in this context, this or that symbol can be applied to this or that thing. Again, though, it is Wittgenstein who demonstrated that ostensive definition of a thing, acceptance of 'models', etc., all depend on a tacit set of background assumptions that are assumed to be understood, and there is no way to make ALL such assumptions explicit. That doesn't mean these analogies are meaningless, only that they are limited in scope. More to the point '1+1=2' applies to this situation given one set of background assumptions, to that situation with an overlapping but different set. For the mathematical symbols to be good models, they can't "stand for" the same things in every situation. (In fact, there are situations, such as, say, addition of clouds, where '1+1=2' doesn't apply in the usual way.)
I too have intuitions about what is '1' and what is '2', but I think it is a very facile and ultimately incoherent assumption that, unless we are talking about pure, arbitrary (within confines of consistency) rules for manipulating symbols, these concepts have unambiguous application to the things in the world that, by experience, we derive these concepts from.
No comments:
Post a Comment