[RU]

Dreams and Things

There is mathematics and mathematics. Most scientists work on some quite practical projects, adjusting the formal instrumentation for better performance in the applications (including mathematics). Within this activity, it seems reasonable to explore a number of possibilities in search of the most viable schemes. Yet another group of problems comes from mathematical entertainment, bringing forth all kinds of funny brain twisters of no visible utility. Amateurs can thus assimilate the historically accumulated heaps of knowledge; for professionals, such musings may do as both a valuable pause in the academic routine and a chance to widen the horizon, getting acquainted with the current trends in the adjacent (or maybe faraway) areas of research. Finally, there is fundamental science, seeking for the foundations of everything, to justify the available formalistic technologies, as well as to soothe the doubts of the influential sponsors.

For an outsider, fundamental mathematics may seem a perfect instance of delirium. Deep in the heart, we can understand and concede the passion of a respectable person for making paper soldiers somewhere in the backyard; however, his ambition to thus gather an invincible army and conquer the whole world would certainly drive us to inquiring into the kind of biochemical and neurological disorder he might have. Of course, many cases do not require any intensive medication: with a harmful idiot, mere regular surveillance and minimal behavioral corrections would be enough. Unlike a schizophrenic, who is aware of his disease but cannot control the phantoms of the mind, which makes him suffer, a great mathematician can live in his imaginary world and be entirely happy believing that this is the only possible reality. This dreamer perceives the world around as an imperfect "model" of the stainless truth and accepts only the agreeable side of things.

Occasionally, almost everybody may experience similar dim moments. Still, if I get regretfully crazy to start bombing the academic authorities with, say, some well-developed theories of the budding habits of green three-horned demons, I am sure, in the best, to be ostracized and blacklisted for all times; with a lesser luck, there is a thick prospect of an asylum. On the contrary, some (originally wealthy) guy may significantly improve his well-being studying the intricate aspects of undecidable nontriviality...

A typical fundamental theory is born like that: we have some idea of how a certain class of things is designed; now, let us imaging something that is not exactly the same way and explore the observable deviations. Up to this moment, no serious objection. We do not know what this stranger is like, but we are free to call it names (the tongue won't break) and play with our paper soldiers to the edge of doom. Persistent complications will spoil the party when the (possibly collective) author forgets about the fantastic nature of such imaginary creatures, and pictures the real world as it has never been. Later authors begin with the already formatted fancy and contrive more freaks, to obtain yet another (presumably more fundamental) abstract theory. Finally, we get what we fought for: a fundamental math entirely alienated from real life (including science).

On the funny side, even such an off-head mathematics will bring about many valuable and useful inventions. The world as it is rules it all, so that any ultimate perversity is only possible within some objective setup. Crazy science is a mirror of a crazy social organization of the present humanity yet lacking the truly human reason. The exaggerated adherence to the rules of the game has become a value in itself millennia ago, when the prehistorical savagery gradually grew into a savage civilization; indeed, in is only in the formal communication that those who cannot yet communicate the human way could be brought and held together. With the fall down of the class economy (provided we manage to witness it some day), everything will get much simpler, at least, because there would be no need to prove anything to anybody.

Note that any (however abstruse) theory grows, in its deepest depth, from a quite earthly root. That is why the admirable inventors rarely present anything dramatically new; for the major part, they just transfer the already known onto their fantastic world, leaving us with the same ideas of a point, a number, ordering, addition and multiplication... That is, everything is decent and proper: the truncated two-valued logic, every species under an appropriate genus, the deductive method, structured artificiality, and an impressive reference apparatus all through. It shines with angelic beauty, but nobody knows what and why it is. As for convincing power, let us keep silent. Today, mathematical proofs do not convince even great mathematicians. The very possibility to arbitrarily distort any (however rigorous) theory is a clear indication of an inner (or even undisguised) slackness. No doubt, there are enough fancy words to explain even that. For example, in nonstandard analysis, they turn themselves inside out, to prove the existence of nonstandard individuals. Just that many queer ultrafilters! Still, after all, here comes a reluctant admission that the ends won't meet without the banal choice axiom (or one of its equivalents). Which is already a kind of standard synonym, a symbol of arbitrariness. While the ill-starred Russel classes have eventually been deprecated among the honest folk, entirely dismissing the axiom of choice would bring the whole enterprise to the final crash.

Nonstandard analysis is to discuss its artificial entities on the basis of a number of ad hoc rules. All the construction is intentionally cut to mimic the well-known high-school results, and hence its value is reduced to occasional heuristic hints. Why. Just because there is no real-life trace of any of the exotic objects inhabiting the theory. Our everyday existence is quite rational. Even real numbers are already beyond practical accessibility, and we can only judge about them by indirect evidence, interpolating a chain of observables. Quantum theory is replete with higher-scale infinities; still, they are judiciously said to be unobservable by their very essence, to eliminate the prospects of eventual migrating to some Hilbert space... Does that mean that all the mathematical infinities are primordial evil? Not at all. The logical implication only calls for a little more prudence in making them go, discriminating a sheer fantasy from the established fact.

For an illustration, take the ordinary mathematics of ordinals. Everybody is well acquainted with integer numbers. Used to indicate the order of enumeration, they are referred to as finite ordinals. For every integer, there is an integer immediately following it, and so on, until one gets bored. That is, integer numbers form a uniform scale to measure all kinds of finite things: with any step length, we can get from a the point A to the point B after an integer number of steps. If the last step brings us too far, just use twice as short steps, and so on. Within the desired precision, we are bound to get right where we need. With the appropriate units, the (practical) distance between A and B is always expressible as an integer. In honor of an Ancient Greek, such scales are called Archimedean.

At this stage, mathematicians employ the sleight of hand. Their pet trick is inductive (or recursive) definitions. We are to believe that the existence of the integer number 1 together with the possibility to fancy an increment for any given integer n imply the existence of a recursively defined set N containing all the positive integer (natural) numbers. Every layman will immediately see a striking resemblance of this logic to the axiom of choice: take one element from each set of a given family, and this will produce yet another set. In view of the common treatment of finite ordinals as sets (in the von Neumann scheme) the similitude becomes almost perfect. However, mathematicians will never agree: this is different...

Alright, let it be. With all that, why should feasibility always mean existence? Theoretically, any large enough ground can be used to construct a house. However, if I wish a tiny sweet home of mine on the Concord square in Paris, do I have a slightest chance? So, I'll have to settle elsewhere. If we are sure to imagine an integer number greater than any currently available (and not merely denoted by the character n), this does not make such an integer a sure thing. Quite possibly, this new entity has not yet been involved in any practical activity, and hence it remains merely plausible, rather than actually constructed. Some computer, for instance, might object that big numbers cannot be represented within the present architecture, and hence they cannot be really used and should be qualified as only potentially existent. One must be a mathematician, to mix up imaginary and real things.

Once again, let as swallow the far-from-nice tricks and admit the existence of a communion of all the integer numbers. But why should it necessarily be a set? After the experience of Russel paradoxes, it seems to be commonly accepted that proper sets are all contained in a standard universe. All the rest should be referred to as classes, with a significantly less rigorous treatment. From nonstandard analysis, one learns that the collection N cannot be an "internal" set (that is, it does not belong to the standard universe). Still, following the old tradition, it retains the status of a set (albeit "external"). Similarly, real numbers are postulated to form a set, and the set of the subsets of a given set is a common construct... Such verbal exercises are not innocent. The perverted game of infinites is rooted right there.

Now, accepting N as a set, it is logical to ask: how many elements does it contain? Obviously, this number is greater than any given integer, and a different scale is to be used. We are told that the number of all integers is infinite, but we are free to invent some handy name and notation: let this particular infinity be denoted as À0 (assuming that there can be other infinities, to be labelled accordingly). One could wonder why we need yet another sign for infinity, in addition to the centuries-old character ∞. The answer is: to make the game funnier. Anyway, it did not have to do anything with reality, from the very beginning. In other words, there is infinity in general (∞) and the specific types of infinity, cardinal and ordinal numbers.

Since, by definition, the "set" N is greater than any finite number, it will represent an ordinal number greater than any finite ordinal. In this context, it is commonly denoted as ω. Meaning that, once mentally introduced, it is certain to really exist. More freaks, more fun. By analogy with the natural numbers, one might conjecture the existence of the next (infinite) ordinal ω + 1 which is greater than ω. Then, once again, the induction trick is to ensure the existence of the set of the elements of the ω + n form, for all (finite) integer n. As an expectable continuation, there is the ordinal ω + ω = ω ⋅ 2, which can be incremented in its turn... This immediately supplies all the combinations like ω ⋅ k + n, and then those like ω ⋅ ω = ω2, and all the rest, following the same scheme. A real killer! Just keep playing on, up to the new countable type ε0, and even greater curiosities, finally coming to the first uncountable infinity ω1... As soon as an axiomatic layout is introduced, the dreamland becomes exact science, a matter of professional pride and a high-profit business.

For a passer-by, everything looks utterly strange. Well, with minimal reserve, one can accept the axiom α + 0 = α for all ordinals; on the contrary, the axioms like α ⋅ 0 = 0 or α0 = 1 raise an inner protest. Since high school, we have a custom to treat ∞ ⋅ 0 and ∞0 as "indeterminate forms" that should be resolved using some additional information specific for a particular situation. Further, ordinal addition is said to be non-commutative:

ω + 1 > ω, 1 + ω = ω

In the general case the sum is to be defined in a crazy manner as

α + γ = sup{α + β ∣ β < γ }

It is generally known that the existence of the exact upper limit in far from being a trivial circumstance in the world of infinities; roughly speaking, one cannot properly define it without the axiom of choice (at least implicitly invoked). Of course, when our paper soldiers are to fight exclusively in the conditions of the a priori applicability of all the assumptions, there are no objections. That will do fine for a game. In science, such a generous environment can hardly ever happen. We don't need a theory of nobody knows what; we are to provide high-performance tools for everyday usage. The applicability of the formalism is then related to the object area rather than any good intentions.

As nonstandard analysis indicates, the numbers of the ω – n form are also infinite, as compared to finite ordinals. The indeterminate form ω – ω is understood as yet another hyperinteger number which is less than ω, but still greater than any finite integer. The class of hyperinteger numbers is, therefore, dense everywhere: between any two infinities, there is yet another one. This is more compliant with the standard of a complete ordered filed; on the other hand, the existence of all those infinities remains a matter of subjective conviction.

A lay person will logically wonder, why, for infinite ordinals, we are to sacrifice the commutative feature of addition and multiplication instead of some other properties of the ordinary numbers. Admittedly, the expectable row of infinities will no longer be a field in the classical sense, and we need to weaken our axioms. But why not define addition in some other manner? For instance,

α + β = sup{α' + β' ∣ α' < α & β' < β }

Thus defined addition is obviously commutative and has really wonderful properties: ω + 1 = ω, 1 + ω = ω, ω + ω = ω ⋅ 2 = 2 ⋅ ω = ω. In this view, the whole zoo of countable ordinals would be a sheer fiction: there is a single countable infinity, and no need for extra junk in the head. Then, similarly, we can introduce the negative infinity –ω, so that –ω – n = –ω. In the result, there is a well-ordered field with the only little exception that the inequalities n + 1 ≥ n and n – 1 ≤ n are not always strict. In any case, this is a much weaker assumption than broken commutativity! Theoretically, one could demand that ω – ω = 0, ω + (–ω) = 0. Then our system will be closed in respect to addition and subtraction. However, the correct decision would rather follow the usual rule of calculus and consider such expressions as indeterminate terms of the ∞ – ∞ type, which are to be resolved on the basis of practical considerations.

So, the fantastic worlds invented by reverend mathematicians are of no use for an ordinary person. There are the positive and negative infinities, and the infinite limit can be defined in a quite natural way. Similarly, there is the only infinitesimal value 1 / ω (the upper limit), complemented with the single negative infinitesimal 1 / –ω (the lower limit). We always keep within the standard analysis and contemplate the harmony and beauty of the only world.

There is yet the specific issue of the existence of the uncountable infinity À1 (or ω1), which should be reconsidered elsewhere. Still, it is clearly predictably that things are much simpler than the academic compendiums narrate. To start with, our "symmetrized" definition of addition is perfectly applicable to real numbers too, which means that the universal countable ordinal ω neatly coincides with the uncountable infinity ω1 and all the other uncountable ordinals! That is, uncountable sets qualitatively differ from countable, rather than on some quantitative grounds; basically, this is the opposition of discreteness and continuity, which can gracefully coexist in a bounded area, like, say, the interval (0, 1). As a bonus, one comes to the comprehension of the different levels of countability: the uniformly dense set of rational numbers is obviously an entity of a different kind than the sparsely spaced natural sequence. In these lines, many great ideas are yet to come.

Just learn somewhat better manners while working with sets and their mappings, and do not much trust the words "and so on" (in the inductive sense); this will tame higher-order infinites and bring them in good terms with reality. On such a foundation, no quanta can entangle us, and the light barrier is no obstacle.

1997


[Mathematics] [Science] [Unism]