>Is the multiplication of two irrationals, especially two non-algebraic irrationals like the product e(pi), a computable procedure?
The multiplication of two computable numbers is computable under commonly accepted definitions.
The product of two real numbers is in general, not computable. In general, an arbitrary "real" number is not nameable or definable. You have no handle on it whatsoever, other than to say it occurs on the mythical number line somewhere. You can even define some real numbers that no one knows or has a method to figure out *where* they occur on the line.
You say we teach these "ral" numbers as everyday number to children and that's true. I'm not sure its necessary or good, but its true. But we forget to tell them some key differences.
We teach them that multiplying two numbers means taking the two numbers and finding the answer. That's a computation. When we merely imagine that there "exists" an answer (that we cannot name, or compute) that's a key difference I would say. Those are rather different games to be playing.
I played Avalon Hill games as a kid. Gettysburg, Stalingrad, Panzer Blitz. I imagined I had armies filled with men and equipment and supply lines and even variable weather conditions. Only a few of the mean had actual names -- mostly they were dealt with in aggregate form. (Similar games are played by real generals, to aid in decision making.) Vividly imagining these "men" and whatnot was an essential part of the game, without which the game would lose its luster and probably be impossible to take interest in anymore. So its is with infinite precision numbers and infinitely long straight lines.
When we want to take two numbers and actually get an answer (which is what we also teach kids is what's expected) we need to move out of the imagination and into the real (not "real") world. What nature hands us is computation, not infinite precision numbers or infinitely straight lines. Nature, it seems, does computation with or without us -- as in DNA processes, its very literally a fact of life.
Let's take stock: Take two numbers and multiply and find the answer: Computation. Imagine the result of the logically defined or primitive operator '*': Logic.
Axiomatically defined operator '*' with all its properties and relations inhabits the world of logic. In *that* world we can assert things exists without ever producing them (imagine the ramifications for the legal world!) But it turns out that as far as anyone has ever tried to really pin logic down, its procedures and workings turn out to be computable procedures anyway. You can't fool Mother Nature! (You can fool people though.) As in the Avalon Hill games, I had to have real physical markers and boards and dice to play. They weren't the same as my imaginary men, but somehow pointed to them.
Now, Devlin says there are two different processes - repeated addition, and multiplication, that can be used to arrive at the same answer for multiplying two numbers. Under my reading its a lie, or a deep mistake he can't get out from under. First, the logical '*' has no "process" associated with it. Second, it doesn't produce actual answers.
(There certainly is more than one way to do computational multiplication, many in fact. To anyone who says this is computer science and not math, have a look here: http://cr.yp.to/papers/m3.pdf -- I call that math.)