On Jun 12, 2013, at 1:21 AM, GS Chandy <email@example.com> wrote:
> I disagree with you. I'd be grateful if you could explain as to just how that 'thought-experiment' suggested by Joe Niederberger describes, in your view, "the development of mathematics".
Joe described the process of reverse engineering an alien technology by studying its construction and behavior. That is how mathematics is and was developed, often simultaneously by separate cultures. That is how all knowing is developed. You study and note recurring themes and when you have enough of those themes you form a theory and as you form a theory you develop an ontology to label the elements of that theory.
Back to Joe's example...
A computer is an electronic device and I assume from his description that the beings examining the "computer" at least understood electricity and magnetism. So rather than complicate this with aliens, let's just suppose that we take a computer back in time and show it to "radio" engineers of the early 1900's. Let's also assume that this is a vacuum tube computer so as not to complicate the experiment with not only a technology gap but a physics gap as well. If you know anything about the technology of computers and their development, vacuum tube computers were every "bit" a computer as is the PC right in front of you. The same principles, the same components, but different physics.
So, how long would it take the radio engineers to become computer engineers? What is it that differentiates a vacuum tube computer from a vacuum tube radio? The answer is discrete logic versus analog logic. In other words, once they recognize that these tubes and components have been arranged around scenarios involving "on and off" they will be in. That is not a remarkable paradigm shift for radio engineers of the early 1900's because the telegraph engineers of the mid 1800's had already been there and done that. In fact, those same telegraph engineers had already established bit encoding patterns with the same purpose as the encoding patterns used by your hard drive. I guess necessity is the mother of invention. Recognizing that this computer device is "digital" is the key but it isn't the whole story. Now that they understood the paradigm (digital) they would have to reverse engineer this particular application of that paradigm (a computer). I am not saying that this would be a no brainer but it would progress fast enough. The math for binary arithmetic was already established, since the 1600's and boolean logic in the 1800's. However, actually combining that with circuits didn't come till Claude Shannon published the following paper in 1937...
And then, as Joe mentioned, there is the system (block) organization to contend with. Ram here and cpu over there and so on. But I don't think any of this would stump the radio engineers because they had already established complexity and blocks in their own work with RF equipment. For Joe's experiment to work I would say that we would have to send this vacuum tube computer back as far as the 1500's. That would I think be far enough back such that the technology would be truly out of the reach of the engineers of the day. They would have to put this vacuum tube curiosity on the shelf for probably 200 years. However, we would probably have computers in the 1800's rather than the 1900's. Actually, the thing would probably have been lost in a fire long before that. (Things getting lost in fires appears to be popular back then).
And that is where we are with reverse engineering people. Primarily, we do not have in our mental possession, the paradigm on which natural intelligence is based. It isn't analog and it isn't digital. Even if we understand how a neuron works we certainly don't understand this application of them. Not only do we lack a pivotal paper like Shannon's, we don't even have the arithmetic for such a paper to be based on. My focus has been on the arithmetic and the paradigm on which natural intelligence is based, not the science. I also don't call it "information processing" because I think that term applies to a different paradigm.
Back to math...
An example of a paradigm in mathematics is the concept of limits. That isn't the only paradigm that can lead to calculus and "digital" wasn't the only paradigm that could lead to computers, but when the dust settled those were the paradigms. If I handed you the paradigm of limits I have essentially handed you calculus. That doesn't mean I gave you a book of calculus, but over time, starting there, you will create a book of calculus. But it would be quick, if the time was right. Just like, once we caught on to "digital" we rewrote electronics in just one decade. But no one was handed the paradigm of limits. What we had were problems of calculus in nature and mathematicians had to illicit a paradigm from them. And yes, it took 1000's of years and other problems are still beyond our current reach. Joe's process continues.