
Re: all of math is just addition and multiplication
Posted:
Jun 8, 2013 1:56 PM


On Jun 7, 2013, at 5:22 PM, Joe Niederberger <niederberger@comcast.net> wrote:
> Rather than focusing on "artificial intelligence" per se, some people have begun to investigate "natural computing". Here's a few links along those lines: > > http://cs.gmu.edu/cne/pjd/PUBS/CACMcols/cacmJul07.pdf > http://www.santafe.edu/media/workingpapers/1009021.pdf > http://www.epjournal.net/wpcontent/uploads/ep04434458.pdf > > From this viewpoint, a key question is whether the definitions and concepts surrounding computation (that we have from mathematics and logic,) can be adapted to faithfully describe natural computational processes. > And in fact, can "natural computation" be defined in a crisp way? > > Cheers, > Joe N
It wouldn't be Turing based. That is not to say that an analog computer cannot produce a useful result that can be harnessed but for the process to be called "computation" in the sense of a computer, then I think it must have a discrete and algorithmic component (a Turing basis). In any event, I would start there. Artificial intelligence is Turing based. Natural intelligence is not. Another aspect I think is important is the separation of the state of the machine from the state of the data (or the machine from the data if you wish). In Turing based solutions the state of the machine and the state of the data are separate. In natural computers they are one and the same. I would classify most of the cryptographic machines in world war II and before as natural computation while the tabulators all the way from the late 1800's as Turing based. In fact, as early as the 1920's they were using SQL so even while the machines were mechanical their process was Turing. A non digital wat! ch, even one with very sophisticated complications would still be a natural computer while a digital watch would not because of this Turing aspect.
I haven't defined natural computation unless you accept that any form of computation that is not Turing based is natural.
By the way, and maybe it is just me, but the articles you cite seem to not want to define computation at all, natural or otherwise. I have noticed this about a lot of papers lately. Maybe it is just a pet peeve I have developed over the years of reading these things. It isn't that I think that papers must always have bullet proof development behind them. But at the same time they shouldn't just be word play. They should be sufficiently critical to cover an intelligent reader's immediate concerns. It's like leaving a movie of fantasy and allowing yourself the belief that it could have happened versus realizing you just wasted $20 on a really poorly developed plot.
Bob Hansen

