R Hansen says: >I think it must have a discrete and algorithmic component
Nerve cells are typically a sort of hybrid between the analog and discrete, by which I mean the all or none "firing" behavior can be considered a discrete, or quantizing, mechanism. But then, the electronics in a microchip can be viewed the same way. I'm just bringing that up as a point of moderation -- not denying that the differences in physical substrate are vastly different. Other people think that the large scale parallelism accounts for much, but I think that fundamentally you can't compute anything in parallel that cannot be computed more serially - faster, sure, but not different in kind. Likewise, the notion of "algorithm" is vague - what we know about computation in the artificial sense can be expanded to ongoing, open ended processes that do not fit the traditional notion of algorithm.
So, those are all reasons why there is a challenge - but its not clear any are "show-stoppers".
R Hansen says: > Another aspect I think is important is the separation of the state of the machine from the state of the data
I think the notion of an "abstract machine" is that a sufficiently defined programming language can be viewed somewhat independently for the hardware it actually runs on. Real hardware will obviously determine absolute performance, etc., but fundamental limitations (e.g., whether one can find a polynomial time alg. to solve TSP) remain. Given an abstract machine, it is certainly allowable in the standard view for a program to modify its own instructions. I don't believe there's any fundamental differences in the computing power between programs that do so and those that do not -- and likewise based on this "abstract machine" principle I wouldn't expect there to be truly fundamental differences between processes that can actually modify the hardware component versus those that do not. But I'm just thinking aloud here - I could be wrong.
R. Hansen says: >By the way, and maybe it is just me, but the articles you cite seem to not want to define computation at all, natural or otherwise.
Computation in the artificial sense was nicely defined by Turing - and there is also the Church-Turing thesis. The lack of definition for "natural information processes" is that the work remains to be done! That's the point. In an informal sense I think we could agree that when I throw a ball and my dog catches it, there's an information processing component at work, and that its really real, not just some abstraction.