Vincent R. Johns (firstname.lastname@example.org) wrote: : David K. Davis wrote: : > : > Alberto C Moreira (email@example.com) wrote: : > : firstname.lastname@example.org (Gerry LaValley) wrote: : > : > <stuff snipped> : ... : > : > I said I'd shut up for while, but I'm having trouble. I believe that there : > is knowledge that is not essentially reducible to algorithm and I believe : > that there is a mathematical argument that shows why. Some (most) (almost : > all actually) data is algorithmically incompressible. That is, the : > algorithm used to encode is almost as long as the data itself. Can it : > be that physical systems implement processes that while perhaps in theory : > are reducible to algorithm, in practice the algorithm would have to be of : > such immense complexity as to not count as an algorithm by ordinary standards, : > that is, be unusable by a computer? Can a physical (biological) process : > be such that any algorithmic emulation would be impractical? I think : > it's possible, and I think it's likely in the case of the mind. (G. Chaitin : > is the one responsible for the algorithmic complexity ideas, but not : > for my use or abuse of them here.) : > : > -Dave D.
: Your comments about "almost all" data may be true, but are likely the necessary : result, IMHO, of our desire to simplify the universe to make it comprehensible. : That is a major function of scientific theories. If I have no theory to : summarize some measurements that I have taken, I need to remember the : measurements. With a theory, I can just remember the exceptions. Random : errors can be reduced by improving the instrumentation, broadening the : population of events observed, etc. So it sounds to me as if you're saying : that any data worth recording (i.e., those not already adequately explained by : theory) are not algorithmically incompressible (i.e., not explained by : theory). Much of everyday experience (e.g., the sound of the fan in my : computer) is easily explainable (I expect the noise to be there and do : not attend to it) and can thus be ignored. Unusual events are the : incompressible part of life, at least until a suitable compression scheme : becomes available.
What I'm focusing on are theories of mind, and in particular the idea that mind can be emulated by algorithm. I'm suggesting that it may be that there exists no algorithm of reasonable (implementable, comprehensible) complexity that emulates mind. The algorithmic complexity argument suggests the possibility and evolution suggests the likelihood. The mind is the result of eons of physical evolution, it is molded by the experience of the individual, and the individual lives in a culture with tens of thousands of years of development (at least). The end result can be effectively summed up in a neat algorithm? I don't think so.