Drexel dragonThe Math ForumDonate to the Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by Drexel University or The Math Forum.


Math Forum » Discussions » sci.math.* » sci.math.independent

Topic: Matheology § 192
Replies: 1   Last Post: Jan 18, 2013 2:50 PM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
mueckenh@rz.fh-augsburg.de

Posts: 15,308
Registered: 1/29/05
Matheology § 192
Posted: Jan 18, 2013 8:06 AM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply


Matheology § 192

We first consider the total amount of energy that one can harvest
centrally. [...] one finds
E_max = 3.5*10^67 J, comparable to the total rest-mass energy of
baryonic matter within today?s horizon. This total accessible energy
puts a limit on the maximum amount of information that can be
registered and processed at the origin in the entire future history of
the Universe. [...] Dividing the total energy by this value yields a
limit on the number of bits that can be processed at the origin for
the future of the Universe: Information Processed [...] = 1.35*10^120.
[..] It is remarkable that the effective future computational capacity
for any computer in our Universe is finite, although, given the
existence of a global event horizon, it is not surprising. Note that
if the equation of state parameter w for dark energy is less than -1,
implying that the rate of acceleration of the Universe increases with
time, then similar although much more stringent bounds on the future
computational capacity of the universe can be derived. In this latter
case, distributed computing is more efficient than local computing (by
a factor as large as 10^10 for
w = -1.2, for example), because the Hawking-Bekenstein temperature
increases with time, and thus one gains by performing computations
earlier in time. [...]
On a more concrete level, perhaps, our limit gives a physical
constraint on the length of time over which Moore?s Law can continue
to operate. In 1965 Gordon Moore speculated that the number of
transistors on a chip, and with that the computing power of computers,
would double every year. Subsequently this estimate was revised to
between 18 months and 2 years, and for the past 40 years this
prediction has held true, with computer processing speeds actually
exceeding the 18 month prediction. Our estimate for the total
information processing capability of any system in our Universe
implies an ultimate limit on the processing capability of any system
in the future, independent of its physical manifestation and implies
that Moore?s Law cannot continue unabated for more than 600 years for
any technological civilization. {{Not a breathtakingly large
number.}}
[Lawrence M. Krauss, Glenn D. Starkman: "Universal Limits on
Computation" (2004)]
http://aps.arxiv.org/PS_cache/astro-ph/pdf/0404/0404510v2.pdf

Therefore it is not only theoretically wrong that a process can always
be completed when every single step can, but it is already practically
impossible to perform a step the identification of which requires more
than 10^130 bits. At least genuine mathematicans would hesitate to
accept steps that in principle are impossible - that is reserved for
matheologians and lunatics.

Regards, WM




Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© Drexel University 1994-2014. All Rights Reserved.
The Math Forum is a research and educational enterprise of the Drexel University School of Education.