Hi All, A friend of mine and I are working on a theoretical math problem.
This is a play on the infinite typewriters, but with some finite boundaries.
The idea is that if an artist was creating digital art, there would be a finite amount of images he could create given a few simple boundaries. For the purposes of this equation, we used a 1920x1200 resolution and a 32bit processor.
The question is, how many possible images could you create at this resolution before you would have to create a repeat?
If each pixel is 2^32, and there are 2,304,000 pixels in an image of that size, then there are 73.7 million bits per image. Therefore, the number of possibilities is 2^73,700,000.
In decimal form, this is 3.3 * 10 ^ 22,194,339*
*We had to write our own calculator to do this, so this is a very, very rough estimate, using 64bit math.
If one were to produce that number of unique images, one would have every (1920x1200) image of dogs playing poker, of LOLcats, every rembrandt and caravaggio and every photograph or painting that ever has been or ever will be created.
Since neither of us are math wizards, we want to make sure we didn't mess up. The program we wrote to test this (obviously) had to do a lot of rounding off, so we could be off exponentially by a googleplex in either direction.
If anyone here has written a really powerful, or accurate calculator, I would love to see an attempt at refining this number.
Again, the question posed is:
If a person (or computer)is working at a 1920x1200 resolution, in a 32bit raster program (Photoshop, etc) how many possible images could they make before they were forced to make an *exact* repeat?