> It is not clear what the real purpose is here, but there seems to be an > emphasis on computing time, and there needs to be a balance depending on > the actual context. There seems scope for introducing a pre-computed > list of permutations held in a file, so that multiple random selections > (in real-time) can be avoided. Different strategies can be thought up > depending on whether the file contents were themselves generated > systematically or randomly. In some applications it might be enough to > do a single "full" (expensive) randomisation to initialize, and then to > read permutations of this result from a file (or array) ... > re-initialising every so often would mean that the file need not be too > large ( a 1000 permutations would/could reduce computation time by a > factor of 1000). I guess such ideas must be fairly standard, but perhaps > modern computing scenarios of file-space limitations may have changed > relevant balances.
Thanks for the comment. My personal desire, i.e. for purposes I myself have in mind, is to employ as few PRNs as possible (see my 2nd post in this thread). The random permutations are for my applications to be dependent on the context of the actual runs and hence cannot be pre-computed -- just like in the cases of real card games. Some higher efficiency (assuming that the interpreted Python results in that aspect is analogously true for optimized versions in other compiled languages) is only a secondary beneficial effect. Currently I am repeating some more computations of the kind reported earlier and trying thereby also a couple of code variations.