Trying to implement the Latin Hypercube Sampling algorithm, I am interrested in an algorithm able to generate a random permutation uniformly distributed among the n! possible permutations of [1, ..., n]. Such an algorithm is well-known (see Donald E. Knuth, `The Art of Computer Programming: Sorting and Searching' (Vol 3, 3rd Ed, 1997), Addison-Wesley, ISBN 0201896850) and is widely available in open source packages, but I can't use it if I want to generate a really huge permutation (let's say n=10^9). In fact, I am looking for an algorithm able to tell me where a given i in [1,...,n] has been transported by the permmutation, for i=1..p in this order, with p << n (hopefuly), that has a memory cost much lesser than n (in the worst case, or in the mean case).
Any reference/algorithm/source code/theory would be GREATLY appreciated!