I need some help. I'm working on a project modeling a hard disk and I fear my probability theory is rusty. The problem is, given that a hard drive head is at some random location on the disk and must then move radially for a read at another random location on the disk, what is the average distance it will travel? I'm ignoring for now acceleration and disk rotate time.
It gets harder. Now suppose the disk always has a choice of n random locations to read, and it always chooses the closest one. Now how far will it travel on average?