Hello, here's a problem I've been working on, but can't see an immediate easy solution.
I'm designing a codewheel for a piece of apparatus that rotates. A (binary) code is printed on the rim of a wheel that defines the position of the wheel and read off with LEDs. A black square indicates a 0 and a white square a 1. Of course I want to use a coding scheme which will be optimal and give me good resolution. As such I've come up with the following scheme for detecting eight different positions of the wheel for example.
The following set of squares is printed on the rim of the wheel:
and read off three at a time by three LEDs. As you can see taking three 'bits' at a time you can define eight positions. Reading from the left we would get:
The electronics can then convert this into a binary number and use a lookup table to work out the position of the wheel.
In fact if I wanted 16 possible positions I could use the following sequence on the rim:
Which would give 16 unique positional values: 0010 0101 1010 0100 1001 0110 1101 1011 1111 1110 1100 1000 0000.
Since I had to find these sequences by trial and error it would be nice to have an algorithm which would generate me a sequence for any number of bits of resolution I wanted. Is there a general method for higher numbers of bits and if not, why not? Can I also generate other sequences of different sizes such as 7 positional values or 100 positional values?