You wish to cut down an 8-foot sheet of plywood from 96 inches to 92-5/8 inches. You position the hook of your steel tape measure on the left end of the sheet so that the top edge of the tape is everywhere flush with the top of the sheet's edge. You make a pencil mark at 92-5/8 inches. So far so good.
But what if the steel tape is hooked at the left end correctly but the right end -- the end where you're going to make a pencil mark at 92-5/8 inches -- is low by 1 inch? I can see that the resulting cut will be short, but I can't figure out by how much.
Is there a formula? It seems like there should be, but I never took whatever area of math would teach me this.
Idealized, I think of a circle of radius 1 centered on the X and Y axes. If I measure from the origin STRAIGHT right 1 unit, the measurement is correct, but if I measure out 1 unit when the ruler is erroneously TILTED DOWN a certain distance, the spot I mark will be short.
Is there a formula that converts this erroneous distance, which presumably describes an angle, to a calculated distance of error? I also do understand that the longer is the distance from the origin to the marked spot, the less the error will be.
Asked perhaps yet another way, as a radius is swept through a circle, how does the X-axis value shrink as it sweeps from a given y-value to a smaller one? I think.