Einsteinians universally teach that a light source moving towards the observer sends a shorter wavelength than a stationary light source:
http://www.amazon.com/Brief-History-Time-Stephen-Hawking/dp/0553380168 Stephen Hawking, "A Brief History of Time", Chapter 3: "...we must first understand the Doppler effect. As we have seen, visible light consists of fluctuations, or waves, in the electromagnetic field. The wavelength (or distance from one wave crest to the next) of light is extremely small, ranging from four to seven ten-millionths of a meter. The different wavelengths of light are what the human eye sees as different colors, with the longest wavelengths appearing at the red end of the spectrum and the shortest wavelengths at the blue end. Now imagine a source of light at a constant distance from us, such as a star, emitting waves of light at a constant wavelength. Obviously the wavelength of the waves we receive will be the same as the wavelength at which they are emitted (the gravitational field of the galaxy will not be large enough to have a significant effect). Suppose now that the source starts moving toward us. When the source emits the next wave crest it will be nearer to us, so the distance between wave crests will be smaller than when the star was stationary."
Let us assume that the moving source does send a shorter wavelength than the stationary source. Clearly the shortening of the wavelength occurs at the very beginning, as two consecutive wavecrests leave the source - then the shortened wavelength starts its journey towards the observer. In other words, the wavelength that travels between source and observer is shortened.
If the observer starts moving towards the stationary source, the situation is entirely different. This time the wavelength that travels between source and observer is not shortened - rather, it is the original wavelength (produced by the stationary source) which could only look shorter upon interaction with the observer.
The two different pictures - the travelling wavelength is shorter when the source moves towards the observer and longer when the observer moves towards the source - contradict the principle of relativity.
We have reductio ad absurdum: either the principle of relativity is incorrect or our assumption that the moving source sends a shorter wavelength is false. The principle of relativity is correct so the conclusion is:
Conclusion: The moving source sends the same wavelength as the stationary source.
But if the moving source sends the same wavelength as the stationary source, why does the observer measure a greater frequency when the light comes from the moving source? The formula:
(frequency measured by the observer) = (speed of light relative to the observer)/(wavelength)
gives the only possible answer: the observer measures a greater frequency when the light comes from the moving source because, relative to him, the speed of the light coming from the moving source is greater, in violation of special relativity.