The Math Forum

Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.

Math Forum » Discussions » sci.math.* » sci.math

Notice: We are no longer accepting new posts, but the forums will continue to be readable.

Topic: Sending Messages in Morse Code
Replies: 16   Last Post: Aug 25, 2000 8:04 PM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
John Savard

Posts: 601
Registered: 12/8/04
Re: Sending Messages in Morse Code
Posted: Aug 2, 2000 2:18 AM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

On Tue, 01 Aug 2000 22:18:42 +0200, Mok-Kong Shen
<> wrote, in part:

>Do I understand correctly that you want to transform a random bit
>sequence (almost uniformy distributed) into a sequence of Morse
>code symbols and study the optimal transformation?

I'm assuming I have a bit sequence that is precisely uniformly
distributed, and my goal is to find the optimal transformation of that
sequence - within limits to the complexity of the transformation used
- into Morse code.

>I admit that
>I haven't fully understood your article, hence the question: Do
>you keep the set of the original Morse code symbols or do you
>consider possible extensions, i.e. introducing more symbols?

I do keep the original Morse code symbols. I note that if I introduced
more symbols, even symbols from the original Morse code, such as those
for digits, this would result in changes to the transformation.

Normally, in constructing a Huffman code for the English alphabet, for
example, one has probabilities for symbols, and then uses those
probabilities to construct symbols. Supposing one already has a fixed
set of symbols of different lengths: can one go in reverse, and
determine how to use those symbols optimally?

That is the question I address, and I find that I have to determine
the potential efficiency of a code using those symbols before I can
compute the probability for each symbol that would make the code
optimal. This requires solving a fancy polynomial equation. And this
also means I have to take into account per-symbol overhead (as opposed
to per-bandwidth-unit or per-bit overhead, which being constant can be
ignored). Then I can construct a Huffman code based on those
probabilities normally, although I am using it in reverse to express
an input in bits as an output in symbols of differing probability.

I just don't remember seeing this kind of thing done before, and so it
seems an illustration of the math involved in finding optimal codes of
a different kind.

John Savard (teneerf <-)
Now Available! The Secret of the Web's Most Overused Style of Frames!

Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© The Math Forum at NCTM 1994-2018. All Rights Reserved.