A question on communication over a lossy channel

Discussion in 'Probability' started by Karthick, Aug 30, 2010.

1. KarthickGuest

Greetings!

A question on information theory that I'm not able to find a more
suitable group for.. hope you can help me.

Suppose we have 16 symbols {S0..S15}. A transmitter (TX) and a
receiver (RX) are communicating over a lossy channel (C). TX sends
symbol x, RX received y. The following probabilities are known:
a. Probability of a drop: P(y=""|x=Si) - receiver didn't get anything
b. Probability of a mutation: P(y=Sj|x=Si), i not equal to j
c. Probability of an alien symbol: P(y=Si|x="") - receiver gets a
symbol not in the original sequence sent by TX
d. Probability of a successful transmission P(y=Si|x=Si).

The four add up to 1, no other conditions are possible.

Now let the symbols sent by the TX be encoded before transmission (by
adding redundant symbols), and later recovered. The recovered symbol
is "z". Ideally, we would like z to be equal to x, of course. Let us
assume some (desired) probability for successful transmission for a
symbol: P(z=Si|x=Si).

Given these parameters, what is the theoretical limit on the channel
efficiency? What are practical coding algorithms available?

I'm reading up the manuals, in particular,
a. Shannon limit
b. Hamming space
...but I'm still far removed from getting close to any answers. They
seem to deal with mutation errors, but not with symbol drops (I could
be wrong).

Any pointers on what I should be checking/reading will be great!

Karthick, Aug 30, 2010