Interleaving is often encountered in digital communication systems. I have seen interleaving between codewords in a single OFDM symbols as follows
Assume the bits are denoted by
$$\{c_o,c_1,c_2,\cdots,\} $$
And assume we use 16-QAM and we have many codeword each of length $N_{CW}$. Then to perform interleaving we do the following, the first 16 QAM symbol (first subcarrier) in the OFDM symbol is from $$\{c_o,c_1,c_2,c_3\} $$ The second 16 QAM symbol in the OFDM symbol is from $$\{c_{N_{CW}},c_{N_{CW+1}},c_{N_{CW+2}},c_{N_{CW+3}}\} $$
The mapping is easy to understand, I don't understand how this helps in an OFDM system. Essentially we are mapping non adjacent bits to one OFDM symbol? What are the benefits over mapping adjacent bits to non adjancent OFDM symbols?
Thanks
Answer
Interleaving is not specific to OFDM. It is an age old technique used for improving error correcting codes.
Basically, most error correction coding techniques are good at suppressing bit-wise errors which are independent and random. However, when channels are not memoryless, there could be burst errors. i.e. several bit errors are likely to see in close by given the first bit error occurs. Hence, in such cases typical ECCs gets much weaker.
Burst errors in general are more adverse not only because they get better of error correction codes but they tend to produce more profound local gaps (such as hicups in voice or patch in video data). And yet, in most wireless and satellite channels they are more common. This is also applicable in storage mediums.
Interleaving, a very simple technique resolves this very efficient way. Basically you take a block of bits and interleave it in a particular known order. Now when a burst error occurs, after the de-interleaving at receiver the effected bits are now speared apart much more than the original burst. And hence it improves the work of ECC significantly.
Many modern channel encoding uses interleaving built-in.
You can read a very detailed outline here:
No comments:
Post a Comment