Options

- Article History
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Email to a Friend
- Printer Friendly Page
- Report Inappropriate Content

Turn on suggestions

Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.

Showing results for

- Cisco Community
- :
- Technology and Support
- :
- Wireless - Mobility
- :
- Wireless - Mobility Documents
- :
- Low Density Parity Check (LDPC) Codes

Labels:

3560

Views
0

Helpful
0

Comments
06-05-2015
08:11 PM

edited on:
03-19-2019
02:04 PM

- LDPC codes are a class of linear block codes characterized by sparse parity check matrices ‘H’. ‘H’ has low density of 1’s.
- LDPC codes were originally invented by Robert Gallager in the early 1960s but were largely ignore till they were ‘rediscovered’ in the mid 90s by MacKay.
- Sparseness of ‘H’ can yield large minimum distance ‘D min’ and reduces coding complexity.
- can perform within 0.0045 dB of Shannon limit.

Basically there are two different possibilities to represent LDPC codes. Like all linear block codes they can be described via matrices. The second possibility is a graphical representation.

Lets look at an example for a LDPC matrix first. The matrix defined in equation (1) is a parity check matrix with dimension n x m for a (8,4) code. We can now define 2 numbers describing these matrix. ‘Wt’ for the number of 1’s in each row and ‘Wc’ for the columns. For a matrix to be called low-density the two conditions Wc << n and Wt << m must be satisfied. In order to do this, the parity check matrix should usually be very large, so the example matrix below can’t be really called low-density.

Tanner introduced an effective graphical representation for LDPC codes. Not only provide these graphs a complete representation of the code, they also help to describe the decoding algorithm. Tanner graphs are bipirtate graphs. That means that the nodes of the graph are separated into two distinctive sets and edges are only connecting nodes of two different types. The two types of nodes in a tanner graph are called variable nodes (v-nodes) and check nodes (c-nodes). Below figure is an example of such a Tanner graph and represents the same code as the matrix in (1). The creation of such a graph is rather straight forward. It consists of m check nodes (the number of parity bits) and n variable nodes (the number of bits in a codeword).

The Equations for a simple LDPC code with n=12

There are actually only 7 independent equations so there are 7 parity digits.

The Parity check matrix for a simple LDPC code

Note above that each symbol is contained in 3 equations and each equation involves 4 code symbols.

Graph for simple LDPC code

The feature of LDPC codes to perform near the Shannon limit of a channel exists only for large block lengths. For example there have been simulations that perform within

0.04 dB of the Shannon limit at a bit error rate of 10^(-6) with a block length of 10^(7). An interesting fact is that those high performance codes are irregular. The large block length results also in large parity-check and generator matrices. The complexity of multiplying a codeword with a matrix depends on the amount of 1’s in the matrix. If we put the sparse matrix H in the form [P^(T)I] via Gaussian elimination the generator matrix G can be calculated as G=[IP]. The sub-matrix P is generally not sparse so that the encoding complexity will be quite high.

Since the complexity grows in O(n^2) even sparse matrices don’t result in a good performance if the block length gets very high. So iterative decoding (and encoding) algorithms are used. Those algorithms perform local calculations and pass those local results via messages. This step is typically repeated several times. The term "local calculations" already indicates that a divide and conquere strategy, which separates a complex problem into manageable sub-problems, is realized. A sparse parity check matrix now helps this algorithms in several ways. First it helps to keep both the local calculations simple and also reduces the complexity of combining the sub-problems by reducing the number of needed messages to exchange all the information. Furthermore it was observed that iterative decoding algorithms of sparse codes perform very close to the optimal maximum likelihood decoder.

Decoding is quite simple – the most powerful algorithm is to use belief propagation – each check node in generation graph (or generation matrix, which is just a different representation of the same entity) sends to codeword nodes what they belief a valid bit with calculated probability, after error probably that given bit is zero or one becomes less than requested number (according to Shannon’s theorem code, which allows to reduce error probabily infinitely, always exist for given rate and communication capacity), codeword calcualtion (decoding) is completed. There are two mechanisms – hard and soft decision algos, the former is simpler, but the latter frequently is faster. Let me show an example of the hard decision algorithm

Let generation matrix to be this (not very sparse) set:

0 1 0 1 1 0 0 1 1 1 1 0 0 1 0 0 0 0 1 0 0 1 1 1 1 0 0 1 1 0 1 0

And source codeword is:

1 0 0 1 0 1 0 1

Check word, calculated my multiplication of matrix and codeword is:

0 0 0 0

Let’s during transmission of the codeword and check word over the channel codeword was changed to this (secod bit changed):

1 1 0 1 0 1 0 1

Here is a generation graph (originally proposed by Tanner) of the given matrix:

\

Here starts decoding algorithm, where each codeword node `C`

first sends its bit

to each check node `F`

.

F0 node will receive 1 1 0 1 bits from C1, C3, C4 and C7 accordingly.

F1 node will receive 1 1 0 1 bits

F2 node will receive 0 1 0 1 bits

F3 node will receive 1 1 0 0 bits

Next step is to calculate the answer for each code node.

Received check word is 0 0 0 0 (calculated above), so set of simple equations starts here. Each check node F gets three out of four received bits and XORing (summing modulo 2, since this example works in Galois finite field of power of 1 – GF(1)), and sends to the codeword node a bit it expects to be correct to satisfy received check bit. Here is an example for first check node:

X0 ^ 1 ^ 0 ^ 1 = 0. X0 = 0 1 ^ X1 ^ 0 ^ 1 = 0. X1 = 0 1 ^ 1 ^ X2 ^ 1 = 0. X2 = 1 1 ^ 1 ^ 0 ^ X3 = 0. X3 = 0

Then we send Xi to Ci code bits. After all check nodes are processed, codeword nodes has following set of bits:

C0: 0 from F1, 1 from F3, 1 from originally received codeword.

C1: 0 from F0, 0 from F1, 1 from originally received codeword.

C2: 1 from F1, 0 from F2, 0 from originally received codeword.

C3: 0 from F0, 1 from F3, 1 from originally received codeword.

C4: 1 from F0, 0 from F3, 0 from originally received codeword.

C5: 0 from F1, 1 from F2, 1 from originally received codeword.

C6: 0 from F2, 0 from F3, 0 from originally received codeword.

C7: 1 from F0, 1 from F2, 1 from originally received codeword.

Then using the voting for each bit (i.e. which bit has more ‘votes’ in above table out of three cases), we get a new codeword:

1 0 0 1 0 1 0 1

The same steps then are repeated until cdeword stopped to change. In our case we get it after the first run.

Soft decision algorithm usses essentially the same logic, but it operates with probabilities of the bit to be 1 or 0, each probabilities are recalculated in each run, and after probability is higher than requested value (or error probability is less than requested value), loop stops.

Real world examples use much bigger codewords (up to several thousands of bits), but logic is always the same.

Latest Contents

Created by David Ritter
on 05-29-2020
12:56 PM

0

0

0

0

I started up my new 9800-40 for exploration. only the Service port patched to local switch and given local IP.I can telnet to it but can't authenticate ssh..The GUI is functional but I do not yet have AAA server data to feed it.I can't ping in or ou...
view more

Created by RitaHe0822
on 05-29-2020
10:54 AM

0

0

0

0

Hello,I have a problem with AnyConnect on iPad Pro.The problem is in AnyConnect keeping get paused (no network). Have anyone run the same issue or maybe have a solution?Thanks,

Created by ccnaluna93
on 05-29-2020
10:28 AM

0

0

0

0

Hi guys, I'm studying wireless topics, I have read that LAP uses local mode by default, and this mode explore for IDS events, but, what exactly do the LAPs with IDS events? I would appreciate your help!

Created by gdspa
on 05-29-2020
09:46 AM

0

0

0

0

Dear all, I've customized login portal using bundle downloaded from cisco site.What I miss now is to customize a little the voucher. Is it possible in some way ? To have Cisco Mobility Express as title of the voucher is not so beautiful, since we are not ...
view more

Created by c.walsh
on 05-29-2020
06:20 AM

6

0

6

0

I am having problems connecting via SSH to a 9800-CL in my lab environment.There are NO firewalls between devices!Configuration on WLC is as follows... hostname WLC001!aaa new-model!aaa authentication login default localaaa authorization exec default...
view more