did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

We're the #1 textbook rental company. Let us show you why.

9780470029206

Essentials of Error-control Coding

by ;
  • ISBN13:

    9780470029206

  • ISBN10:

    047002920X

  • Edition: 1st
  • Format: Hardcover
  • Copyright: 9/1/2006
  • Publisher: WILEY
  • Purchase Benefits
  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $95.94 Save up to $0.48
  • Buy New
    $95.46
    Add to Cart Free Shipping Icon Free Shipping

    PRINT ON DEMAND: 2-4 WEEKS. THIS ITEM CANNOT BE CANCELLED OR RETURNED.

Supplemental Materials

What is included with this book?

Summary

Rapid advances in electronic and optical technology have enabled the implementation of powerful error-control codes, which are now used in almost the entire range of information systems with close to optimal performance. These codes and decoding methods are required for the detection and correction of the errors and erasures which inevitably occur in digital information during transmission, storage and processing because of noise, interference and other imperfections. Error-control coding is a complex, novel and unfamiliar area, not yet widely understood and appreciated. This book sets out to provide a clear description of the essentials of the subject, with comprehensive and up-to-date coverage of the most useful codes and their decoding algorithms. A practical engineering and information technology emphasis, as well as relevant background material and fundamental theoretical aspects, provides an in-depth guide to the essentials of Error-Control Coding. Provides extensive and detailed coverage of Block, Cyclic, BCH, Reed-Solomon, Convolutional, Turbo, and Low Density Parity Check (LDPC) codes, together with relevant aspects of Information Theory EXIT chart performance analysis for iteratively decoded error-control techniques Heavily illustrated with tables, diagrams, graphs, worked examples, and exercises Invaluable companion website features slides of figures, algorithm software, updates and solutions to problems Offering a complete overview of Error Control Coding, this book is an indispensable resource for students, engineers and researchers in the areas of telecommunications engineering, communication networks, electronic engineering, computer science, information systems and technology, digital signal processing and applied mathematics.

Author Biography

Jorge Castiñera Moreira is Associate Professor (Senior Lecturer) in Communication Systems in the Electronics Department, School of Engineering, Mar del Plata University, Argentina. 
He is Director of the Communications Laboratory, Director of the research project “Open Source software applications for wireless networks” and co-director of the research project “Information Theory. Data Networks. Chaos and Communications”.  Jorge is also responsible for the teaching area “Communications”.

Patrick G. Farrell is Visiting Professor in the Department of Communication Systems at Lancaster University, UK, where he supervises 7 research assistants, 50 PhD and 35 MSc students.  His research interests include error-control coding, coded modulation, digital communications, multi-user communications and information theory and source coding.  Patrick has over 350 publications, reports and presentations, and is Editor of two book series (Academic Press and Research Studies Press).

Table of Contents

Preface & Acknowledgements.
Symbols & Abbreviations.
1 INFORMATION AND CODING THEORY.
1.1 Information.
1.1.1 Information measure.
1.2 Entropy and Information Rate.
1.3 Extended Discrete Memoryless Source.
1.4 Channels and Mutual Information.
1.4.1 Information transmission over discrete channels.
1.4.2 Information channels.
1.5 Channel Probability Relationships.
1.6 The a priori and a posteriori Entropies.
1.7 Mutual Information.
1.7.1 Mutual information: Definition.
1.7.2 Mutual information: Properties.
1.8 Capacity of a Discrete Channel.
1.9 Shannon’s Theorems.
1.9.1 Source coding theorem.
1.9.2 Channel capacity and coding.
1.9.3 Channel coding theorem.
1.10 Signal Spaces and the Channel Coding Theorem.
1.10.1 Capacity of the Gaussian Channel.
1.11 Error Control Coding.
1.12 Limits to Communication and their Consequences.
Bibliography and References.
Problems.
2. BLOCK CODES.
2.1 Error Control Coding.
2.2 Error Detection and Correction.
2.2.1 Simple codes: The repetition code.
2.3 Block Codes: Introduction and Parameters.
2.4 The Vector Space over the Binary Field.
2.4.1 Vector subspaces.
2.4.2 Dual subspaces.
2.4.3 Matrix form.
2.4.4 Dual subspace matrix.
2.5 Linear Block Codes.
2.5.1 Generator matrix G.
2.5.2 Block codes in systematic form.
2.5.3 Parity check matrix H.
2.6 Syndrome Error Detection.
2.7 Minimum Distance of a Block Code.
2.7.1 Minimum distance and the structure of the H matrix.
2.8 Error Correction Capability of a Block Code.
2.9 Syndrome Detection and the Standard Array.
2.10 Hamming Codes.
2.11 Forward Error Correction (FEC) and Automatic Repeat ReQuest (ARQ).
2.11.1 Forward error correction (FEC).
2.11.2 Automatic Repeat ReQuest (ARQ).
2.11.3 ARQ schemes.
2.11.3.1 Stop and Wait.
2.11.3.2 Go back N.
2.11.3.3 Selective Repeat.
2.11.4 ARQ scheme efficiencies.
2.11.5 Hybrid-ARQ schemes.
Bibliography and References.
Problems.
3 CYCLIC CODES.
3.1 Description.
3.2 Polynomial Representation of Codewords.
3.3 Generator Polynomial of a Cyclic Code.
3.4 Cyclic Codes in Systematic Form.
3.5 Generator Matrix of a Cyclic Code.
3.6 Syndrome Calculation and Error Detection.
3.7 Decoding of Cyclic Codes.
3.8 An Application Example: CRC Code for the Ethernet Standard.
Bibliography and References.
Problems.
4 BCH CODES.
4.1 Introduction: The Minimal Polynomial.
4.2 Description of BCH Cyclic Codes.
4.2.1 Bounds on the error-correction capability of a BCH code: the Vandermonde determinant.
4.3 Decoding of BCH Codes.
4.4 Error Location and Error Evaluation Polynomials.
4.5 The Key Equation.
4.6 Decoding of BCH Codes using the Euclidean Algorithm.
4.6.1 The Euclidean algorithm.
Bibliography and References.
Problems.
5 REED-SOLOMON CODES.
5.1 Introduction.
5.2 Error Correction Capability of RS Codes: The Vandermonde Determinant.
5.3 RS Codes in Systematic Form.
5.4 Syndrome Decoding of RS Codes.
5.5 The Euclidean Algorithm. Error Location and Evaluation Polynomials.
5.6 Decoding of RS Codes using the Euclidean Algorithm.
5.6.1 Steps of the Euclidean algorithm.
5.7 Decoding of RS and BCH Codes using the Berlekamp- Massey Algorithm.
5.7.1 Berlekamp-Massey iterative algorithm for finding the error location polynomial.
5.7.2 Berlekamp-Massey decoding of RS codes.
5.7.3 Relationship between the error location polynomials of the Euclidean and Berlekamp-Massey algorithms.
5.8 A Practical Application: Error-Control Coding for the Compact Disc (CD).
5.8.1 CD Characteristics.
5.8.2 Channel characteristics.
5.8.3 Coding procedure.
5.9 Encoding for RS codes C(RS)(28,24), C(RS)(32,28) and C(RS)(255,251).
5.10 Decoding of RS Codes C(RS)(28,24) and  C(RS)(32,28).
5.10.1 Berlekamp-Massey decoding.
5.10.2 Alternative decoding methods.
5.10.3 Direct solution of syndrome equations.
5.11 Importance of Interleaving Bibliography and References Problems.
6 CONVOLUTIONAL CODES.
6.1 Linear Sequential Circuits.
6.2 Convolutional Codes and Encoders.
6.3 Description in the D-Transform Domain.
6.4 Convolutional Encoder Representations.
6.4.1 Representation of connections.
6.4.2 State diagram representation.
6.4.3 Trellis representation.
6.5 Convolutional Codes in Systematic Form.
6.6 General Structure of FIR and IIR Finite State Sequential Machines.
6.6.1 FIR FSSM.
6.6.2 IIR FSSM.
6.7 State Transfer Function Matrix: Calculation of the Transfer Function.
6.7.1 State transfer function for FIR FSSMs.
6.7.2 State transfer function for IIR FSSMs.
6.8 Relationship between the Systematic and Non-Systematic Forms.
6.9 Distance Properties of Convolutional Codes.
6.10 Minimum Free Distance of a Convolutional Code.
6.11 Maximum Likelihood Detection (MLD).
6.12 Decoding of Convolutional Codes: The Viterbi Algorithm.
6.13 Extended and Modified State Diagram.
6.14 Error Probability Analysis for Convolutional Codes.
6.15 Hard and soft Decisions.
6.15.1 Maximum likelihood criterion for the Gaussian channel.
6.15.2 Bounds for soft decision detection.
6.15.3 An example of soft decision decoding of convolutional codes.
6.16 Punctured Convolutional Codes and Rate Compatible Schemes.
Bibliography and References.
Problems.
7 TURBO CODES.
7.1 A Turbo Encoder.
7.2 Decoding of Turbo Codes.
7.2.1 Turbo decoder.
7.2.2 Probabilities and estimates.
7.2.3 Symbol detection.
7.2.4 The log likelihood ratio.
7.3 Markov Sources and Discrete Channels.
7.4 The BCJR Algorithm: Trellis Coding and Discrete Memoryless Channels.
7.5 Iterative Coefficient Calculation.
7.6 MAP BCJR Algorithm and the Log Likelihood Ratio.
7.6.1 The BCJR MAP algorithm: LLR calculation.
7.6.2 Calculation of coefficients γi(u′,u).
7.7 Turbo Decoding.
7.7.1 Initial conditions of coefficients αj-1(u′) and βj(u).
7.8 Construction Methods for Turbo Codes.
7.8.1 Interleavers.
7.8.2 Block interleavers.
7.8.3 Convolutional interleavers.
7.8.4 Random interleavers.
7.8.5 Linear interleavers.
7.8.6 Code concatenation methods.
7.8.6.1 Serial concatenation.
7.8.6.2 Parallel concatenation.
7.8.7 Turbo code performance as a function of size and type of interleaver.
7.9 Other Decoding Algorithms for Turbo Codes.
7.10 EXIT Charts for Turbo Codes.
7.10.1 Introduction to EXIT charts.
7.10.2 Construction of the EXIT chart.
7.10.3 Extrinsic transfer characteristics of the constituent decoders.
Bibliography and References.
Problems.
8 LOW-DENSITY PARITY-CHECK (LDPC) CODES.
8.1 Different Systematic Forms of a Block Code.
8.2 Description of LDPC Codes.
8.3 Construction of LDPC Codes.
8.3.1 Regular LDPC codes.
8.3.2 Irregular LDPC codes.
8.3.3 Decoding of LDPC codes: the Tanner graph.
8.4 The Sum-Product Algorithm.
8.5 Sum-Product Algorithm for LDPC Codes: An Example.
8.6 Simplifications of the Sum-Product Algorithm.
8.7 A Logarithmic LDPC Decoder.
8.7.1 Initialization.
8.7.2 Horizontal step.
8.7.3 Vertical step.
8.7.4 Summary of the logarithmic decoding algorithm.
8.7.5 Construction of the lookup table.
8.8 EXIT Charts for LDPC Codes.
8.8.1 Introduction.
8.8.2 Iterative decoding of block codes.
8.8.3 EXIT chart construction for LDPC codes.
8.8.4 Mutual information function.
8.8.5 EXIT chart for the symbol node decoder (SND).
8.8.6 EXIT chart for the parity check node decoder (PCND).
8.9 Fountain and LT Codes.
8.9.1 Introduction.
8.9.2.Fountain codes.
8.9.3.Linear random codes.
8.9.4 LT codes.
8.9.4.1 LT encoder.
8.9.4.2 LT decoder.
8.10 LDPC and Turbo Codes.
Bibliography and References.
Problems.
APPENDIX A: Error Probability in the Transmission of Digital Signals.
A.1 Digital Signalling.
A.1.1 Pulse amplitude modulated digital signals.
A.2 Bit Error Rate (BER).
Bibliography.
APPENDIX B: Galois Fields GF(q).
B.1 Groups.
B.2 Addition and Multiplication modulo-2.
B.3 Fields.
B.4 Polynomials over Binary Fields.
B.5 Construction of a Galois Field GF(2m).
B.6 Properties of Extended Galois Fields GF(2m). 
B.7 Minimal Polynomials.
B.7.1 Properties of Minimal Polynomials.
Bibliography.
Answers to Problems.
Index.

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program