did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

We're the #1 textbook rental company. Let us show you why.

9780486439181

Science and Information Theory, Second Edition

by
  • ISBN13:

    9780486439181

  • ISBN10:

    0486439186

  • Edition: 2nd
  • Format: Hardcover
  • Copyright: 2004-09-07
  • Publisher: Dover Publications
  • Purchase Benefits
  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $65.00

Summary

A classic source for understanding the connections between information theory and physics, this text was written by one of the giants of 20th-century physics. Topics include the principles of coding, coding problems and solutions, the analysis of signals, a summary of thermodynamics, thermal agitation and Brownian motion, plus an examination of Maxwell's demon. 81 figures. 14 tables.

Table of Contents

Preface v
Introduction vii
Chapter 1. THE DEFINITION OF INFORMATION 1(10)
1. Definition of Information
1(1)
2. Unit Systems
2(1)
3. Generalization and Examples
3(1)
4. Information Using the Alphabet
4(1)
5. Information Content in a Set of Symbols with Different a priori Probabilities
5(3)
6. General Remarks
8(3)
Chapter 2. APPLICATION OF THE DEFINITIONS AND GENERAL DISCUSSION 11(10)
1. Definitions
11(1)
2. Property A
12(1)
3. Property B
13(1)
4. Property C
14(3)
5. Joint Events
17(2)
6. Conditional Information
19(2)
Chapter 3. REDUNDANCY IN THE ENGLISH LANGUAGE 21(7)
1. Correlation and Joint Events
21(1)
2. Correlation in Language
22(1)
3. Redundancy in Language
23(2)
4. Some Typical Experiments
25(1)
5. Coding Devices
26(2)
Chapter 4. PRINCIPLES OF CODING, DISCUSSION OF THE CAPACITY OF A CHANNEL 28(23)
1. Introduction
28(1)
2. Definition of a Channel and its Capacity
28(2)
3. Symbols, Words, and Messages in Sequential Coding
30(2)
4. Discussion
32(2)
5. Examples
34(3)
6. Computation of the Capacity of a Channel
37(1)
7. Matching a Code with a Channel
38(3)
8. General Problem: Symbols with Different Lengths
41(3)
9. The Matching Problem
44(1)
10. Problems of Word Statistics (Mandelbrot)
44(3)
11. Solving the Matching Problem
47(2)
Appendix
49(2)
Chapter 5. CODING PROBLEMS 51(11)
1. Alphabetic Coding, Binary System
51(2)
2. Alphabetic Coding, Ternary System
53(1)
3. Alphabet and Numbers
54(1)
4. Binary Coding by Words
55(3)
5. Alphabetic Coding by Words
58(1)
6. Coding Based on Letter Groups and on Correlation
58(4)
Chapter 6. ERROR DETECTING AND CORRECTING CODES 62(9)
1. Error Detecting Codes
62(1)
2. Single Error Detecting Codes
63(3)
3. Single Error Correcting and Double Error Correcting Codes
66(1)
4. Efficiency of Self-Correcting Codes
67(2)
5. The Capacity of a Binary Channel with Noise
69(2)
Chapter 7. APPLICATIONS TO SOME SPECIAL PROBLEMS 71(7)
1. The Problem of Filing Using a Miscellaneous Cell
71(2)
2. Filing with Cross Referencing
73(2)
3. The Most Favorable Number of Signals per Elementary Cell
75(3)
Chapter 8. THE ANALYSIS OF SIGNALS: FOURIER METHOD AND SAMPLING PROCEDURE 78(36)
1. Fourier Series
78(2)
2. The Gibbs' Phenomenon and Convergence of Fourier Series
80(3)
3. Fourier Integrals
83(4)
4. The Role of Finite Frequency Band Width
87(2)
5. The Uncertainty Relation for Time and Frequency
89(4)
6. Degrees of Freedom of a Message
93(4)
7. Shannon's Sampling Method
97(2)
8. Gabor's Information Cells
99(2)
9. Autocorrelation and Spectrum; the Wiener-Khintchine Formula
101(2)
10. Linear Transformations and Filters
103(2)
11. Fourier Analysis and the Sampling Method in Three Dimensions
105(6)
12. Crystal Analysis by X-Rays
111(2)
Appendix. Schwarz' Inequality
113(1)
Chapter 9. SUMMARY OF THERMODYNAMICS 114(14)
1. Introduction
114(1)
2. The Two Principles of Thermodynamics; Entropy and Negentropy
114(3)
3. Impossibility of Perpetual Motion; Thermal Engines
117(2)
4. Statistical Interpretation of Entropy
119(2)
5. Examples of Statistical Discussions
121(1)
6. Energy Fluctuations; Gibbs Formula
122(2)
7. Quantized Oscillator
124(1)
8. Fluctuations
125(3)
Chapter 10. THERMAL AGITATION AND BROWNIAN MOTION 128(13)
1. Thermal Agitation
128(1)
2. Random Walk
129(3)
3. Shot Effect
132(2)
4. Brownian Motion
134(3)
5. Thermal Agitation in an Electric Circuit
137(2)
Appendix
139(2)
Chapter 11. THERMAL NOISE IN AN ELECTRIC CIRCUIT; NYQUIST'S FORMULA 141(11)
1. Random Impulses Model
141(2)
2. The Nyquist Method
143(2)
3. Discussion and Applications
145(1)
4. Generalizations of Nyquist's Formula
146(2)
5. Thermal Agitation in a Rectifier
148(4)
Chapter 12. THE NEGENTROPY PRINCIPLE OF INFORMATION 152(10)
1. The Relation between Information and Entropy
152(1)
2. The Negentropy Principle of Information; Generalization of Carnot's Principle
153(3)
3. Some Typical Physical Examples
156(3)
4. Some General Remarks
159(3)
Chapter 13. MAXWELL'S DEMON AND THE NEGENTROPY PRINCIPLE OF INFORMATION 162(22)
1. Maxwell's Demon: Historical Survey
162(2)
2. The Demon Exorcised
164(2)
3. Discussion
166(2)
4. The Demon's Operation as a Transformation of Information into Negative Entropy
168(4)
5. The Negentropy Required in the Observation
172(4)
6. Szilard's Problem: The Well-Informed Heat Engine
176(3)
7. Gabor's Discussion
179(3)
Appendix I
182(1)
Appendix II
183(1)
Chapter 14. THE NEGENTROPY PRINCIPLE OF INFORMATION IN GENERAL PHYSICS 184(18)
1. The Problem of Measurements in Physics
184(1)
2. Observations Made on an Oscillator
185(3)
3. High-Frequency Resonator and the Cost of an Observation
188(2)
4. Experiments Requiring Many Simultaneous Observations at Low Frequencies
190(4)
5. Problems Requiring High Reliability
194(2)
6. A More Accurate Discussion of Experiments Using High Frequencies
196(2)
7. An Example Showing the Minimum Negentropy Required in an Observation
198(4)
Chapter 15. OBSERVATION AND INFORMATION 202(27)
1. Experimental Errors and Information
202(2)
2. Length Measurements with Low Accuracy
204(2)
3. Length Measurements with High Accuracy
206(3)
4. Efficiency of an Observation
209(1)
5. Measurement of a Distance with an Interferometer
210(3)
6. Another Scheme for Measuring Distance
213(4)
7. The Measurement of Time Intervals
217(2)
8. Observation under a Microscope
219(4)
9. Discussion of the Focus in a Wave Guide
223(3)
10. Examples and Discussion
226(2)
11. Summary
228(1)
Chapter 16. INFORMATION THEORY, THE UNCERTAINTY PRINCIPLE, AND PHYSICAL LIMITS OF OBSERVATION 229(16)
1. General Remarks
229(2)
2. An Observation is an Irreversible Process
231(1)
3. General Limitations in the Accuracy of Physical Measurements
232(3)
4. The Limits of Euclidean Geometry
235(1)
5. Possible Use of Heavy Particles Instead of Photons
236(2)
6. Uncertainty Relations in the Microscope Experiment
238(3)
7. Measurement of Momentum
241(2)
8. Uncertainty in Field Measurements
243(2)
Chapter 17. THE NEGENTROPY PRINCIPLE OF INFORMATION IN TELECOMMUNICATIONS 245(14)
1. The Analysis of Signals with Finite Band Width
245(1)
2. Signals and Thermal Noise: Representation in Hyperspace
246(1)
3. The Capacity of a Channel with Noise
247(1)
4. Discussion of the Tuller-Shannon Formula
248(4)
5. A Practical Example
252(2)
6. The Negentropy Principle Applied to the Channel with Noise
254(3)
7. Gabor's Modified Formula and the Role of Beats
257(2)
Chapter 18. WRITING, PRINTING, AND READING 259(8)
1. The Transmission of Information: Live Information
259(1)
2. The Problem of Reading and Writing
260(1)
3. Dead Information and How to Bring It Back to Life
261(2)
4. Writing and Printing
263(1)
5. Discussion of a Special Example
264(1)
6. New Information and Redundancy
265(2)
Chapter 19. THE PROBLEM OF COMPUTING 267(20)
1. Computing Machines
267(2)
2. The Computer as a Mathematical Element
269(4)
3. The Computer as a Circuit Element, Sampling and Desampling (Linvill and Salzer)
273(2)
4. Computing on Sampled Data at Time I
275(2)
5. The Transfer Function for a Computer
277(2)
6. Circuits Containing a Computer, The Problem of Stability
279(2)
7. Discussion of the Stability of a Program
281(2)
8. A Few Examples
283(4)
Chapter 20. INFORMATION, ORGANIZATION, AND OTHER PROBLEMS 287(15)
1. Information and Organization
287(2)
2. Information Contained in a Physical Law
289(2)
3. Information Contained in a Numerical Table
291(2)
4. General Remarks
293(1)
5. Examples of Problems Beyond the Present Theory
294(3)
6. Problems of Semantic Information
297(5)
Chapter 21. INEVITABLE ERRORS, DETERMINISM, AND INFORMATION 302(19)
1. Information in Science
302(1)
2. Information is Finite
302(1)
3. The Viewpoint of M. Born
303(1)
4. Observation and Experimental Errors
304(1)
5. A Simple Example for Discussion: Laplace's Demon Exorcised
305(3)
6. Some More Examples: Anharmonic Oscillators, Rectifier
308(3)
7. The Anomaly of the Harmonic Oscillator
311(3)
8. The Problem of Determinism
314(2)
9. Information Theory and our Preceding Examples
316(2)
10. Observation and Interpretation
318(2)
11. Conclusions
320(1)
Chapter 22. THE PROBLEM OF VERY SMALL DISTANCES 321(8)
1. The Difficulties in Measuring Extremely Small Distances
321(1)
2. The Possible Use of These Remarks for the Computation of Diverging Integrals in Physics
322(2)
3. Example: Electromagnetic Mass of the Electron
324(1)
4. A Justification of our Assumptions: Schrödinger's Zitterbewegung
325(1)
5. Discussion and Possible Generalizations
326(3)
Author Index 329(2)
Subject Index 331(18)
Books published by L. Brillouin 349

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program