Information Theory And Statistics Bibtex Bibliography

Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.

It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity, metric transitivity, equal a priori probabilities, etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects. The former consists only of the correct enumeration of the states of a system and their properties; the latter is a straightforward example of statistical inference.

  • Received 4 September 1956

DOI:https://doi.org/10.1103/PhysRev.106.620

©1957 American Physical Society

Preface to the Second Edition.

Preface to the First Edition.

Acknowledgments for the Second Edition.

Acknowledgments for the First Edition.

1. Introduction and Preview.

1.1 Preview of the Book.

2. Entropy, Relative Entropy, and Mutual Information.

2.1 Entropy.

2.2 Joint Entropy and Conditional Entropy.

2.3 Relative Entropy and Mutual Information.

2.4 Relationship Between Entropy and Mutual Information.

2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information.

2.6 Jensen’s Inequality and Its Consequences.

2.7 Log Sum Inequality and Its Applications.

2.8 Data-Processing Inequality.

2.9 Sufficient Statistics.

2.10 Fano’s Inequality.

Summary.

Problems.

Historical Notes.

3. Asymptotic Equipartition Property.

3.1 Asymptotic Equipartition Property Theorem.

3.2 Consequences of the AEP: Data Compression.

3.3 High-Probability Sets and the Typical Set.

Summary.

Problems.

Historical Notes.

4. Entropy Rates of a Stochastic Process.

4.1 Markov Chains.

4.2 Entropy Rate.

4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph.

4.4 Second Law of Thermodynamics.

4.5 Functions of Markov Chains.

Summary.

Problems.

Historical Notes.

5. Data Compression.

5.1 Examples of Codes.

5.2 Kraft Inequality.

5.3 Optimal Codes.

5.4 Bounds on the Optimal Code Length.

5.5 Kraft Inequality for Uniquely Decodable Codes.

5.6 Huffman Codes.

5.7 Some Comments on Huffman Codes.

5.8 Optimality of Huffman Codes.

5.9 Shannon–Fano–Elias Coding.

5.10 Competitive Optimality of the Shannon Code.

5.11 Generation of Discrete Distributions from Fair Coins.

Summary.

Problems.

Historical Notes.

6. Gambling and Data Compression.

6.1 The Horse Race.

6.2 Gambling and Side Information.

6.3 Dependent Horse Races and Entropy Rate.

6.4 The Entropy of English.

6.5 Data Compression and Gambling.

6.6 Gambling Estimate of the Entropy of English.

Summary.

Problems.

Historical Notes.

7. Channel Capacity.

7.1 Examples of Channel Capacity.

7.2 Symmetric Channels.

7.3 Properties of Channel Capacity.

7.4 Preview of the Channel Coding Theorem.

7.5 Definitions.

7.6 Jointly Typical Sequences.

7.7 Channel Coding Theorem.

7.8 Zero-Error Codes.

7.9 Fano’s Inequality and the Converse to the Coding Theorem.

7.10 Equality in the Converse to the Channel Coding Theorem.

7.11 Hamming Codes.

7.12 Feedback Capacity.

7.13 Source–Channel Separation Theorem.

Summary.

Problems.

Historical Notes.

8. Differential Entropy.

8.1 Definitions.

8.2 AEP for Continuous Random Variables.

8.3 Relation of Differential Entropy to Discrete Entropy.

8.4 Joint and Conditional Differential Entropy.

8.5 Relative Entropy and Mutual Information.

8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information.

Summary.

Problems.

Historical Notes.

9. Gaussian Channel.

9.1 Gaussian Channel: Definitions.

9.2 Converse to the Coding Theorem for Gaussian Channels.

9.3 Bandlimited Channels.

9.4 Parallel Gaussian Channels.

9.5 Channels with Colored Gaussian Noise.

9.6 Gaussian Channels with Feedback.

Summary.

Problems.

Historical Notes.

10. Rate Distortion Theory.

10.1 Quantization.

10.2 Definitions.

10.3 Calculation of the Rate Distortion Function.

10.4 Converse to the Rate Distortion Theorem.

10.5 Achievability of the Rate Distortion Function.

10.6 Strongly Typical Sequences and Rate Distortion.

10.7 Characterization of the Rate Distortion Function.

10.8 Computation of Channel Capacity and the Rate Distortion Function.

Summary.

Problems.

Historical Notes.

11. Information Theory and Statistics.

11.1 Method of Types.

11.2 Law of Large Numbers.

11.3 Universal Source Coding.

11.4 Large Deviation Theory.

11.5 Examples of Sanov’s Theorem.

11.6 Conditional Limit Theorem.

11.7 Hypothesis Testing.

11.8 Chernoff–Stein Lemma.

11.9 Chernoff Information.

11.10 Fisher Information and the Cram´er–Rao Inequality.

Summary.

Problems.

Historical Notes.

12. Maximum Entropy.

12.1 Maximum Entropy Distributions.

12.2 Examples.

12.3 Anomalous Maximum Entropy Problem.

12.4 Spectrum Estimation.

12.5 Entropy Rates of a Gaussian Process.

12.6 Burg’s Maximum Entropy Theorem.

Summary.

Problems.

Historical Notes.

13. Universal Source Coding.

13.1 Universal Codes and Channel Capacity.

13.2 Universal Coding for Binary Sequences.

13.3 Arithmetic Coding.

13.4 Lempel–Ziv Coding.

13.5 Optimality of Lempel–Ziv Algorithms.

Compression.

Summary.

Problems.

Historical Notes.

14. Kolmogorov Complexity.

14.1 Models of Computation.

14.2 Kolmogorov Complexity: Definitions and Examples.

14.3 Kolmogorov Complexity and Entropy.

14.4 Kolmogorov Complexity of Integers.

14.5 Algorithmically Random and Incompressible Sequences.

14.6 Universal Probability.

14.7 Kolmogorov complexity.

14.9 Universal Gambling.

14.10 Occam’s Razor.

14.11 Kolmogorov Complexity and Universal Probability.

14.12 Kolmogorov Sufficient Statistic.

14.13 Minimum Description Length Principle.

Summary.

Problems.

Historical Notes.

15. Network Information Theory.

15.1 Gaussian Multiple-User Channels.

15.2 Jointly Typical Sequences.

15.3 Multiple-Access Channel.

15.4 Encoding of Correlated Sources.

15.5 Duality Between Slepian–Wolf Encoding and Multiple-Access Channels.

15.6 Broadcast Channel.

15.7 Relay Channel.

15.8 Source Coding with Side Information.

15.9 Rate Distortion with Side Information.

15.10 General Multiterminal Networks.

Summary.

Problems.

Historical Notes.

16. Information Theory and Portfolio Theory.

16.1 The Stock Market: Some Definitions.

16.2 Kuhn–Tucker Characterization of the Log-Optimal Portfolio.

16.3 Asymptotic Optimality of the Log-Optimal Portfolio.

16.4 Side Information and the Growth Rate.

16.5 Investment in Stationary Markets.

16.6 Competitive Optimality of the Log-Optimal Portfolio.

16.7 Universal Portfolios.

16.8 Shannon–McMillan–Breiman Theorem (General AEP).

Summary.

Problems.

Historical Notes.

17. Inequalities in Information Theory.

17.1 Basic Inequalities of Information Theory.

17.2 Differential Entropy.

17.3 Bounds on Entropy and Relative Entropy.

17.4 Inequalities for Types.

17.5 Combinatorial Bounds on Entropy.

17.6 Entropy Rates of Subsets.

17.7 Entropy and Fisher Information.

17.8 Entropy Power Inequality and Brunn–Minkowski Inequality.

17.9 Inequalities for Determinants.

17.10 Inequalities for Ratios of Determinants.

Summary.

Problems.

Historical Notes.

Bibliography.

List of Symbols.

Index.

Elements of Information Theory, Second Edition, will further update the most sucessful book on Information Theory currently on the market.

"As expected, the quality of exposition continues to be a high point of the book. Clear explanations, nice graphical illustrations, and illuminating mathematical derivations make the book particularly useful as a textbook on information theory." (Journal of the American Statistical Association, March 2008)

"This book is recommended reading, both as a textbook and as a reference." (Computing Reviews.com, December 28, 2006)

  • The chapters have been reorganized to make the book more useful as a teaching tool.
  • Over 100 new problems have been added.
  • Updated references and historical notes refer to new areas of research.
  • The coverage of universal methods for source coding and for investment in the stock market, the feedback capacity of Gaussian channels and the duality between source and channel coding has all been expanded.
  • New edition will also be accompanied by a solutions manual.
  • An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department. 

0 thoughts on “Information Theory And Statistics Bibtex Bibliography”

    -->

Leave a Comment

Your email address will not be published. Required fields are marked *