Skip to main content

Elements of Information Theory, 2nd Edition

Elements of Information Theory, 2nd Edition

Thomas M. Cover, Joy A. Thomas

ISBN: 978-0-471-74882-3 April 2005 776 Pages

E-Book
$103.99
Hardcover
$129.00
O-Book
Download Product Flyer

Download Product Flyer

Download Product Flyer is to download PDF in new tab. This is a dummy description. Download Product Flyer is to download PDF in new tab. This is a dummy description. Download Product Flyer is to download PDF in new tab. This is a dummy description. Download Product Flyer is to download PDF in new tab. This is a dummy description.

Description

The latest edition of this classic is updated with new problem sets and material


The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.

All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.

The Second Edition features:
* Chapters reorganized to improve teaching
* 200 new problems
* New material on source coding, portfolio theory, and feedback capacity
* Updated references

Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.
Preface to the Second Edition.

Preface to the First Edition.

Acknowledgments for the Second Edition.

Acknowledgments for the First Edition.

1. Introduction and Preview.

2. Entropy, Relative Entropy, and Mutual Information.

3. Asymptotic Equipartition Property.

4. Entropy Rates of a Stochastic Process.

5. Data Compression.

6. Gambling and Data Compression.

7. Channel Capacity.

8. Differential Entropy.

9. Gaussian Channel.

10. Rate Distortion Theory.

11. Information Theory and Statistics.

12. Maximum Entropy.

13. Universal Source Coding.

Compression.

14. Kolmogorov Complexity.

15. Network Information Theory.

16. Information Theory and Portfolio Theory.

17. Inequalities in Information Theory.

Bibliography.

List of Symbols.

Index.
Elements of Information Theory, Second Edition, will further update the most sucessful book on Information Theory currently on the market.
"As expected, the quality of exposition continues to be a high point of the book. Clear explanations, nice graphical illustrations, and illuminating mathematical derivations make the book particularly useful as a textbook on information theory." (Journal of the American Statistical Association, March 2008)

"This book is recommended reading, both as a textbook and as a reference." (Computing Reviews.com, December 28, 2006)
  • The chapters have been reorganized to make the book more useful as a teaching tool.
  • Over 100 new problems have been added.
  • Updated references and historical notes refer to new areas of research.
  • The coverage of universal methods for source coding and for investment in the stock market, the feedback capacity of Gaussian channels and the duality between source and channel coding has all been expanded.
  • New edition will also be accompanied by a solutions manual.
  • An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.