9780486604343-0486604349-Mathematical Foundations of Information Theory (Dover Books on Mathematics)

Mathematical Foundations of Information Theory (Dover Books on Mathematics)

ISBN-13: 9780486604343
ISBN-10: 0486604349
Edition: 1st Dover Edition
Author: A. Ya. Khinchin
Publication date: 1957
Publisher: Dover Publications
Format: Paperback 128 pages
FREE US shipping on ALL non-marketplace orders
Marketplace
from $10.89 USD
Buy

From $10.89

Book details

ISBN-13: 9780486604343
ISBN-10: 0486604349
Edition: 1st Dover Edition
Author: A. Ya. Khinchin
Publication date: 1957
Publisher: Dover Publications
Format: Paperback 128 pages

Summary

Mathematical Foundations of Information Theory (Dover Books on Mathematics) (ISBN-13: 9780486604343 and ISBN-10: 0486604349), written by authors A. Ya. Khinchin, was published by Dover Publications in 1957. With an overall rating of 3.9 stars, it's a notable title among other Information Theory (Computer Science, Electrical & Electronics, Engineering, History & Philosophy, Applied, Mathematics) books. You can easily purchase or rent Mathematical Foundations of Information Theory (Dover Books on Mathematics) (Paperback) from BooksRun, along with many other new and used Information Theory books and textbooks. And, if you're looking to sell your copy, our current buyback offer is $0.58.

Description

The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists, and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.
In his first paper, Dr. Khinchin develops the concept of entropy in probability theory as a measure of uncertainty of a finite “scheme,” and discusses a simple application to coding theory. The second paper investigates the restrictions previously placed on the study of sources, channels, and codes and attempts “to give a complete, detailed proof of both … Shannon theorems, assuming any ergodic source and any stationary channel with a finite memory.”
Partial Contents: I. The Entropy Concept in Probability Theory — Entropy of Finite Schemes. The Uniqueness Theorem. Entropy of Markov chains. Application to Coding Theory. II. On the Fundamental Theorems of Information Theory — Two generalizations of Shannon’s inequality. Three inequalities of Feinstein. Concept of a source. Stationarity. Entropy. Ergodic sources. The E property. The martingale concept. Noise. Anticipation and memory. Connection of the channel to the source. Feinstein’s Fundamental Lemma. Coding. The first Shannon theorem. The second Shannon theorem.

Rate this book Rate this book

We would LOVE it if you could help us and other readers by reviewing the book