Author: Jan Kåhre
Publisher: Springer Science & Business Media
ISBN: 1461509750
Category : Technology & Engineering
Languages : en
Pages : 517
Book Description
The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron's law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making 'two plus two greater than four'. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation.
The Mathematical Theory of Information
Author: Jan Kåhre
Publisher: Springer Science & Business Media
ISBN: 1461509750
Category : Technology & Engineering
Languages : en
Pages : 517
Book Description
The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron's law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making 'two plus two greater than four'. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation.
Publisher: Springer Science & Business Media
ISBN: 1461509750
Category : Technology & Engineering
Languages : en
Pages : 517
Book Description
The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron's law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making 'two plus two greater than four'. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation.
The Mathematical Theory of Communication
Author: Claude E Shannon
Publisher: University of Illinois Press
ISBN: 025209803X
Category : Language Arts & Disciplines
Languages : en
Pages : 141
Book Description
Scientific knowledge grows at a phenomenal pace--but few books have had as lasting an impact or played as important a role in our modern world as The Mathematical Theory of Communication, published originally as a paper on communication theory more than fifty years ago. Republished in book form shortly thereafter, it has since gone through four hardcover and sixteen paperback printings. It is a revolutionary work, astounding in its foresight and contemporaneity. The University of Illinois Press is pleased and honored to issue this commemorative reprinting of a classic.
Publisher: University of Illinois Press
ISBN: 025209803X
Category : Language Arts & Disciplines
Languages : en
Pages : 141
Book Description
Scientific knowledge grows at a phenomenal pace--but few books have had as lasting an impact or played as important a role in our modern world as The Mathematical Theory of Communication, published originally as a paper on communication theory more than fifty years ago. Republished in book form shortly thereafter, it has since gone through four hardcover and sixteen paperback printings. It is a revolutionary work, astounding in its foresight and contemporaneity. The University of Illinois Press is pleased and honored to issue this commemorative reprinting of a classic.
The Mathematical Theory of Communication
Author: Claude Elwood Shannon
Publisher:
ISBN:
Category : Information theory
Languages : en
Pages : 136
Book Description
Publisher:
ISBN:
Category : Information theory
Languages : en
Pages : 136
Book Description
Mathematical Foundations of Information Theory
Author: Aleksandr I?Akovlevich Khinchin
Publisher: Courier Corporation
ISBN: 0486604349
Category : Mathematics
Languages : en
Pages : 130
Book Description
First comprehensive introduction to information theory explores the work of Shannon, McMillan, Feinstein, and Khinchin. Topics include the entropy concept in probability theory, fundamental theorems, and other subjects. 1957 edition.
Publisher: Courier Corporation
ISBN: 0486604349
Category : Mathematics
Languages : en
Pages : 130
Book Description
First comprehensive introduction to information theory explores the work of Shannon, McMillan, Feinstein, and Khinchin. Topics include the entropy concept in probability theory, fundamental theorems, and other subjects. 1957 edition.
Mathematical Theory of Entropy
Author: Nathaniel F. G. Martin
Publisher: Cambridge University Press
ISBN: 9780521177382
Category : Computers
Languages : en
Pages : 292
Book Description
This excellent 1981 treatment of the mathematical theory of entropy gives an accessible exposition its application to other fields.
Publisher: Cambridge University Press
ISBN: 9780521177382
Category : Computers
Languages : en
Pages : 292
Book Description
This excellent 1981 treatment of the mathematical theory of entropy gives an accessible exposition its application to other fields.
Information: A Very Short Introduction
Author: Luciano Floridi
Publisher: OUP Oxford
ISBN: 0191609544
Category : Computers
Languages : en
Pages : 153
Book Description
We live an information-soaked existence - information pours into our lives through television, radio, books, and of course, the Internet. Some say we suffer from 'infoglut'. But what is information? The concept of 'information' is a profound one, rooted in mathematics, central to whole branches of science, yet with implications on every aspect of our everyday lives: DNA provides the information to create us; we learn through the information fed to us; we relate to each other through information transfer - gossip, lectures, reading. Information is not only a mathematically powerful concept, but its critical role in society raises wider ethical issues: who owns information? Who controls its dissemination? Who has access to information? Luciano Floridi, a philosopher of information, cuts across many subjects, from a brief look at the mathematical roots of information - its definition and measurement in 'bits'- to its role in genetics (we are information), and its social meaning and value. He ends by considering the ethics of information, including issues of ownership, privacy, and accessibility; copyright and open source. For those unfamiliar with its precise meaning and wide applicability as a philosophical concept, 'information' may seem a bland or mundane topic. Those who have studied some science or philosophy or sociology will already be aware of its centrality and richness. But for all readers, whether from the humanities or sciences, Floridi gives a fascinating and inspirational introduction to this most fundamental of ideas. ABOUT THE SERIES: The Very Short Introductions series from Oxford University Press contains hundreds of titles in almost every subject area. These pocket-sized books are the perfect way to get ahead in a new subject quickly. Our expert authors combine facts, analysis, perspective, new ideas, and enthusiasm to make interesting and challenging topics highly readable.
Publisher: OUP Oxford
ISBN: 0191609544
Category : Computers
Languages : en
Pages : 153
Book Description
We live an information-soaked existence - information pours into our lives through television, radio, books, and of course, the Internet. Some say we suffer from 'infoglut'. But what is information? The concept of 'information' is a profound one, rooted in mathematics, central to whole branches of science, yet with implications on every aspect of our everyday lives: DNA provides the information to create us; we learn through the information fed to us; we relate to each other through information transfer - gossip, lectures, reading. Information is not only a mathematically powerful concept, but its critical role in society raises wider ethical issues: who owns information? Who controls its dissemination? Who has access to information? Luciano Floridi, a philosopher of information, cuts across many subjects, from a brief look at the mathematical roots of information - its definition and measurement in 'bits'- to its role in genetics (we are information), and its social meaning and value. He ends by considering the ethics of information, including issues of ownership, privacy, and accessibility; copyright and open source. For those unfamiliar with its precise meaning and wide applicability as a philosophical concept, 'information' may seem a bland or mundane topic. Those who have studied some science or philosophy or sociology will already be aware of its centrality and richness. But for all readers, whether from the humanities or sciences, Floridi gives a fascinating and inspirational introduction to this most fundamental of ideas. ABOUT THE SERIES: The Very Short Introductions series from Oxford University Press contains hundreds of titles in almost every subject area. These pocket-sized books are the perfect way to get ahead in a new subject quickly. Our expert authors combine facts, analysis, perspective, new ideas, and enthusiasm to make interesting and challenging topics highly readable.
Information Theory
Author: JV Stone
Publisher: Sebtel Press
ISBN: 0956372856
Category : Business & Economics
Languages : en
Pages : 259
Book Description
Originally developed by Claude Shannon in the 1940s, information theory laid the foundations for the digital revolution, and is now an essential tool in telecommunications, genetics, linguistics, brain sciences, and deep space communication. In this richly illustrated book, accessible examples are used to introduce information theory in terms of everyday games like ‘20 questions’ before more advanced topics are explored. Online MatLab and Python computer programs provide hands-on experience of information theory in action, and PowerPoint slides give support for teaching. Written in an informal style, with a comprehensive glossary and tutorial appendices, this text is an ideal primer for novices who wish to learn the essential principles and applications of information theory.
Publisher: Sebtel Press
ISBN: 0956372856
Category : Business & Economics
Languages : en
Pages : 259
Book Description
Originally developed by Claude Shannon in the 1940s, information theory laid the foundations for the digital revolution, and is now an essential tool in telecommunications, genetics, linguistics, brain sciences, and deep space communication. In this richly illustrated book, accessible examples are used to introduce information theory in terms of everyday games like ‘20 questions’ before more advanced topics are explored. Online MatLab and Python computer programs provide hands-on experience of information theory in action, and PowerPoint slides give support for teaching. Written in an informal style, with a comprehensive glossary and tutorial appendices, this text is an ideal primer for novices who wish to learn the essential principles and applications of information theory.
A Mathematical Theory of Evidence
Author: Glenn Shafer
Publisher: Princeton University Press
ISBN: 0691214697
Category : Mathematics
Languages : en
Pages :
Book Description
Both in science and in practical affairs we reason by combining facts only inconclusively supported by evidence. Building on an abstract understanding of this process of combination, this book constructs a new theory of epistemic probability. The theory draws on the work of A. P. Dempster but diverges from Depster's viewpoint by identifying his "lower probabilities" as epistemic probabilities and taking his rule for combining "upper and lower probabilities" as fundamental. The book opens with a critique of the well-known Bayesian theory of epistemic probability. It then proceeds to develop an alternative to the additive set functions and the rule of conditioning of the Bayesian theory: set functions that need only be what Choquet called "monotone of order of infinity." and Dempster's rule for combining such set functions. This rule, together with the idea of "weights of evidence," leads to both an extensive new theory and a better understanding of the Bayesian theory. The book concludes with a brief treatment of statistical inference and a discussion of the limitations of epistemic probability. Appendices contain mathematical proofs, which are relatively elementary and seldom depend on mathematics more advanced that the binomial theorem.
Publisher: Princeton University Press
ISBN: 0691214697
Category : Mathematics
Languages : en
Pages :
Book Description
Both in science and in practical affairs we reason by combining facts only inconclusively supported by evidence. Building on an abstract understanding of this process of combination, this book constructs a new theory of epistemic probability. The theory draws on the work of A. P. Dempster but diverges from Depster's viewpoint by identifying his "lower probabilities" as epistemic probabilities and taking his rule for combining "upper and lower probabilities" as fundamental. The book opens with a critique of the well-known Bayesian theory of epistemic probability. It then proceeds to develop an alternative to the additive set functions and the rule of conditioning of the Bayesian theory: set functions that need only be what Choquet called "monotone of order of infinity." and Dempster's rule for combining such set functions. This rule, together with the idea of "weights of evidence," leads to both an extensive new theory and a better understanding of the Bayesian theory. The book concludes with a brief treatment of statistical inference and a discussion of the limitations of epistemic probability. Appendices contain mathematical proofs, which are relatively elementary and seldom depend on mathematics more advanced that the binomial theorem.
Entropy and Information Theory
Author: Robert M. Gray
Publisher: Springer Science & Business Media
ISBN: 1475739826
Category : Computers
Languages : en
Pages : 346
Book Description
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behavior of sample information and expected information. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.
Publisher: Springer Science & Business Media
ISBN: 1475739826
Category : Computers
Languages : en
Pages : 346
Book Description
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behavior of sample information and expected information. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.
Information Theory
Author: Sergio Verdú
Publisher: Wiley-IEEE Press
ISBN:
Category : Computers
Languages : en
Pages : 776
Book Description
See:
Publisher: Wiley-IEEE Press
ISBN:
Category : Computers
Languages : en
Pages : 776
Book Description
See: