Author: Kofi Kissi Dompere
Publisher: Springer
ISBN: 3030181596
Category : Technology & Engineering
Languages : en
Pages : 286
Book Description
This book presents an epistemic framework for dealing with information-knowledge and certainty-uncertainty problems within the space of quality-quantity dualities. It bridges between theoretical concepts of entropy and entropy measurements, proposing the concept and measurement of fuzzy-stochastic entropy that is applicable to all areas of knowing under human cognitive limitations over the epistemological space. The book builds on two previous monographs by the same author concerning theories of info-statics and info-dynamics, to deal with identification and transformation problems respectively. The theoretical framework is developed by using the toolboxes such as those of the principle of opposites, systems of actual-potential polarities and negative-positive dualities, under different cost-benefit time-structures. The category theory and the fuzzy paradigm of thought, under methodological constructionism-reductionism duality, are used in the fuzzy-stochastic and cost-benefit spaces to point to directions of global application in knowing, knowledge and decision-choice actions. Thus, the book is concerned with a general theory of entropy, showing how the fuzzy paradigm of thought is developed to deal with the problems of qualitative-quantitative uncertainties over the fuzzy-stochastic space, which will be applicable to conditions of soft-hard data, fact, evidence and knowledge over the spaces of problem-solution dualities, decision-choice actions in sciences, non-sciences, engineering and planning sciences to abstract acceptable information-knowledge elements.
A General Theory of Entropy
Author: Kofi Kissi Dompere
Publisher: Springer
ISBN: 3030181596
Category : Technology & Engineering
Languages : en
Pages : 286
Book Description
This book presents an epistemic framework for dealing with information-knowledge and certainty-uncertainty problems within the space of quality-quantity dualities. It bridges between theoretical concepts of entropy and entropy measurements, proposing the concept and measurement of fuzzy-stochastic entropy that is applicable to all areas of knowing under human cognitive limitations over the epistemological space. The book builds on two previous monographs by the same author concerning theories of info-statics and info-dynamics, to deal with identification and transformation problems respectively. The theoretical framework is developed by using the toolboxes such as those of the principle of opposites, systems of actual-potential polarities and negative-positive dualities, under different cost-benefit time-structures. The category theory and the fuzzy paradigm of thought, under methodological constructionism-reductionism duality, are used in the fuzzy-stochastic and cost-benefit spaces to point to directions of global application in knowing, knowledge and decision-choice actions. Thus, the book is concerned with a general theory of entropy, showing how the fuzzy paradigm of thought is developed to deal with the problems of qualitative-quantitative uncertainties over the fuzzy-stochastic space, which will be applicable to conditions of soft-hard data, fact, evidence and knowledge over the spaces of problem-solution dualities, decision-choice actions in sciences, non-sciences, engineering and planning sciences to abstract acceptable information-knowledge elements.
Publisher: Springer
ISBN: 3030181596
Category : Technology & Engineering
Languages : en
Pages : 286
Book Description
This book presents an epistemic framework for dealing with information-knowledge and certainty-uncertainty problems within the space of quality-quantity dualities. It bridges between theoretical concepts of entropy and entropy measurements, proposing the concept and measurement of fuzzy-stochastic entropy that is applicable to all areas of knowing under human cognitive limitations over the epistemological space. The book builds on two previous monographs by the same author concerning theories of info-statics and info-dynamics, to deal with identification and transformation problems respectively. The theoretical framework is developed by using the toolboxes such as those of the principle of opposites, systems of actual-potential polarities and negative-positive dualities, under different cost-benefit time-structures. The category theory and the fuzzy paradigm of thought, under methodological constructionism-reductionism duality, are used in the fuzzy-stochastic and cost-benefit spaces to point to directions of global application in knowing, knowledge and decision-choice actions. Thus, the book is concerned with a general theory of entropy, showing how the fuzzy paradigm of thought is developed to deal with the problems of qualitative-quantitative uncertainties over the fuzzy-stochastic space, which will be applicable to conditions of soft-hard data, fact, evidence and knowledge over the spaces of problem-solution dualities, decision-choice actions in sciences, non-sciences, engineering and planning sciences to abstract acceptable information-knowledge elements.
Social Entropy Theory
Author: Kenneth D. Bailey
Publisher: SUNY Press
ISBN: 9780791400562
Category : Social Science
Languages : en
Pages : 336
Book Description
Social Entropy Theory illuminates the fundamental problems of societal analysis with a nonequilibrium approach, a new frame of reference built upon contemporary macrological principles, including general systems theory and information theory. Social entropy theory, using Shannon's H and the entropy concept, avoids the common (and often artificial) separation of theory and method in sociology. The hallmark of the volume is integration, as seen in the author's interdisciplinary discussions of equilibrium, entropy, and homeostasis. Unique features of the book are the introduction of the three-level model of social measurement, the theory of allocation, the concepts of global-mutable-immutable, discussion of order and power, and a large set of testable hypotheses.
Publisher: SUNY Press
ISBN: 9780791400562
Category : Social Science
Languages : en
Pages : 336
Book Description
Social Entropy Theory illuminates the fundamental problems of societal analysis with a nonequilibrium approach, a new frame of reference built upon contemporary macrological principles, including general systems theory and information theory. Social entropy theory, using Shannon's H and the entropy concept, avoids the common (and often artificial) separation of theory and method in sociology. The hallmark of the volume is integration, as seen in the author's interdisciplinary discussions of equilibrium, entropy, and homeostasis. Unique features of the book are the introduction of the three-level model of social measurement, the theory of allocation, the concepts of global-mutable-immutable, discussion of order and power, and a large set of testable hypotheses.
Entropy and Information Theory
Author: Robert M. Gray
Publisher: Springer Science & Business Media
ISBN: 1475739826
Category : Computers
Languages : en
Pages : 346
Book Description
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behavior of sample information and expected information. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.
Publisher: Springer Science & Business Media
ISBN: 1475739826
Category : Computers
Languages : en
Pages : 346
Book Description
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behavior of sample information and expected information. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.
The General Theory of Thermodynamics
Author: Joseph Ellis Trevor
Publisher:
ISBN:
Category : Thermodynamics
Languages : en
Pages : 128
Book Description
Publisher:
ISBN:
Category : Thermodynamics
Languages : en
Pages : 128
Book Description
Mathematical Theory of Entropy
Author: Nathaniel F. G. Martin
Publisher: Cambridge University Press
ISBN: 9780521177382
Category : Computers
Languages : en
Pages : 292
Book Description
This excellent 1981 treatment of the mathematical theory of entropy gives an accessible exposition its application to other fields.
Publisher: Cambridge University Press
ISBN: 9780521177382
Category : Computers
Languages : en
Pages : 292
Book Description
This excellent 1981 treatment of the mathematical theory of entropy gives an accessible exposition its application to other fields.
Discover Entropy And The Second Law Of Thermodynamics: A Playful Way Of Discovering A Law Of Nature
Author: Arieh Ben-naim
Publisher: World Scientific
ISBN: 9814465267
Category : Science
Languages : en
Pages : 287
Book Description
This is a sequel to the author's book entitled “Entropy Demystified” (Published by World Scientific, 2007). The aim is essentially the same as that of the previous book by the author: to present Entropy and the Second Law as simple, meaningful and comprehensible concepts. In addition, this book presents a series of “experiments” which are designed to help the reader discover entropy and the Second Law. While doing the experiments, the reader will encounter three most fundamental probability distributions featuring in Physics: the Uniform, the Boltzmann and the Maxwell-Boltzmann distributions. In addition, the concepts of entropy and the Second Law will emerge naturally from these experiments without a tinge of mystery. These concepts are explained with the help of a few familiar ideas of probability and information theory.The main “value” of the book is to introduce entropy and the Second Law in simple language which renders it accessible to any reader who can read and is curious about the basic laws of nature. The book is addressed to anyone interested in science and in understanding natural phenomenon. It will afford the reader the opportunity to discover one of the most fundamental laws of physics — a law that has resisted complete understanding for over a century. The book is also designed to be enjoyable.There is no other book of its kind (except “Entropy Demystified” by the same author) that offers the reader a unique opportunity to discover one of the most profound laws — sometimes viewed as a mysterious — while comfortably playing with familiar games. There are no pre-requisites expected from the readers; all that the reader is expected to do is to follow the experiments or imagine doing the experiments and reach the inevitable conclusions.
Publisher: World Scientific
ISBN: 9814465267
Category : Science
Languages : en
Pages : 287
Book Description
This is a sequel to the author's book entitled “Entropy Demystified” (Published by World Scientific, 2007). The aim is essentially the same as that of the previous book by the author: to present Entropy and the Second Law as simple, meaningful and comprehensible concepts. In addition, this book presents a series of “experiments” which are designed to help the reader discover entropy and the Second Law. While doing the experiments, the reader will encounter three most fundamental probability distributions featuring in Physics: the Uniform, the Boltzmann and the Maxwell-Boltzmann distributions. In addition, the concepts of entropy and the Second Law will emerge naturally from these experiments without a tinge of mystery. These concepts are explained with the help of a few familiar ideas of probability and information theory.The main “value” of the book is to introduce entropy and the Second Law in simple language which renders it accessible to any reader who can read and is curious about the basic laws of nature. The book is addressed to anyone interested in science and in understanding natural phenomenon. It will afford the reader the opportunity to discover one of the most fundamental laws of physics — a law that has resisted complete understanding for over a century. The book is also designed to be enjoyable.There is no other book of its kind (except “Entropy Demystified” by the same author) that offers the reader a unique opportunity to discover one of the most profound laws — sometimes viewed as a mysterious — while comfortably playing with familiar games. There are no pre-requisites expected from the readers; all that the reader is expected to do is to follow the experiments or imagine doing the experiments and reach the inevitable conclusions.
The Theory of Ecology
Author: Samuel M. Scheiner
Publisher: University of Chicago Press
ISBN: 0226736865
Category : Science
Languages : en
Pages : 416
Book Description
Despite claims to the contrary, the science of ecology has a long history of building theories. Many ecological theories are mathematical, computational, or statistical, though, and rarely have attempts been made to organize or extrapolate these models into broader theories. The Theory of Ecology brings together some of the most respected and creative theoretical ecologists of this era to advance a comprehensive, conceptual articulation of ecological theories. The contributors cover a wide range of topics, from ecological niche theory to population dynamic theory to island biogeography theory. Collectively, the chapters ably demonstrate how theory in ecology accounts for observations about the natural world and how models provide predictive understandings. It organizes these models into constitutive domains that highlight the strengths and weaknesses of ecological understanding. This book is a milestone in ecological theory and is certain to motivate future empirical and theoretical work in one of the most exciting and active domains of the life sciences.
Publisher: University of Chicago Press
ISBN: 0226736865
Category : Science
Languages : en
Pages : 416
Book Description
Despite claims to the contrary, the science of ecology has a long history of building theories. Many ecological theories are mathematical, computational, or statistical, though, and rarely have attempts been made to organize or extrapolate these models into broader theories. The Theory of Ecology brings together some of the most respected and creative theoretical ecologists of this era to advance a comprehensive, conceptual articulation of ecological theories. The contributors cover a wide range of topics, from ecological niche theory to population dynamic theory to island biogeography theory. Collectively, the chapters ably demonstrate how theory in ecology accounts for observations about the natural world and how models provide predictive understandings. It organizes these models into constitutive domains that highlight the strengths and weaknesses of ecological understanding. This book is a milestone in ecological theory and is certain to motivate future empirical and theoretical work in one of the most exciting and active domains of the life sciences.
Entropy and Information
Author: Mikhail V. Volkenstein
Publisher: Springer Science & Business Media
ISBN: 303460078X
Category : Science
Languages : en
Pages : 214
Book Description
This is just...entropy, he said, thinking that this explained everything, and he repeated the strange word a few times. 1 ? Karel Capek , “Krakatit” This “strange word” denotes one of the most basic quantities of the physics of heat phenomena, that is, of thermodynamics. Although the concept of entropy did indeed originate in thermodynamics, it later became clear that it was a more universal concept, of fundamental signi?cance for chemistry and biology, as well as physics. Although the concept of energy is usually considered more important and easier to grasp, it turns out, as we shall see, that the idea of entropy is just as substantial—and moreover not all that complicated. We can compute or measure the quantity of energy contained in this sheet of paper, and the same is true of its entropy. Furthermore, entropy has remarkable properties. Our galaxy, the solar system, and the biosphere all take their being from entropy, as a result of its transferenceto the surrounding medium. Thereis a surprisingconnectionbetween entropyandinformation,thatis,thetotalintelligencecommunicatedbyamessage. All of this is expounded in the present book, thereby conveying informationto the readeranddecreasinghis entropy;butitis uptothe readertodecidehowvaluable this information might be.
Publisher: Springer Science & Business Media
ISBN: 303460078X
Category : Science
Languages : en
Pages : 214
Book Description
This is just...entropy, he said, thinking that this explained everything, and he repeated the strange word a few times. 1 ? Karel Capek , “Krakatit” This “strange word” denotes one of the most basic quantities of the physics of heat phenomena, that is, of thermodynamics. Although the concept of entropy did indeed originate in thermodynamics, it later became clear that it was a more universal concept, of fundamental signi?cance for chemistry and biology, as well as physics. Although the concept of energy is usually considered more important and easier to grasp, it turns out, as we shall see, that the idea of entropy is just as substantial—and moreover not all that complicated. We can compute or measure the quantity of energy contained in this sheet of paper, and the same is true of its entropy. Furthermore, entropy has remarkable properties. Our galaxy, the solar system, and the biosphere all take their being from entropy, as a result of its transferenceto the surrounding medium. Thereis a surprisingconnectionbetween entropyandinformation,thatis,thetotalintelligencecommunicatedbyamessage. All of this is expounded in the present book, thereby conveying informationto the readeranddecreasinghis entropy;butitis uptothe readertodecidehowvaluable this information might be.
Methods, Models, Simulations and Approaches Towards a General Theory of Change - Proceedings of the Fifth National Conference on Systems Science
Author: Gianfranco Minati
Publisher: World Scientific
ISBN: 9814383325
Category : Computers
Languages : en
Pages : 725
Book Description
The book contains the Proceedings of the 2010 Conference of the Italian Systems Society. Papers deal with the interdisciplinary study of processes of changing related to a wide variety of specific disciplinary aspects. Classical attempts to deal with them, based on generalising approaches used to study the movement of bodies and environmental influence, have included ineffective reductionistic simplifications. Indeed changing also relates, for instance, to processes of acquisition and varying properties such as for software; growing and aging biological systems; learning/cognitive systems; and socio-economic systems growing and developing through innovations. Some approaches to modelling such processes are based on considering changes in structure, e.g., phase-transitions. Other approaches are based on considering (1) periodic changes in structure as for processes of self-organisation; (2) non-periodic but coherent changes in structure, as for processes of emergence; (3) the quantum level of description. Papers in the book study the problem considering its transdisciplinary nature, i.e., systemic properties studied per se and not within specific disciplinary contexts. The aim of these studies is to outline a transdisciplinary theory of change in systemic properties. Such a theory should have simultaneous, corresponding and eventually hierarchical disciplinary aspects as expected for a general theory of emergence. Within this transdisciplinary context, specific disciplinary research activities and results are assumed to be mutually represented as within a philosophical and conceptual framework based on the theoretical centrality of the observer and conceptual non-separability of context and observer, related to logically open systems and Quantum Entanglement. Contributions deal with such issues in interdisciplinary ways considering theoretical aspects and applications from Physics, Cognitive Science, Biology, Artificial Intelligence, Economics, Architecture, Philosophy, Music and Social Systems.
Publisher: World Scientific
ISBN: 9814383325
Category : Computers
Languages : en
Pages : 725
Book Description
The book contains the Proceedings of the 2010 Conference of the Italian Systems Society. Papers deal with the interdisciplinary study of processes of changing related to a wide variety of specific disciplinary aspects. Classical attempts to deal with them, based on generalising approaches used to study the movement of bodies and environmental influence, have included ineffective reductionistic simplifications. Indeed changing also relates, for instance, to processes of acquisition and varying properties such as for software; growing and aging biological systems; learning/cognitive systems; and socio-economic systems growing and developing through innovations. Some approaches to modelling such processes are based on considering changes in structure, e.g., phase-transitions. Other approaches are based on considering (1) periodic changes in structure as for processes of self-organisation; (2) non-periodic but coherent changes in structure, as for processes of emergence; (3) the quantum level of description. Papers in the book study the problem considering its transdisciplinary nature, i.e., systemic properties studied per se and not within specific disciplinary contexts. The aim of these studies is to outline a transdisciplinary theory of change in systemic properties. Such a theory should have simultaneous, corresponding and eventually hierarchical disciplinary aspects as expected for a general theory of emergence. Within this transdisciplinary context, specific disciplinary research activities and results are assumed to be mutually represented as within a philosophical and conceptual framework based on the theoretical centrality of the observer and conceptual non-separability of context and observer, related to logically open systems and Quantum Entanglement. Contributions deal with such issues in interdisciplinary ways considering theoretical aspects and applications from Physics, Cognitive Science, Biology, Artificial Intelligence, Economics, Architecture, Philosophy, Music and Social Systems.
New Foundations for Information Theory
Author: David Ellerman
Publisher: Springer Nature
ISBN: 3030865525
Category : Philosophy
Languages : en
Pages : 121
Book Description
This monograph offers a new foundation for information theory that is based on the notion of information-as-distinctions, being directly measured by logical entropy, and on the re-quantification as Shannon entropy, which is the fundamental concept for the theory of coding and communications. Information is based on distinctions, differences, distinguishability, and diversity. Information sets are defined that express the distinctions made by a partition, e.g., the inverse-image of a random variable so they represent the pre-probability notion of information. Then logical entropy is a probability measure on the information sets, the probability that on two independent trials, a distinction or “dit” of the partition will be obtained. The formula for logical entropy is a new derivation of an old formula that goes back to the early twentieth century and has been re-derived many times in different contexts. As a probability measure, all the compound notions of joint, conditional, and mutual logical entropy are immediate. The Shannon entropy (which is not defined as a measure in the sense of measure theory) and its compound notions are then derived from a non-linear dit-to-bit transform that re-quantifies the distinctions of a random variable in terms of bits—so the Shannon entropy is the average number of binary distinctions or bits necessary to make all the distinctions of the random variable. And, using a linearization method, all the set concepts in this logical information theory naturally extend to vector spaces in general—and to Hilbert spaces in particular—for quantum logical information theory which provides the natural measure of the distinctions made in quantum measurement. Relatively short but dense in content, this work can be a reference to researchers and graduate students doing investigations in information theory, maximum entropy methods in physics, engineering, and statistics, and to all those with a special interest in a new approach to quantum information theory.
Publisher: Springer Nature
ISBN: 3030865525
Category : Philosophy
Languages : en
Pages : 121
Book Description
This monograph offers a new foundation for information theory that is based on the notion of information-as-distinctions, being directly measured by logical entropy, and on the re-quantification as Shannon entropy, which is the fundamental concept for the theory of coding and communications. Information is based on distinctions, differences, distinguishability, and diversity. Information sets are defined that express the distinctions made by a partition, e.g., the inverse-image of a random variable so they represent the pre-probability notion of information. Then logical entropy is a probability measure on the information sets, the probability that on two independent trials, a distinction or “dit” of the partition will be obtained. The formula for logical entropy is a new derivation of an old formula that goes back to the early twentieth century and has been re-derived many times in different contexts. As a probability measure, all the compound notions of joint, conditional, and mutual logical entropy are immediate. The Shannon entropy (which is not defined as a measure in the sense of measure theory) and its compound notions are then derived from a non-linear dit-to-bit transform that re-quantifies the distinctions of a random variable in terms of bits—so the Shannon entropy is the average number of binary distinctions or bits necessary to make all the distinctions of the random variable. And, using a linearization method, all the set concepts in this logical information theory naturally extend to vector spaces in general—and to Hilbert spaces in particular—for quantum logical information theory which provides the natural measure of the distinctions made in quantum measurement. Relatively short but dense in content, this work can be a reference to researchers and graduate students doing investigations in information theory, maximum entropy methods in physics, engineering, and statistics, and to all those with a special interest in a new approach to quantum information theory.