Author: Erich L. Lehmann
Publisher: Springer Science & Business Media
ISBN: 1441995005
Category : Mathematics
Languages : en
Pages : 123
Book Description
Classical statistical theory—hypothesis testing, estimation, and the design of experiments and sample surveys—is mainly the creation of two men: Ronald A. Fisher (1890-1962) and Jerzy Neyman (1894-1981). Their contributions sometimes complemented each other, sometimes occurred in parallel, and, particularly at later stages, often were in strong opposition. The two men would not be pleased to see their names linked in this way, since throughout most of their working lives they detested each other. Nevertheless, they worked on the same problems, and through their combined efforts created a new discipline. This new book by E.L. Lehmann, himself a student of Neyman’s, explores the relationship between Neyman and Fisher, as well as their interactions with other influential statisticians, and the statistical history they helped create together. Lehmann uses direct correspondence and original papers to recreate an historical account of the creation of the Neyman-Pearson Theory as well as Fisher’s dissent, and other important statistical theories.
Fisher, Neyman, and the Creation of Classical Statistics
Author: Erich L. Lehmann
Publisher: Springer Science & Business Media
ISBN: 1441995005
Category : Mathematics
Languages : en
Pages : 123
Book Description
Classical statistical theory—hypothesis testing, estimation, and the design of experiments and sample surveys—is mainly the creation of two men: Ronald A. Fisher (1890-1962) and Jerzy Neyman (1894-1981). Their contributions sometimes complemented each other, sometimes occurred in parallel, and, particularly at later stages, often were in strong opposition. The two men would not be pleased to see their names linked in this way, since throughout most of their working lives they detested each other. Nevertheless, they worked on the same problems, and through their combined efforts created a new discipline. This new book by E.L. Lehmann, himself a student of Neyman’s, explores the relationship between Neyman and Fisher, as well as their interactions with other influential statisticians, and the statistical history they helped create together. Lehmann uses direct correspondence and original papers to recreate an historical account of the creation of the Neyman-Pearson Theory as well as Fisher’s dissent, and other important statistical theories.
Publisher: Springer Science & Business Media
ISBN: 1441995005
Category : Mathematics
Languages : en
Pages : 123
Book Description
Classical statistical theory—hypothesis testing, estimation, and the design of experiments and sample surveys—is mainly the creation of two men: Ronald A. Fisher (1890-1962) and Jerzy Neyman (1894-1981). Their contributions sometimes complemented each other, sometimes occurred in parallel, and, particularly at later stages, often were in strong opposition. The two men would not be pleased to see their names linked in this way, since throughout most of their working lives they detested each other. Nevertheless, they worked on the same problems, and through their combined efforts created a new discipline. This new book by E.L. Lehmann, himself a student of Neyman’s, explores the relationship between Neyman and Fisher, as well as their interactions with other influential statisticians, and the statistical history they helped create together. Lehmann uses direct correspondence and original papers to recreate an historical account of the creation of the Neyman-Pearson Theory as well as Fisher’s dissent, and other important statistical theories.
Neyman
Author: Constance Reid
Publisher: Springer Science & Business Media
ISBN: 9780387983578
Category : Biography & Autobiography
Languages : en
Pages : 338
Book Description
Jerzy Neyman received the National Medal of Science "for laying the foundations of modern statistics and devising tests and procedures that have become essential parts of the knowledge of every statistician." Until his death in 1981 at the age of 87, Neyman was vigorously involved in the concerns and controversies of the day, a scientist whose personality and activity were integral parts of his contribution to science. His career is thus particularly well-suited for the non-technical life-story which Constance Reid has made her own in such well-received biographies of Hilbert and Courant. She was able to talk extensively with Neyman and have access to his personal and professional letters and papers. Her book will thus appeal to professional statisticians as well as amateurs wanting to learn about a subject which permeates almost every aspect of modern life.
Publisher: Springer Science & Business Media
ISBN: 9780387983578
Category : Biography & Autobiography
Languages : en
Pages : 338
Book Description
Jerzy Neyman received the National Medal of Science "for laying the foundations of modern statistics and devising tests and procedures that have become essential parts of the knowledge of every statistician." Until his death in 1981 at the age of 87, Neyman was vigorously involved in the concerns and controversies of the day, a scientist whose personality and activity were integral parts of his contribution to science. His career is thus particularly well-suited for the non-technical life-story which Constance Reid has made her own in such well-received biographies of Hilbert and Courant. She was able to talk extensively with Neyman and have access to his personal and professional letters and papers. Her book will thus appeal to professional statisticians as well as amateurs wanting to learn about a subject which permeates almost every aspect of modern life.
R. A. Fisher, the Life of a Scientist
Author: Joan Fisher Box
Publisher: John Wiley & Sons
ISBN:
Category : Biography & Autobiography
Languages : en
Pages : 560
Book Description
Nature and nurture; In the wilderness; Mathematical statistics; Rothamsted Experimental Station; Tests of significance; The design of experiments; The genetical theory of natural selection; The evolution of dominance; The role of a statistician; Galton Professor of Eugenics; Evolutionary ideas; In the United States and India; Blood groups in man; Losses of war; Arthur Balfour Professor of genetics; The biometrical movement; Scientific inference; Retirement.
Publisher: John Wiley & Sons
ISBN:
Category : Biography & Autobiography
Languages : en
Pages : 560
Book Description
Nature and nurture; In the wilderness; Mathematical statistics; Rothamsted Experimental Station; Tests of significance; The design of experiments; The genetical theory of natural selection; The evolution of dominance; The role of a statistician; Galton Professor of Eugenics; Evolutionary ideas; In the United States and India; Blood groups in man; Losses of war; Arthur Balfour Professor of genetics; The biometrical movement; Scientific inference; Retirement.
Classic Topics on the History of Modern Mathematical Statistics
Author: Prakash Gorroochurn
Publisher: John Wiley & Sons
ISBN: 1119127939
Category : Mathematics
Languages : en
Pages : 776
Book Description
"There is nothing like it on the market...no others are as encyclopedic...the writing is exemplary: simple, direct, and competent." —George W. Cobb, Professor Emeritus of Mathematics and Statistics, Mount Holyoke College Written in a direct and clear manner, Classic Topics on the History of Modern Mathematical Statistics: From Laplace to More Recent Times presents a comprehensive guide to the history of mathematical statistics and details the major results and crucial developments over a 200-year period. Presented in chronological order, the book features an account of the classical and modern works that are essential to understanding the applications of mathematical statistics. Divided into three parts, the book begins with extensive coverage of the probabilistic works of Laplace, who laid much of the foundations of later developments in statistical theory. Subsequently, the second part introduces 20th century statistical developments including work from Karl Pearson, Student, Fisher, and Neyman. Lastly, the author addresses post-Fisherian developments. Classic Topics on the History of Modern Mathematical Statistics: From Laplace to More Recent Times also features: A detailed account of Galton's discovery of regression and correlation as well as the subsequent development of Karl Pearson's X2 and Student's t A comprehensive treatment of the permeating influence of Fisher in all aspects of modern statistics beginning with his work in 1912 Significant coverage of Neyman–Pearson theory, which includes a discussion of the differences to Fisher’s works Discussions on key historical developments as well as the various disagreements, contrasting information, and alternative theories in the history of modern mathematical statistics in an effort to provide a thorough historical treatment Classic Topics on the History of Modern Mathematical Statistics: From Laplace to More Recent Times is an excellent reference for academicians with a mathematical background who are teaching or studying the history or philosophical controversies of mathematics and statistics. The book is also a useful guide for readers with a general interest in statistical inference.
Publisher: John Wiley & Sons
ISBN: 1119127939
Category : Mathematics
Languages : en
Pages : 776
Book Description
"There is nothing like it on the market...no others are as encyclopedic...the writing is exemplary: simple, direct, and competent." —George W. Cobb, Professor Emeritus of Mathematics and Statistics, Mount Holyoke College Written in a direct and clear manner, Classic Topics on the History of Modern Mathematical Statistics: From Laplace to More Recent Times presents a comprehensive guide to the history of mathematical statistics and details the major results and crucial developments over a 200-year period. Presented in chronological order, the book features an account of the classical and modern works that are essential to understanding the applications of mathematical statistics. Divided into three parts, the book begins with extensive coverage of the probabilistic works of Laplace, who laid much of the foundations of later developments in statistical theory. Subsequently, the second part introduces 20th century statistical developments including work from Karl Pearson, Student, Fisher, and Neyman. Lastly, the author addresses post-Fisherian developments. Classic Topics on the History of Modern Mathematical Statistics: From Laplace to More Recent Times also features: A detailed account of Galton's discovery of regression and correlation as well as the subsequent development of Karl Pearson's X2 and Student's t A comprehensive treatment of the permeating influence of Fisher in all aspects of modern statistics beginning with his work in 1912 Significant coverage of Neyman–Pearson theory, which includes a discussion of the differences to Fisher’s works Discussions on key historical developments as well as the various disagreements, contrasting information, and alternative theories in the history of modern mathematical statistics in an effort to provide a thorough historical treatment Classic Topics on the History of Modern Mathematical Statistics: From Laplace to More Recent Times is an excellent reference for academicians with a mathematical background who are teaching or studying the history or philosophical controversies of mathematics and statistics. The book is also a useful guide for readers with a general interest in statistical inference.
Statistical Inference as Severe Testing
Author: Deborah G. Mayo
Publisher: Cambridge University Press
ISBN: 1108563309
Category : Mathematics
Languages : en
Pages : 503
Book Description
Mounting failures of replication in social and biological sciences give a new urgency to critically appraising proposed reforms. This book pulls back the cover on disagreements between experts charged with restoring integrity to science. It denies two pervasive views of the role of probability in inference: to assign degrees of belief, and to control error rates in a long run. If statistical consumers are unaware of assumptions behind rival evidence reforms, they can't scrutinize the consequences that affect them (in personalized medicine, psychology, etc.). The book sets sail with a simple tool: if little has been done to rule out flaws in inferring a claim, then it has not passed a severe test. Many methods advocated by data experts do not stand up to severe scrutiny and are in tension with successful strategies for blocking or accounting for cherry picking and selective reporting. Through a series of excursions and exhibits, the philosophy and history of inductive inference come alive. Philosophical tools are put to work to solve problems about science and pseudoscience, induction and falsification.
Publisher: Cambridge University Press
ISBN: 1108563309
Category : Mathematics
Languages : en
Pages : 503
Book Description
Mounting failures of replication in social and biological sciences give a new urgency to critically appraising proposed reforms. This book pulls back the cover on disagreements between experts charged with restoring integrity to science. It denies two pervasive views of the role of probability in inference: to assign degrees of belief, and to control error rates in a long run. If statistical consumers are unaware of assumptions behind rival evidence reforms, they can't scrutinize the consequences that affect them (in personalized medicine, psychology, etc.). The book sets sail with a simple tool: if little has been done to rule out flaws in inferring a claim, then it has not passed a severe test. Many methods advocated by data experts do not stand up to severe scrutiny and are in tension with successful strategies for blocking or accounting for cherry picking and selective reporting. Through a series of excursions and exhibits, the philosophy and history of inductive inference come alive. Philosophical tools are put to work to solve problems about science and pseudoscience, induction and falsification.
Large-Scale Inference
Author: Bradley Efron
Publisher: Cambridge University Press
ISBN: 1139492136
Category : Mathematics
Languages : en
Pages :
Book Description
We live in a new age for statistical inference, where modern scientific technology such as microarrays and fMRI machines routinely produce thousands and sometimes millions of parallel data sets, each with its own estimation or testing problem. Doing thousands of problems at once is more than repeated application of classical methods. Taking an empirical Bayes approach, Bradley Efron, inventor of the bootstrap, shows how information accrues across problems in a way that combines Bayesian and frequentist ideas. Estimation, testing and prediction blend in this framework, producing opportunities for new methodologies of increased power. New difficulties also arise, easily leading to flawed inferences. This book takes a careful look at both the promise and pitfalls of large-scale statistical inference, with particular attention to false discovery rates, the most successful of the new statistical techniques. Emphasis is on the inferential ideas underlying technical developments, illustrated using a large number of real examples.
Publisher: Cambridge University Press
ISBN: 1139492136
Category : Mathematics
Languages : en
Pages :
Book Description
We live in a new age for statistical inference, where modern scientific technology such as microarrays and fMRI machines routinely produce thousands and sometimes millions of parallel data sets, each with its own estimation or testing problem. Doing thousands of problems at once is more than repeated application of classical methods. Taking an empirical Bayes approach, Bradley Efron, inventor of the bootstrap, shows how information accrues across problems in a way that combines Bayesian and frequentist ideas. Estimation, testing and prediction blend in this framework, producing opportunities for new methodologies of increased power. New difficulties also arise, easily leading to flawed inferences. This book takes a careful look at both the promise and pitfalls of large-scale statistical inference, with particular attention to false discovery rates, the most successful of the new statistical techniques. Emphasis is on the inferential ideas underlying technical developments, illustrated using a large number of real examples.
A Chronicle of Permutation Statistical Methods
Author: Kenneth J. Berry
Publisher: Springer Science & Business Media
ISBN: 3319027441
Category : Mathematics
Languages : en
Pages : 535
Book Description
The focus of this book is on the birth and historical development of permutation statistical methods from the early 1920s to the near present. Beginning with the seminal contributions of R.A. Fisher, E.J.G. Pitman, and others in the 1920s and 1930s, permutation statistical methods were initially introduced to validate the assumptions of classical statistical methods. Permutation methods have advantages over classical methods in that they are optimal for small data sets and non-random samples, are data-dependent, and are free of distributional assumptions. Permutation probability values may be exact, or estimated via moment- or resampling-approximation procedures. Because permutation methods are inherently computationally-intensive, the evolution of computers and computing technology that made modern permutation methods possible accompanies the historical narrative. Permutation analogs of many well-known statistical tests are presented in a historical context, including multiple correlation and regression, analysis of variance, contingency table analysis, and measures of association and agreement. A non-mathematical approach makes the text accessible to readers of all levels.
Publisher: Springer Science & Business Media
ISBN: 3319027441
Category : Mathematics
Languages : en
Pages : 535
Book Description
The focus of this book is on the birth and historical development of permutation statistical methods from the early 1920s to the near present. Beginning with the seminal contributions of R.A. Fisher, E.J.G. Pitman, and others in the 1920s and 1930s, permutation statistical methods were initially introduced to validate the assumptions of classical statistical methods. Permutation methods have advantages over classical methods in that they are optimal for small data sets and non-random samples, are data-dependent, and are free of distributional assumptions. Permutation probability values may be exact, or estimated via moment- or resampling-approximation procedures. Because permutation methods are inherently computationally-intensive, the evolution of computers and computing technology that made modern permutation methods possible accompanies the historical narrative. Permutation analogs of many well-known statistical tests are presented in a historical context, including multiple correlation and regression, analysis of variance, contingency table analysis, and measures of association and agreement. A non-mathematical approach makes the text accessible to readers of all levels.
The Logical Foundations of Statistical Inference
Author: Henry E. Kyburg Jr.
Publisher: Springer Science & Business Media
ISBN: 9401021759
Category : Philosophy
Languages : en
Pages : 440
Book Description
Everyone knows it is easy to lie with statistics. It is important then to be able to tell a statistical lie from a valid statistical inference. It is a relatively widely accepted commonplace that our scientific knowledge is not certain and incorrigible, but merely probable, subject to refinement, modifi cation, and even overthrow. The rankest beginner at a gambling table understands that his decisions must be based on mathematical ex pectations - that is, on utilities weighted by probabilities. It is widely held that the same principles apply almost all the time in the game of life. If we turn to philosophers, or to mathematical statisticians, or to probability theorists for criteria of validity in statistical inference, for the general principles that distinguish well grounded from ill grounded generalizations and laws, or for the interpretation of that probability we must, like the gambler, take as our guide in life, we find disagreement, confusion, and frustration. We might be prepared to find disagreements on a philosophical and theoretical level (although we do not find them in the case of deductive logic) but we do not expect, and we may be surprised to find, that these theoretical disagreements lead to differences in the conclusions that are regarded as 'acceptable' in the practice of science and public affairs, and in the conduct of business.
Publisher: Springer Science & Business Media
ISBN: 9401021759
Category : Philosophy
Languages : en
Pages : 440
Book Description
Everyone knows it is easy to lie with statistics. It is important then to be able to tell a statistical lie from a valid statistical inference. It is a relatively widely accepted commonplace that our scientific knowledge is not certain and incorrigible, but merely probable, subject to refinement, modifi cation, and even overthrow. The rankest beginner at a gambling table understands that his decisions must be based on mathematical ex pectations - that is, on utilities weighted by probabilities. It is widely held that the same principles apply almost all the time in the game of life. If we turn to philosophers, or to mathematical statisticians, or to probability theorists for criteria of validity in statistical inference, for the general principles that distinguish well grounded from ill grounded generalizations and laws, or for the interpretation of that probability we must, like the gambler, take as our guide in life, we find disagreement, confusion, and frustration. We might be prepared to find disagreements on a philosophical and theoretical level (although we do not find them in the case of deductive logic) but we do not expect, and we may be surprised to find, that these theoretical disagreements lead to differences in the conclusions that are regarded as 'acceptable' in the practice of science and public affairs, and in the conduct of business.
Statistical Modeling With R
Author: Pablo Inchausti
Publisher: Oxford University Press
ISBN: 0192675036
Category : Science
Languages : en
Pages : 519
Book Description
To date, statistics has tended to be neatly divided into two theoretical approaches or frameworks: frequentist (or classical) and Bayesian. Scientists typically choose the statistical framework to analyse their data depending on the nature and complexity of the problem, and based on their personal views and prior training on probability and uncertainty. Although textbooks and courses should reflect and anticipate this dual reality, they rarely do so. This accessible textbook explains, discusses, and applies both the frequentist and Bayesian theoretical frameworks to fit the different types of statistical models that allow an analysis of the types of data most commonly gathered by life scientists. It presents the material in an informal, approachable, and progressive manner suitable for readers with only a basic knowledge of calculus and statistics. Statistical Modeling with R is aimed at senior undergraduate and graduate students, professional researchers, and practitioners throughout the life sciences, seeking to strengthen their understanding of quantitative methods and to apply them successfully to real world scenarios, whether in the fields of ecology, evolution, environmental studies, or computational biology.
Publisher: Oxford University Press
ISBN: 0192675036
Category : Science
Languages : en
Pages : 519
Book Description
To date, statistics has tended to be neatly divided into two theoretical approaches or frameworks: frequentist (or classical) and Bayesian. Scientists typically choose the statistical framework to analyse their data depending on the nature and complexity of the problem, and based on their personal views and prior training on probability and uncertainty. Although textbooks and courses should reflect and anticipate this dual reality, they rarely do so. This accessible textbook explains, discusses, and applies both the frequentist and Bayesian theoretical frameworks to fit the different types of statistical models that allow an analysis of the types of data most commonly gathered by life scientists. It presents the material in an informal, approachable, and progressive manner suitable for readers with only a basic knowledge of calculus and statistics. Statistical Modeling with R is aimed at senior undergraduate and graduate students, professional researchers, and practitioners throughout the life sciences, seeking to strengthen their understanding of quantitative methods and to apply them successfully to real world scenarios, whether in the fields of ecology, evolution, environmental studies, or computational biology.
The Significance Test Controversy Revisited
Author: Bruno Lecoutre
Publisher: Springer
ISBN: 3662440466
Category : Mathematics
Languages : en
Pages : 140
Book Description
The purpose of this book is not only to revisit the “significance test controversy,”but also to provide a conceptually sounder alternative. As such, it presents a Bayesian framework for a new approach to analyzing and interpreting experimental data. It also prepares students and researchers for reporting on experimental results. Normative aspects: The main views of statistical tests are revisited and the philosophies of Fisher, Neyman-Pearson and Jeffrey are discussed in detail. Descriptive aspects: The misuses of Null Hypothesis Significance Tests are reconsidered in light of Jeffreys’ Bayesian conceptions concerning the role of statistical inference in experimental investigations. Prescriptive aspects: The current effect size and confidence interval reporting practices are presented and seriously questioned. Methodological aspects are carefully discussed and fiducial Bayesian methods are proposed as a more suitable alternative for reporting on experimental results. In closing, basic routine procedures regarding the means and their generalization to the most common ANOVA applications are presented and illustrated. All the calculations discussed can be easily carried out using the freeware LePAC package.
Publisher: Springer
ISBN: 3662440466
Category : Mathematics
Languages : en
Pages : 140
Book Description
The purpose of this book is not only to revisit the “significance test controversy,”but also to provide a conceptually sounder alternative. As such, it presents a Bayesian framework for a new approach to analyzing and interpreting experimental data. It also prepares students and researchers for reporting on experimental results. Normative aspects: The main views of statistical tests are revisited and the philosophies of Fisher, Neyman-Pearson and Jeffrey are discussed in detail. Descriptive aspects: The misuses of Null Hypothesis Significance Tests are reconsidered in light of Jeffreys’ Bayesian conceptions concerning the role of statistical inference in experimental investigations. Prescriptive aspects: The current effect size and confidence interval reporting practices are presented and seriously questioned. Methodological aspects are carefully discussed and fiducial Bayesian methods are proposed as a more suitable alternative for reporting on experimental results. In closing, basic routine procedures regarding the means and their generalization to the most common ANOVA applications are presented and illustrated. All the calculations discussed can be easily carried out using the freeware LePAC package.