Author: Jennifer L. Kobrin
Publisher:
ISBN:
Category :
Languages : en
Pages : 28
Book Description
There is much debate on the merits and pitfalls of standardized tests for college admission, with questions regarding the format (multiple-choice versus constructed response), cognitive complexity, and content of these assessments (achievement versus aptitude) at the forefront of the discussion. This study addressed these questions by investigating the relationship between SAT Mathematics (SAT-M) item characteristics and the item's ability to predict college outcomes. Using multiple regression, SAT-M item characteristics (content area, format, cognitive complexity, and abstract/concrete classification) were used to predict three outcome measures: the correlation of item score with first-year college grade point average (FYGPA), the correlation of item score with mathematics course grades, and the percentage of students who answered the item correctly and chose to major in a mathematics or science (STEM) field. Separate models were run including and excluding item difficulty and discrimination as covariates. The results revealed that many of the item characteristics were related to the outcome measures, and that item difficulty and discrimination had a mediating effect on several of the predictor variables, particularly on the effects of non-routine/insightful items, and multiple-choice items. [Slides presented at AERA 2011.].
Modeling the Predictive Validity of SAT Mathematics Items Using Item Characteristics
Author: Jennifer L. Kobrin
Publisher:
ISBN:
Category :
Languages : en
Pages : 28
Book Description
There is much debate on the merits and pitfalls of standardized tests for college admission, with questions regarding the format (multiple-choice versus constructed response), cognitive complexity, and content of these assessments (achievement versus aptitude) at the forefront of the discussion. This study addressed these questions by investigating the relationship between SAT Mathematics (SAT-M) item characteristics and the item's ability to predict college outcomes. Using multiple regression, SAT-M item characteristics (content area, format, cognitive complexity, and abstract/concrete classification) were used to predict three outcome measures: the correlation of item score with first-year college grade point average (FYGPA), the correlation of item score with mathematics course grades, and the percentage of students who answered the item correctly and chose to major in a mathematics or science (STEM) field. Separate models were run including and excluding item difficulty and discrimination as covariates. The results revealed that many of the item characteristics were related to the outcome measures, and that item difficulty and discrimination had a mediating effect on several of the predictor variables, particularly on the effects of non-routine/insightful items, and multiple-choice items. [Slides presented at AERA 2011.].
Publisher:
ISBN:
Category :
Languages : en
Pages : 28
Book Description
There is much debate on the merits and pitfalls of standardized tests for college admission, with questions regarding the format (multiple-choice versus constructed response), cognitive complexity, and content of these assessments (achievement versus aptitude) at the forefront of the discussion. This study addressed these questions by investigating the relationship between SAT Mathematics (SAT-M) item characteristics and the item's ability to predict college outcomes. Using multiple regression, SAT-M item characteristics (content area, format, cognitive complexity, and abstract/concrete classification) were used to predict three outcome measures: the correlation of item score with first-year college grade point average (FYGPA), the correlation of item score with mathematics course grades, and the percentage of students who answered the item correctly and chose to major in a mathematics or science (STEM) field. Separate models were run including and excluding item difficulty and discrimination as covariates. The results revealed that many of the item characteristics were related to the outcome measures, and that item difficulty and discrimination had a mediating effect on several of the predictor variables, particularly on the effects of non-routine/insightful items, and multiple-choice items. [Slides presented at AERA 2011.].
Differential Item Functioning in the SAT I
Author: Maria Veronica Santelices
Publisher:
ISBN:
Category :
Languages : en
Pages : 406
Book Description
Publisher:
ISBN:
Category :
Languages : en
Pages : 406
Book Description
Resources in Education
Author:
Publisher:
ISBN:
Category : Education
Languages : en
Pages : 1032
Book Description
Publisher:
ISBN:
Category : Education
Languages : en
Pages : 1032
Book Description
Dissertation Abstracts International
Author:
Publisher:
ISBN:
Category : Dissertations, Academic
Languages : en
Pages : 610
Book Description
Publisher:
ISBN:
Category : Dissertations, Academic
Languages : en
Pages : 610
Book Description
Research in Education
Author:
Publisher:
ISBN:
Category : Education
Languages : en
Pages : 1208
Book Description
Publisher:
ISBN:
Category : Education
Languages : en
Pages : 1208
Book Description
Current Index to Journals in Education
Author:
Publisher:
ISBN:
Category : Education
Languages : en
Pages : 964
Book Description
Publisher:
ISBN:
Category : Education
Languages : en
Pages : 964
Book Description
Resources in Education
Author:
Publisher:
ISBN:
Category : Education
Languages : en
Pages : 1112
Book Description
Publisher:
ISBN:
Category : Education
Languages : en
Pages : 1112
Book Description
Current Index to Journals in Education, Semin-Annual Cumulation, January-June, 1977
Author: Educational Resources Information Center Staff
Publisher: Macmillan Reference USA
ISBN:
Category : Education
Languages : en
Pages : 1404
Book Description
Publisher: Macmillan Reference USA
ISBN:
Category : Education
Languages : en
Pages : 1404
Book Description
Advancing Human Assessment
Author: Randy E. Bennett
Publisher: Springer
ISBN: 3319586890
Category : Education
Languages : en
Pages : 717
Book Description
This book is open access under a CC BY-NC 2.5 license. This book describes the extensive contributions made toward the advancement of human assessment by scientists from one of the world’s leading research institutions, Educational Testing Service. The book’s four major sections detail research and development in measurement and statistics, education policy analysis and evaluation, scientific psychology, and validity. Many of the developments presented have become de-facto standards in educational and psychological measurement, including in item response theory (IRT), linking and equating, differential item functioning (DIF), and educational surveys like the National Assessment of Educational Progress (NAEP), the Programme of international Student Assessment (PISA), the Progress of International Reading Literacy Study (PIRLS) and the Trends in Mathematics and Science Study (TIMSS). In addition to its comprehensive coverage of contributions to the theory and methodology of educational and psychological measurement and statistics, the book gives significant attention to ETS work in cognitive, personality, developmental, and social psychology, and to education policy analysis and program evaluation. The chapter authors are long-standing experts who provide broad coverage and thoughtful insights that build upon decades of experience in research and best practices for measurement, evaluation, scientific psychology, and education policy analysis. Opening with a chapter on the genesis of ETS and closing with a synthesis of the enormously diverse set of contributions made over its 70-year history, the book is a useful resource for all interested in the improvement of human assessment.
Publisher: Springer
ISBN: 3319586890
Category : Education
Languages : en
Pages : 717
Book Description
This book is open access under a CC BY-NC 2.5 license. This book describes the extensive contributions made toward the advancement of human assessment by scientists from one of the world’s leading research institutions, Educational Testing Service. The book’s four major sections detail research and development in measurement and statistics, education policy analysis and evaluation, scientific psychology, and validity. Many of the developments presented have become de-facto standards in educational and psychological measurement, including in item response theory (IRT), linking and equating, differential item functioning (DIF), and educational surveys like the National Assessment of Educational Progress (NAEP), the Programme of international Student Assessment (PISA), the Progress of International Reading Literacy Study (PIRLS) and the Trends in Mathematics and Science Study (TIMSS). In addition to its comprehensive coverage of contributions to the theory and methodology of educational and psychological measurement and statistics, the book gives significant attention to ETS work in cognitive, personality, developmental, and social psychology, and to education policy analysis and program evaluation. The chapter authors are long-standing experts who provide broad coverage and thoughtful insights that build upon decades of experience in research and best practices for measurement, evaluation, scientific psychology, and education policy analysis. Opening with a chapter on the genesis of ETS and closing with a synthesis of the enormously diverse set of contributions made over its 70-year history, the book is a useful resource for all interested in the improvement of human assessment.
Psychology Science
Author:
Publisher:
ISBN:
Category : Psychology
Languages : en
Pages : 662
Book Description
Publisher:
ISBN:
Category : Psychology
Languages : en
Pages : 662
Book Description