Comparability of computer and paper-and-pencil versions of algebra and biology assessments

Do Hong Kim, Huynh Huynh

Research output: Contribution to journalArticle

17 Citations (Scopus)

Abstract

This study examined comparability of student scores obtained from computerized and paper-and-pencil formats of the large-scale statewide end-of-course (EOC) examinations in the two subject areas of Algebra and Biology. Evidence in support of comparability of computerized and paper-based tests was sought by examining scale scores, item parameter estimates, test characteristic curves, test information functions, Rasch ability estimates at the content domain level, and the equivalence of the construct. Overall, the results support the comparability of computerized and paper-based tests at the itemlevel, subtest-level, and whole test-level in both subject areas. No evidence was found to suggest that the administration mode changed the construct being measured.

Original languageEnglish (US)
Pages (from-to)1-29
Number of pages29
JournalJournal of Technology, Learning, and Assessment
Volume6
Issue number4
StatePublished - Dec 1 2007
Externally publishedYes

Fingerprint

Algebra
biology
Students
equivalence
evidence
examination
ability
student

ASJC Scopus subject areas

  • Education
  • Computer Science Applications

Cite this

Comparability of computer and paper-and-pencil versions of algebra and biology assessments. / Kim, Do Hong; Huynh, Huynh.

In: Journal of Technology, Learning, and Assessment, Vol. 6, No. 4, 01.12.2007, p. 1-29.

Research output: Contribution to journalArticle

@article{647bba4194de4b0ca50412a2bf953b46,
title = "Comparability of computer and paper-and-pencil versions of algebra and biology assessments",
abstract = "This study examined comparability of student scores obtained from computerized and paper-and-pencil formats of the large-scale statewide end-of-course (EOC) examinations in the two subject areas of Algebra and Biology. Evidence in support of comparability of computerized and paper-based tests was sought by examining scale scores, item parameter estimates, test characteristic curves, test information functions, Rasch ability estimates at the content domain level, and the equivalence of the construct. Overall, the results support the comparability of computerized and paper-based tests at the itemlevel, subtest-level, and whole test-level in both subject areas. No evidence was found to suggest that the administration mode changed the construct being measured.",
author = "Kim, {Do Hong} and Huynh Huynh",
year = "2007",
month = "12",
day = "1",
language = "English (US)",
volume = "6",
pages = "1--29",
journal = "Journal of Technology, Learning, and Assessment",
issn = "1540-2525",
publisher = "Boston College",
number = "4",

}

TY - JOUR

T1 - Comparability of computer and paper-and-pencil versions of algebra and biology assessments

AU - Kim, Do Hong

AU - Huynh, Huynh

PY - 2007/12/1

Y1 - 2007/12/1

N2 - This study examined comparability of student scores obtained from computerized and paper-and-pencil formats of the large-scale statewide end-of-course (EOC) examinations in the two subject areas of Algebra and Biology. Evidence in support of comparability of computerized and paper-based tests was sought by examining scale scores, item parameter estimates, test characteristic curves, test information functions, Rasch ability estimates at the content domain level, and the equivalence of the construct. Overall, the results support the comparability of computerized and paper-based tests at the itemlevel, subtest-level, and whole test-level in both subject areas. No evidence was found to suggest that the administration mode changed the construct being measured.

AB - This study examined comparability of student scores obtained from computerized and paper-and-pencil formats of the large-scale statewide end-of-course (EOC) examinations in the two subject areas of Algebra and Biology. Evidence in support of comparability of computerized and paper-based tests was sought by examining scale scores, item parameter estimates, test characteristic curves, test information functions, Rasch ability estimates at the content domain level, and the equivalence of the construct. Overall, the results support the comparability of computerized and paper-based tests at the itemlevel, subtest-level, and whole test-level in both subject areas. No evidence was found to suggest that the administration mode changed the construct being measured.

UR - http://www.scopus.com/inward/record.url?scp=37649023386&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=37649023386&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:37649023386

VL - 6

SP - 1

EP - 29

JO - Journal of Technology, Learning, and Assessment

JF - Journal of Technology, Learning, and Assessment

SN - 1540-2525

IS - 4

ER -