File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Computerized Adaptive Testing for Cognitively Based Multiple-Choice Data

TitleComputerized Adaptive Testing for Cognitively Based Multiple-Choice Data
Authors
Keywordscognitive diagnosis models
computerized adaptive testing
MC-DINA
G-DINA
item selection methods
Issue Date2019
PublisherSage Publications, Inc. The Journal's web site is located at http://www.sagepub.com/journal.aspx?pid=184
Citation
Applied Psychological Measurement, 2019, v. 43 n. 5, p. 388-401 How to Cite?
AbstractCognitive diagnosis models (CDMs) are latent class models that hold great promise for providing diagnostic information about student knowledge profiles. The increasing use of computers in classrooms enhances the advantages of CDMs for more efficient diagnostic testing by using adaptive algorithms, referred to as cognitive diagnosis computerized adaptive testing (CD-CAT). When multiple-choice items are involved, CD-CAT can be further improved by using polytomous scoring (i.e., considering the specific options students choose), instead of dichotomous scoring (i.e., marking answers as either right or wrong). In this study, the authors propose and evaluate the performance of the Jensen–Shannon divergence (JSD) index as an item selection method for the multiple-choice deterministic inputs, noisy “and” gate (MC-DINA) model. Attribute classification accuracy and item usage are evaluated under different conditions of item quality and test termination rule. The proposed approach is compared with the random selection method and an approximate approach based on dichotomized responses. The results show that under the MC-DINA model, JSD improves the attribute classification accuracy significantly by considering the information from distractors, even with a very short test length. This result has important implications in practical classroom settings as it can allow for dramatically reduced testing times, thus resulting in more targeted learning opportunities.
Persistent Identifierhttp://hdl.handle.net/10722/274093
ISSN
2023 Impact Factor: 1.0
2023 SCImago Journal Rankings: 1.061
PubMed Central ID
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorYigit, H-
dc.contributor.authorSorrel, MA-
dc.contributor.authorde la Torre, J-
dc.date.accessioned2019-08-18T14:54:54Z-
dc.date.available2019-08-18T14:54:54Z-
dc.date.issued2019-
dc.identifier.citationApplied Psychological Measurement, 2019, v. 43 n. 5, p. 388-401-
dc.identifier.issn0146-6216-
dc.identifier.urihttp://hdl.handle.net/10722/274093-
dc.description.abstractCognitive diagnosis models (CDMs) are latent class models that hold great promise for providing diagnostic information about student knowledge profiles. The increasing use of computers in classrooms enhances the advantages of CDMs for more efficient diagnostic testing by using adaptive algorithms, referred to as cognitive diagnosis computerized adaptive testing (CD-CAT). When multiple-choice items are involved, CD-CAT can be further improved by using polytomous scoring (i.e., considering the specific options students choose), instead of dichotomous scoring (i.e., marking answers as either right or wrong). In this study, the authors propose and evaluate the performance of the Jensen–Shannon divergence (JSD) index as an item selection method for the multiple-choice deterministic inputs, noisy “and” gate (MC-DINA) model. Attribute classification accuracy and item usage are evaluated under different conditions of item quality and test termination rule. The proposed approach is compared with the random selection method and an approximate approach based on dichotomized responses. The results show that under the MC-DINA model, JSD improves the attribute classification accuracy significantly by considering the information from distractors, even with a very short test length. This result has important implications in practical classroom settings as it can allow for dramatically reduced testing times, thus resulting in more targeted learning opportunities.-
dc.languageeng-
dc.publisherSage Publications, Inc. The Journal's web site is located at http://www.sagepub.com/journal.aspx?pid=184-
dc.relation.ispartofApplied Psychological Measurement-
dc.rightsApplied Psychological Measurement. Copyright © Sage Publications, Inc.-
dc.rightsCopyright © [year] (Copyright Holder). DOI: [DOI number]-
dc.subjectcognitive diagnosis models-
dc.subjectcomputerized adaptive testing-
dc.subjectMC-DINA-
dc.subjectG-DINA-
dc.subjectitem selection methods-
dc.titleComputerized Adaptive Testing for Cognitively Based Multiple-Choice Data-
dc.typeArticle-
dc.identifier.emailde la Torre, J: jdltorre@hku.hk-
dc.identifier.authorityde la Torre, J=rp02159-
dc.description.naturelink_to_OA_fulltext-
dc.identifier.doi10.1177/0146621618798665-
dc.identifier.pmid31235984-
dc.identifier.pmcidPMC6572910-
dc.identifier.scopuseid_2-s2.0-85059608693-
dc.identifier.hkuros302291-
dc.identifier.volume43-
dc.identifier.issue5-
dc.identifier.spage388-
dc.identifier.epage401-
dc.identifier.isiWOS:000471767900004-
dc.publisher.placeUnited States-
dc.identifier.issnl0146-6216-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats