File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Using Odds Ratios to Detect Differential Item Functioning

TitleUsing Odds Ratios to Detect Differential Item Functioning
Authors
Keywordsdifferential item functioning
logistic regression
Mantel–Haenszel
missing data
odds ratio
scale purification
Issue Date2018
PublisherSage Publications, Inc. The Journal's web site is located at http://www.sagepub.com/journal.aspx?pid=184
Citation
Applied Psychological Measurement, 2018, v. 42 n. 8, p. 613-629 How to Cite?
AbstractDifferential item functioning (DIF) makes test scores incomparable and substantially threatens test validity. Although conventional approaches, such as the logistic regression (LR) and the Mantel–Haenszel (MH) methods, have worked well, they are vulnerable to high percentages of DIF items in a test and missing data. This study developed a simple but effective method to detect DIF using the odds ratio (OR) of two groups’ responses to a studied item. The OR method uses all available information from examinees’ responses, and it can eliminate the potential influence of bias in the total scores. Through a series of simulation studies in which the DIF pattern, impact, sample size (equal/unequal), purification procedure (with/without), percentages of DIF items, and proportions of missing data were manipulated, the performance of the OR method was evaluated and compared with the LR and MH methods. The results showed that the OR method without a purification procedure outperformed the LR and MH methods in controlling false positive rates and yielding high true positive rates when tests had a high percentage of DIF items favoring the same group. In addition, only the OR method was feasible when tests adopted the item matrix sampling design. The effectiveness of the OR method with an empirical example was illustrated.
Persistent Identifierhttp://hdl.handle.net/10722/258754
ISSN
2021 Impact Factor: 1.522
2020 SCImago Journal Rankings: 2.083
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorJin, KY-
dc.contributor.authorChen, HF-
dc.contributor.authorWang, WC-
dc.date.accessioned2018-08-22T01:43:33Z-
dc.date.available2018-08-22T01:43:33Z-
dc.date.issued2018-
dc.identifier.citationApplied Psychological Measurement, 2018, v. 42 n. 8, p. 613-629-
dc.identifier.issn0146-6216-
dc.identifier.urihttp://hdl.handle.net/10722/258754-
dc.description.abstractDifferential item functioning (DIF) makes test scores incomparable and substantially threatens test validity. Although conventional approaches, such as the logistic regression (LR) and the Mantel–Haenszel (MH) methods, have worked well, they are vulnerable to high percentages of DIF items in a test and missing data. This study developed a simple but effective method to detect DIF using the odds ratio (OR) of two groups’ responses to a studied item. The OR method uses all available information from examinees’ responses, and it can eliminate the potential influence of bias in the total scores. Through a series of simulation studies in which the DIF pattern, impact, sample size (equal/unequal), purification procedure (with/without), percentages of DIF items, and proportions of missing data were manipulated, the performance of the OR method was evaluated and compared with the LR and MH methods. The results showed that the OR method without a purification procedure outperformed the LR and MH methods in controlling false positive rates and yielding high true positive rates when tests had a high percentage of DIF items favoring the same group. In addition, only the OR method was feasible when tests adopted the item matrix sampling design. The effectiveness of the OR method with an empirical example was illustrated.-
dc.languageeng-
dc.publisherSage Publications, Inc. The Journal's web site is located at http://www.sagepub.com/journal.aspx?pid=184-
dc.relation.ispartofApplied Psychological Measurement-
dc.rightsApplied Psychological Measurement. Copyright © Sage Publications, Inc.-
dc.subjectdifferential item functioning-
dc.subjectlogistic regression-
dc.subjectMantel–Haenszel-
dc.subjectmissing data-
dc.subjectodds ratio-
dc.subjectscale purification-
dc.titleUsing Odds Ratios to Detect Differential Item Functioning-
dc.typeArticle-
dc.identifier.emailJin, KY: kyjin@hku.hk-
dc.description.naturepostprint-
dc.identifier.doi10.1177/0146621618762738-
dc.identifier.scopuseid_2-s2.0-85044394674-
dc.identifier.hkuros286633-
dc.identifier.volume42-
dc.identifier.issue8-
dc.identifier.spage613-
dc.identifier.epage629-
dc.identifier.isiWOS:000453461500002-
dc.publisher.placeUnited States-
dc.identifier.issnl0146-6216-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats