File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Across disciplines, cultures and technologies: An Item Response Theory approach to assessment of learning
Title | Across disciplines, cultures and technologies: An Item Response Theory approach to assessment of learning |
---|---|
Authors | |
Issue Date | 2017 |
Citation | The Asian Conference on Technology in the Classroom 2017, Kobe Japan, 11-14 May 2017 How to Cite? |
Abstract | In the process of educating for change, we must strategically design assessment to examine how well our students are learning. This subject is important but easily neglected by educators or misrepresented in the education field. This research applied Item Response Theory (IRT), a contemporary measurement theory that models the relationship between the probability of an item response and the underlying proficiency being measured, to examine the psychometric properties of binary (true-or-false) question items designed to check how much students have learned in a web-based learning program based on a sample of Hong Kong Chinese students. The IRT analysis procedure would be illustrated, from checking model assumptions, calibrating items to assessing goodness-of-fit. Principal results of this research would offer information for estimating item discrimination and item difficulty for each question item, producing estimates on the proficiency level for each student, and providing item information to indicate how well an individual item contributes to the assessment of learning along a continuum ranging from low to high proficiency levels. In this direction, the IRT approach offers useful information for design, diagnosis
and revision of question items. For example, items with high information value are particularly useful and should be retained, whereas items with low information value are not particularly useful and could be considered for removal. In conclusion, this research puts forward an IRT approach that can be widely applied to design and modify assessment items such that assessment of learning can be better suited to the discipline, culture and technology in context. |
Description | Saturday Session IV: Student Learning & Learner Experiences - no. 36936 |
Persistent Identifier | http://hdl.handle.net/10722/242951 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Huen, MYJ | - |
dc.contributor.author | Zhao, Y | - |
dc.contributor.author | Yip, PSF | - |
dc.date.accessioned | 2017-08-25T02:47:47Z | - |
dc.date.available | 2017-08-25T02:47:47Z | - |
dc.date.issued | 2017 | - |
dc.identifier.citation | The Asian Conference on Technology in the Classroom 2017, Kobe Japan, 11-14 May 2017 | - |
dc.identifier.uri | http://hdl.handle.net/10722/242951 | - |
dc.description | Saturday Session IV: Student Learning & Learner Experiences - no. 36936 | - |
dc.description.abstract | In the process of educating for change, we must strategically design assessment to examine how well our students are learning. This subject is important but easily neglected by educators or misrepresented in the education field. This research applied Item Response Theory (IRT), a contemporary measurement theory that models the relationship between the probability of an item response and the underlying proficiency being measured, to examine the psychometric properties of binary (true-or-false) question items designed to check how much students have learned in a web-based learning program based on a sample of Hong Kong Chinese students. The IRT analysis procedure would be illustrated, from checking model assumptions, calibrating items to assessing goodness-of-fit. Principal results of this research would offer information for estimating item discrimination and item difficulty for each question item, producing estimates on the proficiency level for each student, and providing item information to indicate how well an individual item contributes to the assessment of learning along a continuum ranging from low to high proficiency levels. In this direction, the IRT approach offers useful information for design, diagnosis and revision of question items. For example, items with high information value are particularly useful and should be retained, whereas items with low information value are not particularly useful and could be considered for removal. In conclusion, this research puts forward an IRT approach that can be widely applied to design and modify assessment items such that assessment of learning can be better suited to the discipline, culture and technology in context. | - |
dc.language | eng | - |
dc.relation.ispartof | The Asian Conference on Technology in the Classroom 2017 | - |
dc.title | Across disciplines, cultures and technologies: An Item Response Theory approach to assessment of learning | - |
dc.type | Conference_Paper | - |
dc.identifier.email | Zhao, Y: myzhao@hku.hk | - |
dc.identifier.email | Yip, PSF: sfpyip@hku.hk | - |
dc.identifier.authority | Zhao, Y=rp02230 | - |
dc.identifier.authority | Yip, PSF=rp00596 | - |
dc.identifier.hkuros | 273930 | - |