File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.12809/hkjr1716804
- Scopus: eid_2-s2.0-85032895133
- WOS: WOS:000418601200009
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Inter-rater reliability of examiners in the Hong Kong College of radiologists' palliative medicine oral examination
Title | Inter-rater reliability of examiners in the Hong Kong College of radiologists' palliative medicine oral examination |
---|---|
Authors | |
Keywords | Oncologists Palliative medicine |
Issue Date | 2017 |
Citation | Hong Kong Journal of Radiology, 2017, v. 20, n. 3, p. 232-236 How to Cite? |
Abstract | © 2017 Hong Kong College of Radiologists. Objective: To analyse the inter-rater reliability of scores in the Palliative Medicine Oral Examination among examiners, among observers, and between examiners and observers. Methods: The Palliative Medicine Subspecialty Board aims to train oncology specialists for palliative medicine through a 4-year accreditation programme. At the end of the programme, trainees undergo a Board Examination involving subjective ratings by examiners. Each candidate rotated through two panels during the 1-day examination; one panel involved the written dissertation and questions pertaining to symptom management (viva 1) and the other about psychosocial issue (viva 2) and ethics (viva 3). A total of 10 candidates were evaluated on the four occasions using a 10-point scale by six examiners and four observers, along with one external examiner. Intraclass correlation coefficient (ICC) was calculated to determine inter-rater reliability (concordance) among examiners, among observers, and between examiners and observers. ICC values are classified as poor (≤0.20), fair (0.21-0.40), moderate (0.41-0.60), good (0.61-0.80), and very good (0.81-1.00). Results: Among examiners, concordance was overall good at different stations. Among observers, concordance was fair to very good across different stations. Between examiners and observers, concordance was fair to moderate at two stations. Across all stations, concordance was good between examiners and observers. Conclusion: The inter-rater reliability was good at the Board Examination administered by the Palliative Medicine Subspecialty Board of the Hong Kong College of Radiologists. The examination is reliable in accrediting practitioners for subspecialty certification. |
Persistent Identifier | http://hdl.handle.net/10722/251694 |
ISSN | 2023 Impact Factor: 0.2 2023 SCImago Journal Rankings: 0.127 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Chow, R. | - |
dc.contributor.author | Zhang, L. | - |
dc.contributor.author | Soong, I. S. | - |
dc.contributor.author | Mang, O. W.K. | - |
dc.contributor.author | Lui, L. C.Y. | - |
dc.contributor.author | Wong, K. H. | - |
dc.contributor.author | Siu, S. W.K. | - |
dc.contributor.author | Lo, S. H. | - |
dc.contributor.author | Yuen, K. K. | - |
dc.contributor.author | Yau, Y. S.H. | - |
dc.contributor.author | Wong, K. Y. | - |
dc.contributor.author | Leung, C. | - |
dc.contributor.author | Wong, S. Y. | - |
dc.contributor.author | Ngan, R. | - |
dc.contributor.author | Chow, E. | - |
dc.contributor.author | Yeung, R. | - |
dc.date.accessioned | 2018-03-08T05:00:42Z | - |
dc.date.available | 2018-03-08T05:00:42Z | - |
dc.date.issued | 2017 | - |
dc.identifier.citation | Hong Kong Journal of Radiology, 2017, v. 20, n. 3, p. 232-236 | - |
dc.identifier.issn | 2223-6619 | - |
dc.identifier.uri | http://hdl.handle.net/10722/251694 | - |
dc.description.abstract | © 2017 Hong Kong College of Radiologists. Objective: To analyse the inter-rater reliability of scores in the Palliative Medicine Oral Examination among examiners, among observers, and between examiners and observers. Methods: The Palliative Medicine Subspecialty Board aims to train oncology specialists for palliative medicine through a 4-year accreditation programme. At the end of the programme, trainees undergo a Board Examination involving subjective ratings by examiners. Each candidate rotated through two panels during the 1-day examination; one panel involved the written dissertation and questions pertaining to symptom management (viva 1) and the other about psychosocial issue (viva 2) and ethics (viva 3). A total of 10 candidates were evaluated on the four occasions using a 10-point scale by six examiners and four observers, along with one external examiner. Intraclass correlation coefficient (ICC) was calculated to determine inter-rater reliability (concordance) among examiners, among observers, and between examiners and observers. ICC values are classified as poor (≤0.20), fair (0.21-0.40), moderate (0.41-0.60), good (0.61-0.80), and very good (0.81-1.00). Results: Among examiners, concordance was overall good at different stations. Among observers, concordance was fair to very good across different stations. Between examiners and observers, concordance was fair to moderate at two stations. Across all stations, concordance was good between examiners and observers. Conclusion: The inter-rater reliability was good at the Board Examination administered by the Palliative Medicine Subspecialty Board of the Hong Kong College of Radiologists. The examination is reliable in accrediting practitioners for subspecialty certification. | - |
dc.language | eng | - |
dc.relation.ispartof | Hong Kong Journal of Radiology | - |
dc.subject | Oncologists | - |
dc.subject | Palliative medicine | - |
dc.title | Inter-rater reliability of examiners in the Hong Kong College of radiologists' palliative medicine oral examination | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.12809/hkjr1716804 | - |
dc.identifier.scopus | eid_2-s2.0-85032895133 | - |
dc.identifier.volume | 20 | - |
dc.identifier.issue | 3 | - |
dc.identifier.spage | 232 | - |
dc.identifier.epage | 236 | - |
dc.identifier.isi | WOS:000418601200009 | - |
dc.identifier.issnl | 2223-6619 | - |