Determining the performance of five multiple choice scoring methods in estimating examinee’s ability / Lau Sie Hoe ... [et. al]
Despite the current popularity of performance-based assessment and the emergence of new assessment methods, multiple choices (MC) item remain a major form of assessment. Conventional Number Right (NR) scoring method where one point for correct response and zero for other response has been consiste...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Thesis |
Language: | English |
Published: |
2006
|
Subjects: | |
Online Access: | https://ir.uitm.edu.my/id/eprint/94756/1/94756.pdf https://ir.uitm.edu.my/id/eprint/94756/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
id |
my.uitm.ir.94756 |
---|---|
record_format |
eprints |
spelling |
my.uitm.ir.947562024-05-08T22:57:49Z https://ir.uitm.edu.my/id/eprint/94756/ Determining the performance of five multiple choice scoring methods in estimating examinee’s ability / Lau Sie Hoe ... [et. al] Lau, Sie Hoe Paul Lau, Ngee Kiong Ling, Siew Eng Hwa, Tee Yong H Social Sciences (General) Study and teaching. Research Despite the current popularity of performance-based assessment and the emergence of new assessment methods, multiple choices (MC) item remain a major form of assessment. Conventional Number Right (NR) scoring method where one point for correct response and zero for other response has been consistently criticized for failure to credit partial knowledge and encourage guessing. Various alternative scoring methods such as Number Right with Correction for Guessing (NRC). Elimination Testing (ET), Confidence Weighting (CW) and Probability Measurement (PM) had been proposed to overcome these two weaknesses. 1 lowever to date, none has been widely accepted although the theoretical rationale behind various scoring methods under Classical Test Theory (CTT) is sound. A major cause of concern is the possibility that complicated scoring instruction might introduce other factors, which may affect the reliability and validity of the test scores. Studies on whether examinees can be trained to follow the new test instructions realistically have been inconclusive. Whether they can consistently follow the test instruction throughout the whole test remain an open question. There have been intense comparisons studies on scores obtain through various CTT scoring methods with NR scores. What yet to be explore is the comparison of these scores with Item Response Theory (IRT) ability estimates. 2006 Thesis NonPeerReviewed text en https://ir.uitm.edu.my/id/eprint/94756/1/94756.pdf Determining the performance of five multiple choice scoring methods in estimating examinee’s ability / Lau Sie Hoe ... [et. al]. (2006) Masters thesis, thesis, University Technology MARA Sarawak. |
institution |
Universiti Teknologi Mara |
building |
Tun Abdul Razak Library |
collection |
Institutional Repository |
continent |
Asia |
country |
Malaysia |
content_provider |
Universiti Teknologi Mara |
content_source |
UiTM Institutional Repository |
url_provider |
http://ir.uitm.edu.my/ |
language |
English |
topic |
H Social Sciences (General) Study and teaching. Research |
spellingShingle |
H Social Sciences (General) Study and teaching. Research Lau, Sie Hoe Paul Lau, Ngee Kiong Ling, Siew Eng Hwa, Tee Yong Determining the performance of five multiple choice scoring methods in estimating examinee’s ability / Lau Sie Hoe ... [et. al] |
description |
Despite the current popularity of performance-based assessment and the emergence of new assessment methods, multiple choices (MC) item
remain a major form of assessment. Conventional Number Right (NR) scoring method where one point for correct response and zero for other response has been consistently criticized for failure to credit partial knowledge and encourage guessing. Various alternative scoring methods such as Number Right with Correction for Guessing (NRC). Elimination Testing (ET), Confidence Weighting (CW) and Probability Measurement (PM) had been proposed to overcome these two weaknesses. 1 lowever to date, none has been widely accepted although the theoretical rationale behind various scoring methods under Classical Test Theory (CTT) is sound. A major cause of concern is the possibility that complicated scoring instruction might introduce other factors, which may affect the reliability and validity of the test scores. Studies on whether examinees can be trained to follow the new test instructions realistically have been inconclusive. Whether they can consistently follow the test instruction throughout the whole test remain an open question. There have been intense comparisons studies on scores obtain through various CTT scoring methods with NR scores. What yet to be explore is the comparison of these scores with Item Response Theory (IRT) ability estimates. |
format |
Thesis |
author |
Lau, Sie Hoe Paul Lau, Ngee Kiong Ling, Siew Eng Hwa, Tee Yong |
author_facet |
Lau, Sie Hoe Paul Lau, Ngee Kiong Ling, Siew Eng Hwa, Tee Yong |
author_sort |
Lau, Sie Hoe |
title |
Determining the performance of five multiple choice scoring methods in estimating examinee’s ability / Lau Sie Hoe ... [et. al] |
title_short |
Determining the performance of five multiple choice scoring methods in estimating examinee’s ability / Lau Sie Hoe ... [et. al] |
title_full |
Determining the performance of five multiple choice scoring methods in estimating examinee’s ability / Lau Sie Hoe ... [et. al] |
title_fullStr |
Determining the performance of five multiple choice scoring methods in estimating examinee’s ability / Lau Sie Hoe ... [et. al] |
title_full_unstemmed |
Determining the performance of five multiple choice scoring methods in estimating examinee’s ability / Lau Sie Hoe ... [et. al] |
title_sort |
determining the performance of five multiple choice scoring methods in estimating examinee’s ability / lau sie hoe ... [et. al] |
publishDate |
2006 |
url |
https://ir.uitm.edu.my/id/eprint/94756/1/94756.pdf https://ir.uitm.edu.my/id/eprint/94756/ |
_version_ |
1800100625839554560 |
score |
13.160551 |