Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong
From a perspective of a PaperRater user, the author attempts to investigate the reliability of the program. Twenty-four freshman students and one writing teacher at Dalat University - Vietnam were recruited to serve the study. The author also served as one scorer. The scores generated by Paper...
Saved in:
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
Universiti Teknologi MARA, Kedah
2017
|
Subjects: | |
Online Access: | http://ir.uitm.edu.my/id/eprint/30281/1/AJ_NGUYEN%20VI%20THONG%20CPLT%20K%2017.pdf http://ir.uitm.edu.my/id/eprint/30281/ https://cplt.uitm.edu.my/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
id |
my.uitm.ir.30281 |
---|---|
record_format |
eprints |
spelling |
my.uitm.ir.302812020-05-04T03:07:08Z http://ir.uitm.edu.my/id/eprint/30281/ Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong Nguyen, Vi Thong Computers in education. Information technology Systems of individual educators and writers From a perspective of a PaperRater user, the author attempts to investigate the reliability of the program. Twenty-four freshman students and one writing teacher at Dalat University - Vietnam were recruited to serve the study. The author also served as one scorer. The scores generated by PaperRater and the two human scorers were analyzed quantitatively and qualitatively. The statistical results indicate that there is an excellent correlation between the means of scores generated by three scorers. With the aid of SPSS and certain calculation, it is shown that PaterRater has an acceptable reliability which implies that the program can somehow assist in grading students’ papers. The semi-structured interview at the qualitative stage with the teacher scorer helped point out several challenges that writing teachers might encounter when assessing students’ prompts. From her perspective, it was admitted that with the assistance of PaperRater, the burden of assessing a bunch of prompts at a short time period would be much released. However, how the program can be employed by teachers should be carefully investigated. Therefore, this study provides writing teachers with pedagogical implications on how PaperRater should be used in writing classrooms. The study is expected to shed new light on the possibility of adopting an automated evaluation instrument as a scoring assistant in large writing classrooms. Universiti Teknologi MARA, Kedah 2017 Article PeerReviewed text en http://ir.uitm.edu.my/id/eprint/30281/1/AJ_NGUYEN%20VI%20THONG%20CPLT%20K%2017.pdf Nguyen, Vi Thong (2017) Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong. Journal of Creative Practices in Language Learning and Teaching (CPLT), 5 (1). pp. 1-18. ISSN 1823-464X https://cplt.uitm.edu.my/ |
institution |
Universiti Teknologi Mara |
building |
Tun Abdul Razak Library |
collection |
Institutional Repository |
continent |
Asia |
country |
Malaysia |
content_provider |
Universiti Teknologi Mara |
content_source |
UiTM Institutional Repository |
url_provider |
http://ir.uitm.edu.my/ |
language |
English |
topic |
Computers in education. Information technology Systems of individual educators and writers |
spellingShingle |
Computers in education. Information technology Systems of individual educators and writers Nguyen, Vi Thong Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong |
description |
From a perspective of a PaperRater user, the author attempts to investigate the reliability of the
program. Twenty-four freshman students and one writing teacher at Dalat University - Vietnam
were recruited to serve the study. The author also served as one scorer. The scores generated by
PaperRater and the two human scorers were analyzed quantitatively and qualitatively. The
statistical results indicate that there is an excellent correlation between the means of scores
generated by three scorers. With the aid of SPSS and certain calculation, it is shown that
PaterRater has an acceptable reliability which implies that the program can somehow assist in
grading students’ papers. The semi-structured interview at the qualitative stage with the teacher
scorer helped point out several challenges that writing teachers might encounter when assessing
students’ prompts. From her perspective, it was admitted that with the assistance of PaperRater,
the burden of assessing a bunch of prompts at a short time period would be much released.
However, how the program can be employed by teachers should be carefully investigated.
Therefore, this study provides writing teachers with pedagogical implications on how
PaperRater should be used in writing classrooms. The study is expected to shed new light on the
possibility of adopting an automated evaluation instrument as a scoring assistant in large
writing classrooms. |
format |
Article |
author |
Nguyen, Vi Thong |
author_facet |
Nguyen, Vi Thong |
author_sort |
Nguyen, Vi Thong |
title |
Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong |
title_short |
Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong |
title_full |
Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong |
title_fullStr |
Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong |
title_full_unstemmed |
Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong |
title_sort |
automated essay assessment: an evaluation on paperrater’s reliability from practice / nguyen vi thong |
publisher |
Universiti Teknologi MARA, Kedah |
publishDate |
2017 |
url |
http://ir.uitm.edu.my/id/eprint/30281/1/AJ_NGUYEN%20VI%20THONG%20CPLT%20K%2017.pdf http://ir.uitm.edu.my/id/eprint/30281/ https://cplt.uitm.edu.my/ |
_version_ |
1685650633441411072 |
score |
13.214268 |