Empowering teachers in item response theory analysis using R Studio in Manokwari

Authors

  • Achmad Rante Suparman Department of Chemistry Education, Faculty of Teacher Training and EducationUniversitas Papua https://orcid.org/0000-0003-3734-8919
  • Murtihapsari Murtihapsari Department of Chemistry Education, Faculty of Teacher Training and EducationUniversitas Papua https://orcid.org/0000-0001-6014-6827
  • Purwati Purwati Department of Mathematics Education, Faculty of Teacher Training and EducationUniversitas Papua

DOI:

https://doi.org/10.26905/abdimas.v10i1.14258

Keywords:

Item response theory, R Studio, Teacher, creativity training

Abstract

The training program on item response theory (IRT) analysis using R Studio for teachers in Manokwari aims to enhance their understanding of modern test theory and their ability to use R Studio as an analytical tool. Many teachers are unfamiliar with these concepts, so the training provides in-depth knowledge and practical applications to analyze test data and estimate student abilities. This interactive, practice-based training involved 30 teachers and included socialization, training, technology application, mentoring, and evaluation of R Studio in test item analysis. The results showed a significant improvement in teachers' knowledge and skills. Before the training, 90 percent of teachers were unfamiliar with R Studio, and 70 percent lacked understanding of IRT. After the training, 70 percent of teachers could use R Studio for student answer analysis, and 80 percent demonstrated a good grasp of modern test theory concepts. This improvement reflects the program's success in achieving its objectives and has the potential to enhance the quality of educational assessments in Manokwari. By utilizing IRT analysis more effectively, teachers can make better-informed decisions in the teaching and learning process, ultimately contributing to improved educational outcomes.

Downloads

Download data is not yet available.

References

Aiken, L. R. (1985). Three coefficients for analyzing the reliability and validity of ratings. Educational and psychological measurement, 45(1), 131-142. https://doi.org/10.1177/0013164485451012

Baker, F. B., & Kim, S. H. (2017). The basics of item response theory using R. Springer International Publishing. https://doi.org/10.1007/978-3-319-54205-8

Çelen, B. (2019). Motivations for teaching English: A review of journal articles on L2 teacher motivation. In G. Y. Ekşi, L. Guerra, D. Werbińska, & Y. Bayyurt (Eds.), Research trends in English language teacher education and English language teaching (pp. 241–258). University of Évora.

Christianti, M., Retnowati, T. H., Wening, S., Hasan, A., & Ratnawati, H. (2022). Early literacy assessment among kindergarten teachers in Indonesia: A phenomenological study. European Journal of Educational Research, 11(4), 2401–2411. https://doi.org/10.12973/eu-jer.11.4.2401

Cooper, A., Klinger, D. A., & McAdie, P. (2017). What do teachers need? An exploration of evidence-informed practice for classroom assessment in Ontario. Educational Research, 59(2), 190–208. https://doi.org/10.1080/00131881.2017.1310392

Crocker, L., & Algina, J. (2008). Introduction to classical and modern test theory. Cengage Learning.

Drasgow, F., & Mattern, K. (2006). New tests and new items: Opportunities and issues. In D. Bartram & R. K. Hambleton (Eds.), Computer-Based Testing and the Internet (pp. 59–76). John Wiley & Sons Ltd.

Geelan, D. (2020). Physical science teacher skills in a conceptual explanation. Education Sciences, 10(1). https://doi.org/10.3390/educsci10010023

Gultom, S., Hutauruk, A. F., & Ginting, A. M. (2020). Teaching skills of teacher in increasing student learning interest. Budapest International Research and Critics Institute (BIRCI-Journal): Humanities and Social Sciences, 3(3), 1564-1569. https://doi.org/10.33258/birci.v3i3.1086

Hambleton, R. K. (2006). Psychometric models, test designs and item types for the next generation of educational and psychological tests. In D. Bartram & R. K. Hambleton (Eds.), Computer-Based Testing and the Internet (pp. 77–90). John Wiley & Sons Ltd.

Herrmann-Abell, C. F., & DeBoer, G. E. (2011). Using distractor-driven standards-based multiple-choice assessments and Rasch modeling to investigate hierarchies of chemistry misconceptions and detect structural problems with individual items. Chemistry Education Research and Practice, 12(2), 184–192. https://doi.org/10.1039/c1rp90023d

Kementerian Pendidikan, Kebudayaan, Riset, dan Teknologi. (2023). Data guru nasional - Dapodikdasmen. Kementerian Pendidikan, Kebudayaan, Riset, dan Teknologi. Retrieved from: https://dapo.kemdikbud.go.id/guru

Lalot, F., Räikkönen, J., & Ahvenharju, S. (2025). An item response theory approach to measurement in environmental psychology: A practical example with environmental risk perception. Journal of Environmental Psychology, 101, 102520. https://doi.org/10.1016/j.jenvp.2025.102520

Linden, W. J. van der. (2018). Handbook of item response theory volume three applications (J. Gill, W. J. van der Linden, S. Heeringa, & T. Snijders, Eds.). CRC Press.

Maghfira, I., Mustar, S., Ifnaldi, I., & Faishol, R. (2022). Pelatihan Musyawarah Guru Mata Pelajaran (MGMP) dan insentif terhadap kinerja guru. Ar-Risalah Media Keislaman Pendidikan dan Hukum Islam, 20(1), 18-50. https://doi.org/10.69552/ar-risalah.v20i1.1327

Morales López, A. I., & Tuzón Marco, P. (2022). Misconceptions, knowledge, and attitudes towards the phenomenon of radioactivity. Science & Education, 31(2), 405-426. https://doi.org/10.1007/s11191-021-00251-w

Mutluer, C., & Çakan, M. (2023). Comparison of test equating methods based on classical test theory and item response theory. Journal of Uludag University Faculty of Education, 36(3), 866–906. https://doi.org/10.19171/uefad.1325587

Nurcahyo, F. A. (2017). Aplikasi IRT dalam analisis aitem tes kognitif. Buletin Psikologi, 24(2), 64-75. https://doi.org/10.22146/buletinpsikologi.25218

Paek, I., & Cole, K. (2020). Using R for item response theory model applications (First). Routledge.

Pokorný, M. (2023). Experience with online learning of mathematics in primary education. International Journal of Emerging Technologies in Learning (IJET), 18(02), 203–213. https://doi.org/10.3991/ijet.v18i02.35401

Potvin, P. (2023). Response of science learners to contradicting information: A review of research. Studies in Science Education, 59(1), 67–108. https://doi.org/10.1080/03057267.2021.2004006

Suparman, A. R., Rohaeti, E., & Wening, S. (2024). Development of computer-based chemical five-tier diagnostic test instruments: A generalized partial credit model. Journal on Efficiency and Responsibility in Education and Science, 17(1), 92-106. https://doi.org/10.7160/eriesj.2024.170108

Webb, N. L. (2011). Identifying content for student achievement tests. In S. M. Downing & T. M. Haladyna (Eds.), Handbook of test development (pp. 155–180). Lawrence Erlbaum Associates.

Zenisky, A. L., & Luecht, R. M. (2016). The future of computer-based testing: Some new paradigms. In C. S. Wells & M. F. Bond (Eds.), Educational measurement from foundations to future (pp. 221–238). The Guilford Press.

Downloads

Published

2025-02-28

How to Cite

Suparman, A. R., Murtihapsari, M., & Purwati, P. (2025). Empowering teachers in item response theory analysis using R Studio in Manokwari. Abdimas: Jurnal Pengabdian Masyarakat Universitas Merdeka Malang, 10(1). https://doi.org/10.26905/abdimas.v10i1.14258

Issue

Section

Social and Humaniora