Validasi Asesmen Kompetensi Kepribadian Guru: Analisis Rasch
DOI:
https://doi.org/10.30998/sap.v10i3.1773Keywords:
personality competence, teachers, rasch modelAbstract
This study aims to validate personality competence assessment instruments for teachers with Rasch Model. This quantitative research involved 165 teachers with 72-items instrument measuring 13 dimensions of personality competence. Analysis using Winsteps 5.5.0 showed person reliability 0.83 and item reliability 0.99, indicating excellent measurement consistency. Dimensionality analysis results showed 66.1% variance explained by measurement with unexplained variance in first contrast 6.7%, meeting unidimensionality criteria. Response categories functioned optimally with ordered thresholds (-1.16, -0.46, 1.62). Of 72 items, 64 items (88.9%) showed good fit, while 8 items required revision. Findings showed “wise and prudent” dimension had highest difficulty level (3.42 logit), while “continuous professional development” lowest (-2.01 logit). This instrument provides practical contribution for teacher education institutions in evaluating teachers' personality competence objectively and evidence-based in digital era.
Downloads
References
[1] K. Schwab, The fourth industrial revolution. World Economic Forum, 2016.
[2] Y. Zhao, A. M. Pinto Llorente, and M. C. Sánchez Gómez, “Digital competence in higher education research: A systematic literature review,” Comput. Educ., vol. 168, p. 104212, Jul. 2021, doi: 10.1016/j.compedu.2021.104212.
[3] S. Blömeke, G. Kaiser, J. König, and A. Jentsch, “Profiles of mathematics teachers’ competence and their relation to instructional quality,” ZDM, vol. 52, no. 2, pp. 329–342, May 2020, doi: 10.1007/s11858-020-01128-y.
[4] R. J. Collie, A. J. Martin, J. Bobis, J. Way, and J. Anderson, “How students switch on and switch off in mathematics: exploring patterns and predictors of (dis)engagement across middle school and high school,” Educ. Psychol., vol. 39, no. 4, pp. 489–509, Apr. 2019, doi: 10.1080/01443410.2018.1537480.
[5] P. A. Jennings and M. T. Greenberg, “The Prosocial Classroom: Teacher Social and Emotional Competence in Relation to Student and Classroom Outcomes,” Rev. Educ. Res., vol. 79, no. 1, pp. 491–525, Mar. 2009, doi: 10.3102/0034654308325693.
[6] Kemendikbudristek, Peraturan Menteri Pendidikan, Kebudayaan, Riset, dan Teknologi Nomor 16 Tahun 2022 tentang Standar Proses. 2022.
[7] E. Mulyasa, Standar kompetensi dan sertifikasi guru. Bandung: PT. Remaja Rosdakarya, 2013.
[8] R. Hidayat and Y. E. Patras, “Assessing students’ mathematical problem-solving skills in the new normal: A Rasch model approach,” Pegem J. Educ. Instr., vol. 11, no. 4, pp. 354–364, 2021.
[9] D. Rahmawati, A. Nuryadin, and S. S. Fadhilah, “Rasch model analysis of teacher professional competence assessment instrument in Indonesia,” Cogent Educ., vol. 1, p. 2166679, 10AD.
[10] A. Akhwani and D. W. Rahayu, “Analisis Komponen TPACK Guru SD sebagai Kerangka Kompetensi Guru Profesional di Abad 21,” J. Basicedu, vol. 5, no. 4, pp. 1918–1925, Jun. 2021, doi: 10.31004/basicedu.v5i4.1119.
[11] S. Azhar and T. Muchtar, “Implementasi Technology, Pedagogic, and Content Knowledge (TPACK) Guru dalam Pembelajaran pada Masa Covid-19,” J. Basicedu, vol. 6, no. 4, pp. 6932–6938, Jun. 2022, doi: 10.31004/basicedu.v6i4.3413.
[12] A. Sulistyo, N. Mukminatien, B. Y. Cahyono, and A. Saukah, “Enhancing learners’ communication skills through authentic materials,” Cogent Educ., vol. 9, no. 1, p. 2043596, 2022.
[13] N. L. Azizah, F. P. Rahmawati, and S. Ridlo, “Development and validation of pedagogical competence assessment instrument using Rasch model,” J. Phys. Conf. Ser., vol. 1567, no. 4, p. 042025, 2020.
[14] W. J. Boone, “Rasch Basics for the Novice,” in Rasch Measurement, Singapore: Springer Singapore, 2020, pp. 9–30. doi: 10.1007/978-981-15-1800-3_2.
[15] J. Metsämuuronen, “Somers’ D as an Alternative for the Item–Test and Item-Rest Correlation Coefficients in the Educational Measurement Settings,” Int. J. Educ. Methodol., vol. 6, no. 1, pp. 207–221, Feb. 2020, doi: 10.12973/ijem.6.1.207.
[16] V. Aryadoust, L. Y. Ng, and H. Sayama, “A comprehensive review of Rasch measurement in language assessment: Recommendations and guidelines for research,” Lang. Test., vol. 38, no. 1, pp. 6–40, Jan. 2021, doi: 10.1177/0265532220927487.
[17] B. Sumintono and W. Widhiarso, Aplikasi pemodelan Rasch pada assessment pendidikan. Trim Komunikata Publishing, 2015.
[18] A. P. Association, Standards for Educational and Psychological Testing. Washington, DC 20002-4242: American Psychological Association, 2014.
[19] J. M. Linacre, “A user’s guide to Winsteps Ministep Rasch-model computer programs: Program Manual 4.8.2.,” winstep.com. Accessed: Oct. 23, 2025. [Online]. Available: Winsteps.com
[20] S. Y. Y. Chyung, K. Roberts, I. Swanson, and A. Hankinson, “Evidence-Based Survey Design: The Use of a Midpoint on the Likert Scale,” Perform. Improv., vol. 56, no. 10, pp. 15–23, Nov. 2017, doi: 10.1002/pfi.21727.
[21] P. G. Curran, “Methods for the detection of carelessly invalid responses in survey data,” J. Exp. Soc. Psychol., vol. 66, pp. 4–19, Sep. 2016, doi: 10.1016/j.jesp.2015.07.006.
[22] J. M. Linacre, “Winsteps® Rasch measurement computer program User’s Guide Version 5.5.0,” winstep.com. Accessed: Oct. 23, 2025. [Online]. Available: winstep.com
[23] T. Bond, Applying the Rasch Model. Routledge, 2015. doi: 10.4324/9781315814698.
[24] B. D. Wright and M. H. Stone, Making Measures. The Phaneron Press, 2004.
[25] W. P. J. Fisher, “Rating Scale Instrument Quality Criteria.” Accessed: Oct. 23, 2025. [Online]. Available: https://rasch.org/rmt/rmt211m.htm
[26] P. W. Duncan, R. K. Bode, S. Min Lai, and S. Perera, “Rasch analysis of a new stroke-specific outcome scale: the stroke impact scale11No commercial party having a direct financial interest in the results of the research supporting this article has or will confer a benefit upon the author(s) or upon any organ,” Arch. Phys. Med. Rehabil., vol. 84, no. 7, pp. 950–963, Jul. 2003, doi: 10.1016/S0003-9993(03)00035-2.
[27] M. Tavakol and R. Dennick, “Making sense of Cronbach’s alpha,” Int. J. Med. Educ., vol. 2, pp. 53–55, Jun. 2011, doi: 10.5116/ijme.4dfb.8dfd.
[28] J. M. Linacre, “Sample size and item calibration (or person measure) stability,” Rasch Meas. Trans., vol. 7, no. 4, p. 328, 2021.
[29] M. D. Reckase, Multidimensional Item Response Theory. New York, NY: Springer New York, 2009. doi: 10.1007/978-0-387-89976-3.
[30] M. Schmid, E. Brianza, and D. Petko, “Self-reported technological pedagogical content knowledge (TPACK) of pre-service teachers in relation to digital technology use in lesson plans,” Comput. Human Behav., vol. 115, p. 106586, Feb. 2021, doi: 10.1016/j.chb.2020.106586.
[31] W. J. Boone and J. R. Staver, Advances in Rasch Analyses in the Human Sciences. Cham: Springer International Publishing, 2020. doi: 10.1007/978-3-030-43420-5.
[32] F. Caena and C. Redecker, “Aligning teacher competence frameworks to 21st century challenges: The case for the European Digital Competence Framework for Educators ( Digcompedu),” Eur. J. Educ., vol. 54, no. 3, pp. 356–369, Sep. 2019, doi: 10.1111/ejed.12345.
[33] A. I. Suryani, A. R. Wulan, and I. R. Suwarna, “Validation of science process skills assessment instrument through Rasch model analysis,” J. Phys. Conf. Ser., vol. 1806, no. 1, p. 012072, 2021.
[34] B. D. Wright and J. M. Linacre, “Reasonable mean-square fit values.” Accessed: Oct. 23, 2025. [Online]. Available: https://rasch.org/rmt/rmt83b.htm
[35] G. Willis, Cognitive Interviewing. 2455 Teller Road, Thousand Oaks California 91320 United States of America: SAGE Publications, Inc., 2005. doi: 10.4135/9781412983655.
[36] J. M. Linacre, “What do infit and outfit, mean-square and standardized mean?,” Rasch Meas. Trans., vol. 16, no. 2, p. 878, 2002.
[37] J. M. Linacre, “Investigating rating scale category utility.,” J. Outcome Meas., vol. 3, no. 2, pp. 103–22, 1999, [Online]. Available: http://www.ncbi.nlm.nih.gov/pubmed/10204322
[38] D. Andrich, I. Marais, and S. Humphry, “Using a Theorem by Andersen and the Dichotomous Rasch Model to Assess the Presence of Random Guessing in Multiple Choice Items,” J. Educ. Behav. Stat., vol. 37, no. 3, pp. 417–442, Jun. 2012, doi: 10.3102/1076998611411914.
[39] M. Wilson, Constructing Measures. Routledge, 2004. doi: 10.4324/9781410611697.
[40] J. Glück, “Measuring Wisdom: Existing Approaches, Continuing Challenges, and New Developments,” Journals Gerontol. Ser. B, vol. 73, no. 8, pp. 1393–1403, Oct. 2018, doi: 10.1093/geronb/gbx140.
[41] L. Darling-Hammond, M. Hyler, and M. Gardner, “Effective Teacher Professional Development,” Jun. 2017. doi: 10.54300/122.311.
[42] M. Thurlings and P. den Brok, “Learning outcomes of teacher professional development activities: a meta-study,” Educ. Rev., vol. 69, no. 5, pp. 554–576, Oct. 2017, doi: 10.1080/00131911.2017.1281226.
[43] J. . Stronge, Qualities of effective teachers, 3rd ed. ASCD, 2018.
[44] A. D. Wu and K. Ercikan, “Using Multiple-Variable Matching to Identify Cultural Sources of Differential Item Functioning,” Int. J. Test., vol. 6, no. 3, pp. 287–300, Sep. 2006, doi: 10.1207/s15327574ijt0603_5.
[45] R. Zwick, “A Review of ETS Differential Item Functioning Assessment Procedures: Flagging Rules, Minimum Sample Size Requirements, and Criterion Refinement,” ETS Res. Rep. Ser., vol. 2012, no. 1, Jun. 2012, doi: 10.1002/j.2333-8504.2012.tb02290.x.
[46] R. D. Penfield and T. C. M. Lam, “Assessing Differential Item Functioning in Performance Assessment: Review and Recommendations,” Educ. Meas. Issues Pract., vol. 19, no. 3, pp. 5–15, Sep. 2000, doi: 10.1111/j.1745-3992.2000.tb00033.x.
[47] J. D. Webster, “Self-Report Wisdom Measures,” in The Cambridge Handbook of Wisdom, Cambridge University Press, 2019, pp. 297–320. doi: 10.1017/9781108568272.015.
[48] P. Grossman, C. Compton, D. Igra, M. Ronfeldt, E. Shahan, and P. W. Williamson, “Teaching Practice: A Cross-Professional Perspective,” Teach. Coll. Rec. Voice Scholarsh. Educ., vol. 111, no. 9, pp. 2055–2100, Sep. 2009, doi: 10.1177/016146810911100905.
[49] J. Lave and E. Wenger, Situated Learning. Cambridge University Press, 1991. doi: 10.1017/CBO9780511815355.
[50] J. Loughran, “Professionally Developing as a Teacher Educator,” J. Teach. Educ., vol. 65, no. 4, pp. 271–283, Sep. 2014, doi: 10.1177/0022487114533386.
[51] L. E. Kim and K. Asbury, “‘Like a rug had been pulled from under you’: The impact of COVID‐19 on teachers in England during the first six weeks of the UK lockdown,” Br. J. Educ. Psychol., vol. 90, no. 4, pp. 1062–1083, Dec. 2020, doi: 10.1111/bjep.12381.
Downloads
Published
Issue
Section
License
Copyright (c) 2026 Tb. Moh. Irma Ari Irawan, Hengki Satrianta, Hendrik Tuaputimain, Alice Asiedu, Ahman Ahman (Author)

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.








