ОЦЕНКА КАНДИДАТОВ НА ГОСУДАРСТВЕННУЮ СЛУЖБУ ИНДОНЕЗИИ: АНАЛИЗ ХАРАКТЕРИСТИК ТЕСТОВ НА АКАДЕМИЧЕСКИЙ ПОТЕНЦИАЛ С ИСПОЛЬЗОВАНИЕМ ПОДХОДА ТЕОРИИ ОТВЕТА НА ТЕСТ

Аннотация

The Academic Potential Test (APT) is a crucial instrument used in the selection process of Indonesian Civil Servant Candidates (CSC), which measures the cognitive potential of prospective employees. This study aims to evaluate the characteristics of items in the Academic Potential Test for Indonesian Civil Servant Candidates using the Item Response Theory (IRT) approach with dichotomous scoring, in order to improve the quality and validity of the test. The research method applied was descriptive quantitative, involving 330 participants in the APT CSC simulation training as research samples. Data analysis was conducted using three logistic models in IRT, namely one parameter (1-PL), two parameters (2-PL), and three parameters (3-PL). The results revealed that the 1-PL model is the most suitable model (fit) for estimating item parameters as well as the difficulty level of APT CSC questions in dichotomous data. Analysis of the item parameters showed that overall, the items in the APT CSC met the criteria of “good” based on the standard level of difficulty. Based on these findings, the study concluded that the 1-PL model can be used as a valid and reliable tool in estimating the difficulty level of APT CSC question items. In addition, the results of this study provide a basis for efforts to improve and develop the APT CSC in the future.

Скачивания

Данные скачивания пока не доступны.

Биографии авторов

Амран Хапсан, Университет Патомпо, Макассар; Inspection Canal Street, Citraland No. 10, Rappocini District, Makassar City, 90233, Indonesia, Indonesia.

Аспирант; Кафедра математического образования

Энданг Мулятинингсих, Государственный университет Джокьякарты

Доктор наук, магистр педагогических наук, профессор оценки обучения и педагогических исследований на кафедре кулинарного искусства и инженерного образования в области моды, инженерный факультет

Кана Хидаяти, Государственный университет Джокьякарты

Доктор, магистр педагогических наук, профессор учебной программы «Образование и образовательная оценка»

Индириани Х. Исмаил , Государственный университет Джокьякарты

Докторант

Литература

Abdullah, M., Ramalis, T. R., and Kaniawati, I. (2020) ‘Characteristics of creative thinking skills test in high school physics subjects on static fluid material through item response theory analysis. WaPFI (Wahana Pendidikan Fisika)’, Journal of Physics Learning Education. DOI:10.17509/wapfi.v5i1.23453

Abedalaziz, N., and Leng, C. H. (2018) ‘The relationship between CTT and IRT approaches in Analyzing Item Characteristics’, MOJES: Malaysian Online Journal of Educational Sciences, 1(1), pp. 64-70.

Adedoyin, O. O., and Mokobi, T. (2013) ‘Using IRT psychometric analysis in examining the quality of junior certificate mathematics multiple choice examination test items’, International Journal of Asian Social Science, 3(4), pp. 992-1011.

Awopeju, O. A. and Afolabi, R. I. (2016) ‘Comparative analysis of classical test theory and item response theory based item parame ter estimates of senior school certificate mathematics examination’, European Scientific Journal, 12(28), pp. 263-284.

Bảo, L. T., Koenig, K. A., Xiao, Y., Fritchman, J. C., Zhou, S. C., and Chen, C. (2022) ‘Theoretical model and quantitative assessment of scientific thinking and reasoning. Physical Review’, 18(1). DOI:10.1103/physrevphyseducres.18.010115

Boone, W. J., and Staver, J. R. (2020) Understanding and Utilizing Item Characteristic Curves (ICC) to further evaluate the functioning of a scale. Springer, Cham, pp. 65-83.

Cook, K. F., Kallen, M. A., and Amtmann, D. (2009) ‘Having a fit: Impact of number of items and distribution of data on traditional criteria for assessing IRT’s unidimensionality assumption’, Quality of Life Research, 18(4), pp. 447–460. DOI:10.1007/S11136-009-9464-4

Crocker, L., and Algina, J. (1986) Introduction to classical and modern test theory. Holt, Rinehart and Winston, 6277 Sea Harbor Drive, Orlando, FL 32887.

DeMars, C. (2010) Item response theory. Oxford University Press.

Güler, N., KAYA UYANIK, G. Ü. L. D. E. N., and TAŞDELEN TEKER, G. Ü. L. Ş. E. N. (2014) ‘Comparison of classical test theory and item response theory in terms of item parameters’, European Journal of Research on Education, 2(1).

Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Newbury Park, CA: Sage Publications.

Hambleton, R. K., and Zhao, Y. (2014) Item Response Theory (IRT). Models for dichotomous data. DOI:10.1002/9781118445112.STAT06400

Hambleton, R.K., and Lam, W. (2009) R edesign of MCAS tests based on a consideration of information functions. MCAS Validity Report No. 18; CEA-689. Amherst, MA: University of Massachusetts, Center for Educational Assessment. Available at: https://www.umass.edu/remp/docs/MCAS-RR-18.pdf (accessed 22 January 2026).

Hambleton, R. K., and Swaminathan, H. (1985) Items response theory: principles and application. Boston: Kluwer Nijjhoff Publish.

Harun, H., Prayitno, P., Sudaryanti, S., Rolina, N., and Manaf, A. (2024) ‘Item characteristics of dynamic assessment instrument for early childhood character outcomes: rasch model analysis’, Journal of Educational Research and Science, 16(2). DOI:10.21831/jpipfip.v16i2.63358

Khalid, M. N. (2009a) IRT model fit from different perspectives. DOI:10.3990/1.9789036529006

Krisna, I. I., Mardapi, D., and Azwar, S. (2016) ‘Determining the standard of academic potential based on the Indonesian Scholastic Aptitude Test (TBS) benchmark’, Research and Evaluation in Education, 2(2), pp. 165-180. DOI:10.21831/REID.V2I2.8465

Lukman, A. F., and Tinamba, S. (2023) Item difficulty estimation between item response theory and rasch model. DOI:10.5281/zenodo.10445736

Lutz, M. E., and Embretson, S. E. (2015) Item response theory, approach to test construction, pp. 1–8. DOI:10.1002/9781118625392.WBECP170

Maydeu-Olivares, A. (2013) ‘Why should we assess the goodness-of-fit of IRT models?’, Interdisciplinary Research and Perspective, 11(3), pp. 127–137. DOI:10.1080/15366367.2013.841511

Oktaviyanthi, R., and Agus, R. N. (2020) ‘Evaluation instrument for students' mathematical adaptive reasoning ability’, Aksioma, 9(4), pp. 1123-1136. DOI:10.24127/AJPM.V9I4.3150

Rahayu, D. A. P., and Seniati, A. N. L. (2024) ‘Psychometric properties of the General Sequential Reasoning Verbal Test (GSR-V): Deductive reasoning ability test. Psychostudia: Journal of Psychology’, 13(2), p. 283. DOI:10.30872/psikostudia.v13i2.15117

Ranger, J., and Much, S. (2020) ‘Analyzing the fit of IRT models with the Hausman Test’, Frontiers in Psychology, (11), p. 149. DOI:10.3389/FPSYG.2020.00149

Reise, S. P., Block, J., Mansolf, M., Haviland, M. G., Schalet, B. D., and Kimerling, R. (2024) ‘Using projective IRT to evaluate the effects of multidimensionality on unidimensional IRT model parameters’, Multivariate Behavioral Research, 1–17. DOI:10.1080/00273171.2024.2430630

Robitzsch, A. (2022a) ‘On the choice of the item response model for scaling PISA data: Model selection based on information criteria and quantifying model uncertainty’, Entropy, 24(6), p. 760. DOI:10.3390/e24060760

Schult, J., Fischer, F. T., and Hell, B. (2016) ‘Tests of scholastic aptitude cover reasoning facets sufficiently’, European Journal of Psychological Assessment, 32(3), 215–219. DOI:10.1027/1015-5759/A000247

Sen, S., and Cohen, A. S. (2020) ‘The impact of test and sample characteristics on model selection and classification accuracy in the multilevel mixture IRT model’, Frontiers in Psychology, (11), p. 197. DOI:10.3389/FPSYG.2020.00197

Sen, S., and Cohen, A. S. (2023) ‘An evaluation of fit indices used in model selection of dichotomous mixture IRT models’, Educational and Psychological Measurement. DOI:10.1177/00131644231180529

Septianingsih, E., and Jerusalem, M. A. (2021) ‘Developing instrument of academic potential test analogy verbal ability for undergraduate students’, Education and Learning, 15(2), pp. 234–241. DOI:10.11591/EDULEARN.V15I2.14220

Shealy, R., and Stout, W. (1991) An item response theory model for test Bias. Available at: https://files.eric.ed.gov/fulltext/ED328584.pdf (accessed 22 January 2026).

Slocum-Gori, S. L., and Zumbo, B. D. (2011) ‘Assessing the unidimensionality of psychological scales: using multiple criteria from factor analysis’, Social Indicators Research, 102(3), pp. 443–461. DOI:10.1007/S11205-010-9682-8

Tarigan, M., and Fadillah, F. (2019) ‘Analisa item response theory wonderlic personnel test (WPT)’, JP31, 8(1), pp. 37–45. DOI:10.15408/JP3I.V8I1.10819

Thorpe, G. L., and Favia, A. (2012) ‘Data analysis using item response theory methodology: an introduction to selected programs and applications’, Psychology Faculty Scholarship, (20). Available at: https://digitalcommons.library.umaine.edu/psy_facpub/20 (accessed 26 January 2026).

Wiberg, M. (2012) ‘Can a multidimensional test be evaluated with unidimensional item response theory’, Educational Research and Evaluation, 18(4), pp. 307–320. DOI:10.1080/13803611.2012.670416

Zain, A. Z., Ramalis, T. R., and Muslim, M. (2022) ‘Characterization of creative thinking skills test instrument based on partial credit model analysis’, Scientific Journal of Physics Education, 6(1), p. 176. DOI:10.20527/jipf.v6i1.4806

Zivanovic, M., Bjekić, J., and Opacic, G. (2018) ‘Multiple solutions test part I: Development and psychometric evaluation’, Psihologija, 51(3), pp. 351–375. DOI:10.2298/PSI161031003Z

Zoghi, M., and Valipour, V. (2014) ‘A comparative study of classical test theory and item response theory in estimating test item parameters in a linguistics test’, Indian Journal of Fundamental and Applied Life Sciences, 4(4), pp. 424-435.

Опубликован
2026-03-23
Как цитировать
ХапсанА., МулятинингсихЭ., ХидаятиК., & Исмаил И. Х. (2026). ОЦЕНКА КАНДИДАТОВ НА ГОСУДАРСТВЕННУЮ СЛУЖБУ ИНДОНЕЗИИ: АНАЛИЗ ХАРАКТЕРИСТИК ТЕСТОВ НА АКАДЕМИЧЕСКИЙ ПОТЕНЦИАЛ С ИСПОЛЬЗОВАНИЕМ ПОДХОДА ТЕОРИИ ОТВЕТА НА ТЕСТ. Вопросы государственного и муниципального управления, (5), 117-138. https://doi.org/10.17323/1999-5431-2026-0-5-117-138
Выпуск
Раздел
ПРОБЛЕМЫ УПРАВЛЕНИЯ: ТЕОРИЯ И ПРАКТИКА