DIPF database of publications

Detailansicht Treffer

DIPF database of publications

Show results

Author
Buchholz, Janine; Hartig, Johannes:

Title:
Measurement invariance testing in questionnaires
A comparison of three Multigroup-CFA and IRT-based approaches

Source:
In: Psychological Test and Assessment Modelling, 62 (2020) 1 , 29-54

URL of full text:
https://www.psychologie-aktuell.com/fileadmin/Redaktion/Journale/ptam-2020-1/03_Buchholz.pdf

Language:
Englisch

Document type
3a. Beiträge in begutachteten Zeitschriften; Beitrag in Sonderheft

Schlagwörter:
PISA (Programme for International Student Assessment), Item-Response-Theorie, Faktorenanalyse, Schülerleistung, Leistungsmessung, Messung, Invarianz, Validität, Statistische Methode


Abstract(englisch):
International Large-Scale Assessments aim at comparisons of countries with respect to latent constructs such as attitudes, values and beliefs. Measurement invariance (MI) needs to hold in order for such comparisons to be valid. Several statistical approaches to test for MI have been proposed: While Multigroup Confirmatory Factor Analysis (MGCFA) is particularly popular, a newer, IRT-based approach was introduced for non-cognitive constructs in PISA 2015, thus raising the question of consistency between these approaches. A total of three approaches (MGCFA for ordinal and continuous data, multi-group IRT) were applied to simulated data containing different types and extents of MI violations, and to the empirical non-cognitive PISA 2015 data. Analyses are based on indices of the magnitude (i.e., parameter-specific modification indices resulting from MGCFA and group-specific item fit statistics resulting from the IRT approach) and direction of local misfit (i.e., standardized parameter change and mean deviation, respectively). Results indicate that all measures were sensitive to (some) MI violations and more consistent in identifying group differences in item difficulty parameters.


DIPF-Departments:
Educational Quality and Evaluation

Notes: