DIPFdocs Open Access

Detailansicht Treffer

DIPFdocs Open Access

Treffer anzeigen

Boer, Diana; Hanke, Katja; He, Jia:

On detecting systematic measurement error in cross-cultural research
A review and critical reflection on equivalence and invariance tests

In: Journal of Cross-Cultural Psychology, 49 (2018) 5 , 713-734

URL des Volltextes:


3a. Beiträge in begutachteten Zeitschriften; Aufsatz (keine besondere Kategorie)

Interkultureller Vergleich, Empirische Forschung, Messverfahren, Psychologische Forschung, Fehler, Kultureinfluss, Einflussfaktor, Kognition, Verhalten, Sprache, Validität, Taxonomie

One major threat to revealing cultural influences on psychological states or processes is the presence of bias (i.e., systematic measurement error). When quantitative measures are not targeting the same construct or they differ in metric across cultures, the validity of inferences about cultural variability (and universality) is in doubt. The objectives of this article are to review what can be done about it and what is being done about it. To date, a multitude of useful techniques and methods to reduce or assess bias in cross-cultural research have been developed. We explore the limits of invariance/equivalence testing and suggest more flexible means of dealing with bias. First, we review currently available established and novel methods that reveal bias in cross-cultural research. Second, we analyze current practices in a systematic content analysis. The content analysis of more than 500 culture-comparative quantitative studies (published from 2008 to 2015 in three outlets in cross-cultural, social, and developmental psychology) aims to gauge current practices and approaches in the assessment of measurement equivalence/invariance. Surprisingly, the analysis revealed a rather low penetration of invariance testing in cross-cultural research. Although a multitude of classical and novel approaches for invariance testing is available, these are employed infrequent rather than habitual. We discuss reasons for this hesitation, and we derive suggestions for creatively assessing and handling biases across different research paradigms and designs. (DIPF/Orig.)

Bildungsqualität und Evaluation