Menü Überspringen
Kontakt
Presse
Deutsch
English
Not track
Datenverarbeitung
Suche
Anmelden
DIPF aktuell
Forschung
Infrastrukturen
Institut
Zurück
Kontakt
Presse
Deutsch
English
Not track
Datenverarbeitung
Suche
Startseite
>
Forschung
>
Publikationen
>
Publikationendatenbank
Ergebnis der Suche in der DIPF Publikationendatenbank
Ihre Abfrage:
(Personen: "Naumann," und "Alexander")
zur erweiterten Suche
Suchbegriff
Nur Open Access
Suchen
Markierungen aufheben
Alle Treffer markieren
Export
19
Inhalte gefunden
Alle Details anzeigen
Measuring the development of general language skills in English as a foreign language. Longitudinal […]
Schnoor, Birger; Hartig, Johannes; Klinger, Thorsten; Naumann, Alexander; Usanova, Irina
Zeitschriftenbeitrag
| In: Language Testing | 2023
43592 Endnote
Autor*innen:
Schnoor, Birger; Hartig, Johannes; Klinger, Thorsten; Naumann, Alexander; Usanova, Irina
Titel:
Measuring the development of general language skills in English as a foreign language. Longitudinal invariance of the C-test
In:
Language Testing, 40 (2023) 3, S. 796-819
DOI:
10.1177/02655322231159829
URL:
https://journals.sagepub.com/doi/10.1177/02655322231159829
Dokumenttyp:
3a. Beiträge in begutachteten Zeitschriften; Aufsatz (keine besondere Kategorie)
Sprache:
Englisch
Abstract:
Research on assessing English as a foreign language (EFL) development has been growing recently. However, empirical evidence from longitudinal analyses based on substantial samples is still needed. In such settings, tests for measuring language development must meet high standards of test quality such as validity, reliability, and objectivity, as well as allow for valid interpretations of change scores, requiring longitudinal measurement invariance. The current study has a methodological focus and aims to examine the measurement invariance of a C-test used to assess EFL development in monolingual and bilingual secondary school students (n = 1956) in Germany. We apply longitudinal confirmatory factor analysis to test invariance hypotheses and obtain proficiency estimates comparable over time. As a result, we achieve residual longitudinal measurement invariance. Furthermore, our analyses support the appropriateness of altering texts in a longitudinal C-test design, which allows for the anchoring of texts between waves to establish comparability of the measurements over time using the information of the repeated texts to estimate the change in the test scores. If used in such a design, a C-test provides reliable, valid, and efficient measures for EFL development in secondary education in bilingual and monolingual students in Germany. (DIPF/Orig.)
DIPF-Abteilung:
Lehr und Lernqualität in Bildungseinrichtungen
COINS-Mathematiktest
List, Marit Kristine; Leininger, Stephanie; Schönenberger, Stephan; Hartig, Johannes; […]
Forschungsdaten/Instrumente
| Forschungsdatenzentrum Bildung am DIPF (FDZ Bildung) | 2023
44642 Endnote
Autor*innen:
List, Marit Kristine; Leininger, Stephanie; Schönenberger, Stephan; Hartig, Johannes; Hochweber, Jan; Naumann, Alexander
Titel:
COINS-Mathematiktest
Erscheinungsvermerk:
Frankfurt am Main: Forschungsdatenzentrum Bildung am DIPF (FDZ Bildung), 2023
DOI:
10.7477/1119:391:61
URL:
https://www.fdz-bildung.de/test.php?id=61
Dokumenttyp:
6. Forschungsdaten; Instrumente; Diagnostische Instrumente (inkl. Testverfahren)
Sprache:
Deutsch
DIPF-Abteilung:
Lehr und Lernqualität in Bildungseinrichtungen
Transferwissen von Lehrenden in der berufsbezogenen Weiterbildung. Entwicklung und Pilotierung […]
Koch, Alisha; Wißhak, Susanne; Spener, Claudio; Naumann, Alexander; Hochholdinger, Sabine
Zeitschriftenbeitrag
| In: Zeitschrift für Weiterbildungsforschung | 2022
42740 Endnote
Autor*innen:
Koch, Alisha; Wißhak, Susanne; Spener, Claudio; Naumann, Alexander; Hochholdinger, Sabine
Titel:
Transferwissen von Lehrenden in der berufsbezogenen Weiterbildung. Entwicklung und Pilotierung eines Testinstruments
In:
Zeitschrift für Weiterbildungsforschung, 45 (2022) 1, S. 89-105
DOI:
10.1007/s40955-022-00210-0
URL:
https://link.springer.com/article/10.1007/s40955-022-00210-0
Dokumenttyp:
3a. Beiträge in begutachteten Zeitschriften; Aufsatz (keine besondere Kategorie)
Sprache:
Deutsch
Abstract:
Transfer gilt als zentrales Ziel berufsbezogener Weiterbildungsmaßnahmen. Obwohl Lehrende in der berufsbezogenen Weiterbildung Transfer anbahnen und unterstützen können, ist über ihr dafür benötigtes professionelles Wissen bislang wenig bekannt, und es fehlt an einem Messinstrument zu dessen Erfassung. Zu diesem Zweck entwickelten wir einen Test, welcher wissenschaftlich fundiertes Wissen über die Determinanten und die Förderung von Transfer erfasst. Der vorliegende Beitrag thematisiert die Entwicklung und Pilotierung des Instruments. Auf Basis der Item-Response-Theory (IRT) wurden die Testantworten von N = 105 Lehrenden skaliert und interpretiert. Die 32 Items zeigen eine gute Passung hinsichtlich des eindimensionalen Partial-Credit-Modells mit Infit-Werten zwischen 0,93 und 1,16. Die EAP/PV-Reliabilität liegt bei 0,81. Die Ergebnisse weisen darauf hin, dass sich Transferwissen mithilfe des Instruments operationalisieren und reliabel messen lässt. (DIPF/Orig.)
Abstract (english):
The goal of work-related training is transfer. Although trainers in work-related continuing education can initiate and support transfer, little is known about the professional knowledge they need to do so, and there is a lack of a measurement tool to capture this knowledge. For this reason, we constructed an instrument as a means to assessing scientifically based knowledge about the determinants of transfer and about its facilitation. This paper reports the test construction and the piloting of the instrument. Based on item response theory (IRT), the answers of N = 105 trainers were scaled and interpreted. The 32 items showed a good fit with respect to the one-dimensional partial credit model with infit values ranging from 0.93 to 1.16. The EAP/PV reliability was 0.81. The results indicate that transfer knowledge can be operationalized and reliably measured using the instrument. (DIPF/Orig.)
DIPF-Abteilung:
Lehr und Lernqualität in Bildungseinrichtungen
Detecting instruction effects. Deciding between covariance analytical and change-score approach
Köhler, Carmen; Hartig, Johannes; Naumann, Alexander
Zeitschriftenbeitrag
| In: Educational Psychology Review | 2021
41484 Endnote
Autor*innen:
Köhler, Carmen; Hartig, Johannes; Naumann, Alexander
Titel:
Detecting instruction effects. Deciding between covariance analytical and change-score approach
In:
Educational Psychology Review, 33 (2021) 3, S. 1191-1211
DOI:
10.1007/s10648-020-09590-6
URN:
urn:nbn:de:0111-pedocs-252368
URL:
https://nbn-resolving.org/urn:nbn:de:0111-pedocs-252368
Dokumenttyp:
3a. Beiträge in begutachteten Zeitschriften; Aufsatz (keine besondere Kategorie)
Sprache:
Englisch
Schlagwörter:
Empirische Forschung; Unterricht; Wirkung; Unterrichtsforschung; Methode; Messverfahren; Lernerfolg; Variable; Prognose; Schüler; Unterstützung; Selbstwirksamkeit; Lehrer; Forschungsdesign; Datenanalyse; Modell
Abstract:
The article focuses on estimating effects in nonrandomized studies with two outcome measurement occasions and one predictor variable. Given such a design, the analysis approach can be to include the measurement at the previous time point as a predictor in the regression model (ANCOVA), or to predict the change-score of the outcome variable (CHANGE). Researchers demonstrated that both approaches can result in different conclusions regarding the reported effect. Current recommendations on when to apply which approach are, in part, contradictory. In addition, they lack direct reference to the educational and instructional research contexts, since they do not consider latent variable models in which variables are measured without measurement error. This contribution assists researchers in making decisions regarding their analysis model. Using an underlying hypothetical data-generating model, we identify for which kind of data-generating scenario (i.e., under which assumptions) the defined true effect equals the estimated regression coefficients of the ANCOVA and the CHANGE approach. We give empirical examples from instructional research and discuss which approach is more appropriate, respectively. (DIPF/Orig.)
DIPF-Abteilung:
Lehr und Lernqualität in Bildungseinrichtungen
Content-specificity of teachers' judgment accuracy regarding students' academic achievement
Kolovou, Dimitra; Naumann, Alexander; Hochweber, Jan; Praetorius, Anna-Katharina
Zeitschriftenbeitrag
| In: Teaching and Teacher Education | 2021
41044 Endnote
Autor*innen:
Kolovou, Dimitra; Naumann, Alexander; Hochweber, Jan; Praetorius, Anna-Katharina
Titel:
Content-specificity of teachers' judgment accuracy regarding students' academic achievement
In:
Teaching and Teacher Education, 100 (2021) , S. 103298
DOI:
10.1016/j.tate.2021.103298
URN:
urn:nbn:de:0111-pedocs-238937
URL:
https://nbn-resolving.org/urn:nbn:de:0111-pedocs-238937
Dokumenttyp:
3a. Beiträge in begutachteten Zeitschriften; Aufsatz (keine besondere Kategorie)
Sprache:
Englisch
Schlagwörter:
Schülerleistung; Lehrer; Bewertung; Genauigkeit; Konsistenz <Psy>; Unterrichtsfach; Bildungsinhalt; Mathematik; Deutsch; Test; Mehrebenenanalyse; Multivariate Analyse; Modell; Sekundarstufe I; Schuljahr 07; Empirische Untersuchung; Zürich; Kanton; Schweiz
Abstract:
Teachers' accuracy in judging students' achievement is often assumed to be a general ability of teachers. Based on this assumption, teachers should be at least consistent in their accuracy across different content domains within a school subject. Yet, this assumption has rarely been investigated empirically so far. Data from 54 mathematics teachers (N = 1170 students) and 55 language teachers (N = 1255 students) were analysed using a Bayesian multivariate multilevel modelling approach. Results indicate that latent accuracy measures across content domains indeed are substantially correlated within both investigated subjects, but may still be considered to represent different dimensions. (DIPF/Orig.)
DIPF-Abteilung:
Lehr und Lernqualität in Bildungseinrichtungen
Der Beitrag der Kompetenztheorie für die Fachdidaktiken
Naumann, Alexander; Köhler, Carmen; Hartig, Johannes
Sammelbandbeitrag
| Aus: Weißeno, Georg; Ziegler, Béatrice (Hrsg.): Handbuch Geschichts- und Politikdidaktik | Wiesbaden: Springer VS | 2021
42020 Endnote
Autor*innen:
Naumann, Alexander; Köhler, Carmen; Hartig, Johannes
Titel:
Der Beitrag der Kompetenztheorie für die Fachdidaktiken
Aus:
Weißeno, Georg; Ziegler, Béatrice (Hrsg.): Handbuch Geschichts- und Politikdidaktik, Wiesbaden: Springer VS, 2021 , S. 1-15
DOI:
10.1007/978-3-658-29673-5_4-1
URL:
https://link.springer.com/referenceworkentry/10.1007%2F978-3-658-29673-5_4-1
Dokumenttyp:
4. Beiträge in Sammelbänden; Sammelband (keine besondere Kategorie)
Sprache:
Deutsch
DIPF-Abteilung:
Lehr und Lernqualität in Bildungseinrichtungen
Using a multilevel random item Rasch model to examine item difficulty variance between random groups
Hartig, Johannes; Köhler, Carmen; Naumann, Alexander
Zeitschriftenbeitrag
| In: Psychological Test and Assessment Modeling | 2020
40525 Endnote
Autor*innen:
Hartig, Johannes; Köhler, Carmen; Naumann, Alexander
Titel:
Using a multilevel random item Rasch model to examine item difficulty variance between random groups
In:
Psychological Test and Assessment Modeling, 62 (2020) 1, S. 11-27
URL:
https://www.psychologie-aktuell.com/fileadmin/Redaktion/Journale/ptam-2020-1/02_Hartig.pdf
Dokumenttyp:
3a. Beiträge in begutachteten Zeitschriften; Aufsatz (keine besondere Kategorie)
Sprache:
Englisch
Schlagwörter:
Rasch-Modell; Mehrebenenanalyse; Methode; Leistungsfähigkeit; Vergleichsuntersuchung; Simulation
Abstract:
In educational assessments, item difficulties are typically assumed to be invariant across groups (e.g., schools or countries). We refer to variances of item difficulties on the group level violating this assumption as random group differential item functioning (RG-DIF). We examine the performance of three methods to estimate RG-DIF: (1) three-level Generalized Linear Mixed Models (GLMMs), (2) three-level GLMMs with anchor items, and (3) item-wise multilevel logistic regression (ML-LR) controlling for the estimated trait score. In a simulation study, the magnitude of RG-DIF and the covariance of the item difficulties on the group level were varied. When group level effects were independent, all three methods performed well. With correlated DIF, estimated variances on the group level were biased with the full three-level GLMM and ML-LR. This bias was more pronounced for ML-LR than for the full three-level GLMM. Using a three-level GLMM with anchor items allowed unbiased estimation of RG-DIF.
Abstract (english):
In educational assessments, item difficulties are typically assumed to be invariant across groups (e.g., schools or countries). We refer to variances of item difficulties on the group level violating this assumption as random group differential item functioning (RG-DIF). We examine the performance of three methods to estimate RG-DIF: (1) three-level Generalized Linear Mixed Models (GLMMs), (2) three-level GLMMs with anchor items, and (3) item-wise multilevel logistic regression (ML-LR) controlling for the estimated trait score. In a simulation study, the magnitude of RG-DIF and the covariance of the item difficulties on the group level were varied. When group level effects were independent, all three methods performed well. With correlated DIF, estimated variances on the group level were biased with the full three-level GLMM and ML-LR. This bias was more pronounced for ML-LR than for the full three-level GLMM. Using a three-level GLMM with anchor items allowed unbiased estimation of RG-DIF.
DIPF-Abteilung:
Bildungsqualität und Evaluation
Multilevel models for evaluating the effectiveness of teaching. Conceptual and methodological […]
Köhler, Carmen; Kuger, Susanne; Naumann, Alexander; Hartig, Johannes
Zeitschriftenbeitrag
| In: Zeitschrift für Pädagogik. Beiheft | 2020
39929 Endnote
Autor*innen:
Köhler, Carmen; Kuger, Susanne; Naumann, Alexander; Hartig, Johannes
Titel:
Multilevel models for evaluating the effectiveness of teaching. Conceptual and methodological considerations
In:
Zeitschrift für Pädagogik. Beiheft, 66 (2020) , S. 197-209
DOI:
10.3262/ZPB2001197
Dokumenttyp:
3a. Beiträge in begutachteten Zeitschriften; Beitrag in Sonderheft
Sprache:
Englisch
Schlagwörter:
Lehrer; Verhalten; Einflussfaktor; Effektivität; Lernen; Unterricht; Modellierung; Methodologie; Unterrichtsforschung; Evaluation; Schülerzahl; Datenanalyse; Reliabilität; Mehrebenenanalyse; Modell
Abstract:
In der Unterrichtsforschung liegt ein Schwerpunkt auf der Identifizierung von Lehrpersonalverhalten, welches Lernende positiv beeinflusst. Ein angemessenes Studiendesign sowie die statistische Modellierung und die Ergebnisinterpretation bergen einige Herausforderungen. Beispielsweise erfordert die dem Forschungsbereich inhärente Mehrebenenstruktur mehrstufige Analysemodelle. Im folgenden Artikel wurde ein exemplarischer Datensatz verwendet, auf den verschiedene mehrstufige Modelle angewendet wurden, um zu veranschaulichen, wie diese Modelle die substantielle Interpretation der Forschungsfrage beeinflussen. Die Forschungsfrage in allen Settings bezog sich auf die Auswirkungen des Lehrpersonalverhaltens auf die Ergebnisse der Lernenden. (DIPF/Orig.)
Abstract (english):
In research on teaching, the primary focus lies in identifying teacher behavior that positively influences relevant student outcomes. To adequately design the study, statistically model and interpret the results poses challenges for researchers. For example, the inherent multilevel structure in studies on teaching requires the application of multilevel models. This research used one exemplary data set, to which varying multilevel models were applied, thus illustrating how these models variously affect the substantial interpretation of the research question. The research question in all settings concerned the effects of teacher behavior on student outcomes. The overall purpose of this paper is to give an overview of modeling and interpreting results regarding the effectiveness of teaching appropriately. (DIPF/Orig.)
DIPF-Abteilung:
Bildungsqualität und Evaluation
Conceptual and methodological challenges in detecting the effectiveness of learning and teaching
Naumann, Alexander; Kuger, Susanne; Köhler, Carmen; Hochweber, Jan
Zeitschriftenbeitrag
| In: Zeitschrift für Pädagogik. Beiheft | 2020
40254 Endnote
Autor*innen:
Naumann, Alexander; Kuger, Susanne; Köhler, Carmen; Hochweber, Jan
Titel:
Conceptual and methodological challenges in detecting the effectiveness of learning and teaching
In:
Zeitschrift für Pädagogik. Beiheft, 66 (2020) , S. 179-196
Dokumenttyp:
3a. Beiträge in begutachteten Zeitschriften; Beitrag in Sonderheft
Sprache:
Englisch
Schlagwörter:
Effektivität; Unterricht; Lernen; Wirkung; Schülerleistung; Messverfahren; Konzeption; Modellierung; Unterrichtsprozess; Leistungsmessung; Validität; Methodologie
Abstract:
One major goal of research on educational effectiveness is to detect the effects of teaching and learning. Reliably detecting the effects of teaching and learning requires the identification and adequate measurement of (a) the relevant classroom processes and (b) outcomes on the student and the classroom level and also (c) modeling the link between both. The present paper aims to identify and discuss current conceptual and methodological challenges in regard to making inferences on the effectiveness of teaching and learning. We give a brief overview of current practices, discuss key quality criteria with respect to these three aspects, and identify areas in need of further development. (DIPF/Orig.)
DIPF-Abteilung:
Bildungsqualität und Evaluation
Instructional sensitivity as a prerequisite for determining the effectiveness of interventions in […]
Naumann, Alexander; Musow, Stephanie; Katstaller, Michaela
Sammelbandbeitrag
| Aus: Astleitner, Hermann (Hrsg.): Intervention research in educational practice: Alternative theoretical frameworks and application problems | Münster: Waxmann | 2020
40253 Endnote
Autor*innen:
Naumann, Alexander; Musow, Stephanie; Katstaller, Michaela
Titel:
Instructional sensitivity as a prerequisite for determining the effectiveness of interventions in educational research
Aus:
Astleitner, Hermann (Hrsg.): Intervention research in educational practice: Alternative theoretical frameworks and application problems, Münster: Waxmann, 2020 , S. 149-170
Dokumenttyp:
4. Beiträge in Sammelwerken; Sammelband (keine besondere Kategorie)
Sprache:
Englisch
Schlagwörter:
Unterricht; Intervention; Effektivität; Bildungsforschung; Messverfahren; Test; Veränderung; Zeit; Modellierung; Forschungsdesign; Psychometrie
Abstract:
Student achievement has become a major criterion for evaluating the effectiveness of schooling and teaching. However, valid interpretation and use of test scores in educational contexts require more detailed information about the degree to which the applied test instruments are appropriate to evaluate the intended educational and interventional effects. Instructional sensitivity is the psychometric property of tests or single items to capture effects of classroom instruction. Although instructional sensitivity is a prerequisite for valid inferences on teaching effectiveness, sensitivity is rather assumed than verified in practice. The aim of this chapter is to improve the understanding of instructional sensitivity and its measurement in educational intervention research. Specifically, it first provides an overview of the theoretical framework and relevance of instructional sensitivity. Then, different approaches of measuring instructional sensitivity are outlined and procedures of implementing instructional sensitivity in educational intervention studies are introduced and contrasted with each other. Finally, the role of time spans is discussed and modelling change for short-time and long-time effects in pretest-posttest-follow-up designs is addressed. (DIPF/Orig.)
DIPF-Abteilung:
Bildungsqualität und Evaluation
Markierungen aufheben
Alle Treffer markieren
Export
1
(aktuell)
2
>
Alle anzeigen
(19)