Menü Überspringen
Contact
Deutsch
English
Not track
Data Protection
Search
Log in
DIPF News
Research
Infrastructures
Institute
Zurück
Contact
Deutsch
English
Not track
Data Protection
Search
Home
>
Research
>
Publications
>
Publications Data Base
Search results in the DIPF database of publications
Your query:
(Schlagwörter: "Evaluation")
Advanced Search
Search term
Only Open Access
Search
Unselect matches
Select all matches
Export
328
items matching your search terms.
Show all details
Evaluation of online information in university students. Development and scaling of the screening […]
Hahnel, Carolin; Eichmann, Beate; Goldhammer, Frank
Journal Article
| In: Frontiers in Psychology | 2020
40881 Endnote
Author(s):
Hahnel, Carolin; Eichmann, Beate; Goldhammer, Frank
Title:
Evaluation of online information in university students. Development and scaling of the screening instrument EVON
In:
Frontiers in Psychology, (2020) , S. 11:562128
DOI:
10.3389/fpsyg.2020.562128
URN:
urn:nbn:de:0111-pedocs-232241
URL:
https://nbn-resolving.org/urn:nbn:de:0111-pedocs-232241
Publication Type:
3a. Beiträge in begutachteten Zeitschriften; Aufsatz (keine besondere Kategorie)
Language:
Englisch
Keywords:
Deutschland; Internet; Informationskompetenz; Ressource; Glaubwürdigkeit; Relevanz; Bewertung; Test; Testentwicklung; Itemanalyse; Suchmaschine; Simulation; Technologiebasiertes Testen; Interview; Erhebungsinstrument; Evaluation; Student; Rasch-Modell; Empirische Untersuchung;
Abstract:
As Internet sources provide information of varying quality, it is an indispensable prerequisite skill to evaluate the relevance and credibility of online information. Based on the assumption that competent individuals can use different properties of information to assess its relevance and credibility, we developed the EVON (evaluation of online information), an interactive computer-based test for university students. The developed instrument consists of eight items that assess the skill to evaluate online information in six languages. Within a simulated search engine environment, students are requested to select the most relevant and credible link for a respective task. To evaluate the developed instrument, we conducted two studies: (1) a pre-study for quality assurance and observing the response process (cognitive interviews of n = 8 students) and (2) a main study aimed at investigating the psychometric properties of the EVON and its relation to other variables (n = 152 students). The results of the pre-study provided first evidence for a theoretically sound test construction with regard to students' item processing behavior. The results of the main study showed acceptable psychometric outcomes for a standardized screening instrument with a small number of items. The item design criteria affected the item difficulty as intended, and students' choice to visit a website had an impact on their task success. Furthermore, the probability of task success was positively predicted by general cognitive performance and reading skill. Although the results uncovered a few weaknesses (e.g., a lack of difficult items), and the efforts of validating the interpretation of EVON outcomes still need to be continued, the overall results speak in favor of a successful test construction and provide first indication that the EVON assesses students' skill in evaluating online information in search engine environments. (DIPF/Orig.)
DIPF-Departments:
Bildungsqualität und Evaluation
Multilevel models for evaluating the effectiveness of teaching. Conceptual and methodological […]
Köhler, Carmen; Kuger, Susanne; Naumann, Alexander; Hartig, Johannes
Journal Article
| In: Zeitschrift für Pädagogik. Beiheft | 2020
39929 Endnote
Author(s):
Köhler, Carmen; Kuger, Susanne; Naumann, Alexander; Hartig, Johannes
Title:
Multilevel models for evaluating the effectiveness of teaching. Conceptual and methodological considerations
In:
Zeitschrift für Pädagogik. Beiheft, 66 (2020) , S. 197-209
DOI:
10.3262/ZPB2001197
Publication Type:
3a. Beiträge in begutachteten Zeitschriften; Beitrag in Sonderheft
Language:
Englisch
Keywords:
Lehrer; Verhalten; Einflussfaktor; Effektivität; Lernen; Unterricht; Modellierung; Methodologie; Unterrichtsforschung; Evaluation; Schülerzahl; Datenanalyse; Reliabilität; Mehrebenenanalyse; Modell
Abstract:
In der Unterrichtsforschung liegt ein Schwerpunkt auf der Identifizierung von Lehrpersonalverhalten, welches Lernende positiv beeinflusst. Ein angemessenes Studiendesign sowie die statistische Modellierung und die Ergebnisinterpretation bergen einige Herausforderungen. Beispielsweise erfordert die dem Forschungsbereich inhärente Mehrebenenstruktur mehrstufige Analysemodelle. Im folgenden Artikel wurde ein exemplarischer Datensatz verwendet, auf den verschiedene mehrstufige Modelle angewendet wurden, um zu veranschaulichen, wie diese Modelle die substantielle Interpretation der Forschungsfrage beeinflussen. Die Forschungsfrage in allen Settings bezog sich auf die Auswirkungen des Lehrpersonalverhaltens auf die Ergebnisse der Lernenden. (DIPF/Orig.)
Abstract (english):
In research on teaching, the primary focus lies in identifying teacher behavior that positively influences relevant student outcomes. To adequately design the study, statistically model and interpret the results poses challenges for researchers. For example, the inherent multilevel structure in studies on teaching requires the application of multilevel models. This research used one exemplary data set, to which varying multilevel models were applied, thus illustrating how these models variously affect the substantial interpretation of the research question. The research question in all settings concerned the effects of teacher behavior on student outcomes. The overall purpose of this paper is to give an overview of modeling and interpreting results regarding the effectiveness of teaching appropriately. (DIPF/Orig.)
DIPF-Departments:
Bildungsqualität und Evaluation
A bias corrected RMSD item fit statistic. An evaluation and comparison to alternatives
Köhler, Carmen; Robitzsch, Alexander; Hartig, Johannes
Journal Article
| In: Journal of Educational and Behavioral Statistics | 2020
40510 Endnote
Author(s):
Köhler, Carmen; Robitzsch, Alexander; Hartig, Johannes
Title:
A bias corrected RMSD item fit statistic. An evaluation and comparison to alternatives
In:
Journal of Educational and Behavioral Statistics, 45 (2020) 3, S. 251-273
DOI:
10.3102/1076998619890566
URL:
https://journals.sagepub.com/doi/10.3102/1076998619890566
Publication Type:
3a. Beiträge in begutachteten Zeitschriften; Aufsatz (keine besondere Kategorie)
Language:
Englisch
Keywords:
Item-Response-Theory; Testkonstruktion; Modell; Frage; Antwort; Messverfahren; Statistische Methode; Evaluation; Vergleich; Bildungsforschung; Empirische Forschung
Abstract:
Testing whether items fit the assumptions of an item response theory model is an important step in evaluating a test. In the literature, numerous item fit statistics exist, many of which show severe limitations. The current study investigates the root mean squared deviation (RMSD) item fit statistic, which is used for evaluating item fit in various large-scale assessment studies. The three research questions of this study are (1) whether the empirical RMSD is an unbiased estimator of the population RMSD; (2) if this is not the case, whether this bias can be corrected; and (3) whether the test statistic provides an adequate significance test to detect misfitting items. Using simulation studies, it was found that the empirical RMSD is not an unbiased estimator of the population RMSD, and nonparametric bootstrapping falls short of entirely eliminating this bias. Using parametric bootstrapping, however, the RMSD can be used as a test statistic that outperforms the other approaches - infit and outfit, S1 X2 with respect to both Type I error rate and power. The empirical application showed that parametric bootstrapping of the RMSD results in rather conservative item fit decisions, which suggests more lenient cut-off criteria. (DIPF/Orig.)
DIPF-Departments:
Bildungsqualität und Evaluation
Empirische Forschung zu Unterrichtsqualität - theoretische Grundfragen und quantitative […]
Praetorius, Anna-Katharina; Grünkorn, Juliane; Klieme, Eckhard
Journal Article
| In: Zeitschrift für Pädagogik. Beiheft | 2020
39925 Endnote
Author(s):
Praetorius, Anna-Katharina; Grünkorn, Juliane; Klieme, Eckhard
Title:
Empirische Forschung zu Unterrichtsqualität - theoretische Grundfragen und quantitative Modellierungen. Einleitung in das Beiheft
In:
Zeitschrift für Pädagogik. Beiheft, 66 (2020) , S. 9-14
DOI:
10.3262/ZPB2001009
Publication Type:
3a. Beiträge in begutachteten Zeitschriften; Bibliografien/Rezensionen u.ä. (z.B. Linktipps)
Language:
Deutsch
Keywords:
Unterricht; Qualität; Empirische Forschung; Unterrichtsforschung; Evaluation; Modell; Theorie; Methodologie; Messung; Messverfahren; Wirkung; Einführung
DIPF-Departments:
Bildungsqualität und Evaluation
Towards developing a theory of generic teaching quality. Origin, current status, and necessary next […]
Praetorius, Anna-Katharina; Klieme, Eckhard; Kleickmann, Thilo; Brunner, Esther; Lindmeier, Anke; […]
Journal Article
| In: Zeitschrift für Pädagogik. Beiheft | 2020
40724 Endnote
Author(s):
Praetorius, Anna-Katharina; Klieme, Eckhard; Kleickmann, Thilo; Brunner, Esther; Lindmeier, Anke; Taut, Sandy; Charalambous, Charalambos
Title:
Towards developing a theory of generic teaching quality. Origin, current status, and necessary next steps regarding the Three Basic Dimensions model
In:
Zeitschrift für Pädagogik. Beiheft, 66 (2020) , S. 15-36
Publication Type:
3a. Beiträge in begutachteten Zeitschriften; Beitrag in Sonderheft
Language:
Englisch
Keywords:
Theoriebildung; Unterricht; Qualität; Empirische Forschung; Unterrichtsforschung; Quantitative Forschung; Modell; Evaluation; Unterrichtstheorie; Didaktik; Mathematikunterricht; Methodologie; Messung; Messverfahren
Abstract:
In this paper we elaborate upon the relevance of theories of teaching (quality) in quantitative empirical research on teaching. First we introduce, the quantitative empirical research approach. Then, we present the origin and current status of research with respect to a model - the Three Basic Dimensions of teaching quality - that is especially popular in quantitative research on teaching quality in German-speaking countries. Next, we reflect on the extent to which the model fullfills criteria for a good theory, before deriving conclusions for future research that focuses on a process of successive theory building. (DIPF/Orig.)
DIPF-Departments:
Bildungsqualität und Evaluation
How do the participants evaluate the HAND in HAND programme? Results of semi-structured focus group […]
Vieluf, Svenja; Denk, Albert; Rožman, Mojca; Roczen, Nina
Book Chapter
| Aus: Kozina, Ana (Hrsg.): Social, emotional and intercultural competencies for inclusive school environments across Europe: Relationships matter | Hamburg: Kovač | 2020
39656 Endnote
Author(s):
Vieluf, Svenja; Denk, Albert; Rožman, Mojca; Roczen, Nina
Title:
How do the participants evaluate the HAND in HAND programme? Results of semi-structured focus group interviews
In:
Kozina, Ana (Hrsg.): Social, emotional and intercultural competencies for inclusive school environments across Europe: Relationships matter, Hamburg: Kovač, 2020 (Studien zur Schulpädagogik, 89), S. 195-218
URL:
https://www.verlagdrkovac.de/volltexte/11406/11406_Kozina%20ED%20-%20Social%20emotional%20and%20intercultural%20competencies%20for%20inclusive%20school%20environments%20across%20Europe.pdf#page=196
Publication Type:
4. Beiträge in Sammelbänden; Sammelband (keine besondere Kategorie)
Language:
Englisch
Keywords:
Teilnehmer; Evaluation; Programm; Summative Evaluation; Formative Evaluation; Gruppe; Interview; Intervention; Wirkung
Abstract (english):
This chapter summarises and discusses how participants evaluated the HAND in HAND programme. It is based on responses to two questions asked during semi-structured focus group interviews which comprised part of a more comprehensive evaluation of the HAND in HAND programme. The findings complement the experimental outcome evaluation but they also serve a formative purpose, i.e. to help identify the starting points for improving the programme. In terms of the summative outcome evaluation, the results show that many participants liked the programme (particularly the teachers, less so the students) and that the programme had positive short-term effects on the participants' mood and on the group atmosphere in several groups. However, only a few participants observed long-term effects. One reason for the latter finding may be that many school staff groups reported difficulties in imple-menting exercises, practices or ideas from the programme within their everyday pedagogical practice. Hence, the programme could probably gain from including longer-term support with respect to implementation. Other suggestions for improvement derived from the interviews are designing the intervention according to a real 'whole-school' approach and better supporting the autonomy of students. (DIPF/Orig.)
DIPF-Departments:
Bildungsqualität und Evaluation
Usability design and evaluation for a formative assessment feedback
Horn, Florian; Schiffner, Daniel; Gattinger, Thorsten; Sacher, Patrick
Book Chapter
| Aus: Zender, Raphael; Ifenthaler, Dirk; Leonhardt, Thiemo; Schumacher, Clara (Hrsg.): DELFI 2020 - die 18. Fachtagung Bildungstechnologien der Gesellschaft für Informatik e. V., 14.-18. September 2020, online | Bonn: Gesellschaft für Informatik | 2020
40266 Endnote
Author(s):
Horn, Florian; Schiffner, Daniel; Gattinger, Thorsten; Sacher, Patrick
Title:
Usability design and evaluation for a formative assessment feedback
In:
Zender, Raphael; Ifenthaler, Dirk; Leonhardt, Thiemo; Schumacher, Clara (Hrsg.): DELFI 2020 - die 18. Fachtagung Bildungstechnologien der Gesellschaft für Informatik e. V., 14.-18. September 2020, online, Bonn: Gesellschaft für Informatik, 2020 (Lecture Notes in Informatics, P-308), S. 121-126
URL:
https://dl.gi.de/handle/20.500.12116/34149
Publication Type:
4. Beiträge in Sammelwerken; Tagungsband/Konferenzbeitrag/Proceedings
Language:
Englisch
Keywords:
Adaptives Testen; Formative Evaluation; Nutzerverhalten; Benutzerfreundlichkeit; Feedback; Bewertung; Student; Hochschulunterricht; Programmierung; Deutschland
Abstract (english):
In a current research project we implemented an approach for delivering computer assisted adaptive testing as an additional form of formative assessment for a lecture. We primarily used these tests as a formative assessment during a "fundamentals of computer programming" course. The tests also included an individual feedback to further guide the students. To evaluate and improve the formative assessments, we performed a usability study, which was focused on the user satisfaction while taking a test and reading the feedback. The study used the user experience questionnaire. The results indicate that an adaptive assessment can provide more support, but also shows shortcomings of the current implementation. (DIPF/Orig.)
DIPF-Departments:
Informationszentrum Bildung
Innerschulische Verarbeitung und Verwendung von Inspektionsergebnissen. Darstellung am Beispiel von […]
Rettinger, Tanja; Feldhoff, Tobias; Wurster, Sebastian
Book Chapter
| Aus: Frommelt, Bernd; Ullrich, Heiner (Hrsg.): Transfer Forschung - Schule: Wenn Theorie auf Praxis trifft | Köln: Link | 2020
40923 Endnote
Author(s):
Rettinger, Tanja; Feldhoff, Tobias; Wurster, Sebastian
Title:
Innerschulische Verarbeitung und Verwendung von Inspektionsergebnissen. Darstellung am Beispiel von Hamburg
In:
Frommelt, Bernd; Ullrich, Heiner (Hrsg.): Transfer Forschung - Schule: Wenn Theorie auf Praxis trifft, Köln: Link, 2020 (Grundkurs Schulmanagement, 23), S. 34-39
Publication Type:
4. Beiträge in Sammelwerken; Sammelband (keine besondere Kategorie)
Language:
Deutsch
Keywords:
Schulinspektion; Qualitätssicherung; Schulsystem; Schulentwicklung; Schulleistung; Evaluation; Daten; Nutzung; Feedback; Rezeption; Hamburg; Deutschland
Abstract:
Schulinspektionen sind in einigen Bundesländern ein Bestandteil der Qualitätssicherung und -entwicklung des Schulsystems, deren zentrales Ziel neben der externen Kontrolle schulischer Prozessqualität und Normendurchsetzung insbesondere auch in einer Simulation von Schul- und Unterrichtsentwicklung liegt (Landwehr, 2011; Maritzen, 2006). (DIPF/Orig.)
DIPF-Departments:
Bildungsqualität und Evaluation
Development of the assessment for use in evaluation of the HAND in HAND programme
Roczen, Nina; Endale, Wubamlak; Vieluf, Svenja; Rožman, Mojca
Book Chapter
| Aus: Kozina, Ana (Hrsg.): Social, emotional and intercultural competencies for inclusive school environments across Europe: Relationships matter | Hamburg: Kovač | 2020
39654 Endnote
Author(s):
Roczen, Nina; Endale, Wubamlak; Vieluf, Svenja; Rožman, Mojca
Title:
Development of the assessment for use in evaluation of the HAND in HAND programme
In:
Kozina, Ana (Hrsg.): Social, emotional and intercultural competencies for inclusive school environments across Europe: Relationships matter, Hamburg: Kovač, 2020 (Studien zur Schulpädagogik, 89), S. 131-156
URL:
https://www.verlagdrkovac.de/volltexte/11406/11406_Kozina ED - Social emotional and intercultural competencies for inclusive school environments across Europe.pdf#page=132
Publication Type:
4. Beiträge in Sammelwerken; Sammelband (keine besondere Kategorie)
Language:
Englisch
Keywords:
Bewertung; Messung; Entwicklung; Pilotstudie; Externe Evaluation; Methode; Vielfalt; Messinstrument
Abstract (english):
This chapter presents how the measures targeting social, emotional and intercultural/transcultural (SEI) competencies as well as classroom climate used to externally evaluate the HAND in HAND programme were developed and selected. In the first section, we describe the assessment strategy for our summative and formative evaluation, which consists of applying a multimethod approach that combines self-reports, other-reports, a sociometric measure, vignettes and interviews to measure possible effects of the HAND in HAND programme, find out how participants experienced the programmes and discover levers to help improve the programmes. In the second section, we look at the process of selecting the questionnaire scales based on a pilot study that was conducted in three countries (Slovenia, Croatia, Sweden). We conclude by presenting the final instruments used for the HAND in HAND programme evaluation. (DIPF/Orig.)
DIPF-Departments:
Bildungsqualität und Evaluation
Evaluation of the HAND in HAND programme. Results from questionnaire scales
Rožman, Mojca; Roczen, Nina; Vieluf, Svenja
Book Chapter
| Aus: Kozina, Ana (Hrsg.): Social, emotional and intercultural competencies for inclusive school environments across Europe: Relationships matter | Hamburg: Kovač | 2020
39655 Endnote
Author(s):
Rožman, Mojca; Roczen, Nina; Vieluf, Svenja
Title:
Evaluation of the HAND in HAND programme. Results from questionnaire scales
In:
Kozina, Ana (Hrsg.): Social, emotional and intercultural competencies for inclusive school environments across Europe: Relationships matter, Hamburg: Kovač, 2020 (Studien zur Schulpädagogik, 89), S. 157-194
URL:
https://www.verlagdrkovac.de/volltexte/11406/11406_Kozina%20ED%20-%20Social%20emotional%20and%20intercultural%20competencies%20for%20inclusive%20school%20environments%20across%20Europe.pdf#page=158
Publication Type:
4. Beiträge in Sammelwerken; Sammelband (keine besondere Kategorie)
Language:
Englisch
Keywords:
Externe Evaluation; Fragebogen; Skala; Quantitative Daten; Programm; Effekt
Abstract (english):
A principal focus of the evaluation of the HAND in HAND programme is tracing back causal effects on the student and/or school staff programmes. We investigate whether the programme had the expected effects on social, emotional and intercultural competencies (hereinafter SEI competencies) and classroom learning environments. In this chapter, we present results regarding the programme's effectiveness that are based on questionnaire scales from the student and school staff evaluation instrument. These results are part of the experimental outcome evaluation. We compare the experimental groups to the control group in the pre- and post-measurements. Our analysis of the short-term programme effects reveals some of the programme's expected effects in all participating countries. However, many effects in an unexpected direction were also observed. Hence, the HAND in HAND programme may be judged as effective, although its effects are complex and appear to be both positive and negative depending on the specific outcome being examined.
DIPF-Departments:
Bildungsqualität und Evaluation
Unselect matches
Select all matches
Export
<
1
2
3
4
...
20
>
Show all
(328)