Menü Überspringen
Contact
Deutsch
English
Not track
Data Protection
Search
Log in
DIPF News
Research
Infrastructures
Institute
Zurück
Contact
Deutsch
English
Not track
Data Protection
Search
Home
>
Research
>
Publications
>
Publications Data Base
Search results in the DIPF database of publications
Your query:
(Schlagwörter: "Testentwicklung")
Advanced Search
Search term
Only Open Access
Search
Unselect matches
Select all matches
Export
2
items matching your search terms.
Show all details
Measuring system competence in education for sustainable development
Roczen, Nina; Fischer, Frank; Fögele, Janis; Hartig, Johannes; Mehren, Rainer
Journal Article
| In: Sustainability | 2021
41188 Endnote
Author(s):
Roczen, Nina; Fischer, Frank; Fögele, Janis; Hartig, Johannes; Mehren, Rainer
Title:
Measuring system competence in education for sustainable development
In:
Sustainability, 13 (2021) 9, S. 4932
DOI:
10.3390/su13094932
URN:
urn:nbn:de:0111-pedocs-238471
URL:
https://nbn-resolving.org/urn:nbn:de:0111-pedocs-238471
Publication Type:
3a. Beiträge in begutachteten Zeitschriften; Aufsatz (keine besondere Kategorie)
Language:
Englisch
Keywords:
Bildung; Nachhaltige Entwicklung; Systemkompetenz; Testentwicklung; Messung; Bewertung; Schuljahr 08; Schuljahr 09; Gesamtschule; Realschule; Hauptschule; Gymnasium; Bayern; Hessen; Nordrhein-Westfalen; Deutschland
Abstract (english):
This paper presents the development of an instrument for the assessment of system competence in the field of Education for Sustainable Development (ESD). Based on an already existing, more complex model of system competence for the school subject geography, we have developed a test that refers to central themes and principles of ESD using exclusively closed answer formats. Building on the results of cognitive laboratories and expert feedback from various fields, the instrument was (further) developed in an iterative process of feedback and revision. We conducted a quantitative pilot study with N = 366 8th and 9th grade students. The results indicate that the development of our system competence test was successful-the overall test yielded a high reliability and only very few items were not working as intended. Furthermore, the difficulties of the items were appropriate for the ability levels of the students and the results of a confirmatory factor analysis (CFA) suggest that the newly developed test measures system competence with one dimension. As the test is compact, easy to interpret, and yet reliable, it is particularly suitable for monitoring purposes in the field of ESD.
DIPF-Departments:
Lehr und Lernqualität in Bildungseinrichtungen
Evaluation of online information in university students. Development and scaling of the screening […]
Hahnel, Carolin; Eichmann, Beate; Goldhammer, Frank
Journal Article
| In: Frontiers in Psychology | 2020
40881 Endnote
Author(s):
Hahnel, Carolin; Eichmann, Beate; Goldhammer, Frank
Title:
Evaluation of online information in university students. Development and scaling of the screening instrument EVON
In:
Frontiers in Psychology, (2020) , S. 11:562128
DOI:
10.3389/fpsyg.2020.562128
URN:
urn:nbn:de:0111-pedocs-232241
URL:
https://nbn-resolving.org/urn:nbn:de:0111-pedocs-232241
Publication Type:
3a. Beiträge in begutachteten Zeitschriften; Aufsatz (keine besondere Kategorie)
Language:
Englisch
Keywords:
Deutschland; Internet; Informationskompetenz; Ressource; Glaubwürdigkeit; Relevanz; Bewertung; Test; Testentwicklung; Itemanalyse; Suchmaschine; Simulation; Technologiebasiertes Testen; Interview; Erhebungsinstrument; Evaluation; Student; Rasch-Modell; Empirische Untersuchung;
Abstract:
As Internet sources provide information of varying quality, it is an indispensable prerequisite skill to evaluate the relevance and credibility of online information. Based on the assumption that competent individuals can use different properties of information to assess its relevance and credibility, we developed the EVON (evaluation of online information), an interactive computer-based test for university students. The developed instrument consists of eight items that assess the skill to evaluate online information in six languages. Within a simulated search engine environment, students are requested to select the most relevant and credible link for a respective task. To evaluate the developed instrument, we conducted two studies: (1) a pre-study for quality assurance and observing the response process (cognitive interviews of n = 8 students) and (2) a main study aimed at investigating the psychometric properties of the EVON and its relation to other variables (n = 152 students). The results of the pre-study provided first evidence for a theoretically sound test construction with regard to students' item processing behavior. The results of the main study showed acceptable psychometric outcomes for a standardized screening instrument with a small number of items. The item design criteria affected the item difficulty as intended, and students' choice to visit a website had an impact on their task success. Furthermore, the probability of task success was positively predicted by general cognitive performance and reading skill. Although the results uncovered a few weaknesses (e.g., a lack of difficult items), and the efforts of validating the interpretation of EVON outcomes still need to be continued, the overall results speak in favor of a successful test construction and provide first indication that the EVON assesses students' skill in evaluating online information in search engine environments. (DIPF/Orig.)
DIPF-Departments:
Bildungsqualität und Evaluation
Unselect matches
Select all matches
Export