hkr.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Dynamic assessment and the “Interactive Examination”
Malmö University. (LISMA)ORCID iD: 0000-0002-3251-6082
Malmö University.
Malmö University.
Malmö University.
2007 (English)In: Journal of Educational Technology & Society, ISSN 1176-3647, E-ISSN 1436-4522, Vol. 10, no 4, 17-27 p.Article in journal (Refereed) Published
Abstract [en]

To assess own actions and define individual learning needs is fundamental for professional development. The development of self-assessment skills requires practice and feedback during the course of studies. The “Interactive Examination” is a methodology aiming to assist students developing their self-assessment skills. The present study describes the methodology and presents the results from a multicentre evaluation study at the Faculty of Odontology (OD) and School of Teacher Education (LUT) at Malmö University, Sweden. During the examination, students assessed their own competence and their self-assessments were matched to the judgement of their instructors (OD) or to their examination results (LUT). Students then received a personal task, which they had to respond to in written text. After submitting their response, the students received a document representing the way an “expert” in the field chose to deal with the same task. They then had to prepare a “comparison document”, where they identified differences between their own and the “expert” answer. Results showed that students appreciated the examination in both institutions. There was a somewhat different pattern of self-assessment in the two centres, and the qualitative analysis of students’ comparison documents also revealed some interesting institutional differences.

Place, publisher, year, edition, pages
2007. Vol. 10, no 4, 17-27 p.
Keyword [en]
Assessment, Self-assessment, Oral health education, Teacher education
National Category
Pedagogy
Identifiers
URN: urn:nbn:se:hkr:diva-6336OAI: oai:DiVA.org:hkr-6336DiVA: diva2:301396
Available from: 2010-03-03 Created: 2010-03-03 Last updated: 2014-08-13Bibliographically approved
In thesis
1. Educative assessment for/of teacher competency: a study of assessment and learning in the ”Interactive examination” for student teachers
Open this publication in new window or tab >>Educative assessment for/of teacher competency: a study of assessment and learning in the ”Interactive examination” for student teachers
2008 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The aim of this dissertation is to explore some of the problems associated with introducing authentic assessment in teacher education. In the first part of the dissertation the question is investigated, through a literature review, whether the use of scoring rubrics can aid in supporting credible assessment of complex performance, and at the same time support student learning of such complex performance. In the second part, the conclusions arrived at from the first part are implemented into the design of the so-called “Interactive examination” for student teachers, which is designed to be an authentic assessment for teacher competency. In this examination, the students are shown short video sequences displaying critical classroom situations, and are then asked to describe, analyze, and suggest ways to handle the situations, as well as reflect on their own answers. It is investigated whether the competencies aimed for in the “Interactive examination” can be assessed in a credible manner, and whether the examination methodology supports student learning. From these investigations, involving three consecutive cohorts of student teachers (n = 462), it is argued that three main contributions to research have been made. First, by reviewing empirical research on performance assessment and scoring rubrics, a set of assumptions has been reached on how to design authentic assessments that both support student learning, and provide reliable and valid data on student performance. Second, by articulating teacher competency in the form of criteria and standards, it is possible to assess students’ skills in analyzing classroom situations, as well as their self-assessment skills. Furthermore, it is demonstrated that by making the assessment demands transparent, students’ performances are greatly improved. Third, it is shown how teacher competency can be assessed in a valid way, without compromising the reliability. Thus the dissertation gives an illustration of how formative and summative purposes might co-exist within the boundaries of the same (educative) assessment.

Place, publisher, year, edition, pages
Malmö: School of Teacher Education, Malmö University, 2008. 149 p.
Series
Malmö Studies in Educational Sciences, ISSN 1651-4513 ; 41
Keyword
Authentic assessment, Formative assessment, Learning, Reliability, Performance assessment, Scoring rubrics, Teacher education
National Category
Pedagogy
Identifiers
urn:nbn:se:hkr:diva-6338 (URN)978-91-977100-3-9 (ISBN)
Public defence
(English)
Opponent
Supervisors
Available from: 2010-03-03 Created: 2010-03-03 Last updated: 2014-06-10Bibliographically approved

Open Access in DiVA

No full text

Other links

Fulltext

Search in DiVA

By author/editor
Jönsson, Anders
In the same journal
Journal of Educational Technology & Society
Pedagogy

Search outside of DiVA

GoogleGoogle Scholar

Total: 166 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf