hkr.sePublications
Planned maintenance
A system upgrade is planned for 10/12-2024, at 12:00-13:00. During this time DiVA will be unavailable.
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Complement or contamination: a study of the validity of multiple-choice items when assessing reasoning skills in physics
Kristianstad University, School of Education and Environment, Avdelningen för Naturvetenskap. Kristianstad University, Research environment Learning in Science and Mathematics (LISMA). (LISMA)ORCID iD: 0000-0002-3251-6082
Malmö högskola.
Malmö högskola.
2017 (English)In: Frontiers in education - Assessment, Testing and Applied Measurement, Vol. 2, article id 48Article in journal (Refereed) Published
Abstract [en]

The purpose of this study is to investigate the validity of using multiple-choice (MC) itemsas a complement to constructed-response (CR) items when making decisions about student performance on reasoning tasks. CR items from a national test in physics have been reformulated into MC items and students’ reasoning skills have been analyzed in two substudies. In the first study, 12 students answered the MC items and were asked to explain their answers orally. In the second study, 102 students from five randomly chosen schools answered the same items. Their answers were scored, and the frequencyof correct answers was calculated for each of the items. The scores were then compared to a sample of student performance on the original CR items from the national test. Findings suggest that results from MC items might be misleading when making decisions about student performance on reasoning tasks, since students use other skills when answering the items than is intended. Results from MC items may also contributeto an overestimation of students’ knowledge in science.

Place, publisher, year, edition, pages
2017. Vol. 2, article id 48
Keywords [en]
Argumentation skills; assessment; multiple-choice items; national testing; socio-scientific issues
National Category
Didactics
Identifiers
URN: urn:nbn:se:hkr:diva-17254DOI: 10.3389/feduc.2017.00048OAI: oai:DiVA.org:hkr-17254DiVA, id: diva2:1140792
Available from: 2017-09-13 Created: 2017-09-13 Last updated: 2017-09-13Bibliographically approved

Open Access in DiVA

fulltext(201 kB)217 downloads
File information
File name FULLTEXT01.pdfFile size 201 kBChecksum SHA-512
ee9d1a9b6fd743bb0f03a338b52c374962cb0d867d1448a43546289055273c5d5d852c9e124a5fbd8ac436aff4c1d77111f46ff36ee84b4f5e57324ce272177a
Type fulltextMimetype application/pdf

Other links

Publisher's full texthttp://journal.frontiersin.org/article/10.3389/feduc.2017.00048/full

Search in DiVA

By author/editor
Jönsson, Anders
By organisation
Avdelningen för NaturvetenskapResearch environment Learning in Science and Mathematics (LISMA)
Didactics

Search outside of DiVA

GoogleGoogle Scholar
Total: 217 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 476 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf