July 26-27, 2006
Syracuse University
Syracuse, NY


About Description
Theme
Registration
Participation
Previous PERCs

Schedule By Time
By Room

Formats Invited Talks
Targeted Poster
Workshops
Roundtable Discussion
Contributed Posters

Deadlines
&
Submissions
Targeted Poster Sessions
Workshops
Contributed Posters
Roundtable Discussions

Search By Presenter
By Session

Invited Sessions Invited Talks
Targeted Poster Session
Workshops

Contributed Sessions Contributed Posters
Roundtable Discussions

Proceedings Purpose
Eligibility
Peer Review
Manuscript
Format
Fee
FAQ


Targeted Poster Session: TP-A

Issues and Innovations in Concept Inventory Development and Administration

Organizer:
   
    Rebecca Lindell (rlindel@siue.edu), Southern Illinois University Edwarsville

When/Where

Thursday, July 27   8:15 – 9:45  in Regency C

Thursday, July 27   1:45 – 3:15 in Waverly


Theme:  Since the creation of the Force Concept Inventory (FCI) the development of research-based distracter driven multiple-choice instruments has surged. Now nearly every scientific discipline now has multiple concept inventories available for their use. A quick literature search yielded concept inventories in Physics, Astronomy, Engineering, Biology, Chemistry, Geoscience, Mathematics and Statistics. While many innovations in the development of these different concept inventories have been made, many issues related to development, validity, reliability and conclusions that can be drawn from these instruments have been raised. This targeted poster session will bring together different discipline-based education researchers to discuss the different issues and innovations in concept inventory development and administration.

Goals:  The goal of this session is to communicate many of the issues and innovations that have been made in concept inventory development and administration. Our hope is that after attending this targeted poster session, many of the attendees will be more aware of these issues and innovations..


Individual Poster Abstracts


TP-A1
Switch Effect in Quantitative Assessment: Models, Techniques, and Applications
Lei Bao &  Jin Wang, The Ohio State University
Abstract: Measurement of student knowledge and change of knowledge are the prime focus of education assessment. A quantitative measurement result is the outcome of many complicated and often entangled processes affected by a wide range of internal and external factors. Existing research has shown that the features and orders of the questions can have significant impact on student responses. [1, 2] Built on the existing research, we were able to create a switch effect that by varying certain context variables of equivalent questions the order in which these questions are given can lead to significantly different response patterns. Combining the contextual variables and the underlying conceptual knowledge of the test questions, we can design specific switch pairs to measure the level of context dependency of students' knowledge, which can be further used to evaluate the transferability of such knowledge across contexts with varying similarity. In the poster, we will introduce question design methods for creating the switch effect and present examples and data analysis methods and results. We will also discuss the applications of this method in research and education testing.
Footnotes: [1] L. Bao and E.F. Redish, Model analysis: Representing and assessing the dynamics of student learning, Phys. Rev. ST Phys. Educ. Res. 2, 10103 (2006) [2] K.Gray, S.Rebello, D. Zollman, The effect of question order on responses to multiple-choice questions, Proceedings of 2000 PER conference

Back to Top


TP-A2
Reliability, Compliance and Security of Web-based Pre/Post-testing
Scott Bonham
, Western Kentucky University
Abstract: : Pre/post testing is an important tool for improving science education. Standard in-class administration has drawbacks such as “lost” class time and converting data into electronic format. These are not issues for unproctored web-based administration, but there are concerns about assessment validity, compliance rates, and instrument security. A preliminary investigation compared astronomy students taking pre/post tests on paper to those taking the same tests over the web. The assessments included the Epistemological Beliefs Assessment for Physical Science and a conceptual assessment developed for this study. Preliminary results on validity show no significant difference on scores or on most individual questions. Compliance rates were similar between web and paper on the pretest and much better for web on the posttest. Remote monitoring of student activity during the assessments recorded no clear indication of any copying, printing or saving of questions, and no widespread use of the web to search for answers..

Back to Top


TP-A3
Ed's Tools; Web Based Language Analysis Tools for Concept Inventory Development
Isidoros Doxas, Kathy Garvin-Doxas & Michael W. Klymkowsky The Bioliteracy Project, Molecular, Cellular & Developmental Biology University of Colorado
Abstract: Despite their crucial role in reforming the way we teach science in general, and Physics in particular, concept inventories are still, well over a decade after their initial introduction, highly time and labor intensive instruments to produce. Ed's Tools are a collection of web based database and productivity tools that can accelerate the development of Concept Inventories by facilitating the collection and analysis of relevant student language, and by codifying some of the critical stages of development. We will describe Ed's Tools, their use in the development of the Biology Concept Inventory (BCI), and how they can be used to facilitate the development of Concept Inventories in other fields. (Ed's Tools are available upon request through http://bioliteracy.net). BCI development project is funded in part by a grant from the NSF.

Back to Top


TP-A4
Are They All Created Equal? A Comparison of Different Concept Inventory Development Methodologies
Rebecca Lindell, Elizabeth Peak and Thomas Foster, Physics Astronomy Chemistry Biology Earth Science Education Research Group Southern Illinois University Edwardsville
Abstract: The creation of the Force Concept Inventory (FCI) [1] was a seminal moment for Physics Education Research. Based on the development of the FCI, many more concept inventories have been developed. The problem with the development of all of these concept inventories is there does not seem to be a concise methodology for developing this inventory, not is there a concise definition of what these inventories measure. By comparing the development methodologies of many common Physics and Astronomy Concept Inventories we can draw inferences about different types of concept inventories, as well as different valid conclusions that can be drawn from the administration of these inventories. Inventories compared include: Astronomy Diagnostic Test (ADT), Brief Electricity and Magnetism Assessment (BEMA), Conceptual Survey in Electricity and Magnetism (CSEM), Diagnostic Exam Electricity and Magnetism (DEEM), Determining and Interpreting Resistive Electric Circuits Concept Test (DIRECT), Energy and Motion Conceptual Survey (EMCS), Force Concept Inventory (FCI), Force and Motion Conceptual Evaluation (FMCE), Lunar Phases Concept Inventory (LPCI), Test of Understanding Graphs in Kinematics (TUG-K) and Wave Concept Inventory (WCI).

Back to Top


TP-A5

Validation Studies of the Colorado Physics Problem Solving Survey
Wendy Adams and Carl Wieman, University of Colorado, Boulder
Abstract: Researchers have created several tools for evaluating conceptual understanding as well as students¿ attitudes and beliefs about physics; however, the field of problem solving is sorely lacking a broad use evaluation tool. This missing tool is an indication of the complexity of the field. The most obvious and largest hurdle to evaluating physics problem solving skills is the physics content knowledge necessary to solve problems. We are tackling this problem by looking for the physics problem solving skills that are useful in other disciplines as well as physics. We will report on the results of a series of double-blind interviews comparing physics students' skills when solving physics problems with the results of the Colorado Physics Problem Solving Survey.
1. Supported in part by funding from National Science Foundation DTS.

Back to Top


 

About Description
Theme
Registration
Participation
Previous PERCs

Schedule By Time
By Room

Formats
Invited Talks
Targeted Poster
Workshops
Roundtable Discussion
Contributed Posters

Deadlines
&
Submissions
Targeted Poster Sessions
Workshops
Contributed Posters
Roundtable Discussions

Search By Presenter
By Session

Invited Sessions Invited Talks
Targeted Poster Session
Workshops

Contributed Sessions Contributed Posters
Roundtable Discussions

Proceedings Purpose
Eligibility
Peer Review
Manuscript
Format
Fee
FAQ


PERC 2006 Organizing Committee

Steve Kanim
Department of Physics, MSC 3D
New Mexico State University, PO Box 30001
Las Cruces, NM 88003-8001
(505) 646-1208  office
(505) 646-1934 fax
skanim@nmsu.edu
 
Rebecca Lindell
Department of Physics
Southern Illinois University Edwardsville
Edwardsville, IL 62026-1654
(618) 650-2934 office
(618) 650-3556 fax
rlindel@siue.edu
 
Michael Loverude
Department of Physics, MH-611
California State University Fullerton
Fullerton, CA 92834
(714) 278-2270 office
(714) 278-5810 fax
mloverude@exchange.fullerton.edu
 
Chandralekha Singh
Department of Physics & Astronomy
University of Pittsburgh
Pittsburgh, PA 15260
(412) 624-9045 office
(412) 624-9163 fax
clsingh@pitt.edu