Showing posts with label Assessment. Show all posts
Showing posts with label Assessment. Show all posts

Thursday, January 15, 2009

Sabella & Steinberg, Physics Teacher (1997)

Performance on multiple-choice diagnostics and complementary exam problems
M. S. Sabella & R. N. Steinberg, The Physics Teacher, 35(3), p 150-155 (1997). (link to journal article)

Abstract: Multiple-choice diagnostic tests are becoming increasingly popular at many levels in the physics education community. They are regularly used to assess curriculum and to measure student understanding of basic concepts. Their multiple-choice format makes them easy to implement and analyze. This has led to the great benefit of an increased awareness of students’ conceptual difficulties.

Since its publication in this journal, the Force Concept Inventory (FCI) has become extremely popular with much attention given to student scores. The FCI therefore plays a major role in the development of curriculum and instructional strategies. Despite such importance, there are only a few studies published on how student performance on the FCI correlates with their understanding of the subject matter.

In order to help physics educators interpret the results of the FCI, as well as other multiple-choice diagnostics, it is clear that further research is needed. The Physics Education Research Group at the University of Maryland has written open-ended examination problems that correspond to several FCI questions. The FCI was administered during the last week of the semester and the exam problems were included the following week on final exams of first semester introductory calculus-based physics classes at the University of Maryland. In this article, we describe the correlation between student performance on the FCI and the corresponding exam problems.

Wednesday, January 14, 2009

Bao, PhD Dissertation (1999)

Using the Context of Physics Problem Solving to Evaluate the Coherence of Student Knowledge
L. Bao, Ph.D. Dissertation, E. F. Redish (advisor), (1999). (html TOC and abstract)


Abstract: A good understanding of how students understand physics is of great importance for developing and delivering effective instructions. This research is an attempt to develop a coherent theoretical and mathematical framework to model the student learning of physics. The theoretical foundation is based on useful ideas from theories in cognitive science, education, and physics education. The emphasis of this research is made on the development of a mathematical representation to model the important mental elements and the dynamics of these elements, and on numerical algorithms that allow quantitative evaluations of conceptual learning in physics.

In part I, a model-based theoretical framework is proposed. Based on the theory, a mathematical representation and a set of data analysis algorithms are developed. This new method is called Model Analysis, which can be used to obtain quantitative evaluations on student models with data from multiple-choice questions. Two specific algorithms are discussed in great detail. The first algorithm is the concentration factor. It measures how student responses on multiple-choice questions are distributed. A significant concentration on certain choices of the questions often implies the existence of common student models that are associated to those choices. The second algorithm is model evaluation which analyzes student responses to form student model vectors and student model density matrix. By studying the density matrix, we can obtain quantitative evaluations of specific models used by students. Application examples with data from FCI, FMCE, and Wave Test are discussed. A number of additional algorithms are introduced to deal with unique aspects of different tests and to make quantitative assessment of various features of the tests. Implications on test design techniques are also discussed with the results from the examples.

Based n the theory and algorithms developed in part I, research is conducted to investigate student understandings of quantum mechanics. Common student models on classical prerequisites and important quantum concepts are identified. For exampled, many students interpret the quantum wavefunction as the representation of the energy of a particle. Based on the research results, multiple-choice instruments are developed to probe student models analysis algorithms. A set of quantum tutorials are also developed and implemented instruction. Results from exams and student interviews indicate that the quantum tutorials are effective.

Tuesday, January 13, 2009

McCaskey, Dancy & Elby, Proceedings of 2003 PER Conference (2004)

Effects on assessment caused by splits between belief and understanding
T. L. McCaskey, M. H. Dancy & A. Elby, in Proceedings of the 2003 Physics Education Research Conference, S. Franklin, J. Marx & K. Cummings (Eds.), 720, p 37-40, Melville, NY: American Institute of Physics (2004). 

Abstract: We performed a new kind of FCI study to get at the differences between what students believe and what they think scientists believe. Students took the FCI in the standard way, and then made a second pass indicating “the answer they really believe” and “the answer they think a scientist would give.” Students split on a large number of the questions, with women splitting more often than men.

Monday, January 12, 2009

Bao & Redish, UMD preprint (2001)

Model Analysis: Assessing the Dynamics of Student Learning
L. Bao & E. F. Redish, University of Maryland preprint (Mar 2001).

Abstract: In this paper we present a method of modeling and analysis that permits the extraction and quantitative display of detailed information about the effects of instruction on a class’s knowledge. The method relies on a cognitive model of thinking and learning that represents student thinking in terms of patterns of association in long-term memory structures that we refer to as schemas or mental models. As shown by previous research, students frequently fail to recognize relevant conditions that lead to appropriate uses of their mental models and, as a result, can use multiple models inconsistently to treat problems that appear equivalent to an expert. Once the most common mental models have been determined via qualitative research, they can be mapped onto probing instruments such as a multiple-choice test. We have developed Model Analysis to analyze the results of these instruments that treats the student as if he/she were in a mixed state – a state which, when probed with a set of scenarios under diverse contextual settings, gives the probability that the student will choose a particular mental model to analyze the scenario. We illustrate the use of our method by analyzing results from the Force Concept Inventory, a research-based multiplechoice instrument developed to probe student’s conceptual understanding of Newtonian Mechanics in a physics class. Model Analysis allows one to use qualitative research results to provide a framework for analyzing and interpreting the meaning of students’ incorrect responses on a well-designed research-based multiple-choice test. These results can then be used to guide instruction, either for an individual teacher or for developers of reform curricula.

Bao & Redish, PER Suppl to Am J Phys (2001)

Concentration Analysis: A Quantitative Assessment of Student States
L. Bao & E. F. Redish, Physics Education Research Supplement to the American Journal of Physics, 69, S45-S53 (July 2001).

Abstract: Multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. However, traditional analysis often relies solely on scores (number of students giving the correct answer). This ignores what can be significant and important information: the distribution of wrong answers given by the class. In this paper we introduce a new method, concentration analysis, to measure how students’ responses on multiple-choice questions are distributed. This information can be used to study if the students have common incorrect models or if the question is effective in detecting student models. When combined with information obtained from qualitative research, the method allows us to identify cleanly what FCI results are telling us about student knowledge.

Bao & Redish, Conf: Phys Teacher Beyond 2000 (2000)

What can you learn from a (good) multiple choice exam?
L. Bao & E. F. Redish, contributed paper, GIREP Conference: Physics Teacher Education beyond 2000, Barcelona, Spain (2000).

Abstract: The information that a teacher typically extracts from a multiple-choice exam is limited. Basically, one learns: How many students in my class can answer each question correctly? Careful studies of student thinking [1] demonstrate that student responses may reflect strongly held naïve conceptions and that students may function as if they think about a particular topic using contradictory models (typically their naïve model and the scientific one taught in class). We have developed tools for extracting information about the state of knowledge of a class from multiple-choice exams that goes beyond how many students answered each question correctly. First, a mathematical function, the concentration factor, allows one to determine whether a particular question triggers students’ naïve models. Second, by treating the students as if they can exist in “mixed states” of knowledge, we create methods of extracting measures of the state of confusion of the class. By this we mean how likely the students are to used mixed models. Our method assists in the construction of multiple-choice tests that respond to what is known about the difficulties students bring into classes and we provide ways of extracting more detail about what students have learned than traditional analysis tools.

Redish, International Conf of Phys Teachers & Educators (1999)

Diagnosing Student Problems Using the Results and Methods of Physics Education Research
E. F. Redish, International Conference of Physics Teachers and Educators: Guilin, People's Republic of China (19 August, 1999). 

Redish, Saul & Steinberg, Am J Phys (1998)

Student Expectations in Introductory Physics
E. F. Redish, J. M. Saul & R. N. Steinberg, Am J Phys, 66, p 212-224 (1998). (html version)

Abstract: Students' understanding of what science is about and how it is done and their expectations as to what goes on in a science course, can play a powerful role in what they get out of introductory college physics. In this paper, we describe the Maryland Physics Expectations (MPEX) Survey; a 34-item Likert-scale (agree-disagree) survey that probes student attitudes, beliefs, and assumptions about physics. We report on the results of pre- and post-instruction delivery of this survey to 1500 students in introductory calculus-based physics at six colleges and universities. We note a large gap between the expectations of experts and novices and observe a tendency for student expectations to deteriorate rather than improve as a result of the first term of introductory calculus-based physics.