• If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

FMCE results Fall 2011

Page history last edited by Joe Redish 12 years, 4 months ago

One of the goals for the NEXUS Physics Course is to introduce more relevant biological content without losing too much of the development of a strong understanding of the Newtonian synthesis that is at the core of basic classical physics.

 

Background: The Instrument

In the first semester in which the trial class was taught (Fall 2011), the Force-Motion Concept Evaluation (FMCE)* was delivered to the class as a pre-post test. The FMCE is one of a set of concept evaluations developed by the Physics Education Research community as a result of substantial research into the standard ideas students commonly bring into and activate in an introductory physics class. This test was chosen instead of the more commonly used Force Concept Inventory (FCI) for three reasons.

 

(1) It has more choices and stronger attractive distractors than the FCI, resulting in students at the low end being spread all the way down to 0.  The FCI tends to "bottom out" at 25-30%. (That is, one typically gets a bunch of students at that level with few below it.) Students with scores close to zero may be very confident of their (wrong) answers.

(2) The FMCE includes a set of items on mechanical energy conservation, while the FCI is restricted to forces. Since one strategy of the NEXUS Physics class was to increase the emphasis on energy and reduce the focus on forces, we considered this might provide some interesting insight into what was happening in the class.

(3) We have a considerable baseline of data for the FMCE for this population as a result of the UMd PERG's project, Learning How to Learn Science. In this project, the algebra-based physics class was modified to focus more strongly on scientific competency building (but without the emphasis on biologically appropriate content and examples that mark the NEXUS project).

 

Methodology and Comparisons

The FMCE was delivered to the test class in the recitation session using Scantron sheets in the first week of class and in the last lecture class of the term. Students were given 5 points participation credit for taking the test but were told that it would not be graded for correctness. In order to offer some motivation and pedagogical value, students were told that they would be sent their pre-post results and could obtain a consultation to improve their understanding of any areas shown by the test to be problematical before the final exam.

 

Of the 20 students completing the class, 19 completed both the pre and the post tests. One student started the class late and did not complete the pre-test. His score has not been included so as to provide a fully matched sample.

 

We compare the results for three classes:

  1. The NEXUS development test class: In this class, recitation sections were used for group problem solving, often with biological content. Treatment of forces received reduced emphasis, with, for example, projectile motion reduced to one day's discussion. Treatment of energy received increased emphasis. (Fall 2011, N = 19)
  2. The LHtLS class:  In this class, recitation sections were used for intuition-building concept-oriented tutorials. The instructor had many years of experience teaching this class and was one of the tutorial developers. He integrated tutorial concepts into lectures, homework, and exams. The content was traditional and not adjusted to match the needs of biology students (though some biologically oriented examples were included). (Fall 2010, N = 185)
  3. The enhanced traditional class: In this recitation sections were used for intuition-building concept-oriented tutorials. The instructor was teaching this class (and any large lecture class) for the first time. It was his first time working with tutorials, but he "bought in" and attended training sessions.

 

Note that the totals ONLY refer to the force and motions items.  The totals do NOT include the four energy items (see below). Two "totals" are given, one with 36 items and one with 32. Some of the items on the test repeat other items in slightly different ways or are "trivial" to put students in the right "frame". We cite the full totals.

 

The comparisons use the figure of merit, "fraction of the class's possible gain" -- equal to the number of points the class average increased divided by the number of points the average could have increased:

 

<g> = [post - pre]/[100 - pre]

 

where the pre and post scores are given in percentages.

 

Typical fractional gains for first term college physics classes range from 0.1-0.3 for traditionally taught classes, 0.3-0.4 for classes with a research-based active-engagement recitation (Tutorials or Group Problem Solving), and 0.4-0.8 for highly interactive classes (like Workshop Physics).** The figure below shows a sample distribution from Redish & Steinberg (from real data but smoothed, including both FMCE and FCI) with results from our test class B added (Trad + AE/Epist = traditional class structure with epistemologized active-engagement elements).

We also display histograms for all three classes showing the pre and post scores on subclusters associated with narrower concepts:

 

  1. "Force sled" -- a set of questions focusing on whether students get the idea that net force is associated with acceleration or whether they use the common misconception that force is associated with velocity.
  2. Reversing direction --  a set of questions to test whether students understand that an object can continue accelerating (indeed, MUST accelerate) as it changes directions and its velocity passes through zero.
  3. Force graphs -- a set of questions testing whether students can translate from a description of motion in words to graphs of the net force the object must be feeling to produce that motion.
  4. Acceleration graphs --  a set of questions testing whether students can translate from a description of motion in words to graphs of the acceleration the object must have as a result of that motion.
  5. Newton III -- a set of questions testing whether students can recognize the applicability of Newton's third law ("When two objects interact, the forces they exert on each other are equal and opposite.") in a variety of physical situations.
  6. Velocity graphs --  a set of questions testing whether students can translate from a description of motion in words to graphs of the velocity the object must have as a result of that motion.
  7. Energy -- a set of question testing whether students understand the implications of mechanical energy conservation for speed.

 

Results and implications

The results for the test class and the two comparison classes are given in the table below.

 

  Class A
(test class)
Class B

Class C
<g> total
(force and motion)
0.41
0.46
0.26
<g> energy
0.71
0.50
0.22

 

 

The detailed pre-post histograms are shown in the figure at the right. Note that

the initial scores on the total and on the subclusters are very close in each of the three classes. This obviates a potential concern and shows that our smaller test class A was NOT somehow self-selected for students who were better prepared in physics.

 

We see that our test class A did substantially better than the traditional (+ tutorials) class in all categories except Newton's third law (and in most cases, substantially better). The Newton's 3rd law tutorial used in classes B and C has been shown to be particularly effective.*** 

 

Our test class did not do as well in comparison to test class B. That class scored significantly higher than did our test class on all of the force and motion concept categories except force sled. Class B did spend significantly more time on these issues, including using a set of tutorials explicitly designed to help students with these challenging concepts.

 

The test class performed significantly better than the comparison classes on the energy problems (post score of 78% compared to post scores of 62% for class B and 39% for class C). It seems likely that this reflects the additional stress placed on the concept in the test class.

 

While these result appear reasonable and consistent with the goals and shifted emphasis in the test class, a critical reason for stressing multiple graphs and Newtonian interpretations is because this is an excellent place for stressing two fundamental competencies, whose acquisition is an explicit goal of the class:

 

 

It is plausible that a major component of the success of class B is the use of the research-based tutorials in recitation. One test we might make is alternating tutorials and group problem solving, selecting the best tutorials and those most focused on the competencies we are attempting to teach.

 

 

* R.K. Thornton and D.R. Sokoloff, "Assessing student learning of Newton's laws: The Force and Motion Conceptual Evaluation," Am. J. Phys. 66(4), 228-351 (1998).

 

** See Hake, "Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses," Am.J. Phys. 66, 64-74, 1998; and Redish & Steinberg, "Teaching physics: figuring out what works," Physics Today, 52, 24-20, Jan. 1999.

 

***(Smith & Wittmann, "Comparing three ways of teaching Newton's third law," AIP Conf. Proc. -- November 5, 2009 -- Volume 1179, pp. 301-304 2009 Phys. Ed. Res. Conf.; doi:10.1063/1.3266742)

 

 

Joe Redish 12/26/11

 

 

 

 

 

Comments (0)

You don't have permission to comment on this page.