Findings from Research on Introductory Physics Students' Understanding
about Measurement Uncertainty
The following is a summary of common difficulties, misconceptions, or mistakes
observed in physics students when they were asked to perform laboratory
activities or answer questions about measurements. These findings
come from the doctoral research conducted by Duane Deardorff at North Carolina
State University.
The nature of uncertainty in measurements
-
Some students do not recognize that all measured values have uncertainty
and are not exact.
-
"Theoretical" values are exact and do not have uncertainty.
-
Some students believe that measurement errors can be eliminated by using
careful procedures and quality equipment. This concept is evidenced
partly by student use of the term "human error" as a source of experimental
error.
Uncertainties should be estimated and clearly reported
-
Students generally do not recognize the importance of accurate estimation
and reporting of uncertainty.
-
Students are reluctant to show explicit uncertainties with their measurements,
even when required to do so. Part of the reason is that they lack
experience with this task, but more significant is the fact that it is
an extra step and often not worth the extra time and effort to complete
(this is true even for "experts").
-
Students rely on memorized rules for finding uncertainty (e.g. half the
smallest division on a measuring instrument), instead of finding a total
uncertainty by considering multiple sources of error, some of which may
be reasonable guesses.
Reporting proper number of significant figures
-
Many students do not recognize that the rules of significant figures are
an efficient means of error propagation and estimating the precision of
the calculated result based on the precision of the available data. Most
textbooks do not explain the rationale of significant figures and the connection
between significant figures and relative uncertainty. Even when confronted
with this connection (via WebAssign's 1% tolerance), a large fraction of
students (~90%), and even many instructors, cannot recognize the relative
uncertainty that corresponds with the implied uncertainty in a measured
value.
-
Students typically report calculated values with more precision than can
be justified, but directly measured values are often reported with less
precision than is possible. Students who write down all the digits
displayed on their calculator report that they do so because they to do
not want to loose any information (or grade points for an incomplete answer).
For the case of measured values, the reason for too few significant figures
comes from lack of sophistication in experimental measurement skills.
-
Students also tend to give imprecise explanations when discussing sources
of error. Why this apparent inconsistency?
-
When an explicit uncertainty is stated, the measured value is often not
properly rounded to be consistent with the amount of uncertainty
-
Students often report uncertainties with too much precision (only 1 or
2 significant figures are valid based on the fact that the uncertainty
is a rough estimate that is generally only about 50% accurate)
Propagation of errors
-
Students avoid or have difficulty propagating uncertainties properly.
-
Students do not have the sophistication to discern which uncertainty factors
are most significant, so they blindly calculate the total RMS error without
first considering simplifications.
-
They do not know how to estimate by hand the uncertainty of the slope or
intercept of a graph
Identifying and classifying sources of error
-
Students use "human error" as a catch-all phrase, often confusing with
"mistake"
-
It is nearly impossible for students to be able to rank the sources of
error based on a quantitative analysis of their contribution to the total
error, even though this is an important step in improving the design of
an experiment. (error budget)
-
Difficulty identifying and classifying sources of error in data.
-
Explaining how each source of error could affect the result
-
Difference between systematic and random errors.
-
Identifying what, why, and where certain kinds of errors occur.
Interpreting and reducing errors
-
When asked if the error in the experimental result is acceptable, students
often rationalize the discrepancy between their experimental and predicted
values by attributing them to some vague factor (like "human error" or
"equipment error") probably," without considering the uncertainty estimate.
This clearly shows that students do not understand the purpose of finding
a quantitative value of the uncertainty.
-
Taking multiple measurements reduces random error, but does not reduce
systematic error.
-
Inability to know how to reduce errors.
Use of uncertainty for comparing results or designing experiments
-
Students often compare results without considering their uncertainties,
even when the uncertainty is known.
-
They do not plot the uncertainties to help them visualize like experts
do.
-
Judgements are made about the quality of an experimental result based on
some arbitrary criterion - "our result is acceptable since we only had
a 5% error, and my high school physics teacher said that less than 10%
error was acceptable."
-
Students do not recognize that error analysis can be used to design or
improve an experimental procedure.
-
Students confuse relative uncertainty and relative error, percent difference
and percent error, and relative uncertainty and confidence interval. These
confusions are indicative of not fully understanding the differences between
these concepts.
-
Students tend to confirm their hypothesis (they state that their result
agrees with the theoretical value) even when this conclusion is not supported
by their data – Type I error. This error in judgement sometimes goes the
other way (Type II error) if a student is critiquing the results of another
person – students are eager to blame someone else for making a mistake,
but they are reluctant to admit that their data does not agree with the
expected value. When a discrepancy is apparent, students blame the difference
on experimental or human error, instead of rationalizing that the theoretical
value may be invalid. [per lab reports and Lab Exam falling ball question]
Other
-
Students confuse accuracy and precision
-
Students do not understand the meaning of a "confidence interval."
-
Students often fail to include error bars on graphs, and if they do, they
do not explain or understand what confidence interval they represent
-
Discarding data is often not adequately justified or explained.
-
Students do not feel comfortable with uncertainty, they prefer answers
that can be considered right or wrong.