The ability to think critically about scientific data and models is a crucial skill and an important goal of education.
Researchers developed a framework for learning quantitative critical thinking, consisting of cycles of decision-making based on comparisons between datasets or data and models.
The authors applied the learning structure to 130 students in an introductory physics lab course. As the students worked through simple physics experiments, they were given explicit instructions to compare new data to existing data or models and to decide how to act on the comparisons based on statistical tests.
After the instructions were removed, the students were 12 times more likely to make or propose changes to improve their data or methods than a control group of 130 students who had taken the course the previous year.
The students in the experimental group were also four times more likely to identify and explain a limitation of a model using their data, compared with the control group. The differences between the two groups persisted in another lab course the following year, suggesting a long-term improvement in critical thinking skills.
The learning framework may have led to a significant and sustained improvement in students’ critical thinking behaviors, and the authors suggest that the framework could be adapted to a range of settings.