Harnessing Contrasts: A Vital Tool in Science Education
Written on
Contrasts of various kinds are crucial for capturing attention. When we examine a set of similar yet distinct cases, our perspective shifts. This comparative approach offers two main advantages.
First, students become aware of details they might overlook when viewing a single case. For instance, when looking at one bicycle, it appears simply as a bicycle with its frame, wheels, gear system, and handlebars. However, positioning two bicycles side by side highlights their differences. These wheels are not just any wheels; they are larger and have a thicker tread, and the gear system may be simpler, making certain features more noticeable.
The second advantage stems from recognizing patterns amid variations. Bicycles can vary widely in design, and thoughtfully chosen examples can elucidate relationships—such as the correlation between gear size and the bicycle's mechanical advantage. Research indicates that engaging with contrasting cases prepares students to delve into the underlying reasons for these variations.
Science classes abound with opportunities to employ contrasting cases. If teaching anatomy, contrasting bone examples can reveal significant details. In geology, contrasting rock types can showcase the diversity present within the subject matter. Yet, the importance of contrasts extends beyond mere content delivery.
Contrasts serve as an excellent framework for facilitating meaningful discussions.
At its core, science is about explanation. It revolves around evidence-based arguments, developing potential explanations, and suggesting methods for evaluating these explanations. For educators wishing to cultivate scientific thinking in their students—rather than merely imparting scientific knowledge—contrasts are invaluable assets in both laboratories and classrooms.
Reflect on the conventional science lab. I enrolled in college-level chemistry three times. During lab sessions, we would gather in groups and receive a step-by-step experimental guide—a recipe for conducting the experiment. For example, we would measure out 5 grams of sodium chloride and 12 ounces of hydrogen dioxide. The most enjoyable part was handling the cool glass beakers. Each experiment typically illustrated concepts we were studying in class, though I often struggled to articulate their relevance.
Did I gain a real understanding of scientific processes? Not at all. Did the lab experience enhance my comprehension of lecture material? Research suggests it likely did not. This "recipe" style of lab work fails to incorporate meaningful contrasts or foster productive discussions. Achieving the expected result meant success, while failing to do so indicated a mistake—leaving students with a list of potential errors to contemplate. If one achieved the desired outcome, further discussion was often unnecessary.
Traditional science labs merely simulate the surface of scientific inquiry. They present students with the polished final product of experiments that demonstrate important principles when executed correctly, while offering little practice in scientific decision-making. If argumentation is essential to science, as some argue, these recipes provide scant opportunities for meaningful debate. Similarly, if model-based reasoning is a cornerstone of scientific understanding, these methods lack avenues for constructing models. Consequently, they do not facilitate rich scientific discourse.
Conversely, contrasts can.
Consider these three scenarios:
- A student assesses whether data aligns with a specific hypothesis.
- A student compares two differing explanations for a phenomenon.
- A student examines two conflicting data sources that represent the same phenomenon.
The first scenario lacks contrasts. Some debate is possible—how close must data be to affirm its consistency with the hypothesis? Moreover, teaching essential scientific practices, such as statistical analysis, can occur, but simply determining data consistency offers limited opportunities for deep scientific reasoning.
Now, reflect on the discovery of Neptune. By the 1830s, astronomers recognized an issue with Uranus; predictions of its orbit based on Newtonian physics did not match observed positions. Unlike Saturn and Jupiter, Uranus exhibited this problem.
Identifying that these observations contradicted Newtonian physics was relatively straightforward, yet the intriguing question remained: what steps should be taken? Should Newtonian physics be discarded? Not entirely, as it still holds value. Could an undiscovered planet be affecting Uranus's orbit? This was plausible, yet astronomers had utilized powerful telescopes for two centuries—how could they have overlooked it? Measurement errors were also a possibility, but Saturn and Jupiter did not share the same discrepancies.
Here, we enter the realm of contrasting explanations. Ultimately, the correct explanation revealed a previously unidentified planet—Neptune—that accounted for Uranus's orbital anomalies. Arriving at this conclusion required deliberation over potential explanations and selecting the most viable one. Which explanations hold the most merit, and why? What evidence can differentiate one explanation from another? What is feasible to investigate?
The explanations for Uranus's orbital irregularities were markedly different. However, similar yet distinct explanations offer a closer comparison to contrasting cases. For instance, consider the single-celled organism Euglena, which reacts to light. These organisms possess photoreceptors that detect light and tend to move towards it. Engineers and educators have developed various tools that allow students to interact with Euglena in real time.
Without contrasts, one might ask, “Does the Euglena's behavior support the hypothesis that it has an eye?” The evidence might indicate:
- Consistent with the hypothesis
- Inconsistent with the hypothesis
Determining whether the evidence supports this hypothesis is quite simple. If Euglena respond to light or color, it suggests they have an eye; if not, they likely do not.
By contrast, if we pose the question, “Which model of Euglena aligns better with the observed behavior: a one-eye model or a two-eye model?” the evidence can now take on various forms:
- Consistent with both models
- Inconsistent with both models
- Consistent with the one-eye model but not the two-eye model
- Consistent with the two-eye model but not the one-eye model
Both models predict that Euglena will respond to light, yet each offers a slightly different explanation of how this occurs. Determining which experiments or observations could distinguish one model from another epitomizes scientific practice. This process also requires deep engagement with content knowledge—understanding how unicellular organisms navigate their environments.
However, it can be challenging for students to grasp the implications of each model. For many, mere contemplation is insufficient. Teachers must act as facilitators of discussion rather than mere arbiters of correctness. Well-designed technologies can enhance clarity regarding each model's implications. For example, students might observe simulations for each model, leading to insights about the relationship between the models and the organisms' predicted behaviors.
Contrasting models are not the only productive contrasts available. Scientists frequently utilize multiple data sources when observing and exploring potential explanations. When these data sources conflict, they must determine which one better represents reality.
Participants in the citizen science project Eterna faced such contradictions. In the simulation, RNA molecules appeared to fold in one manner, yet actual lab tests suggested a different outcome. Which should be trusted, and why?
Resolving these dilemmas necessitates investigating—and debating—the processes that yield the data. Most volunteers and developers instinctively favored lab results over simulations, as the project's purpose was to enhance existing RNA folding models through open laboratory experiments. However, lab results often rely on indirect measurements of molecular structure, lacking a comprehensive view. These indirect measures can introduce biases that must be considered. Occasionally, the simulation offers the more accurate portrayal.
This contrast spurred discussions: How are structures currently measured in the lab? What methods does the simulation employ to model molecular structures? Under what circumstances should one be favored over the other?
Such conversations occurred repeatedly as volunteers encountered new contrasts. Why do algorithmic designs fail differently from human designs? Answering this question requires an understanding of both algorithmic functions and human molecular design practices. Why do two lab measurements appear contradictory? To resolve this, knowledge about how each measurement is produced is essential. Contrasts maintain students' engagement, whereas simply asking if they obtained the expected answer does not.
Contrasts are not only for identifying bird species or bicycle designs. They can be employed to foster productive scientific discussions and actively engage students in scientific practices.
Originally published at www.benjaminkeep.com.