the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
Teaching Uncertainty: A new framework for communicating unknowns in traditional and virtual field experiences
Abstract. Managing uncertainty is fundamental to geoscience practice, yet geoscience education generally does not incorporate explicit instruction on uncertainty. To the extent that students are exposed to scientific uncertainty, it is through in-person field experiences. Virtual field experiences – which rely on pictures, maps, and previously collected measurements – should therefore explicitly address uncertainty or risk losing this critical aspect of students' experience. In this paper we present a framework for teaching students to assess and communicate their uncertainty, which is grounded in best expert practices for conveying uncertainty and familiar terms-of-art in geology. The starting point of our framework is the recognition of uncertainty in both geologic data and models, the latter of which we use as an encompassing term to refer to potential geological processes and structures inferred on the basis of incomplete information. We present a concrete application of the framework to geological mapping and discuss how it could enhance student learning in both traditional in-person and virtual experiences. Our framework is extensible in that it can be applied to a variety of geologic features beyond those where uncertainty is traditionally assessed, and can also be applied to geological subdisciplines.
This preprint has been withdrawn.
-
Withdrawal notice
This preprint has been withdrawn.
-
Preprint
(575 KB)
Interactive discussion
Status: closed
-
CC1: 'Comment on se-2021-69', Bernd Andeweg, 21 May 2021
I just reacted bluntly and quickly on Twitter but was asked to leave my comments here. So, without pretending to have an elaborate statement, here we go:
Including uncertainty is key to field courses in our teaching at Vrije Universiteit Amsterdam. We tried to incorporate some of these aspects in our hybrid online/VU fieldwork: we sampled a lot (100s!) of rocks in the area and students were given random samples from different levels in the formations and members. In thia way they saw parr of the variety within sequences. In the field we often pay a lot of attention at chosing boundaries wisely. Sometimes literally waljing back an forth through a (partial) section and narrow down the location of a boundary between units (' here I am sure I am in unit1, here the doubts comes in, but another 10ms and for sure in unit2).
Normally we start with measuring in a seemingly constant dip section with all of the students at different locations in the section. Then we gather and tell our measurements.. and see the variation. Another thing: we grade certainty of measurements in the field (rank 1-5). Later in interpretation the sure ones cannot be altered or neglected, while rank 5: if it fits, it is fine. If not, the model might still hold. Or just stroll around a bit to search for a better outcrop showing bedding. And if not.. Rank 5 is all you have.
Yhe most prominent point however is that students in the field can show their skills and understanding by determining themselves where to go and look for that one outcrop that supports (or not!) their hypothesis. Being very flexibel with hypotheses that sometimes can change from one outcrop to the other.
Many of these things can imo not be reached in a virtual fieldwork, however we all pushed quite some limits in creating the best possible alternatives with a lot of efforts. We even included quite some doubts in our explanatory videos accompanying the fieldwork.
My main point: sure, it is very good to pay more explicitly attention to uncertainty, if you did not already do that in your teaching. Exactly this aspect is somewhat related to experience in the field.
Sorry for this rather unstructured 'thinking out loud'. I hope it may help you a bit further.
Citation: https://doi.org/10.5194/se-2021-69-CC1 -
AC3: 'Reply on CC1', Cristina Wilson, 01 Sep 2021
Thank you for taking the time to respond to our submission. It is fascinating to learn of a specific example of how uncertainty might be retained in a virtual field trip (by massive sampling effort). We recognize that our point about students’ not being taught explicitly about uncertainty is overemphasized in the current manuscript. A better framing of the point we are trying to make is described by reviewer 2, that there is no formal (systematic) approach to teaching about geoscience uncertainty. In the absence of a formal approach, students can be left to construct their own understanding of how to cope with uncertainty. Although, it is interesting that a similar 1-5 uncertainty rating scale is used in your program, as in our uncertainty framework. Ultimately we have decided to not revise the manuscript for publication in the special issue, but your comments will be helpful as we continue to refine these ideas and eventually pursue publication elsewhere.
Citation: https://doi.org/10.5194/se-2021-69-AC3
-
AC3: 'Reply on CC1', Cristina Wilson, 01 Sep 2021
-
RC1: 'Comment on se-2021-69', Clare Bond, 29 Jun 2021
Please find attached a word document that summarises my thoughts and a PDF of the manuscript with further comments.
I would be very happy to talk to you about my thoughts and comments if that would be helpful.
-
AC1: 'Reply on RC1', Cristina Wilson, 01 Sep 2021
Thank you for your constructive feedback. Ultimately, we agree with many of the concerns raised during review. The uncertainty framework we present in the paper is not specific to virtual field activities, but is also applicable to traditional in-person activities; therefore, we agree that the ideas in this manuscript are constrained by the theme of the special issue. We also recognize a need to demonstrate the effectiveness of our framework in both virtual and in-person activities; field data collected over the 2021 summer season using the uncertainty framework may be helpful in this regard. For these reasons, we have decided to not re-submit to the current special issue, but instead pursue the design work at a professional level to obtain data on student/expert outcomes, and eventually submit to a publication that is a better fit. The review provided will be incredibly helpful as we continue to refine the uncertainty framework and the manuscript.
Citation: https://doi.org/10.5194/se-2021-69-AC1
-
AC1: 'Reply on RC1', Cristina Wilson, 01 Sep 2021
-
RC2: 'Comment on se-2021-69', Heather Petcovic, 14 Jul 2021
General comments:
This paper introduces a new technique designed to help students explicitly identify and communicate the uncertainty inherent in geological fieldwork. Though created in the context of geologic mapping, the framework could be useful in virtual geological work and in other field-based geoscience disciplines.
The framework has six levels (ranging from no evidence to certain) that are applied to four key properties of an outcrop. One of the things I particularly like about the framework is how it separates out data from model uncertainty. I agree that there is a high level distinction between what a geoscientist is immediately observing and what they interpret from their observations. Other major strengths of the paper are the grounding of the framework in research and expert practice, and the ease with which it may potentially be used with students. Helping students to identify, manage, and communicate uncertainty in any type of geoscience research or practice is immensely important, and this technique has the potential to be a major contribution to field-based teaching practice. Lastly, the figures (especially Figure 2) are excellent additions to the paper, which is clearly written and easy to follow.
My biggest issue with the paper is shared by the Referee 1 (R1), namely that the expert development and testing of the paper is not clearly explained and the data on which the framework is built are not shared. The paper could greatly benefit from the addition of a table or figure that shows how the framework is applied, ideally using some of the expert data. This will not only help readers to better understand how the framework is put into practice, but will also help to validate the design process. At present, the design of the framework is not replicable or accessible to other researchers because the raw data (quotes, examples of maps or notes, etc.) are not shared. I also have some comments related to the introductory arguments and claims, specified below. Overall, I recommend that paper for publication with the reviewer comments sufficiently addressed.
Specific comments:
Having read R1’s comments, I would agree that the claim about student not being explicitly taught uncertainly is overstated. In a follow-up to the 2009 study referenced in the paper, I was surprised by the range of tactics used by both experts and novices to depict uncertainty during geologic mapping – for example dotted, dashed, and solid lines; heavy and light shading with colored pencils; marking of outcrops on the map; use of question marks and other symbols; and markings in field notes. Perhaps the issue isn’t that students are never taught how to manage uncertainty, it is instead a lack of a systematic approach for teaching this skill. Anecdotally, the field courses I have worked with each had a set of tactics that they taught – so my sense (not supported by empirical research, since I’m not aware that this has been studied) is that geologists use whatever system they learned from their instructor or mentor. And if they were not taught a system, they made one up.
I’ll also echo R1’s comment that the paper needs to make a distinction between published geologic maps (or “final” maps that a student would submit for a grade) and working field maps and notes. Working maps are by their nature messy, subject to annotation, erasure, and multiple changes of mind. Published maps conform to community standards like the dotted-dashed-solid line notation discussed in the text.
I disagree somewhat with the claim (Lines 38-39) that the field is a student’s first expose to geologic uncertainty - other situations such as interpretation of geophysical data or remotely sensed imagery also require managing uncertainty. Though I do agree that the field is often a student’s first encounter with raw and messy geologic phenomena that do not match the tidy photographs in the textbook or the samples in the lab.
Lines 113-138 would be very well served with a figure or table that shows field examples and how those would be rated using the data and model categories and the uncertainty scale. This would really help readers to understand how the scale is applied in the field.
I really appreciate the argument on lines 147-158 that the framework could help students articulate exactly how they are uncertain. My experience as an instructor is that many students find themselves unable to say exactly how and why they are confused, and simply give up. An instructor could easily use this framework to prompt students to explain where they are stuck. And I agree that the framework could help students guard against getting so set on their geological model or interpretation that they disregard compelling contrary evidence (something that we saw people do in the expert-novice mapping study).
We also found that one of the challenges in working with students (related to lines 182-200) is their reluctance to form models or hypotheses during mapping. Whereas experts made and tested geological interpretations and hypotheses as they collected data, most novices waited until very late in the field exercise to form any interpretations. Could the framework help to address this problem and teach students to be more expert-like in how they approach making and testing hypotheses?
The paper would greatly benefit from further information about the group of experts who used the framework. What was their expertise and other demographic characteristics?
Not only is there a quantitative range of spatial uncertainly for mappable features (lines 231-239) during geological mapping, there is also an element of locational uncertainty. Especially with students, they often struggle to accurately identify their physical location on a map. So it is very possible that they have presumptive or compelling data (or interpretations/models) but that they have entirely misplaced the location of key geological features. I wonder if there is space in the framework to recognize this additional form of uncertainty (e.g., where am I?)?
I found the order of the paper a bit odd. I wonder if the flow might be better if the expert development and testing of the framework was introduced earlier. So, moving sections 4 and 5 of the paper ahead of section 3. This ordering could address my prior comment and one of R1’s that an example of the framework would help readers understand it. Examples from the expert use of the framework could be used to construct this figure/table and its validity could be established if this section (lines 217-240) were moved earlier to when the framework is introduced. The paper could then focus on the applications of the framework to student use both in the field and in virtual instruction (presently section 3).
I am curious to learn more about how challenging it was to norm students to the uncertainty framework scale (Lines 241-249). I am also curious how students compared with experts – I agree with the point that students will likely express a higher degree of uncertainty; what may be presumptive evidence to an expert may be only suggestive to a student. [On the other hand, I know some experts who would never use “certain,” no matter how unmistakable the evidence.]
I was able to access the Sage Hen Pluton teaching materials located on the SERC website. It is not clear to me whether this teaching activity has been empirically evaluated for effectiveness. If not, I suggest softening the conclusions (specifically lines 260-266) to say that these are the potential or proposed benefits to students. As written, it sounds like these benefits are empirically tested.
Technical comments:
There may well by typos in this manuscript but none leapt out at me, thus I have no technical comments.
Citation: https://doi.org/10.5194/se-2021-69-RC2 -
AC2: 'Reply on RC2', Cristina Wilson, 01 Sep 2021
Thank you for your complementary review. We value hearing that the uncertainty framework is applicable to your experience as an educator and researcher. However, we agree that the paper suffers from a lack of data on the framework’s effectiveness, particularly with regard to norming students and experts to the uncertainty scale. For this reason (and others described in our response to Reviewer 1) we have decided to not revise the manuscript for publication in the current special issue. Our intent is pursue the design work at a professional level to obtain data on student/expert outcomes, and eventually submit to a publication that is a better fit. Your comments and feedback will greatly benefit us in the process of refining the uncertainty framework and manuscript.
Citation: https://doi.org/10.5194/se-2021-69-AC2
-
AC2: 'Reply on RC2', Cristina Wilson, 01 Sep 2021
Interactive discussion
Status: closed
-
CC1: 'Comment on se-2021-69', Bernd Andeweg, 21 May 2021
I just reacted bluntly and quickly on Twitter but was asked to leave my comments here. So, without pretending to have an elaborate statement, here we go:
Including uncertainty is key to field courses in our teaching at Vrije Universiteit Amsterdam. We tried to incorporate some of these aspects in our hybrid online/VU fieldwork: we sampled a lot (100s!) of rocks in the area and students were given random samples from different levels in the formations and members. In thia way they saw parr of the variety within sequences. In the field we often pay a lot of attention at chosing boundaries wisely. Sometimes literally waljing back an forth through a (partial) section and narrow down the location of a boundary between units (' here I am sure I am in unit1, here the doubts comes in, but another 10ms and for sure in unit2).
Normally we start with measuring in a seemingly constant dip section with all of the students at different locations in the section. Then we gather and tell our measurements.. and see the variation. Another thing: we grade certainty of measurements in the field (rank 1-5). Later in interpretation the sure ones cannot be altered or neglected, while rank 5: if it fits, it is fine. If not, the model might still hold. Or just stroll around a bit to search for a better outcrop showing bedding. And if not.. Rank 5 is all you have.
Yhe most prominent point however is that students in the field can show their skills and understanding by determining themselves where to go and look for that one outcrop that supports (or not!) their hypothesis. Being very flexibel with hypotheses that sometimes can change from one outcrop to the other.
Many of these things can imo not be reached in a virtual fieldwork, however we all pushed quite some limits in creating the best possible alternatives with a lot of efforts. We even included quite some doubts in our explanatory videos accompanying the fieldwork.
My main point: sure, it is very good to pay more explicitly attention to uncertainty, if you did not already do that in your teaching. Exactly this aspect is somewhat related to experience in the field.
Sorry for this rather unstructured 'thinking out loud'. I hope it may help you a bit further.
Citation: https://doi.org/10.5194/se-2021-69-CC1 -
AC3: 'Reply on CC1', Cristina Wilson, 01 Sep 2021
Thank you for taking the time to respond to our submission. It is fascinating to learn of a specific example of how uncertainty might be retained in a virtual field trip (by massive sampling effort). We recognize that our point about students’ not being taught explicitly about uncertainty is overemphasized in the current manuscript. A better framing of the point we are trying to make is described by reviewer 2, that there is no formal (systematic) approach to teaching about geoscience uncertainty. In the absence of a formal approach, students can be left to construct their own understanding of how to cope with uncertainty. Although, it is interesting that a similar 1-5 uncertainty rating scale is used in your program, as in our uncertainty framework. Ultimately we have decided to not revise the manuscript for publication in the special issue, but your comments will be helpful as we continue to refine these ideas and eventually pursue publication elsewhere.
Citation: https://doi.org/10.5194/se-2021-69-AC3
-
AC3: 'Reply on CC1', Cristina Wilson, 01 Sep 2021
-
RC1: 'Comment on se-2021-69', Clare Bond, 29 Jun 2021
Please find attached a word document that summarises my thoughts and a PDF of the manuscript with further comments.
I would be very happy to talk to you about my thoughts and comments if that would be helpful.
-
AC1: 'Reply on RC1', Cristina Wilson, 01 Sep 2021
Thank you for your constructive feedback. Ultimately, we agree with many of the concerns raised during review. The uncertainty framework we present in the paper is not specific to virtual field activities, but is also applicable to traditional in-person activities; therefore, we agree that the ideas in this manuscript are constrained by the theme of the special issue. We also recognize a need to demonstrate the effectiveness of our framework in both virtual and in-person activities; field data collected over the 2021 summer season using the uncertainty framework may be helpful in this regard. For these reasons, we have decided to not re-submit to the current special issue, but instead pursue the design work at a professional level to obtain data on student/expert outcomes, and eventually submit to a publication that is a better fit. The review provided will be incredibly helpful as we continue to refine the uncertainty framework and the manuscript.
Citation: https://doi.org/10.5194/se-2021-69-AC1
-
AC1: 'Reply on RC1', Cristina Wilson, 01 Sep 2021
-
RC2: 'Comment on se-2021-69', Heather Petcovic, 14 Jul 2021
General comments:
This paper introduces a new technique designed to help students explicitly identify and communicate the uncertainty inherent in geological fieldwork. Though created in the context of geologic mapping, the framework could be useful in virtual geological work and in other field-based geoscience disciplines.
The framework has six levels (ranging from no evidence to certain) that are applied to four key properties of an outcrop. One of the things I particularly like about the framework is how it separates out data from model uncertainty. I agree that there is a high level distinction between what a geoscientist is immediately observing and what they interpret from their observations. Other major strengths of the paper are the grounding of the framework in research and expert practice, and the ease with which it may potentially be used with students. Helping students to identify, manage, and communicate uncertainty in any type of geoscience research or practice is immensely important, and this technique has the potential to be a major contribution to field-based teaching practice. Lastly, the figures (especially Figure 2) are excellent additions to the paper, which is clearly written and easy to follow.
My biggest issue with the paper is shared by the Referee 1 (R1), namely that the expert development and testing of the paper is not clearly explained and the data on which the framework is built are not shared. The paper could greatly benefit from the addition of a table or figure that shows how the framework is applied, ideally using some of the expert data. This will not only help readers to better understand how the framework is put into practice, but will also help to validate the design process. At present, the design of the framework is not replicable or accessible to other researchers because the raw data (quotes, examples of maps or notes, etc.) are not shared. I also have some comments related to the introductory arguments and claims, specified below. Overall, I recommend that paper for publication with the reviewer comments sufficiently addressed.
Specific comments:
Having read R1’s comments, I would agree that the claim about student not being explicitly taught uncertainly is overstated. In a follow-up to the 2009 study referenced in the paper, I was surprised by the range of tactics used by both experts and novices to depict uncertainty during geologic mapping – for example dotted, dashed, and solid lines; heavy and light shading with colored pencils; marking of outcrops on the map; use of question marks and other symbols; and markings in field notes. Perhaps the issue isn’t that students are never taught how to manage uncertainty, it is instead a lack of a systematic approach for teaching this skill. Anecdotally, the field courses I have worked with each had a set of tactics that they taught – so my sense (not supported by empirical research, since I’m not aware that this has been studied) is that geologists use whatever system they learned from their instructor or mentor. And if they were not taught a system, they made one up.
I’ll also echo R1’s comment that the paper needs to make a distinction between published geologic maps (or “final” maps that a student would submit for a grade) and working field maps and notes. Working maps are by their nature messy, subject to annotation, erasure, and multiple changes of mind. Published maps conform to community standards like the dotted-dashed-solid line notation discussed in the text.
I disagree somewhat with the claim (Lines 38-39) that the field is a student’s first expose to geologic uncertainty - other situations such as interpretation of geophysical data or remotely sensed imagery also require managing uncertainty. Though I do agree that the field is often a student’s first encounter with raw and messy geologic phenomena that do not match the tidy photographs in the textbook or the samples in the lab.
Lines 113-138 would be very well served with a figure or table that shows field examples and how those would be rated using the data and model categories and the uncertainty scale. This would really help readers to understand how the scale is applied in the field.
I really appreciate the argument on lines 147-158 that the framework could help students articulate exactly how they are uncertain. My experience as an instructor is that many students find themselves unable to say exactly how and why they are confused, and simply give up. An instructor could easily use this framework to prompt students to explain where they are stuck. And I agree that the framework could help students guard against getting so set on their geological model or interpretation that they disregard compelling contrary evidence (something that we saw people do in the expert-novice mapping study).
We also found that one of the challenges in working with students (related to lines 182-200) is their reluctance to form models or hypotheses during mapping. Whereas experts made and tested geological interpretations and hypotheses as they collected data, most novices waited until very late in the field exercise to form any interpretations. Could the framework help to address this problem and teach students to be more expert-like in how they approach making and testing hypotheses?
The paper would greatly benefit from further information about the group of experts who used the framework. What was their expertise and other demographic characteristics?
Not only is there a quantitative range of spatial uncertainly for mappable features (lines 231-239) during geological mapping, there is also an element of locational uncertainty. Especially with students, they often struggle to accurately identify their physical location on a map. So it is very possible that they have presumptive or compelling data (or interpretations/models) but that they have entirely misplaced the location of key geological features. I wonder if there is space in the framework to recognize this additional form of uncertainty (e.g., where am I?)?
I found the order of the paper a bit odd. I wonder if the flow might be better if the expert development and testing of the framework was introduced earlier. So, moving sections 4 and 5 of the paper ahead of section 3. This ordering could address my prior comment and one of R1’s that an example of the framework would help readers understand it. Examples from the expert use of the framework could be used to construct this figure/table and its validity could be established if this section (lines 217-240) were moved earlier to when the framework is introduced. The paper could then focus on the applications of the framework to student use both in the field and in virtual instruction (presently section 3).
I am curious to learn more about how challenging it was to norm students to the uncertainty framework scale (Lines 241-249). I am also curious how students compared with experts – I agree with the point that students will likely express a higher degree of uncertainty; what may be presumptive evidence to an expert may be only suggestive to a student. [On the other hand, I know some experts who would never use “certain,” no matter how unmistakable the evidence.]
I was able to access the Sage Hen Pluton teaching materials located on the SERC website. It is not clear to me whether this teaching activity has been empirically evaluated for effectiveness. If not, I suggest softening the conclusions (specifically lines 260-266) to say that these are the potential or proposed benefits to students. As written, it sounds like these benefits are empirically tested.
Technical comments:
There may well by typos in this manuscript but none leapt out at me, thus I have no technical comments.
Citation: https://doi.org/10.5194/se-2021-69-RC2 -
AC2: 'Reply on RC2', Cristina Wilson, 01 Sep 2021
Thank you for your complementary review. We value hearing that the uncertainty framework is applicable to your experience as an educator and researcher. However, we agree that the paper suffers from a lack of data on the framework’s effectiveness, particularly with regard to norming students and experts to the uncertainty scale. For this reason (and others described in our response to Reviewer 1) we have decided to not revise the manuscript for publication in the current special issue. Our intent is pursue the design work at a professional level to obtain data on student/expert outcomes, and eventually submit to a publication that is a better fit. Your comments and feedback will greatly benefit us in the process of refining the uncertainty framework and manuscript.
Citation: https://doi.org/10.5194/se-2021-69-AC2
-
AC2: 'Reply on RC2', Cristina Wilson, 01 Sep 2021
Viewed
HTML | XML | Total | BibTeX | EndNote | |
---|---|---|---|---|---|
853 | 432 | 62 | 1,347 | 60 | 49 |
- HTML: 853
- PDF: 432
- XML: 62
- Total: 1,347
- BibTeX: 60
- EndNote: 49
Viewed (geographical distribution)
Country | # | Views | % |
---|
Total: | 0 |
HTML: | 0 |
PDF: | 0 |
XML: | 0 |
- 1