Skip to main content

Responding to incorrect ideas: science graduate teaching assistants’ operationalization of error framing and undergraduate students’ perception

Abstract

Background

In college science laboratory and discussion sections, student-centered active learning strategies have been implemented to improve student learning outcomes and experiences. Research has shown that active learning activities can increase student anxiety if students fear that they could be negatively evaluated by their peers. Error framing (i.e., to frame errors as natural and beneficial to learning) is proposed in the literature as a pedagogical tool to reduce student anxiety. However, little research empirically explores how an instructor can operationalize error framing and how error framing is perceived by undergraduate students. To bridge the gap in the literature, we conducted a two-stage study that involved science graduate teaching assistants (GTAs) and undergraduate students. In stage one, we introduced cold calling (i.e., calling on non-volunteering students) and error framing to 12 chemistry and 11 physics GTAs. Cold calling can increase student participation but may increase student anxiety. Error framing has the potential to mitigate student anxiety when paired with cold calling. GTAs were then tasked to rehearse cold calling paired with error framing in a mixed-reality classroom simulator. We identified GTA statements that aligned with the definition of error framing. In stage two, we selected a few example GTA error framing statements and interviewed 13 undergraduate students about their perception of those statements.

Results

In the simulator, all the GTAs rehearsed cold calling multiple times while only a few GTAs made error framing statements. A thematic analysis of GTAs’ error framing statements identified ways of error indication (i.e., explicit and implicit) and framing (i.e., natural, beneficial, and positive acknowledgement). Undergraduate student interviews revealed specific framing and tone that are perceived as increasing or decreasing student comfort in participating in classroom discourse. Both undergraduate students and some GTAs expressed negative opinions toward responses that explicitly indicate student mistakes. Undergraduate students’ perspectives also suggest that error framing should be implemented differently depending on whether errors have already occurred.

Conclusion

Error framing is challenging for science GTAs to implement. GTAs’ operationalizations of error framing in the simulator and undergraduate students’ perceptions contribute to defining and operationalizing error framing for instructional practice. To increase undergraduate student comfort in science classroom discourse, GTAs can use implicit error indication. In response to students’ incorrect answers, GTAs can positively frame students’ specific ideas rather than discussing broadly how errors are natural or beneficial.

Introduction

To improve student learning experiences and outcomes, research-based instructional strategies have been increasingly implemented in college science laboratory and tutorial sections. Graduate teaching assistants (GTAs) are often the ones who teach in such environments. To actively engage students in learning, GTAs are expected to facilitate student-centered activities, such as classroom discussion and group work. Compared to traditional lecture, active learning can increase student engagement as well as perceived affective risk (Conlin & Scherr, 2018; Cooper et al., 2018). For example, students may fear that their ideas will be negatively evaluated by other students or the instructor (Cooper et al., 2017; Cooper et al., 2018). Consequently, students may withdraw from participation to avoid making mistakes. Since error is a natural part of the learning process, instructors need to establish a positive error climate and encourage students to learn from mistakes. Research has demonstrated a positive impact of teachers’ error management behavior on students’ attitudes towards learning from mistakes (Tulis, 2013). GTA professional development should include discussion of strategies for error management.

In this study, chemistry and physics GTAs were tasked with rehearsing cold calling (i.e., calling on non-volunteering students) paired with error framing (i.e., framing errors as natural and beneficial to learning) in a classroom simulator. We examined how the GTAs conceptualized and operationalized error framing. Using interviews with undergraduate students, we investigated undergraduate students’ perspectives on example error framing statements made by GTAs. We conclude with implications for GTA training on error management.

Pedagogical skills: cold call and error framing

Classroom discussion is arguably one of the most common practices instructors use during active learning. However, many instructors report that students are often unwilling to engage in active learning (Michael, 2007). Cold calling is proposed as a strategy to increase student participation in classroom discussion (Dallimore et al., 2012). Cold calling is calling on students to answer a question or participate in a discussion, regardless of whether they have volunteered to do so (e.g., by raising their hand) (Lemov, 2010). Eddy et al. (2015) categorize cold calling as an accountability practice. Frequent cold calling has been shown to increase student comfort in participation and volunteer participation (Dallimore et al., 2012). Random call, a particular version of cold call where instructors generate a random list to order how they select students to call on, has been shown to increase gender-based equity in participation (Eddy et al., 2015).

However, researchers demonstrated in both quantitative and exploratory interview analyses that cold calling may lead to increased student anxiety (Cooper et al., 2018; Downing et al., 2020; England et al., 2017). Cooper et al. (2018), for example, investigated students’ perceived impact on their anxiety of three active learning strategies: cold call/random call, clicker questions and group work. Students provided examples of how clicker questions and group work both increased and decreased their anxiety; however, cold call was only described as increasing anxiety. They propose that cold call increases student anxiety through the mechanism “fear of negative evaluation”, the fear of being negatively evaluated that can arise while participating or anticipating participation in a social activity (Watson & Friend, 1969; Weeks et al., 2005). As the Yerkes–Dodson law (Yerkes & Dodson, 1908) suggests, some anxiety may increase performance, but too much anxiety decreases performance [see (Cooper et al., 2018) for a thorough discussion of anxiety in college].

Error framing is proposed in the literature as a pedagogical tool to reduce student apprehension (Eddy et al., 2015) and anxiety (Cooper et al., 2018). Error framing involves framing mistakes or misconceptions as natural or useful (Bell & Kozlowski, 2008). An instructor can error frame both at the introduction of an activity as well as during student engagement (Eddy et al., 2015). Examples of error framing include telling students “errors are a positive part of the training process”, “you can learn from mistakes and develop a better understanding” (Bell & Kozlowski, 2008, pg. 17), and “explicitly telling the class that it is OK to answer clicker questions incorrectly, by explaining that an incorrect answer is a common misconception, or by suggesting that he or she understands why students might think an incorrect answer was correct” (Cooper et al., 2018, pg. 10). Research has shown that error framing can decrease anxiety about making mistakes (Bell & Kozlowski, 2008) as well as increase student motivation (Steele-Johnson & Kalinoski, 2014) and improve connections between students and faculty (Cooper et al., 2017).

In addition to error framing, instructors’ response to students’ errors has been shown to impact students’ perceptions of errors as learning opportunities (Steuer & Dresel, 2011; Tulis, 2013). Some researchers have broadly characterized instructors’ responses to student errors as adaptive or maladaptive error management (Spychiger et al., 1998; Tulis, 2013). Adaptive error management includes, for example, discussion with the whole class, emphasizing the learning potential, and impeding negative reactions from peers. Error framing is particularly in line with adaptive error management since it emphasizes the benefits of errors in learning. In contrast, maladaptive error management includes, for instance, ignoring the mistake, criticizing the student, and asking another student to correct the mistake. Instructors’ maladaptive error management is likely to increase students’ fear of failure and tendency to avoid challenges (e.g., Dweck, 1986; Goetz et al., 2006).

We also argue that instructors have a role in helping students “save face” when the instructor has created a social interaction where a student has publicly shared an incorrect response. This is related to the framework of “facework” in sociology and instructional communication (Goffman, 1955), where “face” refers to the self-image one hopes to portray in social situations. Thus, fear of negative evaluation may be one type of “face threat”. Different types of facework have been conceptualized: solidarity, approbation, and tact (Lim & Bowers, 1991). Solidarity refers to strategies that emphasize commonalities and the desire to be included. Approbation refers to actions that avoid expressing doubt or disapproval and strategies that reduce the extent to which other’s perceived competence is challenged. Tact refers to indirect and tentative communication behaviors that respect other’s autonomy.

An instructor can engage in “facework” by using particular strategies and behaviors to protect students’ face. Instructional communication research suggests possible strategies, such as encouraging students’ “ownership of and investment in the class”; “maintaining a safe climate for independent thought and risk taking”; “expressing respectful interest in students’ classroom contributions”; and “focusing on improvement” (Kerssen-Griep, 2001, pg. 265). We see error framing strategies as particularly aligned with maintaining a safe classroom environment for risk taking and expressing respectful interest in students’ contributions. Research in introductory physics has shown that students’ perceptions of their instructor doing “protective facework” is one factor in a complex model predicting student satisfaction with active learning courses (Gaffney & Gaffney, 2016). Gaffney and Housley Gaffney argue that instructors may not perform protective facework through their natural behaviors so “instructors should make additional directed efforts to mitigate threats to face when interacting with their students” (pg. 14).

Graduate teaching assistant professional development

Risks of face threat not only exist in lecture, but also in labs and tutorials. In labs and tutorials, especially in inquiry-based courses, students are expected to work in groups, share ideas, and critically evaluate those ideas. Group work can increase student anxiety when students fear that they will be negatively evaluated by their peers (Conlin & Scherr, 2018; Cooper et al., 2018). To reduce students’ fear of negative evaluation, GTAs could engage in protective facework.

Since GTAs play an important role in undergraduate learning experience and learning outcomes, effective GTA professional development (PD) is essential. GTA PD programs exist in a variety of formats and structures, such as weekly preparation meetings, didactic workshops, and pedagogy seminars. However, many of these programs do not facilitate repeated practice, which is required to develop essential teaching skills. Microteaching is a popular and effective technique in K-12 teacher preparation worldwide (Remesh, 2013) and has also been adapted in GTA PD (Etkina, 2010). Microteaching provides teachers opportunities to practice specific teaching skills and receive feedback. Microteaching often makes use of role play in which a teacher plays the “teacher”, and the other teachers play “students.” A potential drawback of role play is that students’ states of knowledge, reasoning skills, and behaviors may not be accurately characterized. The lack of authentic teaching experience may reduce the effectiveness of professional development (Dawson & Lignugaris/Kraft, 2013).

Mixed-reality classroom simulator

Virtual classroom simulations have been increasingly used to facilitate teachers’ practice of teaching skills. Simulations afford a safe environment for teachers to practice before teaching in real classrooms. Simulated students’ states of knowledge can be set such that they closely resemble real students’ states. Additionally, simulation facilitates iterative practice; simulated students can be reset to an earlier state, allowing the teacher to retry a lesson segment. Therefore, teachers have opportunities to practice specific pedagogical skills deliberately, reflect on their practices, and receive targeted feedback. TeachLivE™ is a mixed-reality classroom simulator, which combines virtual reality with the physical world (Straub et al., 2015). That is, a teacher is present in a physical classroom and faces a large screen, where a simulated classroom and students are displayed. The simulated students are digitally puppeteered by a trained interactor, allowing the practicing teacher to have authentic disciplinary interactions with the students (Geraets et al., 2021). Research has demonstrated a positive impact of rehearsal in the simulator on math and science teachers’ practices in both simulator and real classrooms (Dawson & Lignugaris/Kraft, 2013; Elford et al, 2013; Garland et al., 2012; Straub et al., 2015; Whitten et al., 2013). At the college level, a prior study by our team explored the utility of the simulator in a Learning Assistant (LA) program (Chini et al., 2016). Chini, Straub, and Thomas found that the classroom simulator created a safe and effective environment for LAs to practice a variety of teaching skills. Additionally, in another study our team integrated the simulator in professional development for math GTAs (Saitta et al., 2020).

Current study

As part of a larger project on integrating a classroom simulator in science GTAs’ professional development, we explored science GTAs’ operationalization of error framing in simulated classroom discourse and the potential impact on undergraduate students’ learning experience. We believe that the effectiveness of error framing depends on how it is operationalized. Although the literature has documented the positive impact of error framing on student learning experiences, we are not aware of any studies that investigated how science GTAs conceptualize and operationalize this skill. In order to gain further insights into effective operationalization of error framing, we also investigated undergraduate students’ perception of error framing. We examined specific ways of error framing that may increase or decrease students’ comfort in classroom discourse. We address the following research questions:

  1. 1.

    How do science GTAs operationalize error framing in simulated classroom discourse?

  2. 2.

    How are science GTAs’ error framing statements perceived by undergraduate students?

    1. a.

      What specific aspects of an error framing statement increase student comfort in participation?

    2. b.

      What specific aspects of an error framing statement decrease student comfort in participation?

Methods

Instructional context

The study was conducted in an introductory physics sequence and a general chemistry lab course at a very large, research-intensive university in the southeastern U.S.A. The physics sequence is intended for students who major in life sciences. The first course of the sequence covers Mechanics, and the second focuses on Electricity and Magnetism. Both courses have two components: lecture and a combined tutorial and laboratory session (“mini-studio”; Chini & Pond, 2014). The lecture component is taught by a faculty member, and the mini-studio sessions are taught by GTAs. The typical enrollment for lecture is 250–300 students, and the enrollment for a mini-studio session is up to 32 students. The mini-studio meets every week for a total of 2 h and 50 min, which includes a 75-min tutorial based on the University of Maryland Open Source Tutorials (Elby et al., 2007), a 15-min group quiz, and an 80-min lab based on the Investigative Science Learning Environment (ISLE) curriculum (Etkina & Van Heuvelen, 2007). Both the tutorial and lab make use of a collaborative, guided-inquiry approach. The tutorial focuses on physics concepts, and the lab aims to improve critical thinking. The GTAs are expected to facilitate learning by engaging students in group inquiries.

The general chemistry lab course investigated is a one credit stand-alone course for students who major in science. General chemistry I (a lecture course) is a pre-requisite and General chemistry II (a lecture course) is a pre-requisite or co-requisite for the lab course. Typical enrollments for the lecture courses are 450 students. The lab meets for 2 h and 50 min every week. The typical enrollment for a lab section is up to 24 students. The lab course is intended to foster student inquiry and conceptual discovery. Each lab starts with a 15-min quiz, and the rest of the lab follows a 5E learning cycle approach (Bybee & Landes, 1990). The GTA first demonstrates a phenomenon leading up to a key question (engagement). Students then develop an experimental procedure and collect data to answer the key question (exploration). Toward the end of the lab session, the GTA leads the class in communicating their findings, identifying patterns in data, and summarizing the scientifically accepted understanding (explanation). After the lab session, students read about the topic they discovered in lab (elaboration) and conduct follow-up exercises (evaluation).

Integrating a mixed-reality classroom simulator in GTA professional development

Both the physics and chemistry departments require GTAs to attend professional development activities. All first-year physics GTAs are required to attend a semester-long weekly pedagogy seminar, led by a physics faculty (J.J.C). During the seminar, GTAs discuss education literature, reflect on teaching practices, and evaluate research-based reform in the courses. In addition, all the physics GTAs attend a 90-min preparation meeting every week led by an experienced GTA and/or a postdoctoral scholar (C.M.D. and T.W.). During the meetings, GTAs work through the activities in groups. They also discuss common student difficulties with specific topics and strategies to facilitate student learning.

The chemistry GTAs participate in three 3-h inquiry-based workshops across two days before each semester. First-year GTAs are required to attend all three workshops, and experienced GTAs are required to attend the last workshop only. The workshops provide GTAs opportunities to learn guided-inquiry and to read relevant literature. During the semester, GTAs attend 90-min weekly preparation meetings led by a chemistry faculty (E.K.H.S.). The weekly preparation meetings focus on both content and pedagogy. Additionally, all first-year GTAs attend a two-semester pedagogy seminar led by the same faculty (E.K.H.S.). During the seminar, GTAs discuss constructivist teaching and learning principles, as well as evidence-based teaching.

In Spring 2019, 12 chemistry and 13 physics (as well as 11 calculus) GTAs participated in a simulator session at the beginning of the semester in addition to the department-level professional development activities. GTAs were asked to sign up for one of 14 time slots based on their availability; each time slot accommodated two to three GTAs. In some time slots, GTAs were all from the same discipline; in others, GTAs were from different disciplines. In this training, the simulated students were five young adults, with a range of personalities, in a physical science laboratory environment. The behaviors of the student avatars are enacted by a trained professional using human-in-the-loop technology to create an authentic and interactive simulation. GTAs wear a portable microphone and their movements are tracked by a motion sensor (see figure on pg. 4 in Geraets et al., 2021, for details).

The simulator activity was facilitated by researchers in discipline-based education. In the simulator, GTAs were tasked to practice cold calling and error framing in small groups. Each group was given about 5 min to get familiar with the simulated environment (by introducing themselves to the simulated students) before they practiced the pedagogical skills. GTAs then took turns and each of them led a 7-min whole class discussion (round 1). When one GTA was leading the class discussion, the other GTAs in the same group observed. After every GTA in the group finished, they then reflected on their rehearsals and received feedback from the facilitators. GTAs then took turns individually leading another 7-min discussion (round 2), followed by another reflection and feedback session.

Before attending the simulator activity, GTAs received instructions about the activity via email. The instructions included an introduction to the pedagogical skills and a disciplinary activity (physics or chemistry). Error framing was referred to as “normalizing error” in the instructional material, as used in Lemov (2010). The instructional material provided a few examples of normalizing errors and links for videos on strategies to implement normalizing error (see Additional file 1: Appendix for detail). The disciplinary activities were adapted from the curriculum used in the physics mini-studios and the chemistry labs, respectively.

We also prepared material for and discussed expected student- avatar behaviors with the simulator team (i.e., the professional team who orchestrates the simulation to provide a consistent user-experience). Specifically, student avatars were expected to make mistakes so that GTAs could practice error framing. We reviewed the literature for common incorrect student ideas about the relevant disciplinary topics and assigned those ideas to student avatars. The assigned incorrect student ideas, along with the introduction of pedagogical skills and the disciplinary activities, were provided to the simulator team. These written activities were developed and tested in a pilot study (Geraets et al., 2021).

Data collection and analysis

GTAs’ rehearsals in the simulator

We video recorded GTAs’ rehearsals of cold calling and error framing in the simulator in Spring 2019. The research participants included 11 (out of 13) physics GTAs and 12 (out of 12) chemistry GTAs, including both U.S. citizens and international students. Before Spring 2019, 10 physics and nine chemistry GTAs had already taught at least one semester of the physics mini-studio or chemistry labs.

We conducted formal analyses on GTAs’ class discussions with avatar students but not on GTAs’ reflections. The feedback and reflection sessions were informal in Spring 2019, which was the first semester we implemented simulator training in science GTA professional development. To determine how frequently GTAs implemented cold calling, we watched the video recordings and conducted content analysis. First, we analyzed three out of 12 groups of GTAs in both disciplines. Each group was analyzed by two or three researchers independently. We identified instances in which GTAs implemented cold calling by counting the number of times a GTA called on a non-volunteering student in each round. We then discussed coding and resolved inconsistencies, such as how many occurrences of cold calling to tally if a GTA asked the same student the same question twice in a row. Seven videos were coded by teams of two or three researchers and another two videos were coded by individuals. On average, each researcher coded five groups of GTAs. Within each team, the researchers conducted analysis independently and then discussed and resolved inconsistencies. The researchers reached agreement on all instances of cold calling among the GTAs from the recordings after discussion.

The analysis for error framing implementations involved two steps: a video analysis and a qualitative analysis of excerpts from GTAs. The video analysis of error framing followed the same process as the analysis of cold calling. We identified 18 instances of GTAs addressing students’ incorrect answers that were potentially aligned with error framing. We did not include instances of GTAs responding with another question (to lead students to the correct answer) or if they simply answered “no” (which was very rare). The 18 instances were then transcribed for further analysis—whether they align with error framing. Additionally, we identified one instance where the GTA made error friendly climate statements before class discussion; while this instance was in line with error framing, we did not include it in further analysis due to low incidence.

We then conducted a thematic analysis with the 18 excerpts of GTAs’ responses that were potentially aligned with error framing. The thematic analysis allowed us to refine the definition of error framing based on GTAs’ operationalizations and to identify patterns in the operationalizations. One of the authors (T.W.) reviewed all the excerpts and identified emergent themes. T.W. developed a coding scheme based on the emergent themes. J.J.C. reviewed the coding scheme and modified the code definitions. T.W. coded all the excerpts using the modified coding scheme. To investigate the interrater reliability, A.A.G. coded the excerpts independently. The Cohen’s kappa between T.W. and A.A.G. was 0.72, indicating a substantial agreement (Cohen, 1960). T.W. and A.A.G. discussed their coding and resolved disagreements. We report both frequencies of implementation and themes in GTAs’ operationalization of error framing.

Undergraduate student interviews

During the semester, we recruited 13 undergraduate students who were enrolled in the target courses for 1-h interviews. We selected a few error-framing statements GTAs made in the simulator and asked undergraduate students what aspects of the statements could increase or decrease their comfort in participating in class discussion. Participants were compensated with $15. The interviews were audio-recorded and transcribed for analysis. To help the participants situate the scenario of error framing in the target courses, two disciplinary versions (physics and chemistry) were used. Three undergraduate interview participants were enrolled in the physics mini-studios associated with two different physics GTAs, and 10 were enrolled in the chemistry labs associated with three chemistry GTAs. In order to accommodate students’ schedules and comfort levels with individual interviews, students were allowed to choose either individual interviews or focus groups. As a result, four chemistry students participated in two focus groups (two per group). In total, we conducted 11 interviews, nine individual interviews and two focus groups.

To investigate undergraduate students’ perceptions of GTAs’ error framing, we analyzed the transcripts of student interviews using thematic analysis. Considering the interviews were a mixture of individual and focus groups, we analyzed the data by interview transcripts rather than by participants. T.W. reviewed all the interview transcripts and developed a coding scheme based on the themes emergent from the transcripts. J.J.C. reviewed the coding scheme and provided feedback. T.W. then modified the codes and definitions based on the feedback. T.W. and C.M.D. independently coded about half (five out of 11) of the transcripts. They compared the codes and resolved inconsistencies. Then, they independently coded the rest of the transcripts and calculated IRR based on the rest of the transcripts. T.W. and C.M.D. reached a Cohen’s kappa of 0.71, indicating a substantial agreement (Cohen, 1960). T.W. and C.M.D. discussed their coding and resolved disagreements.

Limitations

This study has limitations in its design and data samples. First, in order for GTAs to implement error framing, they needed to be able to identify opportunities for implementation (i.e., moments when students gave incorrect answers). Although very rare, there were instances when GTAs reported that they did not catch any mistakes avatar students made. The facilitator then pointed out specific opportunities when GTAs could have addressed students’ mistakes. Furthermore, GTAs may have experienced performance anxiety in the simulator, although no GTA in this cohort reported this was the case.

Second, undergraduate students’ perspectives were investigated using a written form of GTA statements. However, during classroom discourse, GTAs’ tone of voice, body language and the prior GTA–student relationship may also play a role in how undergraduate students interpret GTAs’ responses to errors. Moreover, undergraduate students were only presented GTAs’ responses rather than the dialogue between student avatars and the GTAs, which could have provided more context for students to interpret the GTAs’ statements. Additionally, the undergraduate student interviews were a mixture of individual interviews and focus groups. We acknowledge that group dynamic may impact the interview results. For example, a group member may choose not to express a contrary thought if the other member was perceived to have a strong opinion.

Additionally, we did not explore differences in how GTAs’ implemented error framing or how undergraduate students perceived error framing across demographic groups. It is likely that characteristics such as gender, race/ethnicity, and nationality impact both how GTAs’ implement error framing and how undergraduate students perceive error framing statements. This analysis was beyond the scope of this exploratory study, but should be the topic of future work.

Lastly, due to small sample sizes of GTA error framing statements and undergraduate student interviews, the results about GTAs’ operationalization of error framing and undergraduate students’ perception of GTAs’ error framing are not exhaustive. In particular, the results only allowed us to focus on how GTAs respond to students after errors occurred, but not on how GTAs establish an error-positive climate at the beginning of instruction.

Results and discussion

Science GTAs struggle with implementing error framing

The average frequencies of cold calling and error framing implemented by GTAs are shown in Table 1. On average, chemistry GTAs implemented cold calling 8.2 \(\pm 2.9\) times in round 1 and 7.7 \(\pm 3.2\) times in round 2; physics GTAs implemented cold calling 6.2 \(\pm 2.6\) times in round 1 and 7.3 \(\pm 2.4\) in round 2. In contrast, the average frequencies of error framing were very low (less than 1 time) for both chemistry and physics GTAs in both rounds. Additionally, only about half of the GTAs (six chemistry and four physics) ever implemented error framing, while all the GTAs implemented cold calling several times. In round 1, five GTAs implemented error framing. After receiving feedback, seven GTAs implemented error framing in round 2. The results indicate that GTAs were able to implement cold calling readily, but many of them struggled to implement error framing.

Table 1 The mean (\(\pm\) standard deviation) frequencies of cold calling and error framing implemented by GTAs in each discipline

We reviewed GTAs’ reflections with their group and facilitators to gain insight into GTAs’ perception of error framing. All names are pseudonyms. During the reflection and feedback sessions, some GTAs expressed that normalizing error (error framing) is challenging to implement. Below is an excerpt from two chemistry GTAs, GTA A and GTA B, who were in the same group.

GTA A: The normalizing part it’s like the most difficult part of that is, you know, telling them that their answer is incorrect, but not telling them what the correct answer is because you do not just want to directly tell them, you know. So, it’s more like starting a discussion towards that or something.

GTA B: Also, just not telling them, ‘no, let’s move on’. You have to address that and then, because, I don’t know like y’ all’s undergrad background or whatever, but that was just like ‘no, that’s not the right answer, let’s just keep going.’ It’s like when, I have always like been taught that way, it’s harder to transition to something different.

GTA A: Yeah you don’t want to blow dart their answer off, but you want to address it. ‘Well it’s good that you are bringing this up but let’s talk about why it might be different’ you know.

Both GTAs realized it is important to show respect to students’ ideas and build on students’ ideas. For GTA A, the challenging part is building on students’ ideas and providing scaffolding as she felt that one should not just give students the correct answers. For GTA B, his prior experience as an undergraduate student has a negative impact on his teaching practices. When he was an undergraduate student, he was told his idea was incorrect without his idea being discussed. Since GTA B had never seen his own instructors modeling error framing and building on students’ ideas, he found it challenging to adopt this teaching practice. The excerpt from GTA A and GTA B sheds light into what and why science GTAs find challenging in operationalizing error framing.

Science GTAs’ operationalization of error framing

We identified two themes (error identification and framing) from GTAs’ excerpts in response to students’ incorrect answers (see Table 2). The two themes were distinct but not mutually exclusive. That is, each error framing statement was examined from two perspectives: how the error is indicated and what framing is used. Examples are provided in Table 2, and the codes are discussed below.

Table 2 Codes and themes emergent from GTAs’ excerpts of error framing statements

The error indication theme included two codes: explicit and implicit. If a GTA directly commented that the answer is not correct using words such as “incorrect” or “misconception”, we coded it as explicit error indication. In contrast, if a GTA avoided a direct statement that the answer is incorrect, we coded it as implicit error indication.

The framing theme included three codes: natural, beneficial, and positive acknowledgement. An excerpt was coded natural if the GTA framed errors as natural or common. It was coded beneficial if the GTA framed errors as useful for learning. If the GTA used positive framing to acknowledge a student contribution but did not elaborate on why the contribution was valuable, it was coded positive acknowledgement. If none of the framing codes were assigned, the response was not considered an error framing statement.

Out of the 18 excerpts we identified as potentially aligned with error framing, during the coding we determined that 16 were consistent with error framing. In the other 2 excerpts, GTAs let students know that the answer was not quite correct or that some changes needed to be made to the answer. Those two instances did not fit into any of the three framing themes (natural, beneficial, and positive acknowledgement), and therefore were not considered error framing.

Avoiding explicit error indication

About half of the GTAs who implemented error framing used implicit error indication. Instead of saying “that is not correct”, some GTAs avoided using a direct statement to point out students’ mistakes. For example, GTA A responded to a student avatar’s mistake by stating “So I think that’s where some people might get tripped up where you think if it lights up then it must be ionic just because the ionic ones all seem to light up. So, we can’t interchange those two things.” By saying “I think that’s where some people might get tripped up …”, GTA A implied that the answer was not correct. Moreover, by using “some people”, GTA A was able to avoid putting the student in the spotlight because not only that student but also others would think “if it lights up then it must be ionic.” GTA A’s response aligned well with approbation facework items in the facework instrument by (Kerssen-Griep et al., 2003) as she worked to “avoid making students look bad” and made sure that she “doesn’t cast students in a negative light” (pg. 381).

Similarly, during the feedback session, chemistry GTA C pointed out that he did not feel comfortable being explicit that students’ answers were incorrect. When a facilitator provided suggestions for how GTA C could express how he understands the student’s point of view but also point out what is missing in the student’s reasoning, GTA C responded: “You know it’s funny that you mention that. Maybe I am wrong here, I don’t like to be explicit with that sort of thing. That’s why I tried to take what CJ [one of the student avatars] was saying and bring it to the rest of the team and tie it back. Saying like ‘I know where you are coming from, this is what’s happening, do you see why you would think that?’ … I don’t like to be explicit about it. Maybe I should be.” Later, GTA C explained that he did not want to be explicit because he did not want to single out the student as being wrong. “Yeah it’s not about singling anybody out as being wrong but rather where our misconceptions come from.” GTA C’s concern about singling out CJ as being wrong indicates that he cares about students’ feelings. By using implicit error indication, he avoided making CJ look bad. This suggests that GTA C was engaged in solidarity facework, which attends to the desire to be included. Instead of focusing on the fact CJ made a mistake, he intended to build on CJ’s idea and discuss it with the whole group. This action of discussing with the whole group or class was also documented in the teacher education literature (Tulis, 2013) as an adaptive error management behavior.

Framing errors as natural or beneficial

Eight out of 10 GTAs who implemented error framing framed students’ incorrect ideas as natural and common. For example, physics GTA D responded “So that’s kind of like what’s inside of our intuition, right? We see something that is moving upwards and we naturally are going to think that there’s a force acting onto it. But like Maria [a student avatar] said, there is also another force, that gravity force that is acting on it.” GTA D framed the idea as intuitive and natural. Additionally, by using the word “we”, GTA D implied that the idea was common. This verbal action of using “we” can be considered a way of engaging in solidarity facework, which emphasizes the commonality among students as well as commonality between the GTA and students. Similarly, physics GTA E stated, “What you were thinking, CJ, it kind of makes sense that you would think he would not move at all, right, but he is already moving with a certain velocity to begin with.” By saying “it kind of makes sense”, GTA E acknowledged that the student’s idea is natural and also implied that he could see why CJ would think that way. This can be considered as having elements of both approbation and solidarity facework. Praising a student’s idea is one way of doing approbation facework, and the implication of understanding why a student would think that way reinforces the commonality between the GTA and student, which is solidarity facework.

A couple GTAs framed students’ incorrect ideas as beneficial for learning. Chemistry GTA F, for example, framed lab as a safe place for risk taking and learning by stating, “In that sense, not necessarily but because this is a lab, we do want you to be comfortable making errors so we can go ahead and try to explain that to kind of iron some of those out.” The framing of errors being beneficial aligns with one of the adaptive error management behaviors discussed in Tulis (2013), “emphasizing the learning potential.”

Framing student contributions with positive acknowledgement

Three GTAs used positive framing to give value to a student’s contribution, but did not elaborate on why the contribution was valuable. GTA C, for example, stated “I am glad you brought that up because it actually behaves like one but technically isn’t one.” By saying “I am glad you brought that up”, GTA C was engaged in protective facework. By his positive affect, GTA C suggested a respectful interest in the student’s contribution. Similarly, GTA B stated, “That’s a very good point that you brought up, but it’s not one hundred percent correct so …” GTA B praised the student for making a good point, and also pointed out the answer was not completely correct.

Both GTA B and GTA C gave value to students’ contributions, but they did not make explicit why the contribution was valuable. Initially, we did not think such positive framing aligned with the definition of error framing, which is to frame errors as natural and beneficial. However, results from the undergraduate student interviews suggest that students strongly favor positive framing of student contributions, even if the response does not explicitly frame errors as natural and beneficial (as will be discussed in the next subsection). Moreover, the literature (Gaffney & Gaffney, 2016; Tulis, 2013) also suggests this behavior can positively impact student learning experiences. For example, positively highlighting students’ contributions was documented in Tulis (2013) as an adaptive error management behavior and was found to be associated with students’ positive affective reactions. Additionally, positive framing of student contributions aligns with protective, approbation facework because it highlights the student’s contribution. Protective facework has been shown to be positively correlated with student satisfaction (Gaffney & Gaffney, 2016). Therefore, we argue that positive framing of a student contribution is an implicit form of error framing, which can help reduce student anxiety and maintain a safe learning environment.

Undergraduate students’ perception of science GTAs’ error framing statements

The themes emergent from GTAs’ responses to students’ incorrect answers informed the undergraduate student interviews. We selected a few GTA excerpts (and made minor modifications for clarity) that involved a mixture of explicit and implicit error identifications and/or a mixture of beneficial, natural, and positive acknowledgement framing. This allowed us to examine how, if at all, these categories of error framing statements would be perceived differently by undergraduate students. In the physics version of undergraduate student interviews, all the statements were natural framing because we only identified natural framing used by physics GTAs in the simulator. We were cautious not to alter GTAs’ wording because we were not sure to what extent an alteration may influence undergraduate students’ interpretation. The interview protocols and the research team’s coding of the error framing statements are shown in Fig. 1.

Fig. 1
figure 1

Chemistry and Physics versions of interview protocol. The coding of GTA statement is shown in square parentheses. Note that the coding was not shown to students during interviews

We identified two themes from undergraduate students’ interviews: framing and tone (see Table 3). The theme of framing describes specific ways of framing perceived by students that can increase or decrease their comfort. It is important to note that codes from the theme of framing for undergraduate students’ interviews are different from those generated by the research team for GTAs’ excerpts. Since undergraduate students were not introduced to error framing, we do not expect their perceptions of the exemplar excerpts to align directly with the definition of error framing. The other theme concerns tone perceived by students that can increase or decrease their comfort. We first report specific ways of framing and tone perceived by students that can increase or decrease their comfort (regardless of the themes in GTAs’ operationalization of error framing). The units for analysis were interviews rather than undergraduate students considering our data consist of a mixture of individual interviews and focus groups. We then discuss the relationship between themes in GTAs’ operationalization of error framing as described by the research team and student comfort.

Table 3 Themes and codes of undergraduate students’ perception of GTAs’ error framing statements and impact on student comfort

Specific ways of framing and tone that increase student comfort

Acknowledge idea as natural and sensible

Students expressed that their comfort would increase if GTAs acknowledged that their idea is sensible or GTAs understand why students would think in a particular way. Student A, for example, stated “That’s what I’m saying, it’s all it’s understandable. I guess my problem is I’ll like, people understanding where my thinking is coming from, more than it is whether I’m right or wrong. I’m glad you like, yeah, I get that [inaudible] I see where you’re thinking where that kind of thinking is coming from, if that makes sense.” Later, student A continued, “If someone’s all like, ‘it makes sense how you think that way.’ ‘It makes sense,’ they’re justifying your thought process. And then they’re like, ‘but here’s why it’s fine’ so like, yeah yeah. And like, I see we’re on the same page here why I would think this way. Let’s go ahead and fix that together. Take my hand. You know?” Student A emphasized that it is particularly appreciated when others understand why they think in a particular way. This understanding can help create a supportive and collaborative learning environment as student A stated “I see we’re on the same page here why I would think this way. Let’s go ahead and fix that together. Take my hand.” Another chemistry student (student B) stated, “So, I said the third one was the best I think, because you’re kind of like coming down to their level, ‘so it kind of makes sense that you would think it is ionic,’ so you’re kind of agreeing with them in a way. You’re saying, ‘okay, I understand that with you, I understand that’, but then you say ‘but’ so then you can explain it to them. I think that’s just a really kind way to go about that because, like I said, you’re kind of just going on their level and just talking to them like normally.” Student B also emphasized that the GTA is “coming down to their level” and praised the GTA for being kind. In line with findings from prior studies, GTAs are often perceived by undergraduate students as more relatable and understanding compared to faculty (Kendall & Schussler 2012; Kendall & Schussler 2013), and students’ respect for GTAs tends to increase over the semester (Kendall & Schussler 2013). GTAs can show their understanding by “coming down to students’ level” and acknowledging students’ ideas as natural and sensible.

Acknowledge sensemaking effort

Students reported that their comfort would increase if GTAs acknowledged their effort in sensemaking about the subject matter. Unlike the code “acknowledge idea as natural and sensible”, this code emphasizes an effort being made by students rather than the idea being sensible. For example, a physics student (student C) stated that they liked statement 2 because “It’s acknowledging that you tried to make sense of a question and tried to work your way through it.” Similarly, a chemistry student (student A) stated “Here’s why I like number three, because I think it does the opposite of giving credit where credit’s due. It gives credit where it’s not due. And as far as trying to understand something, even if someone is wrong and you give them credit, I feel like they’re more open to taking criticism.” Both students appreciated the recognition of the effort for trying to understand the subject matter even if their answers were incorrect. Student A further explained that such recognition may help people become more open to criticism since effort rather than the outcome is emphasized.

Acknowledge idea as common

Students reported acknowledging (incorrect) ideas as common can increase their comfort. For example, a physics student (student C) favored statement 3 because “They said, ‘That’s okay, that’s a common misconception.’ Again, just so that they know that most people would think that way and it’s not just them getting it wrong.” By framing the incorrect idea as common, students do not feel singled out as being wrong, which can increase student comfort.

Acknowledge a learning opportunity

Students reported that when a GTA pointed out the error creates a learning opportunity for everyone, their comfort level increases. For example, a chemistry student (student D) stated that they liked statement 1 “Because he expresses that he’s happy that the error was made so that you can explain it later and be like, ‘Okay, so it behaves like one but it’s really not.’ And the way that … and then continues to explain it. It makes me feel better making that error because he says, ‘I’m happy that you made that error so that I can explain it to everyone later’”.

Provide explanation to subject matter

Students reported that their comfort increases if the GTA explains why the student’s idea is incorrect instead of just stating it is incorrect. For example, a chemistry student (student E) stated “I mean probably the third one just ‘cause like it comes with an explanation. So, at least I know why it’s wrong.” We agree that it is important for students to know why their answer is incorrect, but we do not recommend GTAs to explain without first providing scaffolding to help students critically examine their own ideas. GTAs should engage in practices that align with the goals of inquiry-based learning rather than using “teaching by telling”.

Positive and conversational

Students reported that their comfort increases when they perceive that the GTA uses a positive tone and engages students in a back-and-forth discussion. A chemistry student (student F) favored statement 1 “Because it’s rewarding you for speaking and being a part of the conversation, while also telling you that you’re wrong.” Similarly, another chemistry student (student G) also favored statement 1 because it “Makes it more positive and makes it into a learning experience or whatever.”

Specific ways of framing and tone that decrease student comfort

Start with a direct comment of idea being incorrect

Students reported that their comfort decreases when a GTA started with a direct comment that the answer is incorrect. For example, a chemistry student (student D) stated “For TA two, I didn’t like the fact that he goes … it’s just straight ‘not necessarily,’ period. Because I know that would feel like, ‘oh, my God,’ for me. I made a mistake.” A direct error identification at the very beginning of the response triggered a negative emotion in student D and may have led student D to perceive making errors as unacceptable. Similarly, another chemistry student (student A) stated, “Okay not to act like a child, because that’s how I feel like I’m about to be acting like. But dang, number two, he’s all like, straight shutdown, ‘not necessarily.’ You’re like, oh!” Student A expressed frustration towards the GTA’s response and also possibly felt embarrassed with this discomfort with the response.

Focus on the error

Students reported that their comfort decreases when they perceive that a GTA focuses too much on the fact that a student made an error. A chemistry student (student E), for example, stated that they disliked statement 2. “To me it’s kind of condescending, just because like, we’d already talked about that in the beginning of the lab, like it’s okay we were wrong, so it doesn’t really matter. So if I’m wrong, I don’t need another explanation like it’s okay to be wrong, like I’m not a five year old …” Later, student E continued, “Cause it’s like if you are responding like you already know that I am wrong, like you don’t have to be like ‘no it’s okay to be wrong,’ like it’s kind of like ‘okay, I’m not two, I got you, you know?’ And then it’s like you are taking more time to go out of your way to tell me that I’m wrong instead of explaining the actual right answer, like the third one does.” Student E pointed out that a positive error climate was already established at the beginning of the lab. Responding with “it’s okay to be wrong” after students answered incorrectly puts too much focus on the fact that the students made an error. Similarly, another chemistry student (student H) stated “TA two, I did not like the response. I kind of just felt like it was very derogatory, like, ‘Oh, we do want you to be comfortable making errors.’ I don’t know. I guess it was too much focus on the fact that they got it wrong.” It seems both students would feel negatively evaluated and “lose face” as they used words like “condescending” and “derogatory.”

Use hedging

Students reported that their comfort decreases when a GTA uses hedging in the response. A chemistry student (student H) stated “I guess the wording of, ‘It kind of makes sense,’ I just feel like you’re enunciating the fact that it didn’t make sense. Like, ‘It might kind of make sense a little bit, but it really doesn’t make any sense.’ It’s like, ‘Okay, thanks for telling me it didn’t make any sense.’” The use of “kind of” makes the student perceive that their thinking is not validated. It is worth mentioning that GTA used the phrase “it kind of makes sense” to avoid explicitly pointing out the answer was incorrect, intending to protect student’s face. However, undergraduate students seem to prefer an even more positive response from the GTA without the phrase of “kind of.”

Use negative words

One student reported that their comfort decreases when a GTA uses negative words when referring to students’ ideas. A physics student (student C) stated that referring to a student’s incorrect idea as a “misconception” can upset some people. “I don’t want to say it’s … I don’t think I’d be too bothered by it, but someone could feel like using a word like ‘misconception’ would be kind of condescending maybe. It’s just a certain type … I just know certain people who would find that a bit upsetting but I don’t know how to explain it.” Although only one student reported it, we believe this theme is generalizable. When students perceive that their GTA uses negative words to describe their ideas, they may feel that their competence is being challenged, which is a face threat to students.

Formal and unconcerned

Students reported that their comfort decreases when a GTA addresses students formally without affirmation. For example, a chemistry student (student G) stated “the second one just sounds robotic” because “you’re supposed to say that, so you do.” Student G agreed with the idea that “you should be comfortable making errors” as they later in the interview acknowledged “conceptually that’s good.” However, student G perceived that the GTA was just saying what they are supposed to say without showing a genuine interest in the student’s contribution or feeling.

Another chemistry student stated (student B) “…then they go into the fact that it’s a lab and I feel like that’s kind of awkward. ‘We do want you to be comfortable making errors so we can go ahead and try to explain that.’ It kind of sounds like they are trying to be very formal with it. I think that’s a big thing. If you’re addressing a student very formally, I know it’s the way it’s supposed to be done, but like in terms of just making the student feel a certain way, that makes a student feel like, ‘Oh, I’m just a student, what do I know?’ Instead of just talking to them like a normal person. So, yeah, that’s why I don’t like the second one.” Student B perceived that the GTA is “addressing a student very formally” rather than “talking to them like a normal person.” This seems to reinforce the image of a GTA as an authority figure and this authority figure does not show affirmation to students’ contribution. Instead of being addressed formally, students prefer that a GTA engages students in a conversation and shows a more positive attitude towards students’ contribution.

Relationship between themes in GTAs’ operationalization of error framing and student comfort

To gain insight into the relationship between themes in GTAs’ operationalization of error framing and student comfort, we examined favored and unfavored reasons for each GTA statement cited by undergraduate students during interviews. The number of interviews during which a reason was cited is shown in Table 4. Due to a small number of exemplar GTA responses, we do not intend to claim one category of GTAs’ operationalization of error framing is perceived better than another. Instead, we attempt to generalize what category of error framing statements is generally perceived as increasing or decreasing student comfort.

Table 4 Number of interviews (out of the total number of interviews for each discipline) during which each reason was cited as favored or unfavored toward each exemplar GTA response

Explicit error indication decreases undergraduate student comfort

Among all reasons reported (favored and unfavored), two are related to error indication: starting the response with a direct comment that students’ answers were incorrect and using negative words (e.g., misconception) to refer to students’ incorrect answers. These reasons were cited to explain why students did not like the GTA responses or why the responses decreased student comfort and were only used in reaction to explicit error framing. This suggests that explicit error indication decreases undergraduate student comfort.

Undergraduate students’ perceptions about explicit error indication appears to be consistent with some of the GTAs’ preference in operationalizing error framing. As mentioned previously, some GTAs intend to avoid explicit error indication so that they can protect students’ face and focus on learning from the mistakes. Both GTAs and undergraduate students expressed negative opinions toward explicit error indication. Therefore, we suggest science GTAs implicitly indicate students’ errors when they implement error framing.

Positive acknowledgement of student contribution increases undergraduate student comfort

The only response that used positive acknowledgement framing appeared to be the favorite of the undergraduate students as it had the most favored reasons and none of the unfavored reasons. The most frequently (four out of eight) cited favored reason for this response is that it acknowledges student ideas as natural and sensible. For example, a chemistry student (student I) stated “I guess TA 1 would recognize that it does make sense that you do think like that because of this rather than the TA3, they just said ‘Yeah it makes sense but.’ TA1 explains kind of why [‘because it actually behaves like one’] you would think like that.” This perception from undergraduate students aligns with the definition of error framing in the literature that errors are framed as natural in the learning process. It is worth mentioning that we, as researchers (and professional development facilitators), coded the statement of chemistry TA1 (i.e., Okay, I am glad you brought that up because it actually behaves like one but technically isn’t one) as “positive acknowledgement” rather than “natural” (or “beneficial”) because the GTA did not explicitly elaborate on why the student contribution is valuable. From the perspective of professional development facilitators, we would like GTAs to use the framing explicitly so that (ideally) all students receive the message. The results suggest that some of the students were able to receive the message even without explicit framing.

The phrase “I am glad you brought that up” was perceived by students as acknowledging a learning opportunity. For example, student D stated that “It makes me feel better making that error because he says, ‘I’m happy that you made that error so that I can explain it to everyone later.’” While we as researchers did not code this statement as “beneficial”, some of the students perceived it as “acknowledging a learning opportunity.” In addition, this phrase was also perceived as being positive and conversational. Student F, for instance, stated “it’s rewarding you for speaking and being a part of the conversation.”

As discussed previously, although this statement does not explicitly frame errors as natural or beneficial, it does value a student’s contribution and students perceive it to increase their comfort. Therefore, we encourage science GTAs to use positive acknowledgement of students’ contribution as a response to students’ incorrect answers.

Responding to errors with error encouragement decreases student comfort

Noticeably, statement C2 was cited most frequently as decreasing student comfort for a variety of reasons. Although it was the only exemplar statement that framed errors as beneficial, we do not believe this was the reason why students disfavored C2 as students expressed appreciation toward the part of the statement “we can go ahead and try to explain that.” Rather, students seem to feel uncomfortable about the part “because this is a lab, we want you to feel comfortable making errors”, even if this part of the statement was intended to encourage students to feel comfortable making errors.

As discussed previously, students recognized that it is conceptually good to encourage students to feel comfortable making errors. However, since an error was already made, this error encouragement made students perceive that it is focusing on the fact that an error was made. As student E mentioned, “we’d already talked about that in the beginning of the lab, like it’s okay we were wrong, so it doesn’t really matter. So if I’m wrong, I don’t need another explanation like it’s okay to be wrong.” This suggests that error framing should be implemented differently depending on whether errors already occurred. Error encouragement can be helpful for establishing a positive error climate at the beginning of instruction with the whole class listening. When errors already occurred, GTAs should avoid broadly discussing errors as valuable in learning but instead respond to a student’s specific idea and frame it as natural and beneficial, or GTAs can use positive acknowledgement to value students’ contribution.

Conclusions and implications

This paper demonstrates how science GTAs operationalize error framing, a pedagogical skill that many GTAs struggle to implement, as well as how GTAs’ error framing statements are perceived by undergraduate students. During the simulator rehearsal session, some GTAs framed students’ errors as natural or beneficial to learning, which is aligned with the definition of error framing in the literature. Others used implicit (rather than explicit) indication of students’ errors and/or positively acknowledged students’ contributions without elaborating on why or how the contributions are valuable. Both implicit error indication and positive acknowledgement of student contribution are intended to mitigate potential face threat. Therefore, both are in line with protective facework and adaptive error management. Undergraduate students also reported that the statements that used explicit error indication decrease their comfort but statements that used positive acknowledgement increase their comfort in participating in classroom discourse.

These results suggest that the definition of error framing should not be limited to framing errors as natural or beneficial. Instead, implicit error indication and positive acknowledgement are also effective ways to implement error framing in undergraduate STEM courses. When paired with cold calling during class discussion, error framing with implicit error indication may be more effective in reducing student anxiety compared to explicit error indication. Implicit error indication and positive acknowledgement can be considered specific ways of doing approbation facework, while explicit error indication may be perceived as a threat to students’ competence face. In addition to pairing error framing with cold calling, we argue that it would be even more effective if GTAs can, at the beginning of instruction, explicitly discuss how errors are part of the learning process and are useful for learning. This would help establish a positive error climate.

Results from undergraduate student interviews provided specific suggestions on responses to students’ incorrect answers. For example, students’ comfort can increase if GTAs acknowledge students’ sensemaking efforts or acknowledge that students’ ideas are sensible. In contrast, undergraduate students’ comfort can decrease if GTAs, for example, started the response with a direct comment that students’ answers are incorrect. This is consistent with the preference of using implicit error identification for some GTAs. Additionally, undergraduate students also pay attention to the tone of the response; their comfort increases when they perceive the GTA to be positive and conversational, and their comfort decreases when they perceive the GTA to be to be formal and unconcerned.

Interestingly, we found that the way error framing is implemented should depend on whether errors have already occurred. Undergraduate students overwhelmingly reported their comfort would decrease if the GTA encouraged students to feel comfortable making mistakes after students have already answered incorrectly. Some undergraduate students even felt that the response sounded “condescending” or “derogatory” because they perceived the GTA as focusing on the fact that students made a mistake. We suggest that GTAs encourage students to feel comfortable and learn from mistakes at the beginning of the instruction, which can help establishing a positive error climate. When errors already occurred, GTAs can respond to the student’s specific idea and frame it as natural and beneficial, or GTAs can use positive acknowledgement to value students’ contribution.

Findings of this study have implications for science GTA professional development. Not only should GTAs be introduced a variety of error framing techniques, but they should also be aware of undergraduate students’ perceptions of specific ways of operationalizing error framing. Being aware of how students may receive the messages can help GTAs effectively compose and tailor the messages being sent. For example, GTAs should consider the option of using implicit error identification in response to students’ incorrect ideas, especially after using cold calling, to reduce student anxiety. When operationalizing error framing, GTAs can use the natural or beneficial framing, or acknowledge students’ contributions are valuable. Specifically, GTAs can praise student effort, acknowledge student ideas as sensible or common, and emphasize the learning opportunity. When errors have already occurred, GTAs should avoid discussing broadly how errors are beneficial but rather frame their responses around students’ specific ideas. GTAs should also be careful about their word choice and try to avoid hedging or negative words.

Future research should explore how error framing is implemented in STEM classrooms and how students perceive the practice. Specifically, it is worth investigating how STEM instructors establish an error-positive climate at the beginning of the instruction, as well as how they respond to errors. We suggest researchers take into account not only verbal statements, but also tone of voice and body language of STEM instructors. Classroom observations and undergraduate student surveys can be used to obtain quantitative results, which will allow for identifying correlations between variables. For example, how often does error framing need to be implemented to have a measurable impact on student learning experience? Is it more effective to implement a variety of error framing techniques than to use the same technique repeatedly? These questions would provide insights into developing effective strategies to support STEM instructors/GTAs to implement error framing.

Availability of data and materials

The human subject data used for this study are not openly available due to Institutional Review Board protocol. Parties with reasonable research interests may contact JJC to request potential access.

Abbreviations

GTA:

Graduate teaching assistant

PD:

Professional development

LA:

Learning assistant

ISLE:

Investigative Science Learning Environment

References

  • Bell, B. S. & Kozlowski, S. W. (2008). Active learning: Effects of core training design elements on self-regulatory processes, learning, and adaptability, from Cornell University, ILR School site http://digitalcommons.ilr.cornell.edu/articles/410

  • Bybee, R. W., & Landes, N. M. (1990). Science for life & living: An elementary school science program from biological sciences curriculum study. The American Biology Teacher, 52(2), 92–98. https://doi.org/10.2307/4449042

    Article  Google Scholar 

  • Chini, J. J. and Pond, J. W. T. (2014). Comparing Traditional and studio courses through FCI gains and losses, in Physics Education Research Conference Proceedings, pp.51–54.

  • Chini, J. J., Straub, C. L., & Thomas, K. H. (2016). Learning from avatars: Learning assistants practice physics pedagogy in a classroom simulator. Physical Review Physics Education Research, 12(1), 010117.

    Article  Google Scholar 

  • Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20, 37–46.

    Article  Google Scholar 

  • Conlin, L. K., & Scherr, R. E. (2018). Making space to sensemake: Epistemic distancing in small group physics discussions. Cognition and Instruction, 36(4), 396.

    Article  Google Scholar 

  • Cooper, K. M., Ashley, M., & Brownell, S. E. (2017). A bridge to active learning: a summer bridge program helps students maximize their active-learning experiences and the active-learning experiences of others. CBE Life Sciences Education, 16(1), ar17.

    Article  Google Scholar 

  • Cooper, K. M., Downing, V. R., & Brownell, S. E. (2018). The influence of active learning practices on student anxiety in large-enrollment college science classrooms. International Journal of STEM Education, 5, 23.

    Article  Google Scholar 

  • Dallimore, E. J., Hertenstein, J. H., & Platt, M. B. (2012). Impact of cold-calling on student voluntary participation. Journal of Management Education, 37(3), 305–341.

    Google Scholar 

  • Dawson, M. & Lignugaris/Kraft, B. (2013). TLE TeachlivE™ vs. role-play: Comparative effects on special educators’ acquisition of basic teaching skills. In: A. Hayes, S. Hardin, L. Dieker, C. Hughes, M. Hynes, & C. Straub (Eds.), Proceedings from the 1st National TLE TeachLivE™ Conference (pp. 23–29).

  • Downing, V. R., Cooper, K. M., Cala, J. M., Gin, L. E., & Brownell, S. E. (2020). Fear of negative evaluation and student anxiety in community college active-learning science courses. CBE Life Sciences Education, 19(ar20), 2020. https://doi.org/10.1187/cbe.19-09-0186

    Article  Google Scholar 

  • Dweck, C. (1986). Motivational processes affecting learning. American Psychologist, 41, 1040–1048.

    Article  Google Scholar 

  • Eddy, S. L., Converse, M., & Wenderoth, M. P. (2015). PORTAAL: A classroom observation tool assessing evidence-based teaching practices for active learning in large science, technology, engineering, and mathematics classes. CBE Life Sciences Education, 14(2), ar23.

    Article  Google Scholar 

  • Elby, A., Scherr, R. E., McCaskey, T., Hodges, R., Redish, E. F., Hammer, D., and Bing T. (2007). Open Source Tutorials in Physics Sensemaking: Suite I. https://www.physport.org/curricula/MD_OST/

  • Elford, M., James, S., & Haynes-Smith, H. (2013). Literacy instruction for pre-service educators in virtual learning environments. In A. Hayes, S. Hardin, L. Dieker, C. Hughes, M. Hynes, & C. Straub. Conference Proceedings for First National TeachLivE Conference. Paper presented at First National TeachLivE Conference, University of Central Florida.

  • England, B. J., Brigati, J. R., & Schussler, E. E. (2017). Student anxiety in introductory biology classrooms: Perceptions about active learning and persistence in the major. PLoS ONE, 12(8), e0182506.

    Article  Google Scholar 

  • Etkina, E. (2010). Pedagogical content knowledge and preparation of high school physics teachers. Physical Review Special Topics Physics Education Research, 6, 020110.

    Article  Google Scholar 

  • Etkina, E., & Van Heuvelen, A. (2007). Investigative Science Learning Environment—A science process approach to learning physics. In E. F. Redish & P. Cooney (Eds.), PER-based reforms in calculus-based physics (Vol. 1, pp. 1–48). American Association of Physics Teachers.

    Google Scholar 

  • Gaffney, J. D. H., & Gaffney, A. L. H. (2016). Student satisfaction in interactive engagement-based physics classes. Physical Review Physics Education Research, 12, 020125.

    Article  Google Scholar 

  • Garland, K. V., Vasquez, E., III., & Pearl, C. (2012). Efficacy of individualized clinical coaching in a virtual reality classroom for increasing teachers’ fidelity of implementation of discrete trial teaching. Education and Training in Autism and Developmental Disabilities, 47, 502.

    Google Scholar 

  • Geraets, A. A., Nottolini, I. L., Doty, C. M., Wan, T., Chini, J. J., & Saitta, E. K. (2021). Preparing GTAs for active learning in the general chemistry lab: Development of an evidence-based rehearsal module for a mixed-reality teaching simulator. Journal of Science Education and Technology, 30(6), 829–840.

    Article  Google Scholar 

  • Goetz, T., Pekrun, R., Hall, N., & Haag, L. (2006). Academic emotions from a social cognitive perspective: Antecedents and domain specificity of students’ affect in the context of Latin instruction. British Journal of Educational Psychology, 76, 289–308.

    Article  Google Scholar 

  • Goffman, E. (1955). On face-work: An analysis of ritual elements in social interaction. Psychiatry, 18, 213.

    Article  Google Scholar 

  • Kerssen-Griep, J. (2001). Teacher communication activities relevant to student motivation: Classroom facework and instructional communication competence. Communication Education, 50, 256. https://doi.org/10.1080/03634520109379252

    Article  Google Scholar 

  • Kerssen-Griep, J., Hess, J. A., & Trees, A. R. (2003). Sustaining the desire to learn: Dimensions of perceived instructional facework related to student involvement and motivation to learn. Western Journal of Communication, 67, 357–381.

    Article  Google Scholar 

  • Lemov, D. (2010). Teach like a champion: 49 techniques that put students on the path to college (1st ed.). Jossey-Bass.

    Google Scholar 

  • Lim, T., & Bowers, J. W. (1991). Facework: Solidarity, approbation, and tact. Human Communication Research, 17, 415–450.

    Article  Google Scholar 

  • Michael, J. (2007). Faculty perceptions about barriers to active learning. College Teaching, 55, 42–47. https://doi.org/10.3200/CTCH.55.2.42-47

    Article  Google Scholar 

  • Remesh, A. (2013). Microteaching, an efficient technique for learning effective teaching. Journal of Research in Medical Sciences, 18(2), 158–163.

    Google Scholar 

  • Saitta, E. K., Wilcox, M., James, W. D., & Chini, J. J. (2020). The views of GTAs impacted by cross-tiered professional development: Messages intended and received. International Journal of Research in Undergraduate Mathematics Education, 6(3), 421–445.

    Article  Google Scholar 

  • Spychiger, M., Mahler, F., Hascher, T., & Oser, F. (1998). Fehlerkultur aus der Sicht von Schülerin-nen und Schülern. Der Fragebogen S-UFS: Entwicklung und erste Ergebnisse [Error culture from students’ perspective. The S-UFS Questionnaire: Development and first results]. In Schriftenreihe zum Projekt Lernen Menschen aus Fehlern? Zur Entwicklung einer Fehlerkultur in der Schule, Vol. 4. Schweiz: Pädagogisches Institut der Universität Freiburg.

  • Steele-Johnson, D., & Kalinoski, Z. T. (2014). Error framing effects on performance: Cognitive, motivational, and affective pathways. The Journal of Psychology: Interdisciplinary and Applied, 148(1), 93–111.

    Article  Google Scholar 

  • Steuer, G., & Dresel, M. (2011). Dealing with errors in mathematics classrooms: the relevance of error climate and personal achievement motivation. In paper presented at the 91. Annual meeting of the American Educational Research Association (AERA) in New Orleans, USA.

  • Straub, C., L. Dieker, L., Hynes, M., & Hughes, C. (2015). Using virtual rehearsal in the TLE TeachLivETM mixed reality classroom simulator to determine the effects on the of performance of science teachers: A follow-up study (year 2). 2015 TeachLivE national research project: Year 2 findings. University of Central Florida.

  • Tulis, M. (2013). Error management behavior in classrooms: Teachers’ responses to students’ mistakes. Teaching and Teacher Education, 33, 56–68. https://doi.org/10.1016/j.tate.2013.02.003

    Article  Google Scholar 

  • Watson, D., & Friend, R. (1969). Measurement of social-evaluative anxiety. Journal of Consulting and Clinical Psychology, 33(4), 448.

    Article  Google Scholar 

  • Weeks, J. W., Heimberg, R. G., Fresco, D. M., Hart, T. A., Turk, C. L., Schneier, F. R., & Liebowitz, M. R. (2005). Empirical validation and psychometric evaluation of the Brief Fear of Negative Evaluation Scale in patients with social anxiety disorder. Psychological Assessment, 17(2), 179.

    Article  Google Scholar 

  • Whitten, E., Enicks, A., Wallace, L., & Morgan, D. (2013). In A. Hayes, S. Hardin, L. Dieker, C. Hughes, M. Hynes, & C. Straub. Conference Proceedings for First National TeachLivE Conference Paper presented at First National TeachLivE Conference. University of Central Florida.

  • Yerkes, R. M., & Dodson, J. D. (1908). The relation of strength of stimulus to rapidity of habit-formation. Journal of Comparative Neurology, 18(5), 459–482.

    Google Scholar 

Download references

Acknowledgements

Not application.

Funding

This work is funded by National Science Foundation Department of Undergraduate Education Grant No. 1725554.

Author information

Authors and Affiliations

Authors

Contributions

JJC and EKHS conceived of and designed the overarching research study. All authors conducted simulator training sessions. TW, CMD, and AAG conducted the course observations and student interviews. All authors participated in data analysis and contributed to the writing of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Jacquelyn J. Chini.

Ethics declarations

Competing interests

The authors declare that they have no competing interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Graduate Student Guide to TeachLivE Teaching Simulator.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wan, T., Doty, C.M., Geraets, A.A. et al. Responding to incorrect ideas: science graduate teaching assistants’ operationalization of error framing and undergraduate students’ perception. IJ STEM Ed 10, 5 (2023). https://doi.org/10.1186/s40594-023-00398-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40594-023-00398-8

Keywords