Skip to main content

Instructor strategies to aid implementation of active learning: a systematic literature review

Abstract

Background

Despite the evidence supporting the effectiveness of active learning in undergraduate STEM courses, the adoption of active learning has been slow. One barrier to adoption is instructors’ concerns about students’ affective and behavioral responses to active learning, especially student resistance. Numerous education researchers have documented their use of active learning in STEM classrooms. However, there is no research yet that systematically analyzes these studies for strategies to aid implementation of active learning and address students’ affective and behavioral responses. In this paper, we conduct a systematic literature review and identify 29 journal articles and conference papers that researched active learning, affective and behavioral student responses, and recommended at least one strategy for implementing active learning. In this paper, we ask: (1) What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies? (2) What instructor strategies to aid implementation of active learning do the authors of these studies provide?

Results

In our review, we noted that most active learning activities involved in-class problem solving within a traditional lecture-based course (N = 21). We found mostly positive affective and behavioral outcomes for students’ self-reports of learning, participation in the activities, and course satisfaction (N = 23). From our analysis of the 29 studies, we identified eight strategies to aid implementation of active learning based on three categories. Explanation strategies included providing students with clarifications and reasons for using active learning. Facilitation strategies entailed working with students and ensuring that the activity functions as intended. Planning strategies involved working outside of the class to improve the active learning experience.

Conclusion

To increase the adoption of active learning and address students’ responses to active learning, this study provides strategies to support instructors. The eight strategies are listed with evidence from numerous studies within our review on affective and behavioral responses to active learning. Future work should examine instructor strategies and their connection with other affective outcomes, such as identity, interests, and emotions.

Introduction

Prior reviews have established the effectiveness of active learning in undergraduate science, technology, engineering, and math (STEM) courses (e.g., Freeman et al., 2014; Lund & Stains, 2015; Theobald et al., 2020). In this review, we define active learning as classroom-based activities designed to engage students in their learning through answering questions, solving problems, discussing content, or teaching others, individually or in groups (Prince & Felder, 2007; Smith, Sheppard, Johnson, & Johnson, 2005), and this definition is inclusive of research-based instructional strategies (RBIS, e.g., Dancy, Henderson, & Turpen, 2016) and evidence-based instructional practices (EBIPs, e.g., Stains & Vickrey, 2017). Past studies show that students perceive active learning as benefitting their learning (Machemer & Crawford, 2007; Patrick, Howell, & Wischusen, 2016) and increasing their self-efficacy (Stump, Husman, & Corby, 2014). Furthermore, the use of active learning in STEM fields has been linked to improvements in student retention and learning, particularly among students from some underrepresented groups (Chi & Wylie, 2014; Freeman et al., 2014; Prince, 2004).

Despite the overwhelming evidence in support of active learning (e.g., Freeman et al., 2014), prior research has found that traditional teaching methods such as lecturing are still the dominant mode of instruction in undergraduate STEM courses, and low adoption rates of active learning in undergraduate STEM courses remain a problem (Hora & Ferrare, 2013; Stains et al., 2018). There are several reasons for these low adoption rates. Some instructors feel unconvinced that the effort required to implement active learning is worthwhile, and as many as 75% of instructors who have attempted specific types of active learning abandon the practice altogether (Froyd, Borrego, Cutler, Henderson, & Prince, 2013).

When asked directly about the barriers to adopting active learning, instructors cite a common set of concerns including the lack of preparation or class time (Finelli, Daly, & Richardson, 2014; Froyd et al., 2013; Henderson & Dancy, 2007). Among these concerns, student resistance to active learning is a potential explanation for the low rates of instructor persistence with active learning, and this negative response to active learning has gained increased attention from the academic community (e.g., Owens et al., 2020). Of course, students can exhibit both positive and negative responses to active learning (Carlson & Winquist, 2011; Henderson, Khan, & Dancy, 2018; Oakley, Hanna, Kuzmyn, & Felder, 2007), but due to the barrier student resistance can present to instructors, we focus here on negative student responses. Student resistance to active learning may manifest, for example, as lack of student participation and engagement with in-class activities, declining attendance, or poor course evaluations and enrollments (Tolman, Kremling, & Tagg, 2016; Winkler & Rybnikova, 2019).

We define student resistance to active learning (SRAL) as a negative affective or behavioral student response to active learning (DeMonbrun et al., 2017; Weimer, 2002; Winkler & Rybnikova, 2019). The affective domain, as it relates to active learning, encompasses not only student satisfaction and perceptions of learning but also motivation-related constructs such as value, self-efficacy, and belonging. The behavioral domain relates to participation, putting forth a good effort, and attending class. The affective and behavioral domains differ from much of the prior research on active learning that centers measuring cognitive gains in student learning, and systematic reviews are readily available on this topic (e.g., Freeman et al., 2014; Theobald et al., 2020). Schmidt, Rosenberg, and Beymer (2018) explain the relationship between affective, cognitive, and behavioral domains, asserting all three types of engagement are necessary for science learning, and conclude that “students are unlikely to exert a high degree of behavioral engagement during science learning tasks if they do not also engage deeply with the content affectively and cognitively” (p. 35). Thus, SRAL and negative affective and behavioral student response is a critical but underexplored component of STEM learning.

Recent research on student affective and behavioral responses to active learning has uncovered mechanisms of student resistance. Deslauriers, McCarty, Miller, Callaghan, and Kestin’s (2019) interviews of physics students revealed that the additional effort required by the novel format of an interactive lecture was the primary source of student resistance. Owens et al. (2020) identified a similar source of student resistance, which was to their carefully designed biology active learning intervention. Students were concerned about the additional effort required and the unfamiliar student-centered format. Deslauriers et al. (2019) and Owens et al. (2020) go a step further in citing self-efficacy (Bandura, 1982), mindset (Dweck & Leggett, 1988), and student engagement (Kuh, 2005) literature to explain student resistance. Similarly, Shekhar et al.’s (2020) review framed negative student responses to active learning in terms of expectancy-value theory (Wigfield & Eccles, 2000); students reacted negatively when they did not find active learning useful or worth the time and effort, or when they did not feel competent enough to complete the activities. Shekhar et al. (2020) also applied expectancy violation theory from physics education research (Gaffney, Gaffney, & Beichner, 2010) to explain how students’ initial expectations of a traditional course produced discomfort during active learning activities. To address both theories of student resistance, Shekhar et al. (2020) suggested that instructors provide scaffolding (Vygotsky, 1978) and support for self-directed learning activities. So, while framing the research as SRAL is relatively new, ideas about working with students to actively engage them in their learning are not. Prior literature on active learning in STEM undergraduate settings includes clues and evidence about strategies instructors can employ to reduce SRAL, even if they are not necessarily framed by the authors as such.

Recent interest in student affective and behavioral responses to active learning, including SRAL, is a relatively new development. But, given the discipline-based educational research (DBER) knowledge base around RBIS and EBIP adoption, we need not to reinvent the wheel. In this paper, we conduct a system review. Systematic reviews are designed to methodically gather and synthesize results from multiple studies to provide a clear overview of a topic, presenting what is known and what is not known (Borrego, Foster, & Froyd, 2014). Such clarity informs decisions when designing or funding future research, interventions, and programs. Relevant studies for this paper are scattered across STEM disciplines and in DBER and general education venues, which include journals and conference proceedings. Quantitative, qualitative, and mixed methods approaches have been used to understand student affective and behavioral responses to active learning. Thus, a systematic review is appropriate for this topic given the long history of research on the development of RBIS, EBIPs, and active learning in STEM education; the distribution of primary studies across fields and formats; and the different methods taken to evaluate students’ affective and behavioral responses.

Specifically, we conducted a systematic review to address two interrelated research questions. (1) What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies? (2) What instructor strategies to aid implementation of active learning do the authors of these studies provide? These two questions are linked by our goal of sharing instructor strategies that can either reduce SRAL or encourage positive student affective and behavioral responses. Therefore, the instructor strategies in this review are only from studies that present empirical data of affective and behavioral student response to active learning. The strategies we identify in this review will not be surprising to highly experienced teaching and learning practitioners or researchers. However, this review does provide an important link between these strategies and student resistance, which remains one of the most feared barriers to instructor adoption of RBIS, EBIPs, and other forms of active learning.

Conceptual framework: instructor strategies to reduce resistance

Recent research has identified specific instructor strategies that correlate with reduced SRAL and positive student response in undergraduate STEM education (Finelli et al., 2018; Nguyen et al., 2017; Tharayil et al., 2018). For example, Deslauriers et al. (2019) suggested that physics students perceive the additional effort required by active learning to be evidence of less effective learning. To address this, the authors included a 20-min lecture about active learning in a subsequent course offering. By the end of that course, 65% of students reported increased enthusiasm for active learning, and 75% said the lecture intervention positively impacted their attitudes toward active learning. Explaining how active learning activities contribute to student learning is just one of many strategies instructors can employ to reduce SRAL (Tharayil et al., 2018).

DeMonbrun et al. (2017) provided a conceptual framework for differentiating instructor strategies which includes not only an explanation type of instructor strategies (e.g., Deslauriers et al., 2019; Tharayil et al., 2018) but also a facilitation type of instructor strategies. Explanation strategies involve describing the purpose (such as how the activity relates to students’ learning) and expectations of the activity to students. Typically, instructors use explanation strategies before the in-class activity has begun. Facilitation strategies include promoting engagement and keeping the activity running smoothly once the activity has already begun, and some specific strategies include walking around the classroom or directly encouraging students. We use the existing categories of explanation and facilitation as a conceptual framework to guide our analysis and systematic review.

As a conceptual framework, explanation and facilitation strategies describe ways to aid the implementation of RBIS, EBIP, and other types of active learning. In fact, the work on these types of instructor strategies is related to higher education faculty development, implementation, and institutional change research perspectives (e.g., Borrego, Cutler, Prince, Henderson, & Froyd, 2013; Henderson, Beach, & Finkelstein, 2011; Kezar, Gehrke, & Elrod, 2015). As such, the specific types of strategies reviewed here are geared to assist instructors in moving toward more student-centered teaching methods by addressing their concerns of student resistance.

SRAL is a particular negative form of affective or behavioral student response (DeMonbrun et al., 2017; Weimer, 2002; Winkler & Rybnikova, 2019). Affective and behavioral student responses are conceptualized at the reactionary level (Kirkpatrick, 1976) of outcomes, which consists of how students feel (affective) and how they conduct themselves within the course (behavioral). Although affective and behavioral student responses to active learning are less frequently reported than cognitive outcomes, prior research suggests a few conceptual constructs within these outcomes.

Affective outcomes consist of any students’ feelings, preferences, and satisfaction with the course. Affective outcomes also include students’ self-reports of whether they thought they learned more (or less) during active learning instruction. Some relevant affective outcomes include students’ perceived value or utility of active learning (Shekhar et al., 2020; Wigfield & Eccles, 2000), their positivity toward or enjoyment of the activities (DeMonbrun et al., 2017; Finelli et al., 2018), and their self-efficacy or confidence with doing the in-class activity (Bandura, 1982).

In contrast, students’ behavioral responses to active learning consist of their actions and practices during active learning. This includes students’ attendance in the class, their participation, engagement, and effort with the activity, and students’ distraction or off-task behavior (e.g., checking their phones, leaving to use the restroom) during the activity (DeMonbrun et al., 2017; Finelli et al., 2018; Winkler & Rybnikova, 2019).

We conceptualize negative or low scores in either affective or behavioral student outcomes as an indicator of SRAL (DeMonbrun et al., 2017; Nguyen et al., 2017). For example, a low score in reported course satisfaction would be an example of SRAL. This paper aims to synthesize instructor strategies to aid implementation of active learning from studies that either address SRAL and its negative or low scores or relate instructor strategies to positive or high scores. Therefore, we also conceptualize positive student affective and behavioral outcomes as the absence of SRAL. For easy categorization of this review then, we summarize studies’ affective and behavioral outcomes on active learning to either being positive, mostly positive, mixed/neutral, mostly negative, or negative.

Methods

We conducted a systematic literature review (Borrego et al., 2014; Gough, Oliver, & Thomas, 2017; Petticrew & Roberts, 2006) to identify primary research studies that describe active learning interventions in undergraduate STEM courses, recommend one or more strategies to aid implementation of active learning, and report student response outcomes to active learning.

A systematic review was warranted due to the popularity of active learning and the publication of numerous papers on the topic. Multiple STEM disciplines and research audiences have published journal articles and conference papers on the topic of active learning in the undergraduate STEM classroom. However, it was not immediately clear which studies addressed active learning, affective and behavioral student responses, and strategies to aid implementation of active learning. We used the systematic review process to efficiently gather results of multiple types of studies and create a clear overview of our topic.

Definitions

For clarity, we define several terms in this review. Researchers refer to us, the authors of this manuscript. Authors and instructors wrote the primary studies we reviewed, and we refer to these primary studies as “studies” consistently throughout. We use the term activity or activities to refer to the specific in-class active learning tasks assigned to students. Strategies refer to the instructor strategies used to aid implementation of active learning and address student resistance to active learning (SRAL). Student response includes affective and behavioral responses and outcomes related to active learning. SRAL is an acronym for student resistance to active learning, defined here as a negative affective or behavioral student response. Categories or category refer to a grouping of strategies to aid implementation of active learning, such as explanation or facilitation. Excerpts are quotes from studies, and these excerpts are used as codes and examples of specific strategies.

Study timeline, data collection, and sample selection

From 2015 to 2016, we worked with a research librarian to locate relevant studies and conduct a keyword search within six databases: two multidisciplinary databases (Web of Science and Academic Search Complete), two major engineering and technology indexes (Compendex and Inspec), and two popular education databases (Education Source and Education Resource Information Center). We created an inclusion criteria that listed both search strings and study requirements:

  1. 1.

    Studies must include an in-class active learning intervention. This does not include laboratory classes. The corresponding search string was:

    “active learning” or “peer-to-peer” or “small group work” or “problem based learning” or “problem-based learning” or “problem-oriented learning” or “project-based learning” or “project based learning” or “peer instruction” or “inquiry learning” or “cooperative learning” or “collaborative learning” or “student response system” or “personal response system” or “just-in-time teaching” or “just in time teaching” or clickers

  2. 2.

    Studies must include empirical evidence addressing student response to the active learning intervention. The corresponding search string was:

    “affective outcome” or “affective response” or “class evaluation” or “course evaluation” or “student attitudes” or “student behaviors” or “student evaluation” or “student feedback” or “student perception” or “student resistance” or “student response”

  3. 3.

    Studies must describe a STEM course, as defined by the topic of the course, rather than by the department of the course or the major of the students enrolled (e.g., a business class for mathematics majors would not be included, but a mathematics class for business majors would).

  4. 4.

    Studies must be conducted in undergraduate courses and must not include K-12, vocational, or graduate education.

  5. 5.

    Studies must be in English and published between 1990 and 2015 as journal articles or conference papers.

In addition to searching the six databases, we emailed solicitations to U.S. National Science Foundation Improving Undergraduate STEM Education (NSF IUSE) grantees. Between the database searches and email solicitation, we identified 2364 studies after removing duplicates. Most studies were from the database search, as we received just 92 studies from email solicitation (Fig. 1).

Fig. 1
figure1

PRISMA screening overview styled after Liberati et al. (2009) and Passow and Passow (2017)

Next, we followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines for screening studies with our inclusion criteria (Borrego et al., 2014; Petticrew & Roberts, 2006). From 2016 to 2018, a team of seven researchers conducted two rounds of review in Refworks: the first round with only titles and abstracts and the second round with the entire full-text. In both rounds, two researchers independently decided whether each study should be retained based on our inclusion criteria listed above. At the abstract review stage, if there was a disagreement between independent coders, we decided to pass the study on to the full text screening round. We screened a total of 2364 abstracts, and only 746 studies passed the first round of title and abstract verification (see PRISMA flow chart on Fig. 1). If there was still a disagreement between independent coders at the full text screening round, then the seven researchers met and discussed the study, clarified the inclusion criteria as needed to resolve potential future disagreements, and when necessary, took a majority vote (4 out of the 7 researchers) on the inclusion of the study. Due to the high number of coders, it was unusual to reach full consensus with all 7 coders, so a majority vote was used to finalize the inclusion of certain studies. We resolved these disagreements on a rolling basis, and depending on the round (abstract or full text), we disagreed about 10–15% of the time on the inclusion of a study. In both the first and second round of screening, studies were often excluded because they did not gather novel empirical data or evidence (inclusion criteria #2) or were not in an undergraduate STEM course (inclusion criteria #3 and #4). Only 412 studies met all our final inclusion criteria.

Coding procedure

From 2017 to 2018, a team of five researchers then coded these 412 studies for detailed information. To quickly gather information about all 412 studies and to answer the first part of our research question (What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies?), we developed an online coding form using Google Forms and Google Sheets. The five researchers piloted and refined the coding form over three rounds of pair coding, and 19 studies were used to test and revise early versions of the coding form. The final coding form (Borrego et al., 2018) used a mix of multiple choice and free response items regarding study characteristics (bibliographic information, type of publication, location of study), course characteristics (discipline, course level, number of students sampled, and type of active learning), methodology (main type of evidence collected, sample size, and analysis methods), study findings (types of student responses and outcomes), and strategy reported (if the study explicitly mentioned using strategies to implementation of active learning).

In the end, only 29 studies explicitly described strategies to aid implementation of active learning (Fig. 1), and we used these 29 studies as the dataset for this study. The main difference between these 29 studies and the other 383 studies was that these 29 studies explicitly described the ways authors implemented active learning in their courses to address SRAL or positive student outcomes. Although some readers who are experienced active learning instructors or educational researchers may view pedagogies and strategies as integrated, we found that most papers described active learning methods in terms of student tasks, while advice on strategies, if included, tended to appear separately. We chose to not over interpret passing mentions of how active learning was implemented as strategies recommended by the authors.

Analysis procedure for coding strategies

To answer our second research question (What instructor strategies to aid implementation of active learning do the authors of these studies provide?), we closely reviewed the 29 studies to analyze the strategies in more detail. We used Boyatzis’s (1998) thematic analysis technique to compile all mentions of instructor strategies to aid implementation of active learning and categorize these excerpts into certain strategies. This technique uses both deductive and inductive coding processes (Creswell & Creswell, 2017; Jesiek, Mazzurco, Buswell, & Thompson, 2018).

In 2018, three researchers reread the 29 studies, marking excerpts related to strategies independently. We found a total of 126 excerpts. The number of excerpts within each study ranged from 1 to 14 excerpts (M = 4, SD = 3). We then took all the excerpts and pasted each into its own row in a Google Sheet. We examined the entire spreadsheet as a team and grouped similar excerpts together using a deductive coding process. We used the explanation and facilitation conceptual framework (DeMonbrun et al., 2017) and placed each excerpt into either category. We also assigned a specific strategy (i.e., describing the purpose of the activity, or encouraging students) from the framework for each excerpt.

However, there were multiple excerpts that did not easily match either category; we set these aside for the inductive coding process. We then reviewed all excerpts without a category and suggested the creation of a new third category, called planning. We based this new category on the idea that the existing explanation and facilitation conceptual framework did not capture strategies that occurred outside of the classroom. We discuss the specific strategies within the planning category in the Results. With a new category in hand, we created a preliminary codebook consisting of explanation, facilitation, and planning categories, and their respective specific strategies.

We then passed the spreadsheet and preliminary codebook to another researcher who had not previously seen the excerpts. The second researcher looked through all the excerpts and assigned categories and strategies, without being able to see the suggestions of the initial three researchers. The second researcher also created their own new strategies and codes, especially when a specific strategy was not presented in the preliminary codebook. All of their new strategies and codes were created within the planning category. The second researcher agreed on assigned categories and implementation strategies for 71% of the total excerpts. A researcher from the initial strategies coding met with the second researcher and discussed all disagreements. The high number of disagreements, 29%, arose from the specific strategies within the new third category, planning. Since the second researcher created new planning strategies, by default these assigned codes would be a disagreement. The two researchers resolved the disagreements by finalizing a codebook with the now full and combined list of planning strategies and the previous explanation and facilitation strategies. Finally, they started the last round of coding, and they coded the excerpts with the final codebook. This time, they worked together in the same coding sessions. Any disagreements were immediately resolved through discussion and updating of final strategy codes. In the end, all 126 excerpts were coded and kept.

Results

Characteristics of the primary studies

To answer our first research question (What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies?), we report the results from our coding and systematic review process. We discuss characteristics of studies within our dataset below and in Table 1.

Table 1 List of course characteristics and strategies of primary studies

Type of publication and research audience

Of the 29 studies, 11 studies were published in conference proceedings, while the remaining 18 studies were journal articles. Examples of journals included the European Journal of Engineering Education, Journal of College Science Teaching, and PRIMUS (Problems, Resources, and Issues in Mathematics Undergraduate Studies).

In terms of research audiences and perspectives, both US and international views were represented. Eighteen studies were from North America, two were from Australia, three were from Asia, and six were from Europe. For more details about the type of research publications, full bibliographic information for all 29 studies is included in the Appendix.

Types of courses sampled

Studies sampled different types of undergraduate STEM courses. In terms of course year, most studies sampled first-year courses (13 studies). All four course years were represented (4 second-year, 3 third-year, 2 fourth-year, 7 not reported). In regards to course discipline or major, all major STEM education disciplines were represented. Fourteen studies were conducted in engineering courses, and most major engineering subdisciplines were represented, such as electrical and computer engineering (4 studies), mechanical engineering (3 studies), general engineering courses (3 studies), chemical engineering (2 studies), and civil engineering (1 study). Thirteen studies were conducted in science courses (3 physics/astronomy, 7 biology, 3 chemistry), and 2 studies were conducted in mathematics or statistics courses.

For teaching methods, most studies sampled traditional courses that were primarily lecture-based but included some in-class activities. The most common activity was giving class time for students to do problem solving (PS) (21 studies). Students were instructed to either do problem solving in groups (16 studies) or individually (5 studies) and sometimes both in the same course. Project or problem-based learning (PBL) was the second most frequently reported activity with 8 studies, and the implementation of this teaching method ranged from end of term final projects to an entire project or problem-based course. The third most common activity was using clickers (4 studies) or having class discussions (4 studies).

Research design, methods, and outcomes

The 29 studies used quantitative (10 studies), qualitative (6 studies), or mixed methods (13 studies) research designs. Most studies contained self-made instructor surveys (IS) as their main source of evidence (20 studies). In contrast, only 2 studies used survey instruments with evidence of validity (IEV). Other forms of data collection included using institutions’ end of course evaluations (EOC) (10 studies), observations (5 studies), and interviews (4 studies).

Studies reported a variety of different measures for researching students’ affective and behavioral responses to active learning. The most common measure was students’ self-reports of learning (an affective outcome); twenty-one studies measured whether students thought they learned more or less due to the active learning intervention. Other common measures included whether students participated in the activities (16 studies, participation), whether they enjoyed the activities (15 studies, enjoyment), and if students were satisfied with the overall course experience (13 studies, course satisfaction). Most studies included more than one measure. Some studies also measured course attendance (4 studies) and students’ self-efficacy with the activities and relevant STEM disciplines (4 studies).

We found that the 23 of the 29 studies reported positive or mostly positive outcomes for their students’ affective and behavioral responses to active learning. Only 5 studies reported mixed/neutral study outcomes, and only one study reported negative student response to active learning. We discuss the implications of this lack of negative study outcomes and reports of SRAL in our dataset in the “Discussion” section.

Strategies

To answer our second research question (What instructor strategies to aid implementation of active learning do the authors of these studies provide?), we provide descriptions, categories, and excerpts of specific strategies found within our systematic literature review.

Explanation strategies

Explanation strategies provide students with clarifications and reasons for using active learning (DeMonbrun et al., 2017). Within the explanation category, we identified two specific strategies: establish expectations and explain the purpose.

Establish expectations

Establishing expectations means setting the tone and routine for active learning at both the course and in-class activity level. Instructors can discuss expectations at the beginning of the semester, at the start of a class session, or right before the activity.

For establishing expectations at the beginning of the semester, studies provide specific ways to ensure students became familiar with active learning as early as possible. This included “introduc[ing] collaborative learning at the beginning of the academic term” (Herkert , 1997, p. 450) and making sure that “project instructions and the data were posted fairly early in the semester, and the students were made aware that the project was an important part of their assessment” (Krishnan & Nalim, 2009, p. 5).

McClanahan and McClanahan (2002) described the importance of explaining how the course will use active learning and purposely using the syllabus to do this:

Set the stage. Create the expectation that students will actively participate in this class. One way to accomplish that is to include a statement in your syllabus about your teaching strategies. For example: I will be using a variety of teaching strategies in this class. Some of these activities may require that you interact with me or other students in class. I hope you will find these methods interesting and engaging and that they enable you to be more successful in this course. In the syllabus, describe the specific learning activities you plan to conduct. These descriptions let the students know what to expect from you as well as what you expect from them (emphasis added, p. 93).

Early on, students see that the course is interactive, and they also see the activities required to be successful in the course.

These studies and excerpts demonstrate the importance of explaining to students how in-class activities relate to course expectations. Instructors using active learning should start the semester with clear expectations for how students should engage with activities.

Explain the purpose

Explaining the purpose includes offering students reasons why certain activities are being used and convincing them of the importance of participating.

One way that studies explained the purpose of the activities was by leveraging and showing assessment data on active learning. For example, Lenz (2015) dedicated class time to show current students comments from previous students:

I spend the first few weeks reminding them of the research and of the payoff that they will garner and being a very enthusiastic supporter of the [active learning teaching] method. I show them comments I have received from previous classes and I spend a lot of time selling the method (p. 294).

Providing current students comments from previous semesters may help students see the value of active learning. Lake (2001) also used data from prior course offerings to show students “the positive academic performance results seen in the previous use of active learning” on the first day of class (p. 899).

However, sharing the effectiveness of the activities does not have to be constrained to the beginning of the course. Autin et al. (2013) used mid-semester test data and comparisons to sell the continued use of active learning to their students. They said to students:

Based on your reflections, I can see that many of you are not comfortable with the format of this class. Many of you said that you would learn better from a traditional lecture. However, this class, as a whole, performed better on the test than my other [lecture] section did. Something seems to be working here (p. 946).

Showing students’ comparisons between active learning and traditional lecture classes is a powerful way to explain how active learning is a benefit to students.

Explaining the purpose of the activities by sharing course data with students appears to be a useful strategy, as it tells students why active learning is being used and convinces students that active learning is making a difference.

Facilitation strategies

Facilitation strategies ensure the continued engagement in the class activities once they have begun, and many of the specific strategies within this category involve working directly with students. We identified two strategies within the facilitation category: approach students and encourage students.

Approach students

Approaching students means engaging with students during the activity. This includes physical proximity and monitoring students, walking around the classroom, and providing students with additional feedback, clarifications, or questions about the activity.

Several studies described how instructors circulated around the classroom to check on the progress of students during an activity. Lenz (2015) stated this plainly in her study, “While the students work on these problems I walk around the room, listening to their discussions” (p. 284). Armbruster et al. (2009) described this strategy and noted positive student engagement, “During each group-work exercise the instructor would move throughout the classroom to monitor group progress, and it was rare to find a group that was not seriously engaged in the exercise” (p. 209). Haseeb (2011) combined moving around the room and approaching students with questions, and they stated, “The instructor moves around from one discussion group to another and listens to their discussions, ask[ing] provoking questions” (p. 276). Certain group-based activities worked better with this strategy, as McClanahan and McClanahan (2002) explained:

Breaking the class into smaller working groups frees the professor to walk around and interact with students more personally. He or she can respond to student questions, ask additional questions, or chat informally with students about the class (p. 94).

Approaching students not only helps facilitate the activity, but it provides a chance for the instructor to work with students more closely and receive feedback. Instructors walking around the classroom ensure that both the students and instructor continue to engage and participate with the activity.

Encourage students

Encouraging students includes creating a supportive classroom environment, motivating students to do the activity, building respect and rapport with students, demonstrating care, and having a positive demeanor toward students’ success.

Ramsier et al. (2003) provided a detailed explanation of the importance of building a supportive classroom environment:

Most of this success lies in the process of negotiation and the building of mutual respect within the class, and requires motivation, energy and enthusiasm on behalf of the instructor… Negotiation is the key to making all of this work, and building a sense of community and shared ownership. Learning students’ names is a challenge but a necessary part of our approach. Listening to student needs and wants with regard to test and homework due dates…projects and activities, etc. goes a long way to build the type of relationships within the class that we need in order to maintain and encourage performance (pp. 16–18).

Here, the authors described a few specific strategies for supporting a positive demeanor, such as learning students’ names and listening to student needs and wants, which helped maintain student performance in an active learning classroom.

Other ways to build a supportive classroom environment were for instructors to appear more approachable. For example, Bullard and Felder (2007) worked to “give the students a sense of their instructors as somewhat normal and approachable human beings and to help them start to develop a sense of community” (p. 5). As instructors and students become more comfortable working with each other, instructors can work toward easing “frustration and strong emotion among students and step by step develop the students’ acceptance [of active learning]” (Harun, Yusof, Jamaludin, & Hassan, 2012, p. 234). In all, encouraging students and creating a supportive environment appear to be useful strategies to aid implementation of active learning.

Planning strategies

The planning category encompasses strategies that occur outside of class time, distinguishing it from the explanation and facilitation categories. Four strategies fall into this category: design appropriate activities, create group policies, align the course, and review student feedback.

Design appropriate activities

Many studies took into consideration the design of appropriate or suitable activities for their courses. This meant making sure the activity was suitable in terms of time, difficulty, and constraints of the course. Activities were designed to strike a balance between being too difficult and too simple, to be engaging, and to provide opportunities for students to participate.

Li et al. (2009) explained the importance of outside-of-class planning and considering appropriate projects: “The selection of the projects takes place in pre-course planning. The subjects for projects should be significant and manageable” (p. 491). Haseeb (2011) further emphasized a balance in design by discussing problems (within problem-based learning) between two parameters, “the problem is deliberately designed to be open-ended and vague in terms of technical details” (p. 275). Armbruster et al. (2009) expanded on the idea of balanced activities by connecting it to group-work and positive outcomes, and they stated, “The group exercises that elicited the most animated student participation were those that were sufficiently challenging that very few students could solve the problem individually, but at least 50% or more of the groups could solve the problem by working as a team” (p. 209).

Instructors should consider the design of activities outside of class time. Activities should be appropriately challenging but achievable for students, so that students remain engaged and participate with the activity during class time.

Create group policies

Creating group policies means considering rules when using group activities. This strategy is unique in that it directly addresses a specific subset of activities, group work. These policies included setting team sizes and assigning specific roles to group members.

Studies outlined a few specific approaches for assigning groups. For example, Ramsier et al. (2003) recommended frequently changing and randomizing groups: “When students enter the room on these days they sit in randomized groups of 3 to 4 students. Randomization helps to build a learning community atmosphere and eliminates cliques” (p. 4). Another strategy in combination with frequent changing of groups was to not allow students to select their own groups. Lehtovuori et al. (2013) used this to avoid problems of freeriding and group dysfunction:

For example, group division is an issue to be aware of...An easy and safe solution is to draw lots to assign the groups and to change them often. This way nobody needs to suffer from a dysfunctional group for too long. Popular practice that students self-organize into groups is not the best solution from the point of view of learning and teaching. Sometimes friendly relationships can complicate fair division of responsibility and work load in the group (p. 9).

Here, Lehtovuori et al. (2013) considered different types of group policies and concluded that frequently changing groups worked best for students. Kovac (1999) also described changing groups but assigned specific roles to individuals:

Students were divided into groups of four and assigned specific roles: manager, spokesperson, recorder, and strategy analyst. The roles were rotated from week to week. To alleviate complaints from students that they were "stuck in a bad group for the entire semester," the groups were changed after each of the two in-class exams (p. 121).

The use of four specific group roles is a potential group policy, and Kovac (1999) continued the trend of changing group members often.

Overall, these studies describe the importance of thinking about ways to implement group-based activities before enacting them during class, and they suggest that groups should be reconstituted frequently. Instructors using group activities should consider whether to use specific group member policies before implementing the activity in the classroom.

Align the course

Aligning the course emphasizes the importance of purposely connecting multiple parts of the course together. This strategy involves planning to ensure students are graded on their participation with the activities as well as considering the timing of the activities with respect to other aspects of the course.

Li et al. (2009) described aligning classroom tasks by discussing the importance of timing, and they wrote, “The coordination between the class lectures and the project phases is very important. If the project is assigned near the directly related lectures, students can instantiate class concepts almost immediately in the project and can apply the project experience in class” (p. 491).

Krishnan and Nalim (2009) aligned class activities with grades to motivate students and encourage participation: “The project was a component of the course counting for typically 10-15% of the total points for the course grade. Since the students were told about the project and that it carried a significant portion of their grade, they took the project seriously” (p. 4). McClanahan and McClanahan (2002) expanded on the idea of using grades to emphasize the importance of active learning to students:

Develop a grading policy that supports active learning. Active learning experiences that are important enough to do are important enough to be included as part of a student's grade…The class syllabus should describe your grading policy for active learning experiences and how those grades factor into the student's final grade. Clarify with the students that these points are not extra credit. These activities, just like exams, will be counted when grades are determined (p. 93).

Here, they suggest a clear grading policy that includes how activities will be assessed as part of students’ final grades.

de Justo and Delgado (2014) connected grading and assessment to learning and further suggested that reliance on exams may negatively impact student engagement:

Particular attention should be given to alignment between the course learning outcomes and assessment tasks. The tendency among faculty members to rely primarily on written examinations for assessment purposes should be overcome, because it may negatively affect students’ engagement in the course activities (p. 8).

Instructors should consider their overall assessment strategies, as overreliance on written exams could mean that students engage less with the activities.

When planning to use active learning, instructors should consider how activities are aligned with course content and students’ grades. Instructors should decide before active learning implementation whether class participation and engagement will be reflected in student grades and in the course syllabus.

Review student feedback

Reviewing student feedback includes both soliciting feedback about the activity and using that feedback to improve the course. This strategy can be an iterative process that occurs over several course offerings.

Many studies utilized student feedback to continuously revise and improve the course. For example, Metzger (2015) commented that “gathering and reviewing feedback from students can inform revisions of course design, implementation, and assessment strategies” (p. 8). Rockland et al. (2013) further described changing and improving the course in response to student feedback, “As a result of these discussions, the author made three changes to the course. This is the process of continuous improvement within a course” (p. 6).

Herkert (1997) also demonstrated the use of student feedback for improving the course over time: “Indeed, the [collaborative] learning techniques described herein have only gradually evolved over the past decade through a process of trial and error, supported by discussion with colleagues in various academic fields and helpful feedback from my students” (p. 459).

In addition to incorporating student feedback, McClanahan and McClanahan (2002) commented on how student feedback builds a stronger partnership with students, “Using student feedback to make improvements in the learning experience reinforces the notion that your class is a partnership and that you value your students’ ideas as a means to strengthen that partnership and create more successful learning” (p. 94). Making students aware that the instructor is soliciting and using feedback can help encourage and build rapport with students.

Instructors should review student feedback for continual and iterative course improvement. Much of the student feedback review occurs outside of class time, and it appears useful for instructors to solicit student feedback to guide changes to the course and build student rapport.

Summary of strategies

We list the appearance of strategies within studies in Table 1 in short-hand form. No study included all eight strategies. Studies that included the most strategies were Bullard and Felder’s (2007) (7 strategies), Armbruster et al.’s (2009) (5 strategies), and Lenz’s (2015) (5 strategies). However, these three studies were exemplars, as most studies included only one or two strategies.

Table 2 presents a summary list of specific strategies, their categories, and descriptions. We also note the number of unique studies (N) and excerpts (n) that included the specific strategies. In total, there were eight specific strategies within three categories. Most strategies fell under the planning category (N = 26), with align the course being the most reported strategy (N = 14). Approaching students (N = 13) and reviewing student feedback (N = 11) were the second and third most common strategies, respectively. Overall, we present eight strategies to aid implementation of active learning.

Table 2 Strategies descriptions by category and numerical count

Discussion

Characteristics of the active learning studies

To address our first research question (What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies?), we discuss the different ways studies reported research on active learning.

Limitations and gaps within the final sample

First, we must discuss the gaps within our final sample of 29 studies. We excluded numerous active learning studies (N = 383) that did not discuss or reflect upon the efficacy of their strategies to aid implementation of active learning. We also began this systematic literature review in 2015 and did not finish our coding and analysis of 2364 abstracts and 746 full-texts until 2018. We acknowledge that there have been multiple studies published on active learning since 2015. Acknowledging these limitations, we discuss our results and analysis in the context of the 29 studies in our dataset, which were published from 1990 to 2015.

Types of courses sampled

Our final sample included only 2 studies that sampled mathematics and statistics courses. In addition, there was also a lack of studies outside of first-year courses. Much of the active learning research literature introduces interventions in first-year (cornerstone) or fourth-year (capstone) courses, but we found within our dataset a tendency to oversample first-year courses. However, all four course-years were represented, as well as all major STEM disciplines, with the most common STEM disciplines being engineering (14 studies) and biology (7 studies).

Thirteen studies implemented course-based active learning interventions, such as project-based learning (8 studies), inquiry-based learning (3 studies), or a flipped classroom (2 studies). Only one study, Lenz (2015), used a previously published active learning intervention, which was Process-Oriented Guided Inquiry Learning (POGIL). Other examples of published active learning programs include the Student-Centered Active Learning Environment for Upside-down Pedagogies (SCALE-UP, Gaffney et al., 2010) and Chemistry, Life, the Universe, and Everything (CLUE, Cooper & Klymkowsky, 2013), but these were not included in our sample of 29 studies.

In contrast, most of the active learning interventions involved adding in-class problem solving (either with individual students or groups of students) to a traditional lecture course (21 studies). For some instructors attempting to adopt active learning, using this smaller active learning intervention (in-class problem solving) may be a good starting point.

Research design, methods, and outcomes

Despite the variety of quantitative, qualitative, and mixed method research designs, most studies used either self-made instructor surveys (20 studies) or their institution’s course evaluations (10 studies). The variation between so many different versions of instructor surveys and course evaluations made it difficult to compare data or attempt a quantitative meta-analysis. Further, only 2 studies used instruments with evidence of validity. However, that trend may change as there are more examples of instruments with evidence of validity, such as the Student Response to Instructional Practices (StRIP, DeMonbrun et al., 2017), the Biology Interest Questionnaire (BIQ, Knekta, Rowland, Corwin, & Eddy, 2020), and the Pedagogical Expectancy Violation Assessment (PEVA, Gaffney et al., 2010).

We were also concerned about the use of institutional course evaluations (10 studies) as evidence of students’ satisfaction and affective responses to active learning. Course evaluations capture more than just students’ responses to active learning, as the scores are biased toward the instructors’ gender (Mitchell & Martin, 2018) and race (Daniel, 2019), and they are strongly correlated with students’ expected grade in the class (Nguyen et al., 2017). Despite these limitations, we kept course evaluations in our keyword search and inclusion criteria, because they relate to instructors concerns about student resistance to active learning, and these scores continue to be used for important instructor reappointment, tenure, and promotion decisions (DeMonbrun et al., 2017).

In addition to students’ satisfaction, there were other measures related to students’ affective and behavioral responses to active learning. The most common measure was students’ self-reports of whether they thought they learned more or less (21 studies). Other important affective outcomes included enjoyment (13 studies) and self-efficacy (4 students). The most common behavioral measure was students’ participation (16 studies). However, missing from this sample were other affective outcomes, such as students’ identities, beliefs, emotions, values, and buy-in.

Positive outcomes for using active learning

Twenty-three of the 29 studies reported positive or mostly positive outcomes for their active learning intervention. At the start of this paper, we acknowledged that much of the existing research suggested the widespread positive benefits of using active learning in undergraduate STEM courses. However, much of these positive benefits related to active learning were centered on students’ cognitive learning outcomes (e.g., Theobald et al., 2020) and not students’ affective and behavioral responses to active learning. Here, we show positive affective and behavioral outcomes in terms of students’ self-reports of learning, enjoyment, self-efficacy, attendance, participation, and course satisfaction.

Due to the lack of mixed/neutral or negative affective outcomes, it is important to acknowledge potential publication bias within our dataset. Authors may be hesitant to report negative outcomes to active learning interventions. It could also be the case that negative or non-significant outcomes are not easily published in undergraduate STEM education venues. These factors could help explain the lack of mixed/neutral or negative study outcomes in our dataset.

Strategies to aid implementation of active learning

We aimed to answer the question: what instructor strategies to aid implementation of active learning do the authors of these studies provide? We addressed this question by providing instructors and readers a summary of actionable strategies they can take back to their own classrooms. Here, we discuss the range of strategies found within our systematic literature review.

Supporting instructors with actionable strategies

We identified eight specific strategies across three major categories: explanation, facilitation, and planning. Each strategy appeared in at least seven studies (Table 2), and each strategy was written to be actionable and practical.

Strategies in the explanation category emphasized the importance of establishing expectations and explaining the purpose of active learning to students. The facilitation category focused on approaching and encouraging students once activities were underway. Strategies in the planning category highlight the importance of working outside of class time to thoughtfully design appropriate activities, create policies for group work, align various components of the course, and review student feedback to iteratively improve the course.

However, as we note in the “Introduction” section, these strategies are not entirely new, and the strategies will not be surprising to experienced researchers and educators. Even still, there has yet to be a systematic review that compiles these instructor strategies in relation to students’ affective and behavioral responses to active learning. For example, the “explain the purpose” strategy is similar to the productive framing (e.g., Hutchison & Hammer, 2010) of the activity for students. “Design appropriate activities” and “align various components of the course” relate to Vygotsky’s (1978) theories of scaffolding for students (Shekhar et al., 2020). “Review student feedback” and “approaching students” relate to ideas on formative assessment (e.g., Pellegrino, DiBello, & Brophy, 2014) or revising the course materials in relation to students’ ongoing needs.

We also acknowledge that we do not have an exhaustive list of specific strategies to aid implementation of active learning. More work needs to be done measuring and observing these strategies in-action and testing the use of these strategies against certain outcomes. Some of this work of measuring instructor strategies has already begun (e.g., DeMonbrun et al., 2017; Finelli et al., 2018; Tharayil et al., 2018), but further testing and analysis would benefit the active learning community. We hope that our framework of explanation, facilitation, and planning strategies provide a guide for instructors adopting active learning. Since these strategies are compiled from the undergraduate STEM education literature and research on affective and behavioral responses to active learning, instructors have compelling reason to use these strategies to aid implementation of active learning.

One way to consider using these strategies is to consider the various aspects of instruction and their sequence. That is, planning strategies would be most applicable during the phase of work that occurs prior to classroom instruction, the explanation strategies would be more useful when introducing students to active learning activities, while facilitation strategies would be best enacted while students are already working and engaged in the assigned activities. Of course, these strategies may also be used in conjunction with each other and are not strictly limited to these phases. For example, one plausible approach could be using the planning strategies of design and alignment as areas of emphasis during explanation. Overall, we hope that this framework of strategies supports instructors’ adoption and sustained use of active learning.

Creation of the planning category

At the start of this paper, we presented a conceptual framework for strategies consisting of only explanation and facilitation categories (DeMonbrun et al., 2017). One of the major contributions of this paper is the addition of a third category, which we call the planning category, to the existing conceptual framework. The planning strategies were common throughout the systematic literature review, and many studies emphasized the need to consider how much time and effort is needed when adding active learning to the course. Although students may not see this preparation, and we did not see this type of strategy initially, explicitly adding the planning category acknowledges the work instructors do outside of the classroom.

The planning strategies also highlight the need for instructors to not only think about implementing active learning before they enter the class, but to revise their implementation after the class is over. Instructors should refine their use of active learning through feedback, reflection, and practice over multiple course offerings. We hope this persistence can lead to long-term adoption of active learning.

Conclusion

Despite our review ending in 2015, most of STEM instruction remains didactic (Laursen, 2019; Stains et al., 2018), and there has not been a long-term sustained adoption of active learning. In a push to increase the adoption of active learning within undergraduate STEM courses, we hope this study provided support and actionable strategies for instructors who are considering active learning but are concerned about student resistance to active learning.

We identified eight specific strategies to aid implementation of active learning based on three categories. The three categories of strategies were explanation, facilitation, and planning. In this review, we created the third category, planning, and we suggested that this category should be considered first when implementing active learning in the course. Instructors should then focus on explaining and facilitating their activity in the classroom. The eight specific strategies provided here can be incorporated into faculty professional development programs and readily adopted by instructors wanting to implement active learning in their STEM courses.

There remains important future work in active learning research, and we noted these gaps within our review. It would be useful to specifically review and measure instructor strategies in-action and compare its use against other affective outcomes, such as identity, interest, and emotions.

There has yet to be a study that compiles and synthesizes strategies reported from multiple active learning studies, and we hope that this paper filled this important gap. The strategies identified in this review can help instructors persist beyond awkward initial implementations, avoid some problems altogether, and most importantly address student resistance to active learning. Further, the planning strategies emphasize that the use of active learning can be improved over time, which may help instructors have more realistic expectations for the first or second time they implement a new activity. There are many benefits to introducing active learning in the classroom, and we hope that these benefits are shared among more STEM instructors and students.

Availability of data and materials

Journal articles and conference proceedings which make up this review can be found through reverse citation lookup. See the Appendix for the references of all primary studies within this systematic review. We used the following databases to find studies within the review: Web of Science, Academic Search Complete, Compendex, Inspec, Education Source, and Education Resource Information Center. More details and keyword search strings are provided in the “Methods” section.

Abbreviations

STEM:

Science, technology, engineering, and mathematics

SRAL:

Student resistance to active learning

IEV:

Instrument with evidence of validity

IS:

Instructor surveys

PRISMA:

Preferred Reporting Items for Systematic Reviews and Meta-Analyses

PS:

Problem solving

PBL:

Problem or project-based learning

EOC:

End of course evaluations

References

  1. Armbruster, P., Patel, M., Johnson, E., & Weiss, M. (2009). Active learning and student-centered pedagogy improve student attitudes and performance in introductory biology. CBE Life Sciences Education, 8(3), 203–213. https://doi.org/10.1187/cbe.09-03-0025.

    Article  Google Scholar 

  2. Autin, M., Bateiha, S., & Marchionda, H. (2013). Power through struggle in introductory statistics. PRIMUS, 23(10), 935–948. https://doi.org/10.1080/10511970.2013.820810.

    Article  Google Scholar 

  3. Bandura, A. (1982). Self-efficacy mechanism in human agency. American Psychologist, 37(2), 122 https://psycnet.apa.org/doi/10.1037/0003-066X.37.2.122.

    Article  Google Scholar 

  4. Berkling, K., & Zundel, A. (2015). Change Management: Overcoming the Challenges of Introducing Self-Driven Learning. International Journal of Engineering Pedagogy (iJEP), 5(4), 38–46. https://www.learntechlib.org/p/207352/.

  5. Bilston, L. (1999). Lessons from a problem-based learning class in first year engineering statics. Paper presented at the 2nd Asia-Pacific Forum on Engineering and Technology Education, Clayton, Victoria.

  6. Borrego, M., Cutler, S., Prince, M., Henderson, C., & Froyd, J. E. (2013). Fidelity of implementation of research-based instructional strategies (RBIS) in engineering science courses. Journal of Engineering Education, 102(3), 394–425. https://doi.org/10.1002/jee.20020.

    Article  Google Scholar 

  7. Borrego, M., Foster, M. J., & Froyd, J. E. (2014). Systematic literature reviews in engineering education and other developing interdisciplinary fields. Journal of Engineering Education, 103(1), 45–76. https://doi.org/10.1002/jee.20038.

    Article  Google Scholar 

  8. Borrego, M., Nguyen, K., Crockett, C., DeMonbrun, M., Shekhar, P., Tharayil, S., … Waters, C. (2018). Systematic literature review of students’ affective responses to active learning: Overview of results. San Jose: Paper presented at the 2018 IEEE Frontiers in Education Conference (FIE). https://doi.org/10.1109/FIE.2018.8659306.

    Book  Google Scholar 

  9. Boyatzis, R. E. (1998). Transforming qualitative information: Thematic analysis and code development. Sage Publications Inc.

  10. Breckler, J., & Yu, J. R. (2011). Student responses to a hands-on kinesthetic lecture activity for learning about the oxygen carrying capacity of blood. Advances in Physiology Education, 35(1), 39–47. https://doi.org/10.1152/advan.00090.2010.

  11. Bullard, L., & Felder, R. (2007). A student-centered approach to the stoichiometry course. Honolulu: Paper presented at the 2007 ASEE Annual Conference and Exposition https://peer.asee.org/1543.

    Book  Google Scholar 

  12. Carlson, K. A., & Winquist, J. R. (2011). Evaluating an active learning approach to teaching introductory statistics: A classroom workbook approach. Journal of Statistics Education, 19(1). https://doi.org/10.1080/10691898.2011.11889596.

  13. Chi, M. T., & Wylie, R. (2014). The ICAP framework: Linking cognitive engagement to active learning outcomes. Educational Psychologist, 49(4), 219–243. https://doi.org/10.1080/00461520.2014.965823.

    Article  Google Scholar 

  14. Christensen, T. (2005). Changing the learning environment in large general education astronomy classes. Journal of College Science Teaching, 35(3), 34.

  15. Cooper, M., & Klymkowsky, M. (2013). Chemistry, life, the universe, and everything: A new approach to general chemistry, and a model for curriculum reform. Journal of Chemical Education, 90(9), 1116–1122. https://doi.org/10.1021/ed300456y.

    Article  Google Scholar 

  16. Creswell, J. W., & Creswell, J. D. (2017). Research design: Qualitative, quantitative, and mixed methods approaches. Sage Publishing Inc.

  17. Dancy, M., Henderson, C., & Turpen, C. (2016). How faculty learn about and implement research-based instructional strategies: The case of peer instruction. Physical Review Physics Education Research, 12(1). https://doi.org/10.1103/PhysRevPhysEducRes.12.010110.

  18. Daniel, B. J. (2019). Teaching while black: Racial dynamics, evaluations, and the role of white females in the Canadian academy in carrying the racism torch. Race Ethnicity and Education, 22(1), 21–37. https://doi.org/10.1080/13613324.2018.1468745.

    Article  Google Scholar 

  19. de Justo, E., & Delgado, A. (2014). Change to competence-based education in structural engineering. Journal of Professional Issues in Engineering Education and Practice, 141(3). https://doi.org/10.1061/(ASCE)EI.1943-5541.0000215.

  20. DeMonbrun, R. M., Finelli, C., Prince, M., Borrego, M., Shekhar, P., Henderson, C., & Waters, C. (2017). Creating an instrument to measure student response to instructional practices. Journal of Engineering Education, 106(2), 273–298. https://doi.org/10.1002/jee.20162.

    Article  Google Scholar 

  21. Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., & Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proceedings of the National Academy of Sciences, 116(39), 19251–19257. https://doi.org/10.1073/pnas.1821936116.

    Article  Google Scholar 

  22. Dweck, C. S., & Leggett, E. L. (1988). A social-cognitive approach to motivation and personality. Psychological Review, 95(2), 256–273. https://doi.org/10.1037/0033-295X.95.2.256.

    Article  Google Scholar 

  23. Finelli, C., Nguyen, K., Henderson, C., Borrego, M., Shekhar, P., Prince, M., … Waters, C. (2018). Reducing student resistance to active learning: Strategies for instructors. Journal of College Science Teaching, 47(5), 80–91 https://www.nsta.org/journal-college-science-teaching/journal-college-science-teaching-mayjune-2018/research-and-1.

    Google Scholar 

  24. Finelli, C. J., Daly, S. R., & Richardson, K. M. (2014). Bridging the research-to-practice gap: Designing an institutional change plan using local evidence. Journal of Engineering Education, 103(2), 331–361. https://doi.org/10.1002/jee.20042.

    Article  Google Scholar 

  25. Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415. https://doi.org/10.1073/pnas.1319030111.

    Article  Google Scholar 

  26. Froyd, J. E., Borrego, M., Cutler, S., Henderson, C., & Prince, M. J. (2013). Estimates of use of research-based instructional strategies in core electrical or computer engineering courses. IEEE Transactions on Education, 56(4), 393–399. https://doi.org/10.1109/TE.2013.2244602.

    Article  Google Scholar 

  27. Gaffney, J. D., Gaffney, A. L. H., & Beichner, R. J. (2010). Do they see it coming? Using expectancy violation to gauge the success of pedagogical reforms. Physical Review Special Topics - Physics Education Research, 6(1), 010102. https://doi.org/10.1103/PhysRevSTPER.6.010102.

    Article  Google Scholar 

  28. Gough, D., Oliver, S., & Thomas, J. (2017). An introduction to systematic reviews. Sage Publishing Inc.

  29. Harun, N. F., Yusof, K. M., Jamaludin, M. Z., & Hassan, S. A. H. S. (2012). Motivation in problem-based learning implementation. Procedia-Social and Behavioral Sciences, 56, 233–242. https://doi.org/10.1016/j.sbspro.2012.09.650.

    Article  Google Scholar 

  30. Haseeb, A. (2011). Implementation of micro-level problem based learning in a course on electronic materialas. Journal of Materials Education, 33(5-6), 273–282 http://eprints.um.edu.my/id/eprint/5501.

    Google Scholar 

  31. Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in Science Teaching, 48(8), 952–984. https://doi.org/10.1002/tea.20439.

    Article  Google Scholar 

  32. Henderson, C., & Dancy, M. (2007). Barriers to the use of research-based instructional strategies: The influence of both individual and situational characteristics. Physical Review Special Topics - Physics Education Research, 3(2). https://doi.org/10.1103/PhysRevSTPER.3.020102.

  33. Henderson, C., Khan, R., & Dancy, M. (2018). Will my student evaluations decrease if I adopt an active learning instructional strategy? American Journal of Physics, 86(12), 934–942. https://doi.org/10.1119/1.5065907.

    Article  Google Scholar 

  34. Herkert, J. R. (1997). Collaborative learning in engineering ethics. Science and Engineering Ethics, 3(4), 447–462. https://doi.org/10.1007/s11948-997-0047-x.

  35. Hodgson, Y., Benson, R., & Brack, C. (2013). Using action research to improve student engagement in a peer-assisted learning programme. Educational Action Research, 21(3), 359-375. https://doi.org/10.1080/09650792.2013.813399.

  36. Hora, M. T., & Ferrare, J. J. (2013). Instructional systems of practice: A multi-dimensional analysis of math and science undergraduate course planning and classroom teaching. Journal of the Learning Sciences, 22(2), 212–257. https://doi.org/10.1080/10508406.2012.729767.

    Article  Google Scholar 

  37. Hutchison, P., & Hammer, D. (2010). Attending to student epistemological framing in a science classroom. Science Education, 94(3), 506–524. https://doi.org/10.1002/sce.20373.

    Article  Google Scholar 

  38. Jaeger, B., & Bilen, S. (2006). The one-minute engineer: Getting design class out of the starting blocks. Paper presented at the 2006 ASEE Annual Conference and Exposition, Chicago, IL. https://peer.asee.org/524.

  39. Jesiek, B. K., Mazzurco, A., Buswell, N. T., & Thompson, J. D. (2018). Boundary spanning and engineering: A qualitative systematic review. Journal of Engineering Education, 107(3), 318–413. https://doi.org/10.1002/jee.20219.

    Article  Google Scholar 

  40. Kezar, A., Gehrke, S., & Elrod, S. (2015). Implicit theories of change as a barrier to change on college campuses: An examination of STEM reform. The Review of Higher Education, 38(4), 479–506. https://doi.org/10.1353/rhe.2015.0026.

    Article  Google Scholar 

  41. Kirkpatrick, D. L. (1976). Evaluation of training. In R. L. Craig (Ed.), Training and development handbook: A guide to human resource development. McGraw Hill.

  42. Knekta, E., Rowland, A. A., Corwin, L. A., & Eddy, S. (2020). Measuring university students’ interest in biology: Evaluation of an instrument targeting Hidi and Renninger’s individual interest. International Journal of STEM Education, 7, 1–16. https://doi.org/10.1186/s40594-020-00217-4.

    Article  Google Scholar 

  43. Kovac, J. (1999). Student active learning methods in general chemistry. Journal of Chemical Education, 76(1), 120. https://doi.org/10.1021/ed076p120.

    Article  Google Scholar 

  44. Krishnan, S., & Nalim, M. R. (2009). Project based learning in introductory thermodynamics. Austin: Paper presented at the 2009 ASEE Annual Conference and Exposition https://peer.asee.org/5615.

    Google Scholar 

  45. Kuh, G. D. (2005). Student engagement in the first year of college. In M. L. Upcraft, J. N. Gardner, J. N, & B. O. Barefoot (Eds.), Challenging and supporting the first-year student: A handbook for improving the first year of college, (pp. 86–107). Jossey-Bass.

  46. Laatsch, L., Britton, L., Keating, S., Kirchner, P., Lehman, D., Madsen-Myers, K., Milson, L., Otto, C., & Spence, L. (2005). Cooperative learning effects on teamwork attitudes in clinical laboratory science students. American Society for Clinical Laboratory Science, 18(3). https://doi.org/10.29074/ascls.18.3.150.

  47. Lake, D. A. (2001). Student performance and perceptions of a lecture-based course compared with the same course utilizing group discussion. Physical Therapy, 81(3), 896–902. https://doi.org/10.1093/ptj/81.3.896.

    Article  Google Scholar 

  48. Laursen, S. (2019). Levers for change: An assessment of progress on changing STEM instruction American Association for the Advancement of Science. https://www.aaas.org/resources/levers-change-assessment-progress-changing-stem-instruction.

    Google Scholar 

  49. Lehtovuori, A., Honkala, M., Kettunen, H., & Leppävirta, J. (2013). Interactive engagement methods in teaching electrical engineering basic courses. In Paper presented at the IEEE global engineering education conference (EDUCON). Germany: Berlin. https://doi.org/10.1109/EduCon.2013.6530089.

    Chapter  Google Scholar 

  50. Lenz, L. (2015). Active learning in a math for liberal arts classroom. PRIMUS, 25(3), 279–296. https://doi.org/10.1080/10511970.2014.971474.

    Article  Google Scholar 

  51. Li, J., Zhao, Y., & Shi, L. (2009). Interactive teaching methods in information security course. Paper presented at the International Conference on Scalable Computing and Communications; The Eighth International Conference on Embedded Computing. https://doi.org/10.1109/EmbeddedCom-ScalCom.2009.94.

  52. Liberati, A., Altman, D. G., Tetzlaff, J., Mulrow, C., Gotzsche, P. C., Ioannidis, J. P., … Moher, D. (2009). The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate healthcare interventions: Explanation and elaboration. Journal of Clinical Epidemology, 62(10), e1–e34. https://doi.org/10.1016/j.jclinepi.2009.06.006.

    Article  Google Scholar 

  53. Lund, T. J., & Stains, M. (2015). The importance of context: An exploration of factors influencing the adoption of student-centered teaching among chemistry, biology, and physics faculty. International Journal of STEM Education, 2(1). https://doi.org/10.1186/s40594-015-0026-8.

  54. Machemer, P. L., & Crawford, P. (2007). Student perceptions of active learning in a large cross-disciplinary classroom. Active Learning in Higher Education, 8(1), 9–30. https://doi.org/10.1177/1469787407074008.

    Article  Google Scholar 

  55. Maib, J., Hall, R., Collier, H., & Thomas, M. (2006). A multi-method evaluation of the implementation of a student response system. Paper presented at the 12th Americas’ Conference on Information Systems (AMCIS), Acapulco, Mexico. https://aisel.aisnet.org/amcis2006/27.

  56. McClanahan, E. B., & McClanahan, L. L. (2002). Active learning in a non-majors biology class: Lessons learned. College Teaching, 50(3), 92–96. https://doi.org/10.1080/87567550209595884.

    Article  Google Scholar 

  57. McLoone, S., & Brennan, C. (2015). On the use and evaluation of a smart device student response system in an undergraduate mathematics classroom. AISHE-J: The All Ireland Journal of Teaching and Learning in Higher Education, 7(3). http://ojs.aishe.org/index.php/aishe-j/article/view/243.

  58. Metzger, K. J. (2015). Collaborative teaching practices in undergraduate active learning classrooms: A report of faculty team teaching models and student reflections from two biology courses. Bioscene: Journal of College Biology Teaching, 41(1), 3–9 http://www.acube.org/wp-content/uploads/2017/11/2015_1.pdf.

    Google Scholar 

  59. Mitchell, K. M., & Martin, J. (2018). Gender bias in student evaluations. PS: Political Science & Politics, 51(3), 648–652. https://doi.org/10.1017/S104909651800001X.

    Article  Google Scholar 

  60. Nguyen, K., Husman, J., Borrego, M., Shekhar, P., Prince, M., DeMonbrun, R. M., … Waters, C. (2017). Students’ expectations, types of instruction, and instructor strategies predicting student response to active learning. International Journal of Engineering Education, 33(1(A)), 2–18 http://www.ijee.ie/latestissues/Vol33-1A/02_ijee3363ns.pdf.

    Google Scholar 

  61. Oakley, B. A., Hanna, D. M., Kuzmyn, Z., & Felder, R. M. (2007). Best practices involving teamwork in the classroom: Results from a survey of 6435 engineering student respondents. IEEE Transactions on Education, 50(3), 266–272. https://doi.org/10.1109/TE.2007.901982.

    Article  Google Scholar 

  62. Oliveira, P. C., & Oliveira, C. G. (2014). Integrator element as a promoter of active learning in engineering teaching. European Journal of Engineering Education, 39(2), 201–211. https://doi.org/10.1080/03043797.2013.854318.

  63. Owens, D. C., Sadler, T. D., Barlow, A. T., & Smith-Walters, C. (2020). Student motivation from and resistance to active learning rooted in essential science practices. Research in Science Education, 50(1), 253–277. https://doi.org/10.1007/s11165-017-9688-1.

    Article  Google Scholar 

  64. Parker Siburt, C. J., Bissell, A. N., & Macphail, R. A. (2011). Developing Metacognitive and Problem-Solving Skills through Problem Manipulation. Journal of Chemical Education, 88(11), 1489–1495. https://doi.org/10.1021/ed100891s.

  65. Passow, H. J., & Passow, C. H. (2017). What competencies should undergraduate engineering programs emphasize? A systematic review. Journal of Engineering Education, 106(3), 475–526. https://doi.org/10.1002/jee.20171.

    Article  Google Scholar 

  66. Patrick, L. E., Howell, L. A., & Wischusen, W. (2016). Perceptions of active learning between faculty and undergraduates: Differing views among departments. Journal of STEM Education: Innovations and Research, 17(3), 55 https://www.jstem.org/jstem/index.php/JSTEM/article/view/2121/1776.

    Google Scholar 

  67. Pellegrino, J., DiBello, L., & Brophy, S. (2014). The science and design of assessment in engineering education. In A. Johri, & B. Olds (Eds.), Cambridge handbook of engineering education research, (pp. 571–598). Cambridge University Press. https://doi.org/10.1017/CBO9781139013451.036.

  68. Petticrew, M., & Roberts, H. (2006). Systematic reviews in the social sciences: A practical guide. Blackwell Publishing. https://doi.org/10.1002/9780470754887.

  69. Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education, 93, 223–232. https://doi.org/10.1002/j.2168-9830.2004.tb00809.x.

    Article  Google Scholar 

  70. Prince, M., & Felder, R. (2007). The many faces of inductive teaching and learning. Journal of College Science Teaching, 36(5), 14–20.

    Google Scholar 

  71. Ramsier, R. D., Broadway, F. S., Cheung, H. M., Evans, E. A., & Qammar, H. K. (2003). University physics: A hybrid approach. Nashville: Paper presented at the 2003 ASEE Annual Conference and Exposition https://peer.asee.org/11934.

    Google Scholar 

  72. Regev, G., Gause, D. C., & Wegmann, A. (2008). Requirements engineering education in the 21st century, an experiential learning approach. 2008 16th IEEE International Requirements Engineering Conference, Catalunya. https://doi.org/10.1109/RE.2008.28.

  73. Rockland, R., Hirsch, L., Burr-Alexander, L., Carpinelli, J. D., & Kimmel, H. S. (2013). Learning outside the classroom—Flipping an undergraduate circuits analysis course. Atlanta: Paper presented at the 2013 ASEE Annual Conference and Exposition https://peer.asee.org/19868.

    Book  Google Scholar 

  74. Schmidt, J. A., Rosenberg, J. M., & Beymer, P. N. (2018). A person-in-context approach to student engagement in science: Examining learning activities and choice. Journal of Research in Science Teaching, 55(1), 19–43. https://doi.org/10.1002/tea.21409.

    Article  Google Scholar 

  75. Shekhar, P., Borrego, M., DeMonbrun, M., Finelli, C., Crockett, C., & Nguyen, K. (2020). Negative student response to active learning in STEM classrooms: A systematic review of underlying reasons. Journal of College Science Teaching, 49(6) https://www.nsta.org/journal-college-science-teaching/journal-college-science-teaching-julyaugust-2020/negative-student.

  76. Smith, K. A., Sheppard, S. D., Johnson, D. W., & Johnson, R. T. (2005). Pedagogies of engagement: Classroom-based practices. Journal of Engineering Education, 94(1), 87–101. https://doi.org/10.1002/j.2168-9830.2005.tb00831.x.

    Article  Google Scholar 

  77. Stains, M., Harshman, J., Barker, M., Chasteen, S., Cole, R., DeChenne-Peters, S., … Young, A. M. (2018). Anatomy of STEM teaching in north American universities. Science, 359(6383), 1468–1470. https://doi.org/10.1126/science.aap8892.

    Article  Google Scholar 

  78. Stains, M., & Vickrey, T. (2017). Fidelity of implementation: An overlooked yet critical construct to establish effectiveness of evidence-based instructional practices. CBE Life Sciences Education, 16(1). https://doi.org/10.1187/cbe.16-03-0113.

  79. Stump, G. S., Husman, J., & Corby, M. (2014). Engineering students' intelligence beliefs and learning. Journal of Engineering Education, 103(3), 369–387. https://doi.org/10.1002/jee.20051.

    Article  Google Scholar 

  80. Tharayil, S., Borrego, M., Prince, M., Nguyen, K. A., Shekhar, P., Finelli, C. J., & Waters, C. (2018). Strategies to mitigate student resistance to active learning. International Journal of STEM Education, 5(1), 7. https://doi.org/10.1186/s40594-018-0102-y.

    Article  Google Scholar 

  81. Theobald, E. J., Hill, M. J., Tran, E., Agrawal, S., Arroyo, E. N., Behling, S., … Freeman, S. (2020). Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proceedings of the National Academy of Sciences, 117(12), 6476–6483. https://doi.org/10.1073/pnas.1916903117.

    Article  Google Scholar 

  82. Tolman, A., Kremling, J., & Tagg, J. (2016). Why students resist learning: A practical model for understanding and helping students. Stylus Publishing, LLC.

  83. Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Harvard University Press.

  84. Weimer, M. (2002). Learner-centered teaching: Five key changes to practice. Wiley.

  85. Wigfield, A., & Eccles, J. S. (2000). Expectancy–value theory of achievement motivation. Contemporary Educational Psychology, 25(1), 68–81. https://doi.org/10.1006/ceps.1999.1015.

    Article  Google Scholar 

  86. Winkler, I., & Rybnikova, I. (2019). Student resistance in the classroom—Functional-instrumentalist, critical-emancipatory and critical-functional conceptualisations. Higher Education Quarterly, 73(4), 521–538. https://doi.org/10.1111/hequ.12219.

Download references

Acknowledgements

We thank our collaborators, Charles Henderson and Michael Prince, for their early contributions to this project, including screening hundreds of abstracts and full papers. Thank you to Adam Papendieck and Katherine Doerr for their feedback on early versions of this manuscript. Finally, thank you to the anonymous reviewers at the International Journal of STEM Education for your constructive feedback.

Funding

This work was supported by the National Science Foundation through grant #1744407. Any opinions, findings, conclusions, or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

Author information

Affiliations

Authors

Contributions

All authors contributed to the design and execution of this paper. KN, MB, and CW created the original vision for the paper. RR solicited, downloaded, and catalogued all studies for review. All authors contributed in reviewing and screening hundreds of studies. KN then led the initial analysis and creation of strategy codes. CF reviewed and finalized the analysis. All authors drafted, reviewed, and finalized sections of the paper. KN, MB, MD, and CC led the final review of the paper. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Cynthia J. Finelli.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Nguyen, K.A., Borrego, M., Finelli, C.J. et al. Instructor strategies to aid implementation of active learning: a systematic literature review. IJ STEM Ed 8, 9 (2021). https://doi.org/10.1186/s40594-021-00270-7

Download citation

Keywords

  • Active learning
  • Systematic review
  • Instructor strategies; student response