Skip to main content

A scoping review of literature assessing the impact of the learning assistant model

Abstract

Much of modern education reform is focused on implementation of evidenced-based teaching, but these techniques are sometimes met with trepidation from faculty, due to inexperience or lack of necessary resources. One near-peer teaching model designed to facilitate evidenced-based teaching in Science, Technology, Engineering, and Mathematics classrooms is the Learning Assistant (LA) model. Here, we describe the details of the LA model, present a scoping review of literature using the four original goals of the LA model as a framework, and suggest future areas of research that would deepen our understanding of the impact that the LA model may have on education. We summarize how the LA model improves student outcomes and teacher preparation and identify a relative deficiency of literature that addresses how the LA model impacts faculty and departmental/institutional change. Additionally, of the 39 papers reviewed, 11 are strictly pre-experimental study designs, 28 use quasi-experimental designs or a combination of quasi and pre-experimental, and none of them included a true experimental design. Thus, we conclude that current studies suggest that LA model positively impacts education, but more refined assessment would improve our understanding of the model. Furthermore, despite the encouraging research on the impact of the LA model and the proliferation of LA programs at institutions across the world, the study of the LA model has been, for the most part, limited to a small group of education researchers. Therefore, a major objective of this review is to introduce the LA model to a new group of instructors and researchers who can further our understanding of this promising model.

Near-peer instruction and the Learning Assistant model

For decades, near-peer teaching has been implemented to supplement education from faculty instructors (Whitman & Fife, 1988). In the literature, there are many examples of near-peer teaching including peer-assisted learning, team-based learning, peer tutoring, education through student interaction, peer mentoring, supplemental instruction, and peer-led team learning (Evans & Cuffe, 2009; Lockspeiser, O’Sullivan, Teherani, & Muller, 2008; ten Cate & Durning, 2007; Williams & Fowler, 2014). However, the central concept of near-peer teaching is consistent: students helping other students learn. Often the near-peer instructor is a student who has recently passed the course and they interact with students during regular class time, which distinguishes near-peer instruction from small group learning and remedial tutoring models. Importantly, the role of a near-peer instructor is distinct from that of a Teaching Assistant (TA), who may aid instructors in their responsibilities as teachers (i.e., grading, evaluation, preparing assignments). In contrast, near-peer instructors work as aides to students in their responsibilities as learners.

The benefits of near-peer teaching in general have been demonstrated among medical and nursing students, where near-peer instructors create supportive learning environments and improve grades (Evans & Cuffe, 2009; Irvine, Williams, & McKenna, 2018; ten Cate, van de Vorst, & van den Broek, 2012; Williams & Fowler, 2014). Two specific models of near-peer instruction with a record of demonstrated success in a broad range of undergraduate STEM courses are supplemental instruction (Arendale, 1994) and peer-led team learning (Gosser & Roth, 1998). Both programs result in higher mean grades and higher retention or persistence rates (Dawson, van der Meer, Skalicky, & Cowley, 2014; Wilson & Varma-Nelson, 2016).

The Learning Assistant (LA) model is a form of near-peer instruction specifically designed to stimulate instructional change in classrooms and shift attitudes among students, teachers, and administrators to adopt evidence-based teaching methods (Otero, 2015). The near-peer instructors in this model (LAs) encourage active student engagement in classrooms and work with faculty and staff to provide a student-centered learning environment.

There are three hallmarks that distinguish LAs from other near-peer instructors (Fig. 1; Otero, 2015; Talbot, Hartley, Marzetta, & Wee, 2015):

  1. 1)

    Practice: An LA’s primary role is to interact with students during formal class time to help them better understand course content by guiding students in their own learning process. LA-student interaction can happen in many forums including, but not limited to, lecture, laboratory, or recitations.

  2. 2)

    Preparation: LAs meet weekly with the course instructor to discuss course content, plan for upcoming lessons, and reflect on activities from previous weeks. This also serves as an opportunity for LAs to provide input on the student perspective to the instructor.

  3. 3)

    Pedagogy: First-time LAs attend a pedagogy-focused seminar typically staffed by a school of education faculty member. The seminar is an opportunity for LAs to learn about teaching, reflect on their experiences, and get support from fellow LAs when they face challenges with students or their working relationship with instructors.

Fig. 1
figure1

The three essential elements of the LA program. Each week LAs meet with the instructional staff for their course (preparation) and in their pedagogy course (pedagogy) to reflect on their experiences with students (practice). LAs use the knowledge they gain in their pedagogy course to inform discussion with faculty during preparation, and in turn, use their experience with faculty to inform discussion with other LAs in their pedagogy class. Finally, the LAs apply what they learn in preparation and in their pedagogy class to their practice with students. Adapted from Otero et al. (2010)

Incorporating LAs into a course gives the instructor a team of teachers to help facilitate a more student-centered learning environment. Incorporating LAs into the classroom improves the student-to-teacher ratio. Thus, active learning techniques that are difficult to implement in large classroom settings become more feasible, and students have an additional resource from which to learn.

Initially, the implementation of the LA model was developed with four goals in mind (Otero, 2015; Otero, Pollock, & Finkelstein, 2010): (1) transforming undergraduate Science, Technology, Engineering, and Mathematics (STEM) curriculum, (2) recruiting and preparing future STEM teachers, (3) engaging faculty in discipline-based educational research literature, and (4) changing departmental and institutional culture to value evidence-base teaching.

The model was developed and first implemented at the University of Colorado (CU) Boulder campus in 2001 when Drs. Valerie Otero and Dick McCray introduced LAs in the Astrophysical and Planetary Sciences department. Since then, the program has expanded throughout CU Boulder, and other programs have been introduced at institutions around the world (Otero, 2015; Otero, Finkelstein, McCray, & Pollock, 2006). In 2009, an International Learning Assistant Alliance was established, and as of August 10, 2020, it has 2228 members from 456 institutions, 97 of which report having an LA program (Learning Assistant Alliance, 2020). The growth of these programs has stimulated interest in the model as a topic for study, and the founders of the model were recently recognized by the American Physical Society for excellence in physics education (American Physical Society, 2020). However, to date, a comprehensive review of the literature assessing the model has not been published. Additionally, much of the literature has been published in journals and conference proceedings with an audience that is largely physics education researchers. Since the application of this model is not specific to physics, it is important to disseminate the findings from this research to a broader academic audience.

Methods

Methodological framework

We present a scoping review (Arksey & O’Malley, 2005) to analyze literature on the LA model, to summarize and disseminate a broad selection of literature, and to identify areas that have not been addressed. To date, there is no existing review of literature focusing on LAs; our scoping review aims to comprehensively analyze literature using rigorous and transparent methods to present all relevant literature in this topic area. We present articles with a range of study designs and methodologies, rather than focusing on selected studies and assessment of quality and bias, as would be found in a systematic review (Campbell Collaboration, 2020).

There is no standard methodology for scoping reviews and continued debate and discussion about optimizing protocols to improve their usefulness and rigor are encouraged (Levac, Colquhoun, & O’Brien, 2010; Pham et al., 2014). Here, we relied on an established protocol with five key phases: (1) identifying the research question, (2) identifying relevant studies, (3) study selection, (4) charting the data, and (5) collating, summarizing and reporting the results (Arksey & O’Malley, 2005). Additionally, there is an optional consultation phase we chose not to implement.

A common criticism of the Arksey & O’Malley protocol is the lack of quality assessment for included articles, and more recent publications have argued that this should be an essential step (Daudt, Van Mossel, & Scott, 2013; Levac et al., 2010). However, because literature on the LA model is scarce in some areas of interest and our review includes publications with diverse methodologies, quality assessment is difficult and potentially limiting. Thus, instead of excluding articles based on a standard of rigor, we provide information on the study designs of each article we reviewed and encourage our readers to make their own quality assessment based on that information (Table 1). The only level of quality assessment we did use during study selection was to ensure that each article was peer-reviewed. Even though many of the articles included in this study are published in either the Physics Education Research Conference Proceedings or American Institute of Physics Conference Proceedings, both publications have rigorous and transparent peer-review processes (AIP Conf. Proc., 2020; PER Central, 2020).

Table 1 A summary of studies referenced in this paper that assess one or more of the original goals of the LA model (see text for details of goals)

Identifying the research question

This review was guided by the goals of the LA model described by Otero et al. (2010) and Otero (2015). We thus ask the question “Does implementation of the LA model improve undergraduate courses and curricula, facilitate teacher recruitment and preparation, encourage faculty to study discipline-based education research, and promote departmental and institutional change?”

Identifying relevant studies

Articles for this review were obtained from four sources: a search of “learning assistant” in the databases (1) “Education Database” (ProQuest) and (2) “Academic Search Premier” (EBSCO), and (3) a list of published articles that cited Otero et al. (2006) and/or (4) Otero et al. (2010) generated by Google Scholar. All of the searches were performed in January 2020 and resulted in a combined total of 722 articles.

Study selection and charting

The first author screened these articles to determine whether they were unique (i.e., not appearing in more than one of our sources), primary studies that used LAs or the LA model as a part of an educational intervention. Studies that did not meet these criteria were excluded; 80 studies are included in this review (Tables 1, 2, and 3).

Table 2 A summary of studies cited in this article that reference LAs, but do not assess any of the original goals
Table 3 Studies that use the LA model, but as only a part of or in addition to another intervention

After summarizing the findings from each article, the first author subdivided the studies into three categories: (1) those that addressed one or more of the four original goals of the LA model (n = 39), (2) those that did not address any of those goals (n = 9), and (3) studies with interventions that included the LA model, but it was not the main focus of the study (n = 32). Since our research question focuses on the four original LA model goals, the results of those 39 articles are discussed in detail, and the other 41 are summarized briefly. A summary of our search procedure and inclusion criteria is in Fig. 2.

Fig. 2
figure2

Flow diagram of article selection and inclusion criteria

We also identified whether the studies described in our 39 reviewed articles use a true experimental, quasi-experimental, or pre-experimental design (Martella, Nelson, Morgan, & Marchand-Martella, 2013). Briefly, the qualifications for a true experimental design were random selection of participants, random assignment of participants to experimental and control groups, and equal treatment of participants except in relation to the independent variable of interest. Quasi-experimental design includes an experimental and control group, but participant selection and assignment are not random. Lastly, pre-experimental design describes a study that does not include a control. Categorizing the reviewed studies in this way should help discern the extent to which conclusions about casual relationships between the LA model and desired outcomes can be made. The experimental designs used in our reviewed articles are included in Table 1.

Inclusion and exclusion transparency

The second author of this manuscript has professional appointments that may result in them benefitting from the success of the LA program at Boston University and the LA Alliance. Thus, the first author was assigned the task of determining inclusion and exclusion criteria.

Results

Goal 1: Improve undergraduate course and curriculum transformation

For over a decade, researchers have aimed to understand how the LA model influences STEM education at the undergraduate level. The intended effect of course transformation with the LA model is to improve the student experience and outcomes. Therefore, we will assess (1) student’s attitudes toward science and satisfaction with science classes, (2) student retention in STEM majors and the combined rates of students earning a D or F or withdrawing (DFW) from a STEM course, (3) student learning gains and performance, and (4) student identity and perceived skills gained.

Attitudes toward science and satisfaction with science classes

Students’ attitudes and satisfaction with their classes are correlated with performance in STEM classes and retention in STEM majors (Bok, 2008; Docktor & Mestre, 2014; Halloun, 1996; House, 1994; Osborne, Simon, & Collins, 2003). Unfortunately, some STEM courses, especially introductory physics courses, are associated with negative attitudinal shifts (Adams et al., 2006; Redish, Saul, & Steinberg, 1998). Thus, researchers have sought to adapt courses to foster more positive attitudinal shifts among students, with some success (Brewe, Kramer, & O’Brien, 2009; Otero & Gray, 2008). Of the four studies reviewed that analyze the impact of LAs on student satisfaction and/or attitudes, three demonstrate evidence of a positive impact, while the fourth suggests no significant association with improved overall course satisfaction.

Students report that LAs made class more engaging, interactive, and personal, and helped them better understand concepts. In a survey distributed to undergraduate students in large enrollment introductory biology and chemistry classes, the majority of respondents (≈58%) use their LAs during class at least once a month and close to two thirds of that population seek help from LAs during class more than once a month. Additionally, nearly 70% of students either “agree” or “strongly agree” that LAs helped them learn, increased their overall satisfaction with the course, and increased their satisfaction with the teaching of their course (Talbot et al., 2015).

Additional studies corroborate these findings. Survey responses from 387 students in LA-supported STEM courses revealed that LAs encourage thinking and participation in class and increase their appreciation for course material (Schick, 2018). At a different institution, 227 students in an LA-supported chemistry course were surveyed and respondents agree that the course is better suited for learning (≈90%) and they are more motivated (≈65%) and enjoy the course (≈80%) more than in courses without LAs. Additionally, students agree that in LA-supported courses, they interact more with their peers (≈90%) and concepts are better connected (≈75%), which could explain why LAs increase enjoyment, understanding, and appreciation for course material (Kiste, Scott, Bukenberger, Markmann, & Moore, 2017).

Understanding student motivation is important when considering the potential impact of LAs. Survey responses (n = 622) revealed that students have a high satisfaction with their LAs. However, the LAs had an insignificant effect on overall course satisfaction, and course satisfaction was the strongest predictor of final grade. Evidence from student focus groups and interviews with their LAs suggests that students in the course are primarily concerned with their final grade, but LAs are focused on learning for understanding. Thus, the lack of a significant relationship between course satisfaction and LAs may be due to students not recognizing the LAs as a source to improve their grade. This idea is further supported by a student who pointed out that exam grades carry the majority of the weight for their final grade, and exams are individual assignments where LAs have little influence (Thompson & Garik, 2015).

Others have explored the effect of the LA experience on the LAs themselves. Using the Colorado Learning Attitudes about Science Survey (CLASS) survey (Adams et al., 2006), researchers assessed attitudinal shifts in two physics courses during one semester. They found that LAs have positive shifts regarding their attitudes about learning physics and their overall interest in physics, but non-LAs had negative attitudinal shifts (Otero et al., 2010). A limitation in this study is that responses from only six participants were analyzed; thus, this is area for future work.

Retention in STEM majors and DFW rates in STEM courses

High DFW rates are commonly associated with large introductory or “gateway” courses with hundreds of students (Webb, Stade, & Grover, 2014). Although these courses seem cost-effective because of the high student-to-teacher ratio, failure in these courses can drive STEM majors to switch majors or even dropout of school, which ultimately results in funds lost (Crisp, Nora, & Taggart, 2009). This high student-to-teacher ratio creates an impersonal environment and makes it difficult to incorporate evidence-based teaching (Cuseo, 2007; Geske, 1992). For example, collaborative learning is difficult to implement in a large lecture hall with stadium seating and one instructor to mediate discussion in hundreds of small student groups; however, the use of collaborative learning in first year undergraduate courses is positively associated with persistence to the second year of college (Loes, An, Saichaie, & Pascarella, 2017). Here, we summarize three studies that demonstrate improved DFW rates for students in LA-supported classes.

A logistic regression analysis found that students who were enrolled in at least one LA-supported STEM gateway course (n = 3696) experienced a 4–15% lower probability of failing or withdrawing from introductory physics courses (Physics I and II) compared to students who were not enrolled in any LA-supported courses (n = 1245). Additionally, this study suggests that the impact on DFW rates was larger among female students, first-generation college students, and students with average high school GPAs (Alzen, Langdon, & Otero, 2017).

A follow-up study from the same researchers explored DFW rates in Physics, General Chemistry I and II, Calculus I and II, and Calculus I and II for Engineers. In total, the dataset included information for 32,071 unique students, 23,074 of whom enrolled in at least one of the above courses with LA support. Here, the authors report a 6% reduction in failure rate for students with LA support in STEM gateway courses, and in contrast to their previous findings, regression analysis demonstrated that exposure to LA support had a larger effect on male students than females (Alzen, Langdon, & Otero, 2018). One thing that may account for the contrasting results is that use of LAs varies among the departments involved in the study.

Alzen et al. (2018) do not present a true experimental design and therefore cannot be used to make casual claims. However, their analytic approach controlled for high school GPA, standardized admissions test scores, and standardized credits at entry to account for issues related to prior aptitude, the year of matriculation, and the year in which students were enrolled in each gateway course varied between cohorts. This limits several threats to validity typically associated with quasi-experimental design. Thus, the observations from this study serve as some of the most compelling evidence that the LA model has a causal effect on student outcomes.

Further work has elucidated how the LA model impacts different populations of students. This is especially important for students that face systemic inequities. For example, a study conducted on 2312 students in introductory physics courses from Fall 2012 to Spring 2019 demonstrated that students as a whole had lower average DFW rates in LA-supported sections, but the student demographics with the largest changes in DFW rates were non-first generation men and women of color. Additionally, among first-generation students, men and women of color show the largest differences in DFW rates when comparing LA-supported and traditional sections (Van Dusen & Nissen, 2020). Thus, LAs may be having an even stronger impact on students who are disproportionately represented in STEM fields.

Outcomes and performance

In this section, we review 13 studies that aimed to assess whether the LA model can improve students’ conceptual understanding. The majority of these studies incorporate the use of concept inventories, which are criterion-referenced tests that assess the accuracy of students’ understanding of a specific set of concepts. These are especially prevalent in physics education research where the Force Concept Inventory (FCI; Hestenes, Wells, & Swackhamer, 1992), Force and Motion Conceptual Evaluation (FMCE; Thornton & Sokoloff, 1998), Brief Electricity and Magnetism Assessment (BEMA; Ding, Chabay, Sherwood, & Beichner, 2006), and Conceptual Survey of Electricity and Magnetism (CSEM; Maloney, O’Kuma, Hieggelke, & Van Heuvelen, 2001) are all established tools. In general, studies suggest that LA support improves student learning gains as measured by concept inventories and performance on higher-order assessment, and LAs have much deeper content knowledge than their peers.

Average normalized learning gains on the FMCE and BEMA for LA-supported introductory physics courses ranged from 44 to 66%, which is 2–3 times higher than national averages observed in traditional courses (Kohlmyer et al., 2009; Otero et al., 2006, 2010). In a separate study, researchers compared student learning gains on the FMCE before and after LA implementation. Before LAs, students (n = 263) averaged a normalized learning gain of 32.4%, and after LAs were added (n = 462), the average increased slightly to 35.8%. However, when they controlled for the instructor of record those numbers changed to 32% and 47%, respectively (Miller, Carver, Shinde, Ratcliff, & Murphy, 2013). This indicates that although LAs may impact student learning, there are other factors that could mute those positive effects.

Importantly, these findings extend beyond physics. Using the Conceptual Inventory of Natural Selection (Anderson, Fisher, & Norman, 2002), researchers measured learning gains for students in General Biology II with and without an LA. When compared to previously published results (Andrews, Leonard, Colgrove, & Kalinowski, 2011), effect sizes for students in a non-LA course were at the bottom end of the published range, and students in the LA-supported course were at the top (Talbot et al., 2015).

More in-depth studies with larger sample sizes were made possible with the generation of the Learning about STEM Student Outcomes (LASSO) online platform, which gives researchers the ability to request data about students outside of their home institution and in diverse classroom settings. To make meaningful statistical comparisons with the nested data from LASSO, researchers generated Hierarchical Linear Models (HLM), which create unique equations for each classroom to model an effect estimate across all classrooms that were assessed allowing correlations between student outcomes and other factors (Nissen, Donatello, & Van Dusen, 2019; Van Dusen & Nissen, 2019).

One such study analyzed 3315 unique concept inventory scores from 17 courses in 13 institutions; the courses were all STEM courses, but varied in terms of their discipline. Thus, a host of different concept inventories was distributed to test students on appropriate content. They found that gender, race, time spent working with LAs, and instructors’ experiences with LAs all had significant correlations to student outcomes. Male students had higher effect sizes than females, and black students had higher average effect sizes than white and Asian peers. So, although the traditional learning gap in gender does not appear to be aided by LA support, underrepresented racial minority students may disproportionately benefit. Additionally, average effect size of students who spent 16–30 min/week interacting with LAs more than doubled that of students that spent 0 min/week interacting with LAs (Van Dusen, Langdon, & Otero, 2015).

Another study focused on only physics students, but still maintained a large sample size (n = 2868) and analyzed pre- and post-test scores on the FCI, FMCE, and CSEM from students in 67 classes from 16 institutions. For this study, they compared culturally “dominant” demographics (White or Asian, non-Hispanic/Latino, male students; n = 1304) to “non-dominant” populations (n = 1564; Estrada et al., 2016). LA support was associated with removal, and in some cases, reversal of traditional learning gaps in physics. Using data from all three concept inventories, the learning gap was significantly negative (i.e., dominant students outperformed their non-dominant peers) in courses without LAs, and in courses with LA support, the learning gap was significantly positive (Van Dusen, White, & Roualdes, 2016). An important caveat to this study is that hierarchical linear modeling (HLM) was not generated. Thus, researchers published a follow-up analysis that included HLM. Here, they found that LA support is meaningfully associated with improvement in overall student performance. However, LA support did not eliminate the learning gaps between dominant and non-dominant student demographics. Their model predicts that students from dominant and non-dominant genders who begin the class with the same pre-test scores will have a difference in posttest scores of 3.5%, and a similar gap (4.1%) emerges between students from dominant and non-dominant races/ethnicities. Additionally, there appears to be a compound effect as students with non-dominant gender and races/ethnicities will score 7.6% lower than dominant peers that score equivalently on their pre-test. Predicted learning gaps in LA-supported courses are not significantly different (Van Dusen & Nissen, 2017).

Different implementations of LAs have different effects on student outcomes in introductory physics. Paired student concept inventory scores (n = 3753) were collected over three semesters from 69 courses offered at a total of 17 institutions. Using combined results from the FCI, FMCE, CSEM, and BEMA, researchers found that LA support in a laboratory setting was associated with a 1.9 times higher effect size than non-LA classes, which was significantly higher than the difference observed in courses where there is LA support in lecture (1.4 times higher), recitation (1.5 times higher), and “unknown” (1.3 times higher). The best practice for LAs is likely dependent on a number of variables, but this study does raise some interesting questions regarding the efficacy of LAs in different settings (White, Van Dusen, & Roualdes, 2016).

One of the challenges of understanding how LAs impact student outcomes is that incorporation of LAs facilitates the use of other research-based teaching methods. Thus, determining whether outcomes should be attributed to the implementation of the LA model or another factor is difficult. Therefore, researchers analyzed learning gains measured by the FMCE and FCI in a first-semester physics courses with three styles: lecture-based instruction (18 courses, 791 students), collaborative instruction alone (24 courses, 1068 students), and collaborative instruction with LAs (70 courses, 4100 students). Results align with previous findings that collaborative learning correlates with higher learning gains than traditional lecture-based courses (Hake, 1998). However, their model shows that collaborative learning alone results in post-semester scores 1.07 times higher than traditional courses, and collaborative learning with LA support is associated with a 1.14 times higher average score. There is significant variation depending on LA usage (1.12 times higher in lecture vs 1.3 times higher in lab), but all gains are larger than with collaborative learning alone (Herrera, Nissen, & Van Dusen, 2018).

Beyond learning gains measured by concept inventories, LA support influences student performance on high-order assessments. Prior to implementation of the LA model, an introductory molecular biology course had been transformed to a highly structured, flipped classroom (Lage, Platt, & Treglia, 2000). This study demonstrated that LA-supported students in a flipped classroom (n = 411) did not have significantly better learning gains than the unsupported, flipped classroom cohort (n = 97) on an adapted concept inventory. However, LA-supported students did perform better on exam questions that require higher order cognitive skills and this improvement was greater among underrepresented minority students (Sellami, Shaked, Laski, Eagan, & Sanders, 2017).

Using qualitative analysis, researchers observed how LAs impact the types of discussions students have while answering in-class clicker questions (Caldwell, 2007). They found that when students interacted with LAs, they spent significantly more time in discussion and the percentage of their discussion that was productive increased. Additionally, students interacting with LAs were more likely to request feedback and reasoning and less likely to request information from instructors. However, the LAs’ technique had a significant impact on the student discussion. For example, when LAs asked a prompting question, provided a background statement, or requested information, it was more likely to facilitate discussion among students, but when LAs explained their own reasoning for an answer, they were less likely to elicit student reasoning (Knight, Wise, Rentsch, & Furtak, 2015). Since peer discussion is a critical part of the benefits that come with in-class clicker questions (Smith et al., 2009; Smith, Wood, Krauter, & Knight, 2011), the authors of this study urge instructors and LAs to ask questions of students to promote discussion rather than provide explanations.

In addition to learning gains for students in LA-supported course, LAs themselves demonstrate cognitive gain. One study reported that LAs display content knowledge comparable to physics graduate students (Otero et al., 2010). Additionally, LAs have larger learning gains than students who taught in another near-peer learning program or participated in undergraduate research. Methodological details for this study are scant, but the authors report that LAs posted significantly better normalized learning gains than the cohort to which they were compared (Price & Finkelstein, 2008). Furthermore, the pre- and posttest scores on the Introductory Molecular and Cell Biology Assessment for LAs and TAs were averaged together, and those scores were significantly higher than undergraduate students with no LA experience. However, their posttest scores were still significantly lower than Biology “experts” (Shi et al., 2010).

Identity and perceived skill gains

For a deeper look at how the LA experience affects the LAs, Close et al. (2016, 2013) explored how LAs develop a strong “physics identity”—that is, thinking of yourself as a physicist, rather than a student who is taking a physics course. The physics identity framework is dependent on personal interest, performance or competence, and recognition by others. Regression analysis suggests that physics identity is a strong predictor of whether students pursue careers in physics (Hazari, Sonnert, Sadler, & Shanahan, 2010; Lock, Hazari, & Potvin, 2013). To determine whether being an LA contributed to the development of a strong physics identity, researchers performed a multi-layered qualitative analysis of LAs. First, researchers analyzed over 180 written reflections from 61 unique, first semester LAs over five semesters. A subset of these LAs (n = 29) reapplied to the program, and the responses on those applications were used as a second source of qualitative data. A third source of qualitative data was obtained by interviewing another subset of LAs (n = 12), probing both self-perceptions and practice. Their analysis suggests that participating as an LA results in more comfort interacting with peers, near peers, and faculty and that contributes to the development of a stronger physics identity (Close et al., 2016; Close, Close, & Donnelly, 2013).

There is also evidence that the LA experience impacts professional identities beyond physics. Survey responses from 20 STEM majors hired as LAs were surveyed to better understand their professional identities. LA responses were analyzed using the self-authorship framework, which posits that as people engage in challenging experiences, they use internal references in their identity expression as opposed to external cues (Baxter Magolda, 2009). In terms of describing their work, 60% of LAs used language indicative of a mastery approach or “collaborator” instead of a “follower”. Furthermore, 65% of LAs used internal cues when discussing professional interactions. More experienced LAs were more likely to indicate more advanced professional identities than first semester LAs (Nadelson & Finnegan, 2014).

Goal 2: Teacher recruitment and preparation

The USA is facing a shortage of quality K-12 math and science teachers (García & Weiss, 2019; Hill & Gruber, 2011). One of the major objectives of the LA program was to address this growing problem by providing undergraduates with an easy mechanism to explore a teaching potential career path. However, few studies have addressed whether the LA model promotes K-12 teacher recruitment. Before the LA program at CU Boulder, an average of less than one physics/astrophysics majors enrolled in their teaching certification program per year, and in academic year 2007/2008 (5 years after the first cohort of LAs), 13 physics/astrophysics majors enrolled in the teaching certification program. By Fall of 2009, 10 physics/astrophysics majors that were former LAs were in-practice teachers and 6 more were enrolled in teacher certification programs (Otero et al., 2006, 2010). Beyond that, it is unclear how the LA experience influences teacher recruitment, and this is an area for future research.

The second piece of this goal is to improve teacher preparation, which has been studied more extensively. First, we will discuss studies that focus on teaching and learning skills that LAs gain. Then, we will describe studies that analyzed teacher practice and how instruction varied between in-service teachers that formally served as LAs and those that did not. Importantly, the participants in these studies were certified through the same teaching program and thus the non-LA group is a reasonably matched comparison group.

Knowledge and understanding of teaching and learning among LAs

At the end of their first semester, it is not uncommon for LAs to experience a state of unease with teaching and learning. Analysis of interview data from physics LAs revealed that experienced LAs reflect on their own learning and express a refined understanding of competence, which includes moving away from a “correct answer” mindset and towards the idea that “it’s okay to be wrong”. However, novice LAs are more focused on teaching and connect competence to being able to remember answers (Conn, Close, & Close, 2014). This suggests that students can continue to grow and benefit from LA experience after their first semester.

Improving pedagogical education for LAs has also been a point of emphasis. For example, using the theoretical framework of “sense-making,” the process by which people rationalize their actions based on collective experience (Weick, 1995), it has been argued that LAs need to understand how a teaching strategy fits with their current ideas in order to implement it. The authors demonstrated that LAs who engage in discussion about which techniques fit in with their existing ideas about good teaching are more engaged and are better at identifying “responsiveness” as an attribute of quality instruction (Robertson & Richards, 2017). Others highlight the importance of language when introducing pedagogical concepts to LAs. In a study of 304 first-semester LAs’ teaching reflections, researchers found that LAs most often discuss student ideas and that increases at the end of the semester. However, in one semester, the discussion of mental models (Redish, 1994) was left out of the curriculum for the LA pedagogy course, and in that semester, the least amount of growth was observed. The authors suggest that the term “mental model” resonates with students because as science students they are familiar with learning complex topics with the use of models (Top, Schoonraad, & Otero, 2018). Thus, building on LAs’ pre-existing knowledge could be beneficial for developing a strong understanding of pedagogy.

In a pedagogy course specifically aimed at training LAs for undergraduate engineering design courses, 13 LAs responded to a survey where they were asked to rate the productivity of the topics covered in the course and lessons used to teach those topics. According to LAs, the most productive topic is “convergent/divergent thinking” and others that were rated highly are “tinkering, making, & fun” (Quan & Gupta, 2020), “design thinking” (Brown, 2008), and “proudness” (Little, 2015). Among the most productive lessons for LAs are “classroom role play”, “watching and discussing video”, “final poster”, and “roses & thorns” (Quan, Turpen, Gupta, & Tanu, 2017). One common theme among these lesson designs is that they require reflection on the part of the LA, which is a main component and focus of the pedagogy course in the LA model.

LAs (n = 55) and faculty (n = 16) were surveyed to assess whether a new program improved active learning, effectively trained LAs, and adequately supported faculty. Results suggested that LAs promote active learning, and faculty and LAs both perceive an improvement to collaborative learning due to LAs. However, faculty and LAs feel that training for LAs was only somewhat adequate and many participants did not sufficiently explain how LAs align with course learning objectives (Campbell, Malcos, & Bortiatynski, 2019). Thus, improvements to training and course transformation could elucidate further pedagogical advantages that LAs bring to a classroom.

Within a thermodynamics course required for mechanical engineering students, researchers observed the LA pedagogy seminar and coded LA interview responses to determine what LAs “notice” about the course. Something LAs picked up on was the lack of metacognitive abilities among their students. LAs communicated that their students often misdiagnosed their own understanding and had difficulties addressing their misconceptions about the first law of thermodynamics. Additionally, LAs began to notice systemic inequities and suggested opportunities for more inclusive teaching. They recognized that during collaborative learning, some groups were dominated by a small percentage of students who were more confident in their knowledge, and the LAs suggested that having more diverse representation on the instructional staff would encourage people other than the “white male nerds from high school” (direct quote from LA comment) to feel immediately comfortable in an engineering program (Wendell, Matson, Gallegos, & Chiesa, 2019).

Teaching practice of former LAs

In a qualitative study, researchers analyzed interview responses from 10 first year middle and high school STEM teachers. Non-LAs were more likely to express discomfort with incorporating group work into their teaching and to talk about group work as a necessity due to a lack of resources. Additionally, some of the non-LAs mentioned concerns with student behavior during group work or feeling that they lack the skills or knowledge required to create assignments for group work. Both former LAs and non-LAs recognize that group work provides the opportunity for students to build important skills, but only former LAs mentioned that students can improve their argumentation and justification skills (Gray & Otero, 2009).

In a more in-depth analysis of 14 first year teachers, researchers combined data from interviews, classroom observations, artifact packages, and observations made with Reformed Teacher Observation Protocol (RTOP) to compare teaching practices of former LAs and non-LAs. The RTOP is made up of 25 statements that cover: lesson design and implementation, content (propositional and procedural knowledge), and classroom culture (interactions and relationships). Respondents rate each statement on a scale from 0 to 4 with 4 being the most in-line with national standards for teaching (Sawada et al., 2002). In this study, each subject was observed at least two times by multiple observers for a total of 19 former LA observations and 20 non-LA observations. Former LAs outperformed non-LAs on the content and classroom culture sections of the RTOP, which demonstrates that former LAs’ teaching practice tends to be more aligned with the national standards and research on teaching. Specifically, former LAs present the content of their courses in a more organized way that students can better relate to and encourage students to challenge ideas and generate alternative solutions (Gray, Webb, & Otero, 2010).

To expand on those findings, researchers performed 178 observations of 29 math and science teachers with 0–4 years of experience. Consistent with previous results, former LAs performed better on the RTOP on average than non-LAs in both math and science. Additionally, 24 participants (12 LAs and 12 non-LAs) were interviewed to better understand the goal of assessment in their classrooms. The majority of both former LAs and non-LAs are most likely to use assessment to inform instruction. Only non-LAs used assessment to evaluate learning, and only former LAs used assessment to inform students about their own understanding (Gray, Webb, & Otero, 2011). These results suggest that LAs are more likely to utilize formative assessment, and some non-LAs rely only on the more traditional summative assessment (Wolfe, 1999).

In the most comprehensive study with this focus, researchers completed 178 observations of 29 middle and high school science and math teachers over the course of 5 years. Consistent with previous results (Gray et al., 2010; Gray et al., 2011), former LAs had higher RTOP scores on average and performed significantly better in nearly every subcategory of the RTOP. The difference in RTOP scores was largest among 1st year teachers. Importantly, former LAs more commonly received ratings of 3 and 4, and non-LAs more commonly received ratings of 0 and 1, which means that non-LAs more often do not implement teaching practices described on the RTOP and if they do, they implement them poorly or incorrectly (Gray, Webb, & Otero, 2016). Together, this series of studies strongly suggest that the LA experience has a longitudinal impact on K-12 teachers and serves as a valuable supplement to traditional teacher certification programs.

Similar results were observed using the Scoop Notebook to assess teachers’ use of reform-oriented practices. The Scoop Notebook is used to collect data about classroom instruction without the labor and cost demands of typical class room observations (Borko, Stecher, & Kuffner, 2007). The study included 19 middle and high school science and math teachers, 11 of whom were former LAs. Former LAs scored significantly better in the categories of “Grouping”, “Discourse Community”, and “Explain & Justify”. These concepts link to lessons in the LA pedagogy course and common activities in weekly prep sessions with faculty (Barr, Ross, & Otero, 2012).

Goal 3: Discipline-based educational research

We look to tenure track faculty to adopt cutting-edge research methods, but adopting cutting-edge teaching methods seems to be less of a focus for STEM faculty. For example, despite overwhelming evidence that active learning is a superior teaching method (Freeman et al., 2014), many STEM faculty continue to use traditional, less effective styles (Handelsman et al., 2004; Stains et al., 2018; Vickrey, Rosploch, Rahmanian, Pilarz, & Stains, 2015). Sometimes faculty have a difficult time incorporating active learning due to circumstance (i.e., large class size, restrictive classrooms) or a lack of information about how to implement them. However, there are some holdouts who still defend traditional methods and others who agree there are benefits of active learning techniques, but are not appropriately motivated to practice them (Handelsman et al., 2004). For these reasons, the LA model was designed to motivate STEM faculty to implement evidence-based teaching methods.

Two studies present quantitative analyses using LASSO and HLM that suggest that the LA model does influence faculty practice. First, student learning gains (n = 3,315) increased in courses led by faculty (n = 17) who had more experience teaching with LAs. Post-semester concept inventory scores were significantly higher for each semester of experience an instructor had with LAs up to 6 semesters (experience beyond that was not measured; Van Dusen et al., 2015). Second, analysis of 4365 pre- and post-semester concept inventories (either FCI or FMCE) scores obtained over 3 years revealed that learning gains steadily decreased for faculty without LAs, and LA support remediated that decline. When instructors have 6 terms of experience teaching their respective courses, the students in LA-supported courses are predicted to outperform those in non-LA courses by 10.3% on their post-semester concept inventory. Given that the average student raw learning gains for students in non-LA classes were approximately 20%, instructors that teach without LAs are losing approximately half of the predicted student learning after 6 semesters (Caravez, De La Torre, Nissen, & Van Dusen, 2017). Thus, faculty may be able to improve their teaching skills by working with LAs and teaching in LA-supported courses.

Evidence from a case study provides some explanation for how student-faculty collaboration impacts course effectiveness, as observed in the two studies summarized in the previous paragraph (Caravez et al., 2017; Van Dusen et al., 2015). A team comprised of two faculty members and three LAs was formed, and under the guidance of two pedagogy coaches, was tasked with redesigning courses and monitoring the progress of those course during the semester. The most prevalent theme among interview responses from LAs and faculty was “expanded conceptions”. Both LAs and faculty reported on a broadening of conceptions about teaching and learning, and the faculty especially reported an increased awareness of new course design strategies they believed to be successful. Thus, the authors conclude that LAs and pedagogy coaches improve faculty understanding of discipline-based education research and that an LA program can expand conceptions about teaching and learning (McHenry, Martin, Castaldo, & Ziegenfuss, 2009).

Further analysis using transcripts from one-on-one interviews with five LAs and seven faculty members yielded three distinct frameworks for faculty-LA partnerships: (1) mentor-mentee, (2) faculty driven collaboration, and (3) collaborative partnership. The mentor-mentee partnership is one-directional and involves limited input from LAs. In faculty-driven collaboration partnerships, faculty elicits feedback and insights from LAs, but LAs are not involved in course design. This is distinct from a true collaborative partnership, where faculty elicits feedback and insights from LAs and work with the LAs to determine the best ways to present material and concepts to students. The authors conclude that collaborative partnerships require faculty to invest more time and a willingness to cede some control of the course to LAs, but they positively impact classroom structure and LAs (Sabella, Van Duzor, & Davenport, 2016).

Goal 4: Departmental and institutional change

While this goal may have been met at many institutions, it is arguably the most difficult to formally assess. Success, sustainability, and growth of an LA program depends upon (1) reliable financial support and (2) pedagogical support (Otero, 2015; Otero et al., 2006, 2010). First, an LA program requires a financial commitment from the administration, which is most sustainable through internal institutional funding, and as a program grows, the cost increases as well (in both faculty/staff time and LA stipends). Thus, to grow and sustain an LA program, it is critical that the administration recognizes and values the outcomes of the programs described in this review. Second, LA programs focus on individual course reform and pedagogical training for LAs. To staff pedagogy courses with qualified instructors, it is beneficial for STEM departments to partner with their schools of education. Establishing STEM-education relationships within institutions will strengthen the LA program and foster an appreciation for evidence-based instruction among students and STEM faculty.

Two instances of departmental and institutional change that have been published on are the LA program-driven partnerships between Chicago State University, California State University San Marcos, and their respective local community colleges. These intuitions have developed partnerships focused on improving curriculum with the use of LAs. Case studies on these partnerships demonstrated the potential to create an LA network that makes it easier for LAs to transfer to 4-year schools. Additional outcomes include faculty development, course and programmatic transformation, evolution of faculty roles, and generally improved alignment between partnered institutions. The two partnerships share several features that they contribute to success; these include, but are not limited to, leadership from faculty, equitable and regular communication between representatives from both partnered institutions, non-hierarchical partnerships, and a strong focus on evidence-based teaching (Cochran, Van Duzor, Sabella, & Geiss, 2016; De Leone, Price, Sabella, & Van Duzor, 2019).

Additional outcomes of the LA model

Beyond the four original goals of the LA model, researchers have studied other benefits of the LA model (Table 2). For example, LAs often work closely with TAs or teaching staff other than faculty, and multiple studies have aimed to characterize those partnerships (Becker, Goldberg, & Jariwala, 2016; Davenport, Amezcua, Sabella, & Van Duzor, 2017). Others have looked more deeply at LA perceptions to assess the reflective practice that is often part of LA pedagogical training (Cao, Smith, Lutz, & Koretsky, 2018; Cochran, Brookes, & Kramer, 2013). Additionally, the LA model can be a source of student networks that improve the connection of underrepresented minority students to a STEM department (Goertzen, Brewe, & Kramer, 2013). Furthermore, some studies have used the LA model to generate and validate new teaching methods and assessments (Chini, Straub, & Thomas, 2016; Cochran et al., 2016; Davenport et al., 2017; Talbot, 2013). The LA model can also facilitate the implementation of other evidence-based methods, which has been demonstrated by a number of studies (Table 3). Lastly, a recent study focused on developing a better method to measure student success and outcomes in LA-supported, large enrollments courses using the Cultural Historical Activity Theory framework (Talbot et al., 2016).

Conclusions and future research

Nearly 20 years since the LA model was first introduced at CU Boulder, a major effort has been focused on assessing outcomes for students in LA-supported courses. However, designing controlled studies is often complicated, it may not be possible to account for all confounding factors, and there are varying implementations of the model that may influence outcomes at a local level. Even given these challenges, the LA model has been well explored in select contexts. This review highlight studies that demonstrate associations between LA model implementation and improved academic outcomes for both LAs themselves and the students in LA-supported courses. Additionally, we summarize findings that describe how being an LA is a valuable supplement to traditional teacher education. However, in this review, we make it clear that some major goals of the LA model’s creators remain understudied. Thus, a focus of future research should be on how the LA model impacts teacher recruitment, or perhaps more generally, how the experience of being an LA impacts career decisions, and an effort should be made to better understand how implementing an LA program affects departmental/faculty attitudes towards evidence-based teaching.

As the only literature review to date that specifically assesses the LA model, this article serves as an important resource for teachers, administrators, and education researchers. Our comprehensive synthesis provides faculty and administrators who are interested in implementing the LA model with key details to make informed decisions about the specifics of their programs. Additionally, a critical look at the literature reviewed in this article reveals that a small group of authors are responsible for the bulk of research and many of the studies were published in Physics Education Research Conference proceedings. Thus, this review may serve as an introduction to the LA model for many education researchers outside of physics, and we hope this stimulates further LA model research in diverse fields.

Lastly, this review has made clear that the research assessing the LA model is lacking the use of true-experimental design. The absence of this “gold-standard” study design was also highlighted by Dawson et al. (2014) in their review of Supplemental Instruction, a similar near-peer instructional model. However, given the evidence that near-peer instruction improves student outcomes and benefits faculty and the near-peer instructors, it could be argued that testing near-peer instruction in a true-experimental design would be unethical, especially in a case with traditional lecture as a control (Freeman et al., 2014). Researchers must weigh the benefits of such studies with the potential detriment to students and faculty members in a control group with no near-peer instructors. Therefore, for future studies, we encourage quasi-experimental designs coupled with advanced statistical modeling and/or carefully considered experiments that limit uncontrolled variables and bias (Alzen et al., 2018; Nissen et al., 2019; Van Dusen & Nissen, 2019).

Availability of data and materials

Data sharing is not applicable to this article as no datasets were generated during the current study.

Abbreviations

LA:

Learning Assistant

TA:

Teaching Assistant

STEM:

Science, Technology, Engineering, and Mathematics

CU:

University of Colorado

DFW:

D, F, or Withdrawal

CLASS:

Colorado Learning Attitudes about Science Survey

FCI:

Force Concept Inventory

FMCE:

Force and Motion Conceptual Evaluation

BEMA:

Brief Electricity and Magnetism Assessment

CSEM:

Conceptual Survey of Electricity and Magnetism

LASSO:

Learning about STEM Student Outcomes

HLM:

Hierarchical Linear Models

RTOP:

Reformed Teacher Observation Protocol

References

  1. Adams, W. K., Perkins, K. K., Podolefsky, N. S., Dubson, M., Finkelstein, N. D., & Wieman, C. E. (2006). New instrument for measuring student beliefs about physics and learning physics: The Colorado Learning Attitudes about Science Survey. Physical Review Special Topics - Physics Education Research, 2(1), 1–14. https://doi.org/10.1103/PhysRevSTPER.2.010101

  2. AIP Conference Proceedings. (2020). For Organizers: Peer Review. https://aip.scitation.org/apc/organizers/peerreview. Accessed 3 Nov 2020.

  3. Alzen, J. L., Langdon, L., & Otero, V. K. (2017). The Learning Assistant model and DFW rates in introductory physics courses. Physics Education Research Conference Proceedings, 36–39. https://doi.org/10.1119/perc.2017.pr.004

  4. Alzen, J. L., Langdon, L. S., & Otero, V. K. (2018). A logistic regression investigation of the relationship between the Learning Assistant model and failure rates in introductory STEM courses. International Journal of STEM Education, 5(1), 1–12. https://doi.org/10.1186/s40594-018-0152-1

  5. American Physical Society. (2020). Excellence in Physics Education Award. https://www.aps.org/programs/honors/prizes/education.cfm. Accessed 8 Dec 2020.

  6. Anderson, D. L., Fisher, K. M., & Norman, G. J. (2002). Development and evaluation of the conceptual inventory of natural selection. Journal of Research in Science Teaching, 39(10), 952–978. https://doi.org/10.1002/tea.10053

  7. Andrews, T. M., Leonard, M. J., Colgrove, C. A., & Kalinowski, S. T. (2011). Active learning not associated with student learning in a random sample of college biology courses. CBE Life Sciences Education, 10(4), 394–405. https://doi.org/10.1187/cbe.11-07-0061

  8. Arendale, D. R. (1994). Understanding the supplemental instruction model. New Directions for Teaching and Learning, 1994(60), 11–21.

    Article  Google Scholar 

  9. Arksey, H., & O’Malley, L. (2005). Scoping studies: Towards a methodological framework. International Journal of Social Research Methodology: Theory and Practice, 8(1), 19–32. https://doi.org/10.1080/1364557032000119616

  10. Baily, C. (2011). Perspectives in quantum physics: Epistemological, ontological and pedagogical [Unpublished doctoral dissertation]. University of Colorado at Boulder.

  11. Barr, S. A., Ross, M. J., & Otero, V. (2012). Using artifact methodology to compare learning assistants’ and colleagues’ classroom practices. AIP Conference Proceedings, 1413, 119–122. https://doi.org/10.1063/1.3680008

  12. Baxter Magolda, M. B. (2009). Self-Authorship: The foundation for twenty-first-century education. New Directions for Teaching and Learning, 109, 69–83. https://doi.org/10.1002/tl.266

  13. Becker, A. P., Goldberg, B., & Jariwala, M. (2016). Self-perception of teaching fellows and learning assistants in introductory physics classes. Physics Education Research Conference Proceedings, 48–51. https://doi.org/10.1119/perc.2016.pr.007

  14. Bok, D. (2008). Our underachieving colleges: A candid look at how much students learn and why they should be learning more. Princeton, Princeton University Press.

  15. Bonham, S. W., Jones, K., Luna, B., & Pauley, L. (2018). An integrated model for teaching writing in the introductory laboratory. Journal of College Science Teaching, 48(2), 40–47.

  16. Borko, H., Stecher, B., & Kuffner, K. (2007). Using artifacts to characterize reform-oriented instruction :The Scoop Notebook and rating guide. In Education.

  17. Brewe, E., Kramer, L., & O’Brien, G. (2009). Modeling instruction: Positive attitudinal shifts in introductory physics measured with CLASS. Physical Review Special Topics - Physics Education Research, 5(1), 1–5. https://doi.org/10.1103/PhysRevSTPER.5.013102

  18. Brown-Robertson, L. T. N., Ntembe, A., & Tawah, R. (2015). Evaluating the “underserved student” success in economics principles courses. Journal of Economics and Economic Education Research, 16(3), 13–24.

  19. Brown, T. (2008). Design thinking. Harvard Business Review, June, 85–92. https://doi.org/10.5437/08956308X5503003

  20. Bullock, D., Callahan, J., & Shadle, S. E. (2015). Coherent calculus course design: Creating faculty buy-in for student success. ASEE Annual Conference and Exposition, Conference Proceedings. https://doi.org/10.18260/p.23694.

  21. Caldwell, J. E. (2007). Clickers in the Large Classroom: Current Research and Best-Practice Tips. CBE Life Sciences Education, 6, 9–20. https://doi.org/10.1187/cbe.06

  22. Callahan, J., Pyke, P., Shadle, S., & Landrum, R. E. (2014). Creating a STEM identity: Investment with return. ASEE Annual Conference and Exposition, Conference Proceedings.

  23. Campbell, B. J. M., Malcos, J. L., & Bortiatynski, J. M. (2019). Growing a learning assistant improvement. Journal of College Science Teaching, 48(3), 66–74.

    Article  Google Scholar 

  24. Campbell Collaboration. (2020). What Is a Systematic Review? https://campbellcollaboration.org/what-is-a-systematic-review.html. Accessed 2 Nov 2020.

  25. Cao, Y., Smith, C., Lutz, B., & Koretsky, M. (2018). Cultivating the next generation: Outcomes from a Learning Assistant program in engineering. ASEE Annual Conference and Exposition. https://doi.org/10.1590/s1809-98232013000400007

  26. Caravez, D., De La Torre, A., Nissen, J. M., & Van Dusen, B. (2017). Longitudinal associations between Learning Assistants and instructor effectiveness. Physics Education Research Conference Proceedings, 80–83. https://doi.org/10.1119/perc.2017.pr.015

  27. Chasteen, S., Perkins, K., Beale, P., Pollock, S., & Wieman, C. (2011). A Thoughtful Approach to Instruction: Course Transformation for the Rest of Us. Journal of College Science Teaching, 40(4), 24–30.

  28. Chini, J. J., Straub, C. L., & Thomas, K. H. (2016). Learning from avatars: Learning assistants practice physics pedagogy in a classroom simulator. Physical Review Physics Education Research, 12(1), 1–15. https://doi.org/10.1103/PhysRevPhysEducRes.12.010117

  29. Close, E. W., Close, H. G., & Donnelly, D. (2013). Understanding the learning assistant experience with physics identity. AIP Conference Proceedings, 1513(January), 106–109. https://doi.org/10.1063/1.4789663

  30. Close, E. W., Conn, J., & Close, H. G. (2016). Becoming physics people: Development of integrated physics identity through the Learning Assistant experience. Physical Review Physics Education Research, 12(1), 1–18. https://doi.org/10.1103/PhysRevPhysEducRes.12.010109

  31. Cochran, G. L., Brookes, D. T., & Kramer, L. H. (2013). A framework for assessing Learning Assistants’ reflective writing assignments. AIP Conference Proceedings, 1513(January), 15–18. https://doi.org/10.1063/1.4789640

  32. Cochran, G. L., Van Duzor, A. G., Sabella, M. S., & Geiss, B. (2016). Engaging in self-study to support collaboration between two-year colleges and universities. Physics Education Research Conference Proceedings, 76–79. https://doi.org/10.1119/perc.2016.pr.014

  33. Co, E. (2019). The power of practice: Adjusting curriculum to include emphasis on skills. Journal of College Science Teaching, 48(5), 22–27.

  34. Conn, J., Close, E. W., & Close, H. G. (2014). Learning Assistant identity development: Is one semester enough? Physics Education Research Conference Proceedings, 55–58. https://doi.org/10.1119/perc.2014.pr.010

  35. Cracolice, M. S., & Queen, M. (2019). Maximizing learning efficiency in General Chemistry. ACS Symposium Series, 1322, 55–67. https://doi.org/10.1021/bk-2019-1322.ch004.

  36. Crisp, G., Nora, A., & Taggart, A. (2009). Student characteristics, pre-college, college, and environmental factors as predictors of majoring in and earning a STEM degree: An analysis of students attending a hispanic serving institution. American Educational Research Journal, 46(4), 924–942. https://doi.org/10.3102/0002831209349460

  37. Cuseo, J. (2007). The empirical case against large class size: Adverse effects on the teaching, learning, and retention of first-year students. Journal of Faculty Development, 21(1), 5–21.

    Google Scholar 

  38. Daudt, H. M. L., Van Mossel, C., & Scott, S. J. (2013). Enhancing the scoping study methodology: A large, inter-professional team’s experience with Arksey and O’Malley’s framework. BMC Medical Research Methodology, 13(1), 1–9. https://doi.org/10.1186/1471-2288-13-48

  39. Davenport, F., Amezcua, F., Sabella, M. S., & Van Duzor, A. G. (2017). Exploring the underlying factors in Learning Assistant-faculty partnerships. Physics Education Research Conference Proceedings, 104–107. https://doi.org/10.1119/perc.2017.pr.021

  40. Dawson, P., van der Meer, J., Skalicky, J., & Cowley, K. (2014). On the effectiveness of supplemental instruction: A systematic review of supplemental instruction and peer-assisted study sessions literature between 2001 and 2010. Review of Educational Research, 84(4), 609–639. https://doi.org/10.3102/0034654314540007

  41. De Leone, C., Price, E., Sabella, M., & Van Duzor, A. (2019). Developing and sustaining faculty-driven, curriculum-centered partnerships between two-year colleges and four-year institutions. Journal of College Science Teaching, 048(06), 20–28. https://doi.org/10.2505/4/jcst19_048_06_20

  42. Ding, L., Chabay, R., Sherwood, B., & Beichner, R. (2006). Evaluating an electricity and magnetism assessment tool: Brief electricity and magnetism assessment. Physical Review Special Topics - Physics Education Research, 2(1), 1–7. https://doi.org/10.1103/PhysRevSTPER.2.010105

  43. Docktor, J. L., & Mestre, J. P. (2014). Synthesis of discipline-based education research in physics. Physical Review Special Topics - Physics Education Research, 10(2), 1–58. https://doi.org/10.1103/PhysRevSTPER.10.020119

  44. Elliott, E. R., Reason, R. D., Coffman, C. R., Gangloff, E. J., Raker, J. R., Powell-Coffman, J. A., & Ogilvie, C. A. (2016). Improved student learningthrough a faculty learning community: How faculty collaboration transformed a large-enrollment course from lecture to student centered. CBE Life Sciences Education, 15(2), 1–14. https://doi.org/10.1187/cbe.14-07-0112.

  45. Estrada, M., Burnett, M., Campbell, A. G., Campbell, P. B., Denetclaw, W. F., Gutiérrez, C. G., Hurtado, S., John, G. H., Matsui, J., McGee, R., Okpodu, C. M., Joan Robinson, T., Summers, M. F., Werner-Washburne, M., & Zavala, M. E. (2016). Improving underrepresented minority student persistence in stem. CBE Life Sciences Education, 15(3), 1–10. https://doi.org/10.1187/cbe.16-01-0038

  46. Evans, D. J. R., & Cuffe, T. (2009). Near-peer teaching in anatomy: An approach for deeper learning. Anatomical Sciences Education, 2(5), 227–233. https://doi.org/10.1002/ase.110

  47. Foote, K., Knaub, A., Henderson, C., Dancy, M., & Beichner, R. J. (2016). Enabling and challenging factors in institutional reform: The case of SCALEUP. Physical Review Physics Education Research, 12(1), 1–22. https://doi.org/10.1103/PhysRevPhysEducRes.12.010103

  48. Franklin, S., Hane, E., Kustusch, M., Ptak, C., & Sayre, E. (2018). Improving retention through metacognition: A program for deaf/hard-of-hearing andfirst-generation STEM college students. Journal of College Science Teaching, 048(02), 21–28. https://doi.org/10.2505/4/jcst18_048_02_21.

  49. Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences of the United States of America, 111(23), 8410–8415. https://doi.org/10.1073/pnas.1319030111

  50. García, E., & Weiss, E. (2019). The teacher shortage is real, large and growing, and worse than we thought.

    Google Scholar 

  51. Geller, B. D., Gouvea, J., Dreyfus, B. W., Sawtelle, V., Turpen, C., & Redish, E. F. (2019). Bridging the gaps: How students seek disciplinary coherence in introductory physics for life science. Physical Review Physics Education Research, 15(2), 20142. https://doi.org/10.1103/physrevphyseducres.15.020142.

  52. Geske, J. (1992). Overcoming the drawbacks of the large lecture class. College Teaching, 40(4), 151–154. https://doi.org/10.1080/87567555.1992.10532239

  53. Goertzen, R. M., Brewe, E., & Kramer, L. (2013). Expanded markers of success in introductory university physics. International Journal of Science Education, 35(2), 262–288. https://doi.org/10.1080/09500693.2012.718099

  54. Goertzen, R. M., Brewe, E., Kramer, L. H., Wells, L., & Jones, D. (2011). Moving toward change: Institutionalizing reform through implementation of the Learning Assistant model and Open Source Tutorials. Physical Review Special Topics - Physics Education Research, 7(2), 1–9. https://doi.org/10.1103/PhysRevSTPER.7.020105.

  55. Goldhaber, S., Pollock, S., Dubson, M., Beale, P., & Perkins, K. (2009). Transforming upper-division quantum mechanics: Learning goals and assessment. AIP Conference Proceedings, 1179, 145–148. https://doi.org/10.1063/1.3266699.

  56. Gosser, D. K., & Roth, V. (1998). The workshop chemistry project: Peer-led team learning. Journal of Chemical Education, 75(2), 185–187. https://doi.org/10.1021/ed075p185

  57. Gray, K. E., & Otero, V. K. (2009). Analysis of former Learning Assistants’ views on cooperative learning. AIP Conference Proceedings, 1179, 149–152. https://doi.org/10.1063/1.3266700

  58. Gray, K. E., Webb, D. C., & Otero, V. K. (2010). Are Learning Assistants better K-12 science teachers? AIP Conference Proceedings, 1289, 157–160. https://doi.org/10.1063/1.3515186

  59. Gray, K. E., Webb, D. C., & Otero, V. K. (2012). Effects of the Learning Assistant model on in-service teacher practice. Physics Education Research Conference Proceedings. https://doi.org/10.1063/1.3680029

  60. Gray, K. E., Webb, D. C., & Otero, V. K. (2016). Effects of the Learning Assistant model on teacher practice. Physical Review Physics Education Research, 12(2), 1–10. https://doi.org/10.1103/PhysRevPhysEducRes.12.020126

  61. Hake, R. R. (1998). Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses. American Journal of Physics, 66(1), 64–74. https://doi.org/10.1119/1.18809

  62. Halloun, I. (1996). Views about science and physics achievement The VASS story. International Conference on Undergraduate Physics Education.

  63. Handelsman, J., Ebert-may, D., Beichner, R., Bruns, P., Chang, A., Dehaan, R., Gentile, J., Lauffer, S., Stewart, J., Tilghman, S. M., Wood, W. B., Handelsman, J., Ebert-may, D., Beichner, R., Bruns, P., Chang, A., Dehaan, R., Gentile, J., Lauffer, S., … Wood, W. B. (2004). Scientific teaching. Science, 304(5670), 521–522.

  64. Hazari, Z., Sonnert, G., Sadler, P. M., & Shanahan, M.-C. (2010). Connecting high school physics experiences, outcome expectations, physics identity, and physics career choice: A gender study. Journal of Research in Science Teaching, 47(8), 978–1003. https://doi.org/10.1002/tea.20363

  65. Herrera, X., Nissen, J., & Van Dusen, B. (2018). Student outcomes across collaborative-learning environments. Physics Education Research Conference Proceedings, 2018, 1–4.

  66. Hestenes, D., Wells, M., & Swackhamer, G. (1992). Force Concept Inventory. The Physics Teacher, 30(3), 141–158. https://doi.org/10.1119/1.2343497

  67. Hill, J. G., & Gruber, K. J. (2011). Education and certification qualifications of departmentalized public high school-level teachers of core subjects. In Education.

  68. House, J. D. (1994). Student motivation and achievement in college chemistry. International Journal of Instructional Media, 21(1), 1–11.

    Google Scholar 

  69. Irvine, S., Williams, B., & McKenna, L. (2018). Near-peer teaching in undergraduate nurse education: An integrative review. Nurse Education Today, 70(March), 60–68. https://doi.org/10.1016/j.nedt.2018.08.009

  70. Jeffery, K., Frawley Cass, S., & Sweeder, R. (2019). Comparison of students’ readily accessible knowledge of reaction kinetics in lecture- and contextbased courses. Journal of STEM Education: Innovations and Research, 19(5), 5–13.

  71. Kiste, A. L., Scott, G. E., Bukenberger, J., Markmann, M., & Moore, J. (2017). An examination of student outcomes in studio chemistry. Chemistry Education Research and Practice, 18(1), 233–249. https://doi.org/10.1039/c6rp00202a

  72. Klymkowsky, M. W. (2007). Teaching without a textbook: Strategies to focus learning on fundamental concepts and scientific process. CBE Life Sciences Education, 6, 190–193. https://doi.org/10.1187/cbe.07.

  73. Knaub, A. V., Foote, K. T., Henderson, C., Dancy, M., & Beichner, R. J. (2016). Get a room: the role of classroom space in sustained implementation of studio style instruction. International Journal of STEM Education, 3(1), 1–22. https://doi.org/10.1186/s40594-016-0042-3

  74. Knight, J. K., Wise, S. B., Rentsch, J., & Furtak, E. M. (2015). Cues matter: Learning Assistants influence introductory biology student interactions during clicker-question discussions. CBE Life Sciences Education, 14(4), 1–14. https://doi.org/10.1187/cbe.15-04-0093

  75. Kohlmyer, M. A., Caballero, M. D., Catrambone, R., Chabay, R. W., Ding, L., Haugan, M. P., Marr, M. J., Sherwood, B. A., & Schatz, M. F. (2009). Tale of two curricula: The performance of 2000 students in Introductory Electromagnetism. Physical Review Special Topics - Physics Education Research, 5(2), 1–10. https://doi.org/10.1103/physrevstper.5.020105

  76. Koretsky, M., Bouwma-Gearhart, J., Brown, S., Dick, T., Brubaker-Cole, S., Sitomer, A., Quardokus Fisher, K., Smith, C., Ivanovitch, J., Risien, J., Kayes, L., & Quick, D. (2016). Enhancing STEM education at Oregon State University - Year 2. ASEE Annual Conference and Exposition. https://doi.org/10.18260/p.26704.

  77. Koretsky, M. D. (2017). Cognitive and social aspects of engagement in active learning. Chemical Engineering Education, 51(4), 198–204.

  78. Lage, M. J., Platt, G. J., & Treglia, M. (2000). Inverting the classroom: A gateway to creating an inclusive learning environment. Journal of Economic Education, 31(1), 30–43. https://doi.org/10.1080/00220480009596759

  79. Learning Assistant Alliance. (2020). Insitution and Member Directory. https://www.learningassistantalliance.org/. Accessed 10 Aug 2020

  80. Levac, D., Colquhoun, H., & O’Brien, K. K. (2010). Scoping studies: Advancing the methodology. Implementation Science, 5(1), 1–9. https://doi.org/10.1186/1748-5908-5-69

  81. Little, A. (2015). Proudness: What is it? Why is it important? And how do we design for it in college physics and astronomy education? (Issue June).

  82. Lock, R. M., Hazari, Z., & Potvin, G. (2013). Physics career intentions: The effect of physics identity, math identity, and gender. AIP Conference Proceedings, 1513, 262–265. https://doi.org/10.1063/1.4789702

  83. Lockspeiser, T. M., O’Sullivan, P., Teherani, A., & Muller, J. (2008). Understanding the experience of being taught by peers: The value of social and cognitive congruence. Advances in Health Sciences Education, 13(3), 361–372. https://doi.org/10.1007/s10459-006-9049-8

  84. Loes, C. N., An, B. P., Saichaie, K., & Pascarella, E. T. (2017). Does collaborative learning influence persistence to the second year of college? Journal of Higher Education, 88(1), 62–84. https://doi.org/10.1080/00221546.2016.1243942

  85. Maloney, D. P., O’Kuma, T. L., Hieggelke, C. J., & Van Heuvelen, A. (2001). Surveying students’ conceptual knowledge of electricity and magnetism. American Journal of Physics, 69(S1), S12–S23. https://doi.org/10.1119/1.1371296

  86. Martella, R. C., Nelson, J. R., Morgan, R. L., & Marchand-Martella, N. E. (2013). Understanding and interpreting educational research. New York, The Guilford Press.

  87. McHenry, N., Martin, A., Castaldo, A., & Ziegenfuss, D. (2009). Learning Assistants program: Faculty development for conceptual change. International Journal of Teaching and Learning in Higher Education, 22(3), 258–268.

    Google Scholar 

  88. Miller, P. M., Carver, J. S., Shinde, A., Ratcliff, B., & Murphy, A. N. (2013). Initial replication results of learning assistants in university physics. AIP Conference Proceedings, 1513, 30–33. https://doi.org/10.1063/1.4789644

  89. Moore, J. C. (2018). Efficacy of multimedia learning modules as preparation for lecture-based tutorials in electromagnetism. Education Sciences, 8(1). https://doi.org/10.3390/educsci8010023.

  90. Nadelson, L. S., & Finnegan, J. (2014). Path less traveled: Fostering STEM majors’ professional identity development through engagement as STEM Learning Assistants. Journal of Higher Education Theory & Practice, 14(5), 29–40 http://search.ebscohost.com/login.aspx?direct=true&db=eue&AN=100405018&site=ehost-live. Accessed 14 Jan 2020

  91. Nelson, M. A. (2011). Oral assessments: Improving retention, grades, and understanding. Primus, 21(1), 47–61. https://doi.org/10.1080/10511970902869176.

  92. Newman, D. L., Stefkovich, M., Clasen, C., Franzen, M. A., & Wright, L. K. (2018). Physical models can provide superior learning opportunities beyond the benefits of active engagements. Biochemistry and Molecular Biology Education, 46(5), 435–444. https://doi.org/10.1002/bmb.21159.

  93. Nissen, J., Donatello, R., & Van Dusen, B. (2019). Missing data and bias in physics education research: A case for using multiple imputation. Physical Review Physics Education Research, 15(2), 1–15. https://doi.org/10.1103/PhysRevPhysEducRes.15.020106

  94. Osborne, J., Simon, S., & Collins, S. (2003). Attitudes towards science: A review of the literature and its implications. International Journal of Science Education, 25(9), 1049–1079. https://doi.org/10.1080/0950069032000032199

  95. O’Shea, B., Terry, L., & Benenson, W. (2013). From F = ma to flying squirrels: Curricular change in an introductory physics course. CBE Life Sciences Education, 12(2), 230–238. https://doi.org/10.1187/cbe.12-08-0127.

  96. Otero, V. (2015). Recruiting and educating future physics teachers: Case studies and effective practices. In C. Sandifer & E. Brewe (Eds.), Effective Practices in Preservice Teacher Education (pp. 107–116). American Physical Society. http://www.phystec.org/webdocs/EffectivePracticesBook.cfm?. Accessed 12 March 2019.

  97. Otero, V., Finkelstein, N., McCray, R., & Pollock, S. (2006). Who is responsible for preparing science teachers? Science, 313(5786), 445–446. https://doi.org/10.1126/science.1129648

  98. Otero, V., & Gray, K. (2008). Attitudinal gains across multiple universities using the Physics and Everyday Thinking curriculum. Physical Review Special Topics - Physics Education Research, 4(2), 1–7. https://doi.org/10.1103/PhysRevSTPER.4.020104

  99. Otero, V., Pollock, S., & Finkelstein, N. (2010). A physics department’s role in preparing physics teachers: The Colorado Learning Assistant model. American Journal of Physics, 78(11), 1218–1224. https://doi.org/10.1119/1.3471291

  100. PER Central. (2020). Physics Education Research Conference: Conference Proceedings. https://www.compadre.org/per/perc/proceedings.cfm. Accessed 3 Nov 2020.

  101. Pham, M. T., Rajić, A., Greig, J. D., Sargeant, J. M., Papadopoulos, A., & Mcewen, S. A. (2014). A scoping review of scoping reviews: Advancing the approach and enhancing the consistency. Research Synthesis Methods, 5(4), 371–385. https://doi.org/10.1002/jrsm.1123

  102. Pollock, S. J. (2007). A longitudinal study of the impact of curriculum on conceptual understanding in E&M. AIP Conference Proceedings, 951, 172–175. 10.1063/1.2820925.

  103. Pollock, S. J., & Finkelstein, N. D. (2007). Sustaining change: Instructor effects in transformed large lecture courses. AIP Conference Proceedings, 883, 109–112. https://doi.org/10.1063/1.2508704.

  104. Pollock, S. J., & Finkelstein, N. D. (2008). Sustaining educational reforms in introductory physics. Physical Review Special Topics - Physics Education Research, 4(1), 1–8. https://doi.org/10.1103/PhysRevSTPER.4.010110.

  105. Pollock, S. J., & Finkelstein, N. D. (2013). Impacts of curricular change: Implications from 8 years of data in introductory physics. AIP Conference Proceedings, 1513, 310–313. https://doi.org/10.1063/1.4789714.

  106. Pollock, S. J. (2009). Longitudinal study of student conceptual understanding in electricity and magnetism. Physical Review Special Topics - Physics Education Research, 5(2), 1–8. https://doi.org/10.1103/physrevstper.5.020110.

  107. Price, E., & Finkelstein, N. D. (2008). Preparing physics graduate students to be educators. American Journal of Physics, 76(7), 684–690. https://doi.org/10.1119/1.2897288

  108. Price, E., Tsui, S., Hart, A., & Saucedo, L. (2011). Don’t erase that whiteboard! Archiving student work on a photo-sharing website. The Physics Teacher, 49(7), 426–428. https://doi.org/10.1119/1.3639151.

  109. Quan, G., & Gupta, A. (2020). Tensions in the productivity of design task tinkering. Journal of Engineering Education, 109(1), 88–106. https://doi.org/10.1002/jee.20303

  110. Quan, G., Turpen, C., Gupta, A., & Tanu, E. (2017). Designing a course for peer educators in undergraduate engineering design courses. ASEE Annual Conference and Exposition. https://doi.org/10.18260/1-2%2D%2D28124

  111. Redish, E. F. (1994). Implications of cognitive studies for teaching physics. American Journal of Physics, 62(9), 796–803. https://doi.org/10.1119/1.17461

  112. Redish, E. F., Saul, J. M., & Steinberg, R. N. (1998). Student expectations in introductory physics. American Journal of Physics, 66(3), 212–224. https://doi.org/10.1119/1.18847

  113. Robertson, A. D., & Richards, J. (2017). Teacher sense-making about being responsive to students’ science ideas: A case study. European Journal of Science and Mathematics Education, 5(4), 314–342 https://files-eric-ed-gov.proxy1.cl.msu.edu/fulltext/EJ1158186.pdf. Accessed 6 Jan 2020.

  114. Sabella, M. S., Van Duzor, A. G., & Davenport, F. (2016). Leveraging the expertise of the urban STEM student in developing an effective LA Program: LA and instructor partnerships. Physics Education Research Conference Proceedings, 288–291. https://doi.org/10.1119/perc.2016.pr.067

  115. Sawada, D., Piburn, M. D., Judson, E., Turley, J., Falconer, K., Benford, R., & Bloom, I. (2002). Measuring reform practices in science and mathematics classrooms: The Reformed Teaching Observation Protocol. School Science and Mathematics, 102(6), 245–253. https://doi.org/10.1111/j.1949-8594.2002.tb17883.x

  116. Schick, C. P. (2018). Trying on teaching: Transforming STEM classrooms with a Learning Assistant program. ACS Symposium Series, 1280, 3–27. https://doi.org/10.1021/bk-2018-1280.ch001

  117. Sellami, N., Shaked, S., Laski, F. A., Eagan, K. M., & Sanders, E. R. (2017). Implementation of a Learning Assistant program improves student performance on higher-order assessments. CBE Life Sciences Education, 16(4), 1–10. https://doi.org/10.1187/cbe.16-12-0341

  118. Shi, J., Wood, W. B., Martin, J. M., Guild, N. A., Vicens, Q., & Knight, J. K. (2010). A diagnostic assessment for Introductory Molecular and Cell Biology. Science, 9, 453–461. https://doi.org/10.1187/cbe.10

  119. Smith, M. K., Wood, W. B., Adams, W. K., Wieman, C., Knight, J. K., Guild, N., & Su, T. T. (2009). Why peer discussion improves student performance on in-class concept questions. Science, 323(5910), 122–124. https://doi.org/10.1126/science.1165919

  120. Smith, M. K., Wood, W. B., Krauter, K., & Knight, J. K. (2011). Combining peer discussion with instructor explanation increases student learning from in-class concept questions. CBE Life Sciences Education, 10(1), 55–63. https://doi.org/10.1187/cbe.10-08-0101

  121. Stains, M., Harshman, J., Barker, M. K., Chasteen, S. V., Cole, R., DeChenne-Peters, S. E., … Young, A. M. (2018). Anatomy of STEM teaching in North American universities. Science, 359(6383), 1468–1470.

    Article  Google Scholar 

  122. Stone, K. L., Shaner, S. E., & Fendrick, C. M. (2018). Improving the success of first term general chemistry students at a liberal arts institution. Education Sciences, 8(1). https://doi.org/10.3390/educsci8010005.

  123. Talbot, R. M. (2013). Taking an item-level approach to measuring change with the Force and Motion Conceptual Evaluation: An application of Item Response Theory. School Science and Mathematics, 113(7), 356–365. https://doi.org/10.1111/ssm.12033

  124. Talbot, R. M., Doughty, L., Nasim, A., Hartley, L., Le, P., Kramer, L. H., Kornreich-Leshem, H., & Boyer, J. (2016). Theoretically framing a complex phenomenon: Student success in large enrollment active learning courses. Physics Education Research Conference Proceedings, 344–347. https://doi.org/10.1119/perc.2016.pr.081

  125. Talbot, R. M., Hartley, L. M., Marzetta, K., & Wee, B. S. (2015). Transforming undergraduate science education With Learning Assistants: Student satisfaction in large enrollment courses. Journal of College Science Teaching, 29(5), 28–34 http://stemgateway.unm.edu/documents/PLFsClassroom.pdf. Accessed 6 Jan 2020.

  126. ten Cate, O., & Durning, S. (2007). Dimensions and psychology of peer teaching in medical education. Medical Teacher, 29(6), 546–552. https://doi.org/10.1080/01421590701583816

  127. ten Cate, O., van de Vorst, I., & van den Broek, S. (2012). Academic achievement of students tutored by near-peers. International Journal of Medical Education, 3, 6–13. https://doi.org/10.5116/ijme.4f0c.9ed2

  128. Thompson, M. M., & Garik, P. (2015). The effect of Learning Assistants on student learning outcomes and satisfaction in large science and engineering courses. Internation Conference of the National Association of Research in Science Teaching, 1–9.

  129. Thornton, R. K., & Sokoloff, D. R. (1998). Assessing student learning of Newton’s laws: The Force and Motion Conceptual Evaluation and the evaluation of active learning laboratory and lecture curricula. American Journal of Physics, 66(4), 338–352. https://doi.org/10.1119/1.18863

  130. Top, L. M., Schoonraad, S. A., & Otero, V. K. (2018). Development of pedagogical knowledge among learning assistants. International Journal of STEM Education, 5(1), 1–18. https://doi.org/10.1186/s40594-017-0097-9

  131. Tsai, J. Y., Kotys-Schwartz, D. A., & Hannigan, M. P. (2013). Learning statics by feeling: Effects of everyday examples on confidence and identity development. ASEE Annual Conference and Exposition.

  132. Van Dusen, B., Langdon, L., & Otero, V. (2015). Learning Assistant Supported Student Outcomes (LASSO) study initial findings. Physics Education Research Conference Proceedings, 343–346. https://doi.org/10.1119/perc.2015.pr.081

  133. Van Dusen, B., & Nissen, J. (2017). Systemic inequities in introductory physics courses: the impacts of Learning Assistants. Physics Education Research Conference Proceedings, 400–403. https://doi.org/10.1119/perc.2017.pr.095

  134. Van Dusen, B., & Nissen, J. (2019). Modernizing use of regression models in physics education research: A review of hierarchical linear modeling. Physical Review Physics Education Research, 15(2), 20108. https://doi.org/10.1103/PhysRevPhysEducRes.15.020108

  135. Van Dusen, B., & Nissen, J. (2020). Associations between learning assistants, passing introductory physics, and equity: A quantitative critical race theory investigation. Physical Review Physics Education Research, 16(1). https://doi.org/10.1103/physrevphyseducres.16.010117

  136. Van Dusen, B., White, J.-S. S., & Roualdes, E. A. (2016). The impact of Learning Assistants on inequities in physics student outcomes. Physics Education Research Conference Proceedings, 360–363. https://doi.org/10.1119/perc.2016.pr.085

  137. Vickrey, T., Rosploch, K., Rahmanian, R., Pilarz, M., & Stains, M. (2015). Research-based implementation of peer instruction: A literature review. CBE Life Sciences Education, 14(1), 1–11. https://doi.org/10.1187/cbe.14-11-0198

  138. Webb, D. C., Stade, E., & Grover, R. (2014). Rousing students’ minds in postsecondary mathematics: The undergraduate Learning Assistant model. Journal of Mathematics Education At Teachers College, 5(2), 39–47.

    Google Scholar 

  139. Weick, K.E. (1995). Sensemaking in Organizations. SAGE Pubulications, Inc.

  140. Wendell, K. B., Matson, D., Gallegos, H., & Chiesa, L. (2019). Work-in-progress : Learning Assistant “noticing” in an undergraduate engineering science course. ASEE Annual Conference and Exposition.

  141. White, J.-S. S., Van Dusen, B., & Roualdes, E. A. (2016). The impacts of Learning Assistants on student learning of physics. Physics Education Research Conference Proceedings, 384–387. https://doi.org/10.1119/perc.2016.pr.091

  142. Whitman, N. A., & Fife, J. D. (1988). Peer Teaching: To Teach is to Learn Twice.

    Google Scholar 

  143. Williams, B., & Fowler, J. (2014). Can Near-Peer Teaching Improve Academic Performance? International Journal of Higher Education, 3(4), 142–149. https://doi.org/10.5430/ijhe.v3n4p142

  144. Wilson, S. B., & Varma-Nelson, P. (2016). Small groups, significant impact: A review of peer-Led team learning research with implications for STEM education researchers and faculty. Journal of Chemical Education, 93(10), 1686–1702. https://doi.org/10.1021/acs.jchemed.5b00862

  145. Wilton, M., Gonzalez-Niño, E., McPartlan, P., Terner, Z., Christoffersen, R. E., & Rothman, J. H. (2019). Improving academic performance, belonging, and retention through increasing structure of an introductory biology course. CBE Life Sciences Education, 18(4). https://doi.org/10.1187/cbe.18-08-0155.

  146. Wolfe, P. (1999). How people learn: Brain, mind, experience and school. Educational Leadership, 57(3).

Download references

Acknowledgements

We thank Dr. Valerie Otero and Dr. Ben Van Dusen for valuable critical reading of and feedback on this manuscript. We also thank the Boston University College of Arts and Sciences and Center for Teaching and Learning for partially funding this project. We also extend a thank you to Dr. Kim McCall and the Department of Biology for their backing of this project. Additionally, we thank the Learning Assistant Alliance for providing free, accessible resources for obtaining information about the LA model. Lastly, we thank the anonymous reviewers for their invaluable suggestions that significantly improved our work.

Funding

We thank the Department of Biology, the Center for Teaching and Learning, the College of Arts and Sciences, and the Office of the Provost at Boston University for funding this project.

Author information

Affiliations

Authors

Contributions

AB designed the parameters for and performed the literature search for this review, determined which articles met the inclusion criteria, and was the major contributor in writing the manuscript. KS made contributions to the interpretation and analysis of studies included in this review and made substantial revisions to this manuscript. Both authors read and approved the final manuscript.

Corresponding author

Correspondence to Anthony P. Barrasso.

Ethics declarations

Competing interests

KS has professional appointments that may result in her benefitting from the success of the LA program at Boston University and the LA Alliance.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Barrasso, A.P., Spilios, K.E. A scoping review of literature assessing the impact of the learning assistant model. IJ STEM Ed 8, 12 (2021). https://doi.org/10.1186/s40594-020-00267-8

Download citation

Keywords

  • Learning assistant
  • Near-peer
  • Curriculum reform
  • Institutional change
  • Peer instruction