Skip to main content

The importance of context: an exploration of factors influencing the adoption of student-centered teaching among chemistry, biology, and physics faculty

Abstract

Background

Research at the secondary and postsecondary levels has clearly demonstrated the critical role that individual and contextual characteristics play in instructors’ decision to adopt educational innovations. Although recent research has shed light on factors influencing the teaching practices of science, technology, engineering, and mathematics (STEM) faculty, it is still not well understood how unique departmental environments impact faculty adoption of evidence-based instructional practices (EBIPs) within the context of a single institution. In this study, we sought to characterize the communication channels utilized by STEM faculty, as well as the contextual and individual factors that influence the teaching practices of STEM faculty at the departmental level. Accordingly, we collected survey and observational data from the chemistry, biology, and physics faculty at a single large research-intensive university in the USA. We then compared the influencing factors experienced by faculty in these different departments to their instructional practices.

Results

Analyses of the survey data reveal disciplinary differences in the factors influencing adoption of EBIPs. In particular, the physics faculty (n = 15) had primarily student-centered views about teaching and experienced the most positive contextual factors toward adoption of EBIPs. At the other end of the spectrum, the chemistry faculty (n = 20) had primarily teacher-centered views and experienced contextual factors that hindered the adoption of student-centered practices. Biology faculty (n = 25) fell between these two groups. Classroom observational data reflected these differences: The physics classrooms were significantly more student-centered than the chemistry classrooms.

Conclusions

This study demonstrates that disciplinary differences exist in the contextual factors teaching conceptions that STEM faculty experience and hold, even among faculty within the same institution. Moreover, it shows that these differences are associated to the level of adoption of student-centered teaching practices. This work has thus identified the critical need to carefully characterize STEM faculty’s departmental environment and conceptions about teaching before engaging in instructional reform efforts, and to adapt reform activities to account for these factors. The results of this study also caution the overgeneralization of findings from a study focused on one type of STEM faculty in one environment to all STEM faculty in any environment.

Background

Calls to reform instructional practices in science, technology, engineering, and mathematics (STEM) courses at the undergraduate level have multiplied over the last decade in the USA (Boyer Commission on Educating Undergraduates in the Research University 1998; National Research Council 1999, 2003, 2011, 2012; National Science Foundation 1996; President’s Council of Advisors on Science and Technology 2012; Project Kaleidoscope 2002, 2006; Rothman and Narum 1999). These calls have been prompted due to insufficient uptake by instructors teaching STEM courses of the results produced by discipline-based education researchers (DBER). Numerous initiatives in the USA have been developed to attempt to address this research-practice gap (e.g., American Association of Universities 2011; Executive Office of the President of the United States 2012). These initiatives often focus on transforming the instructional practices of STEM faculty by raising faculty’s awareness of evidence-based instructional practices (EBIPs) (Handelsman et al. 2004; Handelsman et al. 2006; National Research Council 2011, 2012) and training faculty to implement them. EBIPs are instructional practices that have been empirically demonstrated to promote students’ conceptual understanding and attitudes toward STEM (Eberlein et al. 2008; Handelsman et al. 2004; National Research Council 2011, 2012), with the greatest impacts observed among women and members of underrepresented groups (President’s Council of Advisors on Science and Technology 2012).

Studies have demonstrated that even if faculty adopt EBIPs, they often modify them, sometimes at the expense of critical elements that make the practice effective (Henderson 2008; Henderson and Dancy 2009; Turpen and Finkelstein 2009, 2010). A gap is thus formed between the developers’ vision of an instructional strategy and the actual implementation of the strategy by instructors. This gap can be partly explained by the one-sided focus of educational research on the development of EBIPs and the collection of evidence demonstrating their impact on student learning. This focus is valuable, but comes at the expense of understanding the fit of EBIPs within actual teaching environments, including norms and culture of these environments and faculty’s beliefs and knowledge about teaching and learning (Doyle and Rosemartin 2012; Janssen et al. 2013; National Research Council 2012; Tobin and Dawson 1992; Van Driel et al. 2001). Indeed, extensive research on instructional change has positioned instructors at the center of successful educational reforms (Henderson et al. 2011). In particular, researchers have sought for years to investigate instructors’ actual teaching practices and knowledge about teaching, how instructors integrate their teaching knowledge with new instructional practices, and how instructors’ conceptual framework about teaching and their instructional practices grow as a result of participation in professional development (Pfund et al. 2009; Trigwell et al. 1994; Van Driel 2014; Verloop et al. 2001). Unfortunately, little progress has been made in most of these areas with respect to STEM faculty in postsecondary settings (Janssen et al. 2013; National Research Council 2012; Talanquer 2014). This study addresses aspects of this gap by building on a growing set of studies that focus on characterizing faculty’s knowledge of EBIPs, their instructional practices, as well as their perceived barriers to instructional innovation.

National online survey studies in physics, engineering, and geosciences have provided some insight into the teaching knowledge and practices of STEM faculty at various institutions (Borrego et al. 2010; Henderson and Dancy 2009; Macdonald et al. 2005). Results of these studies are presented in Table 1; although faculty in these particular STEM fields are aware of various EBIPs, only half of them implement one or more of them in their courses. Since these studies are based on self-reported use, which typically overestimate the student-centeredness of actual practices, the actual rate of EBIP adoption is most likely lower than this (D’Eon et al. 2008; Ebert-May et al. 2011; Kane et al. 2002; National Research Council 2012).

Table 1 Awareness and implementation of evidence-based instructional practices among STEM faculty

Although national studies such as these are necessary to capture the instructional landscape in STEM courses in higher education, research has demonstrated that the institutional context plays a critical role in faculty’s decisions about teaching (Austin 2011; D’Eon et al. 2008; Hora and Anderson 2012; Prosser and Trigwell 1997; Walczyk et al. 2007). Moreover, studies have found disciplinary differences in instructional practices and ways of thinking about teaching (Lindblom Ylänne et al. 2006; Norton et al. 2005; Singer 1996), which make it challenging to generalize the findings of the physics, engineering, and geosciences studies to other disciplines where similar national studies have not yet been conducted (e.g., biology and chemistry). There is thus a need to better understand the variations in instructional practices and decision-making processes across a variety of STEM disciplines, and within a variety of institutions, in order for instructional reforms to be effective. The current study addresses this need by characterizing differences in awareness and adoption of EBIPs, as well as factors influencing instructional decisions, between faculty in a biology, chemistry, and physics department at a research-intensive institution.

Conceptual framework

Instructional decisions

Several models have been developed to describe the decision-making process that faculty employ when deciding whether they will adopt an instructional innovation.

One model used in a national study on the awareness and adoption of EBIPs among physics faculty (Henderson et al. 2012b) is Rogers’ model of the innovation-decision process (Rogers 2003). This model describes the five decision-making stages through which an individual passes when opting for adoption of an innovation (Rogers 2003). The model starts at the knowledge stage, in which a faculty member has been made aware of the innovation and has some understanding of its functioning; it is followed by the persuasion stage, in which the faculty member forms their attitude toward the innovation (positive or negative); next, s/he evaluates whether s/he will implement the innovation in her/his course, culminating in a positive or negative decision. If a positive decision is made then s/he moves to the implementation stage, and tries the innovation in her/his course. If s/he observes positive outcomes during the implementation stage, s/he finally commits to the long-term implementation of the innovation. Rogers’ model also includes the following four factors affecting these stages: 1) individual factors (e.g., faculty’s perceived need for the innovation), 2) social system variables (e.g., norms of social system), 3) perceived characteristics of innovations (e.g., complexity), and 4) communication channels (i.e., means by which a person learns about the innovation) (Rogers 2003).

The factors identified by Rogers are present in other models describing the instructional decision processes of faculty in higher education. For example, drawing from the organizational change literature, which highlights the importance of the context within which faculty works and makes decisions (Austin 2011; Gess-Newsome et al. 2003; Kezar 2001), Austin (2011) proposed a systems-approach framework to instructional change. This model closely aligns with the teacher-centered systemic reform developed by Gess-Newsome et al. (2003). Both models identify a) the various organizational levels (e.g., department, institution, professional organizations) that influence faculty decision processes regarding teaching (Anderson et al. 2011; Austin 2011; Brownell and Tanner 2012; Childs 2009; Fixsen et al. 2005; Froyd 2011; Gess-Newsome et al. 2003; Graham et al. 2006; Henderson and Dancy 2007, 2011; Hora 2012a; Lomas 1993; Macoubrie and Harrison 2013; National Research Council 2012; Seymour et al. 2011; Walczyk et al. 2007), as well as b) the individual influences such as self-efficacy, beliefs about teaching, and pedagogical training behaviors favorably (Dancy and Henderson 2007; Ghaith and Yaghi 1997; Gordon and Debus 2002; Guskey 1988; Henderson et al. 2011; Henderson et al. 2012a; Hora 2012b; Kember and Kwan 2000; Murray and Macdonald 1997; Postareff et al. 2007, 2008; Schuster and Finkelstein 2006; Trigwell and Prosser 1996b).

Figure 1, which combines these complementary models, presents the model for faculty instructional decision-making that informs this project. Initially, faculty are unaware of a particular EBIP. In the next stage, faculty have been made aware of the instructional innovation, and perhaps know some very basic information about it. At the interested stage, faculty are learning more about the innovation, forming an opinion about it, and making a decision as to whether they want to implement it. In the adopted stage, faculty are testing out the innovation and eventually adopt it for long-term use. Figure 1 also highlights the factors considered in this study that are influencing these stages.

Fig. 1
figure 1

Stages and factors influencing the instructional innovation-decision process

Factors influencing instructional decisions

Although there may be other factors influencing faculty decision-making, we focused on the following three main categories: communication channels, contextual influences, and individual influences in this study. Below, we only highlight the research associated with these factors.

Communication channels

Rogers (2003) identifies two different types of communication channels, i.e., ways by which information about an innovation is transmitted to the targeted population such as: mass media and interpersonal communication. Mass media in our study includes peer-reviewed journals and conferences. Interpersonal communication includes workshops and discussions with colleagues, broadly defined (e.g., colleagues within one’s own department, high school teachers, discipline-based education researchers at another institution, etc.).

The communication channels used in DBER in the USA have several important similarities and differences across the three disciplines under study. All three communities have peer-reviewed journals targeting faculty within the disciplines, as well as education research symposia at national scientific meetings. However, some DBER fields have longer tradition in the dissemination of EBIPs through interpersonal channels at the national level; moreover, goals of these dissemination initiatives differ by discipline. In particular, the PER community has been running a workshop about EBIPs for new physics faculty since 1996 (Henderson 2008). This workshop has reached about 25 % of all new physics faculty each year (Henderson 2008). One recent study indicates that attendance to this workshop was one of the significant variables related to awareness of EBIPs (Henderson et al. 2012b). The goal of this workshop is to introduce physics faculty to several, specific EBIPs. BER has implemented the following two national programs since 2004: the FIRST series, and the National Academies Summer Institutes (Ebert-May et al. 2011; Handelsman et al. 2004). These programs seek to raise biology faculty’s awareness of EBIPs and instructional implications of the results of DBER. They are also structured in order to promote faculty’s adoption of these practices. A similar type of workshop was implemented for the first time in chemistry in 2012 (Baker et al. 2014, Stains et al. 2015). Beyond this workshop, CER has implemented at scale workshops that introduce faculty to specific EBIPs and promote their adoption. Examples include the Multi-initiative Dissemination Workshop series in the 1990’s, which included Peer-Led Team Learning, ChemConnections, and what is now known as Process Oriented Guided Inquiry Learning (POGIL) (Burke et al. 2004; Peace et al. 2002).

At the time of this study, there was no BER faculty at the institution where this study took place; there was one CER faculty who had started a year prior to the beginning of the study (although there had been two other CER faculty who left the department over a decade before the new CER faculty’s arrival); and there was no PER faculty (although a PER faculty was active in the department until 2005, and the Chair of the department had extensive experience as a high school physics teacher).

Due to these variations in national propagation efforts as well as the variation in availability of individuals likely to propagate EBIPs within the STEM departments investigated in this study, we can expect differences in awareness and adoption of EBIPs between the biology, chemistry, and physics faculty in our study.

Contextual influences

Barriers to instructional change in academia have been studied extensively over the past decades (Anderson et al. 2011; Austin 2011; Brownell and Tanner 2012; Childs 2009; Froyd 2011; Gess-Newsome et al. 2003; Henderson et al. 2011; Henderson and Dancy 2007, 2011; Hora 2012a; National Research Council 2012; Seymour et al. 2011; Trigwell and Prosser 1996b; Walczyk et al. 2007). In this study, we focus on departmental influences and characteristics of the learning environment, since the research has identified these factors as prominent barriers to instructional change (Gess-Newsome et al. 2003; Henderson and Dancy 2007; Hora 2012a; Hora and Anderson 2012). Departmental influences include perceived norms toward student-centered teaching within the department and felt pressure around promotion and tenure policies (Gess-Newsome et al. 2003; Henderson and Dancy 2007; Hora and Anderson 2012; Prosser and Trigwell 1997; Seymour et al. 2011). The characteristics of the learning environment include class size and layout, level of student preparation, and content coverage (Henderson and Dancy 2007; Hora and Anderson 2012; Prosser and Trigwell 1997).

Individual influences

Practical theories are complex, conceptual, and belief networks that constrain faculty’s instructional practices (Feldman 2000; Gess-Newsome et al. 2003). They include beliefs about teaching and learning, the roles of the instructor and students, as well as knowledge of instructional methods and their role in teaching specific content (Gess-Newsome et al. 2003). All instructors, independent of their level of experience, enter the classroom with personal practical theories, which have been developed through diverse avenues including their experiences as students, reflections on their own or others’ teaching, and experiences as instructors in various settings (Feldman 2000). Research suggests that instructional behaviors are tied with at least one component of practical theories, the instructors’ conceptions of teaching, and that these conceptions constitute important barriers to instructional changes (Dancy and Henderson 2007; Hora 2012b; Kember and Kwan 2000; Murray and Macdonald 1997; Schuster and Finkelstein 2006; Trigwell and Prosser 1996a, 1996b). Research on postsecondary faculty has shown that “to create and sustain fundamental change, there must be specific and concentrated attention to the personal practical theories of the faculty involved” (Gess-Newsome et al. 2003). Unfortunately, these practical theories have only been weakly studied in STEM faculty in the USA. This study addresses this need by characterizing and comparing the pedagogical experiences, as well as attitudes and beliefs toward student-centered teaching of biology, chemistry, and physics faculty at a research-intensive institution.

Research questions

The conceptual framework described above informed the development of the following research questions investigated in this study:

  1. 1.

    To what extent do disciplinary differences exist in the (a) awareness and (b) adoption of evidence-based instructional practices among biology, chemistry, and physics faculty?

  2. 2.

    To what extent do disciplinary differences exist in factors influencing biology, chemistry, and physics faculty’s awareness and adoption of evidence-based instructional practices?

We hypothesized that we will observe differences on these questions among the three departments examined in this study since the communication channels employed to disseminate EBIPs differ by disciplines. Moreover, the three departments are functioning independently of each other and we thus expect differences among the faculty in perceived pedagogical norms and promotion and tenure pressures, which we hypothesize would lead to a different level of awareness and adoption according to our conceptual framework.

Methods

Participants

Our aim in this study is to describe faculty from the departments of chemistry, biology, and physics at a single public university in the Midwest. The university is categorized by the Carnegie Foundation as a high undergraduate, large four-year, primarily residential, very high research activity institution (The Carnegie Classification of Institutions of Higher Education™. http://classifications.carnegiefoundation.org/ Accessed 03/08 2015). We attempted to recruit every faculty member from the biology, chemistry, and physics departments as part of a larger study characterizing the impact of a professional development program targeting STEM faculty on campus. The sample includes faculty who intended to participate in the on-site professional development program and faculty who did not. The data presented in this study (online surveys and classroom observations) were collected before the former group participated in the professional development. As Table 2 indicates, the response rate to the online survey was very high for all departments (54 to 74 %). A reasonable number of faculty also allowed the collection of video recordings of their lectures (from 17 to 41 %; see Table 2). Table 2 also shows that the population of study participants within each department is representative of the population of faculty in the whole department. In particular, we observe similar ratio of tenure-track faculty and lecturers who provided survey and observation data than the ratio of these different types of faculty within the biology and physics departments; the ratio for the chemistry department is slightly in favor of the lecturers for the observation data but similar for the survey data.

Table 2 Number of faculty surveyed and observed; percentages represent the proportion of tenure-track or lecturer within the sample considered

Data collected

Survey data was collected via the online survey software Qualtrics®. The research questions informed the design of the survey. Some questions were adapted from online surveys used in other research studies of STEM faculty (Borrego et al. 2011; Borrego et al. 2010; Henderson and Dancy 2009; Hora and Anderson 2012; Macdonald et al. 2005; Walczyk and Ramsey 2003). Others were created by the research team. Interviews were conducted with nine faculty in order to ensure that the questions and options were appropriately understood by potential participants. As a result of these interviews, refinements to several questions and associated options were made.

The survey was separated into the five following sections: 1) participant background, 2) awareness and adoption of EBIPs, 3) communication channels, 4) attitudes and beliefs toward student-centered teaching, and 5) contextual factors. The first section enabled us to collect relevant background data about the participants, including the distribution of their faculty appointment (teaching/research/service), their teaching load, the number of years since their first faculty appointment, and their prior experience with teaching workshops. The second section of the survey addressed our first research question by characterizing the participants’ awareness and adoption of various EBIPs. The remaining three sections of the survey addressed our second research question by characterizing different factors that have been identified in the literature as influencing faculty instructional practices. Specifically, the third section of the survey characterized the type of communication channels participants rely on for advice about teaching, including academic journals, professional conferences, and fellow faculty members. The fourth section of the survey included the Approaches to Teaching Inventory (ATI; Trigwell and Prosser 2004; Trigwell et al. 2005), which indirectly measures faculty beliefs about teaching. We also added questions that further probed their attitudes toward student-centered instructional practices. The last section enabled us to identify the extent to which participants’ teaching is influenced by various contextual factors, including characteristics of the learning environment (e.g., classroom environment, level of student preparation), departmental expectations toward active learning practices, and time constraints due to promotion and tenure pressures. The survey is provided in the Additional file 1.

In addition to the survey, we collected observational data on faculty-teaching practices via videotaped classroom visits. A week’s worth of class periods (2–3 consecutive classes) were recorded for each faculty. These classroom recordings were coded using the Reformed Teaching Observation Protocol (RTOP) and the Classroom Observation Protocol for Undergraduate STEM (COPUS) as described previously (Lund et al. 2015). The method to establish inter-rater reliability with these two protocols is described elsewhere (Lund et al. 2015). High levels of inter-rater reliability were achieved with both protocols (Lund et al. 2015); the average intraclass correlation coefficient for exact agreement among ten different pairs of coders was 0.875 ± 0.085 for the RTOP. For the COPUS, the average Cohen’s kappa was 0.908 ± 0.045 and 0.852 ± 0.69 for the students’ and instructor’s set of codes, respectively.

Data analysis

We first examined our participants’ background data (section 1 of the survey) to determine whether there were systematic differences between faculty from the three departments that could explain trends related to our research questions. Table 3 displays the average faculty appointment distributions as a percentage of total appointment responsibilities. A one-way ANOVA shows no significant difference between groups for teaching [F(2,53) = 0.573, p = 0.568], research [F(2,53) = 1.128, p = 0.331], and service [F(2,53) = 0.283, p = 0.754] responsibilities. In addition, a one-way ANOVA shows no significant differences between departments for the number of courses taught by the faculty [F(2,53) = 2.785, p = 0.071], indicating similar requirement across the departments on time commitment related to teaching (Table 3). Experience in teaching was also investigated through a one-way ANOVA. No differences were observed between the three groups of faculty who have, on average, been faculty members for 15 years [F(2,53) = 0.225, p = 0.799]. Although a lower percentage of chemists report having attended teaching workshops than biologists or physicists (Table 3), a chi-square analysis shows no statistically significant differences between groups for teaching workshop attendance [χ 2(2) = 4.923, p = 0.085]. Finally, a chi-square analysis shows no statistically significant differences between groups for the type of institution they attended as an undergraduate student, with roughly three quarters having attended a PhD-granting research institution [χ 2(2) = 0.049, p = 0.976], rather than an institution granting only a M.S., B.S., or B.A. in the faculty’s field of study. Assumptions underlying each statistical test were checked, and appropriate measures were taken when assumptions were not met. These various analyses indicated that the three groups of faculty were comparable on background variables that could be related to our outcome variables. We thus did not need to control for any of these variables for the rest of the analyses. All analyses were carried out using the SPSS package, version 22.

Table 3 Faculty background data

In addition to examining the faculty background data for differences, we also examined whether there were systematic differences in the class context. In a prior study, we identified statistically significant differences in instructional styles based on the characteristics of the learning environment (i.e., course level, class size, classroom physical layout) (Lund et al. 2015). The lectures recorded for this study varied on all three of these characteristics, and statistically significant differences between disciplines exist (Table 4). Specifically, Fisher’s exact tests were conducted on all three characteristics of observed classrooms and demonstrated that statistically significant differences in the type of course level (p < 0.001), class size (p = 0.003), and classroom physical layout (p < 0.001) existed between the three disciplines. We will take these differences into consideration when discussing the results of the analysis of classroom instructional practices.

Table 4 Learning environment characteristics of observed classes by discipline

The analysis of the classroom observations is described in depth in a previous paper (Lund et al. 2015). In brief, RTOP scores were assigned for each lecture recorded, representing the level of “reformed teaching practices” observed in the classroom. Additionally, each class period was assigned to one of ten different COPUS teaching profiles via a cluster analysis of COPUS behavioral codes. These ten COPUS profiles represent different instructional strategies, ranging from teacher-centered to student-centered pedagogies. The profiles include the following: lecture (with slides or at the board), transitional lecture (primarily consisting of lecturing, but with a small percentage of time spent on student-student interactions), Socratic (with slides or at the board), limited peer instruction (with slides or at board; peer instruction is an instructional strategy that has students answer conceptual questions individually, vote, further discuss the question with peers, and then revote (Vickrey 2015), extensive peer instruction, student-centered peer instruction, and group work. These ten instructional strategies can be categorized into four general instructional styles as follows: lecturing, Socratic instruction, peer instruction, and collaborative learning.

Results

In this study, we are interested in characterizing the level of EBIP awareness, the types of instructional practices, and the factors influencing teaching among faculty from the departments of biology, chemistry, and physics at one research-intensive institution. We will present results addressing each research question in turn.

Disciplinary differences in the awareness and adoption of EBIPs among biology, chemistry, and physics faculty

Awareness of EBIPs

First, we sought to establish the level of awareness of EBIPs among the chemistry, biology, and physics faculty participating in the study. In the online survey, faculty were presented with a list of seventeen of the most common EBIPs, including a brief description of each EBIP (e.g., peer instruction, just-in-time teaching, case studies, process oriented guided inquiry learning; see Additional file 1 for complete list). The majority of this list was adapted from prior surveys (Borrego et al. 2010; Henderson and Dancy 2009; Macdonald et al. 2005). Faculty indicated their orientation to each EBIP by selecting one of the following statements: (1) I have never heard of it, (2) I have heard the name but don’t know much else, (3) I am familiar but have not used it, (4) I am familiar and plan to implement it, (5) In the past, I have used all or part of it, but I am no longer using it, or (6) I currently use all or part of it. We defined “awareness” as a response of (3) or above on this scale.

The mean number of EBIPs that faculty were aware of is presented in Fig. 2a. Regardless of department, faculty professed awareness of approximately 11 out of the 17 EBIPs; a one-way ANOVA shows no significant differences between groups [F(2,57) = 0.223, p = 0.801]. Fig. 2b provides the percentage of faculty who were aware of X or more EBIPs. Although the curves have similar shapes, it is notable that every physics faculty surveyed professed awareness of at least six EBIPs, while only 80 and 88 % of the chemistry and biology faculty did so.

Fig. 2
figure 2

Number of EBIPs faculty are aware of. a Average and standard deviation. b Percentage of faculty who are aware of X or more EBIPs

We were also interested in determining which specific EBIPs faculty were most and least aware of. Table 5 lists the percentage of faculty reporting awareness of each of the seventeen EBIPs, ranked by the overall average awareness. Clickers, collaborative learning, and animations were among the most well-known EBIPs in every department. The physicists surveyed were universally aware of clickers and peer instruction, strategies that are frequently coupled; interestingly, chemists were universally aware of clickers, but were far less aware of peer instruction (74 %). The institution at which this study took place had adopted clickers and installed the system in most classrooms a couple of years before this study took place; this may explain the high level of awareness of clickers by the faculty in this study.

Table 5 Percent of faculty aware of the indicated EBIPs. Results are displayed as the percent of faculty selecting statement 3 (I am familiar…) or higher. EBIPs familiar to two thirds or more of the faculty are bolded; EBIPs familiar to less than half of the faculty are italicized

A large proportion of faculty from at least two different departments were relatively well aware of formative assessment (chemists and biologists), case studies (biologists and physicists), and computer simulations (chemists and physicists). Physicists were unique in their strong awareness of just-in-time teaching, concept inventories, and interactive demonstrations. Biologists were unique in their relatively high awareness of cooperative learning and problem-based learning. SALG, concept maps, and SCALE-UP were among the least well-known EBIPs across all three departments. In addition, less than half of the biologists were aware of POGIL, interactive demonstrations, and concept inventories. Less than half of the physicists were aware of problem-based learning and cooperative learning.

Interest in EBIPs

As described in our theoretical framework, after a faculty becomes aware of an EBIP, they may or may not proceed to a stage in which they are interested in that EBIP. In this stage, they seek out information about the innovation and develop a more informed opinion about it. Although this step is not the focus of our research questions, it is an important transitional step on the way to adoption of an EBIP. In our survey, a response of statement 4 (I am familiar and plan to implement it) represents a faculty who is interested in a particular EBIP, and has even decided to implement it in their classroom, but has not yet done so.

The mean number of EBIPs that faculty were interested in is presented in Table 6. For the sake of this analysis, in addition to those who selected statement 4, we also included those who selected statements 5 (In the past, I have used all or part of it, but I am no longer using it) and 6 (I currently use all or part of it), since by definition, these users must have passed through a stage of interest in which they learned more about the EBIP. On average, faculty professed interest in approximately 6 of the 17 EBIPs, a drop from the eleven that they reported awareness of. Although we are beginning to see differences between the departments, with chemists expressing interest in approximately five EIPBs while physicists express interest in approximately seven, a one-way ANOVA shows no significant differences between groups [F(2,57) = 19.087, p = 0.212].

Table 6 Number of EBIPs faculty are interested in

Adoption of EBIPs

We next explored the self-reported level of adoption of EBIPs among chemistry, biology, and physics faculty. Only faculty who selected statement 6 (I currently use all or part of it) for a given EBIP were considered a current user of that particular EBIP. The mean number of EBIPs that faculty reported using is presented in Fig. 3a. On average, physicists reported using more EBIPs (5.6 ± 3.6) than biologists (3.2 ± 3.2), who themselves reported using more EBIPs than chemists (1.8 ± 2.0). A Welch F test (which we selected instead of a one-way ANOVA due to a violation in the assumption of the homogeneity of variances) shows a significant difference between groups in reported EBIP usage [F(2,31.58) = 7.34, p = 0.002] with a large effect size (η 2 = 0.204). The results of a Games-Howell post-hoc test show that physicists had adopted significantly more EBIPs than chemists (p = 0.004). Therefore, despite being aware of a similar number of EBIPs, chemists reported implementing significantly fewer of these practices than physicists. Biologists’ usage rates fall between those of chemists and physicists.

Fig. 3
figure 3

Number of EBIPs faculty are using. a Average and standard deviation. b Percentage of faculty who are using X or more EBIPs

Figure 3b indicates that although the percentage of faculty reporting use of at least one EBIP is somewhat similar across departments (65, 76, and 87 % for chemistry, biology, and physics, respectively), the reported EBIP adoption rates in the different departments quickly diverge. Thirty, fifty-two, and eighty-seven percent of chemists, biologists, and physicists, respectively, reported current usage of at least three EBIPs, while 5, 32, and 53 %, of chemists, biologists, and physicists, respectively, reported current usage of at least five EBIPs. The usage trends do not reconverge until approximately 11 EBIPs; less than 10 % of faculty in any department report using 11 or more EBIPs.

As in the previous section investigating EBIP awareness, we were interested in determining which specific EBIPs were the most and least adopted across the departments. Table 7 lists the percentage of faculty reporting current usage of each of the seventeen EBIPs, ranked by the overall average EBIP adoption. Clickers were the most highly adopted EBIPs across all three departments, consistent with its status as the EBIP that faculty are most aware of. Again, despite the popularity of clickers, the level of adoption of peer instruction varied across the departments, ranging from only 5 % among chemists to a remarkable 87 % among physicists. Adoption of animations, formative assessment, collaborative learning, and case studies varied widely between departments (Table 7). Physicists were unique in their prevalent adoption of concept inventories, just-in-time teaching, interactive demonstrations, think-pair-share, and computer simulations. SALG was one of the least-used EBIPs across all departments. In addition, biologists reported very low levels of adoption (<10 %) of concept inventories and interactive demonstrations, while few physicists reported adoption of problem-based learning, POGIL, and concept maps. Remarkably, 11 of the 17 EBIPs exhibited very poor (<10 %) adoption rates for the chemistry faculty (Table 7).

Table 7 Percent of faculty reporting current use of the indicated EBIP. Results are displayed as the percent of faculty reporting current use. Percentages above one third are bolded; percentages below 10 % are italicized

Limitations to self-reported descriptions of instructional practices are widely recognized (D'Eon et al. 2008; Ebert-May et al. 2011; Kane et al. 2002). We thus collected observational data in addition to our survey to provide more accurate insight into the level of EBIP adoption in the classroom. First, the level of EBIP adoption was measured indirectly using the RTOP, since many EBIPs are comparable to the reformed instructional practices assessed by this observation protocol. Table 8 provides the means and standard deviations of RTOP scores across the three departments. A one-way ANOVA shows no significant difference between groups on RTOP scores [F(2,81) = 0.060, p = 0.942]. We also conducted a cluster analysis, which classified the video recordings into ten COPUS teaching profiles based on observed classroom behaviors. This process of video analysis and statistical clustering is described in detail in a previous paper (Lund et al. 2015).

Table 8 Averaged RTOP scores of classroom observations by discipline

Table 9 demonstrates that although the RTOP scores are similar across the three groups, there are important differences between departments when it comes to which instructional practices are being enacted in the classroom. The majority of the chemistry classroom observations (69 %) felt into the lecturing instructional style, while the physics classroom observations were split between lecturing and peer instruction (45 % each). Biology classroom observations were split across all four instructional styles, including collaborative learning (11 %). None of the physics or chemistry classroom observations felt into this particular instructional style. In order to evaluate whether these disciplinary differences in the level of student interaction were significant, we compared the frequency of lecture-based instruction (i.e., lecturing and Socratic instruction) and the frequency of instruction that integrates student interactions (i.e., peer instruction and collaborative learning) among the three disciplines. A chi-square analysis demonstrated a significant relationship between the type of instruction and the discipline, χ 2(2, N = 84)= 6.293, p = 0.043, Φ = 0.274. There were significantly fewer observations in chemistry that fell into the instruction that integrates student interactions.

Table 9 Distribution of lectures across the COPUS profiles by discipline

As was noted above (see Table 4), the significant differences in learning environments (i.e., course level, class size, and classroom layout) that we observed among the departments we surveyed would be expected to produce different distributions of COPUS teaching profiles. Thus, it was necessary to check whether the differences observed in Table 8 were merely due to these departmental differences in classroom contexts. Using the ratio of the instructor-centered instructional styles (lecturing and Socratic instruction) to the student-centered instructional styles (peer instruction and collaborative learning) found in each of these teaching conditions (see Lund et al. 2015), Fig. 2bd, it is possible to estimate the expected rate of instructor-centered instructional styles according to the rate of their occurrence in the biology, chemistry, and physics departments we surveyed. According to these calculations (Additional file 2), we would predict that the rate of instructor-centered instructional styles (lecturing and Socratic instruction) would be approximately 65 % in the biology and chemistry departments, and approximately 70 % in the physics department (due primarily to the higher percentage of small classes with tables). Notably, the rate of instructor-centered teaching observed in the biology department (62 % lecturing/Socratic instruction) matches our prediction. However, the observed rate in chemistry (85 %) is 20 percentage points higher than our prediction, and the observed rate in physics (54 %) is 19 percentage points lower than our prediction. This suggests that the different distributions of COPUS profiles observed in Table 9 are not only due to the differences in teaching contexts, but due to other factors as well. These potential factors will be discussed below.

Disciplinary differences in factors influencing biology, chemistry, and physics faculty’s awareness and adoption of evidence-based instructional practices

Communication channels

Research on the diffusion of innovation has clearly demonstrated that the communication channels utilized by adopters influence their level of awareness and adoption of EBIPs (Rogers 2003). A portion of our survey asked faculty to indicate the extent to which they relied on various communication channels (e.g., academic journals, fellow faculty members, etc.) for advice about teaching. Participants indicated how frequently they utilized these resources using the following Likert scale: (0) Not Applicable, (1) Never/Rarely, (2) Sometimes, (3) Often, (4) Always. In addition, our survey asked faculty how often they attended a number of professional conferences that have some focus on teaching. Participants indicated their conference attendance using the following rating scale: (1) I have never heard of it/I have heard the name but I have never attended, (2) I have attended it once or twice in the past, (3) I attend this conference regularly, (4) I attend every conference offering. Table 9 displays the percentage of faculty who selected options 3 or 4 (Often/Regularly and Always/Every, respectively) on these scales.

Lecturers (or professors of practice) and “bench” faculty were among the top three most utilized communication channels across all three departments, although used at very different levels in each department. For example, physicists rely more on their bench colleagues than their lecturer(s), while the opposite is true for chemists. Faculty conducting science education research were modestly utilized across all departments; they were the second most utilized resources for chemists and physicists, although they were only the fifth most utilized resource for biologists, with just 17 % of biologists reporting high utilization of this communication channel. This could partly be explained by the fact that the chemistry and physics departments each currently have or have had at least one education researcher, but there was no biology education researcher in the biology department at the time of the data collection, raising the question of who biology faculty assume is taking on this role in their department. Finally, a fifth of physicists reported regular reliance on their department chair regarding teaching issues, whereas only one biologist (4 %) and no chemists reported such contact. At the time of data collection, the chair of the physics department had significant secondary teaching experience; this experience may explain why his colleagues saw him as a teaching resource.

Educational publications were largely underutilized except for educational texts or websites, which were used “often” by almost half of the biologists and a quarter of the chemists. Science education journals such as discipline-based education research journals and the Journal of College Science Teaching are notably not used by faculty, despite their explicit goals of targeting this population.

The educational conferences were among the least-utilized of the communication channels presented in Table 10. No faculty reported (4) I attend every conference offering, and only 4 faculty out of 60 (6.7 %) reported (3) I attend this conference regularly. These choices are very analogous to (4) Always and (3) Often, the choices provided for utilization of the other communication channels, and thus are the most appropriate responses for direct comparison of conference attendance rates with the other communication channels. However, there clearly are far greater time and resource barriers to attending a conference than reading a journal article or communicating with a fellow faculty member. Accordingly, we were interested in considering what percentage of faculty had reported at least the rating of (2) I have attended it once or twice in the past for each conference. These results are presented in Table 11. Clearly, education talks at national conferences are one of the most utilized of the conferences, with roughly half of all faculty reporting attendance at one or more of these talks. Chemists are unique in their participation in Science Education Conferences (e.g., Biennial Conference on Chemical Education, Gordon Research Conferences in Education), with roughly a third having attended such a conference.

Table 10 Percentage of faculty reporting often or always utilizing the following communication channels. Percentages above one quarter are bolded; percentages below 10 % are italicized
Table 11 Percentage of faculty reporting attendance at one or more meeting of the indicated conference

Contextual influences

Our theoretical framework highlights the importance of the context in which faculty are working in faculty’s decision to engage in instructional reforms. The literature has identified two critical aspects: the departmental context and the characteristics of the learning environment.

Departmental influences

Among the external factors that can influence faculty’s teaching practices, we were interested in faculty perceptions of the departmental expectations surrounding their teaching. Table 12 lists the percent of faculty indicating that the stated expectation is true (somewhat, quite a bit, or a great deal) in their department. Interestingly, a majority of the physicists reported each of the three expectations to be present in their department, whereas less than half of the chemists and biologists did so. Furthermore, although half of the chemists reported an expectation for active student involvement in class, a mere quarter of chemists reported an expectation that non-lecture techniques or a variety of techniques would be implemented in class.

Table 12 Percentage of faculty who identified the following departmental expectations as true. Percentages above two thirds are bolded; percentages below 50 % are italicized

As indicated in the conceptual framework, the departmental context can influence (positively or negatively) faculty’s decision toward instructional reform. We asked faculty to rate the extent to which several departmental characteristics influence their teaching on a scale from (0) Not Applicable, (1) Not at all, (2) A little, (3) Somewhat, to (4) Very/Extremely. Table 13 lists the results as a percentage of faculty selecting (3) or (4). The majority of the faculty felt that they have flexibility in how they teach their course (more than 90 % of faculty in every department) and that the priority placed by their department on research and the priority placed by their department on teaching influences their own teaching (over 73 % of the faculty in each department). Time constraints due to research commitments were more influential than administrative or service commitments within most departments, although chemists reported them to be equally influential. Departmental promotion or tenure pressures were reported as influential by roughly half of the chemists and physicists and by two thirds of the biologists. Interestingly, less than half of the faculty in each department reported that their departmental reward system influences their teaching.

Table 13 Percentage of faculty who identified the following departmental characteristics as influencing their teaching. Percentages above two thirds are bolded; percentages below 50 % are italicized

Characteristics of the learning environment

Significant variations between departments exist with respect to the influence of the learning environment on faculty’s teaching, as illustrated in Table 14. The percent of faculty reporting that class size dictates teaching methods ranged from three quarters of physicists to a striking 100 % of biologists. The physical classroom space was reported as influential for only 40 % of physicists, but for nearly 90 % of chemists and biologists. These faculty perceptions about the influence of class size and classroom appear consistent with the actual class sizes and classrooms found across the departments. It is perhaps unsurprising that the biologists are particularly concerned with class size and classroom space since, as described in Table 4, the biologists we surveyed and observed were significantly more likely to be responsible for teaching large classes (>100 students) in fixed-seating lecture halls. Physicists, on the other hand, were significantly more likely to teach small classes (<25 students) in classrooms with tables.

Table 14 Percentage of faculty who identified the following pedagogical aspects as influences on their teaching. Percentages above two thirds are bolded; percentages below one third are italicized

TA availability and textbook selection were each cited as influential by over 70 % of chemists, but by only 50–60 % of biologists and by 40 % of physicists. The ability to cover all necessary content was reported as influential by nearly half of chemists, but by only a third of physicists and a quarter of biologists. Interestingly, although one of the least influential items for chemists and biologists was working with a required textbook or syllabus planned by others, physicists were as likely (40 %) to cite this influence as they were physical classroom space, TA availability, or their own textbook selection.

Half of chemists and two thirds of physicists reported being influenced by student preparation, as opposed to only a third of biologists. However, very few faculty reported that teaching evaluations by students influenced their selection of teaching methods.

Individual influences

Our conceptual framework highlights the importance of individual factors on faculty-teaching practices. These factors primarily relate to faculty past experiences in the classroom (both as a student and a teacher), as well as their attitudes and beliefs about teaching.

Pedagogical experiences

As previously mentioned, physicists were more likely to have attended pedagogical professional development programs than the biologists and chemists (Table 3). Interestingly, a strong majority of physicists (80 %) reported that knowledge of instructional methods influenced their teaching, while only a third of chemists or biologists did so. Moreover, the majority of biologists (60 %) reported basing their teaching on their own best teachers, although only half of chemists and a fifth of physicists did so.

We were interested in further exploring the relationships between the teaching methods faculty had personally experienced as a student and their current teaching practices. We thus asked faculty to identify which thirteen of the seventeen EBIPs they experienced when they were a student (Additional file 1). On average, faculty had experienced only one or two of the thirteen EBIPs during their entire undergraduate career (Fig. 4a). It was very rare for faculty to have experienced more than four of the EBIPs as an undergraduate, and no single faculty reported having experienced more than six (Fig. 4b). In fact, roughly half of the faculty in each department had experienced none of the EBIPs at any time as a student, and no single EBIP had been experienced by all, or even most, of the faculty (see Table 15).

Fig. 4
figure 4

Number of EBIPs faculty experienced as students. a Average and standard deviation. b Percentage of faculty who experienced X or more EBIPs when they were students

Table 15 Percent of surveyed faculty who had experienced the indicated EBIP

Among the faculty who had experienced EBIPs as a student, we also quantified how likely they were to be currently using some of those EBIPs. Although there are very wide variations among faculty within each department, the physicists we surveyed implemented significantly more EBIPs that they experienced as students than did the chemists (Table 16); F(2,10.958) = 6.235, p = 0.016, η 2 = 0.34.

Table 16 Average percentage of EBIPs each faculty is using, of the total EBIPs they personally experienced

Attitudes and beliefs toward student-centered teaching

We first explored faculty’s attitudes toward student-centered teaching by asking them the extent to which they agree with various statements regarding this style of pedagogy. Table 17 presents the results as a percentage of faculty selecting Agree or Strongly Agree with the indicated statement. A strong majority of faculty in all departments reported an interest in implementing non-lecture strategies. Chemists were somewhat more likely to believe that teaching with new instructional methods will limit content coverage (consistent with their views on covering content in Table 14), and much more likely to believe that group work is more appropriate in recitation.

Table 17 Attitudes toward student-centered teaching. Percent selecting agree or strongly agree. percentages above two thirds are bolded; percentages below 50 % are italicized

Faculty beliefs about student-centered teaching were also measured through a validated instrument that was included in our survey, the Approaches to Teaching Inventory (ATI; Trigwell and Prosser 2004; Trigwell et al. 2005). The 22 items in this instrument are designed to measure two embedded construct variables, the conceptual change/student-focused (CCSF) approach to teaching, and the information transmission/teacher-focused (ITTF) approach to teaching. This survey has been demonstrated to be a measure of teachers’ beliefs about teaching (Trigwell and Prosser 1996a). Each survey item was evaluated by the participant on a five-point scale from 1 (rarely or never true) to 5 (almost always or always true). Table 18 provides the means and standard deviations for each construct. A one-way ANOVA shows a significant difference with medium effect size between the groups on the CCSF and ITTF constructs [F(2,57) = 4.054, p = 0.023, η 2=.12 and F(2,57) = 4.744, p = 0.012, η 2=.14]. The results of a post-hoc Tukey show that chemists score statistically significantly lower on the CCSF scale than physicists, and that chemists score statistically significantly higher on the ITTF scale than both biologists and physicists. We thus see disciplinary differences in faculty’s beliefs about teaching.

Table 18 Results of the approaches to teaching inventory by discipline

Discussion

The conceptual framework used in this study (Fig. 1) led us to hypothesize that disciplinary differences between biology, chemistry, and physics faculty would exist in their level of knowledge and adoption of EBIPs, as well as in the factors influencing their instructional practices. The results described above support these hypotheses. Our findings, which are summarized below, are organized by research questions and framed within the conceptual framework.

Disciplinary differences in the awareness and adoption of EBIPs among biology, chemistry, and physics faculty

The analyses of the level of awareness and adoption of EBIPs among biology, chemistry, and physics faculty at one research-intensive university in the USA show variations between disciplines in faculty’s progression through the instructional innovation-decision process (Fig. 1). Although all three groups of faculty have a similar level of awareness of EBIPs (from M = 10.5 to M = 11.5 EBIPs; Fig. 2), the physics faculty have adopted three times more EBIPs than chemistry faculty have (a statistically significant difference), and twice as many as biology faculty (Fig. 3). Classroom observations confirmed that physics and biology faculty are more frequently employing student-centered instructional practices than chemistry faculty (Table 8). Interestingly, eight of the most popular EBIPs are common to all three disciplines (animations, case studies, clickers, collaborative learning, computer simulations, formative assessment, peer instruction, and think-pair-share) and four more EBIPs are popular among two of the disciplines (concept inventories, interactive demonstrations, Just-in-Time Teaching, and Problem-Based Learning). Although the three groups of faculty are aware of the same types of EBIPs, the EBIPs that interest them and that they have adopted are not as similar: only two EBIPs interested all three types of faculty (clickers and formative assessment) and four interested two types of faculty (collaborative learning, computer simulations, concept inventories, interactive demonstrations); similarly, only clickers were adopted by all three types of faculty, while animations and collaborative learning were only adopted by biologists and physicists.

Although several factors that influence the different rates of EBIP awareness and adoption among the three departments are discussed in the next section, these factors do not directly explain the departmental interest in the different types of EBIPs described in the previous paragraph. These departmental differences in the specific EBIPs that faculty are interested in and implement may reflect the sorts of EBIP characteristics that faculty in different disciplines prefer. Indeed, Rogers identified the following four characteristics of an innovation that a future adopter considers at the persuasion stage (Rogers 2003): relative advantage (the extent to which the innovation is perceived as better than current practices), compatibly with current instructional practices and norms, complexity (the extent to which the innovation is perceived as difficult to understand and use), trialability (the extent to which the innovation is perceived as being testable in a small setting), and observability (the extent to which outcomes of the implementation of the innovation will be seen by other faculty and members of the community). It was beyond the scope of this study to identify the specific characteristics of EBIPs that faculty in each of these three disciplines value, but these results indicate that it should be investigated in future studies.

Disciplinary differences in factors influencing biology, chemistry, and physics faculty’s awareness and adoption of EBIPs

The results described above indicate that the faculty in the three departments we studied experience a wide spectrum of influences toward instructional innovation that ranged from supportive to impeding, which we summarize in Figs. 5, 6, and 7. First, in terms of communication channels, biologists and chemists use mass media (e.g., journals, websites) as well as colleagues when seeking solutions to teaching problems, while physicists primarily rely on their colleagues (Table 9). One similarity among all three departments is the reliance on colleagues within their own department when facing teaching problems. This is consistent with the importance of interpersonal channels that Rogers highlights in his model (Rogers 2003). The influence of communication channels can also be seen in the type of EBIPs that the faculty know about, are interested in, and use. For example, the use of case studies was one of the most popular EBIPs and one of the four most used EBIPs for biologists. This EBIP has been widely disseminated within the biology discipline, in particular with the National Center for Case Study Teaching in Science (National Center for Case Study Teaching in Science). Similarly, Peer Instruction, interactive demonstrations, and just-in-time teaching are popular among physicists (Henderson and Dancy 2009). These EBIPs are targeted at the New Faculty Workshop for Physics and Astronomy faculty, which is attended by a quarter of new physics faculty in the country each year (Henderson 2008).

Fig. 5
figure 5

EBIP awareness/adoption rates and factors affecting the instructional practices of biology faculty

Fig. 6
figure 6

EBIP awareness/adoption rates and factors affecting the instructional practices of chemistry faculty

Fig. 7
figure 7

EBIP awareness/adoption rates and factors affecting the instructional practices of physics faculty

Secondly, although biologists report a balance of supportive and impeding influences toward student-centered teaching, chemists report primarily impeding contextual influences and physicists report primarily supportive contextual influences. For example, the chemists perceive that there are weak norms toward student-centered teaching within their department and feel some constraints on their time due to research expectations; they also feel that certain elements of the learning environment (class size, layout and availability of teaching assistants) constrain their teaching. On the contrary, physicists perceive strong norms toward student-centered teaching within their department, although they also feel similar constraints on their time due to research expectations; they identify class size as being influential on their teaching, along with the level of student preparation, which indicates a focus on students rather than infrastructure. Clearly, faculty within these three departments are experiencing different contextual influences on their teaching, despite the fact that the departments are all on the same campus, managed by the same college. Our conceptual framework would indicate that the contextual influences experienced by the physicists are more conducive of them choosing to adopt student-centered teaching practices compared to those experienced by chemistry faculty.

Thirdly, the same trends are observed for individual influences: chemists have had limited experiences and training with EBIPs, and have teacher-centered beliefs and attitudes; in contrast, the majority of the physicists had participated in teaching workshops and had student-centered beliefs and attitudes. The biologists ranked between chemists and physicists, with a mix of student- and teacher-centered beliefs and attitudes. Faculty in the three departments thus hold different individual characteristics which, according to our conceptual framework, should result in departmental differences in the level of adoption of student-centered instructional practices.

These important differences in contextual and individual influences that we observed in the three disciplines suggest that we should be cautious when generalizing from studies exploring faculty instructional practices and decision-making about teaching; studies of faculty in one particular STEM discipline within one particular type of institution may not generalize well to all STEM faculty at all institutions.

Relationships between awareness/adoption rates and factors influencing instructional practices

Our conceptual framework indicates that communication channels and both contextual and individual influences will impact the instructional innovation-decision process (Fig. 1). Our findings support this framework. Indeed, our data demonstrate that faculty who experience contextual and individual influences supportive of student-centered instructional practices are significantly more likely to be interested in and adopting EBIPs than faculty who experience impeding contextual and individual influences For example, the chemists in this study primarily experienced contextual and individual influences that can impede adoption of EBIPs, including weak departmental norms and negative attitudes and beliefs regarding student-centered teaching; interestingly, chemists had the lowest adoption rate with an adoption average of just 2 of the 11 EBIPs they are aware of. This self-reported adoption data is confirmed by classroom observations (Table 8), which produced overwhelmingly instructor-centered distribution of COPUS profiles (lecturing, 69 %, and Socratic instruction, 17 %) at the expense of student-centered COPUS profiles (peer instruction, 13 %). In sharp contrast, physicists in this study primarily experienced contextual and individual influences that can support adoption of EBIPs, including strong departmental norms and positive attitudes and beliefs regarding student-centered teaching; this is reflected in their adoption of an average of 6 of the 11 EBIPs they are aware of, a rate triple that of the chemists. This self-reported adoption data is again confirmed by classroom observations, which resulted in a striking increase in student-centered COPUS profiles (peer instruction, 45 %). The contextual and individual influences experienced by the biologists were intermediate between the chemists and physicists, as were their self-reported EBIP adoption rates and their distribution of COPUS profiles.

Readers may recall that statistically significant differences were observed in the learning environments across the three departments (Table 4) which, according to prior work (Lund et al. 2015), would be expected to produce differences in the distribution of COPUS profiles (Table 8). Our calculations (Additional file 2) resulted in a predicted rate of instructor-centered instructional styles (lecturing and Socratic instruction) that matched the observed rate among biologists in this study. However, the observed rate of instructor-centered instructional styles among chemists was 20 % higher than predicted, and the observed rate among physicists was 19 % lower than predicted. This difference is consistent with the imbalance of supporting and impeding influences summarized in Figs. 5, 6, and 7.

These observations and findings further confirm the need to take into account the characteristics of the targeted population in order for instructional reforms to be effective. The disciplinary differences observed in this study among faculty within the same institution call for more extensive investigations of characteristics of faculty within each STEM discipline and across various types of institutions. This baseline data is critical to design and implements instructional reforms that are adequately tailored to the needs of these various populations.

Limitations

First, this study was conducted at one research-intensive institution within the USA and should be replicated at similar institutions within and outside the USA. Second, the number of faculty who volunteered to be videotaped in the physics department was significantly lower than in the biology and chemistry department, which limits the generalizability of the results to other faculty members within this department. Finally, data measuring faculty awareness of EBIPs was collected through self-report. Although a short description was provided for each EBIP, it is possible that faculty misinterpreted the EBIP and indicated knowing about it or using it when this may not be the case. This issue was probed during the interviews undertaken to validate the online survey; adjustments to the descriptions were made accordingly, and certain EBIPs were eliminated due to high probability of misunderstanding by the faculty (e.g., writing and sharing learning goals with students). However, we suspect that there may still be some confusion regarding the nature of certain EBIPs, despite our best effort to eliminate it.

Conclusions

In this study, we explored and compared the awareness and adoption rates of EBIPs, as well as factors influencing instructional decisions, across faculty from the following three STEM disciplines: biology, chemistry, and physics. Faculty within one research-intensive institution in the USA were surveyed, and their classroom practices were analyzed. Results demonstrate that physicists at that institution are more likely to experience contextual and individual influences in support of student-centered teaching, while chemists were more likely to experience impeding contextual and individual influences toward student-centered teaching. These variations in the type of influences are consistent with the variations in the self-reported level of adoption of EBIPs, with physicists adopting the largest number of EBIPs and chemists adopting the least. These trends in EBIP adoption rates are particularly striking, since the rates of EBIP awareness are very similar among the three departments. The self-reported EBIP adoption data was confirmed by observational data, which resulted in a very instructor-focused distribution of COPUS teaching profiles among chemists, while the distribution in physics included significantly more student-focused COPUS profiles. Biologists fall between the physicists and the chemists in terms of contextual and individual influences, self-reported EBIP adoption rates, and the observed distribution of COPUS teaching profiles. This study highlights that important departmental and disciplinary differences can exist, even across faculty within the same institution. Clearly, STEM faculty cannot and should not be treated identically when an instructional reform is initiated at one institution. Moreover, findings from this study caution against the overgeneralization of the results of studies exploring the teaching practices and thought processes of faculty within one STEM field to all STEM faculty.

Finally, awareness rates in all three departments were quite high, which is consistent with findings in other studies. Reform efforts should thus be focused not just on advertising that certain EBIPs exist, but also on raising the interest levels and adoption rates.

Abbreviations

ATI:

approaches to teaching Inventory

BER:

biology education research

CCSF:

conceptual change/student-focused approach to teaching

CER:

chemical education research

COPUS:

classroom observation protocol for undergraduate STEM

DBER:

discipline-based education research

EBIP:

evidence-based instructional practice

ITTF:

information transmission/teacher-focused approach to teaching

PER:

physics education research

RTOP:

reformed teaching observation protocol

STEM:

science, technology, engineering, and mathematics

References

  • American Association of Universities (2011). Undergraduate STEM education initiative. https://www.aau.edu/policy/article.aspx?id=12588. Accessed 12/30 2014.

  • Anderson, W, Banerjee, U, Drennan, C, Elgin, S, Epstein, I, Handelsman, J, et al. (2011). Changing the culture of science education at research universities. Science, 331, 152.

    Article  Google Scholar 

  • Austin, A. (2011). Promoting evidence-based change in undergraduate science education. A white paper commissioned by the National Academies National Research Council Board on Science Education. http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_072578.pdf. Accessed 07/30/2015

  • Baker, LA, Chakraverty, D, Columbus, L, Feig, AL, Jenks, WJ, Pilarz, M, Stains, M, Waterman, R and Wesemann, JL. (2014). Cottrell Scholars Collaborative New Faculty Workshop: Professional Development for New Chemistry Faculty, Journal of Chemical Education, 91(11), 1874-1881 DOI:10.1021/ed500547n

  • Borrego, M, Froyd, JE, & Hall, TS. (2010). Diffusion of engineering education innovations: a survey of awareness and adoption rates in US engineering departments. Journal of Engineering Education, 99(3), 185–207.

    Article  Google Scholar 

  • Borrego, M, Cutler, S, Froyd, J, Prince, M, & Henderson, C. Faculty use of research based instructional strategies. In AAEE Conference, Fremantle, Western Australia, 2011

  • Boyer Commission on Educating Undergraduates in the Research University. (1998). Reinventing undergraduate education: a blueprint for America’s research universities (In Carnegie Foundation for the Advancement of Teaching (Ed.)). CA: Menlo Park.

    Google Scholar 

  • Brownell, SE, & Tanner, KD. (2012). Barriers to faculty pedagogical change: lack of training, time, incentives, and … tensions with professional identity? CBE-Life Sciences Education, 11(4), 339–346.

    Article  Google Scholar 

  • Burke, K, Greenbowe, TJ, & Gelder, JI. (2004). The multi-initiative dissemination project workshops: Who attends them and how effective are they? Journal of Chemical Education, 81(6), 897.

    Article  Google Scholar 

  • Childs, PE. (2009). Improving chemical education: turning research into effective practice. Chemistry Education Research and Practice, 10(3), 189–203.

    Article  Google Scholar 

  • Council, NR. (1999). Transforming undergraduate education in science, mathematics, engineering, and technology. Washington, DC: National Academy Press.

    Google Scholar 

  • Council, NR. (2003). Evaluating and improving undergraduate teaching in science, technology, engineering, and mathematics. Washington, D.C.: National Academy Press.

    Google Scholar 

  • Council, NR. (2011). Promising practices in undergraduate science, technology, engineering, and mathematics education: summary of two workshops. Washington, DC: The National Academies Press.

    Google Scholar 

  • D’Eon, M, Sadownik, L, Harrison, A, & Nation, J. (2008). Using self-assessments to detect workshop success: Do they work? American Journal of Evaluation, 29(1), 92–98.

    Article  Google Scholar 

  • Dancy, MH, & Henderson, C. (2007). Framework for articulating instructional practices and conceptions. Physical Review Special Topics-Physics Education Research, 3(1), 010103.

    Article  Google Scholar 

  • Doyle, W, & Rosemartin, D. (2012). The ecology of curriculum enactment. In Interpersonal Relationships in Education (pp. 137-147). Rotterdam, The Netherlands: Sense Publishers.

  • Eberlein, T, Kampmeier, J, Minderhout, V, Moog, RS, Platt, T, Varma-Nelson P, et al. (2008). Pedagogies of engagement in science: a comparison of PBL, POGIL, and PLTL. Biochemistry and Molecular Biology Education, 36(4), 262–273. doi:10.1002/bmb.20204.

    Article  Google Scholar 

  • Ebert-May, D, Derting, TL, Hodder, J, Momsen, JL, Long, TM, & Jardeleza, SE. (2011). What we say is not what we do: effective evaluation of faculty professional development programs. Bioscience, 61(7), 550–558. doi:10.1525/bio.2011.61.7.9.

    Article  Google Scholar 

  • Executive Office of the President of the United States (2012). Presidential commitments in support of PCAST recommendations on science, technology, engineering, and mathematics education. https://www.whitehouse.gov/sites/default/files/microsites/ostp/pcast-related_initiatives_fact_sheet.pdf. Accessed 07/30/2015

  • Feldman, A. (2000). Decision making in the practical domain: a model of practical conceptual change. Science Education, 84(5), 606–623.

    Article  Google Scholar 

  • Fixsen, DL, Naoom, SF, Blase, KA, Friedman, RM, & Wallace, F. (2005). Implementation research: a synthesis of the literature. Tampa, FL: University of south Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network.

    Google Scholar 

  • Froyd, J. (2011). Propagation and realization of educational innovations in the system of undergraduate STEM education. A White Paper commissioned for the National Academy of Engineering Forum “Characterizing the Impact and Diffusion of Engineering Education Innovations”. https://www.nae.edu/File.aspx?id=36824. Accessed 07/30/2015

  • Gess-Newsome, J, Southerland, SA, Johnston, A, & Woodbury, S. (2003). Educational reform, personal practical theories, and dissatisfaction: the anatomy of change in college science teaching. American Educational Research Journal, 40, 731–767.

    Article  Google Scholar 

  • Ghaith, G, & Yaghi, H. (1997). Relationships among experience, teacher efficacy, and attitudes toward the implementation of instructional innovation. Teaching and Teacher Education, 13(4), 451–458.

    Article  Google Scholar 

  • Gordon, C, & Debus, R. (2002). Developing deep learning approaches and personal teaching efficacy within a preservice teacher education context. British Journal of Educational Psychology, 72(4), 483–511.

    Article  Google Scholar 

  • Graham, ID, Logan, J, Harrison, MB, Straus, SE, Tetroe, J, Caswell, W, et al. (2006). Lost in knowledge translation: time for a map? Journal of Continuing Education in the Health Professions, 26(1), 13–24. doi:10.1002/Chp.47.

    Article  Google Scholar 

  • Guskey, TR. (1988). Teacher efficacy, self-concept, and attitudes toward the implementation of instructional innovation. Teaching and Teacher Education, 4(1), 63–69.

    Article  Google Scholar 

  • Handelsman, J, Ebert-May, D, Beichner, R, Bruns, P, Chang, A, DeHaan, R, et al. (2004). Scientific teaching. Science, 304(5670), 521–522.

    Article  Google Scholar 

  • Handelsman, J, Miller, S, & Pfund, C. (2006). Scientific teaching. USA: W.H. Freeman & Company, in collaboration with Roberts & Company Publishers.

    Google Scholar 

  • Henderson, C. (2008). Promoting instructional change in new faculty: an evaluation of the physics and astronomy new faculty workshop. American Journal of Physics, 76, 179.

    Article  Google Scholar 

  • Henderson, C, & Dancy, MH. (2007). Barriers to the use of research-based instructional strategies: the influence of both individual and situational characteristics. Physical Review Special Topics-Physics Education Research, 3(2), 020102.

    Article  Google Scholar 

  • Henderson, C., & Dancy, M. H. (2009). Impact of physics education research on the teaching of introductory quantitative physics in the United States. Physical Review Special Topics-Physics Education Research, 5(2), doi:10.1103/PhysRevSTPER.5.020107

  • Henderson, C., & Dancy, M. H. (2011). Increasing the Impact and Diffusion of STEM Education Innovations. A White Paper commissioned for the National Academy of Engineering Forum “Characterizing the Impact and Diffusion of Engineering Education Innovations”. https://www.nae.edu/File.aspx?id=36304. Accessed 07/30/2015

  • Henderson, C, Beach, A, & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: an analytic review of the literature. Journal of research in science teaching, 48(8), 952–984.

  • Henderson, C, Cole, R, Froyd, J, & Khatri, R. (2012a). Five claims about effective propogation: A white paper prepared for January 30–31, 2012 meetings with NSF-TUES Program Directors. http://homepages.wmich.edu/~chenders/Publications/2012WhitePaperFiveClaims.pdf. Accessed 07/30/2015

  • Henderson, C, Dancy, MH, & Niewiadomska-Bugaj, M. (2012b). Use of research-based instructional strategies in introductory physics: where do faculty leave the innovation-decision process? Physical Review Special Topics-Physics Education Research, 8(2), 020104.

  • Hora, MT. (2012a). Organizational factors and instructional decision-making: a cognitive perspective. The Review of Higher Education, 35(2), 207–235.

    Article  Google Scholar 

  • Hora, M. T. (2012b). A Situative Analysis of the Relationship between Faculty Beliefs and Teaching Practice: Implications for Instructional Improvement at the Postsecondary Level. (Vol. Wisconsin Center for Education Research Working Paper No. 2012–10).

  • Hora, MT, & Anderson, C. (2012). Perceived norms for interactive teaching and their relationship to instructional decision-making: a mixed methods study. Higher Education, 64(4), 573–592.

    Article  Google Scholar 

  • Janssen, FJJM, Westbroek, HB, Doyle, W, & Van Driel, JH. (2013). How to make innovations practical. Teachers College Record, 115(7), 1–43.

    Google Scholar 

  • Kane, R, Sandretto, S, & Heath, C. (2002). Telling half the story: a critical review of research on the teaching beliefs and practices of university academics. Review of Educational Research, 72(2), 177–228.

    Article  Google Scholar 

  • Kember, D, & Kwan, KP. (2000). Lecturers’ approaches to teaching and their relationship to conceptions of good teaching. Instructional Science, 28(5), 469–490.

    Article  Google Scholar 

  • Kezar, AJ. (2001). Understanding and facilitating organizational change in the 21st century: recent research and conceptualizations (ASHE-ERIC higher education report (Vol. 28, no 4)). San Francisco: Jossey-Bass.

    Google Scholar 

  • Lindblom Ylänne, S, Trigwell, K, Nevgi, A, & Ashwin, P. (2006). How approaches to teaching are affected by discipline and teaching context. Studies in Higher Education, 31(03), 285–298.

    Article  Google Scholar 

  • Lomas, J. (1993). Diffusion, dissemination, and implementation: Who should do what? Annals of the New York Academy of Sciences, 703(1), 226–237. doi:10.1111/j.1749-6632.1993.tb26351.x.

    Article  Google Scholar 

  • Lund, TJ, Pilarz, M, Velasco, JB, Chakraverty, D, Rosploch, K, Undersander, M, et al. (2015) The Best of Both Worlds: Building on the COPUS and RTOP Observation Protocols to Easily and Reliably Measure Various Levels of Reformed Instructional Practices, CBE Life Sciences Education, 14(2), ar18 DOI:10.1187/cbe.14-10-0168

  • Macdonald, RH, Manduca, CA, Mogk, DW, & Tewksbury, BJ. (2005). Teaching methods in undergraduate geoscience courses: results of the 2004 on the cutting edge survey of US faculty. Journal of Geoscience Education, 53(3), 237.

    Google Scholar 

  • Macoubrie, J, & Harrison, C. (2013). Human services research dissemination: what works? In U. S. D. o. H. a. H. Services. Washington, D.C: Office of Planning, Research and Evaluation, Administration for Children and Families.

    Google Scholar 

  • Murray, K, & Macdonald, R. (1997). The disjunction between lecturers’ conceptions of teaching and their claimed educational practice. Higher Education, 33(3), 331–349.

    Article  Google Scholar 

  • National Center for Case Study Teaching in Science Case Studies in Science. http://sciencecases.lib.buffalo.edu/cs/. Accessed 03/12 2015.

  • National Research Council. (2012). Discipline-based education research: understanding and improving learning in undergraduate science and engineering. (Vol. 2012). Washington, D.C: The National Academies Press.

    Google Scholar 

  • National Science Foundation (1996). Shaping the future: New expectations for undergraduate education in science, mathematics, engineering, and technology. In N. S. F. D. f. E. a. H. Resources (Ed.). Arlington, VA

  • Norton, L, Richardson, T, Hartley, J, Newstead, S, & Mayes, J. (2005). Teachers’ beliefs and intentions concerning teaching in higher education. Higher Education, 50(4), 537–571.

    Article  Google Scholar 

  • Peace, GE, Lewis, EL, Burke, K, & Greenbowe, TJ. (2002). The multi-initiative dissemination project: active-learning strategies for college chemistry. Journal of Chemical Education, 79(6), 699.

    Article  Google Scholar 

  • Pfund, C, Miller, S, Brenner, K, Bruns, P, Chang, A, Ebert-May, D, et al. (2009). Summer institute to improve university science teaching. Science, 324(5926), 470.

    Article  Google Scholar 

  • Postareff, L, Lindblom-Ylänne, S, & Nevgi, A. (2007). The effect of pedagogical training on teaching in higher education. Teaching and Teacher Education, 23(5), 557–571.

    Article  Google Scholar 

  • Postareff, L, Lindblom-Ylänne, S, & Nevgi, A. (2008). A follow-up study of the effect of pedagogical training on teaching in higher education. Higher Education, 56(1), 29–43.

    Article  Google Scholar 

  • President’s Council of Advisors on Science and Technology (2012). Engage to excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics. In Executive Office of the President (Ed.)

  • Project Kaleidoscope. (2002). Report on reports: recommendations for action in support of undergraduate science, technology, engineering and mathematics. Washington, D.C.: Project Kaleidoscope.

    Google Scholar 

  • Project Kaleidoscope. (2006). Report on reports II: transforming America’s scientific and technological infrastructure: recommendations for urgent action. Washington, D.C.: Project Kaleidoscope.

    Google Scholar 

  • Prosser, M, & Trigwell, K. (1997). Relations between perceptions of the teaching environment and approaches to teaching. British Journal of Educational Psychology, 67(1), 25–35.

    Article  Google Scholar 

  • Rogers, EM. (2003). Diffusion of innovations (5th ed.). New York: Free Press.

    Google Scholar 

  • Rothman, FG, & Narum, JL. (1999). Then, now, & in the next decade: a commentary on strengthening undergraduate science, mathematics, engineering and technology education. Washington, D.C.: Project Kaleidoscope.

    Google Scholar 

  • Schuster, JH, & Finkelstein, MJ. (2006). The american faculty: the restructuring of academic work and careers. USA: Johns Hopkins University Press.

    Google Scholar 

  • Seymour, E., DeWelde, K., & Fry, C. (2011). Determining progress in improving undergraduate STEM education: The reformers’ tale. A White Paper commissioned for the National Academy of Engineering Forum, “Characterizing the Impact and Diffusion of Engineering Education Innovations”. https://www.nae.edu/File.aspx?id=36664. Accessed 07/30/2015

  • Singer, ER. (1996). Espoused teaching paradigms of college faculty. Research in Higher Education, 37(6), 659–679.

    Article  Google Scholar 

  • Stains, M, Pilarz, M., and Chakraverty, D. (2015) Short and Long-Term Impacts of the Cottrell Scholars Collaborative New Faculty Workshop, Journal of Chemical Education, ASAP DOI:10.1021/acs.jchemed.5b00324

  • Talanquer, V. (2014). DBER and STEM education reform: are we up to the challenge? Journal of Research in Science Teaching, 51(6), 809–819.

    Article  Google Scholar 

  • The Carnegie Classification of Institutions of Higher Education™. http://carnegieclassifications.iu.edu/. Accessed 03/08 2015.

  • Tobin, K, & Dawson, G. (1992). Constraints to curriculum reform: teachers and the myths of schooling. Educational Technology Research and Development, 40(1), 81–92.

    Article  Google Scholar 

  • Trigwell, K, & Prosser, M. (1996a). Changing approaches to teaching: a relational perspective. Studies in Higher Education, 21(3), 275–284.

    Article  Google Scholar 

  • Trigwell, K, & Prosser, M. (1996b). Congruence between intention and strategy in university science teachers’ approaches to teaching. Higher Education, 32(1), 77–87.

    Article  Google Scholar 

  • Trigwell, K, & Prosser, M. (2004). Development and use of the approaches to teaching inventory. Educational Psychology Review, 16(4), 409–424.

    Article  Google Scholar 

  • Trigwell, K, Prosser, M, & Taylor, P. (1994). Qualitative differences in approaches to teaching first year university science. Higher Education, 27(1), 75–84.

    Article  Google Scholar 

  • Trigwell, K, Prosser, M, & Ginns, P. (2005). Phenomenographic pedagogy and a revised approaches to teaching inventory. Higher Education Research & Development, 24(4), 349–360.

    Article  Google Scholar 

  • Turpen, C, & Finkelstein, ND. (2009). Not all interactive engagement is the same: variations in physics professors’ implementation of peer instruction. Physical Review Special Topics-Physics Education Research, 5(2), 020101.

    Article  Google Scholar 

  • Turpen, C, & Finkelstein, ND. (2010). The construction of different classroom norms during peer instruction: students perceive differences. Physical Review Special Topics-Physics Education Research, 6(2), 020123.

    Article  Google Scholar 

  • Van Driel, JH. (2014). Professional learning of science teachers. Topics and Trends in Current Science Education; Contributions from Science Education Research, 1, 139–157.

    Article  Google Scholar 

  • Van Driel, JH, Beijaard, D, & Verloop, N. (2001). Professional development and reform in science education: the role of teachers’ practical knowledge. Journal of Research in Science Teaching, 38(2), 137–158.

    Article  Google Scholar 

  • Verloop, N, Van Driel, JH, & Meijer, P. (2001). Teacher knowledge and the knowledge base of teaching. International Journal of Educational Research, 35(5), 441–461.

    Article  Google Scholar 

  • Vickrey, T, Rosploch, K., Rahmanian, R., Pilarz, M. and Stains, M.* (2015) Research-based implementation of Peer Instruction: A literature review, CBE Life Sciences Education, 14(1) DOI:10.1187/cbe.14-11-0198

  • Walczyk, JJ, & Ramsey, LL. (2003). Use of learner-centered instruction in college science and mathematics classrooms. Journal of Research in Science Teaching, 40(6), 566–584.

    Article  Google Scholar 

  • Walczyk, JJ, Ramsey, LL, & Zha, P. (2007). Obstacles to instructional innovation according to college science and mathematics faculty. Journal of Research in Science Teaching, 44(1), 85–106.

    Article  Google Scholar 

Download references

Acknowledgements

We would like to thanks all members of the Stains’ research group who assisted with the collection of video recordings. This work was supported by the National Science Foundation, grant #1256003, and start-up funding from the Department of Chemistry at the University of Nebraska-Lincoln.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marilyne Stains.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

MS conceived and designed the study. TL and MS managed the collection of surveys and video recordings, analyzed and interpreted the data, and drafted the manuscript. All authors read and approved the final manuscript.

Additional files

Additional file 1:

Online Survey. This file contains all the questions that were included in the online survey that was used to collect the data presented in the paper.

Additional file 2:

Learning Environments and Predicted Instructional Practices. This file contains an explanation of the instructional practices that would be predicted from the learning environments in which the video recordings were taken.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lund, T.J., Stains, M. The importance of context: an exploration of factors influencing the adoption of student-centered teaching among chemistry, biology, and physics faculty. IJ STEM Ed 2, 13 (2015). https://doi.org/10.1186/s40594-015-0026-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40594-015-0026-8

Keywords