Open Access

Situated instructional coaching: a case study of faculty professional development

International Journal of STEM Education20163:10

DOI: 10.1186/s40594-016-0044-1

Received: 10 March 2016

Accepted: 16 June 2016

Published: 24 June 2016

Abstract

Background

Barriers to reforming traditional lecture-based undergraduate STEM classes are numerous and include time constraints, lack of training, and instructor’s beliefs about teaching and learning. This case study documents the use of a situated instructional coaching process as a method of faculty professional development. In this model, a geoscience education graduate student (the coach) assisted a faculty member in reforming and teaching an introductory geoscience course on dinosaurs using evidence-based teaching strategies. The revision process occurred in three phases to progressively transfer responsibility for lesson design from the coach to the instructor over the length of the course. Data on instructional practices were collected using the Reformed Teaching Observation Protocol (RTOP), and belief changes experienced by the instructor were captured using the Teacher Beliefs Interview (TBI) and Beliefs about Reformed Science Teaching and Learning (BARSTL) survey.

Results

RTOP data confirm that the instructor was successful in teaching the lessons as designed and also gained skills in designing reformed lessons. TBI and BARSTL were indicative of a shift in the instructor's beliefs toward a more student-centered perspective.

Conclusions

Data collected on instructional practice and teaching beliefs indicate that this model served as an effective method of professional development for the instructor.

Keywords

Reformed instruction Teaching beliefs Active learning Student-centered Geoscience Dinosaurs Professional development

Background

The President’s Council of Advisors on Science and Technology’s (PCAST) report expresses concern that we are losing too many science, technology, engineering, and mathematics (STEM) students along their academic journey (PCAST 2012). Fewer than 40 % of students entering college as STEM majors complete a STEM degree which may result in a future STEM workforce shortage (PCAST 2012). The reasons that students leave STEM are numerous, but among them are loss of interest in the discipline, lack of confidence in one’s ability to succeed in STEM, negative experiences in introductory STEM courses, and poor performance (Chen 2013). The PCAST report’s principal recommendation for boosting STEM majors is to “catalyze widespread adoption of empirically validated teaching practices.” The recognition that students leave STEM majors in significant numbers is not new (Seymour and Hewitt 1997). Neither is the call to change how introductory STEM courses are taught (Handelsman et al. 2004; NRC 1999). However, attention has recently shifted from defining effective teaching practices to discovering effective means of diffusing these practices into the STEM higher education community.

Empirically validated instructional practices are known by an array of names such as student-centered teaching, active learning, research-based instructional practices, or reformed teaching (Handelsman et al. 2004; NRC 1999; PCAST 2012). Examples of these practices include peer instruction (Crouch and Mazur 2001), lecture tutorials (Kortz et al. 2008; LoPresto and Murrell 2009), and a variety of other in-class, student-centered activities (Knight and Wood 2005; McConnell et al. 2003; NRC 2015). These practices have been demonstrated to improve student learning in a variety of instructional settings from smaller laboratory courses (Luckie et al. 2012) to large lecture courses (Crouch and Mazur 2001; Deslauriers et al. 2011; Freeman et al. 2014; Kortz et al. 2008; Walker and Cotner 2008). STEM programs that incorporate reformed teaching practices into introductory courses have also been shown to retain majors (Graham et al. 2013).

A critical aspect of the challenge of reforming undergraduate STEM education is encouraging faculty adoption of reformed teaching strategies in the classroom. Surveys of faculty in engineering (Borrego et al. 2010), geosciences (Macdonald and Manduca 2005), and physics (Henderson and Dancy 2009) reveal that less than half of respondents report some use of reformed teaching practices. Instructors using such methods may do so with modifications that reduce the efficacy of these strategies (Turpen et al. 2010). Ebert-May et al. (2011) utilized classroom observations to show that only 20 % of instructors who reported implementing reformed teaching practices following professional development actually moved toward a more student-centered classroom. A significant proportion of instructors who initially adopt these methods discontinue their use (Henderson and Dancy 2009) while approximately a third report integrating these practices extensively into their courses (Henderson et al. 2012).

Instructors report that the most prevalent barrier to the adoption of reformed teaching strategies is insufficient time to learn about reformed teaching strategies and to revise courses to implement these practices (Dancy and Henderson 2010; Henderson and Dancy 2007; Sunal et al. 2001). Instructors mention obstacles to reform such as limited training in the use of reformed teaching strategies, lack of resources, uncertainty with the practice, and the absence of institutional support (Henderson and Dancy 2009; Walczyk et al. 2007). Additionally, many faculty may struggle with professional identity, where being regarded as a successful researcher holds higher status than being an effective teacher, especially at institutions that have a significant research culture (Brownell and Tanner 2012).

The impetus for this study came from an Earth Science faculty member at North Carolina State University who was discontented with her teaching and desired a change but was uncertain how to make it happen. The instructor sought help from the discipline-based education researchers within the Marine, Earth, and Atmospheric Sciences department. The resulting study describes the piloting of a collaborative professional development model, dubbed situated instructional coaching, for promoting change in both the instructional practice and beliefs of a faculty member teaching an introductory geoscience course. Under this model, the faculty member was assisted in revising and teaching the course by the lead author and primary investigator (PI), a second-year master’s student in a geoscience education program. The PI had one year of reform-based teaching training but no personal teaching experience in a lecture course. In addition to collecting data throughout the process, the PI assisted in revising lessons, creating student activities, and coaching the faculty member in the use of reformed teaching strategies. Data were collected to assess the effect of implementing reformed teaching strategies on the instructor’s professional growth as evidenced in teaching practice, lesson design skills, and beliefs about teaching and learning. All American Pshychological Assoaciation (APA) ethical standards for the treatment of research subjects were complied with, and the study was carried out under the approval of the Institutional Review Board at NCSU (IRB #3172).

Conceptual framework: professional development

Professional development programs seek to facilitate instructional reforms and assist faculty in overcoming barriers to change. The standard short-term, workshop-based programs are among the least effective professional development methods (Ebert-May et al. 2011; Garet et al. 2001; Stes et al. 2010). Effective professional development programs typically last a significant period of time, involve many hours of faculty engagement (Garet et al. 2001; Stes et al. 2010), and are most successful when instructors are able to take an active role in the development process through implementing strategies, observing other instructors, and receiving feedback (Garet et al. 2001). On the basis of Henderson et al.’s (2011) review of change strategies and literature on effective professional development, the National Research Council (NRC 2012) suggests that at least two of the following strategies are necessary as part of successful programs for changing instructional practice: (1) sustained, focused efforts, lasting from 4 weeks to a semester, or longer; (2) feedback on instructional practice; and (3) a deliberate focus on changing faculty conceptions about teaching and learning.

Collaborative models for professional development provide a way to incorporate the NRC’s (2012) suggested features of successful programs and overcome many of the barriers to instructional change. Alternative professional development models that utilize some form of collaboration like peer learning, team teaching, or coaching have resulted in positive reforms to instruction (Emerson and Mosteller 2000; Stes et al. 2010). Working with a collaborator experienced in reformed teaching reduces the time constraints on faculty for revising courses while providing class-situated training on the implementation of reformed practices. This collaborative approach may result in a more positive experience with reformed instruction which in turn could lead to the achievement of desired outcomes such as the continued use of reformed teaching practices and the development of more student-centered teaching beliefs (Bailey and Nagamine 2012; Brogt 2007; Henderson et al. 2009; Wieman et al. 2013).

Conceptual framework: theory of change

Numerous theories of change have been put forth to explain the complex relationship of factors that contribute to instructional change and professional growth. Some theories stress the importance of changing teacher conceptions or beliefs as part of development, which in turn leads to changes in practice and desired student outcomes (Fullan 1982). This model of change is supported by research showing that changing an instructor’s beliefs leads to a change in practice (Ho et al. 2001) or that an instructor’s beliefs about teaching can impact their implementation of curricula (Brickhouse 1990; Cronin-Jones 1991; Roehrig and Kruse 2005) and scientific inquiry lessons (Roehrig and Luft 2004).

Guskey (1986) presented an alternative to this model, proposing that the change in instructors’ beliefs was driven by positive student outcomes that resulted from a change in practice. Evidence to support this “practice first” model comes from studies that show change in beliefs as a result of changes in practice (Devlin 2003; Hativa 2000). Similarly, some authors argue that beliefs may exist as a product of reflection on practice (Eley 2006).

These theories of change attempt to explain the relationships between beliefs, practice, and student outcomes in a linear fashion, with changes to one aspect of instruction driving subsequent changes to the other in a defined temporal order. It can be argued that the relationship of these variables is not well understood (Devlin 2006) and that the interplay between the variables of professional growth are more complex than a linear model might suggest. This has resulted in the development of more dynamic models of change that attempt to explain these relationships and provide a better understanding of professional growth (Clarke and Hollingsworth 2002; Saroyan and Amundsen 2001). Clarke and Hollingsworth (2002) generated the Interconnected Model of Teacher Professional Growth, in which change is the result of various interactions between four domains: (1) the external domain, any outside source of information or motivation; 2) the personal domain, one’s knowledge, beliefs, and attitudes; (3) the domain of practice, experimentation in the classroom; and (4) the domain of consequence, any student-related or other relevant outcomes.

The Interconnected Model of Teacher Professional Growth is a useful analytic tool for guiding and evaluating professional development programs. Clarke and Hollingsworth distinguish between change sequences and growth networks in their model, a distinction which is critical to professional development. Change sequences are simply defined as the change in one domain that creates a change in another, such as a teacher learning a new strategy (external domain) and experimenting with it in the classroom (domain of practice). Growth networks are those changes that involve numerous connections of enactment and reflection between multiple domains, essentially indicating permanent changes in both practice and teaching beliefs (Clarke and Hollingsworth 2002). The goal of professional development programs may be to change teacher practice, but without fostering the continued enactment and reflection across multiple domains, it limits the professional growth of teachers and may not ensure the long-term adoption of effective change.

Methods

Course revisions

The professional development study described in this paper was tied to the revision of a medium-sized, introductory-level course on dinosaurs in the geology department at a large research university. The course has no associated lab, an enrollment cap of 70 students, and was taught during two 75-min class periods per week. The course is generally taken by non-STEM majors to fulfill a general science requirement, and there is high interest due to the topic. The course is taught by instructor M, a faculty member with an internationally recognized research program, numerous publications in prominent journals, and considerable experience communicating her research to the general public. In the past, instructor M co-taught approximately half of the course (8 weeks). In the spring of 2013, she provided the PI with all the PowerPoint presentations from a previous iteration of the course. Among the presentations, there was an average of 95 slides per 75-min class. Although no observational data are available from the previous iterations of the course, the slide numbers, content coverage, and informal conversations with instructor M allude to a teacher-centered, lecture-dominated instructional style.

The revised course utilized reorganized lessons created with edited versions of the pre-existing PowerPoint presentations. The content covered in these presentations was expanded from 8 weeks to cover the entire 16-week semester with new topics added to fill content gaps. Instructor M and the PI outlined a syllabus that would allow for a reasonable amount of content coverage and time for student activities in each class period. The syllabus was divided into seven distinct modules on the basis of common content themes, with each module representing three to six class periods.

A three-phase system was used in creating the revised lessons for each class. For the first third of the semester (nine classes), all revisions and student activities were created by the PI, which allowed instructor M to focus on implementing the reformatted lessons and becoming comfortable with a student-centered instructional style. For the middle part of the course, the PI and instructor M worked together on the revisions, and during the last third of the course, all revisions were primarily handled by instructor M with minimal assistance from the PI.

New lessons were designed using the Integrated Course Design (ICD) model (Fink 2009). The ICD model focuses on five components of course design: situational factors, learning objectives, learning activities, feedback and assessment activities, and course integration. Using the ICD model, course design begins by considering situational factors such as class size, class time, technology available, and students’ prior knowledge. These factors are considered in the construction of appropriate learning objectives. Based on the situational factors and learning objectives, learning activities and related feedback and assessments are developed. Finally, these components are integrated into the classroom lesson plan by making sure that activities and assessments support the learning objectives, beneficial feedback is provided to support student learning, and situational factors will not hinder implementation of the lesson (Fink 2009).

While many of instructor M’s original PowerPoint presentations contained objectives at the beginning, they were mostly teacher-centered or not measurable. Four or five new student-centered learning objectives at varying levels of Bloom’s taxonomy (Bloom et al. 1956) were written for each of the revised lessons. The new learning objectives used appropriate action verbs that would allow for measurement of student learning on formative and summative assessments.

After drafting the learning objectives, conceptual multiple choice questions (ConcepTests; Mazur 1997) and student activity worksheets were created to accompany the lesson. Students responded to ConcepTests using clickers to indicate their answer choices, providing instructor M and the students with immediate feedback. Student worksheets were loosely modeled after lecture tutorials (Kortz et al. 2008) which require students to answer questions or complete activities related to material just presented. ConcepTests and worksheets served not only as formative assessment tools but as a way to track student participation and attendance. The most common complaints about instructor M on previous student course evaluations noted the fast pace of lecture. The worksheets and associated activities were designed to create breaks in lecture and address this concern. Working in pairs or small groups was encouraged and often necessary for many of the activities. The activities varied by lecture and often included multiple choice and short-answer questions, evaluating or writing hypotheses, or solving problems with provided data. Often these activities included learning reflection questions, such as asking students to summarize what they learned in a couple of sentences, identify what they felt were the most important points of the lesson, or rank their confidence in completing learning objectives. Two new mid-term exams and a final exam were written to align with the new learning objectives and assess students at various levels of Bloom’s taxonomy (Bloom et al. 1956).

Data collection instruments

The Reformed Teaching Observation Protocol (RTOP; Sawada and Piburn 2002) was used to characterize the level of reformed teaching that occurred during each class period. The supplemental RTOP rubric created by Budd et al. was also utilized to ensure consistent scoring (Budd et al. 2013). The RTOP rates a lesson on 25 items across five different categories: (1) lesson design and implementation, (2) content propositional knowledge, (3) content procedural knowledge, (4) student-student interactions, and (5) student-teacher relationships. Each item is scored from 0–4 with the overall RTOP score for a class ranging from 0–100. The PI was able to observe instruction and generate RTOP scores for all but two classes taught by instructor M. These two classes were observed by peers who had undergone the same three-phase RTOP training and calibration as the PI. An additional observer was also used during four of the classes to ensure reliability of the PI’s RTOP scores. Inter-rater reliability for the additional observers and the PI was determined using a weighted Cohen’s Kappa (k = 0.829, p < 0.001), and there was excellent agreement for total RTOP score (r = 0.95). No observational or RTOP data was available for prior iterations of the course.

The Teacher Beliefs Interview (TBI; Luft and Roehrig 2007) was used to capture instructor M’s beliefs about teaching and learning during the course redesign and implementation process. The TBI consists of seven short, focused questions pertaining to teaching and learning (Table 1). Answers to the seven questions were then coded and classified into one of five categories: traditional, instructive, transitional, responsive, and reformed. “Traditional” and “instructive” are considered teacher-centered views, where “responsive” and “reformed” are considered student-centered. Using a technique adopted from Roehrig and Kruse (2005), the coding of each question’s response can be converted to a number, with 1 point for a traditional response and on up to a 5 for a reformed response. An interview of all seven questions can then be assigned a TBI score between 7 and 35.
Table 1

Teacher Beliefs Interview (TBI) questions (Luft and Roehrig 2007)

1.

How do you maximize student learning in your classroom? (learning)

2.

How do you describe your role as a teacher? (knowledge)

3.

How do you know when your students understand? (learning)

4.

In the school setting, how do you decide what to teach and what not to teach? (knowledge)

5.

How do you decide when to move on to a new topic in your classroom? (knowledge)

6.

How do your students learn science best? (learning)

7.

How do you know when learning is occurring in your classroom? (learning)

During this study, an initial pre-course TBI was conducted 5 months prior to the semester to determine instructor M’s initial beliefs about teaching and learning. A second mid-course interview was conducted 5 weeks into the course after the first exam. A third end-of-course interview was conducted immediately after the last day of classes, and a final post-course interview was completed 1 month after the semester ended. All interviews were conducted and transcribed by the PI, and interviews were co-coded by the PI and a peer. Inter-rater reliability analysis performed using the Cohen Kappa statistic determined substantial agreement (k = 0.643, p < 0.001) between coders. It should be noted that no attempt was made to coach instructor M into more student-centered beliefs during the study. Discussions that could possibly lead to student-centered answers on the TBI were avoided throughout the project. Additionally, interviews were not transcribed, coded, and scored until after the final interview was complete.

After the final TBI, an additional informal, loosely structured interview was conducted to allow instructor M a chance to reflect on the experience. This interview used three simple questions as reflection prompts: (1) What went well? (2) What did not go so well? and (3) What would you change in the future? While data collected during this interview were not used to answer the guiding research questions, the responses help to frame some of the discussion below.

Instructor M also completed a Beliefs about Reformed Science Teaching and Learning (BARSTL) survey before each TBI (Sampson et al. 2013). The BARSTL is a four-option Likert scale survey composed of 32 items divided up among four subscales: (1) how people learn about science, (2) lesson design and implementation, (3) characteristics of teachers and the learning environment, and (4) the nature of the science curriculum. The BARSTL survey is scored from 32 to 128 points. Respondents indicate their level of agreement with a statement by circling a number from 1 through 4, with 1 being strongly disagree, 2 disagree, 3 agree, and 4 strongly agree. Half the statements are phrased from a traditional perspective and thus are reverse scored where circling a 1 (strongly disagree) is scored as four points.

Results

RTOP

Instructor M taught a total of 24 classes over the semester, and the average RTOP score was 44.9 (Fig. 1). The highest scoring class was a 74 (class 13) and the lowest 31 (class 28; Fig. 1). In the four classes where two observers were used, the average score was reported although scores between both observers never varied by more than four points. Missing class numbers (6, 7, 10, 21; Fig. 1) represent 2 days that the PI taught in instructor M’s absence and two exam days. The shaded regions of the graph represent generalized score characterizations from Budd et al. (2013) where scores of ≤30 represent teacher-centered, lecture-based classes, 31–49 are transitional classes with moderate student talk and activity, and ≥50 are student-centered active-learning classrooms.
Fig. 1

RTOP scores from each revised class. Missing class numbers represent exams or instructor absences. Score ranges for student-centered, transitional, and teacher-centered taken from Budd et al. (2013)

While the ICD model used to revise each class period allowed for some consistency in lesson design, the student activities that were created often dictated the format and structure of each class. Some classes featured better integration and implementation of activities than others, resulting in the fluctuation of RTOP score which is further discussed below.

TBIs and TBI scores

Results from the TBIs showed an overall increase of seven TBI points from pre- to post-course (Fig. 2). The seven-point shift is accounted for as a result of instructor M changing beliefs on question nos. 1, 2, 3, and 6 (Fig. 3). On question no. 1, she changed from a teacher-centered to a transitional view, on question nos. 2 and 6 from a transitional to a student-centered view, and on no. 3 from a teacher-centered to a student-centered view. Her responses to question no. 4 exhibited dichotomous views, expressing either a teacher- or student-centered viewpoint. Question nos. 5 and 7 showed little or no change throughout the course revision process. Further analyses of TBI results with sample responses for each question are provided below.
Fig. 2

Total TBI interview scores

Fig. 3

TBI score for each TBI question across all four interviews. See Table 1 for complete text of TBI questions. TBI score codes: 1 traditional, 2 instructive, 3 transitional, 4 responsive, 5 reformed

No. 1: How do you maximize student learning in your classroom?

Instructor M maintained a teacher-centered belief on this question from pre- to end-of-course. Responses were focused on providing information in a structured environment or monitoring student actions: “And so what I do is give you the background that you need to understand…I provide them with notes to follow…by asking if everyone is on the same page.” By the final interview, after time to reflect on the experience, the answer had become transitional:

I think hands on experience and observation, if you give them the opportunity to see things…I think asking them questions that all are centered around what you want to pull out of them makes them think more than if you just tell them the information or if they read the information.

This response reflects a belief that students should be involved in the classroom environment.

No. 2: How do you describe your role as a teacher?

During the semester, instructor M maintained an instructive belief regarding this question, but her views changed on the post-semester interview. She often referred to the term facilitator, suggesting that she saw facilitating as providing an experience for the student to learn. During the pre-course interview, she did express an affective transitional belief regarding students’ fear of science: “many freshman in particular are scared to death of science…So I think to start by trying to un-threaten science.” However, the post-course answer to this same question revealed a reform-based view that considered the prior knowledge and interests of the students:

Giving students a framework, starting with where they’re at, building the root of the tree and then hanging the branches off is more your job than giving them de novo information. They won’t retain it if it’s just memorizing a bunch of isolated facts. You need to give them a way to tie them to what they already know and what they want to know and teach it.

No. 3: How do you know when your students understand?

On this question, instructor M showed a steady progression from an instructive view on the pre-course interview to a reform-based view on the final post-course interview. Her initial view focused on students being able to repeat presented information: “I try to ask questions in the lecture and if they can’t answer the questions I figure they’re not getting it…I guess ultimately it’s how they do on the test.” By post-course, she expressed a reform-based view: “…different applications of the basic concept. When they can do that, when they can take the concept and apply it in novel situations, I think they got it.”

No. 4: In the school setting, how do you decide what to teach and what not to teach?

This was a question where instructor M expressed strongly dichotomous views. During the pre- and post-course interview, she expressed instructive responses, reflecting a teacher focus on deciding what to teach with responses like “I basically go on my own experiences” or “I don’t teach what I don’t like.” During the mid-course and end-of-course interviews, she expressed reform-based views with a student focus. On the mid-course interview she stated, “I struggle with what does a future banker or accountant really need to know about dinosaurs?…it’s more about looking and learning how to see their world,” and from the end-of-course, “first of all, I want to be able to have them relate everything they learn about dinosaurs to the world they live in because it’s the only way it’s going to be pertinent.”

No. 5: How do you decide when to move on to a new topic in your classroom?

This was a question where instructor M never expressed anything other than a traditional view of the teacher controlling the direction of class. Every answer reflected the idea of sticking to the syllabus, from “I guess when I’m setting up the syllabus…” on the pre-course interview to “I decide when I’m making up my syllabus what topics I want to cover; I move on according to the syllabus rather than whether or not they’re getting it, and that’s probably wrong” on the post-course interview. As in the previous quote, in most interviews, she acknowledges or hints that this may not be the best strategy but did not move beyond a traditional belief.

No. 6: How do your students learn science best?

Instructor M initially expressed a transitional view on this question that students best learn science by doing it or using procedures. In both the mid- and post-course interview, she expressed responsive views, acknowledging that students learn science best by not only doing but also interpreting: “By doing it…if they make an observation that they are surprised by, then they need to ask the question ‘why did this happen this way?’…They’ve got to be observing, they’ve got to be asking questions.” The traditional coding on the end-of-course interview is the result of instructor M not directly answering the question, forcing it to be coded as traditional.

No. 7: How do you know when learning is occurring in your classroom?

This question consistently received responsive replies, including some form of student-student or student-teacher interaction as a sign of learning. Instructor M’s responses usually included comments about students asking questions on the topic, talking to each other, or having discussions. Responses included “When they ask questions” and “I can see them playing with the bones, turning them upside down, trying to talk about them” on the pre-course interview to “when they get so excited about an idea that they actually look things up on their own or ask questions that are outside the box” on the post-course interview.

BARSTL survey

The BARSTL survey results showed an overall increase of six points from pre- to post-course (Fig. 4), albeit with a three-point drop from mid- to end-of-course. Three questions on the survey results were identified (nos. 5, 19, and 23) where instructor M went from scoring a 3 (pre-) to a 4 (mid-) to a 3 (end-of-) and back to a 4 (post-). The slight vacillation on these three questions led to the drop in scores between mid- and end-of-course where there would have otherwise been no change.
Fig. 4

BARSTL survey results

With Likert-response surveys, there is an unclear degree of difference between the ordinal values selected by the respondent. The degree of difference between “agree” and “strongly agree” may vary between individuals, and the direction of agreement plays a bigger factor in score changes (Peabody 1962). Changes from “agree” to “disagree” (score changes from 2 to 3 or 3 to 2) are also likely to represent a more fundamental shift in viewpoint. From pre- to mid-course, instructor M showed a positive scoring viewpoint change from disagree to agree on item nos. 9, 17, and 24. These items are related to student independence and talk in the classroom (the full text of these questions can be seen in Table 2). By end-of-course, she also switched to agreement on question no. 32 related to the focus of scientific curriculum. The table also shows three positive scoring items where instructor M’s level of agreement changed, either from “agree” to “strongly agree” or from “strongly disagree” to “disagree.”
Table 2

BARSTL survey items showing change from pre- to post-course

  

Scores

 
 

Item

Pre-

Mid-

End-

Post-

Item text

Positive scoring viewpoint change

9

2

3

3

3

During a lesson, students should explore and conduct their own experiments with hands-on materials before the teacher discusses any scientific concepts with them.

17

2

3

3

3

Students should do most of the talking in geoscience classrooms.

24

2

3

3

3

Geoscience teachers should primarily act as a resource person, working to support and enhance student investigations rather than explaining how things work.

32

2

2

3

3

A good science curriculum should focus on the history and nature of science and how science affects people and societies.

Positive scoring agreement level change

4a

1

1

1

2

Students are more likely to understand a scientific concept if the teacher explains the concept in a way that is clear and easy to understand.

13

3

3

4

4

Lessons should be designed in a way that allows students to learn new concepts through inquiry instead of through a lecture, a reading, or a demonstration.

28

3

3

3

4

The science curriculum should encourage students to learn and value alternative modes of investigation or problem solving.

Total of selected items

15

18

20

22

 

aReverse scored (strongly disagree earns 4 points)

Discussion

Change in instructional practice

The change in instructional practice undertaken by instructor M over the course of the semester is an example of a change sequence within the Interconnected Model. This change sequence started when instructor M sought help in revising her course (Fig. 5, arrow 1). The external domain was represented by the PI who became the source of information on new instructional strategies for instructor M. This led to a change in the domain of practice for instructor M, who was using many of these new strategies for the first time (Fig. 5, arrow 2). This change in practice represents a critical first step of the professional development experience for instructor M, and the RTOP observational data were collected as evidence that this change was successful compared to what was known about instructor M’s prior practice.
Fig. 5

Interconnected Model of Professional Growth adopted from Clarke and Hollingsworth (2002)

RTOP data show that the revised course was never taught below a transitional level, and a third of the classes were student-centered (8 of 24), scoring over a 50 on the RTOP (Fig. 1). The RTOP data provide evidence that instructor M was effectively able to implement the redesigned lessons and reformed teaching practices. No class was taught with a teacher-centered style (Fig. 1). This represents a significant shift for an instructor who began this experience with traditional, teacher-centered practice and beliefs. There was a concern at the start of this study that instructor M would follow the example of other STEM faculty (Turpen et al. 2010) and modify the proposed teaching strategies so that they aligned more closely with traditional methods. Even with over 20 years of experience teaching, she admitted to anxiety over teaching and public speaking. Talking non-stop during lecture was a coping mechanism, which she touched on during the informal interview:

I like to turn all the lights off and hide behind the slides. They’re not looking at me, they’re looking at the slides. I’m really shy. So having to get up there and do these things that are out of my comfort zone like stop in the middle of lecture and give them time to work on their own, that’s hard for me.

Having the PI as a collaborator in the classroom and being able to participate in post-class reflections and discussions on the implementation of the teaching practices may have eased the instructor’s concern over effectively implementing the new lessons. Additionally, the presence of a collaborator in the classroom may have added a level of accountability, ensuring that she did not slip back into traditional teaching practices that may have been more comfortable.

This is not to say that implementation was flawless or even went smoothly in all classes. Most of the transitional RTOP scoring classes featured lessons with a student-centered design, but less than ideal implementation resulted in them scoring in the transitional range. One example is class 9 (RTOP score = 45) which contained a student debate component centered on the issue of Tyrannosaurus rex as a predator or scavenger. The class was divided in half, and students worked in small groups to formulate supporting evidence for their side of the debate. Finally, students would present the case for their side, and groups from the other side would get a chance to counter the point. Instructor M hurried the students through the activity in 10 min, which was an insufficient amount of time for them to achieve the lesson objective of compiling evidence and formulating an argument to support their position. The result was that only a couple of groups presented their cases before the class moved on to lecturing on the topic. These actions, which occurred several times during the semester, may be the result of anxiety about content coverage or impatience with student-centered activities.

We interpret these responses as a consequence of instructor M’s traditional beliefs regarding when to move on to a new topic in the classroom which never varied throughout the experience (Fig. 3). A desire to rigidly stick to the syllabus and get through the material often led to rushing student activities so as not to run out of lecture time to cover the material. For example, class 9 finished several minutes early. This class would most likely have earned a student-centered score (RTOP ≥ 50) if this time would have been used for further student-student interaction and class discussion associated with the planned activity. In light of our conceptual model of change (Fig. 5), this can be seen as a change sequence, where instructor M is using new teaching practices but the new practices are not impacting or shaping her traditional beliefs about when to move on to new topics in the classroom. In order for this impact to occur, an effort may need to be made to focus on the student learning outcomes of these new practices. If instructor M is able to see learning gains from students being active and having sufficient time to achieve their goals, these outcomes could help shape her beliefs in this area. Creating this connection between three domains (practice, consequence, and personal) would represent a growth network that would be more likely to influence the beliefs and continued practice of instructor M.

Another factor may also partially explain some of the lower RTOP scores. Unlike some other STEM disciplines, there were very few resources or activities available in the literature, or online, that were appropriate for a college-level course on dinosaurs. While some activities could be borrowed or modified from geology or biology resources, most of the activities and ConcepTests were created by instructor M and the PI. Consequently, every class required redesigning the lesson by developing or modifying presentations, writing learning objectives, generating ConcepTests, and creating student activities and worksheets. The one major exception to this was the highest RTOP scoring class (#13) which discussed cladistics. A student-centered activity called The Great Clade Race (Goldsmith 2003) that had been shown to promote student learning on the topic was used (Perry et al. 2008). Additionally, exam questions, online quiz question banks, and learning journals were also being created during the semester. The creation of all these materials was often time consuming, and some lesson designs were less than ideal due to these time constraints. We anticipate that this will be less of an issue for future iterations of the course as activities should only require moderate modifications and refinement.

The three-phased design process used in revising the course also allowed instructor M to develop lesson design skills as the semester progressed. Instructor M was largely responsible for creating lessons for the final third of the course, which was comprised of seven classes following the second mid-term exam. The average RTOP score for these classes was 41.7 which is slightly below the overall average of 44.9. None of the lessons that were created were taught below a transitional level (RTOP 31–49). By the mid-point of the semester, instructor M had become competent in the creation of clear and measurable learning objectives from the student perspective. She had become comfortable with the ICD model (Fink 2009) after about 10 weeks and was also generating appropriate student activities, feedback, and assessment. This is evidence for a link between the external domain and the personal domain in the Interconnected Model (Fig. 5, arrow 3), where instructor M has acquired new teaching knowledge.

Situated instructional coaching as professional growth

If situated instructional coaching is to be seen as a successful model for professional development or growth, evidence for continued and lasting change should be seen in the domains of the Interconnected Model. Evidence for changes within the domains of the model came from the multiple BARSTL interviews and TBIs that were conducted during and after the semester of changed practice. These tools provided evidence that changes in the domains of practice and consequence also led to changes in the personal domain for instructor M. These changes in the personal domain were driven by personal reflection on the change in practice as well as on the domain of consequence (outcomes). The informal interview was useful for providing insight into these reflective moments that occurred between the domains as instructor M changed her practice during the semester.

Instructor M exhibited a shift in beliefs on question nos. 1 and 3 of the TBI (Fig. 3). These questions deal with how to maximize student learning and knowing when students understand. During the final TBI, she mentioned hands-on experience and observation as ways of maximizing student learning and the ability to apply information as a signal of student conceptual understanding. These views were also reflected in changes to her responses on item nos. 9, 13, and 17 on the BARSTL survey which deal with student exploration, talk, and inquiry (Table 2). Teaching the revised course format was the first time she had given her students substantial responsibility for their own learning in an undergraduate classroom. In doing so, she recognized the value of students taking an active role in their learning, and this likely influenced changes to her answers on the TBI questions. When asked during the informal interview about the course revision experience and what went well, she stated:

I think that the chance to break up into small groups to talk over the concepts or work on guided exercises, I think that went really well. It’s the first time I’ve ever been exposed to anything like that and I think it really did help the students kind of cement. I would have liked to have seen more breaking up into small groups and having them argue, or have them teach something.

This response indicates that not only does instructor M see the value of active learning to the student but that it also provides the instructor with an opportunity to formatively assess student learning and conceptual understanding by interpreting how they utilize and apply their new knowledge. From this response, we see that instructor M has reflected on the positive student outcomes of the new instructional strategy (Fig. 5, arrow 6) which has in turn led to a reflective change in her own beliefs (Fig. 5, arrow 8) as evidenced by the TBI results. She also notes that she would have liked to have seen more student debate or teaching, an indication that she may increase the use of and refine such strategies (Fig. 5, arrow 7).

In the above example, instructor M was reflecting on the student outcome of learning, by noting that the activities helped students to “cement” concepts. Another student outcome that was noted is that of student participation or engagement: “My favorite thing I think…I liked the fact the students were more engaged throughout the semester than I’ve had before, comparing the old way of teaching and the new.” This represents another reflective link between the domain of practice and consequence (Fig. 5, arrow 6).

The other TBI question that showed significant change was no. 2 that dealt with instructor M’s role as a teacher. During the final interview, she described her role as a facilitator responsible for determining what students know, what they want to know, and building from there. Responses on the BARSTL to item nos. 4 and 24 also demonstrate a change in how instructor M views the role of the teacher, toward the idea that guidance of student investigations and inquiry is more important than teacher explanations and lecture. Reflection on the highly student-centered practices used in classes 4, 5, and 13 may have be influential in changing instructor M’s views on her role as teacher (Fig. 5, arrow 4). During the informal interview, she talked about class 13, the Great Clade Race class that scored a 74 on the RTOP, saying “I loved the clade race activity because it’s the first time in all the years I’ve been in paleo that I’ve gotten it. That was a really good way to teach those concepts.” She acknowledges the efficacy of a lesson that contained virtually no lecture and where her role was to guide the students through an activity.

Further reflection on the domain of practice also occurred, since the new instructional style was one she was uncomfortable with. She stated in the informal interview “It’s hard for me…so to stop, I’m on a roll and I’m teaching these concepts, and then I have to stop and let them break up into groups. It’s really uncomfortable, but I like it and I would do it again.” She reflects on the use of the new practices (Fig. 5, arrow 4), indicates a changing attitude toward them, and says she would use them again (Fig. 5, arrow 5).

The last link in the model that has not been discussed is that of arrow 9 (Fig. 5). This is the reflective link leading from the personal domain to the domain of consequence. Just as the outcomes of an instructional strategy can impact a teacher’s beliefs or attitudes, these changes in the personal domain can change how an instructor views certain outcomes. For instructor M who initially had a very traditional lecture style, stopping to let students talk and work together was something that was uncomfortable for her and signaled a loss of control, and she viewed it as a negative outcome. As seen in some of the previous quotes, she is starting to become more comfortable with the idea of student talk and activity in the classroom and beginning to view it as a positive outcome indicative of increased engagement.

Limitations

Measuring positive student learning outcomes as a result of using reformed teaching strategies would help to reinforce the value of student-centered instructional strategies. These strategies have been repeatedly shown to improve student learning and reduce failure rates (Freeman et al. 2014), benefits which may help lower STEM attrition rates. Comparing student performance on exams with past iterations of the course was not possible since instructor M had not preserved exam grade data from when the course was last taught. Additionally, the old exams assessed students at the lowest levels of Bloom’s taxonomy (knowledge and comprehension), and new exams were written to assess students at various higher levels and to align with the new learning objectives. While student performance data were not collected, the revised course did utilize research-based strategies that have been shown to promote student learning (e.g., Freeman et al. 2014; Knight and Wood 2005; Kortz et al. 2008; McConnell et al. 2003).

No data were available from prior iterations of the course regarding instructional practice. Instructor M reported having a lecture-dominated, traditional instructional style, but no RTOP data from prior iterations of the course were available to provide a baseline score. Other limitations to the use of this model may also exist. The PI and instructor had a pre-existing amicable relationship, and instructor M was comfortable with the daily observation in the classroom. Not all instructors may be as willing to collaborate with a coach who is observing their class frequently. While we conducted daily observations for this analysis, weekly or bi-weekly observations would likely have yielded similar results.

The first requirement for this process is also a willing instructor. Like many colleagues, instructor M was aware that there were alternative teaching strategies but did not have the time or expertise to incorporate these approaches. We had some prior experience with professional development in college-level geoscience programs and were well placed to provide assistance. Finding a qualified collaborator may present a challenge in other settings. While discipline-based education research groups are more common in STEM departments at higher education institutions, they are far from the norm. However, collaboration could occur with a coach in an education department (Bailey and Nagamine 2012), a faculty peer with experience in reformed teaching practices, a representative of a campus teaching and learning center, or with sufficient institutional support an education specialist could be hired (Wieman et al. 2010). Another viable option could be the use of graduate teaching assistants in lecture courses. While graduate teaching assistants are not necessarily experts in reformed teaching strategies, they can assist instructors in overcoming the time barrier by helping to learn about and create effective teaching strategies and materials.

Conclusions

This research utilized a model for professional development that occurred in the classroom and involved collaboration with a reformed teaching coach, in this case a geoscience education graduate student. The model incorporates all three of the strategies that make up successful programs for changing instructional practice according to the NRC (2012): (1) sustained, focused efforts, (2) feedback on instructional practice, and (3) a deliberate focus on changing faculty conceptions about teaching and learning. The situated instructional coaching model tested was sustained, lasted an entire semester, and focused on revising a single course using a structured model (ICD model; Fink 2009). Second, providing feedback on instructional practice was possible by having a collaborator who assisted with the revisions and frequently attended class. Finally, the main objective of this research was to evaluate instructor M’s professional development as a teacher by supporting her change of teaching practice which served as a driver for changes to teaching beliefs.

The situated instructional coaching model can also aide instructors in overcoming the commonly cited barriers to changing their instructional practice. Having a collaborator who can aid in revisions can significantly reduce the time burden of creating objectives, activities, feedback, and assessments. Since training is specific to the course and occurring in the classroom, instructors are not investing time to attend training workshops, listen to talks, or reading and interpreting unfamiliar literature. Training is occurring during their normal teaching responsibilities, with feedback and reflection provided immediately by the presence of a coach. Successful implementation serves to drive the process forward as the instructor sees an immediate reward for their efforts. The observation and feedback provided by the coach also ensures that implementation is effective and positive student outcomes are realized.

Future work on the use of a situated instructional coaching model will involve looking at longer term impacts of the experience. Does instructor M continue to use or further refine the reform-based changes that were implemented in future iterations of a course? Will instructor M apply the skills that were learned concerning course design and reformed teaching to other courses? Essentially, are the experiences of instructor M during this study indicative of lasting changes that will become part of her professional growth as an instructor? This case study only had one subject, so it is of interest as to how the developed model can be scaled to work with a larger population concurrently to produce similar results. The model could be scaled up to where one coach works with multiple instructors concurrently, which could easily be manageable if the instructors were teaching the same or similar courses. This larger community of practice may also have the added benefit of increasing the amount of feedback and reflection the instructors experience, benefitting their professional growth.

Through the lens of the Interconnected Model of Teacher Professional Growth, situated instructional coaching not only produced a change in the domain of practice for the instructor but also enabled instructor M to reflect on and enact changes in other domains of the change environment. Strategies for increasing the use of student-centered teaching among faculty should be viewed in a holistic fashion such as that provided by the Interconnected Model. Faculty professional development is least effective when the goal is simply to inform of best practices and expect faculty adoption. Professional development should be about finding methods that work to not only change and monitor an instructor’s practice but also incorporate and foster reflection on the outcomes of this practice and the resulting impacts on the instructor’s personal beliefs, knowledge, and attitudes. In this way, adoption of reformed teaching strategies and professional growth are likely to become a long-term part of one’s personal teaching experience. Situated instructional coaching represents one promising method for achieving this end.

Abbreviations

BARSTL, Beliefs About Reformed Science Teaching and Learning; ICD, Integrated Course Design; NRC, National Research Council; PCAST, President’s Council of Advisors on Science and Technology; PI, primary investigator; RTOP, Reformed Teaching Observation Protocol; STEM, science, technology, engineering, and math; TBI, Teacher Beliefs Interview

Declarations

Acknowledgements

We thank instructor M for not only reaching out to us for help in changing her teaching practice but also in allowing us to observe her classroom and investigate her beliefs about teaching and learning. Her willingness to participate made the piloting of this model possible. We would also like to thank Michael Pelch for his assistance with classroom observations and co-coding of interviews as well as Hayley Joyell Smith, April Grissom, and Melissa Johnson for assisting with classroom observations. Finally, we would like to thank all the reviewers of this manuscript for their valuable feedback.

Funding

No funding sources were used to complete this research.

Availability of data and materials

Data from this project will not be shared through a data repository. Any data collected and used to draw conclusions in this project can be obtained by emailing the corresponding author.

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors’ Affiliations

(1)
Department of Marine, Earth, and Atmospheric Sciences, North Carolina State University

References

  1. Bailey, J. M., & Nagamine, K. (2012). Experiencing conceptual change about teaching: a case study from astronomy. American Journal of Physics, 80(6), 542.View ArticleGoogle Scholar
  2. Bloom, B. S., Engelhart, M., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives: the classification of educational goals. Handbook I: cognitive domain. New York: David McKay.Google Scholar
  3. Borrego, M., Froyd, J., & Hall, T. (2010). Diffusion of engineering education innovations: a survey of awareness and adoption rates in US engineering departments. Journal of Engineering Education,99(3), 185-207.Google Scholar
  4. Brickhouse, N. W. (1990). Teachers’ beliefs about the nature of science and their relationship to classroom practice. Journal of Teacher Education, 41(3), 53–62.View ArticleGoogle Scholar
  5. Brogt, E. (2007). Instruction as a scientific experiment: a professional development case study of a professor changing the introductory astronomy course for non-science majors. Astronomy Education Review, 6(2), 20–31.View ArticleGoogle Scholar
  6. Brownell, S. E., & Tanner, K. D. (2012). Barriers to faculty pedagogical change: lack of training, time, incentives, and…tensions with professional identity? CBE Life Sciences Education, 11(4), 339–346.View ArticleGoogle Scholar
  7. Budd, D., van der Hoeven Kraft, K., McConnell, D., & Vislova, T. (2013). Characterizing teaching in introductory geology courses: measuring classroom practices. Journal of Geoscience Education, 475, 461–475.Google Scholar
  8. Chen, X. (2013). STEM attrition: college students’ paths into and out of STEM fields (NCES 2014-001). Washington D.C.: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education.Google Scholar
  9. Clarke, D., & Hollingsworth, H. (2002). Elaborating a model of teacher professional growth. Teaching and Teacher Education, 18(8), 947–967.View ArticleGoogle Scholar
  10. Cronin-Jones, L. L. (1991). Science teacher beliefs and their influence on curriculum implementation: two case studies. Journal of Research in Science Teaching, 28(3), 235–250.View ArticleGoogle Scholar
  11. Crouch, C. H., & Mazur, E. (2001). Peer instruction: ten years of experience and results. American Journal of Physics, 69(9), 970.View ArticleGoogle Scholar
  12. Dancy, M., & Henderson, C. (2010). Pedagogical practices and instructional change of physics faculty. American Journal of Physics, 78(10), 1056–1063.View ArticleGoogle Scholar
  13. Deslauriers, L., Schelew, E., & Wieman, C. (2011). Improved learning in a large-enrollment physics class. Science, 332(6031), 862–864.View ArticleGoogle Scholar
  14. Devlin, M. (2003). A solution focused model for improving individual university teaching. International Journal for Academic Development, 8(1-2), 77–89.View ArticleGoogle Scholar
  15. Devlin, M. (2006). Challenging accepted wisdom about the place of conceptions of teaching in university teaching improvement. International Journal of Teaching & Learning, 18(2), 112–119.Google Scholar
  16. Ebert-May, D., Derting, T. L., Hodder, J., Momsen, J. L., Long, T. M., & Jardeleza, S. E. (2011). What we say is not what we do: effective evaluation of faculty professional development programs. BioScience, 61(7), 550–558.View ArticleGoogle Scholar
  17. Eley, M. G. (2006). Teachers’ conceptions of teaching, and the making of specific decisions in planning to teach. Higher Education, 51(2), 191–214.View ArticleGoogle Scholar
  18. Emerson, J. D., & Mosteller, F. (2000). Development programs for college faculty: preparing for the twenty-first century. Educational Media and Technology Yearbook, 25, 26–42.Google Scholar
  19. Fink, L. D. (2009). Preface. New Directions for Teaching and Learning, 119, 1–7.View ArticleGoogle Scholar
  20. Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., et al. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences of the United States of America, 111, 8410–8415.Google Scholar
  21. Fullan, M. (1982). The meaning of educational change. New York: Teachers College Press.Google Scholar
  22. Garet, M. S., Porter, a. C., Desimone, L., Birman, B. F., & Yoon, K. S. (2001). What makes professional development effective? Results from a national sample of teachers. American Educational Research Journal, 38(4), 915–945.View ArticleGoogle Scholar
  23. Goldsmith, D. W. (2003). The Great Clade Race: presenting cladistic thinking to biology majors & general science students. The American Biology Teacher, 65(9), 679–682.View ArticleGoogle Scholar
  24. Graham, M. J., Frederick, J., Byars-Winston, A., Hunter, A.-B., & Handelsman, J. (2013). Increasing persistence of college students in STEM. Science, 341, 1455–1456.View ArticleGoogle Scholar
  25. Guskey, T. R. (1986). Staff development and the process of teacher change. Educational Researcher, 15(5), 5–12.View ArticleGoogle Scholar
  26. Handelsman, J., Ebert-May, D., Beichner, R., Bruns, P., Amy, C., DeHaan, R., … Wood, W. B.. (2004). Scientific teaching. Science, 304, 521–522.View ArticleGoogle Scholar
  27. Hativa, N. (2000). Becoming a better teacher: a case of changing the pedagogical knowledge and beliefs of law professors. Instructional Science, 28, 491–523.View ArticleGoogle Scholar
  28. Henderson, C., Beach, A., & Famiano, M. (2009). Promoting instructional change via co-teaching. American Journal of Physics, 77(3), 274.View ArticleGoogle Scholar
  29. Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: an analytic review of the literature. Journal of Research in Science Teaching, 48(8), 952–984.View ArticleGoogle Scholar
  30. Henderson, C., & Dancy, M. (2007). Barriers to the use of research-based instructional strategies: the influence of both individual and situational characteristics. Physical Review Special Topics - Physics Education Research, 3(2), 020102.View ArticleGoogle Scholar
  31. Henderson, C., & Dancy, M. (2009). Impact of physics education research on the teaching of introductory quantitative physics in the United States. Physical Review Special Topics - Physics Education Research, 5(2), 020107.View ArticleGoogle Scholar
  32. Henderson, C., Dancy, M., & Niewiadomska-Bugaj, M. (2012). Use of research-based instructional strategies in introductory physics: where do faculty leave the innovation-decision process? Physical Review Special Topics - Physics Education Research, 8(2), 020104.View ArticleGoogle Scholar
  33. Ho, A., Watkins, D., & Kelly, M. (2001). The conceptual change approach to improving teaching and learning: an evaluation of a Hong Kong staff development programme. Higher Education, 42(2), 143–169.View ArticleGoogle Scholar
  34. Knight, J., & Wood, W. (2005). Teaching more by lecturing less. Cell Biology Education, 4, 298–310.View ArticleGoogle Scholar
  35. Kortz, K., Smay, J., & Murray, D. (2008). Increasing learning in introductory geoscience courses using lecture tutorials. Journal of Geoscience Education, 56(3), 280–290.View ArticleGoogle Scholar
  36. LoPresto, M., & Murrell, S. (2009). Using the Star Properties Concept Inventory to compare instruction with lecture tutorials to traditional lectures. Astronomy Education Review, 8, 1–5.View ArticleGoogle Scholar
  37. Luckie, D. B., Aubry, J. R., Marengo, B. J., Rivkin, A. M., Foos, L. A., & Maleszewski, J. J. (2012). Less teaching, more learning: 10-yr study supports increasing student learning through less coverage and more inquiry. Advances in Physiology Education, 36(4), 325–335.View ArticleGoogle Scholar
  38. Luft, J., & Roehrig, G. (2007). Capturing science teachers’ epistemological beliefs: the development of the teacher beliefs interview. Electronic Journal of Science Education, 11, 2.Google Scholar
  39. Macdonald, R., & Manduca, C. (2005). Teaching methods in undergraduate geoscience courses: results of the 2004 On the Cutting Edge survey of US faculty. Journal of Geoscience Education, 53(3), 237–252.View ArticleGoogle Scholar
  40. Mazur, E. (1997). Peer instruction: a user’s manual. New Jersey: Prentice Hall.Google Scholar
  41. McConnell, D., Steer, D., & Owens, K. (2003). Assessment and active learning strategies for introductory geology courses. Journal of Geoscience Education, 51(2), 205–216.View ArticleGoogle Scholar
  42. National Research Council. (1999). Transforming undergraduate education in science, mathematics, engineering, and technology. Washington, DC: The National Academy Press.Google Scholar
  43. National Research Council. (2012). Discipline-based education research: understanding and improving learning in undergraduate science and engineering. Washington, D.C.: The National Academies Press.Google Scholar
  44. National Research Council. (2015). Reaching students: what research says about effective instruction in undergraduate science and engineering. Washington, DC: The National Academies Press.Google Scholar
  45. Peabody, D. (1962). Two components in bipolar scales: direction and extremeness. Psychological Review, 69, 2.View ArticleGoogle Scholar
  46. Perry, J., Meir, E., & Herron, J. (2008). Evaluating two approaches to helping college students understand evolutionary trees through diagramming tasks. CBE-Life Sciences Education, 7, 193–201.View ArticleGoogle Scholar
  47. President’s Council of Advisors on Science and Technology. (2012). Engage to excel: producing one million additional college graduates with degrees in science, technology, engineering, and mathematics (Executive Office of the President).Google Scholar
  48. Roehrig, G. H., & Luft, J. a. (2004). Constraints experienced by beginning secondary science teachers in implementing scientific inquiry lessons. International Journal of Science Education, 26(1), 3–24.View ArticleGoogle Scholar
  49. Roehrig, G., & Kruse, R. (2005). The role of teachers’ beliefs and knowledge in the adoption of a reform-based curriculum. School Science and Mathematics, 105(8), 412–422.View ArticleGoogle Scholar
  50. Sampson, V., Enderle, P., & Grooms, J. (2013). Development and initial validation of the Beliefs about Reformed Science Teaching and Learning (BARSTL) questionnaire. School Science and Mathematics, 113(1), 3–15.View ArticleGoogle Scholar
  51. Saroyan, A., & Amundsen, C. (2001). Evaluating university teaching: time to take stock. Assessment & Evaluation in Higher Education, 26(4), 37–41.View ArticleGoogle Scholar
  52. Sawada, D., & Piburn, M. (2002). Measuring reform practices in science and mathematics classrooms: the reformed teaching observation protocol. School Science and Mathematics, 102(October), 245–253.View ArticleGoogle Scholar
  53. Seymour, E., & Hewitt, N. M. (1997). Talking about leaving: why undergraduates leave the sciences. Boulder: Westview Press.Google Scholar
  54. Stes, A., Min-Leliveld, M., Gijbels, D., & Van Petegem, P. (2010). The impact of instructional development in higher education: the state-of-the-art of the research. Educational Research Review, 5(1), 25–49.View ArticleGoogle Scholar
  55. Sunal, D. W., Hodges, J., Sunal, C. S., Whitaker, K. W., Freeman, L. M., Edwards, L., et al. (2001). Teaching science in higher education: faculty professional development and barriers to change. School Science and Mathematics, 101(5), 246–257.Google Scholar
  56. Turpen, C., Dancy, M., & Henderson, C. (2010). Faculty perspectives on using peer instruction: a national study. AIP Conference Proceedings, 1289(1), 325–328.View ArticleGoogle Scholar
  57. Walczyk, J., Ramsey, L., & Zha, P. (2007). Obstacles to instructional innovation according to college science and mathematics faculty. Journal of Research in Science Teaching, 44(1), 85–106.View ArticleGoogle Scholar
  58. Walker, J., & Cotner, S. (2008). A delicate balance: integrating active learning into a large lecture course. CBE-Life Sciences Education, 7, 361–367.View ArticleGoogle Scholar
  59. Wieman, C., Deslauriers, L., & Gilley, B. (2013). Use of research-based instructional strategies: how to avoid faculty quitting. Physical Review Special Topics-Physics,9(2), 1–7.Google Scholar
  60. Wieman, C., Perkins, K., & Gilbert, S. (2010). Transforming science education at large research universities: a case study in progress. Change: The Magazine of Higher Learning, 42(2), 6–14.View ArticleGoogle Scholar

Copyright

© The Author(s). 2016