Skip to main content

Advertisement

We’d like to understand how you use our websites in order to improve them. Register your interest.

Increasing high school teachers self-efficacy for integrated STEM instruction through a collaborative community of practice

Abstract

Background

Teachers can have a significant impact on student interest and learning in science, technology, engineering, and math (STEM) subjects and careers. Teacher self-efficacy can also significantly affect student learning. Researchers investigated the effects of teacher professional development and integrated STEM curriculum development on teacher self-efficacy. Participants in the study included high school science and engineering technology teachers enrolled in a National Science Foundation–ITEST project called Teachers and Researchers Advancing Integrated Lessons in STEM (TRAILS). The TRAILS program sought to prepare teachers to integrate STEM content using engineering design, biomimicry, science inquiry, and 3D printing as pedagogical approaches. Teachers learned within a community of practice working alongside industry partners and college faculty. The purpose of the study was to investigate the impact of the 70 h of professional development to train three cohorts of teachers over 3 years on teacher self-efficacy. The research design utilized a quasi-experimental nonequivalent control group approach, including an experimental group and an untreated control group.

Results

Measurements on beliefs about teacher self-efficacy were collected on pretest, posttest, and delayed posttest survey assessments. Researchers analyzed the T-STEM survey results for teaching self-efficacy using the Wilcoxson signed-rank test for detecting significant differences. Science teachers showed a significant increase in teacher self-efficacy comparing the pretest and delayed posttest scores after TRAILS professional development and STEM lesson implementation (p = .001, effect size = .95). Additionally, significant differences between groups (science experimental vs science control group teachers) using the Wilcoxon rank-sum test were detected from pretest to posttest (p = .033, effect size = .46), posttest to delayed posttest (p = .029, effect size = .47), and pretest to delayed posttest (p = .005, effect size = .64). There were no significant differences detected in the control group. Engineering technology teachers showed no significant differences between the pretest, posttest, and delayed posttest self-efficacy scores.

Conclusions

The results indicate the science teachers’ self-efficacy increased after professional development and after lesson implementation. Potential implications from this research suggest that the science teacher participants benefited greatly from learning within a community of practice, engaging in science practices, and using science knowledge to solve a real-world problem (engineering design).

Introduction

Efforts in improving STEM education have been on the rise for over two decades (Honey, Pearson, & Schweingruber, 2014). Global concern for improvement in STEM education increases as a STEM-skilled workforce is critical to meet economic challenges and sustainability in the 21st Century (Partnership for 21st Century Skills, 2017; Rockland et al., 2010). Furthermore, as STEM fields expand and increase demands for workers skilled in STEM fields, it is difficult to predict all workforce skills and knowledge necessary to remain competitive (Caprile, Palmen, Sanz, & Dente, 2015; English, 2017). For example, worldwide challenges require collaboration by experts across STEM fields to design and implement effective solutions. Although there is a global workforce demand for STEM expertise, enthusiasm toward STEM learning has declined among students in many countries (Thomas & Watters, 2015). In the USA, government agencies and educational organizations are promoting the development of effectively integrated STEM curricula. These initiatives have also generated a rallying cry for more research to investigate the impacts that integrated STEM education has on student STEM learning. Equally, there is a need for more research on integrated STEM learning activities that impact students’ interest in and pursuit of STEM careers (Honey et al., 2014; PCAST, 2010).

Teachers can have a significant influence on student interest in and understanding of STEM pathways and careers (Autenrieth, Lewis, & Butler-Perry, 2017; Brophy, Klein, Portsmore, & Roger, 2008). Teachers often focus on the STEM knowledge and skills they are comfortable teaching. Therefore, when teachers lack confidence in STEM teaching, they may in fact, potentially limit students’ exposure to a full breadth of STEM knowledge. Teacher self-efficacy has been found to be a significant factor in student learning (Nadelson, Seifert, Moll, & Coats, 2012; Yoon, Evans, & Strobel, 2012, 2014). Nadelson et al. (2012) conducted research on integrated STEM education teacher professional development and emphasized the importance of teacher self-efficacy; findings revealed a relationship in teaching STEM content between teachers’ comfort level and motivation. Teachers feel less knowledgeable and comfortable teaching in subject areas outside of their expertise affecting their self-efficacy and confidence in teaching integrated STEM curriculum (Stohlmann, Moore, & Roehrig, 2012). Teacher self-efficacy is an important component of teacher effectiveness that influences teacher behavior and student outcomes. Therefore, high-quality teacher professional development must support and reinforce the growth in teacher skills and self-efficacy (Bray-Clark & Bates, 2003). Often, the development of STEM instruction is not informed by research that has focused on integrated STEM teacher professional development approaches (Honey et al., 2014; Nadelson et al., 2012). More efforts in integrated STEM teacher professional development must seek to address these issues, and existing research findings must be leveraged to develop the most effective approaches in order to improve student STEM learning.

The following research focuses on high school science and engineering technology education (ETE) teachers participating in a project called TRAILS National Science Foundation Award # DRL – 1513248). Teachers attended a 10-day and 70-h summer professional development based upon an integrated STEM education model that challenges teachers to cogenerate their own integrated lessons. Science and engineering technology teachers were challenged to address Next-Generation Science Standards (NGSS) for teaching science through engineering design and promoting 21st century skills by co-developing crosscutting STEM concepts (NGSS Lead States, 2013; Partnership for 21st Century Skills, 2017).

Purpose of the study

This research investigates the effectiveness of a teacher professional development program using an integrated STEM education model (Kelley & Knowles, 2016) on teacher self-efficacy. The TRAILS project seeks to increase teacher self-efficacy as a result of quality professional development instruction that includes science inquiry and engineering design experiences, and collaborative approaches to situate learning within a community of practice. Novices and experts work together within a community of practice to learn and connect STEM content and skills (Kelley & Knowles, 2016). By combining various pedagogical and learning theories, the TRAILS model of integrated STEM education addresses the needs of multiple student learning styles within meaningful and authentic contexts.

This research was guided by the questions:

1) Does teacher self-efficacy increase after participation in integrated STEM education professional development?

2) Does teacher self-efficacy increase after the implementation of integrated STEM lessons?

Theoretical framework

The theoretical framework for this research is based on a larger framework for integrated STEM education (Kelley & Knowles, 2016; Fig. 1). We theorize that integrated STEM education is more than a blend of two or more school subjects. The TRAILS professional development approach is built upon the theory that integrated STEM requires a complex system of pedagogical approaches working in harmony to achieve key learning theories to realize the benefits of subject integration. Our theoretical framework is grounded in situated cognition theory (Brown, Collins, & Duguid, 1989; Lave & Wenger, 1991; Putnam & Borko, 2000). Therefore, students are taught to understand how knowledge and skills are applied around a specific activity, a situated context that is authentic and relevant. Engineering design provides an ideal platform for situated learning because it provides a context situated in a problem that is authentic and bound by science and engineering practices. Additionally, situated cognition can be achieved through teaching these commonly shared practices of scientists and engineers while working through an engineering design problem (NRC, 2012). We believe all science and engineering technology education teachers can use science inquiry and engineering design pedagogical approaches to naturally integrate STEM contents, thus helping students achieve technological literacy, promoting computational thinking, and developing 21st century skills (Kelley, 2014b; Kelley & Knowles, 2016). The theoretical framework is represented in Fig. 1 as a “block and tackle” bound by a “rope” of a community of practice. The TRAILS theory uses a community of practice of educators, researchers, and corporate community partners to help students and teachers understand STEM career pathways in real practices. The TRAILS project remains true to the Lave and Wenger’s (1991) concept of legitimate peripheral participation that promotes learning to occur within a community of practitioners, helping the students move from novice towards expert as they engage “in a social practice of a community” (p. 29). We hypothesize that teaching science and engineering technology education teachers how to integrate STEM content through the TRAILS approach will also help improve teachers’ self-efficacy.

Fig. 1
figure1

The conceptual framework for STEM learning (Kelley & Knowles, 2016, p. 4)

Background literature: teacher self-efficacy

Albert Bandura was a pioneer in identifying the construct of self-efficacy, which still provides a framework for research on self-efficacy today. Bandura (1994) explained self-efficacy as “people’s beliefs about their capabilities to produce designated levels of performance that exercise influence over events to affect their lives. Self-efficacy beliefs determine how people feel, think, motivate themselves, and behave.” (p. 71). Robust feelings of self-efficacy improved achievement and general well-being. Individuals with high self-efficacy see difficulties as challenges to overcome with persistent effort and opportunities to gain the necessary knowledge and skills. Individuals with low self-efficacy, in contrast, are prone to doubt their own competencies and avoid challenges, consequently leading to low ambitions and limited persistence when facing difficult circumstances (Bandura, 1994).

Academic achievement may be impacted by many factors, including self-efficacy, motivation, attitude, and aptitude (Witt-Rose, 2003). Furthermore, research has shown that self-efficacy is generally a solid predictor of academic success. A domain-specific instrument should be used to research self-efficacy in a particular subject area rather than a more general assessment, to ensure more precise measurements (Bandura, 1994). When measuring the construct of teaching self-efficacy, a context-specific instrument should be utilized to determine a respondent’s belief about their own capability and the strength of that belief in teaching a specific domain in science, mathematics, engineering, and technology (Bandura, 1997; Rittmayer & Beier, 2008). Teacher self-efficacy has proven to be an important factor in student learning (Nadelson et al., 2012). Self-efficacy has been described as follows:

Belief in one’s ability to perform a specific task is referred to as self-efficacy. Self-efficacy is defined as a judgment about one’s ability to organize and execute the courses of action necessary to attain a specific goal–self-efficacy judgments are related to specific tasks in a given domain… (Rittmayer & Beier, 2008, p. 1)

Teacher self-efficacy greatly influences teachers’ preparation, teaching strategies, pedagogical approaches, and their students’ own self-efficacy and achievement in that subject (Bray-Clark & Bates, 2003; Yoon et al., 2012, 2014). Self-efficacy is specific to a particular goal or domain and is measured in assessments asking respondents to rate their confidence in achieving a particular goal. There may be a direct link between teachers’ level of comfort and motivation in teaching STEM content (Nadelson et al., 2012). Significant gains were detected in teacher “perceived efficacy, comfort, contentment, and knowledge” resulting in greater teacher aptitude to teach STEM content after attending integrated STEM professional development (Nadelson et al., 2012, p. 81; Wang, Moore, Roehrig, & Park, 2011; Nathan, Atwood, Prevost, Phelps, & Tran, 2011). Teacher self-efficacy has been found to influence student cognitive achievement and success in school as well as the student’s own sense of efficacy (Caprara, Barbaranelli, Steca, & Malone, 2006). Yoon et al. (2012) stated that “teacher self-efficacy has received attention from researchers because of findings that indicate its direct relationship with teachers’ classroom behaviors that influence the student performance” (p. 1).

Based on these previous studies, TRAILS researchers designed professional development to better prepare teachers in order to help increase teacher self-efficacy and impact student learning. The researchers acknowledge that teacher self-efficacy is important to teacher success and student performance in STEM education. Therefore, the researchers sought to determine if self-efficacy increased upon completion of teacher professional development. TRAILS teacher professional development trains and supports teachers to teach using an integrated STEM approach. These professional development experiences improve pedagogical content knowledge and domain-specific STEM content knowledge within a community of practice. Furthermore, the researchers sought to determine if improvement in self-efficacy is sustained after completion of integrated STEM lesson implementation.

Method

Research design

The research design utilized a quasi-experimental nonequivalent control group approach which matches an experimental (treatment) group and a control (untreated) group on non-randomized participants as shown in Table 1 (Ary, Jacobs, Sorensen, & Walker, 2014; Creswell, 2009; Gall, Gall, & Borg, 2007; Shadish, Cook, & Campbell, 2002). TRAILS incorporated 3 years of professional development working with three cohorts of teachers. Cohort 1–3 (years 1–3) teachers in the experimental group attended the TRAILS 2-week professional development institute (the treatment). Cohort 1 (year 1) teachers attended the professional development in June 2016, cohort 2 (year 2) teachers in 2017, and cohort 3 (year 3) teachers in 2018. The project used a delayed model approach allowing a control group to later become a treatment group member (Table 1).

Table 1 Research design: quasi-experimental design delayed approach

Applicants to TRAILS that could not attend the professional development were invited to be a part of the control group. A delayed model provides an opportunity for the control group to receive the treatment in a later cohort. As a result, there are two control groups for the TRAILS program so that all control group members are provided with an opportunity to attend the TRAILS professional development by cohort 3 (year 3). This quasi-experimental approach ensures an ethical way to provide the control group an opportunity to participate and benefit from the educational program (Ary et al., 2014; Gall et al., 2007). This was the rationale for including only two control groups for a 3-year program.

The control group did not participate in the professional development. Although each member of the control group was offered to participate in the TRAILS professional development in the next year cohort, no control group teacher did, in fact, participated in the professional development. Therefore, they are considered to be a true control group. In most cases, this was not because of a lack of interest but because they could not recruit a partnering teacher. Both experimental and control groups were given the same pretest preceding the TRAILS summer professional development institute and the same posttest after professional development for the experimental group. Teachers in both groups took the same assessment again as a delayed posttest during the school year after the implementation of TRAILS lessons by the experimental group. The delayed posttest was given after teachers implemented both the exemplar lesson, and the teacher created a custom lesson.

Context of the study

Participant teacher demographics are presented in Table 2. The breakdown of each cohort is shown by gender, subject, and group, including experimental, control, and returning experimental teachers.

Table 2 Cohort 1–3 participant teacher demographics

Participants were required to have 2 years or more of teaching experience at their current school and to be teaching primarily in ETE or in physics or biology for science teachers. This allowed teachers to have teaching experience and relationships with colleagues in their school to collaborate within an integrated model. Preferably, two teachers from the same school would attend professional development to collaborate on lesson plans and implementation. In practice, this was not possible for all teachers and schools. Although the TRAILS program was open and advertised to all schools in the state, the cohort demographic was limited by the teachers who applied, were interested, and available to participate in the TRAILS project. ETE teachers were required to have experience with parametric modeling and access to 3D printing equipment. The control group teachers were chosen from the applicants that were not able to attend the summer professional development due to having summer schedule conflicts or not having a partner teacher. These teachers were a self-selecting control group for the study because they applied to participate and expressed interest in integrated STEM. The control group helps account for other factors that may influence the results (Creswell, 2009; Gall et al., 2007).

Year 2 experimental group has four teachers who returned from the previous year’s experimental group. Year 3 experimental group has four teachers from year 2 experimental group and four teachers who participated in both year 1 and year 2 cohorts. The teachers who returned from the previous years improved their custom lesson plans during the professional development based on prior lesson implementation experiences in the previous academic school year. These returning teachers also developed new custom lessons and mentored participating experimental group teachers during professional development.

Intervention: integrated STEM teacher professional development

The TRAILS project included 70 h of teacher professional development (PD) led by Purdue University and Ivy Tech Community College faculty, graduate students, and in-service master high school teachers. The first day of training provided teachers with an overview of pedagogical approaches to science and engineering technology. Specifically, the PD team taught fundamentals grounded in practices teaching scientific inquiry (Kelley & Knowles, 2016; National Research Council, 1996; Purzer, Goldstein, Adams, Xie, & Nourian, 2015) and using student inquiry prompt (a modified KWL called KWHLAQ, Barell, 2006), designed to engage students in utilizing prior science knowledge, discover new knowledge, and seek ways to apply knowledge to solve new problems.

Instruction on engineering design pedagogies included engineers’ notebook instruction (Kelley, 2011, 2014a), utilizing a decision matrix (Eide, Jenison, Marshaw, & Northrup, 2001; Kelley, 2010), promoted brainstorming techniques (Mentzer, Farrington, & Tennenhouse, 2015), and avoiding design fixation (Kelley & Sung, 2017).

On the second day of the PD, both science and engineering technology teachers were taught basic information about entomology including, aquatic habitats, food webs, adaptations, and evolution. Science faculty modeled a biology science investigation by challenging the teachers with a few inquiry questions: (a) “What aquatic insects are in the local pond?” (b) “What does the aquatic insect samples tell us about the quality of water?”, and (c) “What insects become food for fish?” Next, the teachers and faculty left the classroom and visited a retention pond at the site of the Ivy Tech Community College Campus, Crawfordsville, IN hosting the PD.

At the pond, the Purdue Entomology professor and graduate student demonstrated how to properly collect aquatic insect specimens. With the gathered samples, they returned to the wet lab and examined the specimens with hand lenses. A field guide was provided to the teachers to identify insect specimens. Next, teachers determined the number of specimens in their sample that belonged to each insect family. Using this data, teachers were taught about how to use types of aquatic insects as bio-indicators to assess water quality. A bio-indicator guide (Speelman & Carroll, 2012) provides pictures of aquatic insects and a tolerance factor for each specimen to be used to calculate the quality of the water. Additionally, while at the pond, teachers identified bluegill and bass fish. This information was used to inform the engineering design challenge.

The rest of the first week, teachers worked in the CAD lab creating designs for fishing lure prototypes to mimic an identified aquatic insect as bait for fish (in this case, bluegill or bass). Both science and engineering technology teachers were taught by Ivy Tech Community College instructors how to draw insects on CAD software and used these drawings to design split molds for soft plastic lures. Both science and engineering technology teachers created CAD designs and printed lure prototypes. On the last day of the first week, the teachers returned to the pond with their prototypes and tested their designs using common fishing poles. Some cohorts had fish follow lures indicating attraction to the bait. One teacher’s lure caught a sizable bass. These experiences make up a series of the lesson within an integrated STEM unit call Dbait (Knowles, Kelley, & Hurd, 2016). Dbait provided teachers with an integrated STEM experience like their students. The Dbait unit plan thus uses a range of student experiences to highlight science and engineering practices in an integrated STEM flow of activities (Fig. 2).

Fig. 2
figure2

D-Bait Lesson Activities: Science and Engineering Practices (Kelley, 2019: Adapted from NRC, 2011; NRC, 2012; NGSS Lead States, 2013)

In the second week of the PD, teachers were challenged to identify the core fundamentals of the Dbait lesson. These fundamentals included science inquiry investigations, using engineers’ notebooks to capture design procedures, brainstorming biomimicry solutions, and 3D printing and testing prototypes. Next, teachers working in pairs, science and engineering technology teachers created STEM unit plans to teach science content (addressing Indiana science standards and NGSS) through entomology and featured a biomimicry-inspired engineering design challenge. All student design challenges required 3D printing prototype solutions. All core fundamentals from the Dbait lesson were also required to be embedded within the teacher-created unit plans.

The community of practice approach was used to potentially increase teachers’ pedagogical content knowledge (PCK) and increase their self-efficacy in teaching authentic STEM practices. These presentations led to discussions helping teachers create authentic contexts for learning STEM content and practices (Brown et al., 1989; Bruner, 1996; Lave & Wenger, 1991). Furthermore, teachers networked with these professionals for advice while developing lessons, invited professionals as guest speakers in the classroom, and asked them to serve on design assessment panels at the end of TRAILS design projects. Teachers were challenged to add members of their own community of practice with local STEM professionals. This feature of TRAILS had a significant impact on teachers’ STEM career awareness (Knowles, Kelley, & Holland, 2018).

Survey instrument

For the pretest, posttest, and delayed posttest measures of teacher self-efficacy, the T-STEM Survey for technology (ETE) and science teachers was given to the participants (The Friday Institute for Educational Innovation, 2012a). The Qualtrics online survey platform was used to disseminate and collect data on the pretests and posttests. Science teachers completed the Science T-STEM Survey and the ETE teachers completed the Technology T-STEM Survey. The T-STEM Survey was created to measure multiple constructs, including teacher self-efficacy. The survey items used a Likert-type scale with 1 being “Strongly Disagree,” 2 “Disagree,” 3 “Neither Agree nor Disagree,” 4 “Agree,” and 5 being “Strongly Agree” (The Friday Institute for Educational Innovation, 2012a; Sekaran & Bougie, 2009). The instrument utilizes 63 Likert-scale questions with seven sub-constructs. The focus of this paper is on this specific construct labeled “teaching efficacy and beliefs” (Friday Institute for Educational Innovation, 2012a). Since this data is self-reported by participants, some bias may exist (Sekaran & Bougie, 2009). Higher scores are associated with more robust views of capability and strength of belief in capability toward teaching in a specific subject domain, except in the case of negatively worded items which are reversed scored.

This T-STEM survey checks multiple constructs on seven subscales including teaching self-efficacy toward teaching STEM content, what level teachers believe student learning might be increased by effective teaching (outcome expectancy), teacher attitudes toward 21st century skills, use of STEM instructional practices, STEM career awareness, and technology use by students (Friday Institute for Educational Innovation, 2012a; Table 3). This research made use of the eleven items within the construct of teaching self-efficacy (Table 3). According to the Friday Institute, “The Personal Teaching Efficacy and Beliefs (PTEB) construct and the Teaching Outcome Expectancy Beliefs (TOEB) constructs were derived from a well-known survey of science teachers, the Science Teaching Efficacy Belief Instrument, or the STEBI (Riggs & Enochs, 1990)” (Friday Institute for Educational Innovation, 2012b), which helps ensure its reliability and validity.

Table 3 T-STEM Survey Subscale Summary (T-STEM Science & T-STEM Technology)

The T-STEM instrument consists of five different forms, one for each of the subject areas in STEM: T-STEM Science Teacher, T-STEM Technology Teacher, T-STEM Engineering Teacher, and T-STEM Mathematics Teacher, and one version for elementary teachers. For all survey items, developers calculated Cronbach’s alpha at 0.95 (Friday Institute for Educational Innovation, T-STEM Survey, 2012b), indicating good internal reliability (Caliendo, 2015; Tavakol & Dennick, 2011). For the science domain, Cronbach’s alpha for teaching efficacy was reported to be .908. The technology domain Cronbach’s alpha was not reported for self-efficacy (Friday Institute for Educational Innovation, T-STEM Survey, 2012b). Permission was obtained for using the T-STEM survey instruments (T. Collins, personal communication, March 26, 2014).

To investigate the change of teacher self-efficacy after the professional development, the current study focused on the first sub-construct within the science (T-STEM Science) or technology (T-STEM Technology) teaching efficacy and beliefs (Table 3). Each version of the T-STEM inquires about the teacher’s confidence in teaching within his or her domain, science for science teachers, and technology for ETE teachers. Table 4 contains all eleven of the survey items.

Table 4 Self-efficacy: personal STEM teaching efficacy and beliefs

Data collection

To answer the research question # 1: “Does teacher self-efficacy increase after participation in integrated STEM education professional development?” participants were surveyed before and after TRAILS professional development. To answer the research question #2: “Does teacher self-efficacy increase after implementation of integrated STEM lessons?” participants were surveyed after lesson implementation during the school year as a delayed posttest. The timing of the pretest, posttest, and delayed posttest surveys was coordinated within the experimental and control groups as closely as possible. The pretest was conducted the week before the professional development, the posttest was conducted the last day of the professional development, and the delayed posttest was conducted after the lesson implementation during the school year. Each teacher entered a unique code at the beginning of the survey instead of a name to pair data for statistical analysis and maintain confidentiality.

Approval from the Institutional Research Board (IRB) was obtained from both higher education institutions (Purdue University and Ivy Tech Community College) involved in the research. Reminders to complete the surveys were sent a second and third time if necessary approximately 7 days later and again 14 days after the initial survey link was emailed (Couper, 2008; Dillman, Tortora, & Bowker, 1999). Participants took the T-STEM survey for their appropriate subject area of expertise.

Statistical analysis

The researchers determined that the data was ordinal, the sample size was small, and the distributions were non-normal using Kolmogorov-Smirnov and Shapiro-Wilk tests. Since the data were ordinal, relatively small, and non-normal, the researchers used the nonparametric Wilcoxon signed-rank test and Wilcoxon rank-sum test. The critical alpha level was set to 0.05, as is commonly used in educational and social science research (Cumming, 2012; Krzywinski & Altman, 2013). The researchers acknowledge that sometimes a small sample size and low power can fail to detect a significant effect when one may exist. To increase statistical power, a larger sample size would need to be used. However, the TRAILS teacher professional development was constrained by funding for a maximum of fifteen participants, while the control group was limited to ten teachers per cohort.

The researchers used the Wilcoxson signed-rank test to compare pretest, posttest, and delayed posttest scores of each group. To test for significant differences between the groups (control vs. experimental, science vs. ETE) in the score changes between assessments (pretest, posttest, and delayed posttests), the Wilcoxon rank-sum test was implemented.

When analyzing small sample sizes where normality is questionable or when handling ordinal data, nonparametric tests such as Wilcoxon signed-rank test and Wilcoxon rank-sum test are recommended instead of a paired two-sample t test and a two independent sample t test, respectively. In the nonparametric test, the medians or “distribution shapes” are analyzed; descriptive ordinal statistics, including the minimum, first quartile, median, third quartile, and maximum values for individual items and constructs, are calculated for Likert scores (Bowerman & O’Connell, 2007; Doane & Seward, 2007; Keller, 2005; Meek, Ozgur, & Dunning, 2007).

While statistical significance is important for hypothesis testing, the magnitude of an effect is not conveyed by p values. An accepted measure of effect size for the Wilcoxon rank-sum test is Cliff’s delta measure, also known as Cliff’s dominance measure (Grissom, 2015; Grissom & Kim, 2012).

$$ \mathrm{Cliff}'\mathrm{s}\ \mathrm{delta}=2\left({M}_{r2}-{M}_{r1}\right)/\left({n}_1+{n}_2\right) $$

where Mr1and Mr2are rank means, and n1and n2are sample sizes.

For paired two-sample nonparametric analysis, matched-pairs rank biserial r is used as an effect size. It is the difference between the ratio of the positive rank-sum to the total rank-sum and the ratio of the negative rank-sum to the total rank-sum (Glass, 1966; Kerby, 2014).

$$ r=\mid {S}_{r+}-{S}_{r-}\mid /\left({S}_{r+}+{S}_{r-}\right), $$

where Sr+ is the sum of positive ranks and Sr− is the sum of negative ranks.

The effect size varies from 0 to 1, where 0 indicates that the groups are statistically equal, and 1 means that one group significantly dominates.

The researchers used matched pairs rank-biserial correlation to calculate the effect size for the Wilcoxon signed-rank test (pretest, posttest, and delayed posttest scores comparison within each group) and rank biserial correlation for the Wilcoxon rank-sum test (between-group comparison).

Results

Participant survey results

Participants completed responses to the surveys except for one teacher in the control group and seven teachers in the experimental group. Returning teachers were counted only once within their original cohort in the statistical analysis, as summarized in Table 5.

Table 5 Descriptive ordinal statistics by test and group

When conducting a Wilcoxon sign-ranked test for in-group test controls, researchers first summed the scores. The median for the experimental teachers for pretest was 44.5, posttest 47, and delayed posttest was 48, respectively. The median for experimental teachers increased while the control group median did not.

Significant effects on self-efficacy for independent variables (teacher group and subject area) were detected using the Wilcoxon signed-rank test when comparing pretest to the delayed posttest and when comparing posttest and delayed posttest scores in the experimental group (Table 6). Although both TRAILS science and ETE teachers experienced the same professional development, the impact was significantly greater on science teacher self-efficacy. Comparing the findings of the teacher groups by subject area, only science teachers showed significant differences in scores (Table 6). When comparing science teachers in the experimental group, a significant increase in courses with large effect sizes were detected between posttest to delayed posttest scores (p = .0099, effect size = .71) and pretest to delayed posttest scores (p = .001, effect size = .95). No significant differences were found in the control group (Table 7).

Table 6 Summary of statistical tests for significant differences for the experimental group (V, p value and effect size)
Table 7 Summary of statistical tests for significant differences for the control group (V, p value and effect size)

The researchers also conducted a Wilcoxon rank-sum test to compare between-group differences, as seen in Table 8. A Wilcoxon rank-sum test shows that there is a significant difference between experimental group and control group from the pretest to posttest (p = .048, effect size = .29), and from the pretest to the delayed test (p = .034, effect size = .32) (Table 8). Differences were detected for all (cohorts 1–3) the experimental science teachers compared to the control group science teachers for the pretest and posttest results (p = .033, effect size = 0.46), pretest and delayed posttest results (p = .005, effect size = .64), and the posttest and delayed posttest (p = .029, effect size = .47).

Table 8 Summary of Wilcoxon rank-sum test for significant differences between-groups (W, p value, and effect size)

The focus of TRAILS professional development was to build a partnership between high school science and ETE teachers in order to create and implement integrated STEM lessons. It appears that TRAILS professional development and lesson implementation had a greater impact on teacher self-efficacy for science teachers than ETE teachers. The science teachers showed a significant increase in self-efficacy pretest to delayed posttest and posttest to delayed posttest scores after the TRAILS professional development and lesson implementation, with relatively large effect sizes as shown in Table 6. ETE teachers showed no significant increase in self-efficacy pretest to posttest after professional development and posttest to delayed posttest scores after lesson implementation. Between-group differences were not detected when comparing TRAILS ETE teachers with control group teachers.

Discussion

There may be several variables that could influence the self-efficacy results for TRAILS science teachers compared to the ETE teachers. Upon review of the nature of the self-efficacy questions that are within the Science T-STEM survey, a few themes emerge. Table 4 shows that self-efficacy questions are asking about science teachers’ confidence in the general teaching of science (questions #1 and #4), teaching science practices including steps of science and explaining experiments results (questions # 2 and #3), and engaging with colleagues (inviting colleagues to evaluate teaching). The [Name of Project] professional development required all teachers to engage in science and technology practices by requiring teachers to collect samples in the field, to categorize aquatic insect specimens, to make observations, and to use data to assess water quality conditions. Outlined in NGSS (2013), science practice 3: Planning and Carrying Out Investigations (p.389) requires students to learn how to plan and conduct investigations collaboratively to generate data as evidence, make informed decisions about the accuracy, reliability of measures, and precision of data to make final conclusions. TRAILS teachers collected, sorted, and counted aquatic insect specimens and used this data to assess the quality of the water in the local pond. The TRAILS science teachers benefited from engaging in science practices as outlined in NGSS. TRAILS science teachers collaborated with colleagues, both school partners and teachers from other schools to create custom STEM lessons. Additionally, the teachers’ peer assessed the custom lessons and provided feedback for revision. Finally, science teachers engaged in technology practices by working with engineering technology teachers to create a fishing lure design on CAD and 3D-printed prototype solutions. This experience prepared science teachers to help their students design and evaluate a complex real-world problem based on scientific knowledge, generate evidence, prioritize criteria, and tradeoff considerations (NGSS, Lead States 2013, HS-ETS1-2, 1-3, p.126). These experiences helped science teachers develop a deeper understanding of technology. Potentially, the most important impact on self-efficacy was the ability to help students increase interest in science (T-STEM question # 11). TRAILS teachers experienced a blend of science and technology practices during professional development and learned how these practices are used in industry and scientific research from guest speakers. These experiences may have impacted teachers’ increase in self-efficacy in teaching STEM.

Additionally, the instrument inquires about the teacher’s confidence in understanding science well enough to effectively teach technology (questions #5 and #6). Research indicates that teachers with limited knowledge in a subject outside of their area of expertise struggle to have confidence in teaching those subjects (Stohlmann et al., 2012). The results of the T-STEM science survey indicate that TRAILS science teachers increased their knowledge of technology and therefore may be the factor for increasing their confidence in teaching both science and technology as well as increasing students’ interest in STEM career (Knowles et al., 2018).

TRAILS professional development included providing authentic engineering design activities and common STEM practices during teacher training sessions (Annetta & Minogue, 2016; Bray-Clark & Bates, 2003; NRC, 2015; Stohlmann et al., 2012). These sessions explicitly modeled how to teach the key features of authentic engineering design. Additionally, the researchers believe one key element to building authentic engineering design learning experiences is to do so within a community of practice (Lave & Wenger, 1991). The TRAILS teacher professional development was delivered within a community of STEM practitioners, allowing novices to learn engineering design, science inquiry, 3D printing, and 3D scanning by working and learning alongside experts.

ETE teachers appear to already have confidence (a potential ceiling effect) in teaching STEM subjects using an integrated approach. Upon review of the pretest scores for ETE teachers, their scores began higher than science teachers (median ETE = 49, median science = 44), and a significant difference was detected when comparing the pretest scores for self-efficacy of ETE and science teachers (p = 0.026). Because the pretest scores were higher for the ETE teachers, they have less opportunity for overall growth in self-efficacy in comparison to the science teachers.

Teacher self-efficacy has been found to be a factor in influencing students’ persistence and retention in STEM subjects (Painter & Bates, 2012) and overall improvement in student learning (Nadelson et al., 2012; Yoon et al., 2012, 2014). These research findings for TRAILS science teachers’ self-efficacy are significant in showing positive effects of professional development, which may improve student learning of STEM content and career interest. Moreover, it is important to acknowledge that there was a significant increase with a large effect-size in self-efficacy for the TRAILS science teachers from posttest to delayed posttest results. These findings reveal that science teachers were significantly impacted by implementing integrated STEM lessons, both the exemplar and their own custom lessons. These results indicate that science teachers reinforced their self-efficacy through the process of implementing integrated STEM lessons. These findings reinforce Bandura’s theory (Bandura, 1994) on improving self-efficacy through continuous feedback between the stages of learning new skills, putting those skills into practice, and receiving feedback on success and failure. This embraces the necessity of persistence in the midst of failure or obstacles when integrating STEM. Members of the TRAILS community of practice encouraged the teachers during professional development, as well as sharing about their own failures in their work, including manufacturing, scientific research, and other STEM field contexts. These examples may have positively impacted teachers’ self-efficacy by hearing these examples of persistence.

Results from this research suggest that science teachers benefited greatly from learning within this community of practice, engaging in science practices including data collection and observation in the field, and using this science knowledge to solve a real-world problem (engineering design) (NGSS Lead States, 2013). Additionally, these teachers benefited from the opportunity to successfully implement their own integrated STEM lessons as measured by delayed T-STEM posttest result.

Limitations of this study

The authors note several limitations of this study. The teachers surveyed only include those who applied and were selected to participate in the TRAILS summer professional development workshop within a rural region of a Midwestern state. The participants were not a random sample but selected from a pool of applicants because they met the criteria and were able to attend the entire summer professional development. However, any high school science or ETE teacher in this Midwestern state had the opportunity to apply to this program. The sample of teachers in the TRAILS project was based on participant support, which limited the number of experimental group teachers to a maximum of 15 each year. As a result, the researchers calculated the effect size to measure the magnitude of the effect on the significant differences discovered.

Recommendations

Based on the findings presented here, the authors would like to make the following recommendations:

  1. 1.

    STEM teachers seem to have increased their self-efficacy after successfully implementing integrated STEM lessons in their classrooms. Researchers should consider conducting delayed posttest assessments upon completion of professional development STEM lessons and investigate further factors that impact teacher self-efficacy.

  2. 2.

    STEM teachers benefit from learning within a community of practice, specifically when cogenerating and peer evaluating integrated STEM lessons. Teacher professional developers should give teachers time to talk about their own practices with their peers and discuss ways to incorporate their own approaches to integrated STEM. When establishing a community of practice, it is important to include master teachers who can share best practices from their own successfully implementing integrated STEM lessons.

  3. 3.

    More research needs to be conducted on how to effectively establish a community of practice incorporating industry and university partners that impacts teachers’ self-efficacy in integrated STEM teaching and STEM career awareness.

  4. 4.

    More research to understand the best practices of quality teacher professional development is necessary to improve integrated STEM education (Honey et al., 2014; Miles, Slagter van Tryon, & Mensah, 2015).

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

ETE:

Engineering Technology Education

IRB:

Institutional Research Board

ITEST:

Innovative Technology Experiences for Students and Teachers

NGSS:

Next-Generation Science Standards

NSF:

National Science Foundation

PCK:

Pedagogical content knowledge

STEM:

Science, Technology, Engineering, and Math

References

  1. Annetta, L., & Minogue, J. (2016). Connecting science and engineering education practices in meaningful ways building bridges (1st ed. 2016. ed., Contemporary Trends and Issues in Science Education, 44).

  2. Ary, D., Jacobs, L., Sorensen, C., & Walker, D. (2014). Introduction to research in education (9th ed.). Belmont: Wadsworth.

  3. Autenrieth, R., Lewis, C., & Butler-Perry, K. (2017). Long-term impact of the enrichment experiences in engineering (E3) summer teacher program. Journal of STEM Education, 18(1), 25–31.

  4. Bandura, A. (1994). Self-efficacy. In V. S. Ramachaudran (Ed.), Encyclopedia of human behavior (Vol. 4) (pp. 71–81). New York: Academic Press.

  5. Bandura, A. (1997). Self-efficacy: the exercise of control. New York: W H Freeman/Times Books/ Henry Holt & Co..

  6. Barell, J. F. (2006). Problem-based learning: an inquiry approach. Corwin Press.

  7. Bowerman, B., & O’Connell, R. (2007). Business statistics in practice. New York: McGraw-Hill Irwin Publishing Co..

  8. Bray-Clark, N., & Bates, R. (2003). Self-efficacy beliefs and teacher effectiveness: implications for professional development. The Professional Educator, XXVI(1), 13–22.

  9. Brophy, S., Klein, S., Portsmore, M., & Roger, C. (2008). Advancing engineering education in P-12 classrooms. Journal of Engineering Education, 97(3), 369–387.

  10. Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42.

  11. Bruner, J. (1996). The culture of education. Cambridge: Harvard University Press.

  12. Caliendo, J. (2015). Pre-service elementary teachers: scientific reasoning and attitudes toward STEM subjects (Doctoral dissertation). Available from ProQuest Dissertations and Theses database. (UMI No. 3706494).

  13. Caprara, G., Barbaranelli, C., Steca, P., & Malone, P. (2006). Teachers’ self-efficacy beliefs as determinants of job satisfaction and students’ academic achievement: a study at the school level. Journal of School Psychology, 44(2006), 473–490.

  14. Caprile, M., Palmen, R., Sanz, & Dente, G. (2015). Encouraging STEM studies for the labour market (Directorate-General for Internal Policies: European Parliament). Brussels: European Union Retrieved from http://www.europarl.europa.eu/RegData/etudes/STUD/2015/542199/IPOLSTU/282015/29542199_EN.pdf.

  15. Couper, M. (2008). Designing effective web surveys. Cambridge: Cambridge Press.

  16. Creswell, J. (2009). Research design: qualitative, quantitative, and mixed methods approaches (3rd ed.). Los Angeles: Sage.

  17. Cumming, G. (2012). Understanding the new statistics: effect sizes, confidence intervals, and meta-analysis (pp. 27–28). New York: Routledge.

  18. Dillman, D. A., Tortora, R.D. & Bowker, D. (1999). Principles of constructing web surveys. Retrieved July, 6th 2017, from http://claudiaflowers.net/rsch8140/PrinciplesforConstructingWebSurveys.pdf

  19. Doane, D., & Seward, L. (2007). Applied statistics in business and economics. New York: McGraw-Hill Irwin Publishing Co..

  20. Eide, A. R., Jenison, R. D., Marshaw, L. H., & Northrup, L. (2001). Engineering fundamentals and problem solving (4th ed.). Boston: McGraw-Hill.

  21. English, L. (2017). Advancing elementary and middle school STEM education. International Journal of Science and Mathematics Education, 15(1), 5–24.

  22. Friday Institute for Educational Innovation. (2012a). Teacher efficacy and attitudes toward STEM survey (T-STEM). Raleigh: North Carolina State University Retrieved from: http://miso.ncsu.edu/articles/t-stem-survey-2.

  23. Friday Institute for Educational Innovation. (2012b). Teacher efficacy and attitudes toward STEM (T-STEM) survey: development and psychometric properties. Raleigh: North Carolina State University Retrieved from: http://miso.ncsu.edu/wp-content/uploads/2013/06/TSTEM_FridayInstitute_DevAndPsychometricProperties_FINAL.pdf.

  24. Gall, M. D., Gall, J. P., & Borg, W. R. (2007). Educational research: an introduction (8th ed.). Boston: Pearson/Allyn & Bacon.

  25. Glass, G. V. (1966). Note on rank biserial correlation. Educational and Psychological Measurement, 26, 623–631.

  26. Grissom, R. J. (2015). Nonparametric effect size estimators. Post in EDSTAT-L (listserv), 16-June-2015.

  27. Grissom, R. J., & Kim, J. J. (2012). Effect sizes for research univariate and multivariate applications (2nd ed.). New York: Routledge.

  28. Honey, M., Pearson, G., & Schweingruber, A. (2014). STEM integration in K–12 education: status, prospects, and an agenda for research. Washington, DC: National Academies Press.

  29. Keller, G. (2005). Statistics for management and economic (7th ed.). Belmont: Thomson Publishing Co..

  30. Kelley, T. (2011). Engineer’s notebook – a design assessment tool. Technology and Engineering Teacher, 70(7), 30–35.

  31. Kelley, T. (2014a). Constructing an engineer’s notebook rubric. The Technology and Engineering Teacher, 73(5), 26–32.

  32. Kelley, T. (2014b). STL guiding the 21st century thinker. The Technology and Engineering Teacher, 73(4), 18–23.

  33. Kelley, T. (2019). TRAILS Annual NSF Project Report. Unpublished Report.

  34. Kelley, T., & Sung, R. (2017). Sketching by design: teaching sketching to young learners. International Journal of Technology and Design Education, 27(3), 363–386.

  35. Kelley, T. R. (2010). Design assessment: consumer reports style. The Technology Teacher, 69(8), 12.

  36. Kelley, T. R., & Knowles, J. G. (2016). A conceptual framework for integrated STEM education. International Journal of STEM Education, 3, 11. https://doi.org/10.1186/s40594-016-0046-z.

  37. Kerby, D. S. (2014). The simple difference formula: an approach to teaching nonparametric correlation. Comprehensive Psychology, 3. https://doi.org/10.2466/11.IT.3.1.

  38. Knowles, J. G., Kelley, T., & Holland, J. D. (2018). Increasing teacher awareness of STEM Careers. Journal of STEM Education, 13(3), 26–34.

  39. Knowles, J. G., Kelley, T., & Hurd, B. (2016). Innovate the intersection between entomology and technology. The Technology and Engineering Teacher, 76(1), 1–7.

  40. Krzywinski, M., & Altman, N. (2013). Points of significance: significance, P values and t-tests. Nature Methods. Nature Publishing Group., 10(11), 1041–1042 Retrieved from: http://www.nature.com/nmeth/journal/v10/n11/full/nmeth.2698.html.

  41. Lave, J., & Wenger, E. (1991). Situated learning. Legitimate peripheral participation. Cambridge, England: Cambridge University Press.

  42. NGSS Lead States. (2013). Next generation science standards: for states, by states. Washington, DC: National Academies Press.

  43. Meek, G., Ozgur, C., & Dunning, K. (2007). Does scale of measurement really make a difference in test selection? An empirical control of t test vs. Mann Whitney (pp. 951–953). Proceedings of the 2000 National Annual Meeting of the Decision Sciences Institute.

  44. Mentzer, N., Farrington, S., & Tennenhouse, J. (2015). Strategies for teaching brainstorming in design education. Technology and Engineering Teacher, 74(8), 8.

  45. Miles, R., Slagter van Tryon, P., & Mensah, F. (2015). Mathematics and science teacher’s professional development with local businesses to introduce middle and high school students to opportunities in STEM careers. Science Educator, 24(1), 1–11.

  46. Nadelson, L., Seifert, A., Moll, A., & Coats, B. (2012). i-STEM summer institute: an integrated approach to teacher professional development in STEM. Journal of STEM Education, 13(2), 69–83.

  47. Nathan, M. J., Atwood, A. K., Prevost, A., Phelps, L. A., & Tran, N. A. (2011). How professional development in project lead the way changes high school STEM teachers’ beliefs about engineering education. Journal of Pre-College Engineering Education Research (J-PEER), 1(1), 3 http://docs.lib.purdue.edu/jpeer/vol1/iss1/3.

  48. National Research Council [NRC]. (1996). National Science Education Standards. National Committee for Science Education Standards and Assessment. Washington: National Academies Press.

  49. National Research Council [NRC]. (2011). Successful K-12 STEM education: identifying effective approaches in science, technology, engineering, and mathematics. Washington, DC: National Academies Press.

  50. National Research Council [NRC]. (2012). A framework for K12 science education: practices, cross cutting concepts, and core ideas. Washington: National Academies Press.

  51. National Research Council [NRC]. (2015). Guide to implementing the next generation science standards. Board on Science Education, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.

  52. Painter, P., & Bates, R. (2012). Statistical models of self-efficacy in STEM students. Journal of Undergraduate Research at Minnesota State University, Mankato, 12(7), 1–13.

  53. Partnership for 21st Century Skills. (2017) Retrieved from: http://www.p21.org/framework.

  54. President’s Council of Advisors on Science and Technology (PCAST). (2010). Prepare and inspire: K–12 education in science, technology, engineering, and math (STEM) for America’s future. Washington, DC: Author.

  55. Purzer, S., Goldstein, M., Adams, R., Xie, C., & Nourian, S. (2015). An exploratory study of informed engineering design behaviors associated with scientific explanations. International Journal of STEM Education, 2(9), 1–12.

  56. Putnam, R. T., & Borko, H. (2000). What do new views of knowledge and thinking have to say about research on teacher learning? Educational researcher, 29(1), 4–15.

  57. Riggs, I. M., & Enochs, L. G. (1990). Toward the development of an elementary teacher’s science teaching efficacy beliefs instrument. Science Education, 74(6), 625–637.

  58. Rittmayer, A., & Beier, M. (2008). Overview: self-efficacy in STEM. Houston: SWE-AWE-CASEE ARP Resources, Rice University Retrieved from: http://www.AWEonline.org.

  59. Rockland, R., Bloom, D. S., Carpinelli, J., Burr-Alexander, L., Hirsch, L. S., & Kimmel, H. (2010). Advancing the “E” in K-12 STEM education. Journal of Technology Studies, 36(1), 53–64.

  60. Sekaran, U., & Bougie, R. (2009). Research methods for business: A skill building approach. Cornwall: Wiley.

  61. Shadish, W., Cook, T., & Campbell, D. (2002). Experimental and quasi-experimental designs for generalized causal inference. New York: Houghton Mifflin Company.

  62. Speelman, J., & Carroll, N. (2012). Bioindicators of water quality. West Lafayette: Purdue Extension Retrieved from https://mdc.itap.purdue.edu/item.asp?ItemID=20801.

  63. Stohlmann, M., Moore, T., & Roehrig, G. (2012). Considerations for teaching integrated STEM education. Journal of Pre-College Engineering Education Research, 2(1), 28–34.

  64. Tavakol, M., & Dennick, R. (2011). Making sense of Cronbach’s alpha. International Journal of Medical Education, 2, 53–55.

  65. Thomas, B., & Watters, J. (2015). Perspectives on Australian, Indian and Malaysian approaches to STEM education. International Journal of Educational Development, 45(November 2015), 42–53.

  66. Wang, H., Moore, T., Roehrig, G., & Park, M. (2011). The impact of professional development on integrating engineering into science and mathematics classroom. American Society for Engineering Education, 01434–01417.

  67. Witt-Rose, D. (2003). Student self-efficacy in college science: an investigation of gender, age, and academic achievement. (Unpublished Master’s Thesis). Menomonie: University of Wisconsin-Stout.

  68. Yoon, S., Evans, M., & Strobel, J. (2012). Development of the teaching engineering self-efficacy scale (TESS) for K–12 teachers. In Proceeding of the American Society of Engineering Education (ASEE) Annual Conference and Exposition. San Antonio: ASEE.

  69. Yoon, S., Evans, M., & Strobel, J. (2014). Validation of the teaching engineering self-efficacy scale for K–12 teachers: a structural equation modeling approach. Journal of Engineering Education., 103(3), 463–485.

Download references

Acknowledgements

None

Funding

Elements of this paper are supported by the National Science Foundation, award #DRL-1513248. Any opinions and findings expressed in this material are of the authors’ and do not necessarily reflect the views of NSF.

Author information

Affiliations

Authors

Contributions

TK, GK, JDH, and JH were the major contributors in writing the manuscript. GK administered the surveys and collected data. JH conducted data statistical analysis, reviewed, and approved by JDH. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Todd R. Kelley.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Kelley, T.R., Knowles, J.G., Holland, J.D. et al. Increasing high school teachers self-efficacy for integrated STEM instruction through a collaborative community of practice. IJ STEM Ed 7, 14 (2020). https://doi.org/10.1186/s40594-020-00211-w

Download citation

Keywords

  • Teacher self-efficacy
  • Integrated STEM
  • Teacher professional development
  • Engineering design