Skip to main content

Engineering student experience and self-direction in implementations of blended learning: a cross-institutional analysis

Abstract

Background

Much of researchers’ efforts to foster wider implementation of educational innovations in STEM has focused on understanding and facilitating the implementation efforts of faculty. However, student engagement in blended learning and other innovations relies heavily on students’ self-directed learning behaviors, implying that students are likely key actors in the implementation process. This paper explores the ways in which engineering students at multiple institutions experience the self-directed selection and implementation of blended learning resources in the context of their own studies. To accomplish this, it adopts a research perspective informed by Actor-Network Theory, allowing students themselves to be perceived as individual actors and implementors rather than a population that is implemented upon.

Results

A thematic analysis was conducted in two parts. First, analysis identified sets of themes unique to the student experience at four participant institutions. Then, a second round of analysis identified and explored a subset of key actors represented in students’ reported experiences across all institutions. The findings show clear similarities and differences in students’ experiences of blended learning across the four institutions, with many themes echoing or building upon the results of prior research. Distinct institutional traits, the actions of the instructors, the components of the blended learning environment, and the unique needs and preferences of the students themselves all helped to shape students’ self-directed learning experiences. Students’ engagement decisions and subsequent implementations of blended learning resulted in personally appropriate, perhaps even idiosyncratic, forms of engagement with their innovative learning opportunities.

Conclusion

The institutional implementation of blended learning, and perhaps other educational innovations, relies in part on the self-directed decision-making of individual students. This suggests that instructors too hold an additional responsibility: to act as facilitators of their students’ implementation processes and as catalysts for growth and change in students’ learning behaviors. Developing a greater understanding of students’ implementation behaviors could inform the future implementation efforts of faculty and better empower students to succeed in the innovative classroom.

Introduction

Researchers in STEM education, and in the context of engineering more specifically, have long called for broader dissemination and application of research-based innovations in undergraduate teaching (Besterfield-Sacre et al., 2014; Borrego et al., 2013; Jamieson & Lohmann, 2012). This push for increased translation of research into practice has coincided with an increasing academic interest in the implementation of such innovations (Henderson et al., 2011; Reinholz et al., 2020). The resulting literature has been predominately focused on the actions and experiences of faculty, and for good reason: faculty act to enhance the learning opportunities available to their students through the implementation of educational innovations, and facilitating or streamlining that implementation process is integral to the dissemination of new innovations (Finelli & Borrego, 2020; Liu et al., 2020; Mirriahi et al., 2015). As a result, researchers have come to identify a variety of factors that influence the implementation of innovations among STEM instructors.

However, sole focus on this faculty-oriented perspective may limit our understanding of students’ experiences and the roles they play in the implementation process (Kezar et al., 2015). For the past six years, our research team has been collecting data on the implementation of a blended learning environment called Freeform. The Freeform environment was developed to combine a variety of research-based pedagogical innovations and learning resources (Rhoads et al., 2014) and has been implemented at several engineering institutions—both domestically within the USA and internationally—since its formal introduction in 2014 (Kandakatla et al., 2018). One of the major affordances of blended learning is flexibility, allowing students to determine how best to tailor resource use to fit their individual needs (Means et al., 2009). This flexibility, however, depends on student’s own self-directed engagement. Furthermore, in our prior work, we have seen student decision-making play a vital role in determining how the overall implementation of blended learning manifested in the behaviors, experiences, and outcomes for students and faculty alike (Evenhouse et al., 2018, 2020; Kandakatla et al., 2020; Stites et al., 2019).

A better understanding of how students engage with the innovative resources contained in blended learning environments could provide valuable insight for future implementations of hybrid and blended learning, which have become even more prevalent in the wake of COVID-19. In addition, we suspect that a deeper understanding of the student experience and recognition of students as central actors in implementation could inspire a change in perspective. As Freeform has been disseminated to institutions beyond our own, it has also been exposed to students of many different contexts and backgrounds, allowing us to examine the variation in student implementations of blended learning innovations across institutional types and cultures. The purpose of this paper is to further examine the role that students play in the practical implementation of their blended learning environment: not just as a factor influencing the actions of faculty, but as actors working with the innovation itself.

Blended learning context: the learning environment

Freeform is a pedagogical system which has been iteratively developed, researched, and propagated through the joint effort of faculty in the Purdue University—West Lafayette (PUWL) Schools of Engineering Education (ENE) and Mechanical Engineering (ME). Originally conceived to enhance teaching and learning in 2nd year dynamics courses (Rhoads et al, 2014), the Freeform environment has since been applied to other courses in ME mechanics (Kandakatla et al., 2018) and other core engineering sciences (e.g., in Chemical Engineering). The environment was designed to combine best practices in blended learning (Halverson et al., 2014; Means et al., 2009) with active (Christie & de Graaff, 2017; Freeman et al., 2014) and collaborative (Barkley et al., 2014; Dillenbourg, 1999) instructional approaches, encouraging the adoption of innovative pedagogical methods and allowing students to engage with a wide variety of in-person and online learning resources (Rhoads et al., 2014). This is also the primary difference between the pedagogical environment and similar applications of educational technology, such as Learning Management Software (LMS) or enhanced textbooks—the physical and digital resources in the environment are intended to complement and facilitate the simultaneous use of various research-based teaching practices. The online resources also act as a supplement to in-class learning (Francis & Shannon, 2013), thereby promoting opportunities for self-regulated learning behavior outside of class (Zimmerman, 2001). Table 1 gives an overview of the learning resources typically available to students in Freeform courses on the PUWL campus.

Table 1 Learning resources in the Freeform environment (Evenhouse et al., 2020)

Having these resources distributed to multiple implementing institutions provides a unique opportunity to study how and why students interact with specific subsets of resources and learning opportunities (Kandakatla et al., 2018). Previous work by Stites et al. (2019), Stites et al. (2020) used cluster analysis to identify nine distinct resource usage patterns among students. These patterns, with few exceptions, did not predict significant differences in academic performance. Rather, students chose to engage with resources they expected to best address their needs and could readily explain engagement decisions when interviewed (Evenhouse et al., 2020; Stites et al., 2019). The analyses and findings resulting from these students’ explanations echoed the conclusions of previous theories such as the Technology Acceptance Model (Davis, 1993) or the expectancy-value model (Makara & Karabenick, 2013). However, the manifest depth and variety of student experience further complicated our understanding of academic implementation. Students’ decisions were influenced by a host of factors comprising future goals, scheduling limitations, personal preferences, background factors, and prior educational experiences (Evenhouse et al., 2018, 2020; Kandakatla et al., 2020). Even personal context such as students’ housing situations could influence how they engaged with learning resources in their blended learning environment (Evenhouse et al., 2018, 2020).

Although we have evaluated the implementation of Freeform from a number of perspectives (Evenhouse et al., 2018; Kandakatla et al., 2018), we have been increasingly drawn to the experiences, behaviors, and demonstrated agency of students themselves. For example, in their discussion of resource usage and agency, Stites et al. (2019) encouraged faculty to consider allowing students to make their own implementation decisions, rather than expecting all students to use specific, tailored approaches to learning in the innovative classroom. Evenhouse et al. (2020) came to a similar conclusion, emphasizing that instructors should encourage mindsets and behaviors that promote deep learning, rather than attempting to prescribe specific methods and resources for all of their students to use. In implementing blended learning environments across diverse contexts, we expect students to play a pivotal role in the introduction, use, and long-term success of their learning innovations. To better understand the experiences and roles of students in the implementation of such innovations, we address the following research questions:

  • RQ1: How do students experience and engage in implementation within the Freeform environment?

  • RQ2: How do these experiences compare across institutionally distinct contexts?

Literature review

Implementation in undergraduate STEM

Borrego and Henderson (2014) introduced implementation in higher education as an intentional and targeted process of curricular change. Through implementation, faculty adopt, adapt, and subsequently integrate specific educational innovations to new environments via strategies designed to promote the innovations' efficacy. In this way, the decision to adopt a new research-based innovation is only the first step of any implementation (Taylor et al., 2018b; Tornatzky & Klein, 1982): faculty then apply, test, and iteratively adapt the innovation to fit with the unique needs and values of their classroom's context (Aarons, et al., 2019; Baumann et al., 2017). Ideally, these intentional processes of targeted change result in the successful, effective, and long-term use of research-based innovations (Moullin et al., 2019; Rogers, 2003), applied and adapted in ways that benefit both faculty and their students.

Henderson and Dancy (2007), Lattuca et al. (2009), and others model change in higher education using a combination of individual and environmental—or institutional—factors, each of which influence individual decision-making and shape the final form of a given implementation. In short, the process of implementation in education is shaped by its implementors and context (Lattuca et al., 2009). In this body of work, students are most often included as an environmental factor, as these models of implementation typically assume that the implementors in question are faculty or administrators. For example, in the 2014 Journal of Engineering Education special issue on systematic change in STEM higher education, students were almost exclusively represented in the form of student resistance as experienced by faculty members (McKenna et al., 2014).

It is the influence of these individual and environmental factors in implementation that often result in the development of local adaptations. Individual factors may require that implementors better adapt the innovation to fit with their personal needs, abilities, or preferences. For example, faculty may have strong conceptions of learning theory or a long history of educational experience, leading them to alter the ways in which they utilize innovations to better match their prior experience and skillset (Englund et al., 2017). In contrast, environmental factors could require implementors to adapt innovations to fit with the needs, values, or capabilities of their context. For example, faculty may have to adapt their use of new educational technologies to fit with their institutional infrastructure or their students’ preferences (Cohen & Ball, 2007; Henderson & Dancy, 2008). Such adaptations are a necessary part of the implementation process, allowing innovations to be applied broadly despite differences in institutional, cultural, or physical context (Dancy & Henderson, 2008). Thus, any adaptations to an educational innovation that are determined to be necessary, and the resulting changes to its final form after implementation, depend both upon the context in which it is implemented and the unique individuals engaged in the implementation process.

Undergraduate students in the implementation of blended learning

Prior research on blended instruction strongly supports the efficacy of blended learning in higher education (Halverson et al., 2014; Means et al., 2009; Porter et al., 2014). Yet, there are few examples of literature discussing the evaluation of blended learning in context with implementation strategies that are specific to such innovations (e.g., Brown, 2016; Taylor et al., 2018a). In addition, those studies specifically targeting the implementation of blended learning rarely examine blended learning environments as a whole (Porter et al., 2016). Instead, they rely heavily on literature related to the dissemination, implementation, or use of specific educational technologies by new faculty or at new institutions.

As a result, formalized frameworks and models of blended learning implementation tend to feature students as a strictly environmental factor (e.g., Brown, 2016) when they feature students at all (e.g., Porter et al., 2014). This is somewhat surprising, even if it follows the general trends of implementation literature. Students’ self-regulated or self-directed learning behaviors are widely recognized as a critical component driving student success in blended learning environments (Kintu et al., 2017; Stacey & Gerbic, 2008). This implies that student interaction with blended learning innovations not only influences the implementation process, but directly contributes to the implementation’s success.

Student resource adoption and self-direction in blended learning

Though research that formally examines student agency in context with processes of educational change remains rare, many studies have examined what factors might act as motivators or barriers to student engagement in the context of newly introduced, research-based, educational innovations. The expectancy-value model, for example, frames decision-making in terms of students’ perceptions regarding resource availability, applicability, and quality: students weigh these factors alongside their learning needs to determine the potential value of each resource they encounter (Makara & Karabenick, 2013). The Technology-Acceptance Model has likewise been widely applied in education research and frames students’ technology adoption decisions in similar terms; predicting students’ attitudes and engagement behaviors based on the resources’ perceived ease-of-use and perceived usefulness (Davis, 1993). Speaking generally, we can expect students’ adoption decisions to heavily depend on their perceptions regarding how valuable a resource is (i.e.: how much relevant help it can provide), and perceptions regarding how easy a resource is to engage with. For example, students with a high degree of digital literacy might view digital learning resources as being particularly easy to use (Sayaf et al., 2022). Likewise, students whose peers speak highly of certain resources might see more value in utilizing those resources themselves due to social influence (VanDerSchaaf et al., 2021). However, these studies tend to revolve around assessments of student adoption and engagement decisions. To examine how students go on to further implement the resources they choose to adopt (i.e.: studying how those adopted resources are adapted and utilized in the long-term), we must look beyond literature on adoption and instead examine learning behaviors.

Self-directed learning describes the process through which students select, plan, carry-out, and evaluate the efficacy of their own learning experiences. In self-directed learning, choices regarding when and how to engage with specific opportunities for learning lie partially with the students, rather than being wholly directed by their instructors. Therefore, self-directed learning relies heavily on students’ own internal motivations to learn, understandings of how learning works, and abilities to plan ahead (Litzinger et al., 2005). These students must be able to set their own goals, engage actively and metacognitively in the learning process, and evaluate their own learning outcomes (Jossberger et al., 2010).

Self-directed learning readiness has long been discussed as an essential contributor to student success and positive student perceptions of learning in blended environments (Ausburn, 2004), and engaging in blended learning can help students to develop their own self-directed learning aptitude (De George-Walker & Keeffe, 2010). However, blended learning does not automatically teach students how to effectively regulate their own learning (Adinda & Mohib, 2020; Sirakaya & Özdemir, 2018). Instructors must intentionally design for the enhancement of students’ self-directed and self-regulatory learning behaviors for development to reliably occur (Adinda & Mohib, 2020; Van Laer & Elen, 2020). Van Laer and Elen (2020) published a list of curricular design attributes that typify the facilitation of self-direction in blended learning. Together, these attributes are intended to foster students’ internal motivation, facilitate the adoption and use of blended learning resources, and encourage metacognitive and reflective learning practice in the presence of blended learning innovations (Van Laer & Elen, 2020). Put another way, instructors in blended learning are encouraged to empower their students, enabling them to better adopt, adapt to, and utilize their wide range of resources. Thus, implementing blended learning in a way that targets student self-direction involves adopting methods that treat the students as implementors themselves.

Theoretical framework

Implementors, innovations, and their environments interact in complex ways during the implementation process. In this study, we employ Actor-Network Theory (ANT) as a theoretical framework to help better confront this complexity. In ANT all stakeholders, participants, technologies, and environments are conceived as being actors connected through mutual interactions (Latour, 1984). In the same way that people can act on objects around them to define the object’s purpose, uses, or roles, objects are theorized to act upon people to inform their own use (Tatnall & Davey, 2015). This can make ANT especially useful for the examination of innovation and implementation efforts (Fenwick, 2011; Harty, 2010). ANT provides a means of conceptualizing interactions between the various components and stakeholders involved in the implementation process. Within the scope of ANT, implementation becomes a process of renegotiation of roles and connections involving innovations, users, and their surrounding environments (Latour, 2005).

By adopting ANT, we are intentionally considering students as controlling actors, with each student working to influence their immediate learning network (Aheto, 2017). When given the agency to implement new, innovative approaches to learning, students attempt to change their learning networks in ways that incorporate those innovations in roles that are comprehensible and useful to the students themselves. In ANT, implementation is represented by the process of translation. Translation is the alteration of existing networks as certain actors attempt to exert control over their surroundings, altering them and attempting to redefine the roles of other actors as desired (Latour, 2005). However, these controlling actors (or implementors) can also be guided and changed throughout their work due to the influence of other actors, including the innovation they wish to implement.

In this way, the process of implementation is framed as a renegotiation of roles, one which may require compromise from any or all of the actors involved. Such processes of negotiation are well documented in implementation literature even outside the context of ANT. The concept of mutual adaptation, for example, has existed in discussions of business management for decades (Leonard-Barton, 1988). Incorporating ANT as a theoretical framework can help researchers identify and interpret mutual adaptations by allowing all the actors involved in the network, whether human, technological, or otherwise, to demonstrate agency during the implementation process. For example, ANT has proven useful in the study of Information and Communications Technology, disrupting researchers’ prior understandings to identify complex interactions in the use of digital learning technologies (Arif et al., 2017) and online learning spaces (Rowan & Bigum, 2003; Tatnall, 2019).

We are not employing ANT as a methodological approach. Rather, we are creating a hybrid study, incorporating ANT as a theoretical framework to inform our analysis and subsequent interpretation of findings in light of its sociomaterial perspective. As Fenwick and Edwards note in their introduction to Revisiting Actor-Network Theory in Education, “this practice of ANT hybrid is becoming more the norm than the exception” (Fenwick & Edwards, 2019: p. 3), and several chapters in their book follow a similar, hybridized approach. They reiterate from their previous book: “ANT offers, ‘a way of intervening in or interrupting education rather than simply a way of representing education’ (Fenwick & Edwards, 2019: p 4)”. Although ANT has been applied to Engineering Education Research (EER) before (Johri & Olds, 2011; Paledi, 2019), studies that intentionally treat students as controlling actors and implementors appear far more common outside of EER (e.g., Buhl, 2017; Luke, 2020; Pillai, 2017). By employing ANT, we intend to interrupt our current conceptions of implementation in education, opening the door to a wider range of possible interpretations.

Methods

To study the experiences of individual students as they implement the Freeform environment, we employed a cross-institutional thematic analysis. We conducted this analysis in two parts. First, we analyzed the experiences of students at each separate institution, looking for commonalities and trends in the data from different implementations. Next, we took those initial codes and findings to compare them across the institutional populations, once again looking for clear themes and further interpreting them through the lens of ANT.

Participant selection and institutional context

In approaching this multi-institutional study of implementation, we treated the student populations at each location as separate actor-networks. We assumed that all the students in these four distinct institutional populations considered their use of the Freeform environment to be of “relevance and personal significance” (Pietkiewicz & Smith, 2014: p. 10) to their experience in the course. Although students would experience implementation in individually distinct ways, every student could, in some way, speak to our phenomenon of interest due to their participation in the course. All students in each Freeform course were invited to participate in interviews, maximizing our number of potential participants given the relatively small population sizes. A summary of each participating institution and their corresponding sample sizes may be found in Table 2. By subsequently comparing our findings from participants in each context, we drew insight on any unifying, diversifying, or singular aspects of the experienced phenomenon of interest (the implementation of Freeform).

Table 2 Summary of participating institutions

All implementations of Freeform included in this study were within the context of engineering dynamics courses, and all occurred prior to the arrival of COVID-19 (years have been deidentified to preserve anonymity). All institutions received their own instantiations of the Freeform course blog (a course website containing digital learning resources), although some instructors, like in the case of Trine, chose to distribute Freeform resources using their own LMS. Decisions regarding the use of supplementary instruction or other out-of-class teaching activities were left to the discretion of each instructor.

Data collection

Data were collected using a semi-structured interview protocol during in-person visits to each institution. The interview questions were written to address students’ experiences with learning resources and pedagogical approaches characteristic of Freeform, highlighting relevant stories and personally significant moments of experience. In addition, an initial set of questions asked students to describe themselves and their home institution, helping to establish the context for their experience. Interviews lasted 30–50 min, with duration largely driven by the interviewee. Interviews were held approximately two-thirds of the way through the semester, and participation was incentivized by individual $20 gift cards. After recording, interviews were sent to a third-party service for transcription.

A portion of our data (from PUWL) has been used in previous research (Evenhouse et al., 2020). By employing a new theoretical framework and broadening the scope to our multi-institutional dataset, we expect to discover new insights from these students’ stories and draw fresh comparisons across contexts.

Data analysis

The structure of our analytical process drew heavily from Braun and Clarke (2006), following their step-by-step guide to thematic analysis while taking advantage of their framework’s flexibility. Analysis began with immersing ourselves in the dataset, reading through each transcript in its entirety and comparing their accuracy against the interview recordings. Inspired by Kirn and Benson (2018), we concluded this initial read-through by creating summary descriptions of each of our participants. This helped us to keep the individuality and personality of each of our participants in mind during subsequent analysis. Next, our first round of coding followed an in vivo approach, taking direct quotations from the students’ interviews and using them as initial codes (Saldaña, 2013). These codes were then iteratively grouped, described, and categorized into sets of emergent and clustered themes through repeated engagement with the dataset, as well as with our participant summaries and other notes (Braun & Clarke, 2006; Pietkiewicz & Smith, 2014). This process was continuously accompanied by analytical memo writing to inform creation and interpretation of results (Creswell & Miller, 2000).

In contrast to the inductive nature of our initial analysis, our subsequent comparison across institutional contexts was, to use the terminology of Braun and Clarke (2006), far more theoretical. Rather than allowing themes to again inductively emerge from the data, our team approached previous codes and themes through the lens of ANT, first identifying key actors in the students’ learning networks and subsequently examining their roles in the students’ experiences of implementation. The resulting high-level themes categorize and describe the actors themselves using key points of similarity or difference exhibited across institutions to clarify each actors’ influence on the implementation experience.

Key limitations

Often, researchers employing thematic analysis continue incorporating data until initial codes and emergent themes reach saturation, a state wherein the inclusion of additional data fails to produce new insights. However, the number of participants required to reach saturation has proven extremely difficult to determine, with typical recommendations for small populations hovering around 10 participants (Fugard & Potts, 2015). While the total number of students included in this study (39) exceeds typical guidelines, participants are spread across four different instantiations of the Freeform environment. This makes it difficult to demonstrate saturation at any one institution, especially in the case of PUNW (n = 4). While our findings demonstrate clear differences in student experience between contexts, allowing for practical comparisons to be made, these findings should not be assumed to be freely generalizable. Rather, we hope that this study encourages readers to reflect upon their own experiences of implementation in new ways, guided by the subsequent analyses and informed by our results.

We approached analysis from an intentionally constructivist perspective, acknowledging the experiences of each student as individual, storied, and deeply complex (Ekebergh, 2009; Noon, 2018). As such, we took care at each stage to ensure that themes contained honest and accurate representations of our participant’s stories. Therefore, the themes below do not elucidate one common, universally defining aspect of implementation experience. Rather, we are using these themes to describe trends in the stories we collected, concisely representing a subset of illustrative, unique experiences and interpreting through the lens of ANT. We remind readers of this individuality of experience as they engage with our results and connect this work to their own contexts and experiences.

Positionality statement

In addition to the limitations inherent to our research, we encourage readers to keep our own positionality as authors in mind when engaging with this work (Secules et al., 2021). One of our co-authors (JFR) was an original developer of the Freeform environment, and our team has been conducting research within the context of Freeform for more than 5 years. As such, we have developed understandings and expectations regarding its effects on student experience. We have no doubt that, despite our best efforts to bracket our assumptions (Fischer, 2009), our research and later findings will be colored by these previous research efforts. We recommend that readers contextualize this study within prior findings as they interpret and apply our results to their own contexts.

Our current interest in students’ experience and agency has taken years to develop. Research regarding Freeform has slowly shifted from early efforts centered on program development and evaluation to a more nuanced, intentional, and empathetic engagement with student experience. As tenure-track faculty (all PUWL faculty members were tenured at the time of writing), researchers, and former undergraduate students ourselves, we are each invested in the effort to better understand and improve the learning experiences of engineering undergraduates. However, due to our positions as paid researchers at a research-intensive university, we necessarily inhabit a position of privilege and perceived authority in our interactions with student (and perhaps faculty) participants. In addition, most of our team identifies as white or as men, labels which bear their own inherent privilege. The ramifications of this privilege and perceived power naturally extend to the collection of data and the creation of our research products, as the findings depicted here are interpreted and represented based on our own academic insight. The data represented here have been filtered multiple times, and in multiple ways, through our own scholarly lens—a fact which should be considered when engaging with this study. Though we have tried to empathize with the experiences described here as best we can, and to depict stories in an authentic manner, our analysis cannot relate the unmediated voices of our student participants.

Findings

The results of this study are presented in two parts. First, we describe our findings at each institution, representing student experiences in the context of distinct implementations of blended learning. Second, we compare student experiences and themes across institutions, looking not only at students’ reported experiences, but also interpreting how they acted and interacted within their context and with the actors around them. Through this, we establish a cross-institutional perspective on the implementation of blended learning in the formation of individual students’ learning networks.

McGill University (McGill)

Student interviews were collected during Freeform’s first semester at McGill University. According to student participants, dynamics was notorious within the McGill student community for the course’s difficulty. Many students reported that the course had high attrition, even saying that dynamics acted as a weed-out course for the ME program. While dynamics is widely acknowledged in literature to be challenging (Goldfinch et al., 2008; Martín-Blas et al., 2010), the number of interviewees who took time to call out their course’s difficulty was surprising. Most interviewees contextualized their experience in the Freeform environment by juxtaposing their personal, positive stories against this preexisting narrative of failure. In our first theme from McGill (as shown in Table 3), many students tied current successes in the course to the new structure brought to their dynamics curriculum. The blended approach and its intentionally aligned resources were often credited with students’ perceived improvements over their prior expectations.

Table 3 McGill institutional analysis—Theme 1

Students often highlighted their individually accessible course resources and their instructor as the most important contributors to their counter-narrative experiences of success, leading to a second theme (see Table 4) that builds upon the content of the first. Each of the students’ sources of help took on a distinct role: course resources provided students with a unique opportunity for personal growth, and their instructor facilitated the use of those resources by incorporating them into students’ in-class experiences via active and collaborative pedagogies.

Table 4 McGill institutional analysis—Theme 2

The majority of student participants at McGill were heavily invested in their use of out-of-class learning resources, seeing these resources as an opportunity to improve and advance their learning. Few had previously experienced similar blended learning environments, but this did not prevent them from trying to take full advantage of their resources by seeking out new opportunities for engagement. Many students reported that they used a wide variety of resources to find help and aid learning outside of class, and that the experience had taught them new lessons about how to learn and study effectively. Some went so far as to seek out additional non-Freeform resources by using dynamics textbooks from prior years, public online discussion forums, past exams, and student discussions on the PUWL course blog.

Interestingly, few students brought up their instructor as an individually accessible learning resource. The course instructor was more often portrayed as a facilitator of resource engagement rather than as a resource themselves. Many students highlighted their instructor’s efforts to encourage collaboration and engagement with educational technology, but few reported or recommended that interactions with their instructor outside of class were particularly helpful, or even accessible, leading to our final theme from McGill University (see Table 5). However, those students who did report using their instructor as a learning resource were eager to share their experience. Two students reported using their instructor as a resource on a regular (at the least, weekly) basis, making them a somewhat vocal minority within our sample. Similarly vocal minorities were not present in the other themes from our McGill dataset.

Table 5 McGill institutional analysis—Theme 3

Purdue University—Northwest (PUNW)

The impact of extracurricular influences on the experience of students at PUNW was by far the strongest out of any of the institutions studied, laying out a compelling narrative for our first theme despite low student participation (see Table 6). During an early visit, an informal poll during class indicated that more than 90% of the students in the course were working 40-h weeks on top of their academic obligations. This included jobs inside and outside of engineering: some were taken as internships, while others were simply income in unrelated areas used to pay for tuition. Our study at PUNW not only occurred during the first instantiation of Freeform at the institution, but also during a Summer semester. As such, this extracurricular influence may have been, in part, due to the timing of data collection. However, from our interviews it appears that students at PUNW regularly expected to have home and work obligations outside of their academic load. Our only participant who was not working a full-time job had in fact lost their job several months prior and had yet to find a new position.

Table 6 PUNW institutional analysis—Theme 1

Students at PUNW were quick to notice when they encountered difficulties or barriers to learning that they deemed to be unnecessary or avoidable. Accompanying their personal needs for expediency when doing academic work, students were quick to point out teaching methods or even learning resources which they considered to be counterproductive, difficult to understand, or misaligned with the course or its assessments. Collaborative learning opportunities were often discussed in this context. In-class collaborative learning often required the instructor to move between multiple groups of students to provide them with targeted advice and feedback, a process which limited the number of students they could provide instruction to at any given time. At home, students were often forced to do homework at odd hours, or on a limited timetable, making it impossible for them to wait for responses to their discussion forum posts, emails, or even text messages.

However, every interviewee also highlighted how the broad range of resources provided by the Freeform environment helped them expediently complete homework assignments and allowed them to address questions as needed outside of class. Due to their work and school schedules, many students were forced to do homework late into the night. The solution videos and other easily accessible learning resources allowed students to find the help they needed, regardless of the time of day.

The videos helped in other ways, as three of the four interviewees discussed how the lecture example videos changed the ways they learned while in-class as well. Two discussed how the example videos could be used to help students overcome difficulties during collaborative in-class activities. Sometimes as the instructor moved between groups, multiple groups of students could require help simultaneously. Using the lecture example videos to provide guidance allowed collaborative efforts to move forward without requiring the instructor to personally address every question. Another student mentioned how the videos could be used to effectively make up for gaps in students’ concentration during lecture or to help students catch up to their peers after missing a day of class. Having an out-of-class instructional resource freed this student to pay attention more effectively during lecture and to make up for any incidental absences due to factors outside their control. Collectively, this interplay between in-class and out-of-class learning experiences gave rise to our second theme from PUNW, as shown in Table 7.

Table 7 PUNW institutional analysis—Theme 2

Trine University (Trine)

Our student participants at Trine talked extensively about the relationships they had with their institution and its members. In most cases, students took time to highlight the supportive and community-oriented culture cultivated at Trine. This broader sense of community was one of the biggest factors that initially drew our participants to the university and was frequently cited as a strong contributor to their overall academic success, giving rise to our first theme from Trine University (see Table 8). This was even true for participants who did not identify as particularly collaborative learners. In fact, several students who discussed the Trine community as an important aspect of their current learning experience preferred to work alone when outside of class. Even these students discussed the importance of the close interactions and personal connections, with instructors and peers alike, that had been fostered by the small campus environment at Trine. All students, to some extent, discussed the intentional culture of community support that their institution embodied.

Table 8 Trine institutional analysis—Theme 1

Although the semester included in this study was not the first time Freeform had been used to teach dynamics at Trine, neither of the instructors at Trine had used the blended environment previously. The blended resources, pedagogical approaches, and even notation were largely unfamiliar to the instructors themselves. In contrast, the community-focused culture at Trine meant that students were very familiar with their instructors and comfortable looking to their instructors for help both inside and outside the classroom. Students described faculty as “approachable”, “caring”, and “available”, conveying a general expectation that Trine faculty were ready and willing to engage with their students to help them to succeed. As a result, many students expressed negative emotions regarding outside influences that they felt inhibited their instructor’s ability to teach. In the case of our research, this included any perceived impositions originating from the Freeform environment or its developers from PUWL. Our second theme (see Table 9) combines these observations—highlighting both the value of the students’ instructors, and the frustration that grew from seeing those instructors constrained by an outside influence.

Table 9 Trine institutional analysis—Theme 2

However, this does not mean that students refrained from using their new resources. Rather, students appeared to have a clear set of priorities when it came to studying outside of class, and they adapted their use of Freeform resources to suit those needs, giving rise to our final theme from Trine University (see Table 10). Students who highly valued interactive sources of help, including from faculty, reported that they continued to prioritize those sources of help despite the changes in their learning environment. Some students reported attending office hours of faculty other than their instructor when seeking additional help. Others reported attending office hours as a group or seeking help during odd times of the day or night, continuing with the same study behaviors they employed previously. However, students who worked alone for a significant portion of their study time frequently mentioned how the online solution videos (both the lecture examples and homework solution videos) empowered them to make the most of their time while studying alone, even if their overall study strategies did not exhibit any drastic changes. Many students also reported that their out-of-class and in-class engagement with learning resources mutually influenced one another. Students’ in-class collaborative learning activities, for example, fostered out-of-class collaborative engagement between peers by forging new social connections. As a result, the suite of blended resources provided by the Freeform environment empowered and enhanced the learning behaviors of many students, but was not capable of addressing all of the students’ learning priorities—especially those tied directly to their academic community.

Table 10 Trine institutional analysis—Theme 3

Purdue University—West Lafayette (PUWL)

The majority of the students interviewed at PUWL identified their institution, and in no small part themselves, by the challenging nature of their degree program. Many talked extensively about the difficulty of their courses, the challenges posed to them by their professors, and the perceived value of hard work, giving shape to our first theme from PUWL (see Table 11). Student interviewees often communicated how they had to work to live up to the expectations of their program and the dedication it took to pursue success. Many students also communicated that the community around them including administration, faculty, and especially their peers, were very supportive and willing to help. Although our interviewees rarely claimed to know all the students in their courses and often professed to be primarily individual learners, most interviewees were quick to complement their instructors and offered help to peers when needed throughout the semester.

Table 11 PUWL institutional analysis—Theme 1

Students at PUWL emphasized how useful they considered their Freeform resources to be. In fact, most students’ discussions of their resources were not limited to the context of completing homework assignments; students described how their resources helped them to better learn dynamics concepts and content, or further helped them to become better learners. When rising to the challenge proffered by their curriculum, some students were accustomed to using outside resources such as alternative textbooks, online videos, or even educational websites like Coursera or Khan Academy to reinforce their learning. The blended resources provided by the Freeform environment addressed these students’ needs for additional learning opportunities in a similar way, but used content that was intentionally aligned with their actual course material. Many students found that the multiple representations of dynamics content offered by their blended and collaborative resources helped them to better understand and retain their engineering knowledge, which sets the tone for our second theme from PUWL (see Table 12). Although some students reported that it took time to get acquainted with their new blended resources, this expenditure was widely determined to be worthwhile.

Table 12 PUWL institutional analysis—Theme 2

Many students reported that Freeform connected in- and out-of-class learning experiences together in new ways, leading to our third and final theme from PUWL (see Table 13). This was most prevalent in discussions of how resources could address gaps in understanding. If certain topics were not covered during lecture, or if errors or lapses in concentration inhibited in-class learning, students felt confident that they could address their learning needs outside of class. Even complex topics that would typically require asking questions to their professor or peers could be solved at odd hours of the night through use of the online discussion forums or lecture videos. Instructors took time early in the course to encourage students to use all of the resources available, and many students subsequently benefitted from intentional engagement with blended and collaborative sources of help. Although each student could make their own decisions regarding what resources to use and why, the fact that each resource was intentionally aligned to the dynamics curriculum ensured that many different approaches to studying could be productive.

Table 13 PUWL institutional analysis—Theme 3

Theoretical interpretations: cross-institutional analysis

In interpreting our findings across the universities in this dataset, we identified actors who influenced the development of students’ learning networks and explored the roles those actors played in implementation. Our interpretation of these findings resulted in an examination of four primary actors: the institutional context, the subject of the implementation (in this case, the Freeform environment), the instructors, and the students themselves. Each of these actors influenced students’ resource usage decisions, thus impacting the development of each students’ learning network and study behaviors.

The role of the (institutional) context

The institutional context influenced students’ priorities and affective perceptions of their engagement with Freeform resources. Students at PUNW, for example, were heavily influenced by extracurricular pressures and responsibilities. This led them to prioritize resources that helped address gaps in knowledge and expeditiously solve homework problems. Although some participants at each of the four institutions reported a similar need for expediency, the students at PUNW spoke extensively about time constraints, the need to do homework quickly, and the importance of being able to find help when needed.

In contrast, very few students at McGill brought up extracurricular pressures. Instead, the majority of students viewed their new resources as an additional opportunity for personal growth. Students avidly pursued resources to support their own academic success, motivated by existing narratives of failure and pressures to perform reported at their institution. The institutional context also de-emphasized the instructors’ availability as an out-of-class resource, all of which led students at McGill to be especially invested in individually accessible learning materials on the blog or elsewhere. McGill was the only partner institution where multiple students reported using Visualizing Mechanics videos (pre-recorded mechanics demonstrations), and a large number of McGill participants reported using the PUWL blog as a supplemental resource. Students highlighted how they were able to approach dynamics concepts from many different perspectives by utilizing multiple learning resources, which in turn enhanced their learning and helped them to perform better on assessments.

Students at PUNW reported that they appreciated the structure that the Freeform environment imposed, highlighting how it tied together their in-class and online learning opportunities in a way that complemented the needs of their extracurricular responsibilities. Students at Trine, on the other hand, often discussed the imposition of Freeform’s structure in a negative light. The culture at Trine promotes personal engagement and individual investment, and students at Trine were accustomed to personalized instruction and interaction with professors. Perhaps as a result, the structure of the Freeform environment felt overly restrictive. Students felt as though the Freeform environment was preventing their instructors from adapting the course and its content to the needs of the students and the values of Trine.

Students’ implementation experiences at Trine were also complicated due to the use of an institutional LMS that took precedence over the Freeform blog. Students reported that many (but not all) of the videos that would typically be available on the Freeform blog site were also posted to their course’s LMS page. Some students even reported that they were not aware of the blog’s existence, as they had used the LMS to address all of their online study needs.

This could be contrasted with PUWL and its integrated instantiation of Freeform, as the environment had originally been developed for use at that institution. The resources and, more specifically, online presence of the Freeform environment was better established at PUWL than at any other institution. In fact, when students at other, partnered institutions searched the internet using phrases such as “Freeform” or “Freeform dynamics” the PUWL blog site would be the first to appear, not the blog site created for their own institution. In light of the maturity of this institutional implementation and its resulting online presence, many students at Trine, McGill, and even PUNW reported finding the PUWL blog and using its discussion forum as a source of help. Thus, these students had access to an additional resource. One discussion forum came to be used, albeit in a limited capacity, at all four of the participating institutions.

The role of the Freeform environment

Integrating students’ responses from all four participant institutions, the distinct “structure” of the Freeform environment was almost universally praised. However, students used the word “structure” to refer to multiple concepts. Some used “structure” to refer to the alignment and synchronicity between the various resources in the Freeform environment, describing the ways in which the lecturebook, videos, and even in-class lectures can be used to supplement, enhance, and interact with one another. Other students used the word “structure” to refer to the dynamics curriculum itself, describing the conceptual progression between chapters over the semester’s duration. This aspect of Freeform’s structure was most obviously represented through the content of the lecturebook, which provided a clear breakdown of dynamics concepts and acted as the students’ textbook (and notebook) throughout the semester.

Both aspects of “structure” drove students’ engagement with Freeform’s resources, but the reasoning behind those engagement behaviors varied. Students who promoted the alignment between the various aspects of the learning environment spoke extensively about how they could use their resources to complement each other, supplementing their learning and addressing misconceptions as they arose. For example, many students used the lecture example videos to clarify questions that arose during lecture. Others reported reading the lecturebook as a means of better preparing for class, or using the online discussion forums to further clarify dynamics concepts or problem-solving processes.

In comparison, students who promoted the structure of the course in terms of its curriculum and content discussed how it taught them both what they needed to learn and how they could go about learning it. Dynamics concepts built on each other sequentially and, rather than directly influencing which resources students engaged with, this informed how those resources were used to address relevant topics or ideas. This helped students to better understand underlying dynamics concepts, perform well on assessments, and develop a robust understanding of core engineering content knowledge.

The single factor that seemed to best facilitate both of these positive aspects of “structure” was the sheer variety of resources provided by the Freeform environment. Students who valued having multiple representations of their course content benefitted from a wide range of physical and digital resources (for more on the role of multiple representations see Ainsworth, 1999). Students who valued having a clear progression of concepts benefitted from the ease with which they were able to find the one or two resources that best addressed their needs. In either case, variety (in resources) helped to foster students’ learning activity, allowing them greater flexibility and confidence in shaping their own, personalized learning networks.

The role of the instructor

Instructors played an important role in guiding, encouraging, and facilitating students’ use of learning resources. This included resources that were accessed both collaboratively and individually, whether inside or outside of the classroom. Every institution had students who discussed a mutual influence between their in-class learning activities and their out-of-class studying. Instructors contributed to this process by introducing their students to innovative resources, facilitating collaborative and social engagement, and establishing norms for students’ engagement behaviors.

Students’ positive stories of their instructor’s facilitative actions most often described well-organized and highly structured in-class learning activities. For example, the collaborative in-class experiences at PUWL and McGill were well-received, with students highlighting how well they had been prepared for the activities and how easily they were able to find help from their instructor when needed. This can be contrasted with the students’ descriptions from PUNW, where some students felt ill-prepared or unsupported during their collaborative in-class activities. Unstructured deployment of innovations was typically frustrating to students, while highly structured activities with ample preparation and support were better received. This was especially true for interactive and collaborative experiences, such as those encountered during in-class group work or online via the discussion forums. This follows the same trends seen in prior literature, which demonstrate how students in blended learning prefer learning resources that are well organized and for which they have been adequately prepared (Finelli & Borrego, 2020; Martin et al., 2020; Taylor et al., 2018a).

The importance of instructors’ planning and structure can also be seen through their impact on the perceived alignment among resources and learning opportunities encountered in the course. Alignment depended heavily on the actions of the instructor, e.g., how they utilized the lecturebook material, what variable notation they used when writing, and whether they incorporated digital resources like videos into the classroom. Unfortunately, it appears that instructors’ influence on perceived alignment was most obvious in its absence. Students were quick to notice when there were inconsistencies between the content presented in-class and the content they encountered using other resources in Freeform. Such inconsistencies forced students to make decisions regarding which sources they could trust, or which way was the “right way” to learn the course material, drastically complicating the engagement decisions they had to make while studying.

The role of the students

At each institution, we saw examples of students who chose not to implement resources that are typically considered to be essential parts of the Freeform learning environment. This is not necessarily surprising, as previous works found numerous distinct resource usage patterns and behaviors among their student participants. However, seeing vast differences in student engagement within populations at multiple institutions has clarified and emphasized the important role that students take in determining their own, personalized use of the innovations they encounter.

For example, most students at McGill reported overwhelmingly positive experiences with, and perceptions of, the Freeform lecturebook. Juxtaposed against an institutional narrative of failure, many students commented on the utility of the lecturebook and its accompanying lecture example videos. These comments could be simple and direct such as, “…just follow the lecturebook and you'll be fine” (McGill: Student 8), but some were more comprehensive.

“It's, like, I use it almost every day. I just read it over and over. To prepare myself for midterm assignments, I'm going to prepare myself with the dynamics [lecturebook]. For the final as well. I also use it all the time during class… Yeah, we call the dynamics book the Bible. It's really what we base everything [on]... Every time we have a question we look at the book, you know?” (McGill: Student 9)

Despite what would appear to be overwhelming support for the lecturebook, some students still chose not to use it. Upon reflection, these students could even acknowledge that they differed from the norm, saying for example, “Many people do write directly into the lecture book, but I personally write into a notebook of my own.” (McGill: Student 4)

Looking more broadly, there was at least one student from every institution who reported not using the Freeform lecturebook, which previous study has noted to be the most popular resource among students at PUWL (Stites et al., 2020). This illustrates that students at all institutions had agency to determine the people, practices, and resources that made up their own personal learning networks. It also highlights the importance of students’ ability to choose not to adopt, and thus not to implement, some of the resources they encounter. Students determined their own engagement behavior and study strategies in light of, or sometimes in spite of, any outside influence from their instructor, their institution, their peers, or otherwise.

In discussing their decisions and resulting behaviors, students often talked about their individual resources as filling a specific role: a role which either did or did not align with their own understanding of their individual needs and objectives. In addition, different students could perceive the same resources as filling different roles. For example, Student 13 from Trine chose not to purchase a lecturebook, justifying their choice by specifically referencing notetaking.

“Actually, I think this is a disadvantage for me. I'm a note taker. I learn by taking notes and, not regurgitating, but modifying what is put on the board and representing what it means in my personal perspective. In this class, it's actually been harder for me because it's filling out notes in that book, in the dynamics [PUWL] book, and it's less of me modifying and thinking.” (Trine: Student 13)

The student above determined that the lecturebook held little value for them due to how closely their notetaking had to align with their lecture content. In contrast, most of our interviewees valued the lecturebook for this same quality, highlighting how the lecturebook could interact with the pre-written lecture example videos to reinforce learning outside of class. Of course, students’ interactions with the lecturebook serve as only one example. Many students reported avoiding particular resources such as the online discussion forums, peer collaboration, or instructor office hours for reasons tied to their own learning goals and preferences.

However, far more often we encountered examples at each institution of students who went out of their way to engage with new resources or opportunities for learning—students who adopted, adapted, and implemented the innovations they encountered in the context of their own study behaviors. These students changed the resources they engaged with, or changed the ways in which they engaged with their resources, in an effort to improve their own learning. Many students reported “learning how to learn” as a result of their self-directed engagement with the Freeform environment by not only tailoring the resources of the environment to fit their needs, but also changing their approach to learning in order to better interact with new innovations. This metacognitive engagement is a hallmark of self-directed learning (Jossberger et al., 2010) and was demonstrated in students’ use and non-use of their blended learning resources.

Discussion

Practical observations

Students across every institution expressed appreciation for the “structure” of the Freeform environment. This is a theme that we had observed in previous work on the PUWL campus (Evenhouse et al., 2020), but seeing its emphasis across multiple institutions reflects broader research on the implementation of blended learning. Taylor and Newton (2013) collected data on an institution-wide implementation of blended learning, reporting that students appreciated “well-structured subjects and relevant and accessible learning resources” (p. 56), among a variety of other factors. This emphasized a need for implementors of blended learning to focus on “learning design and learning support, rather than technologies” (p. 57), and our findings reinforce this conclusion. Instructors at each institution in this study employed different technologies while teaching, and students employed a wide range of digital learning resources in their studies. However, students most appreciated those resources that aligned closely with the content of the course and their understanding of course assessments, highlighting the importance of students’ perceptions of resource relevance (i.e.: perceived value, or perceived usefulness) in both the design of resources and their implementation.

While this study has re-emphasized the importance of alignment and structure in blended learning environments, we are not suggesting that local adaptations and changes are detrimental to the implementation process. Rather, adaptations could help to reinforce alignment by correcting for contextual factors that the original developers could not account for. Our findings from Trine University testify to this: forcing instructors to adopt Freeform and its practices with an extremely high degree of fidelity would directly conflict with salient aspects of the institution’s prevailing culture. Students at Trine have an understanding that their instructors are both competent and caring, and that they will adjust instruction to their students’ benefit. Placing strict limitations on instructors’ ability to adapt would compromise the efficacy of the students’ most highly valued learning resources and could adversely affect their experience in the course.

A new perspective on implementation

The ANT-informed findings in this study illustrate the value in viewing students as individual, controlling actors within the scope of their own learning networks. Based on our interpretation and a broader review of literature, students in this study go through individual processes of implementation by adopting and adapting the various innovations they encounter. This would suggest that the success of institutional implementations of blended learning depend, in part, on the implementation experiences and decision-making processes of individual students.

In prior study, we identified how the perceived availability of a resource, and the perceived relevancy of the help provided by that resource, directed students’ behavior in the Freeform environment, tying our participants’ engagement decisions in a resource-rich environment to contemporary models of student resource adoption (Evenhouse et al., 2020). Based on our findings here, the ways in which other actors influence students’ engagement behaviors seem to extend beyond the individual adoption decisions motivated by perceptions of availability and relevancy, or ease-of-use and usefulness. At the very least, this study hints at a complicated, contextual, and perhaps idiosyncratic web of factors which mediates the formation of students’ perceptions of, engagement with, and use of learning resources. Thus, this study supports using a nuanced, contextually informed approach to understanding the implementation experiences of undergraduate students in the presence of educational innovations. As students in this study described their approaches to learning both inside and outside of the classroom, they engaged with a variety of actors who impacted their decision-making processes and subsequent behaviors. In future study, researchers may benefit from not only examining the perceptions and self-directed behaviors of individual students, but also from exploring how these experiences are embedded within a larger network of concerned actors: human, institutional, or otherwise.

These conclusions call for an expansion of the ways we conceptualize processes of pedagogical change in undergraduate STEM education. Taking an instructor-centric approach has proven useful, and systemic reviews of literature have demonstrated both the nuance and complexity of implementation efforts of faculty (e.g., Liu et al., 2020). However, respecting students as independent actors in the implementation of innovations allows for a similarly nuanced discussion of students’ own engagement decisions and their subsequent implications for educational change efforts.

A new role for instructors

ANT provides language to better discuss and situate these findings. In describing the implications of utilizing an ANT perspective, Latour writes, “…with ANT: the theory of action itself is different, since we are now interested in mediators making other mediators do things.” (Latour, 2005: p. 216–217, emphasis ours). In conceptualizing the role of instructors as mediators influencing mediators, and thereby acknowledging the agency of students in the implementation process, we gain new insight into what instructors can do to help students succeed in the innovative classroom. Rather than snapping students and learning resources together like puzzle pieces, instructors can change the way in which innovations interact with students and, in turn, direct the ways in which students interact with the innovations they encounter—an actor facilitating actors, and an implementor facilitating implementors.

The idea of faculty as facilitators of students’ in-class engagement and growth as learners is not a new concept. The facilitative efforts of faculty come up repeatedly in context with blended learning (Taylor et al., 2018a), collaborative engagement (Martin et al., 2020), and the use of online resources (Lewis & Abdul-Hamid, 2006). As such, we emphasize that students’ own processes of decision-making, adoption, and subsequent adaptation of new learning opportunities are especially important in light of the unique affordances and requirements of blended learning (Gedik, Kiraz, & Özden, 2012). In fact, Tang and Chaw (2016) concluded that students’ ability to adapt in the presence of technological innovations is a significant predictor of whether individual students are ready to engage with blended learning. Unfortunately, the role that faculty can play in shaping such adaptations remains relatively unexplored.

With students not only adopting new technologies, but engaging in entire processes of individual implementation, faculty could be said to own an additional responsibility: to act as facilitators of their students’ implementation processes and as catalysts for their metacognitive growth as learners. Though there are pedagogical resources and frameworks that can help instructors to foster such metacognitive growth (Gamby & Bauer, 2022; Van Laer & Elen, 2020), few have been intentionally extended to inform the broader implementation and use of other research-based educational innovations. ANT can help us understand how the actions of an instructor are not only connected to the statistical measures of engagement and performance of their students, but to the actions of their students as agents in the implementation process. Future work should interrogate this role of instructors as facilitators of student action, not only in terms of pedagogy, but in terms of student perceptions and behaviors in their personal interactions with educational innovations.

Conclusion

Using thematic analysis, we examined students’ experiences of implementation in a blended learning environment applied to Mechanical Engineering dynamics courses across four universities. In contrast to a more typically instructor-centered perspective on implementation, we examined the role(s) of students as actors and agents of change who shape the adoption and subsequent expression of learning innovations. By intentionally examining students’ use of blended resources as a process of implementation through the lens of ANT, we described a number of actors who influenced our participants’ learning behaviors. The institutional context, the innovation being implemented, and the course instructors all contributed to the formation of students’ personal learning networks.

Students reported making informed and reflective decisions regarding which blended resources to use, sometimes rejecting popular resources due to their own personal learning needs. The other actors in the network helped to inform, shape, and facilitate students’ engagement behaviors. For example, many students were motivated to engage with new resources when there was an overarching structure, or obvious alignment, tying the content of their course, their assessments, and their blended resources together. Likewise, instructors could help to facilitate students’ engagement with new resources by utilizing or promoting them in class, demonstrating the structure or alignment of the course environment, or by maintaining clear channels of communication with their students for feedback, guidance, and support. Many students reported that the innovative resources they encountered helped them to “learn how to learn”, indicating that they not only adapted the resources they encountered to fit their needs, but also adapted their own habits and behaviors to better utilize resources that they perceived to be valuable and effective.

In conclusion, we believe there is value in further investigating the active role that students play in the implementation of innovations, even when those innovations are not explicitly participatory in nature. Simply examining student experiences with, and perceptions of, educational innovations may be casting students in too passive a role. Developing a more nuanced understanding of students’ behaviors and how they are influenced by other actors in their learning network could help instructors to not only successfully implement innovations, but also to better empower their students to succeed in the innovative classroom.

Availability of data and materials

The data sets generated and analyzed during this study are not publicly available due to the identifiable nature of the data.

References

  • Aarons, G., Askew, R. A., Green, A., Yalon, A. J., Reeder, K., & Palinkas, L. A. (2019). Evidence- based practice adaptation during large-scale implementation: A taxonomy of process and content adaptations. Journal of Children’s Services, 14(2), 61–77. https://doi.org/10.1108/JCS-02-2018-0003

    Article  Google Scholar 

  • Adinda, D., & Mohib, N. (2020). Teaching and instructional design approaches to enhance students’ self-directed learning in blended learning environments. Electronic Journal of eLearning, 18(2), 162–174. https://doi.org/10.34190/EJEL.20.18.2.005

    Article  Google Scholar 

  • Aheto, S.-P. K. (2017). Patterns of the use of technology by students in Higher Education. PhD Dissertation. University of Cape Coast, Department of Mathematics and Science Education. Retrieved from http://etd.cput.ac.za/handle/20.500.11838/2541

  • Ainsworth, S. (1999). The functions of multiple representations. Computers & Education, 33(2–3), 131–152. https://doi.org/10.1016/S0360-1315(99)00029-9

    Article  Google Scholar 

  • Arif, S., Sidek, S., & Bakar, N. A. (2017). Actor-network theory (ANT) as an interpretative tool to understand the use of online technologies: A review. Asian Journal of Information Technology, 16(1), 61–68. https://doi.org/10.36478/ajit.2017.61.68

    Article  Google Scholar 

  • Ausburn, L. J. (2004). Course design elements most valued by adult learners in blended online education environments: An American perspective. Educational Media International, 41(4), 327–337. https://doi.org/10.1080/0952398042000314820

    Article  Google Scholar 

  • Barkley, E. F., Major, C. H., & Cross, K. P. (2014). Collaborative Learning Techniques: A Handbook for College Faculty (2nd ed.). San Francisco, CA.

    Google Scholar 

  • Baumann, A. A., Cabassa, L. J., & Stirman S. W. (2017). Adaptation in dissemination and implementation science. In Brownson, R. C., Colditz, G. A., & Proctor, E. K. (Eds.) Dissemination and Implementation Research in Health: Translating Science to Practice. Ch 17. Oxford University Press: New York, NY. doi: https://doi.org/10.1093/oso/9780190683214.003.0017

  • Besterfield-Sacre, M., Cox, M. F., Borrego, M., Beddoes, K., & Zhu, J. (2014). Changing engineering education: Views of US faculty, chairs, and deans. Journal of Engineering Education, 103(2), 193–219. https://doi.org/10.1002/jee.20043

    Article  Google Scholar 

  • Borrego, M., Cutler, S., Prince, M., Henderson, C., & Froyd, J. E. (2013). Fidelity of implementation of Research-Based Instructional Strategies (RBIS) in engineering science courses. Journal of Engineering Education, 102(3), 394–425. https://doi.org/10.1002/jee.20020

    Article  Google Scholar 

  • Borrego, M., & Henderson, C. (2014). Increasing the use of evidence-based teaching in STEM higher education: A comparison of eight change strategies. Journal of Engineering Education, 103(2), 220–252. https://doi.org/10.1002/jee.20040

    Article  Google Scholar 

  • Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology., 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa

    Article  Google Scholar 

  • Brown, M. G. (2016). Blended instructional practice: A review of the empirical literature on instructors’ adoption and use of online tools in face-to-face teaching. The Internet and Higher Education, 31(1), 1–10. https://doi.org/10.1016/j.iheduc.2016.05.001

    Article  Google Scholar 

  • Buhl, M. (2017). Students and teachers as developers of visual learning designs with augmented reality for visual arts education. In Mesquita, A., & Peres, P. (eds.). Proceedings of the 16th European Conference on e-Learning. Academic Conferences and Publishing International. 94–101. Retrieved from https://vbn.aau.dk/en/publications/students-and-teachers-as-developers-of-visual-learning-designs-wi

  • Carnegie Foundation for the Advancement of Teaching. (2018). The Carnegie Classification of Institutions of Higher Learning (2018th ed.). Bloomington, IN.

    Google Scholar 

  • Christie, M., & de Graaff, E. (2017). The philosophical and pedagogical underpinnings of active learning in engineering education. European Journal of Engineering Education, 42(1), 5–16. https://doi.org/10.1080/03043797.2016.1254160

    Article  Google Scholar 

  • Cohen, D. K., & Ball, D. L. (2007). Educational innovation and the problem of scale. In B. Schneider & S.-K. McDonald (Eds.), Scale-up in Education: Ideas in Principle (pp. 19–37). Rowman & Littlefield Publishers, INC.

    Google Scholar 

  • Creswell, J. W., & Miller, D. L. (2000). Determining validity in qualitative inquiry. Theory into Practice, 39(3), 124–130. https://doi.org/10.1207/s15430421tip3903_2

    Article  Google Scholar 

  • Dancy, M. H., & Henderson. C. (2008). Barriers and promises in STEM reform. In National Academies of Science Board on Science Education: Evidence on Promising Practices in Undergraduate STEM Education Workshop 2. Retrieved from https://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_072636.pdf

  • Davis, F. D. (1993). User Acceptance of information technology: System characteristics, user perceptions and behavioral impacts. International Journal of Man-Machine Studies, 38(3), 475–487. https://doi.org/10.1006/imms.1993.1022

    Article  Google Scholar 

  • De George-Walker, L., & Keeffe, M. (2010). Self-determined blended learning: A case study of blended learning design. Higher Education Research & Development, 29(1), 1–13. https://doi.org/10.1080/07294360903277380

    Article  Google Scholar 

  • Dillenbourg, P. (1999). What do you mean by “Collaborative Learning”? In P. Dillenbourg (Ed.), Collaborative-learning: Cognitive and Computational Approaches (pp. 1–19). Emerald Group Publishing Limited.

    Google Scholar 

  • Ekebergh, M. (2009). Developing a didactic method that emphasizes lifeworld as a basis for learning. Reflective Practice, 10(1), 51–63. https://doi.org/10.1080/14623940802652789

    Article  Google Scholar 

  • Englund, C., Olofsson, A. D., & Price, L. (2017). Teaching with technology in higher education: Understanding conceptual change and development in practice. Higher Education Research & Development., 36(1), 73–87. https://doi.org/10.1080/07294360.2016.1171300

    Article  Google Scholar 

  • Evenhouse, D., Kandakatla, R., Berger, E., Rhoads, J. F., & DeBoer, J. (2020). Motivators and barriers in undergraduate mechanical engineering students’ use of learning resources. European Journal of Engineering Education. https://doi.org/10.1080/03043797.2020.1736990

    Article  Google Scholar 

  • Evenhouse, D., Patel, N., Gerschutz, M., Stites, N. A., Rhoads, J. F., Berger, E., & DeBoer, J. (2018). Perspectives on pedagogical change: Instructor and student experiences of a newly implemented undergraduate engineering dynamics curriculum. European Journal of Engineering Education, 43(5), 664–678. https://doi.org/10.1080/03043797.2017.1397605

    Article  Google Scholar 

  • Fenwick, T. (2011). Reading educational reform with actor network theory: Fluid spaces, otherings, and ambivalences. Educational Philosophy and Theory, 43(1), 114–134. https://doi.org/10.1111/j.1469-5812.2009.00609.x

    Article  Google Scholar 

  • Fenwick, T., & Edwards, R. (2019). Introduction: How is actor-network theory contributing to educational research? A critical revisitation. In T. Fenwick & R. Edwards (Eds.), Revisiting Actor-Network Theory in Education. New York: Routledge.

    Chapter  Google Scholar 

  • Finelli, C. J., & Borrego, M. (2020). Evidence-based strategies to reduce student resistance to active learning. In J. Mintzes & E. Walter (Eds.), Active Learning in College Science. Cham: Springer.

    Google Scholar 

  • Fischer, C. T. (2009). Bracketing in qualitative research: Conceptual and practical matters. Psychotherapy Research, 19(4–5), 583–590. https://doi.org/10.1080/10503300902798375

    Article  Google Scholar 

  • Francis, R., & Shannon, S. J. (2013). Engaging with blended learning to improve students’ learning outcomes. European Journal of Engineering Education, 38(4), 359–369. https://doi.org/10.1080/03043797.2013.766679

    Article  Google Scholar 

  • Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415. https://doi.org/10.1073/pnas.1319030111

    Article  Google Scholar 

  • Fugard, A. J. B., & Potts, H. W. W. (2015). Supporting thinking on sample sizes for thematic analyses: A quantitative tool. International Journal of Social Research Methodology, 18(6), 669–684. https://doi.org/10.1080/13645579.2015.1005453

    Article  Google Scholar 

  • Gamby, S., & Bauer, C. F. (2022). Beyond “study skills”: a curriculum-embedded framework for metacognitive development in a college chemistry course. International Journal of STEM Education, 9, 61. https://doi.org/10.1186/s40594-022-00376-6

    Article  Google Scholar 

  • Gedik, N., Kiraz, E., & Özden, M. Y. (2012). The optimum blend: Affordances and challenges of blended learning for students. Turkish Online Journal of Qualitative Inquiry, 3(3), 102–117. https://dergipark.org.tr/en/pub/tojqi/issue/21396/229377

    Google Scholar 

  • Goldfinch, T., Carew, A., & Mccarthy, T. (2008). Improving learning in engineering mechanics: The significance of understanding. In AAEE - Annual Conference of Australasian Association for Engineering Education, edited by L. Mann, A. Thompson, and P. Howard, 1–6. Rockhampton, QLD: Faculty of Sciences, Engineering Health, CQUniversity. Retrieved from https://ro.uow.edu.au/engpapers/2626/

  • Halverson, L. R., Graham, C. R., Spring, K. J., Drysdale, J. S., & Henrie, C. R. (2014). A thematic analysis of the most highly cited scholarship in the first decade of blended learning research. Internet and Higher Education, 20(1), 20–34. https://doi.org/10.1016/j.iheduc.2013.09.004

    Article  Google Scholar 

  • Harty, C. (2010). Implementing innovation: Designers, users and actor-networks. Technology Analysis & Strategic Management, 22(3), 297–315. https://doi.org/10.1080/09537321003647339

    Article  Google Scholar 

  • Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in Science Teaching, 48(8), 952–984. https://doi.org/10.1002/tea.20439

    Article  Google Scholar 

  • Henderson, C., & Dancy, M. H. (2007). Barriers to the use of research-based instructional strategies: The influence of both individual and situational characteristics. Physical Review Special Topics - Physics Education Research, 3(2), 1–14. https://doi.org/10.1103/PHYSREVSTPER.3.020102

    Article  Google Scholar 

  • Henderson, C., & Dancy, M. H. (2008). Physics faculty and educational researchers: Divergent expectations as barriers to the diffusion of innovations. American Journal of Physics, 76(1), 79–91. https://doi.org/10.1119/1.2800352

    Article  Google Scholar 

  • Jamieson, L. H., & Lohmann, J. R. (2012). Innovation with Impact: Creating a Culture for Scholarly and Systematic Innovation in Engineering Education. American Society for Engineering Education.

    Google Scholar 

  • Johri, A., & Olds, B. M. (2011). Situated Engineering learning: bridging engineering education research and the learning sciences. Journal of Engineering Education, 100(1), 151–185. https://doi.org/10.1002/j.2168-9830.2011.tb00007.x

    Article  Google Scholar 

  • Jossberger, H., Brand-Gruwel, S., Boshuizen, H., & van de Wiel, M. (2010). The challenge of self-directed and self-regulated learning in vocational education: A theoretical analysis and synthesis of requirements. Journal of Vocational Education and Training, 62(4), 415–440. https://doi.org/10.1080/13636820.2010.523479

    Article  Google Scholar 

  • Kandakatla, R., Goldenstein, A., Evenhouse, D. A., Berger, E. J., Rhoads, J. F., & DeBoer, J. (2018). MEERCat: A Case Study of How Faculty-led Research Initiatives Gave Rise to a Cross-departmental Research Center with Potential to Inform Local Policy. In 125th ASEE Annual Conference & Exposition. doi: https://doi.org/10.18260/1-2--30802

  • Kandakatla, R., Berger, E., Rhoads, J. F., & DeBoer, J. (2020). Student Perspectives on the Learning Resources in an Active, Blended, and Collaborative (ABC) Pedagogical Environment. International Journal of Engineering Pedagogy, 10(2), 7–31. https://doi.org/10.3991/ijep.v10i2.11606

    Article  Google Scholar 

  • Kezar, A., Gehrke, S., & Elrod, S. (2015). Implicit Theories of Change as a Barrier to Change on College Campuses: An Examination of STEM Reform. The Review of Higher Education, 38(4), 479–506. https://doi.org/10.1353/rhe.2015.0026

    Article  Google Scholar 

  • Kintu, M. J., Zhu, C., & Kagambe, E. (2017). Blended learning effectiveness: The relationship between student characteristics, design features and outcomes. International Journal of Educational Technology in Higher Education, 14(7), 1–20. https://doi.org/10.1186/s41239-017-0043-4

    Article  Google Scholar 

  • Kirn, A., & Benson, L. (2018). Engineering students’ perceptions of problem solving and their future. Journal of Engineering Education, 107(1), 87–112. https://doi.org/10.1002/jee.20190

    Article  Google Scholar 

  • Latour, B. (1984). The powers of association. In J. Law (Ed.), The Sociological Review, Special Issue: Sociological Review Monograph Series: Power, Action and Belief. A New Sociology of Knowledge? (pp. 264–280). London: Routledge & Kegan Paul.

    Google Scholar 

  • Latour, B. (2005). Reassembling the social: An introduction to actor network theory. Oxford University Press.

    Google Scholar 

  • Lattuca, L. R., & Stark, J. (2009). Shaping the College Curriculum: Academic Plans in Context. Jossey-Bass.

    Google Scholar 

  • Leonard-Barton, D. (1988). Implementation as mutual adaptation of technology and organization. Research Policy, 17, 251–267. https://doi.org/10.1016/0048-7333(88)90006-6

    Article  Google Scholar 

  • Lewis, C. C., & Abdul-Hamid, H. (2006). Implementing effective online teaching practices: Voices of exemplary faculty. Innovative Higher Education, 31(2), 83–98. https://doi.org/10.1007/s10755-006-9010-z

    Article  Google Scholar 

  • Litzinger, T. A., Wise, J. C., & Lee, S. H. (2005). Self-directed learning readiness among engineering undergraduate students. Journal of Engineering Education, 94(2), 215–221. https://doi.org/10.1002/j.2168-9830.2005.tb00842.x

    Article  Google Scholar 

  • Liu, Q., Geertshuis, S., & Grainger, R. (2020). Understanding academics’ adoption of learning technologies: A systematic review. Computers & Education, 151, 1–19. https://doi.org/10.1016/j.compedu.2020.103857

    Article  Google Scholar 

  • Luke, K. (2020). The pause/play button actor-network: Lecture capture recordings and (re)configuring multi-spatial learning practices. Interactive Learning Environments. https://doi.org/10.1080/10494820.2019.1706052

    Article  Google Scholar 

  • Makara, K. A., & Karabenick, S. A. (2013). Characterizing sources of academic help in the age of expanding educational technology: A new conceptual framework. In S. A. Karabenick & M. Puustinen (Eds.), Advances in Help Seeking Research and Applications: The Role of Emerging Technologies (pp. 37–72). Information Age Publishing.

    Google Scholar 

  • Martin, F., Wang, C., & Sadaf, A. (2020). Facilitation matters: Instructor perception of helpfulness of facilitation strategies in online courses. Online Learning, 24(1), 28–49. https://doi.org/10.24059/olj.v24i1.1980

    Article  Google Scholar 

  • Martín-Blas, S., & T. L., & Serrano-Fernández, A. (2010). Enhancing force concept inventory diagnostics to identify dominant misconceptions in first-year engineering physics. European Journal of Engineering Education, 35(6), 597–606. https://doi.org/10.1080/03043797.2010.497552

    Article  Google Scholar 

  • McKenna, A. F., Froyd, J., & Litzinger, T. (2014). The complexities of transforming engineering higher education: Preparing for next steps. Journal of Engineering Education, 103(2), 188–192. https://doi.org/10.1002/jee.20039

    Article  Google Scholar 

  • Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2009). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. Washington, DC: U.S. Department of Education, Office of Planning, Evaluation, and Policy Development. Retrieved from https://www2.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.pdf

  • Mirriahi, N., Alonzo, D., McIntyre, S., Kligyte, G., & Fox, B. (2015). Blended learning innovations: Leadership and change in one Australian institution. International Journal of Education and Development Using ICT, 11(1), 4–16.

    Google Scholar 

  • Moullin, J. C., Dickson, K., Stadnick, N., Rabin, B., & Aarons, G. (2019). Systematic review of the exploration, preparation, implementation, sustainment (epis) framework. Implementation Science. https://doi.org/10.1186/s13012-018-0842-6

    Article  Google Scholar 

  • Noon, E. (2018). Interpretive phenomenological analysis: An appropriate methodology for educational research? Journal of Perspectives in Applied Academic Practice, 6(1), 75–83. https://doi.org/10.14297/jpaap.v6i1.304

    Article  Google Scholar 

  • Paledi, V. N. (2019). Perceived Actors and Factors for Sustaining M-learning in Higher Education: A South African Students Perspective. 2019 Open Innovations (OI): Cape Town, South Africa. 342–350, doi: https://doi.org/10.1109/OI.2019.8908200.

  • Pietkiewicz, I., & Smith, J. A. (2014). A Practical Guide to Using Interpretative Phenomenological Analysis in Qualitative Research Psychology. Psychological Journal, 20(1), 7–14. https://doi.org/10.14691/CPPJ.20.1.7

    Article  Google Scholar 

  • Pillai, S. (2017). An investigation of implementation, adoption and use of technology for enhancing students’ CoreLife Skills in a vocational institute: A Case Study informed by Actor-Network Theory. PhD Dissertation. Lancaster University: Department of Educational Research. doi: https://doi.org/10.17635/lancaster/thesis/45

  • Porter, W. W., Graham, C. R., Bodily, R., & Sandberg, D. (2016). A qualitative analysis of institutional drivers and barriers to blended learning adoption in higher education. Internet and Higher Education, 28(1), 17–27. https://doi.org/10.1016/j.iheduc.2015.08.003

    Article  Google Scholar 

  • Porter, W. W., Graham, C. R., Spring, K. A., & Welch, K. R. (2014). Blended learning in higher education: Institutional adoption and implementation. Computers and Education, 75(1), 185–195. https://doi.org/10.1016/j.compedu.2014.02.011

    Article  Google Scholar 

  • Reinholz, D. L., Rasmussen, C., & Nardi, E. (2020). Time for (research on) change in mathematics departments. International Journal of Research in Undergraduate Mathematics Education, 6(2), 147–158. https://doi.org/10.1007/s40753-020-00116-7

    Article  Google Scholar 

  • Rhoads, J. F., Nauman, E., Holloway, B., & Krousgrill, C. M. (2014). The Purdue Mechanics Freeform Classroom: A new approach to engineering mechanics education. In 121st ASEE Annual Conference and Exposition. Retrieved from https://peer.asee.org/23174

  • Rogers, E. M. (2003). Diffusion of innovations (5th ed.). Free Press.

    Google Scholar 

  • Rowan L., & Bigum C. (2003). Actor network theory and the study of online learning. In: Davies G., Stacey E. (eds) Quality Education @ a Distance. IFIP—The International Federation for Information Processing, 131. Springer: Boston, MA. doi: https://doi.org/10.1007/978-0-387-35700-3_20

  • Saldaña, J. (2013). The Coding Manual for Qualitative Researchers (pp. 91–95). SAGE Publications.

    Google Scholar 

  • Sayaf, A. M., Alamri, M. M., Alqahtani, M. A., & Alrahmi, W. M. (2022). Factors influencing university students’ adoption of digital learning technology in teaching and learning. Sustainability, 14(1), 493. https://doi.org/10.3390/su14010493

    Article  Google Scholar 

  • Secules, S., McCall, C., Mejia, J. A., Beebe, C., Masters, A. S., Sánchez-Peña, M. L., & Svyantek, M. (2021). Positionality practices and dimensions of impact on equity research: A collaborative inquiry and call to the community. Journal of Engineering Education, 110(1), 19–43. https://doi.org/10.1002/jee.20377

    Article  Google Scholar 

  • Sirakaya, A. D., & Özdemir, S. (2018). The effect of a flipped classroom model on academic achievement, self-directed learning readiness, motivation and retention. Malaysian Online Journal of Educational Technology, 6(1), 76–91.

    Google Scholar 

  • Stacey, E., & Gerbic, P. (2008). Success factors for blended learning. In Proceedings ascilite Melbourne 2008. 964–968. Retrieved from https://www.ascilite.org/conferences/melbourne08/procs/index.htm

  • Stites, N. A., Berger, E., DeBoer, J., & Rhoads, J. F. (2019). A cluster-based approach to understanding students’ resources-usage patterns in an active, blended, and collaborative learning environment. International Journal of Engineering Education., 35(6), 1738–1757.

    Google Scholar 

  • Stites, N. A., Berger, E., DeBoer, J., & Rhoads, J. F. (2020). Are resource-usage patterns related to achievement? A study of an active, blended, and collaborative learning environment for undergraduate engineering courses. European Journal of Engineering Education, in Print. https://doi.org/10.1080/03043797.2020.1783208

    Article  Google Scholar 

  • Tang, C. M., & Chaw, L. Y. (2016). Digital Literacy: A Prerequisite for Effective Learning in a Blended Learning Environment? The Electronic Journal of e-Learning, 14(1), 54–65.

    Google Scholar 

  • Tatnall, A., & Davey, B. (2015). The internet of things and beyond: Rise of the non-human actors. International Journal of Actor-Network Theory and Technological Innovation, 7(4), 58–69. https://doi.org/10.4018/IJANTTI.2015100105

    Article  Google Scholar 

  • Taylor, C., Spacco, J., Bunde, D., Butler, Z., Bort, H., Hovey, C., Maiorana, F., & Zeume, T. (2018b). Propagating the adoption of CS educational innovations. In Proceedings of the 23rd Annual ACM Conference. Larnaca, Cyprus. doi: https://doi.org/10.1145/3293881.3295785

  • Taylor, J. A., & Newton, D. (2013). Beyond blended learning: A case study of institutional change at an Australian regional university. The Internet and Higher Education, 18(1), 54–60. https://doi.org/10.1016/j.iheduc.2012.10.003

    Article  Google Scholar 

  • Taylor, M., Ghani, S., Atas, S., & Fairbrother, M. (2018a). A pathway towards implementation of blended learning in a medium sized Canadian university. International Journal of Online Pedagogy and Course Design (IJOPCD), 8(1), 60–76. https://doi.org/10.4018/IJOPCD.2018010105

    Article  Google Scholar 

  • Tornatzky, L. G., & Klein, K. J. (1982). Innovation characteristics and innovation adoption- implementation: A meta-analysis of findings. IEEE Transactions on Engineering Management., 29(1), 28–45. https://doi.org/10.1109/TEM.1982.6447463

    Article  Google Scholar 

  • Van Laer, S., & Elen, J. (2020). Adults’ self-regulatory behaviour profiles in blended learning environments and their implications for design. Technology, Knowledge, and Learning., 25(1), 509–539. https://doi.org/10.1007/s10758-017-9351-y

    Article  Google Scholar 

  • VanDerSchaaf, H. P., Daim, T. U., & Basoglu, N. A. (2021). Factors influencing student information technology adoption. IEEE Transactions on Engineering Management (early Access). https://doi.org/10.1109/TEM.2021.3053966

    Article  Google Scholar 

  • Zimmerman, B. J. (2001). Theories of self-regulated learning and academic achievement: An overview and analysis. In B. J. Zimmerman & D. H. Schunk (Eds.), Self-Regulated Learning and Academic Achievement: Theoretical Perspectives (pp. 1–37). Lawrence Erlbaum Associates.

    Google Scholar 

Download references

Acknowledgements

We gratefully acknowledge the support of our fellow Freeform team members, as well as the administration, faculty, and staff at Purdue University who have assisted us in this work. We would also like to acknowledge the work of the Purdue University Institutional Review Board (IRB), who provided approval and oversight of this study.

Funding

This study is based upon work supported by the National Science Foundation (NSF), Division of Undergraduate Education [Grant Number 1525671]. Any opinions, findings, conclusions, or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the NSF.

Author information

Authors and Affiliations

Authors

Contributions

JD serves as PI for this project. JFR was integral to the creation of the Freeform learning environment, and both JFR and EB serve as co-PIs. Data collection was conducted by DE, EB, JFR, and JD. Data analysis was primarily conducted by DE, with support from JD and EB. DE wrote much of the manuscript with help from YL, who contributed most heavily to the literature review and theoretical framework. All authors reviewed the manuscript on multiple occasions and made distinct contributions to its development. All authors read and approved the final manuscript.

Corresponding author

Correspondence to David Evenhouse.

Ethics declarations

Competing interests

The authors have no competing interests to disclose.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Evenhouse, D., Lee, Y., Berger, E. et al. Engineering student experience and self-direction in implementations of blended learning: a cross-institutional analysis. IJ STEM Ed 10, 19 (2023). https://doi.org/10.1186/s40594-023-00406-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40594-023-00406-x

Keywords