Skip to main content

Beyond “group work”: an integrated approach to support collaboration in engineering education

Abstract

Background

Working effectively in a collaborative team is not only an outcome required by ABET but also one that scholars and practitioners recognize as necessary for being a successful professional engineer. Technology-based solutions hold promise for supporting collaboration; however, research has shown that technology alone is not sufficient to develop students’ collaborative skills. The authors created a combined pedagogical and technological environment—Google Drive Environment for Collaboration (GDEC)—to support collaborative problem-solving during a semester-long team undergraduate human factor engineering design project. The environment uniquely used an “off-the-shelf” tool to implement collaborative scripts to take advantage of the affordances offered by the cloud-based collaboration technology environment that may contribute positively toward learning and collaboration. We examined the following research questions:

  • What is the relationship between the use of an online collaboration environment and student learning outcomes?

  • What is the relationship between the use of an online collaboration environment and student collaboration skills?

We used individual and per team collaborative contributions to GDEC as the independent measure of collaboration, and project scores, homework, and exam scores as dependent variables to show evidence of student learning. GDEC contributions were collected for the three project phases and regressed to student learning measures. Pre/poststudent collaboration skills were measured using the Dimensions of Teamwork Survey. Student open-ended responses to per phase surveys were analyzed for additional evidence of collaborative skills and use of the GDEC environment.

Results

Regression analyses clustered by group showed statistically significant relationships between:

  • Individual student contributions to the collaborative environment and homework and project and second exam scores.

Pre- to post collaboration skill scores on all Dimensions of Teamwork scales increased; however, the differences were not statistically significant.

Conclusions

We argue these results are promising as the combination of pedagogical strategies with the readily available off-the-shelf technology tools used to create GDEC and can be easily replicated. Further, student comments indicated they found the GDEC environment easy to use and effective, and they intended to use similar tools for future collaborative activities.

Background

Collaboration and higher order thinking are essential in the modern workplace. As global competition increases, organizations need to perform smarter, faster, and more efficiently. This requires embedding collaborative technologies deep into processes and incentivizing collaborative behaviors—ultimately transforming the way organizations—even classroom learning situations—turn knowledge into action (Hamilton et al. 2013).

In engineering, collaborative skills are mandated by Accreditation Board for Engineering and Technology (ABET)’s required program outcomes, “to function on multidisciplinary teams” and “an ability to communicate effectively”; further research conducted with practicing engineers has validated the need for these skills (Jonassen et al. 2006).

Accordingly, collaboration is often required in capstone engineering courses and project-based courses. Although engineering faculty see the value of collaborative experiences, many have not worked in engineering industry where collaboration is common, and thus, they lack the ability to support students’ development of effective collaboration skills (Ahern 2007).

Technology-based solutions may offer some promise for better supporting and developing collaborative skills; however, technology alone has been shown to not be sufficient to effectively support collaboration (Hsu et al. 2014). In the current study, we examined the impact of student use on a technology and pedagogy—based solution for supporting engineering students engaged in meaningful collaborative design activities. We developed Google Drive Environment for Collaboration (GDEC)—to support collaborative design and problem-solving activities in a required junior level Industrial and Manufacturing Systems Engineering (IMSE) course—“Ergonomics and Workstation Design.” The environment was designed to address oft-encountered problems associated with collaboration, including regulating and monitoring tasks, and practical considerations such as access to collaborative artifacts from multiple locations and simultaneous editing of collaborative work.

This study addresses the following research questions:

  • (RQ1) What is the relationship between the use of an online collaboration environment and student learning outcomes?

  • (RQ2) What is the relationship between the use of an online collaboration environment and student collaboration skills?

Background literature

Collaboration in STEM disciplines

Collaborative learning has a rich history in STEM disciplines. Springer et al.’s (1999) meta-analysis found that STEM-related collaborative learning promotes greater academic achievement, more favorable attitudes toward learning, and increased persistence in STEM courses (across many STEM disciplines) compared to control groups. In addition to building skills in students necessary for an engineering career, a study by Terenzini et al. (2001) found that collaborative learning processes resulted in greater scholarly achievement and productivity; more supportive and committed relationships among students; and greater psychological health, social competence, and self-esteem among students. Johnson et al. (1998) showed similar results across STEM and non-STEM disciplines. Further, collaboration is argued to be the most essential in domains where knowledge capital is key, such as the STEM fields (Bughin et al. 2010), thus providing further evidence for the need for engineering students to garner collaboration skills.

In engineering, motivated by both ABET accreditation (2016) and the common use of teamwork in professional engineering settings, team projects are quite frequent in both first-year and senior capstone design courses (Froyd 2005). In a recent literature review, Borrego et al. (2013) noted that the engineering literature shows that “it is taken for granted in engineering education that team projects are valuable because they will prepare engineering students to work in industry” (p. 479). Even so, Borrego et al. (2013) also summarize the negative experiences that both students and faculty report regarding team experiences, including managing conflict among team members and team members who do not do his or her share of the work.

Technology approaches to collaboration—as were used in this study—have also been used in engineering education settings to facilitate distributed communication among team members. For instance, Glier et al. (2011) used tablets to facilitate communication among globally distributed team members. Similarly, Serce et al. (2011) used computer-mediated communication (CMC) to support collaboration and to then analyze communication patterns in distributed teams. Researchers Finger et al. (2006) also combined the technology support of collaboration with data collection via developing a web-based repository for capturing all group artifacts and discussions, allowing team members to schedule, record, and develop action items from meeting and build on each other’s work and to draw relevant relationships between information provided. They found that the “rigid structures” of the environment did not align with the students’ own preferred modes of managing time and group processes and thus did not improve student learning.

Although many studies of collaboration in engineering education have shown positive effects on performance, some, even those employing technologies, have experienced difficulties. For example, students using groupware tools to complete three tasks typically performed by the members of an engineering design team—idea generation (e.g., brainstorming solution concepts), co-editing (e.g., reviewing and revising a technical report), and negotiation (e.g., deciding what should be done and who should do it) (Kirschman and Greenstein 2002)—were limited by the hardware, software, and network bandwidth, which compromised the predicted effects. Further, studies of collaboration communication patterns have found that participants do not use the technology interfaces to engage in fundamental collaborative activities such as reflection and monitoring, or challenging others (Serce et al. 2011; Ellis et al. 2008).

The state of digital technologies, however, particularly those aimed at facilitating distributed communications, is changing rapidly, and new studies that utilize these technologies are in order. Additionally, as Borrego et al. (2013) note, many of the articles they reviewed, although they implemented collaborative activities in engineering (and computer science) classrooms, did not adequately reference and build on prior research and collaborative literature. The current research takes advantage of these technological improvements to implement an online collaborative environment that facilitates collaboration via the affordances of the Internet and also includes collaboration scaffolds to guide students through the collaboration process relative to the problem-solving activities in which they were engaged.

Supporting collaboration: technology tools integrated with pedagogy

Technologies can facilitate improved collaboration by several means. For example, social workflow platforms tools such as “Huddle” provide roles, tasks, and templates that can help guide groups through an optimized and standardized work plan. Teams can then use such tools to exchange and discuss work, review progress, and obtain approvals. Other advantages of technology-enhanced collaborative learning include the following: (1) built-in assessment and data collection capability, (2) flexibility (e.g., Resta and Laferrière 2007), and (3) the capacity for diverse cultures, and disciplines to interact.

Another advantage of technology-enabled collaboration tools is that a great many of them are available in “off-the-shelf” form. These tools—such as Google drive—provide the technology infrastructure for educators and designers to support activities key to enabling collaborating at a distance such as cloud storage and simultaneous editing. They also have the advantages of being accessible (many are free), and not specific to any discipline, so they can be used in nearly any academic setting.

Beyond their practical advantages, these off-the-shelf technology tools when paired with appropriate pedagogies can promote communication and collaboration and have the potential to encourage co-construction of knowledge and meaning negotiation among students (Serce et al. 2011). For instance, Raitman et al.’s (2005) results indicated that students using Wikis to collaborate found the anytime, anywhere nature of editing the Wiki to be “relaxing” and leading to a “democratic feeling among members” helping them to feel they can make their contributions in a non-confrontational setting where all are on equal footing. Similarly, simple tools such as synchronous chats can contribute positively to student motivation and learning outcomes, and students further indicate that they enjoy using such tools in learning contexts (Dickey 2003; Shotsberger 2000).

However, research has found that in practice, students do not tend to use these features nor show evidence of building shared understanding unless they receive appropriate scaffolding or support in the technology-supported collaborative environment (Hsu et al. 2014). Thus, technology alone is not sufficient to address the problem. Nor are pedagogical methods sufficient to address the needs of online collaboration at a distance. Integrating off-the-shelf technology with purposeful pedagogical design, we posit, can support meaningful collaborative practice that helps students develop meaning and higher learning within student groups or dyads.

Scaffolds and scripts

Wood et al. (1976) coined the term “scaffold” and defined it as assistance from experts that enables learners to achieve what is beyond their ability to accomplish independently, as well as allows them to learn from experience. In general, scaffolding can have multiple functions including engaging, motivating and challenging learners, drawing attention to critical features of the problem at hand, demonstrating techniques, and reducing frustration (Wood et al. 1976). Scaffolding is different from supports such as job aids in that scaffolds may both simplify processes and highlight certain aspects of their complexity (Reiser 2004); these functions are based upon what barriers learners often face for that task or learning outcome.

Scaffolding has been shown to improve learner performance by providing the appropriate level of support in a just-in-time fashion (Belland 2014, Pressley et al. 2006). Although the original concept of scaffolding was applied to teacher or expert support, in the past decade, there has been considerable attention paid to software or technology-enabled scaffolds (e.g., Kolodner et al. 2004; Linn et al. 2004; Reiser 2004; Zahn et al. 2012); these are also referred to as “hard” scaffolds. Specifically, in these technology implementations of scaffolding, the technology features support the learning activity rather than using direct intervention from instructors. Clearly, this makes scaffolding more feasible in the typical learning situation where students far outnumber teachers. Technology-based scaffolds are also useful for providing “just-in-time” support when students may be working together without an instructor present or in the case of learners in asynchronous online settings.

As the learning outcomes in the Ergonomics course used in this study were focused on students not only developing and mastering human factor design skills but also learning to collaborate effectively, we implemented scaffolding techniques to support both sets of skills. To accomplish this, we established scripted prompts as a structuring type of scaffold (Reiser 2004) within GDEC. Scripting consists of constraints that structure conversation or discourse among collaborators with the aim of guiding the exchange of knowledge and information (Kirschner et al. 2008). For instance, one way to enhance effectiveness of collaborative learning and to teach students how to collaborate is to structure interactions by engaging students in defined scripts (Dillenbourg 2005). Collaborative scripts prescribe how students should form groups, interact, solve problems, and so on. Scripts work by specifying activities that help learners to engage in tasks that will elaborate new knowledge (relating new idea to already known or by making it personally meaningful by adding details, examples, analogies, visualizations, explanations, argumentation, question asking) (King 2010; Kobbe et al. 2007). Scripts can also help to sequence the activities.

Because collaborative learning includes both epistemic and social components (Fischer et al. 2002) and these have been shown to be predictive of collaborative learning results (Cohen 1994; Fischer et al. 2002), Weinberger et al. (2005) developed and tested the use of epistemic and social scripts to support collaborative learning. Epistemic scripts are designed to support how learners work on a specific task, while social scripts are designed to support how learners interact with one another during collaborative activities (Weinberger et al. 2005). For instance, in their study where students were discussing an attribution theory case, Weinberger et al. (2005) used guiding analytical questions such as “Does a success or failure precede this attribution? Is the cause for the attribution stable or variable?” (p. 14) as epistemic scripts. Their social script prompts included sentence starters for collaborative conversation such as “These aspects are not yet clear to me,” or “My proposal for an adjustment of the analysis is” (p. 14). Their studies of independent groups using epistemic and social scripts found that social scripts were beneficial toward individual learning; however, epistemic scripts did not consistently produce improvements. We discuss the specific content and design of the scripts we created under the “Methods” section.

Our implementation of scripts using the Google Drive technology constitutes a new contribution to the literature in this area. Prior work has shown the potential effectiveness of using epistemic scripts (e.g., Weinberger et al. 2005). We hypothesize that by implementing such scripts, using Google Drive—which allows for team members a great deal of flexibility and many modes for contributing to the response to the script prompts—will reduce barriers commonly encountered in meaningful collaboration. Specifically, features such as cloud storage allowing for access from any Internet-connected device, and synchronous and asynchronous editing by multiple users, may allow learners to engage in dialog-type activities within the technology-based collaborative environment.

Literature summary

This study builds on prior work establishing the need to better support the development collaboration skills in engineering students while engaging in engineering activities, the potential for the use of technology to support collaboration, and the use of scripts as scaffolds to enable learners to better use the technology and the scripts to enable meaningful knowledge building as well as the development of collaboration skills. This study uniquely implements computer-based scripting using epistemic and social scripts in an online environment developed using an “off-the-shelf” collaboration tool—in this case Google Drive. This technology combined with the pedagogical design allowed for simultaneous editing and usage of the epistemic scripts by team members either in face to face or distance settings. This was paired with social scripts completed individually at intervals throughout the project allowing for individual reflection that could be applied for improving the next phase’s collaborative activities. This study builds on the past research through a combination of the affordances of the technology tools paired with the intentional pedagogical design of the two types of scripts.

Methods

Research context

We collected data in the 2012 fall semester of one undergraduate industrial engineering course—Ergonomics and Workstation Design (hereafter Ergonomics)—at a large Midwestern US public university. The writing intensive course enrolled 40 engineering students in their third or fourth year of degree completion. Participants were predominately male with 13 females in the course. Ergonomics was designed around a collaborative group project where participants identified a human factor problem and designed a solution for it. The instructor agreed to have students use GDEC to support their project work.

Design of GDEC

We created GDEC via the Google Drive technology platform. We chose Google Drive because it is a free service, widely available, and supports version-controlled simultaneous editing, multiple types of artifacts (e.g., spreadsheets, word processing documents), artifact commenting, online linking, folder structures, and image sharing. Using Google Drive also allowed the student team members to accomplish collaboration tasks, and address the scripting prompts asynchronously or simultaneously, and when team members were co-located or working at a distance from one another. We note, however, that other cloud-based collaboration tools with the same affordances could be used in the same way.

GDEC’s affordances, however, are derived not solely from the Google Drive technology but also from the theoretical and pedagogical foundations of collaboration and scaffolding. Figure 1 conceptually illustrates the design and student usage of the GDEC environment. As Fig. 1 shows, the Google Drive technology combined with collaboration and scaffolding theory provides the foundations for the design of GDEC. Student teams used the GDEC environment as a workspace to accomplish their project work and received further “human” coaching via per phase project feedback from the instructor as well as the supports provided in the aforementioned workshop

Fig. 1
figure 1

GDEC conceptual framework

Table 1 further defines how both the literature on collaboration and scaffolding provide a basis for the technological and pedagogical features we designed into the GDEC environment. For instance, Johnson and Johnson’s framework that defines the necessary elements for successfully supporting cooperative or collaborative learning (e.g., individual accountability, positive interdependence) provides the rationale for design elements to help learners develop these skills. Further, the GDEC environment also targets several ABET outcomes—such as supporting engineering design and problem-solving.

Table 1 GDEC pedagogical and technological design elements mapped to outcomes

Scaffolding scripts in GDEC

Based on prior work from Weinberger et al. (2005) and Nussbaum et al. (2009), we developed epistemic and social scripts as scaffolds to support students in their collaborative human factor design tasks for the project, as well as to support their development of teamwork skills. Our approach to the epistemic scripting was to structure students’ work on the human factor design problem (Reiser 2004, 283–284). According to Reiser, structuring scaffolds can have three purposes:

  • Decompose complex tasks—intended to address/or reduce the task open-endedness and difficulty by reducing choices and thus reducing complexity.

  • Focusing effort—reducing the problem space—or offloading more routine parts of the task.

  • Monitoring. Such scaffolds may be implemented as prompts, agendas, or graphical organizers that help learners to keep track of their plans and monitor progress. These monitors can help remind learners of important goals and criteria that must guide their work.

Similar to Weinberger et al. (2005), the authors designed epistemic scripts in the form of prompting questions to aid group members in the analysis and problem-solving skills required for each project phase. The instructor and researchers made these available as bulleted lists in a document placed inside each groups’ GDEC folder on Google Drive. This document became the working document for each group’s collaborative writing. Table 2 shows sample epistemic scripting prompts used by each group. The content of these scripts was based on the three purposes of structured scaffolding described above. We applied these purposes to the course content area of the human factor design process. The “what methods will we use to assess risks”, for instance, is designed to decompose this complex analysis process, while “how are humans affected” will help them focus on an important issue in the design process. Monitoring occurred within phases as teams could, in real time, see their progress (e.g., what was done, what remained) on the epistemic scaffolds in the GDEC environment. Further, the entire phased approach to the project’s completion allowed students to monitor their own progress in addition to receiving monitoring feedback from the instructor.

Table 2 Epistemic script example

Our approach to the social scripting was to structure student reflections on their team’s work processes, communication, and collaboration while completing the project deliverables. Students were asked to complete a reflection activity using the social scripting prompts housed on GDEC following completion of each phase. Each student completed an individual reflection. Table 3 shows sample social scripting prompts for the phase 2 reflection activity.

Table 3 Social script example

Procedures

Research and project activities occurred over the semester as shown in Fig. 2.

Fig. 2
figure 2

Project and data collection timeline

To implement the technology component of GDEC, at the beginning of the semester, we established a Google Drive account for the course with permissions for the instructor, students, teaching assistants, and research assistants. Students formed themselves into 11 groups of three to four members for the semester-long project. The instructor and researchers had viewing and editing access to all folders in GDEC; team members only had access to their group’s collaborative space.

The instructor assigned the group project to students at the beginning of week 4 of the semester; students worked on the project in three phases: (1) identifying and justifying the problem, (2) selecting methods and conducting preliminary analyses, and (3) redesigning and reporting on the final project. Each team submitted the phase work for a grade and instructor feedback. The students were required to collaborate with their teammates using GDEC to complete all project tasks. Because the environment was based in Google Drive, and Google Drive supports word processing, and spreadsheet editors, the groups were able to produce finished project products directly in GDEC. Additionally, multiple users could perform all of these tasks simultaneously on the same artifact as the underlying Google drive technology supports concurrent editing. This created a seamless process where the GDEC environment did not introduce any additional tasks for students to complete during their project work. The instructor then accessed each group’s completed work by phase for grading purposes directly from the groups’ GDEC folders.

Per the writing intensive aspect in Ergonomics, students were required to collaboratively write the reports for each project phase. Good collaborative writing requires more from teams than simply dividing up the writing by team members and “bolting” it together. However, research has shown, and the instructor’s experience confirms this, that students are reluctant to engage in reflecting on or challenging their peer’s ideas in either synchronous or asynchronous communications that would get beyond this non-integrated, divide-and-conquer method of writing (Janssen et al. 2009; Munneke et al. 2007; Violet and Mansfield 2006).

Thus, at week 6 (just after students completed individual project proposals within their teams), we conducted an in-class workshop to instruct and model to students how to provide written constructive feedback to peer members on both engineering content and writing, as well as to model the use of GDEC in the writing and feedback process. Before the workshop, student groups were paired with another of the class groups. Using GDEC, the groups shared their individual project proposals and were instructed to prepare constructive criticism on their assigned proposal. Their constructive criticism documents were prepared and stored using their GDEC workspaces. During the in-class workshop, we introduced them to a rubric for guiding constructive writing feedback (Massachusetts Institute of Technology (MIT) 1999) as well as examples of techniques for professional criticism of peer projects. During the workshop, students were instructed to use the rubric and the examples provided to revisit and improve upon their initial written project critiques; they completed this graded exercise using their laptops in class using the GDEC environment. Additionally, the instructor reminded students of the rubric and how to apply it to their tasks throughout the semester.

Students continued to work on their projects using GDEC to complete all three phases of the project. Each phase included both epistemic and social scripts. Epistemic scripts were adjusted for each phase to help support the types of thinking most needed, or most anticipated to be difficult for the students, during that phase (e.g., phase 1 scripts focused on problem identification; phase 2—selecting and justifying analysis methods, and phase 3—critical interpretation of results and evaluation of project outcomes). Social scripts remained largely static across phases. The instructor and research team created the scripts per the process described above and copied them in each team’s GDEC folder for their access. This process involved simply uploading each script file into the individual teams’ GDEC folders.

At the end of each of the three project phases (see Table 3 timeline), the instructor assessed each team’s project work. All team members received the same score for each phase of the project. For research purposes, we also downloaded GDEC usage data from the Google Drive servers after the completion of each phase (further described under the “Results” section).

The exams in the course were comprised of a mix of short answer and longer “work-out” problems. These longer problems included structured cases or scenarios that required students to apply concepts from the class to identify critical ergonomic challenges, select and apply appropriate analysis tools, interpret results, and recommend interventions. These components align with the requirements of the course project. While the exam problems were structured (as opposed to the projects which were unstructured), the items did require students to use the same ergonomic approach as used in the design project.

Additionally, homework assignments were due throughout the term. Each of these assignments was designed as a semi-structured case scenario where students were required to conduct various components of the ergonomics process, justify their thinking, and communicate their findings in writing to a specified audience (e.g., a memo to a company executive or a proposal for a consulting job). Thus, the assignments required both strong writing skills and ergonomic analytical and design skills similar to those utilized in the course project.

Analysis

At the end of the semester, Google Drive data from GDEC were copied into a text editor. Data were amended to remove extraneous information, and multiple email identities were standardized for consistency. Participant identities were then separated to account for individual contributions with each of the students’ original inputs, edits, and comments counted as single contributions. All data were transposed into a spreadsheet to allow further analysis (described under results) with statistical processing software. Table 4 shows a summary of the data frequencies for each group after this amendment process.

Table 4 Per group and per phase GDEC contributions

To answer research question 1, we first conducted an interclass correlation analysis (Table 5) to see how strongly each dependent variable’s scores are correlated when grouped by project team. The highest correlations are for homework and project scores with much lower relationships for the exams. The very high ICC value for project scores is due to members within a single group generally receiving the same per phase project scores.

Table 5 Interclass correlations (ICC) for learning outcome variables by group membership

Given that at two of the learning outcome variables are somewhat (or highly) correlated with group membership, we used MPLUS to conduct a complex regression analysis that clusters the individual data by group membership. This type of analysis accounts for the potential group effect while still analyzing relationships between variables at the individual level (Begg and Parides 2003).

For the second research question concerning students’ knowledge of teamwork processes, we administered a pre- and posttest version of the Dimensions of Teamwork Survey (Ryan 2008). Using a 6-point Likert scale (disagree to agree), the survey measures seven scales:

  1. 1.

    Customer and inter-team issues

  2. 2.

    Roles and interdependence

  3. 3.

    Communication and conflict management

  4. 4.

    Team member skills

  5. 5.

    Clarity of team goals

  6. 6.

    Decision authority and accountability

  7. 7.

    Support from organization

Although “teamwork” does not equate to collaboration, these scales map well to literature on characteristics necessary for effective collaborative learning to take place (e.g., Johnson and Johnson 2007; Cottell and Millis 1994)—including positive interdependence and ability to communicate and manage conflict (King 2010; Serce et al. 2011) (Table 6). We modified the Dimensions of Teamwork survey by removing scales one (customer and inter-team issues) and seven (support from organization) because these two scales pertained only to a business realm and were not applicable; the remaining five scales had test/retest reliabilities from 0.84 to 0.96. Thirty-one participants completed the pretest and 22 completed the posttest. Due to a limitation in our permissions for using the instrument, data were collected anonymously; thus, connecting these data to student group membership was not possible. For this reason, we conducted individual pre- and postscores using t tests.

Table 6 Collaborative activity codes applied to social script responses

To provide further explanation of the quantitative Dimensions of Teamwork survey scores and to inform our overall discussion, we qualitatively analyzed students’ responses to the phase 2 and 3 social script prompts that captured their reflections on their collaborative activities. We did not analyze phase 1 responses for this research question as they contained mostly comments about learning the mechanics of Google Drive and did not provide a perspective on student collaboration.

To conduct this qualitative analysis, we:

  • Downloaded all three phases’ open-ended items from GDEC.

  • Both authors read through the student responses to ascertain their overall content relative to students developing collaborative skills. We met to discuss our initial impressions. Because phase 1 data were focused mostly on the logistics of logging into the system and not on aspects of project collaboration, we decided to eliminate phase 1 responses from further analysis.

  • Developed a preliminary code set Johnson and Johnson’s Cooperative learning theory (2007) as well as the project’s desired learning outcomes.

  • The authors coded the following social script items that were the most focused on how students collaborated: How has your team changed how it accomplishes its work during this phase of the project? What have been the biggest benefits to using Google Drive to collaborate as a team on this project?

The prompts were focused, and student responses were brief; thus, we used each students’ entire response to the single prompt as the unit of analysis for coding. A single response could garner multiple codes of each type. For example, the following passage was coded as positive interdependence (“more hands on team collaboration”) and knowledge construction (“discussion… achieve better solutions”).

During Phase 3 we completed a lot more hands on team collaboration during the data collection and risk analysis phase and everyone worked really well together. There was a lot more discussion in the face to face meetings before writing out the prompts and this allowed as to achieve better solutions to the current problems at MBS Textbooks. Google Drive was also used as a space to store parts of our future report draft, thus increasing efficiency in the long run.

Coding by both authors occurred in two phases; to refine our codes and develop common understandings of the coding schemes, we first coded responses from phase 2 prompts. Both authors coded all data independently and then met to resolve differences and discuss the codes. At this point, we discussed our separate coding and added a “NC (no code)”—for responses that did not address the prompt. The resulting set of codes and their definitions is shown in Table 5. We then separately applied this code set to the remaining responses and resolved all coding differences.

Results

(RQ1) Relationship between the use of GDEC and student learning outcomes?

To answer this question, we conducted regression analysis between the GDEC environment contributions and the student learning outcome variables. We conducted these analyses of the relationship at both the group and the individual student level. For the group analysis, we examined whether there was a significant relationship between the total group’s contributions to the GDEC artifacts and each of the three phases of project scores and the score on the final report. This analysis is appropriate at the group level as all students in each group received identical project scores. Simple regression analysis was conducted with the group project scores as dependent variables, and the group contribution counts to GDEC artifacts as predictors for the scores of each of the three phases of the project. The results showed no significant relationships (Table 7).

Table 7 Regression analysis for group contribution predicting group (n = 11) project score in each phase

Using the clustered regression described previously, we examined whether there is a significant relationship between students’ individual contributions and homework, exam, and project phase, and final report scores. The results (Table 8) indicated that when clustered by project team, there are significant relationships between individual GDEC contributions and homework scores, the second exam score (both at p < 0.01), and project scores (p < 0.05).

Table 8 Regression analysis clustered by group for individual contribution predicting students’ learning outcomes

(RQ2) Relationship between GDEC and collaboration skills

Because the team project and the use of GDEC were intended to help students develop their individual collaboration skills, we collected and analyzed pre- and postdata the Dimensions of Teamwork (DOT) survey (Ryan 2008). The anonymous data results (Table 9) showed that students’ average scores increased from pre- to posttest in all five scales, but the increases were not statistically significant. We posit that this is due to the high level of their pretest scores (4.8~5.1 on 6-point Likert scale). Although the DOT increases are a potential indicator of students being engaged productive collaborative activities while performing project work using GDEC, we find further evidence from the qualitative coding of the phase 2 and 3 epistemic scripts (Table 2).

Table 9 Dimensions of Teamwork Survey pre- and posttest survey t tests on mean scores for each scale

These frequencies indicate the per team frequencies of the collaboration activities codes from Table 6 (e.g., positive interdependence, individual accountability) evidenced in students’ responses to the reflection question “How has your team changed how it accomplishes its work during phase 10 of this project”. For example, Fig. 3 shows that team one members’ responses showed evidence of a combined 11 occurrences of these positive collaborative activities. Teams one–six, eight, and 11 all showed fairly consistent frequencies (within a count of two) from phase 2 to 3 for these collaborative activities. Although the raw frequencies are not high (seven was the highest count), they were arrived at from responses to questions that did not prompt students to discuss positive collaborative activities.

Fig. 3
figure 3

Per team collaboration codes

Discussion

Our results for using GDEC to support collaborative learning and develop teamwork skills in this Ergonomics course were mixed but promising. For RQ1, that addressed how the contributions to GDEC were related to student learning outcomes, taking into account the effect of group membership we found positive significant relationships between individual student contributions to their GDEC-based team activities and individual student learning outcomes (Table 8), where individual GDEC contributions were positively related to homework scores the second exam score and the final project score. The relationship between the use of GDEC and learning is also supported in our finding of a statistically significant positive relationship between total per group GDEC contributions and individual final project scores (Table 8).

We did not find significance between per group collaborative contributions and any of the per group learning outcomes measured (Table 7). We attribute this lack of significance to the relatively small number of groups (11) resulting in low statistical power. Regarding students’ knowledge of effective collaboration skills (RQ2), anonymous data results showed positive improvements (but not significant) in students’ knowledge on all five Dimensions of Teamwork scales measured.

We posit that the significant results we found do provide initial support for the effectiveness of a combined pedagogical and technology GDEC-type model that combines the affordances of an online collaborative technology platform with pedagogical scaffolds (specifically epistemological and social scripts) designed to support collaboration and learning. We realize we cannot make causal claims about our results. Our analysis shows that students who made contributions to the GDEC environment performed better on most of the learning outcomes we measured. However, we cannot ascertain whether the combined pedagogical and technological environment influenced their contributions and ultimately their scores; we only know via this study that there is a correlation between contributions and performance.

Nonetheless, even the presence of the correlation is cause for some optimism about this pedagogical and technology combination. Revisiting our conceptual framework (Fig. 1) for the design of GDEC and this research, we discuss the results in terms of the attributes of the pedagogical and technological environment students operated within. Figure 4 shows how each aspect of the GDEC technology and pedagogical framework was intended to support collaboration and learning outcomes.

Fig. 4
figure 4

GDEC elements mapped to outcomes

Epistemic scripts and cloud-based collaborative technology

Although not a causal relationship, the statistically significant results around student learning point to the effectiveness of the collaborative epistemic scripts. The epistemic scripts consisted of guiding questions collaboratively completed by all group members that were designed to help them complete the analyses required for each phase of the project. Learner’s contributions to the environment—which were the basis for our independent variable—were guided by these scripts. By this, we mean that not only were there contributions made directly to the script files by team members but also scripting questions guided team members to make other separate contributions.

Project grades for each phase (one of the learning outcome dependent variables) were based upon the quality of the analyses contained in that phases work—which again was guided by these scripts. Thus, significant relationships between GDEC contributions and project phase scores would constitute a near transfer of the skills being supported by the GDEC technology and pedagogical environment and the learning outcome being measured.

In contrast, the pedagogical scripts in GDEC would be less aligned with supporting students’ exam and homework scores. Both exams and homework included aspects that were broader than the team-project tasks; thus, the significant relationships found between GDEC contributions and these outcomes may show support for “far” transfer of the researched activity.

A further potential explanation for the significant results is the affordances made available by implementing the scripts and the overall collaborative work using a cloud-based collaborative technology. This integration of technology- and pedagogical-based features supported students’ collaboration activities and ultimately their individual learning. We posited that the combination of epistemic scripts with the technology affordances of a cloud-based collaborative technology like Google Drive would lead to enhance the collaborative experience and may also increase learning. Specifically, the following affordances are seen as important to collaboration and learning:

  • Ability to accept multiple types of artifacts (e.g., word processing documents, spreadsheets, image files) necessary in engineering problem-solving.

  • Access anytime anywhere via the “Cloud” reduced barriers for team members to access and build upon each other’s work.

  • Support for simultaneous artifact editing (whether sitting in the same room at multiple computers or at a distance) supported nearly instantaneous knowledge and exchange and building. Further from a logistical standpoint, simultaneous editing allowed for team members to productively be able to work on an artifact—such as the per phase project report—at the same time.

  • Support for “commenting” on artifacts and real-time chatting allowed team members to directly and easily communicate with other team members around the context of their teamwork. To add a comment that is anchored to a specific word or phrase from the teams’ per phase report is more direct than an email that refers indirectly to that report and allows for clearer communication.

  • Support for asynchronous communication and editing in a shared folder structure meant less potential confusion of emailing different document versions among team members.

Quotations from students’ per phase reflections support our assertion of the effectiveness of the integration of technology and pedagogy to support each team’s work. Bracketed text are author comments on the significance of each quote.

  • Being able to see our other team members’ work so that we could elaborate each other’s work and build our work off of their work. Also, we could see all of the completed work second by second…

  • We all have access to the data we need to do calculations and to set up the redesign of the workstation as well as completing the phases of the project. We all have access to what each other write so we can take a look at it and help each other editing our information to make it better. {both of the above show how the cloud-based collaboration environment helped to reduce barriers to sharing collaborative work; facilitated efficiency of access}

  • Instead of assigning a question [prompt] for each person to address, we simply got together and collaborated via Google drive and answered all the questions with a group effort in order to better follow the rubric. {supported collaborative editing and providing immediate feedback to each other’s work; this is something that is often lacking in undergraduate teamwork (Janssen et al, 2009)}

  • I think Google Drive is a great tool for group projects. I am using it for a couple other group projects currently. I think you should keep encouraging student to use it. {representative of many student comments that indicated they would continue to use Google Drive in the future.}

Social scripts and classroom workshop

The instructor had taught this writing intensive course previously and reported that students have difficulty providing critical but constructive feedback to their peers during problem-solving and resist the collaborative writing process (Cho et al. 2006; Hyde 1993; Kinsella 1996). The instructor’s observations are consistent with research that shows that students do not wish to challenge others’ work (Janssen et al. 2009). However, the difficulties that students experience in collaborating and writing collaboratively may not be sufficiently addressed by the affordances of technology alone (Kirschman and Greenstein 2002). Beyond simply the technological features, learners need explicit instruction and modeling of constructive collaborative work. We attempted to address these needs via the in-class workshop that included an inter-team assignment that required teams to use a writing feedback job aid to provide constructive feedback on another team’s writing; the per phase social scripts also provided ongoing motivation to reflect on collaboration and writing. Although these interventions did not produce statistically significant results in the Dimensions of Teamwork survey, we did see positive pre/postgains in all scales measured from this instrument.

The lack of Dimensions of Teamwork significance may be attributed to unusually high prescores for this group of students. It is unclear whether these students actually did enter the project with well-developed collaboration skills (although instructor/author Steege notes that many of these students knew each other from their study of industrial engineering and may have worked collaboratively before) or if rather their pretest scores were inflated. In subsequent projects, we plan to not only implement both a pretest and posttest of the Dimensions of Teamwork as we did for this study but also collect “retrospective pretest” data at the end of the collaborative experience. A retrospective pretest asks respondents to rate both their current abilities post the collaborative experiences as well as to reflect upon and report their assessment of their collaborative abilities prior to the collaborative experience, thus allowing respondents to reflect upon and report changes in skills.

Data analysis of the student per phase reflections did provide some evidence of the project’s impact on student collaboration skills. Table 9 shows that all teams described their use of productive collaborative activities. Further, students’ comments point to how their use of GDEC supported meaningful collaboration activities. The first quote from a team 6 member was coded as evidencing knowledge construction (“.. we could elaborate each other’s work..”) that is perhaps the most sophisticated desired outcome from collaboration.

[on the biggest benefit to using google drive] Being able to see our other team members work so that we could elaborate each others < sic > work and build our work off of their work. Also, we could all do everything at the same time because we could see all of the completed work second by second on our own computers.

More common were occurrences of comments such as the one below from a member of team 1 that do not explicitly provide evidence of knowledge construction but do point to the GDEC environment supporting team members participating in meaningful problem-solving sessions.

Google Drive tremendously increases team efficiency and collaboration between members. It allows the team to comment, analyze, change, or see what other team members are working on or have completed. This also allows the entire team to proofread or see the final product and not have one guy be the “finisher”. It takes loads of stress off of the team and allows people to work together in a more productive environment- Team 1

Limitations and future work

As in many initial studies, there are some limitations. As previously discussed, we did find significant relationships between GDEC contributions and individual learning; however, without a control group, we do not have evidence to support a causal relationship for these findings. A study that implements a control group would allow us to analyze these important questions.

Additionally, the Dimensions of Teamwork survey data were anonymous, and thus, we were not able to regress them with the GDEC contributions and learning outcome scores. Lastly, the significant results we found in this study occurred in the context of the limited forms of measures of collaboration we used in this early instantiation of GDEC. Although the frequency of the individual and total group contributions (defined as each of the students’ original inputs, edits, and comments) is one indicator of collaboration that occurred in each student group, it is only one and it is strictly quantitative. Dillenbourg (2000) and Serce et al. (2011) posit that collaborative knowledge construction arises not simply from having more individuals work together but rather from meaningful activities such as “reading, building, predicting, and negotiating” (Serce et al. 2011, p. 500) that trigger meaningful collaboration. Although the existence of a GDEC contribution is a necessary aspect of these potential triggers, it is not necessarily a sufficient one. Clearly more nuanced measures are needed that include both the quality of collaborative contributions in addition to quantity.

We are currently piloting an enhanced version of GDEC that will implement more nuanced ways of measuring and describing learners’ collaborative activities. Specifically, we will pilot the use of Epistemic Network Analysis (ENA) (Shaffer et al. 2009) which is a method that models the nature of the relationships of qualitatively coded data. We anticipate qualitatively coding the content (thus getting at the existence of the “triggers” described by Serce et al. (2011)) of student collaborative contributions as an input to the ENA. Nonetheless, even with the current study’s limited measure of collaboration, the GDEC environment coupled with our pedagogical scaffolds did yield significant positive results in student learning outcomes.

In future work, in addition to the control group study previously mentioned, we would like to examine the use of GDEC with students working at a distance as this would provide a more authentic context to test its effectiveness.

Conclusions

Prior research has shown that technologies can be used to effectively support student collaboration. However, research has also shown that in many cases, technology alone is not enough. This study examined the impact of using an environment to support effective collaboration—GDEC (supported by Google Drive). GDEC used pedagogical strategies for supporting engineering learning (e.g., scripting) in conjunction with the cloud-based collaboration tool to help in the development of collaborative skills and positively influence learning outcomes. In particular, the implementation of epistemic scripts that learners completed in a collaborative and synchronous editing environment was unique.

Although not causal evidence, we found quantitative evidence that the use of the environment was significantly correlated to improved student learning outcomes. Additionally, the qualitative feedback from students indicated that students saw value in the GDEC for facilitating collaboration activities and also improving the quality of their project work. Notable student comments such as this—I think Google Drive is a great tool for group projects. I am using it for a couple other group projects currently. I think you should keep encouraging student to use it—showed support for carrying their experience with the GDEC forward to support collaboration in other scenarios beyond this course project.

References

  • Ahern, A. (2007). What are the perceptions of lecturers towards using cooperative learning in civil engineering? European Journal of Engineering Education, 32(5), 517–526.

    Article  Google Scholar 

  • Begg, M. D., & Parides, M. K. (2003). Separation of individual‐level and cluster‐level covariate effects in regression analysis of correlated data. Statistics in Medicine, 22(16), 2591–2602.

    Article  Google Scholar 

  • Belland, B. (2014). Scaffolding: definition, current debates, and future directions. In M. Spector, M. D. Merrill, J. Elan, & M. Bishop (Eds.), Handbook of research on educational communications and technology (pp. 505–518). New York: Springer.

    Chapter  Google Scholar 

  • Borrego, M., Karlin, J., McNair, L., & Beddoes, K. (2013). Team effectiveness theory from industrial and organizational psychology applied to engineering student project teams: a research review. Journal of Engineering Education, 102(4), 472–512.

    Article  Google Scholar 

  • Bughin, J., Chui, M., & Manyika, J. (2010). Clouds, big data, and smart assets: ten tech-enabled business trends to watch. McKinsey Quarterly, 56(1), 75–86.

    Google Scholar 

  • Cho, K., Schunn, C. D., & Charney, D. (2006). Commenting on writing: typology and perceived helpfulness of comments from novice peer reviewers and subject matter experts. Written Communication, 23(3), 260–294.

    Article  Google Scholar 

  • Cohen, E. G. (1994). Restructuring the classroom: conditions for productive small groups. Review of Educational Research, 64, 1–35.

    Article  Google Scholar 

  • Cottell, P. G., Jr., & Millis, B. J. (1994). Complex cooperative learning structures for college and university courses. Retrieved from http://digitalcommons.unl.edu/podimproveacad/304/

  • Dickey, M. D. (2003). Teaching in 3D: pedagogical affordances and constraints of 3D virtual worlds for synchronous distance learning. Distance Education, 24(1), 105–121.

    Article  Google Scholar 

  • Dillenbourg, P. (2000). Virtual learning environments. In Workshop on virtual learning environments of the EUN conference: learning in the new millennium: building new education strategies for schools. Retrieved from http://tecfa.unige.ch/tecfa/publicat/dil-papers-2/Dil.7.5.18.pdf/.

    Google Scholar 

  • Dillenbourg, P. (2005). Designing biases that augment socio-cognitive interactions. In R. Bromme, F. Hesse, & H. Spada (Eds.), Barriers and biases in computer-mediated knowledge communication and how they may be overcome (pp. 243–264). New York: Springer.

    Chapter  Google Scholar 

  • Ellis, R. A., Goodyear, P., Calvo, R. A., & Prosser, M. (2008). Engineering students’ conceptions of and approaches to learning through discussions in face-to-face and online contexts. Learning & Instruction, 18(3), 267–282.

    Article  Google Scholar 

  • Finger, S., Gelman, D., Fay, A., Szczerban, M., Smailagic, A., & Siewiorek, D. P. (2006). Supporting collaborative learning in engineering design. Expert Systems with Applications, 31, 734–741.

    Article  Google Scholar 

  • Fischer, F., Bruhn, J., GraÅN sel, C., & Mandl, H. (2002). Fostering collaborative knowledge construction with visualization tools. Learning and Instruction, 12, 213–232.

    Article  Google Scholar 

  • Froyd, J. E. (2005). The engineering education coalitions program. In National Academy of Engineering (Ed.), Educating the engineer of 2020: adapting engineering education to the new century. Washington, DC: National Academies Press.

    Google Scholar 

  • Glier, M. W., Schmidt, S. R., Linsey, J. S., & McAdams, D. A. (2011). Distributed ideation: idea generation in distributed capstone engineering design teams. International Journal of Engineering Education, 27(6), 1281–1294.

    Google Scholar 

  • Hamilton, M., Kass, A., Alter, A. E., & Coffey, R. T. (2013). From talking to transforming: getting real value from enterprise collaboration technology.

    Google Scholar 

  • Hsu, Y. C., Ching, Y. H., & Grabowski, B. L. (2014). Web 2.0 applications and practices for learning through collaboration. In Handbook of research on educational communications and technology (pp. 747–758). New York: Springer.

    Chapter  Google Scholar 

  • Hyde, M. (1993). Pair work—a blessing or a curse? An analysis of pair work from pedagogical, cultural, social and psychological perspectives. System, 2(3), 343–348.

    Article  Google Scholar 

  • Janssen, J., Erkens, G., Kirschner, P. A., & Kanselaar, G. (2009). Influence of group member familiarity on online collaborative learning. Computers in Human Behavior, 25(1), 161–170.

    Article  Google Scholar 

  • Johnson, D., & Johnson, R. (2007). Creative controversy: intellectual challenge in the classroom (4th ed.). Edina: Interaction Book Co.

    Google Scholar 

  • Johnson, D. W., Johnson, R. T., & Smith, K. A. (1998). Cooperative learning returns to college: what evidence is there that it works? Change, 30(4), 26–35.

    Article  Google Scholar 

  • Jonassen, D., Strobel, J., & Lee, C. B. (2006). Everyday problem solving in engineering: lessons for engineering educators. Journal of Engineering Education, 95(2), 139–151.

    Article  Google Scholar 

  • King, A. (2010). Scripting collaborative learning processes: a cognitive perspective. In F. Fischer, I. Kollar, H. Mandl, & J. M. Haake (Eds.), Scripting computer-supported collaborative learning (pp. 13–37). New York: Springer.

    Google Scholar 

  • Kinsella, K. (1996). Designing group work that supports and enhances diverse classroom work style. TESOL Journal, 6(1), 24–30.

    Google Scholar 

  • Kirschman, J. S., & Greenstein, J. S. (2002). The use of groupware for collaboration in distributed student engineering design teams. Journal of Engineering Education, 91, 403–407.

    Article  Google Scholar 

  • Kirschner, P. A., Beers, P. J., Boshuizen, H. P. A., & Gijselaers, W. H. (2008). Coercing shared knowledge in collaborative learning environments. Computers in Human Behavior, 24, 403–420.

    Article  Google Scholar 

  • Kobbe, L., Weinberger, A., Dillenbourg, P., Harrer, A., Hämäläinen, R., Häkkinen, P., & Fischer, F. (2007). Specifying computer-supported collaboration scripts. Computer-Supported Collaborative Learning, 2, 211–224.

    Article  Google Scholar 

  • Kolodner, J. L., Owensby, J. N., & Guzdial, M. (2004). Case-based learning aids. In D. H. Jonassen (Ed.), Handbook of research for educational communications and technology (2nd ed., pp. 829–862). Mahwah: Lawrence Erlbaum Associates Inc.

    Google Scholar 

  • Linn, M. C., Bell, B., & Davis, E. A. (2004). Internet environments for science education. Mahwah: Lawrence Erlbaum Associates, Inc.

    Google Scholar 

  • Massachusetts Institute of Technology (MIT). (1999). Rubric for professional writing. Retrieved from http://tll.mit.edu/sites/default/files/examples/rubric-tll-writing.pdf

  • Munneke, L., Andriessen, J., Kanselaar, G., & Kirschner, P. (2007). Supporting interactive argumentation: influence of representational tools on discussing a wicked problem. Computers in Human Behavior, 23(3), 1072–1088.

    Article  Google Scholar 

  • Nussbaum, M., Alvarez, C., McFarlane, A., Gomez, F., Claro, S., & Radovic, D. (2009). Technology as small group face-to-face collaborative scaffolding. Computers & Education, 52(1), 147–153.

    Article  Google Scholar 

  • Reiser, B. (2004). Scaffolding complex learning: the mechanisms of structuring and problematizing student work. The Journal of the Learning Sciences, 13(3), 273–304.

    Article  Google Scholar 

  • Resta, P., & Laferrière, T. (2007). Technology in support of collaborative learning. Educational Psychology Review, 19(1), 65–83.

    Article  Google Scholar 

  • Raitman, R., Geelong, V., Augar, N., & Wanlei, Z. (2005). Employing Wikis for online collaboration in the E-learning environment: case study. In Proceedings of Information Technology and Applications, 2005. ICITA 2005. Third International Conference (Vol. 2, pp. 142–146).

    Google Scholar 

  • Ryan, D. P. (2008). Dimension of teamwork survey. Toronto: Regional Geriatric Program of Toronto, University of Toronto. Retrieved from http://rgp.toronto.on.ca/PDFfiles/Dteamsurvey.pdf.

    Google Scholar 

  • Serce, F. C., Swigger, K., Alpaslan, F. N., Brazile, R., Dafoulas, G., & Lopez, V. (2011). Online collaboration: collaborative behavior patterns and factors affecting globally distributed team performance. Computers in Human Behavior, 27(1), 490–503.

    Article  Google Scholar 

  • Shaffer, D. W., Hatfield, D., Svarovsky, G. N., Nash, P., Nulty, A., Bagley, E., Frank, K., Rupp, A. A., & Mislevy, R. (2009). Epistemic network analysis: a prototype for 21st-century assessment of learning. International Journal of Learning and Media, 1(2), 1–22.

    Article  Google Scholar 

  • Shotsberger, P. G. (2000). The human touch: synchronous communication in web-based learning. Educational Technology, 40(1), 53–56.

    Google Scholar 

  • Springer, L., Stanne, M. E., & Donovan, S. S. (1999). Effects of small-group learning on undergraduates in science, mathematics, engineering, and technology: a meta-analysis. Review of Educational Research, 69(1), 21–51.

    Article  Google Scholar 

  • Terenzini, P. T., Cabrera, A. F., Colbeck, C. L., Parente, J. N., & Bjorklund, S. A. (2001). Collaborative learning vs. lecture/discussion: students’ reported learning gains. Journal of Engineering Education, 90, 123–130.

    Article  Google Scholar 

  • Violet, S., & Mansfield, C. (2006). Group work at university: significance of personal goals in the strategies of students with positive and negative appraisals. Higher Education Research & Development, 25(4), 341–356.

    Article  Google Scholar 

  • Weinberger, A., Ertl, B., Fischer, F., & Mandl, H. (2005). Epistemic and social scripts in computer-supported collaborative learning. Instructional Science, 33(1), 1–30.

    Article  Google Scholar 

  • Wood, D., Bruner, J. S., & Ross, G. (1976). The role of tutoring in problem solving. Journal of Child Psychology and Psychiatry and Allied Disciplines, 17, 89–100.

    Article  Google Scholar 

  • Zahn, C., Krauskopf, K., Hesse, F. W., & Pea, R. (2012). How to improve collaborative learning with video tools in the classroom? Social vs. cognitive guidance for student teams. International Journal of Computer-Supported Collaborative Learning, 7(2), 259–284.

    Article  Google Scholar 

  • Pressley, M., Gaskins, I. W., Solic, K., & Collins, S. (2006). A portrait of benchmark school: How a school produces high achievement in students who previously failed. Journal of Educational Psychology, 98(2), 282.

  • Quintana, C., Reiser, B. J., Davis, E. A., Krajcik, J., Fretz, E., Duncan, R. G., & Soloway, E. (2004). A scaffolding design framework for software to support science inquiry. The journal of the learning sciences, 13(3), 337-386.

  • ABET (2016). Accreditation Policy and Procedure Manual (APPM), 2016 – 2017. Retrieved from http://www.abet.org/accreditation/accreditation-criteria/accreditation-policy-and-procedure-manual-appm-2016-2017/.

Download references

Acknowledgements

This work would not have been possible without the leadership and guidance of the late David H. Jonassen (Curator's Professor, University of Missouri), who conceptualized the initial design of the GDEC environment. The authors also wish to thank the National Science Foundation for their support of this work under grant no. NSF DUE # 1044297. 

Authors’ contributions

Author RMM carried out the study design, developed the in-class workshop, and analyzed the data. Author LS helped with the study design and analyzed the data. Authors CLT and NET contributed to the research methods and statistical analyses. Authors RMM and LS contributed equally and were responsible for the majority of the manuscript preparation. All authors read and approved the final manuscript.

Competing interests

The authors declare that they have no competing interests.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rose M. Marra.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Marra, R.M., Steege, L., Tsai, CL. et al. Beyond “group work”: an integrated approach to support collaboration in engineering education. IJ STEM Ed 3, 17 (2016). https://doi.org/10.1186/s40594-016-0050-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40594-016-0050-3

Keywords