Skip to main content

Measuring supports from learning assistants that promote engagement in active learning: evaluating a novel social support instrument

Abstract

Background

Active learning supports student performance, but can be challenging to implement in large courses. The Learning Assistant (LA) Program is a growing intervention to support students in large active learning classrooms. This program places advanced undergraduates who have training in pedagogical methods in active learning classrooms to interact with and support students during in-class activities. LAs increase student performance, but the mechanism behind this is still unclear. Social support is a promising framework to help elucidate the types and extent of assistance LAs provide to students and begin exploring the “how” behind LAs effectiveness. The aim of this study was to develop an instrument measuring undergraduate students’ perceptions of the social supports for active learning available to them in the classroom. This instrument was based on both the broader social support literature and the literature on what factors encourage students to engage deeply in active learning. To provide initial evidence of validity, the instrument was completed in six sections of General Chemistry I at one R1 university. Exploratory and confirmatory factor analyses were applied to determine the internal structure of the instrument. Then the instrument’s relationship to engagement in active learning was evaluated as another form of validity evidence.

Results

These analyses best supported a three-factor instrument that included five items representing supportive feedback provided during active learning (appraisal support), eight items representing emotional support during active learning, and six items representing the communications of norms and values related to active learning (informational support). All three factors were individually correlated with three measures of engagement. In regression analyses with all three factors measured together, only informational support predicted changes in two of the three measures of engagement.

Conclusions

This study supports the use of the Perception of Social Supports for Active Learning (PSSALI) instrument to understand students’ perceptions of the supports they are receiving to engage in active learning in chemistry courses. One implication of this work is that in order to increase engagement, learning assistants should clearly communicate the value of active learning and the classroom norm of active participation.

Introduction

Across the United States, the use of active learning in large college STEM courses has rapidly increased and now encompasses over 40 percent of courses (Stains et al. 2018). Recent meta-analyses demonstrate that, on average, the use of active learning methods increases undergraduate achievement over traditional lecture courses across the STEM disciplines (Freeman et al. 2014) and reduces performance disparities between students from marginalized groups and their peers (Theobald et al. 2020). One of the challenges of using active learning in large classes is the limited ability of one instructor to manage and support the engagement and learning of so many students at one time (Michael 2007; Mulryan-Kyne 2010).

One solution to this challenge of large class sizes is the Learning Assistant model. Learning assistants (LAs) are undergraduates who have previously taken the course that they are embedded in and are trained to facilitate in-class groupwork (Otero et al. 2010). LAs engage with students during in-class activities by providing clarifying instructions, asking guiding questions, and sharing timely feedback on student ideas and overall course success strategies (Otero 2015). The ratio of LAs to students is generally more favorable than instructor to students, so LAs are able to spend more time with individual students and develop a relationship that can impact student engagement in active learning activities.

The learning assistant model has three core elements that differentiates it from other peer tutoring approaches: LAs are required to take a pedagogy seminar that presents effective teaching practices, they are embedded in regular class sessions, and they meet weekly with their faculty mentors. The pedagogy course introduces LAs to relevant educational practices for supporting students during active learning including managing group dynamics, questioning techniques, and building rapport with students. The seminar content is reinforced through weekly reflection assignments. LAs also meet with the faculty member leading the course they are embedded in to review the in-class activities for the upcoming week. Thus, LAs combine their own disciplinary content knowledge with concepts learned in their pedagogy seminar to optimize student learning during in-class group discussions (Learning Assistant Alliance 2020).

Multiple positive impacts on student achievement have been documented in LA-supported active learning classrooms. For example, students with LAs perform better on concept inventories both within (Kohlmyer et al. 2009; Otero et al. 2010) and across institutions (White 2016) and are more likely to correctly answer higher-order exam questions (Sellmani et al. 2017). In addition, performance gaps between students from marginalized groups and their peers are reduced when LAs are present (Van Dusen et al. 2016). LAs can also have lasting effects. For example, students who had LAs have higher odds of passing their subsequent STEM courses than those who did not (Alzen et al. 2018). Beyond performance, LAs also promote positive student attitudes about learning discipline-specific material (Brewe et al. 2013) and student satisfaction with the teaching of their courses (Talbot et al. 2015).

Understanding how LA-student interactions can lead to these positive outcomes is critical for helping optimize the LA program. Although LAs can offer assistance through “office hours” and “recitations,” students report benefiting from LAs most during regular class time (Talbot et al. 2015). This suggests that one possible explanation for the positive impacts of LAs on student learning is their ability to engage frequently and extensively with students during active learning. Initial studies on the interactions of LAs during active learning demonstrate that LAs can change how students engage with in-class activities by both prompting students (Knight et al. 2015) and sharing feedback (Thompson et al. 2020) on their reasoning. LAs also encourage all students to contribute to the exchange of ideas (Chini et al. 2016). These descriptions of LA behaviors suggest that social support theory, which characterizes the types and extent of assistance provided by others, may be promising approach for thinking about how LA-student interactions promote student in-class engagement, and ultimately performance, in active learning courses.

In this paper, we develop and present initial validity evidence for an instrument that measures students’ perceptions of the social supports for active learning provided by LAs. The instrument developed here builds on existing questionnaires of social supports developed for use primarily with K-12 students. Factor analysis was applied to evaluate the internal structure of the instrument and resulting factors were tested to see if they were related to student engagement with in-class activities.

Theoretical framework: social supports

Social support refers to any aid and assistance provided to someone by others (Caplan 1974). Positive relationships are built around the provisioning of these supports (Turner et al. 2014). Critically for thinking about how LA behaviors could influence student engagement and performance, the provisioning of these supports improves a person’s ability and motivation to engage in the behaviors being supported (House 1971). Social support theory has been widely and successfully applied in mental health research and these supports are considered crucial for human well-being and health as they help buffer individuals against stress (Turner et al. 2014).

Social support theory has also been applied in education both at the K-12 and college level. In the education context, the others providing supports (to students) that are commonly measured are teachers, peers, and parents. Studies at the K-12 level that simultaneously measure social supports from multiple sources have found that supports from each group have independent effects on student outcomes (Estell and Perdue 2013; Wentzel 1998). Thus, it is reasonable to focus on one source of social supports in a study even though the ecology of the classroom is composed of multiple types of supports.

Much of the work on social supports in K-12 has focused on student academic engagement at the level of the school. Teacher social support has been shown to have positive effects on a range of outcomes including school compliance (Wang and Eccles 2012), social responsibility (Wentzel 1998), school identification (Wang and Eccles 2012), and increased value of learning (Midgley et al. 1989; Wang and Eccles 2012). An additional positive outcome of social supports from teachers is that it helps students regulate anxiety and stress (Sarason et al. 1990). At the classroom level, engagement in class has been documented by multiple studies to be positively influenced by teacher social support (Bryson and Hand 2007; Goodenow 1993; Wentzel 1994, 1998, 2004) although there are also counter examples (Estell and Perdue 2013). In addition, a related dimension, compliance with instructor expectations, is also positively related to teacher social supports (Garnefski and Diekstra 1996; Wentzel 1994).

Current research on social supports with undergraduates has focused on the holistic experience of college students as well as the mentor-mentee relationship in research experiences. Researchers have found that social supports from university personnel, peers, and parents collectively impact student adjustment and persistence (Espinoza 2011; Halamandaris and Power 1999; Holahan et al. 1995; Tao et al. 2000; Wilcox et al. 2005) and their mental health (Whiteman et al. 2013). Social supports from mentors during research experiences can influence professional identity development and persistence in research career pathways for both undergraduates and graduate students (Estrada et al. 2018; Hernandez et al. 2018; Robnett et al. 2019). Our understanding of the role of social supports in the college classroom is more limited: only a few studies have explored this context. Initial evidence suggests that social supports from instructors remain important for the classroom engagement of undergraduates and it decreases student perception of the workload of a class (Kember and Leung 2006; Xerri et al. 2018). There are no existing studies of the impact of social supports from LAs in the classroom.

Social supports can take many forms and the types of social supports experienced may have differential impacts on classroom engagement. Although the number and nomenclature can vary by researcher, four commonly recognized supports are emotional, appraisal, instrumental, and informational (Cohen and McKay 1984; Haley et al. 1987; House 1971; Semmer et al. 2008; Tardy 1985). Even within these categories of support, the context can change how the support is provided. Thus, the design of social support instruments and the way a support manifests will be context specific. Below, we provide a general definition of each social support and some initial evidence of what that social support could look like in the context of LAs supporting engagement in in-class activities.

Appraisal support is helping individuals understand and cope with stressful events through feedback (Cohen and McKay 1984). Wentzel (2004) provides a possible characterization of appraisal support in the classroom as acknowledging the effort students are putting toward achieving goals as well as creating a safe space for their engagement. Emphasizing effort over ability is also a best practice for promoting a mastery orientation in students, which is correlated with increased engagement (Meece et al. 1988). LAs could provide appraisal support to increase engagement by providing supportive feedback to students that acknowledges the effort they are putting toward the in-class activities. This might look like acknowledging their improvement and providing encouraging feedback.

Emotional support is the support that helps an individual recognize that they are valued and accepted regardless of any difficulties they may be experiencing (Cohen and Wills 1985). Teachers might provide social support by demonstrating to students that they are valued and accepted as they strive toward achieving goals, even when they make mistakes. This dimension of social support is probably one of the most studied. Researchers have found that the degree to which students feel a teacher likes them (Goodenow 1993; Midgley et al. 1989) and has positive expectations for all students (Patrick et al. 2001) impacts engagement. In addition, teachers that students perceive as friendly promote a mastery orientation (Smart 2014) which is correlated with increased engagement (Handelsman et al. 2005). LAs could provide emotional support during active learning by being friendly with students, showing they like and value them. This support may be especially important after a student makes mistakes during in-class activities.

Informational support is providing information and advice (House 1971). In the classroom, this could look like teachers helping students identify the goals and norms of the class that they should focus on (Wentzel 2004). Norms for a course that are predicted to increase engagement in active learning include mistakes are not something to be embarrassed of (Ames 1992), putting effort into learning can lead to improvement (Meece 1991), the class is focused on understanding not performance (Turner et al. 2002), and learning requires being active (Patrick et al. 2001). LAs could share these types of messages with students as they talk with them during in-class activities.

Finally, instrumental support is providing aid, material resources, or needed services (Cohen and McKay 1984). During active learning aid might look like LAs clarifying task instructions, encouraging students to challenge themselves, and answering questions or helping with points of confusion with the content (Wentzel 2004).

Existing measures of social supports and the need for a new instrument to measure social support for active learning

Several existing measures describe social supports in educational contexts (Dubow and Ullman 1989; Eccles and Barber 1993; Harter 1985; Johnson et al. 1985; Malecki and Demaray 2002; Malecki and Elliott 1999; Nolten 1995; Reid et al. 1989; Robnett et al. 2019; see Table 1). These measures were primarily developed to work with elementary and middle school children, so their application to young adults in college contexts is limited. For example, some of the support items reference interactions that would rarely happen in the college classroom because of its scale or student’s age (“If upset, the children will seek comfort from me” or “When I praise the children, they beam with pride”). In the one social support scale developed for use with college students (Robnett et al. 2019), the educational context differs (undergraduate research vs. classroom interactions) such that many of the items are inappropriate.

Table 1 Characteristics of existing social support instruments for use in education

In addition, many of the existing instruments focus on who is providing social supports rather than what supports are being provided. On these surveys each item had to be completed multiple times (i.e., once for each source which usually included teachers, peers, and parents), so researchers chose to keep the number of items per type of support low (often 2-3 items per type). Thus, instruments focused on sources of the support either measure social supports globally or only a subset of the four types. Yet, some studies have shown that different social supports impact different aspects of students’ behaviors and affect. For example, Murray et al. (2016) using version 2 of the Child and Adolescent Social Support Scale (CASSS) found that appraisal support alone influenced instances of school conduct problems, but informational support alone influenced school satisfaction. This suggests that a global measure of social supports, while still predictive, may mask the value of the particular social supports for a given outcome. In addition, measuring the different types of social supports separately is important because each is enacted differently in the classroom and, thus, has different implications for teaching practices. Even in the current instruments that separate out all four forms of social supports (Table 1), each support is only represented by a small number of items (2-3), which can limit the content validity of the factor.

Finally, the type of social support we were measuring was for a specific aspect of the classroom experience, active learning, rather than the global experience. This made existing informational items particularly challenging to apply. Informational support is often defined as providing information and advice about content and can be measured by items such as “My teacher explains things to me that I don’t understand” (Reid et al. 1989). In our case, we focused on the communication of information on the value of active learning and of classroom norms that support engagement.

Given these challenges with existing instruments, we have attempted to create a novel social support instrument that measures four common types of social support separately and focuses on supporting undergraduates to engage in active learning. In this paper, we present initial validity evidence for the Perception of Social Supports for Active Learning Instrument (PSSALI).

Methods

Overview of types of validity evidence explored

There are many types of validity evidence that can be considered when developing an instrument (American Educational Research Association et al. 2014). In this article, we present, to varying degrees, four types of validity evidence. In the first half of the “Methods” section, we present initial validity evidence based on content and response process. Evidence based on content is the alignment of the relationship between the content of the instrument (the items) and the constructs they are intended to measure (the different social supports). Evidence based on response process represents information on how respondents answer the instrument items. Preliminary evidence for this dimension comes from think-aloud interviews with students and discussions with undergraduate researchers engaged in the project. In the second half of the “Methods” and the “Results” sections, we present more extensive arguments for evidence based on internal structure and relations to other variables. Evidence based on internal structure is the analysis of the relationships between items and how they relate to their intended constructs. This is evaluated through factor analyses. Finally, evidence based on relations to other variables is evaluated by determining whether the predicted relationships between the constructs represented by the survey and outside variables are found using the survey items. We specifically examine the relationship between social supports for active learning and student self-reported engagement in in-class activities.

Instrument development and validity evidence based on content

Our instrument aims to measure social supports that contribute to student engagement in in-class activities in large STEM classrooms (Perception of Social Supports for Active Learning Instrument, PSSALI). The instrument is a self-report survey of the availability of social supports from learning assistants to students. To evaluate social supports, the survey measured four types: appraisal, emotional, informational, and instrumental.

The PSSALI was assembled by writing 22 new items and adapting 19 items from existing social support, teacher-student relationship, and classroom culture/climate instruments. The new items were inspired by concepts in existing instruments, descriptions of social supports in classrooms, and conversations with undergraduate students and learning assistants about what these supports could look like in large college classes. The informal conversations with undergraduates included brainstorm sessions with the two undergraduate co-authors (one of whom, G. Jacomino, was an LA and one of whom, D. Hernandez, had been a student in LA-supported courses) and S. Eddy as well as conversations undergraduate co-authors had with several of their peers. The initial instrument included 9 appraisal items, 13 emotional items, 9 informational items, and 10 instrumental items. A six-point (strongly disagree [1], disagree [2], somewhat disagree [3], somewhat agree [4], agree [5], strongly agree [6]) Likert scale was used to rate these items. We also provided students the option to choose “prefer not to respond” on each item.

Appraisal support items

Appraisal support items focused on supportive feedback and feedback that acknowledged student effort (Table 2). One item was adapted from an instrument measuring classroom climate (Midgley et al. 2000). The remaining 8 items were developed.

Table 2 Original set of Social Supports for Active Learning Instrument (SSAL) items. The prompt was: “Please consider your experiences with your learning assistant(s), LAs, in the lecture portion of your General Chemistry 1 course and answer how much you agree with the statements below.” Each item started with “The learning assistant(s)…” Mean values and standard deviations from general chemistry students given. Original source for each item is indicated in the last column. D, developed for this survey

Emotional support items

Emotional support items focused on whether students felt a learning assistant cared about them, respected them, and believed in them (Table 2). Seven items were adapted from existing social support instruments (Johnson et al. 1985; Malecki and Demaray 2002; Robnett et al. 2019) and one item from an instructor-student relationship instrument (Pianta 2001). Five new items were developed.

Informational support items

Informational support items focused on conveying norms and information that would influence engagement such as encouraging students to assume a mastery orientation and conveying that learning occurs when students are active (Table 2). Six items were adapted from a classroom climate instrument that measured instructor impact on that climate (Midgley et al. 2000) and three items were developed.

Instrumental support items

Instrumental items focused on helping students gain skills and providing assistance (Table 2). Four items were adapted from existing social support instruments (Malecki and Demaray 2002; Robnett et al. 2019) and six items were developed.

Think-alouds and evidence based on response processes

To explore whether students would interpret items in the way we intended and demonstrate initial validity evidence based on response process, we conducted six think aloud interviews with STEM undergraduates at a large Hispanic-serving R1. Students were recruited who had previously taken General Chemistry I with LAs. They were predominately Hispanic students. In the think-aloud interviews students described their reasoning for answering each question the way they did, providing insight into how they interpreted each question. Given the length of the instrument, each student only worked through two of the four factors, so three students provided feedback on each scale. Although a small sample, it should be sufficient to identify severe problems with any items (Virzi 1992). Based on think-alouds, one item from instrumental construct was edited further because of interpretation challenges. Students in the think-alouds all independently suggested the same change (which was adopted), but the item was not re-interviewed after the change.

Participants, procedures, and course context

The questionnaire was distributed in Fall 2018 to undergraduates in six General Chemistry I sections at a large southern Hispanic Serving R1 (n=827). The instrument was administered through an online survey platform 4 weeks before the end of the semester to ensure students had sufficient interactions with LAs to respond to the questions. Students completed the PSSALI outside of class and received a small number of extra credit points for their participation. In total, 691 students at least partially completed the survey. Demographic information was not collected for these sections, but the overall demographic breakdown of this university is found in Table 3.

Table 3 Demographics of undergraduate population at the university

General Chemistry I is a team-taught course meaning there is a common curriculum (including textbooks and worksheets) and shared exams across the sections. The instructional team was composed of three instructors who each taught one section and one that instructor taught three sections. The four instructors met weekly to discuss the next week’s classes and the in-class activities to be employed. Students worked together in instructor-assigned permanent groups ranging from 3-4 people on a variety of activities and problems. Based on instructor self-report, class time was evenly divided between lecture done by the professors and group activities. Each class day, the professor would introduce a topic and then students would complete activities for that topic. During these activities students attempted to answer questions and could seek help from the LAs if needed or receive LA feedback on their initial answers. Each LA was responsible for 9-12 groups of students (27-48 students total). After class, the LAs provided feedback to the professor on student progress and difficulties. This cycle of interactions was present within all studied sections.

LA preparation for these six sections was similar. All LAs had previously taken General Chemistry I and experienced the same active learning setting they now would contribute to. Thus, they had experience with the class content, and understood its challenges from a student’s perspective. Before the semester began, all LAs were interviewed and given clear expectations regarding their responsibilities by the instructor. All LAs had also had taken or were currently enrolled in an LA training seminar. This seminar met once a week and covered a range of topics including techniques for stimulating discussion, encouraging students to adopt a growth mindset, helping students engage in metacognition, creating a positive classroom climate, and promoting equity in the classroom. In addition to the seminar, LAs met weekly with the instructor leading their particular section. In this meeting, LAs received practice in mock class sessions and guidance from more experienced LAs.

Factor analyses and evidence based on internal structure

Data analysis was run in R version 3.5.1 (R Core Team 2020). First descriptive statistics were examined and then factor analysis was applied to determine the number of subscales present in the instrument. Because we heavily modified items and drew from multiple sources, we treated this instrument as if it was new: applying first an exploratory factor analysis (EFA) on half the sample (n = 346 and a confirmatory factor analysis (CFA) on the other half (n = 345). EFA was run using the R package psych (Revelle 2018) and the CFA was run with R package lavaan (Rosseel 2012). Splitting the sample and running the EFA with a different set of students than the CFA is considered best practice in measurement development (Bandalos and Finney 2010).

Exploratory factor analysis

Due to preliminary data exploration results, a weighted least square (WLS) estimator was chosen to extract the variance from the data. We hypothesized a correlation between sub-scales within the instrument (Appraisal, Emotional, Instrumental, and Informational supports), so an oblique rotation (oblimin) was applied. To identify the number of factors to retain visual inspection of a scree plot, parallel analysis, factor analysis, and theory were all considered. The total variance explained, communalities, pattern coefficients, and factor correlations were used to evaluate the fit of the data to the model and the fit of each item to each sub-scale. A pattern coefficient > 0.5 on the theorized sub-scale was considered sufficient for retention of the item and a pattern coefficient for the same item of > 0.25 on any other subscale was considered problematic.

The total sample size for the EFA was 346, which is sufficient for performing factor analysis when the number of items per factor and the item correlations are high (Gagne and Hancock 2006; Wolf et al. 2013).

Confirmatory factor analysis

To confirm the results of the EFA, CFA was applied to the second half of the sample (n = 345). A weighted least square mean and variance adjusted estimator (wlsmv) was used to extract the variance from the data. Like the WLS in the EFA, this estimator treated the data like it was ordinal and not continuous. Multiple fit indices were run to evaluate model fit (chi-squared value; Tucker-Lewis fit index, TLI; root-mean-squared error of approximation, RMSEA). We followed Hu and Bentler’s (1999) recommendations to evaluate the adequacy of model fit: TLI > 0.95 and RMSEA < 0.06. Coefficient α was computed based on model results and used to assess reliability (Gignac 2009). Coefficient α values > 0.70 were acceptable.

Validity evidence based on relationships to other variables: testing relationship to in-class engagement

The final type of validity evidence we collected was evidence based on relationships to other variables. Drawing on the literature on LAs, LAs seem to have their largest impact through their in-class interactions with students (Talbot et al. 2015), which implies that LAs influence how students engage in classroom activities (Chini et al. 2016; Knight et al. 2015; Thompson et al. 2020). Several studies have specifically documented that LA prompting can lead to increased sharing of logic and participation. In addition, from the literature on social supports, we also know that when individuals feel supported they are more likely to engage in the supported behavior (House 1971). Our survey focused on supports for active learning, so we predicted that we would see students engage more fully in active learning if they experienced greater informational, instrumental, appraisal, and emotional support from LAs.

In-class engagement was measured using three constructs from the Formative Assessment Buy-in and Utilization Survey (FABUS) developed by Brazeal and Couch (Brazeal et al. 2016, 2018, 2019; Brazeal and Couch 2016). This survey gages how students perceive and interact with in-class formative assessments. The three FABUS constructs we used were student buy-in to active learning (4 items), surface approach (4 items), and deep approach (4 items). The Buy-in scale measured student’s perception of the value of in-class activities for their learning (example item: “The in-class activities help improve my learning in this course.”). The scales of deep and surface approaches focus on how and with what goals students engage with the in-class activities. Students using deep approaches seek to gain conceptual understanding (Davidson 2003; Elias 2005), whereas students using surface approaches give less effort, tend to resort to memorization, and exhibit a lack of reflection (Baeten et al. 2010). An example item from the deep approach scale is: “When completing the in-class activities, I try to work on them until I have a better understanding.” An example item from the surface approach scale is: “I complete the in-class activities without really understanding the concepts.” For these three measures of engagement, we specifically predicted that students who experienced higher levels of social supports for active learning from LAs would exhibit greater buy-in to active learning and also would employ deep engagement strategies. We predicted social supports would be inversely related to employing surface level strategies.

A preliminary CFA of the FABUS did not support good model fit (TLI = 0.886, RMSEA = 0.092 [90% CI: 0.084-0.10], and SRMR = 0.097), so the sample was split in half and an EFA was run. The EFA identified one item on the Buy-in scale that crossloaded strongly, so we removed that item. A second CFA was run on this new structure with the other half of the sample and reasonable model fit was achieved (TLI = 0.95, RMSEA = 0.073 [90% CI: 0.056-0.09], SRMR = 0.049).

To test whether there were relationships between social support measures and in-class engagement, we first used the entire data set to examine correlations between all our variables. Our EFA and CFA results suggested we could create a single measurement for each social support and engagement outcome by averaging a student’s responses on the items representing each construct. Because our social supports had skewed distributions, we used the Kendall rank correlation coefficient.

To establish which social support types were important for predicting engagement in active learning (controlling for the other types), we ran regressions with the entire dataset. Two of the engagement measures (deep engagement and buy-in) were highly left skewed, so we employed Tobit regressions to correct for the ceiling effect observed (Theobald et al. 2019). Because the study sample came from the classrooms of four different instructors, we included instructor in the model. We ran a model with all the social supports together as predictors (as well as a variable for instructor). In addition, because of the substantial correlations between the different measures of social supports, we also ran each support individually as a predictor of each type of engagement. Tobit regressions were implemented in R with the censReg package (Henningsen 2020).

Results

Descriptive statistics

Within the sample used for EFA, all social support items had means between 4.9 and 5.53 and standard deviations ranged from .81 to 1.25. The majority of variables exhibited univariate skewness > 1.5 with four items above 2. Kurtosis ranged from 1.2 to 6.74. Mardia test of multivariate normality and Royston’s revision of a goodness of fit multivariate extension to the Shapiro-Wilks W test (for small sample sizes) both indicated a lack of multivariate normality. In addition, 52 outliers were identified using Mahalanobis distance (p< 0.001). These cases were examined and all were instances of individuals reporting low values for an item. We found no justification for removing any of these cases. Across all the items about 7% of data was missing. This low number implies that the estimation method for the missing data would not strongly influence our results, so EFAs were run using the median value of an item to replace any missing values in the data set (Tabachnick and Fidell 2013).

Given that our data demonstrated multivariate skewness and had a large number of outliers, we chose to use a weighted least squared estimator rather than the traditional maximum likelihood estimator. Maximum-likelihood estimators are sensitive to both skewness and outliers, whereas weighted least squares estimators are more robust (Zygmont and Smith 2014).

Factor analyses and evidence based on internal structure

Scree plot and parallel analysis revealed the presence of 1-4 factors. Thus, initial EFAs including all items were tested. The total variance explained by each EFA was similar (1 factor, 57%; 2 factors, 61%; 3 factors, 64%; 4 factors, 66%).

First round of EFAs

The correlation matrix and initial EFAs reveal that there were high correlations between items both on the same sub-scale and across subscales (Supp. Table 1). Crossloading was common in the multiple factor EFAs. For the 1-factor solution, all items loaded with pattern coefficients >0.64. The two factor solution had little support as all items loaded primarily on the first of the two factors (Supp. Table 1).

With the 3-factor solution, we began to see resolution of the different subscales (Supp. Table 1). Emotional support items loaded onto the first factor (with three items cross-loading with the second factor) and informational support items loaded on the second factor (with 1 item crossloading with the first factor and 1 item with poor fit on any factor). Appraisal support items loaded most strongly on the third factor, but demonstrated strong cross-loading across all three sub-scales and low fit in general (pattern coefficients on third factor < 0.55). Instrumental support items did not demonstrate cohesion, crossloading across factors 1 and 2 with some items loading primarily on factor 1 and others primarily on factor 2.

The 4-factor solution demonstrated the most promise for separating the four theoretical sub-scales (Supp. Table 1). Emotional support items loaded on factor 1 with three items demonstrating crossloading with other factors. Informational support items loaded primarily on the second factor, although one item demonstrated cross-loading. Appraisal support items loaded primarily on the third factor, although three items still had pattern coefficients < 0.5 on this factor. Instrumental support items still did not demonstrate cohesion with items loading on the first, second, and fourth factor. These items had strong crossloading and the majority did not load well onto any factor, with communalities ranging from 1.6 to 3.5.

Given these results, we chose to pursue two options: a 1-factor solution with all the items and the 4-factor solution, which discriminated the most among the four sub-scales.

Second round of EFAs: working with the four-factor solution

We began by removing problematic items from the three sub-scales that already demonstrated reasonable resolution in the hopes that this would reduce crossloading of the items related to instrumental support. We first removed items with pattern coefficients < 0.5 on all factors for these three sub-scales. Then we removed items that showed pattern coefficients >0.5 on one factor but also had pattern coefficients >0.25 on another factor for these three subscales. Iterative pruning of items achieved clear factors for appraisal, emotional, and informational support, but instrumental support continued to have low-pattern coefficients on its focal factor and to crossload strongly across the other factors with communalities for items ranging from 1.6 to 2.8.

We concluded that the items that we thought made up instrumental support did not make their own factor instead they were influenced by the three other sub-scales of social supports. Given that the items had low-pattern coefficients even in the three-factor model from round 1, we made the decision to drop instrumental support and continue to work with the items for emotional, appraisal, and informational supports only using a three-factor solution.

Third round of EFAs: three-factor solution with emotional, appraisal, and informational supports

This round of EFAs began again with all the initial items for the three sub-scales. We had a dual purpose in this round of reducing the number of items to make the survey more user friendly and to create a clean factor structure. We began dropping items in an iterative fashion. The emotional support subscale had the most items, so we began there. All items had pattern coefficients > 0.5 on the focal factor, but three also had pattern coefficients ≥ 0.2 on another factor. We removed these three items (E1, E4, and E13). In addition, two pairs of items were determined to be redundant (E7 and E8 as well as E6 and E12). The item from each pair with the lower pattern coefficient on the focal factor was removed (E6 and E8). The final emotional support scale had eight items (E2, E3, E5, E7, E9, E10, E11, and E12). Cronbach’s alpha for the final scale was 0.93.

Next, we focused on the informational support scale. Item If1 had a low-pattern coefficient (< 0.5) on the focal factor and was thus removed. Items If2 and If5 all had pattern coefficients > 0.2 on a second factor and were removed. The final informational support scale had six items (If3, If4, If6, If7, If8, and If9). Cronbach’s alpha for the final scale was 0.92.

Appraisal support had two items (A1 and A2) that had pattern coefficients < 0.5 on the focal factor and were removed. In addition, two pairs of items (A3 and A4 and A6 and A7) were considered redundant, so the item from each pair with the lower pattern coefficient on the focal factor was removed (A3 and A7). The final appraisal support scale had five items (A4, A5, A6, A8, and A9; Supp. Table 2). Cronbach’s alpha for the final scale was 0.90.

We ran the final 3-factor model (Supp. Table 2) and the total variance explained was 67%.

Fourth set of EFAs: refining the one-factor solution

We ran the reduced set of items identified through the refinement of the three-factor solution and added back in the instrumental items to further explore a one-factor model for social supports. All items had pattern coefficients greater than 0.64. We then removed two items from those originally on the instrumental scale (Is3 & Is4) deemed redundant with an item on the appraisal scale A4. The final 1-factor model included items A4, A5, A6, A8, A9, E2, E3, E5, E7, E9, E10, E11, E12, Is1, Is2, Is5, Is6, Is7, Is8, Is9, Is10, If3, If4, If6, If7, If8, and If9 (Supp Table 2). Cohen’s alpha for the global social supports scale was 0.97.

The total variance explained by the final 1-factor model was 57%.

Confirmatory factor analysis: one-factor solution and three-factor solution

To confirm the factor structures suggested by the EFAs, two CFA models were specified. First, a one-factor CFA model was tested that included items A4, A5, A6, A8, A9, E2, E3, E5, E7, E9, E10, E11, E12, Is1, Is2, Is5, Is6, Is7, Is8, Is9, Is10, If3, If4, If6, If7, If8, and If9. The one-factor model did not demonstrate model fit that met our guidelines (χ2 = 72.01, df = 22.8, p <0.00; TLI = 0.91, RMSEA = 0.089 [CI: 0.066-0.114], and SRMR = 0.055). Looking at the modification indices suggested that dropping items A7 and Is2 could increase model fit. With this change, we achieved our guidelines for the RMSEA (0.08 [CI: 0.055-0.108]) but were still low for the TLI (0.93). The standardized factor loadings were above 0.7 for all items, indicating that for most items > 50% of variance in item was explained by the theorized factor.

The second solution we tested was the 3-factor models where items A4, A5, A6, A8, and A9 represented appraisal support; E2, E3, E5, E7, E9, E10, E11, and E12 represented emotional support; and If3, If4, If6, If7, If8, and If9 represented informational support (Fig. 1). Correlations between the factors were allowed. The 3-factor solution demonstrated good model fit (χ2 = 35.8, df = 22.2, p = 0.034; TLI = 0.98, RMSEA = 0.05 [CI: 0.00-0.095], and SRMR = 0.032). The standardized factor loadings were above 0.7 for all items, indicating that for most items > 50% of variance was explained by the theorized factors. The correlation between appraisal and emotional support was 0.56 and appraisal and informational support was 0.49. The lowest correlation between the latent factors was for emotional and informational support at 0.42.

Fig. 1
figure 1

Final three-factor CFA model. Instrument items (for text of items see Table 2) are represented by squares and factors are represented by ovals. The numbers associated with the double-headed arrows between the factors are the correlations between each factor. The numbers next to the one directional arrows are standardized factor loadings. Small arrows indicate error terms. P<0.001 for all estimates

We had the best support for the three-factor solution, so we continued with that solution.

Testing relationships to other variables: can the perception of social supports for active learning instrument predict engagement in active learning?

Descriptive statistics and correlations

Students perceived high levels of social support from LAs (Fig. 2). The highest level was for emotional support (median: 5.5), but the other two were not much lower (appraisal support median, 5.2; informational support median 5.3).

Fig. 2
figure 2

Distribution of students’ perceptions of appraisal, emotional, and informational social supports they receive from LAs. The dotted line represents the median response value. Each item was scored on a 6-point Likert scale (strongly disagree [1], disagree [2], somewhat disagree [3], somewhat agree [4], agree [5], strongly agree [6])

All three social support scales were significantly correlated with each other (paralleling results from the EFA) and the three measures of engagement (Table 4). These correlations with engagement ranged from weak to moderate (Dancey and Reidy 2007).

Table 4 Kendall rank coefficient correlations between appraisal, emotional, and informational support and three measures of engagement: buy-in, surface engagement, and deep engagement

Regressions

For each of the three engagement variables, we ran a total of four models. First, each social support was run individually and then all were run together in a final model.

Buy-in

All three types of social supports significantly, at the α = 0.05 level, predicted buy-in when run in individual Tobit regression models (Supp Table 3). After accounting for variation based on instructor, a one-point increase in emotional (β = 0.74 ± 0.079) and informational (0.74 ± 0.072) support increased student buy-in by almost 3/4ths a point on the Likert scale ranging from 1 to 6. Appraisal support increased buy-in by a slightly smaller effect (β = 0.59 ± 0.062). When combined in the same regression model, only informational support was a significant predictor of buy-in (Supp Table 3). As students perceptions of receiving informational support increased (β = 0.68 ± 0.123) so did their buy-in to active learning. In addition, in all models, buy-in varied between instructors (Supp. Table 3).

Surface engagement

None of the three social supports significantly predicted surface engagement when run in individual Tobit regressions (appraisal, β = −0.04 ± 0.055; emotional, β = −0.10 ± 0.067; informational β = −0.03 ± 0.064; Supp Table 4). These results held true when all the social supports were combined in a single model (Supp. Table 4). There were differences in surface engagement between instructors in all models (Supp. Table 4).

Deep engagement

All three types of social supports significantly predicted deep strategies of engagement when run in individual Tobit regression models (Supp. Table 5). After accounting for variation based on instructor), both emotional (β = 0.45 ± 0.055) and informational support (β = 0.44 ± 0.052) increased deep engagement by almost half a point on the Likert scale ranging from 1 to 6. Appraisal support increased deep engagement by a slightly smaller amount (β =0.35 ± 0.242). When combined in the same regression only informational support was significant. As student perceptions of receiving informational support increase (β = 0.32 ± 0.089), so did their deep engagement in active learning. Deep engagement also varied by instructor (Supp. Table 5).

Discussion

LAs improve student performance (Kohlmyer et al. 2009; Otero et al. 2010; White 2016), but the mechanisms by which they do this remains unclear. We hypothesize that LAs’ in-class interactions with students change how students engage with active learning. As a first step toward exploring this hypothesis, here we developed an instrument (the Perception of Social Supports for Active Learning Instrument, PSSALI) to identify the types of and extent of supports LAs provide to students in active learning classrooms and collected initial validity evidence supporting the use of this instrument with LAs.

Instrument quality

Social supports are often measured globally in education studies (see Table 1), but for our instrument we found stronger support for a three-factor model that separated appraisal, emotional, and informational support. CFA on our data supported the use of the three-factor model over the use of a global one-factor model. This difference in our survey is likely due to our intended use: we focus on one classroom relationship (LA-student) and, thus, differences between the potential sources of social supports (teachers, parents, LAs, etc.) did not overwhelm differences between the types of social supports as observed with other instruments (Eccles and Barber 1993; Harter 1985; Johnson et al. 1985; Malecki and Demaray 2002; Malecki and Elliot 1999; Nolten 1995).

The final instrument included five appraisal, eight emotional, and six informational support items. The five appraisal items focus on supportive feedback that emphasizes effort and the ability of students to improve. The eight emotional items focus on the LA caring, being empathetic, and supporting students in their struggles. The six informational support items communicate norms of the class including a focus on engagement in in-class activities, emphasizing that mistakes are ok and natural, and that the purpose of engagement should be to understand the material not just complete the activity.

Overall, using the 3-factor solution of the PSSALI, we found that students in general chemistry classes perceived strong social supports from their LAs. This meant we saw positively skewed responses. Mean values for the retained items range from 4.9 to 5.5 on a scale from 1-6. For the final appraisal factor, the mean item score was 5.3, for emotional it was 5.3, and for informational it was 5.2 out of 6. This skew and limited variation may have impacted our ability to correlate these scales to the engagement measures. In the future, this skew could be addressed by using a positively packed response scale that has more positively worded items than negative (Brown 2004; Brown et al. 2017). Alternatively, items could be reworded to make them harder for a student to agree with.

One challenge we could not resolve was identifying an instrumental support scale. This fourth form of social support focuses on providing aid and services. In the classroom context, LAs are not providing this in a material way, but rather through answering questions or helping students who are struggling. Thus, in the classroom context it may be accurate that we saw the items crossloaded with other forms of support related to feedback and providing information. Dropping the instrumental support items from our instrument limits the scope of social supports measured relative to the existing more global social supports measures, but provides a finer resolution for the supports we are able to include. This resolution may be important as different social supports can influence different student outcomes and the relative importance of different social supports for an outcome can vary over time (Murray et al. 2016).

Evidence of the relationship between engagement and social supports

In the second half of this study, we demonstrated preliminary evidence that the three social supports measured by the PSSALI each were related to measures of in-class engagement. When we ran each type of social support independently, we found all three types were positively related to deep engagement and buy-in in both the correlation analyses and Tobit regressions. This was as we predicted. However, in the Tobit model controlling for each other type of social support (i.e., running all three supports in the same mode), we found that informational support alone influenced two of the three measures of engagement. These conflicting results could be explained by the moderate correlation between the three supports. One assumption of regression is that the predictor variables are independent of each other, which is violated by this correlation. Because of this, we have chosen to focus on the regression results where the supports are run separately. Still, the strength of the influence of informational support may not be surprising because students experiencing informational support are learning about norms and values related to active learning. Our instrument included informational support items focused on the value of engaging in active learning which should directly influence buy-in. Information support items also evaluated whether students received messages about the importance of engaging in in-class activities to understand the material. This message should encourage deep engagement.

The relationship between social supports and surface engagement was less clear. In the correlation analyses, we found the predicted negative relationship between each social support and surface engagement. However, this impact was small and it was not mirrored in the Tobit regressions.

Two prior studies with college students explored the provisioning of social supports in educational contexts and their impacts. Xerri et al. (2018) measured engagement of college students in their courses and social support. They took a more global approach to social supports and measured feedback, assistance, and the relationship between student and teacher all together as a single construct. They found higher responses on this combined scale were related to reduced student perceptions of workload in the class and, similar to our findings, increased engagement in the course. Another study on social supports with college students split out the different types of social supports and their impact on student self-efficacy in undergraduate research experiences (Robnett et al. 2019). Researchers found that emotional support from research mentors had a smaller impact on self-efficacy after 1 year of research than instrumental support. They did not measure informational or appraisal support. In their qualitative study of emergent themes, however, they did find evidence of informational support for research: being transparent about the challenge of each task and pointing out that undergraduate researcher will have to work hard to be successful were both positive mentor practices for students. Together with our study, these results suggest that social supports can have positive impacts on college age students and should be further explored in these educational contexts.

Implications

The results of this work suggest that LAs embedded in courses provide more than just help with course content. Students are receiving at least three types of social supports from LAs. This finding is, in and of itself, interesting; especially in light of the type of training LAs receive in their pedagogy course. The pedagogy course, which is one the key elements differentiating the LA program from other types of peer learning, focuses on the cognitive and social aspects of learning and may already be training LAs to provide appraisal, emotional, and informational support (Otero et al. 2010). For example, the informational support scale includes items about engaging in active learning to understand and promote the idea that mistakes are ok. The cognitive aspect of the pedagogy course provides LAs techniques to focus students on understanding rather than memorizing and to encourage students to adopt a growth mindset. A key piece of the growth mindset is normalizing that mistakes are part of learning and that one can get better with practice. So, it is possible that LAs are buying into these ideas in their pedagogy course and then sharing them with students in the classroom. Similarly, in the social aspects of learning section of their course they learn about building rapport and positive relationships with students, which aligns with the emotional support scale. Collecting evidence of a link between the pedagogy course and LAs’ implementation of appraisal, emotional, and informational support in the classroom could be used to focus and further optimize this pedagogy class. In addition, faculty working with LAs could be coached to model these behaviors and reinforce the importance of LAs providing these supports to enhance their spread in their classrooms.

Limitations and recommendations for future study

The Perception of Social Supports for Active Learning Instrument has promising properties, is brief, and covers three of the four primary types of social supports measured in educational contexts. However, validation in one context (chemistry gateway courses at a Hispanic-Serving Institution) does not mean it will work in all. We recommend more studies collecting further validity in different disciplinary and institutional contexts and collecting additional types of validity evidence (American Educational Research Association et al. 2014) for the use of the PSSALI in active learning classes. Also, additional outcome variables need to be evaluated related to the PSSALI, especially outcome measures related to engagement such as value (Midgley et al. 1989; Wang and Eccles 2012) or sense of belonging (Wang and Eccles 2012). It may be that social supports in the classroom work indirectly through measures like these to impact engagement in active learning. In addition to measuring additional outcomes, evidence of what predicts student responses on the PSSALI could provide additional validity. For example, future studies could ask whether students who have more interactions with LAs report higher supports or if LAs who have more experience provide more supports. Finally, the validity evidence for response process could be further strengthened with additional think aloud interviews. Three participants reviewed each item which is enough to capture severe problems (Virzi 1992), but further challenges could be uncovered with additional interviews, especially if the instrument is being employed in a new context.

Further, this study focused on only one provider of social supports in the classroom, yet, instructors are also present and their impact on engagement is clear from our analyses: we consistently saw that instructors differed from one another in the level of student engagement in their classrooms. However, it is unclear whether the instructor’s contribution is acting through social supports or some other mechanism (for example, differences in implementation of in-class activities). It is likely the PSSALI could be adapted to measure social supports from instructors as well as LAs simultaneously to help address this question, but this use needs to be validated.

Conclusions

This study supports the use of the PSSALI to gain an understanding of the social supports provided to students by LAs in active learning classrooms. The PSSALI allows for greater resolution than other existing social support instruments for three of the four commonly reported social supports in classrooms: appraisal, emotional, and informational. Initial evidence that these three measures can influence student engagement in in-class activities is demonstrated. We encourage STEM education researchers to adapt and collect additional validity evidence for this instrument’s use with different providers of social supports in classrooms and in different contexts to gain a broad understanding of the role of social supports in classroom engagement and, thus, student performance.

Availability of data and materials

The datasets generated and analyzed during the current study are available and anonymized from the corresponding author by request.

Abbreviations

CFA:

Confirmatory factor analysis

CFI:

Comparative fit index

EFA:

Exploratory factor analysis

LA:

Learning assistant

MLR:

Robust maximum likelihood estimator

MLR χ2:

Chi-squared value from robust maximum likelihood estimation

RMSEA:

The root-mean-squared error of approximation

SRMR:

The standardized root-mean-squared residual

PSSALI:

Perception of Social Supports for Active Learning Instrument

WLS:

Weighted least square

References

  • Alzen, J. L., Langdon, L. S., & Otero, V. K. (2018). A logistic regression investigation of the relationship between the Learning Assistant model and failure rates in introductory STEM courses. International Journal of STEM Education, 5(1), 56. https://doi.org/10.1186/s40594-018-0152-1.

    Article  Google Scholar 

  • American Educational Research Association, American Psychological Association, & National Council for Measurement in Education. (2014). Standards for educational and psychological testing.

  • Ames, C. (1992). Classrooms: Goals, structures, and student motivation. Journal of Educational Psychology, 84(3), 261–271.

    Article  Google Scholar 

  • Baeten, M., Kyndt, E., Struyven, K., & Dochy, F. (2010). Using student-centered learning environments to stimulate deep approaches to learning: Factors encouraging or dicouraging their effectiveness. Educational Research Review, 5(3), 243–260. https://doi.org/10.1016/j.edurev.2010.06.001.

    Article  Google Scholar 

  • Bandalos, D. L., & Finney, S. J. (2010). Factor analysis. Exploratory and confirmatory. In Hancock, & Mueller (Eds.), The reviewer’s guide to quantitative methods in social science, (pp. 93–114). Routledge.

  • Brazeal, K. R., Brassil, C., & Couch, B. A. (2018). FABUS: A survey measuring student buy-in toward and utilization of formative assessments. Society for the Advancement of Biology Education Research National Conference.

  • Brazeal, K. R., Brown, T. L., Brassil, C., & Couch, B. A. (2019). Cultivating active learners: How instructors can change their teaching to help students engage with formative assessments. Society for the Advancement of Biology Education Research National Conference.

  • Brazeal, K. R., Brown, T. L., & Couch, B. A. (2016). Characterizing student perceptions of and buy-in toward common formative assessment techniques. CBE - Life Sciences Education, 15(4), ar73.

    Article  Google Scholar 

  • Brazeal, K. R., & Couch, B. A. (2016). Student buy-in toward formative assessments: The influence of student factors and importance for course success. Journal of Microbiology & Biology Education, 18(1), 1–10.

  • Brewe, E., Traxler, A., De La Garza, J., & Kramer, L. H. (2013). Extending positive CLASS results across multiple instructors and multiple classes of modeling instruction. Physical Review Special Topics - Physics Education Research, 9(2), 020116. https://doi.org/10.1103/PhysRevSTPER.9.020116.

  • Brown, G. T. L. (2004). Measuring attitude with positively packed self-report ratings: Comparison of agreement and frequency scales. Psychological Reports, 94(3), 1015–1024. https://doi.org/10.2466/pr0.94.3.1015-1024.

  • Brown, G. T. L., Harris, L. R., O’Quin, C., & Lane, K. E. (2017). Using multi-group confirmatory factor analysis to evaluate cross-cultural research: Identifying and understanding non-invariance. International Journal of Research & Methods in Education, 40(1), 66–90. https://doi.org/10.1080/1743727X.2015.1070823.

    Article  Google Scholar 

  • Bryson, C., & Hand, L. (2007). The role of engagement in inspiring teaching and learning. Innovations in Education and Teaching International, 44(4), 349–362. https://doi.org/10.1080/14703290701602748.

    Article  Google Scholar 

  • Caplan, G. (1974). Support systems and community mental health. Behavioral Publications.

  • Chini, J. J., Straub, C. L., & Thomas, K. H. (2016). Learning from avatars: Learning assistants practice physics pedagogy in a classroom simulator. Physical Review Physics Education Research, 12(1), 010117. https://doi.org/10.1103/PhysRevPhysEducRes.12.010117.

  • Cohen, S., & McKay, G. (1984). Social support, stress, and the buffering hypothesis: A theoretical analysis. Handbook of Psychology and Health, 4(5–6), 253–267.

    Google Scholar 

  • Cohen, S., & Wills, T. A. (1985). Stress, social support, and the buffering hypothesis. Psychological Bulletin, 98(2), 310–357. https://doi.org/10.1037/0033-2909.98.2.310.

    Article  Google Scholar 

  • Dancey, C. P., & Reidy, J. (2007). Statistics without math for psychology. Pearson Education.

  • Davidson, R. A. (2003). Relationship of study approach and exam performance. Journal of Accounting Education, 20(1), 29–44.

    Article  Google Scholar 

  • Dubow, E. F., & Ullman, D. G. (1989). Assessing social support in elementary school children: The survey of children’s social support. Journal of Clinical Child Psychology, 18(1), 52–64.

  • Eccles, J. S., & Barber, B. (1993). The Michigan study of adolescent life transitions.

  • Elias, R. Z. (2005). Students’ approaches to study in introductory accounting courses. Journal of Education for Business, 80(4), 194–199. https://doi.org/10.3200/JOEB.80.4.194-199.

    Article  Google Scholar 

  • Espinoza, R. (2011). Pivotal moments: How educators can put all students on the path to college. Harvard Education Press.

  • Estell, D. B., & Perdue, N. H. (2013). Social support and behavioral and affective school engagement: The effects of peers, parents, and teachers. Psychology in the Schools, 50(4), 325–339. https://doi.org/10.1002/pits.21681.

    Article  Google Scholar 

  • Estrada, M., Hernandez, P. R., & Schultz, P. W. (2018). A longitudinal study of how quality mentorship and research experience integrate underrepresented minorities into STEM careers. CBE Life Sciences Education, 17(1), 1–13.

    Article  Google Scholar 

  • Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences of the United States of America, 111(23), 8410–8415. https://doi.org/10.1073/pnas.1319030111.

  • Gagne, P., & Hancock, G. R. (2006). Measurement model quality, sample size, and solution propriety in confirmatory factor models. Multivariate Behavioral Research, 41(1), 65–83. https://doi.org/10.1207/s15327906mbr4101_5.

    Article  Google Scholar 

  • Garnefski, N., & Diekstra, R. (1996). Perceived social support from family, school, and peers: Relationship with emotional and behavioral problems among adolescents. Journal of the American Academy of Child and Adolescent Psychiatry, 35(12), 1657–1664. https://doi.org/10.1097/00004583-199612000-00018.

    Article  Google Scholar 

  • Gignac, G. E. (2009). Psychometrics and the measurement of emotional intelligence. In C. Stough, D. H. Saklofske, & J. D. A. Parker (Eds.), Assessing emotional intelligence: Theory, research, and applications, (pp. 9–40). Springer US. https://doi.org/10.1007/978-0-387-88370-0_2.

  • Goodenow, C. (1993). Classroom belonging among early adolescent students: Relationships to motivation and achievement. Journal of Early Adolescence, 13(1), 21–43. https://doi.org/10.1177/0272431693013001002.

    Article  Google Scholar 

  • Halamandaris, K. F., & Power, K. G. (1999). Individual differences, social support and coping with the examination stress: A study of the psychosocial and academic adjustment of first year home students. Personality and Individual Differences, 26(4), 665–685. https://doi.org/10.1016/S0191-8869(98)00172-X.

    Article  Google Scholar 

  • Haley, W., Levine, E., Brown, S., & Bartolucci, A. (1987). Stress, appraisal, coping and social support as predictors of adaptational outcome among dementia caregivers. Psychology and Aging, 2(4), 323–330. https://doi.org/10.1037/0882-7974.2.4.323.

    Article  Google Scholar 

  • Handelsman, M. M., Briggs, W. L., Sullivan, N., & Towler, A. (2005). A measure of college student course engagement. Journal of Educational Research, 98(3), 184–192. https://doi.org/10.3200/JOER.98.3.184-192.

  • Harter, S. (1985). Social support scale for children: Manual and questionnaires.

  • Henningsen, A. (2020). censReg: Censored regression (Tobit) models (R package version 0.5-32).

  • Hernandez, P. R., Hopkins, P. D., Masters, K., Holland, L., Mei, B. M., Richards-Babb, M., … Shook, N. J. (2018). Student integration into STEM careers and culture: A longitudinal examination of summer faculty mentors and project ownership. CBE Life Sciences Education, 17(3). https://doi.org/10.1187/cbe.18-02-0022.

  • Holahan, C. J., Valentiner, D. P., & Moos, R. H. (1995). Parental support, coping strategies, and psychological adjustment: An integrative model with late adolescents. Journal of Youth and Adolescence, 24(6), 633–648. https://doi.org/10.1007/BF01536948.

    Article  Google Scholar 

  • House, R. (1971). A path goal theory of leader effectiveness. Administrative Science Quarterly, 16(3), 321–339. https://doi.org/10.2307/2391905.

    Article  Google Scholar 

  • Hu, L., & Bentler, P. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1–55. https://doi.org/10.1080/10705519909540118.

    Article  Google Scholar 

  • Johnson, D. W., Johnson, R. T., Buckman, L. A., & Richards, P. S. (1985). The effect of prolonged implementation of cooperative learning on social support within the classroom. Journal of Psychology: Interdisciplinary and Applied, 119(5), 405–411. https://doi.org/10.1080/00223980.1985.10542911.

    Article  Google Scholar 

  • Kember, D., & Leung, D. (2006). Characterising a teaching and learning environment conducive to making demands on students while not making their workload excessive. Studies in Higher Education, 31(2), 185–198. https://doi.org/10.1080/03075070600572074.

    Article  Google Scholar 

  • Knight, J. K., Wise, S. B., Rentsch, J., & Furtak, E. M. (2015). Cues matter: Learning assistants influence introductory biology student interactions during clicker-questions. CBE - Life Sciences Education, 14(4), 1–14.

    Article  Google Scholar 

  • Kohlmyer, M. A., Caballero, M. D., Catrambone, R., Chabay, R. W., Ding, L., Haugan, M. P., … Schatz, M. F. (2009). Tale of two curricula: The performance of 2000 students in introductory electromagnetism. Physical Review Special Topics - Physics Education Research, 5(2), 1–10.

    Google Scholar 

  • Learning Assistant Alliance. (2020). Learning assistant alliance: The general program elements.

  • Malecki, C. K., & Demaray, M. K. (2002). Measuring perceived social support: Development of the Child and Adolescent Social Support Scale (CASSS). Psychology in the Schools, 39(1), 1–18. https://doi.org/10.1002/pits.10004.

    Article  Google Scholar 

  • Malecki, C. K., & Elliot, S. N. (1999). Adolescents’ ratings of perceived school support and its importance: Validation of the student social support scale. Psychology in the Schools, 36(6), 473–483.

    Article  Google Scholar 

  • Malecki, C. K., & Elliott, S. N. (1999). Adolescents’ ratings of perceived social support and its importance: Validation of the student social support scale. Psychology in the Schools, 36(6), 473–483.

    Article  Google Scholar 

  • Meece, J. L. (1991). The classroom context and children’s motivational goals. In M. L. Maehr, & P. R. Pintrich (Eds.), Advances in achievement motivation research, (pp. 261–285). Academic.

  • Meece, J. L., Blumenfeld, P. C., & Hoyle, R. H. (1988). Students’ goal orientations and cognitive engagement in classroom activities. Journal of Educational Psychology, 80(4), 514–523. https://doi.org/10.1037/0022-0663.80.4.514.

  • Michael, J. (2007). Faculty perceptions about barriers to active learning. College Teaching, 55(2), 42–47. https://doi.org/10.3200/CTCH.55.2.42-47.

    Article  Google Scholar 

  • Midgley, C., Feldlaufer, H., & Eccles, J. S. (1989). Student/teacher relations and attitudes toward mathematics before and after the transition to junior high school. Child Development, 60(4), 981–992. https://doi.org/10.2307/1131038.

    Article  Google Scholar 

  • Midgley, C., Maehr, M. L., Hruda, L. Z., & Anderman, E. M. (2000). Manual for the patterns of adaptive learning scales.

  • Mulryan-Kyne, C. (2010). Teaching large classes at college and university level: Challenges and opportunities. Teaching in Higher Education, 15(2), 175–185. https://doi.org/10.1080/13562511003620001.

    Article  Google Scholar 

  • Murray, C., Kosty, D., & Hauser-McLean, K. (2016). Social support and attachment to teachers: Relative importance and specificity among low-income children and youth of color. Journal of Psychoeducational Assessment, 34(2), 119–135. https://doi.org/10.1177/0734282915592537.

  • Nolten, P. (1995). Conceptualization and measurement of social support: The development of the student social support scale. University of Wisconsin-Madison.

  • Otero, V. (2015). Nationally scaled model for leveraging course transformation with physics teacher preparation. In Recruting and educating future physics teachers: Case studies of effective practices, (pp. 107–116).

    Google Scholar 

  • Otero, V., Pollock, S., & Finkelstein, N. (2010). A physics department’s role in preparing physics teachers: The Colorado learning assistant model. American Journal of Physics, 78(11), 1218–1224. https://doi.org/10.1119/1.3471291.

    Article  Google Scholar 

  • Patrick, H., Anderman, L. H., Ryan, A. M., Edelin, K. C., & Midgley, C. (2001). Teachers’ communication of goal orientations in four fifth-grade classrooms. The Elementary School Journal, 102(1), 35–58. https://doi.org/10.1086/499692.

    Article  Google Scholar 

  • Pianta, R. C. (2001). Student-teacher relationship scale: Professional manual.

  • R Core Team (2020). R: A language and environment for statistical computing. R Foundation for Statistical Computing.

  • Reid, M., Landesman, S., Treder, R., & Jaccard, J. (1989). “My family and friends”: Six- to twelve-year-old children’s perceptions of social support. Child Development, 60(4), 896–910. https://doi.org/10.2307/1131031.

    Article  Google Scholar 

  • Revelle, W. (2018). psych: Procedures for personality and psychological research (1.8.12). Northwestern University.

  • Robnett, R. D., Nelson, P. A., Zurbriggen, E. L., Crosby, F. J., & Chemers, M. M. (2019). The form and function of STEM research mentoring: A mixed-methods analysis focusing on ethnically diverse undergraduates and their mentors. Emerging Adulthood, 7(3), 180–193. https://doi.org/10.1177/2167696818758734.

  • Rosseel, Y. (2012). lavaan: An R package for structural equation modeling. Journal of Statistical Software, 48(1), 1–36.

    Google Scholar 

  • Sarason, B., Sarason, I., & Pierce, G. (1990). Traditional views of social supports and their impact on assessment. In B. Sarason, I. Sarason, & G. Sarason (Eds.), Social support: An interactional view, (pp. 9–25). Wiley.

  • Sellmani, N., Laski, F. A., Eagan, K. M., & Sanders, E. R. (2017). Implementation of a learning assistant program improves students performance on higher-order assessments. CBE - Life Sciences Education, 16(4), ar62.

  • Semmer, N., Elfering, A., Jacobshagen, N., Perrot, T., Beehr, T., & Boos, N. (2008). The emotional meaning of insturmental social support. International Journal of Stress Management, 15(3), 235–251. https://doi.org/10.1037/1072-5245.15.3.235.

  • Smart, J. B. (2014). A mixed methods study of the relationship between student perceptions of teacher-student interactions and motivation in middle level science. RMLE Online, 38(4), 1–19. https://doi.org/10.1080/19404476.2014.11462117.

  • Stains, M., Harshman, J., Barker, M., Chasteen, S., DeChenne-Peters, S., Eagan Jr., M. K., … Young, A. M. (2018). Anatomy of STEM teaching in North American universities. Science (New York, N.Y.), 359(6383), 1468–1470.

  • Tabachnick, & Fidell (2013). Using multivariate statistics, (6th ed., ). Pearson.

  • Talbot, R., Hartley, L. M., & Marzetta, K. W. (2015). Transforming undergraduate science education with learning assistants: Student satisfaction in large-enrollment courses. Journal of College Science Teaching, 44(5), 24–30.

  • Tao, S., Dong, Q., Pratt, M. W., Hunsberger, B., & Pancer, S. M. (2000). Social support: During the transition to university. Journal of Adolescent Research, 15(1), 123–144. https://doi.org/10.1177/0743558400151007.

  • Tardy, C. H. (1985). Social support measurement. American Journal of Community Psychology, 13(2), 187–202. https://doi.org/10.1007/BF00905728.

  • Theobald, E. J., Aikens, M., Eddy, S., & Jordt, H. (2019). Beyond linear regression: A reference for analyzing common data types in discipline based education research. Physical Review Physics Education Research, 15(2), 20110. https://doi.org/10.1103/PhysRevPhysEducRes.15.020110.

  • Theobald, E. J., Hill, M. J., Tran, E., Agrawal, S., Nicole Arroyo, E., Behling, S., … Freeman, S. (2020). Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proceedings of the National Academy of Sciences of the United States of America, 117(12), 6476–6483. https://doi.org/10.1073/pnas.1916903117.

  • Thompson, A. N., Talbot, R. M., Doughty, L., Huvard, H., Le, P., Hartley, L., & Boyer, J. (2020). Development and application of the Action Taxonomy for Learning Assistants (ATLAs). International Journal of STEM Education, 7(1), 1. https://doi.org/10.1186/s40594-019-0200-5.

  • Turner, J. C., Midgley, C., Meyer, D. K., Gheen, M., Anderman, E. M., Kang, Y., & Patrick, H. (2002). The classroom environment and students’ reports of avoidance strategies in mathematics: A multimethod study. Journal of Educational Psychology, 94(1), 88–106. https://doi.org/10.1037/0022-0663.94.1.88.

  • Turner, R., Turner, J., & Hale, W. (2014). Social relationships and social support. In R. Johnson, R. Turner, & B. Link (Eds.), Sociology of mental health, (pp. 1–20). Springer. https://doi.org/10.1007/978-3-319-07797-0_1.

  • Van Dusen, B., White, J.-S. S., & Roualdes, E. (2016). The impact of learning assistants on inequalities in physics student outcomes. ArXiv Preprint ArXiv:1607.07121.

  • Virzi, R. A. (1992). Refining the test phase of usability evaluation: How many subjects is enough? Human Factors, 34(4), 457–468. https://doi.org/10.1177/001872089203400407.

  • Wang, M. T., & Eccles, J. S. (2012). Social support matters: Longitudinal effects of social support on three dimensions of school engagement from middle to high school. Child Development, 83(3), 877–895. https://doi.org/10.1111/j.1467-8624.2012.01745.x.

  • Wentzel, K. R. (1994). Relations of social goal pursuit to social acceptance, classroom behavior, and perceived social support. Journal of Educational Psychology, 86(2), 173–182. https://doi.org/10.1037/0022-0663.86.2.173.

  • Wentzel, K. R. (1998). Social relationships and motivation in middle school: The role of parents, teachers, and peers. Journal of Educational Psychology, 90(2), 202–209. https://doi.org/10.1037/0022-0663.90.2.202.

  • Wentzel, K. R. (2004). Understanding classroom competence: The role of social-motivational and self-processes. Advances in Child Development and Behavior, 32(C), 213–241. https://doi.org/10.1016/S0065-2407(04)80008-9.

  • White, J.-S. S. (2016). The impacts of learning assistants on student learning of physics. ArXiv:1607.07469 [Physics].

  • Whiteman, S., Barry, A., Mroczek, D., & MacDermid Wadsworth, S. (2013). The development and implications of peer emotional support for student service members/veterans and civilian college students. Journal of Counseling Psychology, 60(2), 265–278. https://doi.org/10.1037/a0031650.

  • Wilcox, P., Winn, S., & Fyvie-Gauld, M. (2005). “It was nothing to do with the university, it was just the people”: the role of social support in the first-year experience of higher education. Studies in Higher Education, 30(6), 707–722. https://doi.org/10.1080/03075070500340036.

  • Wolf, E. J., Harrington, K. M., Clark, S. L., & Miller, M. W. (2013). Sample size requirements for structural equation models: An evaluation of power, bias, and solution propriety. Educational and Psychological Measurement, 73(6), 913–934. https://doi.org/10.1177/0013164413495237.

  • Xerri, M. J., Radford, K., & Shacklock, K. (2018). Student engagement in academic activities: A social support perspective, (pp. 589–605).

  • Zygmont, C., & Smith, M. R. (2014). Robust factor analysis in the presence of normality violations, missing data, and outliers: Empirical questions and possible solutions. The Quantitative Methods for Psychology, 10(1), 40–55. https://doi.org/10.20982/tqmp.10.1.p040.

Download references

Acknowledgements

We would like to thank Dr. Kathryn R.Wentzel for early helpful feedback on instrument items. We also thank Drs. Melissa Aikens, Eva Knekta, and Chris Runyon for providing feedback on the statistical methods.

Funding

No funding was received for this research.

Author information

Authors and Affiliations

Authors

Contributions

DH, GJ, and SLE reviewed the literature and designed the instrument. US recruited instructors into the study. SLE cleaned analyzed and interpreted the data with input from DH and GJ. DH, GJ, SLE, and US wrote the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Sarah L. Eddy.

Ethics declarations

Ethics approval and consent to participate

This study was approved by FIU’s Institutional Review Board (IRB-15-0175).

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1: Supplemental Table 1.

Pattern matrix for original 1-, 2-, 3-, and 4-factor models. Table 2S. Pattern matrix from 1- and 3-factor solution from the third and fourth round of EFAs. Table 3S. Regression coefficients and SEs from the tobit models predicting buy-in to active learning. Table 4S. Regression coefficients and SEs from the tobit models predicting surface engagement strategies during active learning. Table 5S. Regression coefficients and SEs from the tobit models predicting deep engagement strategies during active learning.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hernandez, D., Jacomino, G., Swamy, U. et al. Measuring supports from learning assistants that promote engagement in active learning: evaluating a novel social support instrument. IJ STEM Ed 8, 22 (2021). https://doi.org/10.1186/s40594-021-00286-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40594-021-00286-z

Keywords