A systematic review on trends in using Moodle for teaching and learning
International Journal of STEM Education volume 9, Article number: 9 (2022)
The Moodle Learning Management System (LMS) is widely used in online teaching and learning, especially in STEM education. However, educational research on using Moodle is scattered throughout the literature. Therefore, this review aims to summarise this research to assist three sets of stakeholders—educators, researchers, and software developers. It identifies: (a) how and where Moodle has been adopted; (b) what the concerns, trends, and gaps are to lead future research and software development; and (c) innovative and effective methods for improving online teaching and learning.
The review used the 4-step PRISMA-P process to identify 155 suitable journal articles from 104 journals in 55 countries published from January 2015 to June 2021. The database search was conducted with Scopus and Web of Science. Insights into the educational use of Moodle were determined through bibliometric analysis with Vosviewer outputs and thematic analysis.
This review shows that Moodle is mainly used within University STEM disciplines and effectively improves student performance, satisfaction, and engagement. Moodle is increasingly being used as a platform for adaptive and collaborative learning and used to improve online assessments. The use of Moodle is developing rapidly to address academic integrity, ethics, and security issues to enhance speed and navigation, and incorporate artificial intelligence.
More qualitative research is required on the use of Moodle, particularly investigating educators’ perspectives. Further research is also needed on the use of Moodle in non-STEM and non-tertiary disciplines. Further studies need to incorporate educational theories when designing courses using the Moodle platform.
Various learning management systems (LMSs) are available to develop, manage, and distribute digital resources for face-to-face and online teaching. An LMS provides interaction between traditional teaching techniques and digital learning resources, and simultaneously offers students personalised e-learning opportunities (Aljawarneh, 2020). E-learning is an area that has seen considerable growth, particularly since 2020 with the onset of the COVID-19 pandemic, which has limited face-to-face teaching possibilities for many educational institutions globally (Dias et al., 2020; Raza et al., 2021). Educational institutions have had to adapt to restrictions imposed on physical interaction, which have precluded most conventional forms of education, assessment, research, and scientific discourse (Byrnes et al., 2021).
The role of LMSs has gained prominence within the context of STEM (Science, Technology, Engineering, and Mathematics) programs and courses over the last decade through improved access to broadband internet and advancements in online teaching and learning technologies. Many educational institutions have effectively used LMSs and continue to research the effectiveness of using various types of LMSs. Recent studies focussing on STEM education suggest that various LMSs and associated tools increase student engagement, motivation, collaboration (Araya & Collanqui, 2021; Campbell et al., 2020; Hwang, 2020; Jones et al., 2021), performance, retention, and critical thinking (Alkholy et al., 2015; Ardianti et al., 2020; Bernacki et al., 2020; Cadaret & Yates, 2018; Hempel et al., 2020; Oguguo et al., 2021). In addition, LMSs allow STEM educators to track learning outcomes, predict achievements (for early detection of students at risk), and then use the identified information to adapt and modify teaching practices (Dominguez et al., 2016; Hempel et al., 2020; Price et al., 2021; Sergis et al., 2017; Zakaria et al., 2019; Zheng et al., 2019). The future of STEM education can continue to be improved with innovative LMSs and technology-enhanced learning materials (Zhao et al., 2018), such as online laboratories (Henke et al., 2021), online tutorials (Rissanen & Costello, 2021) and virtual reality applications (Christopoulos et al., 2020). A recent systematic review on research trends in STEM education (Li et al., 2020) indicates that ‘learning environments’ which include an LMS is one key area that will continue to evolve.
Currently, 561 LMSs are available worldwide for academic/educational purposes, according to Capterra (2021) an international software review and selection platform. The learning platforms that were most widely used and researched during 2015–2020 include Edmodo, Moodle, MOOC, and Google Classroom (Setiadi et al., 2021). Research on comparisons of various LMSs is rare but some comparisons between LMSs such as Moodle, Sakai, SumTotal, Blackboard, Canvas, and ATutor are available in the literature (Shkoukani, 2019; Xin et al., 2021). According to a recent systematic review on tendencies in the use of LMSs (Altinpulluk & Kesim, 2021), Moodle is the most popular and preferred open-source LMS. Moodle has a high rate of acceptance in the community and in many institutions and has a wide variety of active courses, available in many languages (Al-Ajlan & Zedan, 2008; Sergis et al., 2017). A recent study that determined the effect of LMSs on students’ performance in educational measurement and evaluation recommends that LMSs such as Moodle should be learnt and used by lecturers (Oguguo et al., 2021).
Currently, the world's leading open-source LMS, Moodle (Moodle Project, 2020a), is used by various disciplines within academia, including STEM education. A keyword search of “Moodle” in publications, categorised by discipline area from 2015 to 2021, indicated that more than 60% of publications containing the keyword “Moodle” are in the STEM area. Moodle is a cloud-based LMS and among the top 20 best LMSs based on user experiences in 2018 (Henrick, 2018). The number of Moodle users continues to increase from 78 million in 2015 (Singh, 2015) to over 294 million in 2021(Moodle Project, 2021a)—an increase of over 250%. Although Moodle is becoming increasingly popular, to date, no review has provided information on the use of Moodle across a vast number of disciplines in different educational institutions at different levels of education. This review aims to comprehensively analyse the literature on the adaptation of Moodle as an educational tool over the past 6 years to provide information for three sets of stakeholders—educators, researchers, and software developers. The review addresses two main research questions:
Where is Moodle used, adapted, and researched?
How is Moodle used in teaching and learning?
This systematic review focuses on recent research (January 2015–June 2021) in using Moodle within academic institutions. The review took a multidisciplinary approach to encompass all subjects and levels within academia. To align with the first research question, Where is Moodle used, adapted, and researched?, a bibliometric analysis was performed to identify the dissemination of the literature and summarise the bibliometrics of the publications. Then, a thematic analysis was performed to address the second research question, How is Moodle used in teaching and learning?.
This study adopted a strict systematic review protocol that followed the 4-step PRISMA-P process (Moher et al., 2015). This process has the following steps: (1) Identification of the relevant literature pertaining to this study, (2) Screening using the criteria determined by the authors, (3) Classification of the screened articles in a methodical manner using codes and themes predetermined by the authors, and (4) Determining the articles for inclusion in this review.
Scopus and Web of Science (WOS) were used to perform the literature search due to their comprehensive journal coverage, ease of keyword searching, accessibility within academia, and popularity within multiple disciplines (Colares et al., 2020; Souza et al., 2019). The term “Moodle” found articles with a wide range of Moodle topics when used in the search databases, while an initial search of Moodle review articles suggested several keywords, such as “Moodle quiz” and “e-learning”. The Scopus search was limited to the selected years with the option of only “Article” or “Review” chosen along with using the title, abstract, and keywords to identify “Moodle” articles. The WOS search was run with “Moodle” and selected all topics in the search parameters. Both database searches were last run on 30 June 2021.
In this phase, literature identified from both database searches was screened to exclude articles that were: (1) published before 2015, (2) written in any language other than English, (3) published but had not been through the peer review process (e.g., conference papers, book chapters, letters), and (4) was not relevant to this review. An individual article's relevance was determined by examining the title, abstract, results, and methods. Any articles that did not fulfil these screening criteria were excluded from this study.
The articles identified and screened were multidisciplinary; therefore, these articles were then classified. Initially, the classification process allocated codes to the journal articles related to the article's research discipline (see Table 1 for codes)—for example, STEM disciplines encompass subject matter of science, technology, engineering, and maths. If more than one discipline was covered in the article, the multidisciplinary (MD) code was used. The articles were then classified into specific subject matter and education levels, including undergraduate, postgraduate, and multi-level. These codes were based on categories of the International Standards Classification for Education (ISCED, 2012). The not-determined (ND) code was used, if needed, for discipline and education level.
The articles selected for review were limited to Jan 2015–June 2021 and included the word “Moodle” in either the title, abstract, or keywords. The 4-step process applied for selecting the articles included in this review is shown in Fig. 1.
The Vosviewer software, Version 1.6.15, was applied for bibliometric analysis using the Scopus and WOS database search results. Vosviewer is freely available software that automates term identification and constructs bibliometric maps based on network data (Colares et al., 2020; de Souza et al., 2019). The combined downloaded results from Scopus and WOS were used to create a CSV file. The CSV file was updated after the 4-step systematic review protocol process and articles irrelevant to this study were removed from the file. The CSV file was then loaded into Vosviewer to create a co-occurrence map of bibliographic data. The software enables the user to build co-occurrence maps in various areas, such as keywords, journal citation counts, and publication title (van Eck & Waltman, 2020). Bibliometric analysis was conducted on each article, including the year of publication, keywords, journal publication citation count, and the country of publication.
Thematic analysis (TA)
Following the classification of the included journal articles, further insights and trends within the articles were established by thematic analysis. This process was consistent with Braun and Clarke (2006) thematic analysis (TA) method which identifies and analyses patterns of meanings (themes) in qualitative data. This method can be applied within a range of theoretical frameworks and can be used to analyse almost all forms of qualitative data, both small and large data sets, to address different types of research questions (Clarke & Braun, 2014). The TA used in this review involves the generation of codes and themes. The codes capture features of each paper which have potential relevance to the research questions. The themes were constructed from the coding to capture broader patterns.
To generate the trends identified in the literature, the six-phase Braun and Clarke (2006) method was used as follows:
Familiarisation with the data: The selected articles were read to become familiar with the topics covered by each article, noting any common concepts covered by each study.
Coding: Codes were generated for important features relevant to teaching and learning covered by each article (Research Question 2). This coding is not simply a method of data reduction; it is an analytic process.
Searching for themes: A theme is a coherent and meaningful pattern in the reviewed articles which is relevant to the research questions. The themes were not necessarily in the articles but were constructed. This review constructed eight themes of interest relevant to teaching and learning (Research Question 2).
Reviewing themes: This step involved reflecting on the themes to tell a story, defining the nature of the theme, and identifying relationships between the themes and different sub-themes within the themes.
Defining and naming themes: This step involved specifying the ‘essence’ of each theme and constructing an informative name for each theme.
Writing up: Writing-up involves creating a coherent and persuasive story about the reviewed papers which includes analysis of current and future research.
The themes, sub-themes, and definitions of each theme are shown in Table 2.
Results and discussion
The initial database searches identified 538 Moodle-related articles. The literature was then screened for the period Jan 2015–June 2021, journal or review articles only, and articles published in English. This screening reduced the identified articles to 285, 167 from Scopus, and 118 from WOS. These initially screened articles were downloaded from the relevant databases and checked for duplicates. After screening for duplicates, the abstracts from the 211 remaining articles were reviewed, resulting in the elimination of a further 24 articles. The full text of the remaining 187 articles was read, eliminating another 32 articles as they were not directly related to this study. Thus, a total of 155 journal articles were used in this systematic review.
Journals and citations
Moodle is prevalent in various disciplines, as revealed by 104 journals relevant to this study. Journal titles that published two or more articles are shown in Fig. 2. The journal with the most published Moodle-related articles was International Journal of Emerging Technologies in Learning (10 articles), followed by Computer Application in Engineering Education (8 articles), and then Journal of e-Learning & Knowledge Society, and the Journal of Technology and Science Education (5 articles per journal).
Scopus was used for the citation count unless the article was only available in WOS; then, the WOS citation count per article was used. The 155 journal articles reviewed in this study have a combined citation count of 608 with the most cited (71 times) being a review article comparing 17 blended courses using Moodle LMS (Conijn et al., 2017). Total citation counts of the articles by published year were 95 in 2015, 92 in 2016, 270 in 2017, 83 in 2018, 50 in 2019, and 21 in 2020. Of the top 10 cited articles (listed in Table 3), five articles were published in 2017, accounting for 198 citations of the total 270 for that year, with the remaining 72 citations across 24 papers. Of the top 10 authors, four are attributed to the top-cited paper (Conijn et al., 2017). All the top 10 cited authors have articles in the top 10 cited list (Table 4).
Vosviewer has the facility to produce a density map of co-occurrences in countries (van Eck & Waltman, 2020). Figure 3 shows the density map of countries publishing more than two articles. Fifty-five countries contributed research to the 155 articles, with 37 countries publishing more than two papers. The higher the count of publications, the brighter the yellow, with Spain contributing 17 articles, the United States of America (USA) 14, Australia 12, the Russian Federation 10, Malaysia 8, Italy 7, and Portugal 5 articles. The software positions the countries with a similar number of articles published close to each other. Therefore, Vosviewer provides the reader with an instantaneous pictorial result of countries publishing Moodle articles.
The keywords from the 155 articles were analysed in Vosviewer. In total, 926 keywords were used, of which 154 were used three times or more. Table 5 shows the top 10 keywords. The most occurring keyword was Moodle (61), followed by e-learning (31), teaching (26), and education (25), and learning management system (25).
Along with the ability to extract the top keywords used within the articles, Vosviewer produced cluster graphics of keywords. Figure 4a shows the cluster graphic of keywords of more than three uses or a higher density with a larger marker on the graphic; hence, the most significant markers are Moodle, e-Learning, and Education. The map also has the feature to zoom in and out, showing more keywords and highlighting the most occurring keywords. Figure 4b shows the option in Vosviewer to see the links that connect the keywords within the articles (in this instance, Education was highlighted). The keywords associated with Education in the 155 articles (with more than 3 occurrences) can be seen with linked keywords, such as Moodle, Student, and e-Learning. All keywords can be highlighted individually for associations to be seen.
Discipline and education level of studies
Research into Moodle assessments is being published in many different subject areas, such as science, technology, engineering, and maths (STEM), health sciences (HS), and veterinary medicine (VM). Figure 5 shows the number of publications per full year (2015–2020) and the articles' discipline.
L—languages, A—arts, VM, veterinary medicine, TD—teaching degree, STEM—science, technology, engineering and maths, ND—not determined, MD—multi-discipline, HS—health sciences, CS—computer science, BS—business studies.
The number of total publications was lowest in 2015 and 2016 with 18 and 19 publications, respectively. This number increased each year after that: 2017 (n = 24), 2018 (n = 26), 2019 (n = 32), and 2020 (n = 36). The two main disciplines throughout this publication period were STEM and HS. The STEM discipline contained various subjects, with most being engineering (civil) and science (i.e., physics and chemistry). HS subjects published include nursing, medical practice, and dentistry. Some articles that did not fit into a particular discipline (ND) covered various subjects, such as security issues identified within e-learning or articles that deal with databases (Chaparro-Peláez et al., 2019; Mudiyanselage & Pan, 2020).
Of the 155 articles, 116 articles evaluated Moodle within a university setting, with 112 at undergraduate (UG) level, nine postgraduate (PG), and seven articles examined at both UG and PG courses. School-age students (S) were the focus in six studies, teaching staff (T) in four articles, and S and T in two articles. A total of 31 articles did not determine (ND) the level of education for the study or were not focused on individuals but rather systems (Chafiq et al., 2018; Conejo et al., 2016).
Thematic analysis (TA)
The trends demonstrated in the research articles are categorised into eight main themes (see Table 2). Theme 1 compares various Moodle features explained in the study. Themes 2 to 4 highlight the trends in pedagogy, which include curriculum development, learning, and assessment processes in e-learning. Theme 5 analyses ethical aspects of e-learning, and Theme 6 highlights trends in new software development aiming to improve e-learning, particularly Moodle. Themes 7 and 8 provide an overview of research approaches, methods, and common student success indicators. Figure 6 shows the number of papers that discuss each of the eight themes, although several papers discuss multiple themes. Figure 7 shows the percentage of papers related to each theme and sub-theme.
Theme 1: Moodle features
Of the reviewed articles discussing Moodle LMS, 23% discuss Moodle ‘Activities’. An activity, a general name for a group of Moodle features, is usually something that a student will engage in and that interacts with other students or the teacher. The activities identified included: Moodle quizzes, forums, workshops, lessons, wikis, and surveys. Of these, Moodle quizzes and workshops were the most prevalent, with 16 and eight articles, respectively (see Fig. 8). Some activities, such as videos, virtual tours, e-portfolios, are external tools easily embedded into the Moodle system.
None of the articles discussed Moodle activities, such as Choice, Database, Feedback, Glossary, H5P activity, or SCORM (for course content). One study (Sánchez et al., 2015) recommends Moodle's “Survey” tool for anonymous surveys, yet if this tool is used along with Moodle’s “Group” option, the users can determine who responds to the survey. Therefore, the “Feedback” activity is a better anonymous survey tool than the “Survey” activity.
Except for Shkoukani (2019), who analysed features for the 20 most popular LMSs in 2018, few studies compare Moodle with other LMSs. Only 2% of papers analysed in this study have compared Moodle with other LMSs, and they only compared Moodle with Blackboard or Canvas (Aljawarneh, 2020; Shdaifat & Obeidallah, 2019). Further analysis between LMSs focusing on features, integration, cost, and security are pivotal for e-learning success.
Theme 2: curriculum development
In 53% of the reviewed articles, LMS Moodle was used for curriculum development, including implementing learning modules and assessments for blended and online courses. While about half of the articles (45%) explain how this can be used at the course level (e.g., Awofeso et al., 2016; Brateanu et al., 2019; Chootongchai & Songkram, 2018), 4% of the articles explain how this can be used for framework design (multiple courses to achieve program objectives) (e.g., Kouis et al., 2020; Saleh & Salama, 2018; Smolyaninova & Bezyzvestnykh, 2019).
Educators bear responsibility for ensuring optimal tools are utilised for the most effective computerised assessment that enables students and teachers to address or avoid assessment-related problems (Marczak et al., 2016). However, only 4% of papers analyse the teachers’ perspectives of using Moodle (Babo & Suhonen, 2018; Badia et al., 2019; García-Martín & García-Sánchez, 2020; Jackson, 2017; Marczak et al., 2016; Valero & Cárdenas, 2017). Badia et al. (2019) conducted a study using 132 teachers across 43 schools indicated further research should be conducted on: Why do only certain Moodle activities positively impact learning outcomes? What can technological designers and teachers do to improve the level of learning outcomes achieved through the use of Moodle activities?
Of the 155 articles reviewed, only eight used educational theoretical frameworks for their research and development (see Table 6). According to the studies shown in Table 6, online assessments can be theorised using Classical Test Theory (CTT) and Item Response Theory (IRT). Online content development, particularly adaptive content, can be theorised using Computer Adaptive Testing (CAT), the Technology Acceptance Model (TAM), Merrill's problem-centric framework, and Self-determination theory. The DeLone and McLean Information Systems (IS) theories can be used to measure the effectiveness of advanced online materials and for the implementation of e-learning systems.
Theme 3: learning focus
Adaptive, collaborative, or problem-based content developments were discussed in 20% of the articles, with only 4% considering learning styles and critical thinking.
LMSs provide large data databases and fast access to a systematic analysis of information. Therefore, designing adaptive or self-learning modules and automatic assessments which adapt to the learner’s preferences has become much easier. Of the articles reviewed, 8% either demonstrate or improve automated content. The areas addressed within these articles were randomly generated tests, questions with multiple possible answers, automated marking systems and rubrics, provision of positive and motivational automatic summative and formative feedback, auto-adaptive content for learners with diverse backgrounds, interactive content, self-assessed quiz and multimedia books for instructional design (Azevedo et al., 2019; Brateanu et al., 2019; Gutiérrez et al., 2016; Ljubimova et al., 2015; Paiva et al., 2015).
Further research has investigated integrating instructional design theories, psychological elements, and learning theories into adaptive learning (Abuhassna & Yahaya, 2018; Conejo et al., 2016; Saleh & Salama, 2018). Tlili et al. (2019) conducted a study that aimed to model the learners’ personalities using a learning analytics approach called intelligent Moodle (iMoodle), with results compared to the traditional method of modelling learners' personalities using questionnaires (Tlili et al., 2019). A further study investigated automatic detection of learning styles by analysing student learning behaviour by constructing a mathematical model (Xiao & Rahman, 2017). Further research has been suggested in the areas of exploring the extent to which automatic feedback encourages positive motivational beliefs and self‐esteem among students (Gaona, et al., 2018), improving real-time adaptation learning modules, intelligent non-human tutoring, and using educational data mining techniques to investigate and predict students' attitude to learning.
Collaborative learning was discussed by 12% of the reviewed articles. Of these, a number focused on Moodle's peer assessment tool “workshop” and demonstrated how to use “workshop” to allow students to mark their fellow students’ work and reduce the marking load for teaching staff (ArchMiller et al., 2017; Slee & Jacobs, 2017; Strang, 2015). Peer review and feedback were generally accepted as helping to develop students’ meta-cognitive skills relating to critical reflection (Wilson et al., 2015). However, qualitative studies show that students and staff have divided opinions regarding the “workshop” tool for peer assessment (Divjak & Maretić, 2017; Dolezal et al., 2018; Wilson et al., 2015). While students agree with a limited number of peer assessments, staff experience an increase or no decrease in their marking workload (Wilson et al., 2015). However, peer assessments using “workshop” are still time-consuming for both the teacher and students and could lose their charm if they are overused (Dolezal et al., 2018). In studies that have used peer assessments to allow students to grade their peers, some students reported the peer assessment method as “unfair and “unprofessional” (Divjak & Maretić, 2017; Dolezal et al., 2018; Wilson et al., 2015). The “workshop” tool in Moodle does not have a built-in measure for peer assessment validity. One study which addressed the concern of students’ validity contributing to marking assignments reported that the grades were consistent with what faculty expected based on t tests and reliability estimates (Strang, 2015).
The Moodle activity “Forum” can be used to improve problem-based learning via group projects (Awofeso et al., 2016). “Forums” allowed students to maintain much more direct contact when they were not in the class and made it easier for students to meet and work on their projects even though they were in different places (Marti et al., 2015; Phungsuk et al., 2017). A further study reported that online learning systems positively influenced students' thinking and innovation skills (Chootongchai & Songkram, 2018).
Of the identified articles, 3% investigated learning styles—namely, Active vs Reflective, Sensitive vs Intuitive, Visual vs Verbal, Sequential vs Global—when implementing e-course content (Kouis et al., 2020; Ljubimova et al., 2015; Xiao & Rahman, 2017). These studies have shown that students' independent work can be guided through interactive technology, and these teaching methods would eliminate students' passivity in the classroom and enhance their cognitive activity. While some studies have proposed automatic detection of learning styles by analysing student's learning behaviour through mathematical models (Xiao & Rahman, 2017), other studies have developed simpler matrix systems that would allow the teacher to carry out a manual selection of tools for Moodle Learning after considering student's learning styles (Ljubimova et al., 2015; Meza-Fernández & Sepúlveda-Sariego, 2017; Xiao & Rahman, 2017). However, identifying students' learning styles to maintain assessment quality needs further investigation (Meza-Fernández & Sepúlveda-Sariego, 2017).
Theme 4: assessment
A third (33%) of the reviewed papers focused on assessment including summative and formative assessment, online exams, marking, and feedback (Adesemowo et al., 2016; Albano & Dello Iacono, 2019; Basol & Balgalmis, 2016; George-Williams et al., 2019). Moodle can create large data pools of various questions, including multiple-choice, open answer, generative questions, and complex tasks (Conejo et al., 2016). Nevertheless, most papers focused on summative assessment based on Moodle quizzes investigating both teachers’ and students’ opinions when implementing multiple-choice questions (Babo & Suhonen, 2018; Cakiroglu et al., 2017; Dimic et al., 2018; McKenzie & Roodenburg, 2017; Shdaifat & Obeidallah, 2019). According to a 5-year study, the ‘luck’ factor associated with multiple-choice questions is fair (Babo et al., 2020). Studies that have investigated the students' point of view indicate that the students agree that Moodle is easy to use and complements teaching, although most students still prefer classical assessment techniques (Cakiroglu et al., 2017; McVey, 2016; Popovic et al., 2018). However, one study found no direct relationships between students' preferences and academic performance (Cakiroglu et al., 2017).
Some studies which focused on the assessment process investigated the usefulness of the online environment for instructors to organise assessments, the usefulness of giving responsibilities to students during assessment (mainly via peer assessments), and using Moodle statistics and analytics to evaluate and improve the quality assessment process (Cakiroglu et al., 2017; Gamage et al., 2019; Hussain & Jaeger, 2018).
Marking and feedback
Four percent of reviewed articles focused on improving and streamlining the marking and feedback processes for both students and teachers. These studies indicate that online marking systems associated with Moodle lower the long-term costs, increase the speed of providing feedback, provide greater flexibility with respect to location and timing and reduce the space required to manage the assessment process (García López & García Mazarío, 2016; Koneru, 2017; Villa et al., 2018). A study with 57 academics conducted at Monash University, Australia, highlighted Moodle's reliability, and improved impartiality of the assessment process (George-Williams et al., 2019). The study concluded that this impartiality is generally achieved through the removal of personal, academic judgment, which results in more reliable, consistent marking practices.
Theme 5: ethics
The reviewed articles investigated two strands of ethics: (1) ethics relating to users' data security and privacy, and (2) academic integrity. While 4% of all reviewed articles highlighted security and privacy concerns, 6% of the articles discussed academic integrity issues caused by the increased use of LMSs for assessment purposes. Although personal data protection has legal compliance, such as the policies in the European Union and the Privacy Act 1988 in Australia, several articles discussed the privacy concerns of cloud-based services. The use of cloud-based services has resulted in teaching materials being stolen, and instructors' or administrators' credentials being compromised (Daniels & Iwago, 2017; Kiennert et al., 2019; Mudiyanselage & Pan, 2020).
Two re-occurring academic integrity issues associated with online assessments were highlighted: students plagiarising and students using third parties to complete assignments (Amoako & Osunmakinde, 2020; Guillén-Gámez & García-Magariño, 2015). Although instances of these two integrity problems occur in traditional teaching and learning methods, face-to-face invigilated exam environments can help minimise the effect of these issues. One alternative to invigilated exams is online quizzes which have become popular due to their ability to automate marking. However, cheating cannot be controlled unless it is held in an invigilated room. Several studies attempted to address this issue by introducing new software and analytical tools to detect academic misconduct. These tools include: limiting IP range for the users during online exams (Adesemowo et al., 2016); using timestamps and data processing techniques to identify unauthorised users (Genci, 2014); using facial verification software (Guillén-Gámez & García-Magariño, 2015) and using plagiarism detection software (Adesemowo et al., 2016; Genci, 2014; Guillén-Gámez & García-Magariño, 2015; Kaya & Özel, 2015).
Theme 6: technical developments.
Application of Moodle analytics
Online LMSs make it more manageable to gather and analyse students’ data. Ten percent of the articles reviewed discussed the in-built statistical tools such as the facility index and discrimination index along with the databases available in LMSs for the use of educational and research purposes (Fenu et al., 2017; Gamage et al., 2019; Monllaó Olivé et al., 2020). The articles used data mining and statistical tools to measure and analyse student engagement, student satisfaction, and online courses' performance. Analysing the tools available would be beneficial for monitoring student retention rates (Monllaó Olivé et al., 2020), identifying underachieving students (Saqr et al., 2017), predicting students' trends and attitudes, and accreditation purposes (El Tantawi et al., 2015; Saleh & Salama, 2018; Strang, 2016). Data and analytics tools may also be used to automate personality assessments and create intelligent (adaptive) learning platforms (Tlili et al., 2019).
Software development and adaptation
This review found that 24% of the articles discussed or evaluated software development and adaptations, including the use of existing software to improve the learning experience within Moodle. Software applications that can be integrated into Moodle include:
Apple's Siri and Google's GRScloud-based speech recognition for language learning (Daniels & Iwago, 2017).
The ever-increasing number of new software/Add-Ins available for Moodle is indicative of the interest of software developers and researchers to improve the useability of Moodle for online teaching and education. Course developers utilise plug-ins to assist with automatic essay marking, randomising questions, and identifying ineffective questions (Koneru, 2017; Schweighofer et al., 2019; Villa et al., 2018). Table 7 lists several software applications that can be integrated into LMSs and, in particular, Moodle.
To date (June 2021), Moodle has 1753 available plug-ins that can add new functions that improve administration, assessment, collaboration, communication, content and the interface (Moodle Project, 2020b). The Moodle statistics for 2019 show that the most popular plug-ins (based on the number of downloads) were communication and content plug-ins, such as Moove, BigBlueBN, Adaptable, H5P, and Eguru (Moodle Project, 2020b). The articles in this review covering Jan 2015–June 2021 show that most reported advancements in new software developments for Moodle relate to improving assessment processes. The development advancements include improving the security of the assessment processes (Adesemowo et al., 2016; Kaya & Özel, 2015), improving the mechanisms to generate quiz questions, and improving feedback and response time (Conejo et al., 2016; Kruger et al., 2015). Security improvements include, but are not limited to, improving user data verification (Amoako & Osunmakinde, 2020), facial recognition (Guillén-Gámez & García-Magariño, 2015), limiting IP range (Adesemowo et al., 2016), and scanning students IDs (Ross, 2017). Daniels and Iwago (2017) also reported on integrating Google speech recognition for speech assessments. Improving students cognitive, innovative, and collaborative learning skills were a key area of development in some reported studies (Chootongchai & Songkram, 2018; Finogeev et al., 2020; García López & García Mazarío, 2016), along with the improvement of user interface evaluation (Fenu et al., 2017).
Artificial intelligence tools are an increasing area of research which investigates intellectual mechanisms for managing personalised learning. Gray et al. (2018) reported on the software developments that aid students in their report writing and allow arguments, justification, and conclusions to be formed without any human input. Software development also encompasses the ability to direct students to relevant content and assessments after automatic analysis of the students' behaviour (Finogeev et al., 2020) and can also evaluate summaries written by students using information available on websites and online repositories (Ramírez-Noriega et al., 2018). As software advancements to assist students with their assignments are increasing, so is plagiarism. Plagiarism detection systems are successfully integrated into Moodle with plug-ins, such as Urkund, Turnitin, Plagiarisma, and SafeAssign which can detect textual plagiarism. Source code detection software for programming courses are under development (Kaya & Özel, 2015).
Despite advances in software and technology for e-learning and online LMSs, numerous fundamental gaps/drawbacks still exist, with the majority on technical issues (Adesemowo et al., 2016; Marczak et al., 2016; Rachman‐Elbaum, et al, 2017), such as server/browser response times, lag time in resolving technical issues, lack of equipment available to students and the possible high cost associated with the initial development of programs (Chang Chan et al., 2019; El Tantawi et al., 2015; Marczak et al., 2016; Zamalia & Porter, 2016).
Theme 7: research approach and methods
The research approaches used are categorised into quantitative analysis, qualitative analysis, mixed methods, technical and other. Of the 155 articles reviewed, 67 papers used a quantitative (QN) research approach which aimed to quantify a phenomenon relevant to online teaching and learning (see Fig. 9). Forty-eight papers used a qualitative (QL) research approach which involved descriptive data collection, student, teacher, or other stakeholder thoughts and experiences; 28 papers used mixed methods—both qualitative and quantitative approaches; 37 papers discussed technical (T) components of LMS and included new software development and framework design; and, 37 papers were categorised as “other”, namely, research that did not fall into the above three categories, e.g., applications of existing LMSs and tools, reviewing/comparing existing LMSs or tools.
Qualitative research studies in this review evaluated mainly the students’ perspective: their preferences, perceptions, satisfaction, and attitudes towards online learning, including the online tools being utilised (Botelho et al., 2020; Cakiroglu et al., 2017; García-Martín & García-Sánchez, 2020; Tsai & Tang, 2017). Only two research studies focused exclusively on teacher opinions, perceptions, and experiences in e-assessment, Moodle activities, and their learning impacts (Babo & Suhonen, 2018; Badia et al., 2019). Four articles reported on both student and teacher perspectives and discussed attitudes towards summative and formative assessments and flexibility in e-learning (Jackson, 2017; Kamenez et al., 2018; Marczak et al., 2016; Valero & Cárdenas, 2017). Jackson (2017) reported that Moodle is a technology that enables creativity among teachers and recommended that management incorporate training programs of LMSs for both teachers and students into their strategic plans.
Theme 8: student success factors
The qualitative, quantitative, and mixed methods research have common indexes used as student success indicators, namely, student performance, engagement, and satisfaction indicators (as described in Table 2). Figure 7h shows the articles that discussed student success factors with 14% using student performance, 16% student engagement, and 8% student satisfaction. Student performance and engagement are mainly found in quantitative research, whereas student satisfaction indicators are found in qualitative research. Qualitative research measuring student satisfaction are fewer than quantitative research analysing student performance and engagement.
This comprehensive systematic review on Moodle use for online teaching and learning covers a wide range of educational institutions. The review identifies methods used and developments over the last 6 years published in 155 journal articles across 104 journals over 55 countries and 10 disciplines. The findings have been summarised bibliographically and thematically where appropriate, providing vital information to educators, researchers, and software developers. The critical limitation of this review is that only Scopus and Web of Science databases were used for the search, and papers that are not covered by either database are not included in this analysis.
The bibliographic analysis identified Moodle as a well-established and advanced learning platform for multiple disciplines and particularly used in STEM education. Most of the literature (75%) focus on university settings, with the majority (96%) on undergraduate studies. The bibliographic analysis shows the increasing trend in Moodle educational research and provides information about the top journals, leading authors, keywords, and high citations. The thematic analysis finds that Moodle is a powerful tool used to support learning in various ways. Both educators and students benefit from using the Moodle LMS, although currently at varying degrees. The most prevalent tools being used are Moodle “quizzes” and “workshops”, and external tools that can be easily embedded into the Moodle system are videos, virtual tours, and e-portfolios. Moodle enables the creativity of individual teachers to develop course-specific materials for students. In addition, Moodle saves time due to randomly generated tests, questions with multiple possible answers, automated marking systems and rubrics, and positive and motivational automatic summative and formative feedback. There is strong evidence that Moodle increases student engagement, performance, and satisfaction while enhancing flexibility in their learning environments. Areas showing a rapid growth in research are adaptive content and assessment development, improvements in data security, and user verification. Regardless of recent advancements in online teaching and learning, some studies report numerous fundamental gaps and drawbacks.
The gaps identified in this review are significant for future research. Some gaps include comparing Moodle with other LMSs and elaborating on the many e-learning tools and associated plug-ins available in the market but not analysed in educational research. Future research could focus on aspects pivotal for e-learning success: features, integration, cost, and security. Further research is needed to outline Moodle e-learning experiences in primary and secondary education settings, with qualitative studies needed, particularly focusing on teachers’ perspectives in a tertiary education setting. As only 5% of the studies have considered educational theories, future research needs to strengthen the theoretical underpinning of studies. Existing educational theories could successfully theorise the efficiency of content developments and the effectiveness of online study materials and assignments. Data gathering tools and statistical tools embedded into LMSs along with theoretical frameworks could lead to insightful research. As only 10% of articles discussed ethical aspects, more publications are needed to analyse ethical issues associated with e-learning, particularly focusing on the increasing number of artificial intelligence tools. More research on these aspects will help educators to utilise LMSs for successful online or blended course developments. As this review is based only on published articles, more applications of Moodle might be occurring, particularly in developing countries. Therefore, an area of future study could be a study examining statistics of Moodle usage rather than published papers.
Availability of data and materials
The data sets used and analysed during the current study are available from the corresponding author on request.
Abuhassna, H., & Yahaya, N. (2018). Students’ Utilization of distance learning through an interventional online module based on Moore Transactional Distance theory. Eurasia Journal of Mathematics, Science and Technology Education, 14(7), 3043–3052. https://doi.org/10.29333/ejmste/91606
Adesemowo, A. K., Johannes, H., Goldstone, S., & Terblanche, K. (2016). The experience of introducing secure e-assessment in a South African university first-year foundational ICT networking course. Africa Education Review, 13(1), 67–86. https://doi.org/10.1080/18146627.2016.1186922
Al-Ajlan, A., & Zedan, H. (2008). Why Moodle?. Paper presented at the 12th IEEE International workshop on future trends of distributed computing systems. 2008. doi: https://doi.org/10.1109/ftdcs13956.2008.
Albano, G., & Dello Iacono, U. (2019). GeoGebra in e-learning environments: A possible integration in mathematics and beyond. Journal of Ambient Intelligence and HumanizedCcomputing, 10(11), 4331–4343. https://doi.org/10.1007/s12652-018-1111-x
Aljawarneh, S. A. (2020). Reviewing and exploring innovative ubiquitous learning tools in higher education. Journal of Computing in Higher Education, 32(1), 57–73. https://doi.org/10.1007/s12528-019-09207-0
Alkholy, S., Gendron, F., Dahms, T., & Ferreira, M. P. (2015). Assessing student perceptions of indigenous science co-educators, interest in STEM, and identity as a scientist: A pilot study. Ubiquitous Learning, 7(3–4), 41–51.
Altinpulluk, H., & Kesim, M. (2021). A systematic review of the tendencies in the use of learning management systems. The Turkish Online Journal of Distance Education, 22(3), 40–54. https://doi.org/10.17718/tojde.961812
Amoako, P. Y. O., & Osunmakinde, I. O. (2020). Emerging bimodal biometrics authentication for non-venue-based assessments in open distance e-learning (OdeL) environments. International Journal of Technology Enhanced Learning, 12(2), 218–244. https://doi.org/10.1504/IJTEL.2020.106287
Araya, R., & Collanqui, P. (2021). Are cross-border classes feasible for students to collaborate in the analysis of energy efficiency strategies for socioeconomic development while keeping CO2 concentration controlled? Sustainability (basel, Switzerland), 13(3), 1–20. https://doi.org/10.3390/su13031584
ArchMiller, A., Fieberg, J., Walker, J. D., & Holm, N. (2017). Group peer assessment for summative evaluation in a graduate-level statistics course for ecologists. Assessment and Evaluation in Higher Education, 42(8), 1208–1220. https://doi.org/10.1080/02602938.2016.1243219
Ardianti, S., Sulisworo, D., Pramudya, Y., & Raharjo, W. (2020). The impact of the use of STEM education approach on the blended learning to improve student’s critical thinking skills. Universal Journal of Educational Research, 8(3B), 24–32. https://doi.org/10.13189/ujer.2020.081503
Awofeso, N., Hassan, M., & Hamidi, S. (2016). Individual and collaborative technology-mediated learning using question & answer online discussion forums: Perceptions of public health learners in Dubai UAE. Open Learning, 31(1), 54–63. https://doi.org/10.1080/02680513.2015.1120662
Azevedo, J. M., Oliveira, E. P., & Beites, P. D. (2019). Using learning analytics to evaluate the quality of multiple-choice questions: A perspective with Classical Test Theory and Item Response Theory. The International Journal of Information and Learning Technology, 36(4), 322–341. https://doi.org/10.1108/IJILT-02-2019-0023
Babo, R., Babo, L. V., Suhonen, J. T., & Tukiainen, M. (2020). E-assessment with multiple-choice questions: A 5-year study of students’ opinions and experience. Journal of Information Technology Education: Innovations in Practice, 19, 1–29. https://doi.org/10.28945/4491
Babo, R., & Suhonen, J. (2018). E-assessment with multiple choice questions: A qualitative study of teachers’ opinions and experience regarding the new assessment strategy. International Journal of Learning Technology, 13(3), 220–248. https://doi.org/10.1504/IJLT.2018.095964
Badia, A., Martín, D., & Gómez, M. (2019). Teachers’ perceptions of the use of Moodle activities and their learning impact in secondary education. Technology, Knowledge and Learning, 24(3), 483–499. https://doi.org/10.1007/s10758-018-9354-3
Basol, G., & Balgalmis, E. (2016). A multivariate investigation of gender differences in the number of online tests received-checking for perceived self-regulation. Computers in Human Behavior, 58, 388–397. https://doi.org/10.1016/j.chb.2016.01.010
Bernacki, M. L., Vosicka, L., & Utz, J. C. (2020). Can a brief, digital skill training intervention help undergraduates “learn to learn” and improve their STEM achievement? Journal of Educational Psychology, 112(4), 765–781. https://doi.org/10.1037/edu0000405
Botelho, M., Gao, X., & Bhuyan, S. Y. (2020). Mixed-methods analysis of videoed expert-student dialogue supporting clinical competence assessments. European Journal of Dental Education, 24(3), 398–406. https://doi.org/10.1111/eje.12515
Brateanu, A., Strang, T. M., Garber, A., Mani, S., Spencer, A., Spevak, B., Thomascik, J., Mehta, N., & Colbert, C. Y. (2019). Using an adaptive, self-directed web-based learning module to enhance residents’ medical knowledge prior to a new clinical rotation. Medical Science Educator, 29(3), 779–786. https://doi.org/10.1007/s40670-019-00772-8
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa
Byrnes, K. G., Kiely, P. A., Dunne, C. P., McDermott, K. W., & Coffey, J. C. (2021). Communication, collaboration and contagion: “Virtualisation” of anatomy during COVID-19. Clinical Anatomy, 34(1), 82–89. https://doi.org/10.1002/ca.23649
Cadaret, C. N., & Yates, D. T. (2018). Retrieval practice in the form of online homework improved information retention more when spaced 5 days rather than 1 day after class in two physiology courses. Advances in Physiology Education, 42(2), 305–310. https://doi.org/10.1152/advan.00104.2017
Cakiroglu, U., Erdogdu, F., Kokoc, M., & Atabay, M. (2017). Student’s preference in online assessment process: Influence on academic performance. The Turkish Online Journal of Distance Education, 18(1), 132–132. https://doi.org/10.17718/tojde.285721
Campbell, L. O., Heller, S., & Pulse, L. (2020). Student-created video: An active learning approach in online environments. Interactive Learning Environments. https://doi.org/10.1080/10494820.2020.1711777
Capterra (2021). LMS software. https://www.capterra.com/learning-management-system-software/?feature=%5B38347%5D&sortOrder=sponsored. Accessed 15 Oct 2021.
Chafiq, N., Talbi, M., & Ghazouani, M. (2018). Design and implementation of a risk management tool: A case study of the Moodle platform. International Journal of Advanced Computer Science and Applications, 9(8), 458–461.
Chang Chan, A.Y.-C., Custer, E. J. F. M., van Leeuwen, M. S., Bleys, R. L. A. W., & ten Cate, O. (2019). Correction to: Does an additional online anatomy course improve performance of medical students on gross anatomy examinations? Medical Science Educator, 29(3), 891–891. https://doi.org/10.1007/s40670-019-00758-6
Chaparro-Peláez, J., Iglesias-Pradas, S., Rodríguez-Sedano, F. J., & Acquila-Natale, E. (2019). Extraction, processing and visualization of peer assessment data in Moodle. Applied Sciences, 10(1), 163. https://doi.org/10.3390/app10010163
Chemsi, G., Sadiq, M., Radid, M., & Talbi, M. (2020). Study of the self-determined motivation among students in the context of online pedagogical activities. International Journal of Emerging Technologies in Learning, 15(5), 17–29. https://doi.org/10.3991/ijet.v15i05.11392
Chootongchai, S., & Songkram, N. (2018). Design and development of SECI and Moodle online learning s to enhance thinking and innovation skills for higher education learners. International Journal of Emerging Technologies in Learning, 13(3), 154–172. https://doi.org/10.3991/ijet.v13i03.7991
Christopoulos, A., Pellas, N., & Laakso, M.-J. (2020). A learning analytics theoretical framework for STEM education virtual reality applications. Education Sciences, 10(11), 317. https://doi.org/10.3390/educsci10110317
Clarke, V., & Braun, V. (2014). Thematic analysis. In T. Teo (Ed.), Encyclopedia of critical psychology (pp. 1947–1952). Springer.
Colares, G. S., Dell’Osbel, N., Wiesel, P. G., Oliveira, G. A., Lemos, P. H. Z., da Silva, F. P., Lutterbeck, C. A., Kist, L. T., & Machado, Ê. L. (2020). Floating treatment wetlands: A review and bibliometric analysis. The Science of the Total Environment, 714, 136776–136776. https://doi.org/10.1016/j.scitotenv.2020.136776
Conejo, R., Guzmán, E., & Trella, M. (2016). The SIETTE automatic assessment environment. International Journal of Artificial Intelligence in Education, 26(1), 270–292. https://doi.org/10.1007/s40593-015-0078-4
Conijn, R., Snijders, C., Kleingeld, A., & Matzat, U. (2017). Predicting student performance from LMS data: A comparison of 17 blended courses using Moodle LMS. IEEE Transactions on Learning Technologies, 10(1), 17–29. https://doi.org/10.1109/TLT.2016.2616312
de Souza, M. P., Hoeltz, M., Brittes Benitez, L., Machado, Ê. L., & de Souza Schneider, R. C. (2019). Microalgae and clean technologies: A review. Clean: Soil, Air, Water, 47(11), 1800380. doi: https://doi.org/10.1002/clen.201800380.
Daniels, P., & Iwago, K. (2017). The suitability of cloudbased speech recognition engines for language learning. JALT CALL Journal, 13(3), 229–239.
Dias, S. B., Hadjileontiadou, S. J., Diniz, J., & Hadjileontiadis, L. J. (2020). Deep LMS: A deep learning predictive model for supporting online learning in the Covid-19 era. Scientific Reports, 10(1), 19888–19888. https://doi.org/10.1038/s41598-020-76740-9
Dimic, G., Predic, B., Rancic, D., Petrovic, V., Macek, N., & Spalevic, P. (2018). Association analysis of Moodle e-tests in blended learning educational environment. Computer Applications in Engineering Education, 26(3), 417–430. https://doi.org/10.1002/cae.21894
Divjak, B., & Maretić, M. (2017). Learning analytics for peer-assessment: (Dis)advantages, reliability and implementation. Journal of Information and Organizational Sciences, 41(1), 21–34. https://doi.org/10.31341/jios.41.1.2
Dolezal, D., Posekany, A., Roschger, C., Koppensteiner, G., Motschnig, R., & Pucher, R. (2018). Person-centered learning using peer review method: An evaluation and a concept for student-centered classrooms. International Journal of Engineering Pedagogy, 8(1), 127–147. https://doi.org/10.3991/ijep.v8i1.8099
Dominguez, M., Bernacki, M. L., & Uesbeck, P. M. (2016). Predicting STEM achievement with learning management system data: Prediction modeling and a test of an early warning system. Paper presented at the EDM.
El Tantawi, M. M. A., Abdelsalam, M. M., Mourady, A. M., & Elrifae, I. M. B. (2015). e-Assessment in a limited-resources dental school using an open-source learning management system. Journal of Dental Education, 79(5), 571–583. https://doi.org/10.1002/j.0022-0337.2015.79.5.tb05917.x
Fenu, G., Marras, M., & Meles, M. (2017). A learning analytics tool for usability assessment in Moodle environments. Journal of e-Learning and Knowledge Society, 13(3), 23–34. https://doi.org/10.20368/1971-8829/1388
Finogeev, A., Gamidullaeva, L., Bershadsky, A., Fionova, L., Deev, M., & Finogeev, A. (2020). Convergent approach to synthesis of the information learning environment for higher education. Education and Information Technologies, 25(1), 11–30. https://doi.org/10.1007/s10639-019-09903-5
Gamage, S. H. P. W., Ayres, J. R., Behrend, M. B., & Smith, E. J. (2019). Optimising Moodle quizzes for online assessments. International Journal of STEM Education, 6(1), 1–14. https://doi.org/10.1186/s40594-019-0181-4
Gaona, J., Reguant, M., Valdivia, I., Vásquez, M., & Sancho-Vinuesa, T. (2018). Feedback by automatic assessment systems used in mathematics homework in the engineering field. Computer Applications in Engineering Education, 26(4), 994–1007. https://doi.org/10.1002/cae.21950
García López, A., & García Mazarío, F. (2016). The use of technology in a model of formative assessment. Journal of Technology and Science Education, 6(2), 91–103. https://doi.org/10.3926/jotse.190
García-Martín, J., & García-Sánchez, J.-N. (2020). The effectiveness of four instructional approaches used in a MOOC promoting personal skills for success in life. Revista De Psicodidáctica (english Ed.), 25(1), 36–44. https://doi.org/10.1016/j.psicoe.2019.08.001
Genci, J. (2014). About one way to discover formative a cheating. 312, 83–90. Cham: Switzerland: Springer International Publishing.
George-Williams, S., Carroll, M.-R., Ziebell, A., Thompson, C., & Overton, T. (2019). Curtailing marking variation and enhancing feedback in large scale undergraduate chemistry courses through reducing academic judgement: A case study. Assessment and Evaluation in Higher Education, 44(6), 881–893. https://doi.org/10.1080/02602938.2018.1545897
Gray, W. G., Lado, M. J., Zhang, Z., Iskander, M. F., Garcia-Gorrostieta, J. M., Lopez-Lopez, A., & Gonzalez-Lopez, S. (2018). Automatic argument assessment of final project reports of computer engineering students. Computer Applications in Engineering Education, 26(5), 1217–1226. https://doi.org/10.1002/cae.21996
Guillén-Gámez, F. D., & García-Magariño, I. (2015). Use of facial authentication in E-learning: A study of how it affects students in different Spanish-speaking areas. International Journal of Technology Enhanced Learning, 7(3), 264–280. https://doi.org/10.1504/IJTEL.2015.072818
Guillen-Gamez, F. D., Garcia-Magarino, I., Bravo, J., & Plaza, I. (2015). Exploring the influence of facial verification software on student academic performance in online learning environments. International Journal of Engineering Education, 31(6A), 1622–1628.
Gutiérrez, I., Álvarez, V., Puerto Paule, M., Pérez-Pérez, J. R., & de Freitas, S. (2016). Adaptation in e-learning content specifications with dynamic sharable objects. Systems (basel), 4(2), 24. https://doi.org/10.3390/systems4020024
Hempel, B., Kiehlbaugh, K., & Blowers, P. (2020). Scalable and practical teaching practices faculty can deploy to increase retention: A faculty cookbook for increasing student success. Education for Chemical Engineers, 33, 45–65. https://doi.org/10.1016/j.ece.2020.07.004
Henke, K., Nau, J., Bock, R. N., & Wuttke, H.-D. (2021). A hybrid online laboratory for basic STEM education. In Uskov V.L., Howlett R.J., & J. L.C. (Eds.), Smart Education and e-Learning 2021, 240, 29–39, New York, N. Y., Springer.
Henrick, G. (2018). Moodle 2 interactive tool guide gets an interactive treatment. Moodle News. https://www.moodlenews.com/2015/moodle-2-interactive-tool-guide-gets-an-interactive-treatment/. Accessed 26 Feb 2019.
Hsiung, W. Y. (2018). The use of e-resources and innovative technology in transforming traditional teaching in chemistry and its impact on learning chemistry. International Journal of Interactive Mobile Technologies, 12(7), 86–96. https://doi.org/10.3991/ijim.v12i7.9666
Hussain, Y. A., & Jaeger, M. (2018). LMS-supported PBL assessment in an undergraduate engineering program: Case study. Computer Applications in Engineering Education, 26(5), 1915–1929. https://doi.org/10.1002/cae.22037
Hwang, C. S. (2020). Using continuous student feedback to course-correct during COVID-19 for a monmajors chemistry course. Journal of Chemical Education, 97(9), 3400–3405. https://doi.org/10.1021/acs.jchemed.0c00808
ISCED, (2012). International Standard Classification of Education (ISCED) 2011, https://doi.org/10.15220/978-92-9189-123-8-en. Accessed 22 Jan 2021
Jackson, E. A. (2017). Impact of MOODLE platform on the pedagogy of students and staff: Cross-curricular comparison. Education and Information Technologies, 22(1), 177–193. https://doi.org/10.1007/s10639-015-9438-9
Jones, D., Lotz, N., & Holden, G. (2021). A longitudinal study of virtual design studio (VDS) use in STEM distance design education. International Journal of Technology and Design Education, 31(4), 839–865. https://doi.org/10.1007/s10798-020-09576-z
Kamenez, N. V., Vaganova, O. I., Smirnova, Z. V., Bulayeva, M. N., Kuznetsova, E., & Maseleno, A. (2018). Experience of the use of electronic training in the educational process of the Russian higher educational institution. International Journal of Engineering and Technology (UAE), 7(4), 4085–4089.
Kaya, M., & Özel, S. A. (2015). Integrating an online compiler and a plagiarism detection tool into the Moodle distance education system for easy assessment of programming assignments. Computer Applications in Engineering Education, 23(3), 363–373. https://doi.org/10.1002/cae.21606
Kiennert, C., De Vos, N., Knockaert, M., & Garcia-Alfaro, J. (2019). The influence of conception paradigms on data protection in e-learning platforms: A case study. IEEE Access, 7, 64110–64119. https://doi.org/10.1109/ACCESS.2019.2915275
Koneru, I. (2017). Exploring moodle functionality for managing Open Distance Learning e-assessments. The Turkish Online Journal of Distance Education, 18(4), 129–141. https://doi.org/10.17718/tojde.340402
Kouis, D., Kyprianos, K., Ermidou, P., Kaimakis, P., & Koulouris, A. (2020). A framework for assessing LMSs e-courses content type compatibility with learning styles dimensions. Journal of e-Learning and Knowledge Society, 16(2), 73–86. https://doi.org/10.20368/1971-8829/1135204
Kruger, D., Inman, S., Ding, Z., Kang, Y., Kuna, P., Liu, Y., Lu, X., Oro, S., & Wang, Y. (2015). Improving teacher effectiveness: Designing better assessment tools in learning management systems. Future Internet, 7(4), 484–499. https://doi.org/10.3390/fi7040484
Li, Y., Wang, K., Xiao, Y., & Froyd, J. E. (2020). Research and trends in STEM education: A systematic review of journal publications. International Journal of STEM Education, 7(1), 1–16. https://doi.org/10.1186/s40594-020-00207-6
Ljubimova, E. M., Galimullina, E. Z., & Ibatullin, R. R. (2015). The development of university students’ self-sufficiency based on interactive technologies by their immersion in the professional. International Education Studies, 8(4), 192. https://doi.org/10.5539/ies.v8n4p192
Marczak, M., Krajka, J., & Malec, W. (2016). Web-based assessment and language teachers-from Moodle to WebClass. International Journal of Continuing Engineering Education and Life Long Learning, 26(1), 44–59. https://doi.org/10.1504/IJCEELL.2016.075048
Marjanovic, U., Delić, M., & Lalic, B. (2016). Developing a model to assess the success of e-learning systems: Evidence from a manufacturing company in transitional economy. Information Systems and e-Business Management, 14(2), 253–272. https://doi.org/10.1007/s10257-015-0282-7
Marti, E., Gurguí, A., Gil, D., Hernández-Sabaté, A., Rocarias, J., & Poveda, F. (2015). PBL On Line: A proposal for the organization, part-time monitoring and assessment of PBL group activities. Journal of Technology and Science Education, 5(2), 87–96. https://doi.org/10.3926/jotse.145
Matazi, I., Messoussi, R., Bellmallem, S.-E., Oumaira, I., Bennane, A., & Touahni, R. (2018). Development of intelligent multi-agents system for collaborative e-learning support. Bulletin of Electrical Engineering Informatics, 7(2), 294–305. https://doi.org/10.11591/eei.v7i2.860
McKenzie, W., & Roodenburg, J. (2017). Using PeerWise to develop a contributing student pedagogy for postgraduate psychology. Australasian Journal of Educational Technology, 33(1), 32–47. https://doi.org/10.14742/ajet.3169
McVey, M. (2016). Preservice teachers’ perception of assessment strategies in online teaching. Journal of Digital Learning in Teacher Education, 32(4), 119–127. https://doi.org/10.1080/21532974.2016.1205460
Meza-Fernández, S., & Sepúlveda-Sariego, A. (2017). Representational model on Moodle’s activity: Learning styles and navigation strategies. International Journal of Educational Technology in Higher Education, 14(1), 1–9. https://doi.org/10.1186/s41239-017-0052-3
Moher, D., Shamseer, L., Clarke, M., Ghersi, D., Liberati, A., Petticrew, M., et al. (2015). Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Systematic Reviews, 4(1), 1–9. https://doi.org/10.1186/2046-4053-4-1
Monllaó Olivé, D., Huynh, D. Q., Reynolds, M., Dougiamas, M., & Wiese, D. (2020). A supervised learning framework: Using assessment to identify students at risk of dropping out of a MOOC. Journal of Computing in Higher Education, 32(1), 9–26. https://doi.org/10.1007/s12528-019-09230-1
Moodle Project, 2020b. Moodle Plug-ins. https://moodle.org/plugins/?q=. Accessed 12 Jan 2021.
Moodle Project, (2020a). Moodle statistics. https://stats.moodle.org/. Accessed 20 Oct 2020.
Mudiyanselage, A. K., & Pan, L. (2020). Security test MOODLE: A penetration testing case study. International Journal of Computers & Applications, 42(4), 372–382. https://doi.org/10.1080/1206212X.2017.1396413
Neitola, M. T. T. (2019). Circuit theory e-assessment realized in an open-source learning environment. International Journal of Engineering Pedagogy, 9(1), 4–18. https://doi.org/10.3991/ijep.v9i1.9072
Nunes, F. B., Herpich, F., Voss, G. B., Lima, J. V. D., & Medina, R. D. (2015). An adaptive environment based on Moodle with treating of quality of context. International Journal of Knowledge Learning, 10(2), 198–221. https://doi.org/10.1504/IJKL.2015.071618
Oguguo, B. C. E., Nannim, F. A., Agah, J. J., Ugwuanyi, C. S., Ene, C. U., & Nzeadibe, A. C. (2021). Effect of learning management system on student’s performance in educational measurement and evaluation. Education and Information Technologies, 26(2), 1471–1483. https://doi.org/10.1007/s10639-020-10318-w
Paiva, R. C., Ferreira, M. S., Mendes, A. G., & Eusébio, A. M. J. (2015). Interactive and multimedia contents associated with a system for computer-aided assessment. Journal of Educational Computing Research, 52(2), 224–256. https://doi.org/10.1177/0735633115571305
Park, Y., & Jo, I.-H. (2017). Using log variables in a learning management system to evaluate learning activity using the lens of activity theory. Assessment and Evaluation in Higher Education, 42(4), 531–547. https://doi.org/10.1080/02602938.2016.1158236
Phungsuk, R., Viriyavejakul, C., & Ratanaolarn, T. (2017). Development of a problem-based learning model via a virtual learning environment. Kasetsart jJurnal of Social Sciences, 38(3), 297–306. https://doi.org/10.1016/j.kjss.2017.01.001
Popovic, N., Popovic, T., Dragovic, I. R., & Cmiljanic, O. (2018). A Moodle-based blended learning solution for physiology education in Montenegro: A case study. Advances in Physiology Education, 42(1), 111–117. https://doi.org/10.1152/ADVAN.00155.2017
Price, E., Lau, A. C., Goldberg, F., Turpen, C., Smith, P. S., Dancy, M., & Robinson, S. (2021). Analyzing a faculty online learning community as a mechanism for supporting faculty implementation of a guided-inquiry curriculum. International Journal of STEM Education, 8(1), 17–17. https://doi.org/10.1186/s40594-020-00268-7
Rachman-Elbaum, S., Stark, A. H., Kachal, J., Johnson, T., & Porat-Katz, B. S. (2017). Online training introduces a novel approach to the Dietetic Care Process documentation. Nutrition & Dietetics, 74(4), 365–371. https://doi.org/10.1111/1747-0080.12331
Ramírez-Noriega, A., Juárez-Ramírez, R., Jiménez, S., Inzunza, S., & Martínez-Ramírez, Y. (2018). Ashur: Evaluation of the relation summary-content without human reference using rouge. Computing and Informatics, 37(2), 509–532. https://doi.org/10.4149/cai_2018_2_509
Raza, S. A., Qazi, W., Khan, K. A., & Salam, J. (2021). Social isolation and acceptance of the Learning Management System (LMS) in the time of COVID-19 pandemic: An expansion of the UTAUT model. Journal of Educational Computing Research, 59(2), 183–208. https://doi.org/10.1177/0735633120960421
Rissanen, A., & Costello, J. M. (2021). The effectiveness of interactive online tutorials in first-year large biology course. Journal of Applied Research in Higher Education. https://doi.org/10.1108/JARHE-09-2020-0312
Ross, R. (2017). MoodleNFC: Integrating smart student ID cards with Moodle for laboratory assessment. Australasian Journal of Engineering Education., 22(2), 73–80. https://doi.org/10.1080/22054952.2017.1414557
Saleh, M., & Salama, R. M. (2018). Recommendations for building adaptive cognition-based e-learning. International Journal of Advanced Computer Science and Applications, 9(8), 385–393.
Sancho-Vinuesa, T., Masià, R., Fuertes-Alpiste, M., & Molas-Castells, N. (2018). Exploring the effectiveness of continuous activity with automatic feedback in online calculus. Computer Applications in Engineering Education, 26(1), 62–74. https://doi.org/10.1002/cae.21861
Saqr, M., Fors, U., & Tedre, M. (2017). How learning analytics can early predict under-achieving students in a blended medical education course. Medical Teacher, 39(7), 757–767. https://doi.org/10.1080/0142159X.2017.1309376
Schweighofer, J., Taraghi, B., & Ebner, M. (2019). Development of a quiz: Implementation of a (self-) assessment tool and its integration in Moodle. International Journal of Emerging Technologies in Learning, 14(23), 141–151. https://doi.org/10.3991/ijet.v14i23.11484
Sergis, S., Vlachopoulos, P., Sampson, D. G., & Pelliccione, L. (2017). Implementing teaching model templates for supporting flipped classroom-enhanced STEM education in Moodle. In A. Marcus-Quinn & T. Hourigan (Eds.), Handbook on Digital Learning for K-12 Schools (pp. 191–215). Springer International Publishing.
Setiadi, P. M., Alia, D., Sumardi, S., Respati, R., & Nur, L. (2021). Synchronous or asynchronous? Various online learning platforms studied in Indonesia 2015–2020. In Journal of Physics. Conference Series,1987, Bristol: IOP Publishing.
Shdaifat, A. M., & Obeidallah, R. (2019). Quiz tool within Moodle and Blackboard mobile applications. International Journal of Interactive Mobile Technologies, 13(8), 32–42. https://doi.org/10.3991/ijim.v13i08.10552
Shkoukani, M. (2019). Explore the major characteristics of learning management systems and their impact on e-learning success. International Journal of Advanced Computer Science and Applications, 10(1), 296–301.
Singh, J., 2015. Moodle Statistics – Moodle now has more than 78 million users all over the world #MoodleWorld #Moodle. https://www.lmspulse.com/2015/moodle-statistics-moodle-now-has-more-than-78-million-users-all-over-the-world-moodleworld-moodle/. Accessed 10 Oct 2020.
Slee, N. J. D., & Jacobs, M. H. (2017). Trialling the use of Google Apps together with online marking to enhance collaborative learning and provide effective feedback [version 2 peer review: 2 approved with reservations]. F1000 research, 4, 177. doi:https://doi.org/10.12688/f1000research.6520.2.
Smolyaninova, O., & Bezyzvestnykh, E. (2019). Implementing teachers’ training technologies at a federal university: E-portfolio, digital laboratory, PROLog Module System. International Journal of Online and Biomedical Engineering, 15(4), 69–87. https://doi.org/10.3991/ijoe.v15i04.9288
Strang, K. D. (2015). Effectiveness of peer assessment in a professionalism course using an online workshop. Journal of Information Technology Education: Innovations in Practice, 14(1), 1–16.
Strang, K. D. (2016). Predicting student satisfaction and outcomes in online courses using learning activity indicators. Journal of Interactive Learning Research, 27(2), 125–152. https://doi.org/10.4018/IJWLTT.2017010103
Tlili, A., Denden, M., Essalmi, F., Jemni, M., Chang, M., Kinshuk, K., & Chen, N.-S. (2019). Automatic modeling learner’s personality using learning analytics approach in an intelligent Moodle learning platform. Interactive Learning Environments. https://doi.org/10.1080/10494820.2019.1636084
Tsai, M.-H., & Tang, Y.-C. (2017). Learning attitudes and problem-solving attitudes for blended problem-based learning. Library Hi Tech, 35(4), 615–628. https://doi.org/10.1108/LHT-06-2017-0102
van Eck, N.J., Waltman, L., 2020. VOSviewer manual. http://www.vosviewer.com/documentation/Manual_VOSviewer_1.6.1.pdf. Accessed 14 July 2020.
Valero, G., & Cárdenas, P. (2017). Formative and summative assessment in veterinary pathology and other courses at a Mexican veterinary college. Journal of Veterinary Medical Education, 44(2), 331–337. https://doi.org/10.3138/jvme.1015-169R
Villa, V., Motyl, B., Paderno, D., & Baronio, G. (2018). TDEG based framework and tools for innovation in teaching technical drawing: The example of LaMoo project. Computer Applications in Engineering Education, 26(5), 1293–1305. https://doi.org/10.1002/cae.22022
Wang, F. H. (2019). On the relationships between behaviors and achievement in technology-mediated flipped classrooms: A two-phase online behavioral PLS-SEM model. Computers and Education, 142, 103653. https://doi.org/10.1016/j.compedu.2019.103653
Wilson, M. J., Diao, M. M., & Huang, L. (2015). “I’m not here to learn how to mark someone else’s stuff”: An investigation of an online peer-to-peer review workshop tool. Assessment and Evaluation in Higher Education, 40(1), 15–32. https://doi.org/10.1080/02602938.2014.881980
Xiao, L. L., & Rahman, S. S. B. A. (2017). Predicting learning styles based on students’ learning behaviour using correlation analysis. Current Science (bangalore), 113(11), 2090–2096. https://doi.org/10.18520/cs/v113/i11/2090-2096
Xin, N. S., Shibghatullah, A. S., Subaramaniam, K. A. P., & Wahab, M. H. A. (2021). A systematic review for online learning management system. Journal of Physics. Conference Series, 1874(1), 12030. https://doi.org/10.1088/1742-6596/1874/1/012030
Zakaria, N. A., Saharudin, M. S., Yusof, R., & Abidin, Z. Z. (2019). Code pocket: Development of interactive online learning of STEM’s subject. International Journal of Recent Technology and Engineering, 8(2), 5537–5542. https://doi.org/10.35940/ijrte.B3297.078219
Zamalia, M., & Porter, A. L. (2016). Students’ perceived understanding and competency in probability concepts in an e-learning environment: An Australian experience. Pertanika Journal of Social Science and Humanities, 24, 73–82.
Zhao, D., Chis, A., Muntean, G., & Muntean, C. (2018). A large-scale pilot study on game-based learning and blended learning methodologies in undergraduate programming courses. Paper presented at the Proc. Int. Conf. Educ. New Learn. Technol.(EDULEARN).
Zheng, J., Xing, W., & Zhu, G. (2019). Examining sequential patterns of self- and socially shared regulation of STEM learning in a CSCL environment. Computers and Education, 136, 34–48. https://doi.org/10.1016/j.compedu.2019.03.005
This study is funded by NBERC Teaching & Learning (T&L) Seed Funding – University of South Australia, 2020.
Ethics approval and consent to participate
N/A. This is a meta data analysis based on published literature.
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Gamage, S.H.P.W., Ayres, J.R. & Behrend, M.B. A systematic review on trends in using Moodle for teaching and learning. IJ STEM Ed 9, 9 (2022). https://doi.org/10.1186/s40594-021-00323-x
- Learning management systems
- Thematic analysis