Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

sustainability-logo

Article Menu

thesis study about online learning

  • Subscribe SciFeed
  • Recommended Articles
  • Google Scholar
  • on Google Scholar
  • Table of Contents

Find support for a specific problem in the support section of our website.

Please let us know what you think of our products and services.

Visit our dedicated information section to learn more about MDPI.

JSmol Viewer

Assessing the impact of online-learning effectiveness and benefits in knowledge management, the antecedent of online-learning strategies and motivations: an empirical study.

thesis study about online learning

1. Introduction

2. literature review and research hypothesis, 2.1. online-learning self-efficacy terminology, 2.2. online-learning monitoring terminology, 2.3. online-learning confidence in technology terminology, 2.4. online-learning willpower terminology, 2.5. online-learning attitude terminology, 2.6. online-learning motivation terminology, 2.7. online-learning strategies and online-learning effectiveness terminology, 2.8. online-learning effectiveness terminology, 3. research method, 3.1. instruments, 3.2. data analysis and results, 4.1. reliability and validity analysis, 4.2. hypothesis result, 5. discussion, 6. conclusions, 7. limitations and future directions, author contributions, institutional review board statement, informed consent statement, data availability statement, conflicts of interest.

  • UNESCO. COVID-19 Educational Disruption and Response ; UNESCO: Paris, France, 2020. [ Google Scholar ]
  • Moore, D.R. E-learning and the science of instruction: Proven guidelines for consumers and designers of multimedia learning. Educ. Technol. Res. Dev. 2006 , 54 , 197–200. [ Google Scholar ] [ CrossRef ]
  • McDonald, E.W.; Boulton, J.L.; Davis, J.L. E-learning and nursing assessment skills and knowledge–An integrative review. Nurse Educ. Today 2018 , 66 , 166–174. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Homan, S.R.; Wood, K. Taming the mega-lecture: Wireless quizzing. Syllabus Sunnyvale Chatsworth 2003 , 17 , 23–27. [ Google Scholar ]
  • Emran, M.A.; Shaalan, K. E-podium technology: A medium of managing knowledge at al buraimi university college via mlearning. In Proceedings of the 2nd BCS International IT Conference, Abu Dhabi, United Arab Emirates, 9–10 March 2014; pp. 1–4. [ Google Scholar ]
  • Tenório, T.; Bittencourt, I.I.; Isotani, S.; Silva, A.P. Does peer assessment in on-line learning environments work? A systematic review of the literature. Comput. Hum. Behav. 2016 , 64 , 94–107. [ Google Scholar ] [ CrossRef ]
  • Sheshasaayee, A.; Bee, M.N. Analyzing online learning effectiveness for knowledge society. In Information Systems Design and Intelligent Applications ; Bhateja, V., Nguyen, B., Nguyen, N., Satapathy, S., Le, D.N., Eds.; Springer: Singapore, 2018; pp. 995–1002. [ Google Scholar ]
  • Panigrahi, R.; Srivastava, P.R.; Sharma, D. Online learning: Adoption, continuance, and learning outcome—A review of literature. Int. J. Inform. Manag. 2018 , 43 , 1–14. [ Google Scholar ] [ CrossRef ]
  • Al-Rahmi, W.M.; Alias, N.; Othman, M.S.; Alzahrani, A.I.; Alfarraj, O.; Saged, A.A. Use of e-learning by university students in Malaysian higher educational institutions: A case in Universiti Teknologi Malaysia. IEEE Access 2018 , 6 , 14268–14276. [ Google Scholar ] [ CrossRef ]
  • Al-Rahmi, W.M.; Yahaya, N.; Aldraiweesh, A.A.; Alamri, M.M.; Aljarboa, N.A.; Alturki, U. Integrating technology acceptance model with innovation diffusion theory: An empirical investigation on students’ intention to use E-learning systems. IEEE Access 2019 , 7 , 26797–26809. [ Google Scholar ] [ CrossRef ]
  • Gunawan, I.; Hui, L.K.; Ma’sum, M.A. Enhancing learning effectiveness by using online learning management system. In Proceedings of the 6th International Conference on Education and Technology (ICET), Beijing, China, 18–20 June 2021; pp. 48–52. [ Google Scholar ]
  • Nguyen, P.H.; Tangworakitthaworn, P.; Gilbert, L. Individual learning effectiveness based on cognitive taxonomies and constructive alignment. In Proceedings of the IEEE Region 10 Conference (Tencon), Osaka, Japan, 16–19 November 2020; pp. 1002–1006. [ Google Scholar ]
  • Pee, L.G. Enhancing the learning effectiveness of ill-structured problem solving with online co-creation. Stud. High. Educ. 2020 , 45 , 2341–2355. [ Google Scholar ] [ CrossRef ]
  • Kintu, M.J.; Zhu, C.; Kagambe, E. Blended learning effectiveness: The relationship between student characteristics, design features and outcomes. Int. J. Educ. Technol. High. Educ. 2017 , 14 , 1–20. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Wang, M.H.; Vogel, D.; Ran, W.J. Creating a performance-oriented e-learning environment: A design science approach. Inf. Manag. 2011 , 48 , 260–269. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Hew, K.F.; Cheung, W.S. Students’ and instructors’ use of massive open online courses (MOOCs): Motivations and challenges. Educ. Res. Rev. 2014 , 12 , 45–58. [ Google Scholar ] [ CrossRef ]
  • Bryant, J.; Bates, A.J. Creating a constructivist online instructional environment. TechTrends 2015 , 59 , 17–22. [ Google Scholar ] [ CrossRef ]
  • Lee, M.C. Explaining and predicting users’ continuance intention toward e-learning: An extension of the expectation–confirmation model. Comput. Educ. 2010 , 54 , 506–516. [ Google Scholar ] [ CrossRef ]
  • Lin, K.M. E-Learning continuance intention: Moderating effects of user e-learning experience. Comput. Educ. 2011 , 56 , 515–526. [ Google Scholar ] [ CrossRef ]
  • Huang, E.Y.; Lin, S.W.; Huang, T.K. What type of learning style leads to online participation in the mixed-mode e-learning environment? A study of software usage instruction. Comput. Educ. 2012 , 58 , 338–349. [ Google Scholar ]
  • Chu, T.H.; Chen, Y.Y. With good we become good: Understanding e-learning adoption by theory of planned behavior and group influences. Comput. Educ. 2016 , 92 , 37–52. [ Google Scholar ] [ CrossRef ]
  • Bandura, A. Self-efficacy: Toward a unifying theory of behavioral change. Psychol. Rev. 1977 , 84 , 191–215. [ Google Scholar ] [ CrossRef ]
  • Torkzadeh, G.; Van Dyke, T.P. Development and validation of an Internet self-efficacy scale. Behav. Inform. Technol. 2001 , 20 , 275–280. [ Google Scholar ] [ CrossRef ]
  • Saadé, R.G.; Kira, D. Computer anxiety in e-learning: The effect of computer self-efficacy. J. Inform. Technol. Educ. Res. 2009 , 8 , 177–191. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Tucker, J.; Gentry, G. Developing an E-Learning strategy in higher education. In Proceedings of the SITE 2009–Society for Information Technology & Teacher Education International Conference, Charleston, SC, USA, 2–6 March 2009; pp. 2702–2707. [ Google Scholar ]
  • Wang, Y.; Peng, H.M.; Huang, R.H.; Hou, Y.; Wang, J. Characteristics of distance learners: Research on relationships of learning motivation, learning strategy, self-efficacy, attribution and learning results. Open Learn. J. Open Distance Elearn. 2008 , 23 , 17–28. [ Google Scholar ] [ CrossRef ]
  • Mahmud, B.H. Study on the impact of motivation, self-efficacy and learning strategies of faculty of education undergraduates studying ICT courses. In Proceedings of the 4th International Postgraduate Research Colloquium (IPRC) Proceedings, Bangkok, Thailand, 29 October 2009; pp. 59–80. [ Google Scholar ]
  • Yusuf, M. Investigating relationship between self-efficacy, achievement motivation, and self-regulated learning strategies of undergraduate Students: A study of integrated motivational models. Procedia Soc. Behav. Sci. 2011 , 15 , 2614–2617. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • De la Fuente, J.; Martínez-Vicente, J.M.; Peralta-Sánchez, F.J.; GarzónUmerenkova, A.; Vera, M.M.; Paoloni, P. Applying the SRL vs. ERL theory to the knowledge of achievement emotions in undergraduate university students. Front. Psychol. 2019 , 10 , 2070. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Ahmadi, S. Academic self-esteem, academic self-efficacy and academic achievement: A path analysis. J. Front. Psychol. 2020 , 5 , 155. [ Google Scholar ]
  • Meyen, E.L.; Aust, R.J.; Bui, Y.N. Assessing and monitoring student progress in an E-learning personnel preparation environment. Teach. Educ. Spec. Educ. 2002 , 25 , 187–198. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Dunlosky, J.; Kubat-Silman, A.K.; Christopher, H. Training monitoring skills improves older adults’ self-paced associative learning. Psychol. Aging 2003 , 18 , 340–345. [ Google Scholar ] [ CrossRef ]
  • Zhang, H.J. Research on the relationship between English learning motivation. Self-monitoring and Test Score. Ethnic Educ. Res. 2005 , 6 , 66–71. [ Google Scholar ]
  • Rosenberg, M.J. E-Learning: Strategies for Delivering Knowledge in the Digital Age ; McGraw-Hill: New York, NY, USA, 2001. [ Google Scholar ]
  • Bhat, S.A.; Bashir, M. Measuring ICT orientation: Scale development & validation. Educ. Inf. Technol. 2018 , 23 , 1123–1143. [ Google Scholar ]
  • Achuthan, K.; Francis, S.P.; Diwakar, S. Augmented reflective learning and knowledge retention perceived among students in classrooms involving virtual laboratories. Educ. Inf. Technol. 2017 , 22 , 2825–2855. [ Google Scholar ] [ CrossRef ]
  • Hu, X.; Yelland, N. An investigation of preservice early childhood teachers’ adoption of ICT in a teaching practicum context in Hong Kong. J. Early Child. Teach. Educ. 2017 , 38 , 259–274. [ Google Scholar ] [ CrossRef ]
  • Fraillon, J.; Ainley, J.; Schulz, W.; Friedman, T.; Duckworth, D. Preparing for Life in a Digital World: The IEA International Computer and Information Literacy Study 2018 International Report ; Springer: New York, NY, USA, 2019. [ Google Scholar ]
  • Huber, S.G.; Helm, C. COVID-19 and schooling: Evaluation, assessment and accountability in times of crises—Reacting quickly to explore key issues for policy, practice and research with the school barometer. Educ. Assess. Eval. Account. 2020 , 32 , 237–270. [ Google Scholar ] [ CrossRef ]
  • Eickelmann, B.; Gerick, J. Learning with digital media: Objectives in times of Corona and under special consideration of social Inequities. Dtsch. Schule. 2020 , 16 , 153–162. [ Google Scholar ]
  • Shehzadi, S.; Nisar, Q.A.; Hussain, M.S.; Basheer, M.F.; Hameed, W.U.; Chaudhry, N.I. The role of e-learning toward students’ satisfaction and university brand image at educational institutes of Pakistan: A post-effect of COVID-19. Asian Educ. Dev. Stud. 2020 , 10 , 275–294. [ Google Scholar ] [ CrossRef ]
  • Miller, E.M.; Walton, G.M.; Dweck, C.S.; Job, V.; Trzesniewski, K.; McClure, S. Theories of willpower affect sustained learning. PLoS ONE 2012 , 7 , 38680. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Moriña, A.; Molina, V.M.; Cortés-Vega, M.D. Voices from Spanish students with disabilities: Willpower and effort to survive university. Eur. J. Spec. Needs Educ. 2018 , 33 , 481–494. [ Google Scholar ] [ CrossRef ]
  • Koballa, T.R., Jr.; Crawley, F.E. The influence of attitude on science teaching and learning. Sch. Sci. Math. 1985 , 85 , 222–232. [ Google Scholar ] [ CrossRef ]
  • Chao, C.Y.; Chen, Y.T.; Chuang, K.Y. Exploring students’ learning attitude and achievement in flipped learning supported computer aided design curriculum: A study in high school engineering education. Comput. Appl. Eng. Educ. 2015 , 23 , 514–526. [ Google Scholar ] [ CrossRef ]
  • Stefan, M.; Ciomos, F. The 8th and 9th grades students’ attitude towards teaching and learning physics. Acta Didact. Napocensia. 2010 , 3 , 7–14. [ Google Scholar ]
  • Sedighi, F.; Zarafshan, M.A. Effects of attitude and motivation on the use of language learning strategies by Iranian EFL University students. J. Soc. Sci. Humanit. Shiraz Univ. 2007 , 23 , 71–80. [ Google Scholar ]
  • Megan, S.; Jennifer, H.C.; Stephanie, V.; Kyla, H. The relationship among middle school students’ motivation orientations, learning strategies, and academic achievement. Middle Grades Res. J. 2013 , 8 , 1–12. [ Google Scholar ]
  • Nasser, O.; Majid, V. Motivation, attitude, and language learning. Procedia Soc. Behav. Sci. 2011 , 29 , 994–1000. [ Google Scholar ]
  • Özhan, Ş.Ç.; Kocadere, S.A. The effects of flow, emotional engagement, and motivation on success in a gamified online learning environment. J. Educ. Comput. Res. 2020 , 57 , 2006–2031. [ Google Scholar ] [ CrossRef ]
  • Wang, A.P.; Che, H.S. A research on the relationship between learning anxiety, learning attitude, motivation and test performance. Psychol. Dev. Educ. 2005 , 21 , 55–59. [ Google Scholar ]
  • Lin, C.H.; Zhang, Y.N.; Zheng, B.B. The roles of learning strategies and motivation in online language learning: A structural equation modeling analysis. Comput. Educ. 2017 , 113 , 75–85. [ Google Scholar ] [ CrossRef ]
  • Deschênes, M.F.; Goudreau, J.; Fernandez, N. Learning strategies used by undergraduate nursing students in the context of a digital educational strategy based on script concordance: A descriptive study. Nurse Educ. Today 2020 , 95 , 104607. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Jerusalem, M.; Schwarzer, R. Self-efficacy as a resource factor in stress appraisal processes. In Self-Efficacy: Thought Control of Action ; Schwarzer, R., Ed.; Hemisphere Publishing Corp: Washington, DC, USA, 1992; pp. 195–213. [ Google Scholar ]
  • Zimmerman, B.J. Becoming a self-regulated learner: An overview. Theory Pract. 2002 , 41 , 64–70. [ Google Scholar ] [ CrossRef ]
  • Pintrich, P.R.; Smith, D.A.F.; García, T.; McKeachie, W.J. A Manual for the Use of the Motivated Strategies Questionnaire (MSLQ) ; University of Michigan, National Center for Research to Improve Post Secondary Teaching and Learning: Ann Arbor, MI, USA, 1991. [ Google Scholar ]
  • Knowles, E.; Kerkman, D. An investigation of students attitude and motivation toward online learning. InSight Collect. Fac. Scholarsh. 2007 , 2 , 70–80. [ Google Scholar ] [ CrossRef ]
  • Hair, J.F., Jr.; Black, W.C.; Babin, B.J.; Anderson, R.E. Multivariate Data Analysis: A Global Perspective , 7th ed.; Pearson Education International: Upper Saddle River, NJ, USA, 2010. [ Google Scholar ]
  • Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981 , 18 , 39–50. [ Google Scholar ] [ CrossRef ]
  • Hair, J.F., Jr.; Hult, G.T.M.; Ringle, C.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM) ; Sage: Los Angeles, CA, USA, 2016. [ Google Scholar ]
  • Kiliç-Çakmak, E. Learning strategies and motivational factors predicting information literacy self-efficacy of e-learners. Aust. J. Educ. Technol. 2010 , 26 , 192–208. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Zheng, C.; Liang, J.C.; Li, M.; Tsai, C. The relationship between English language learners’ motivation and online self-regulation: A structural equation modelling approach. System 2018 , 76 , 144–157. [ Google Scholar ] [ CrossRef ]
  • May, M.; George, S.; Prévôt, P. TrAVis to enhance students’ self-monitoring in online learning supported by computer-mediated communication tools. Int. J. Comput. Inform. Syst. Ind. Manag. Appl. 2011 , 3 , 623–634. [ Google Scholar ]
  • Rafart, M.A.; Bikfalvi, A.; Soler, J.; Poch, J. Impact of using automatic E-Learning correctors on teaching business subjects to engineers. Int. J. Eng. Educ. 2019 , 35 , 1630–1641. [ Google Scholar ]
  • Lee, P.M.; Tsui, W.H.; Hsiao, T.C. A low-cost scalable solution for monitoring affective state of students in E-learning environment using mouse and keystroke data. In Intelligent Tutoring Systems ; Cerri, S.A., Clancey, W.J., Papadourakis, G., Panourgia, K., Eds.; Springer: Berlin, Germany, 2012; pp. 679–680. [ Google Scholar ]
  • Metz, D.; Karadgi, S.S.; Müller, U.J.; Grauer, M. Self-Learning monitoring and control of manufacturing processes based on rule induction and event processing. In Proceedings of the 4th International Conference on Information, Process, and Knowledge Management eKNOW, Valencia, Spain, 21–25 November 2012; pp. 78–85. [ Google Scholar ]
  • Fitch, J.L.; Ravlin, E.C. Willpower and perceived behavioral control: Intention-behavior relationship and post behavior attributions. Soc. Behav. Pers. Int. J. 2005 , 33 , 105–124. [ Google Scholar ] [ CrossRef ]
  • Sridharan, B.; Deng, H.; Kirk, J.; Brian, C. Structural equation modeling for evaluating the user perceptions of e-learning effectiveness in higher education. In Proceedings of the ECIS 2010: 18th European Conference on Information Systems, Pretoria, South Africa, 7–9 June 2010. [ Google Scholar ]
  • Tarhini, A.; Hone, K.; Liu, X. The effects of individual differences on e-learning users’ behaviour in developing countries: A structural equation model. Comput. Hum. Behav. 2014 , 41 , 153–163. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • de Leeuw, R.A.; Logger, D.N.; Westerman, M.; Bretschneider, J.; Plomp, M.; Scheele, F. Influencing factors in the implementation of postgraduate medical e-learning: A thematic analysis. BMC Med. Educ. 2019 , 19 , 300. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Erenler, H.H.T. A structural equation model to evaluate students’ learning and satisfaction. Comput. Appl. Eng. Educ. 2020 , 28 , 254–267. [ Google Scholar ] [ CrossRef ]
  • Fee, K. Delivering E-learning: A complete strategy for design, application and assessment. Dev. Learn. Organ. 2013 , 27 , 40–52. [ Google Scholar ] [ CrossRef ]
  • So, W.W.N.; Chen, Y.; Wan, Z.H. Multimedia e-Learning and self-regulated science learning: A study of primary school learners’ experiences and perceptions. J. Sci. Educ. Technol. 2019 , 28 , 508–522. [ Google Scholar ] [ CrossRef ]

Click here to enlarge figure

VariablesCategoryFrequencyPercentage
GenderMale24351.81
Female22648.19
Education program levelUndergraduate program21044.78
Master program15432.84
Doctoral program10522.39
Online learning toolsSmartphone25554.37
Computer/PC12526.65
Tablet8918.98
Online learning mediaGoogle Meet13228.14
Microsoft Teams9921.11
Zoom19641.79
Others428.96
ConstructMeasurement ItemsFactor Loading/Coefficient (t-Value)AVEComposite ReliabilityCronbach’s Alpha
Online Learning Benefit (LBE)LBE10.880.680.860.75
LBE20.86
LBE30.71
Online-learning effectiveness (LEF)LEF10.830.760.900.84
LEF20.88
LEF30.90
Online-learning motivation (LMT)LMT10.860.770.910.85
LMT20.91
LMT30.85
Online-learning strategies (LST)LST10.900.750.900.84
LST20.87
LST30.83
Online-learning attitude (OLA)OLA10.890.750.900.84
OLA20.83
OLA30.87
Online-learning confidence-in-technology (OLC)OLC10.870.690.870.76
OLC20.71
OLC30.89
Online-learning monitoring (OLM)OLM10.880.750.890.83
OLM20.91
OLM30.79
Online-learning self-efficacy (OLS)OLS10.790.640.840.73
OLS20.81
OLS30.89
Online-learning willpower (OLW)OLW10.910.690.870.77
OLW20.84
OLW30.73
LBELEFLMTLSTOLAOLCOLMOLSOLW
LBE
LEF0.82
LMT0.810.80
LST0.800.840.86
OLA0.690.630.780.81
OLC0.760.790.850.790.72
OLM0.810.850.810.760.630.83
OLS0.710.590.690.570.560.690.75
OLW0.750.750.800.740.640.810.800.79
LBELEFLMTLSTOLAOLCOLMOLSOLW
LBE10.880.760.870.660.540.790.780.630.74
LBE20.860.680.740.630.570.750.910.730.79
LBE30.710.540.590.710.630.550.500.360.53
LEF10.630.830.720.650.510.620.690.460.57
LEF20.770.880.780.710.550.730.780.520.69
LEF30.720.900.800.830.570.720.760.580.69
LMT10.880.760.870.660.540.790.780.630.74
LMT20.790.890.910.790.620.730.880.610.67
LMT30.720.650.850.770.890.720.670.590.69
LST10.610.630.680.900.780.640.570.390.57
LST20.740.590.720.870.780.680.610.480.63
LST30.720.900.800.830.570.720.760.580.69
OLA10.720.650.850.790.890.720.670.590.69
OLA20.510.480.550.590.830.580.470.420.43
OLA30.520.440.550.700.870.550.430.390.47
OLC10.780.700.730.650.530.870.770.650.91
OLC20.510.530.570.620.750.710.460.390.47
OLC30.810.730.780.690.550.890.800.660.75
OLM10.790.890.910.790.620.730.880.610.69
OLM20.860.680.740.630.570.750.910.730.79
OLM30.690.550.570.470.390.670.790.610.73
OLS10.410.230.350.280.390.410.400.690.49
OLS20.450.410.480.380.430.480.520.810.49
OLS30.750.660.720.600.490.690.770.890.82
OLW10.780.700.730.650.530.870.770.650.91
OLW20.750.650.710.590.510.690.770.870.84
OLW30.570.490.540.590.570.570.530.390.73
HypothesisPathStandardized Path Coefficientt-ValueResult
H1OLS → LST0.29 ***2.14Accepted
H2OLM → LST0.24 ***2.29Accepted
H3OLC → LST0.28 ***1.99Accepted
H4OLC → LMT0.36 ***2.96Accepted
H5OLW → LMT0.26 ***2.55Accepted
H6OLA → LMT0.34 ***4.68Accepted
H7LMT → LST0.71 ***4.96Accepted
H8LMT → LEF0.60 ***5.89Accepted
H9LST → LEF0.32 ***3.04Accepted
H10LEF → LBE0.81 ***23.6Accepted
MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

Hongsuchon, T.; Emary, I.M.M.E.; Hariguna, T.; Qhal, E.M.A. Assessing the Impact of Online-Learning Effectiveness and Benefits in Knowledge Management, the Antecedent of Online-Learning Strategies and Motivations: An Empirical Study. Sustainability 2022 , 14 , 2570. https://doi.org/10.3390/su14052570

Hongsuchon T, Emary IMME, Hariguna T, Qhal EMA. Assessing the Impact of Online-Learning Effectiveness and Benefits in Knowledge Management, the Antecedent of Online-Learning Strategies and Motivations: An Empirical Study. Sustainability . 2022; 14(5):2570. https://doi.org/10.3390/su14052570

Hongsuchon, Tanaporn, Ibrahiem M. M. El Emary, Taqwa Hariguna, and Eissa Mohammed Ali Qhal. 2022. "Assessing the Impact of Online-Learning Effectiveness and Benefits in Knowledge Management, the Antecedent of Online-Learning Strategies and Motivations: An Empirical Study" Sustainability 14, no. 5: 2570. https://doi.org/10.3390/su14052570

Article Metrics

Article access statistics, further information, mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

Advertisement

Advertisement

The effects of online education on academic success: A meta-analysis study

  • Published: 06 September 2021
  • Volume 27 , pages 429–450, ( 2022 )

Cite this article

thesis study about online learning

  • Hakan Ulum   ORCID: orcid.org/0000-0002-1398-6935 1  

82k Accesses

30 Citations

10 Altmetric

Explore all metrics

The purpose of this study is to analyze the effect of online education, which has been extensively used on student achievement since the beginning of the pandemic. In line with this purpose, a meta-analysis of the related studies focusing on the effect of online education on students’ academic achievement in several countries between the years 2010 and 2021 was carried out. Furthermore, this study will provide a source to assist future studies with comparing the effect of online education on academic achievement before and after the pandemic. This meta-analysis study consists of 27 studies in total. The meta-analysis involves the studies conducted in the USA, Taiwan, Turkey, China, Philippines, Ireland, and Georgia. The studies included in the meta-analysis are experimental studies, and the total sample size is 1772. In the study, the funnel plot, Duval and Tweedie’s Trip and Fill Analysis, Orwin’s Safe N Analysis, and Egger’s Regression Test were utilized to determine the publication bias, which has been found to be quite low. Besides, Hedge’s g statistic was employed to measure the effect size for the difference between the means performed in accordance with the random effects model. The results of the study show that the effect size of online education on academic achievement is on a medium level. The heterogeneity test results of the meta-analysis study display that the effect size does not differ in terms of class level, country, online education approaches, and lecture moderators.

Explore related subjects

  • Artificial Intelligence
  • Digital Education and Educational Technology

Avoid common mistakes on your manuscript.

1 Introduction

Information and communication technologies have become a powerful force in transforming the educational settings around the world. The pandemic has been an important factor in transferring traditional physical classrooms settings through adopting information and communication technologies and has also accelerated the transformation. The literature supports that learning environments connected to information and communication technologies highly satisfy students. Therefore, we need to keep interest in technology-based learning environments. Clearly, technology has had a huge impact on young people's online lives. This digital revolution can synergize the educational ambitions and interests of digitally addicted students. In essence, COVID-19 has provided us with an opportunity to embrace online learning as education systems have to keep up with the rapid emergence of new technologies.

Information and communication technologies that have an effect on all spheres of life are also actively included in the education field. With the recent developments, using technology in education has become inevitable due to personal and social reasons (Usta, 2011a ). Online education may be given as an example of using information and communication technologies as a consequence of the technological developments. Also, it is crystal clear that online learning is a popular way of obtaining instruction (Demiralay et al., 2016 ; Pillay et al., 2007 ), which is defined by Horton ( 2000 ) as a way of education that is performed through a web browser or an online application without requiring an extra software or a learning source. Furthermore, online learning is described as a way of utilizing the internet to obtain the related learning sources during the learning process, to interact with the content, the teacher, and other learners, as well as to get support throughout the learning process (Ally, 2004 ). Online learning has such benefits as learning independently at any time and place (Vrasidas & MsIsaac, 2000 ), granting facility (Poole, 2000 ), flexibility (Chizmar & Walbert, 1999 ), self-regulation skills (Usta, 2011b ), learning with collaboration, and opportunity to plan self-learning process.

Even though online education practices have not been comprehensive as it is now, internet and computers have been used in education as alternative learning tools in correlation with the advances in technology. The first distance education attempt in the world was initiated by the ‘Steno Courses’ announcement published in Boston newspaper in 1728. Furthermore, in the nineteenth century, Sweden University started the “Correspondence Composition Courses” for women, and University Correspondence College was afterwards founded for the correspondence courses in 1843 (Arat & Bakan, 2011 ). Recently, distance education has been performed through computers, assisted by the facilities of the internet technologies, and soon, it has evolved into a mobile education practice that is emanating from progress in the speed of internet connection, and the development of mobile devices.

With the emergence of pandemic (Covid-19), face to face education has almost been put to a halt, and online education has gained significant importance. The Microsoft management team declared to have 750 users involved in the online education activities on the 10 th March, just before the pandemic; however, on March 24, they informed that the number of users increased significantly, reaching the number of 138,698 users (OECD, 2020 ). This event supports the view that it is better to commonly use online education rather than using it as a traditional alternative educational tool when students do not have the opportunity to have a face to face education (Geostat, 2019 ). The period of Covid-19 pandemic has emerged as a sudden state of having limited opportunities. Face to face education has stopped in this period for a long time. The global spread of Covid-19 affected more than 850 million students all around the world, and it caused the suspension of face to face education. Different countries have proposed several solutions in order to maintain the education process during the pandemic. Schools have had to change their curriculum, and many countries supported the online education practices soon after the pandemic. In other words, traditional education gave its way to online education practices. At least 96 countries have been motivated to access online libraries, TV broadcasts, instructions, sources, video lectures, and online channels (UNESCO, 2020 ). In such a painful period, educational institutions went through online education practices by the help of huge companies such as Microsoft, Google, Zoom, Skype, FaceTime, and Slack. Thus, online education has been discussed in the education agenda more intensively than ever before.

Although online education approaches were not used as comprehensively as it has been used recently, it was utilized as an alternative learning approach in education for a long time in parallel with the development of technology, internet and computers. The academic achievement of the students is often aimed to be promoted by employing online education approaches. In this regard, academicians in various countries have conducted many studies on the evaluation of online education approaches and published the related results. However, the accumulation of scientific data on online education approaches creates difficulties in keeping, organizing and synthesizing the findings. In this research area, studies are being conducted at an increasing rate making it difficult for scientists to be aware of all the research outside of their ​​expertise. Another problem encountered in the related study area is that online education studies are repetitive. Studies often utilize slightly different methods, measures, and/or examples to avoid duplication. This erroneous approach makes it difficult to distinguish between significant differences in the related results. In other words, if there are significant differences in the results of the studies, it may be difficult to express what variety explains the differences in these results. One obvious solution to these problems is to systematically review the results of various studies and uncover the sources. One method of performing such systematic syntheses is the application of meta-analysis which is a methodological and statistical approach to draw conclusions from the literature. At this point, how effective online education applications are in increasing the academic success is an important detail. Has online education, which is likely to be encountered frequently in the continuing pandemic period, been successful in the last ten years? If successful, how much was the impact? Did different variables have an impact on this effect? Academics across the globe have carried out studies on the evaluation of online education platforms and publishing the related results (Chiao et al., 2018 ). It is quite important to evaluate the results of the studies that have been published up until now, and that will be published in the future. Has the online education been successful? If it has been, how big is the impact? Do the different variables affect this impact? What should we consider in the next coming online education practices? These questions have all motivated us to carry out this study. We have conducted a comprehensive meta-analysis study that tries to provide a discussion platform on how to develop efficient online programs for educators and policy makers by reviewing the related studies on online education, presenting the effect size, and revealing the effect of diverse variables on the general impact.

There have been many critical discussions and comprehensive studies on the differences between online and face to face learning; however, the focus of this paper is different in the sense that it clarifies the magnitude of the effect of online education and teaching process, and it represents what factors should be controlled to help increase the effect size. Indeed, the purpose here is to provide conscious decisions in the implementation of the online education process.

The general impact of online education on the academic achievement will be discovered in the study. Therefore, this will provide an opportunity to get a general overview of the online education which has been practiced and discussed intensively in the pandemic period. Moreover, the general impact of online education on academic achievement will be analyzed, considering different variables. In other words, the current study will allow to totally evaluate the study results from the related literature, and to analyze the results considering several cultures, lectures, and class levels. Considering all the related points, this study seeks to answer the following research questions:

What is the effect size of online education on academic achievement?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the country?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the class level?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the lecture?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the online education approaches?

This study aims at determining the effect size of online education, which has been highly used since the beginning of the pandemic, on students’ academic achievement in different courses by using a meta-analysis method. Meta-analysis is a synthesis method that enables gathering of several study results accurately and efficiently, and getting the total results in the end (Tsagris & Fragkos, 2018 ).

2.1 Selecting and coding the data (studies)

The required literature for the meta-analysis study was reviewed in July, 2020, and the follow-up review was conducted in September, 2020. The purpose of the follow-up review was to include the studies which were published in the conduction period of this study, and which met the related inclusion criteria. However, no study was encountered to be included in the follow-up review.

In order to access the studies in the meta-analysis, the databases of Web of Science, ERIC, and SCOPUS were reviewed by utilizing the keywords ‘online learning and online education’. Not every database has a search engine that grants access to the studies by writing the keywords, and this obstacle was considered to be an important problem to be overcome. Therefore, a platform that has a special design was utilized by the researcher. With this purpose, through the open access system of Cukurova University Library, detailed reviews were practiced using EBSCO Information Services (EBSCO) that allow reviewing the whole collection of research through a sole searching box. Since the fundamental variables of this study are online education and online learning, the literature was systematically reviewed in the related databases (Web of Science, ERIC, and SCOPUS) by referring to the keywords. Within this scope, 225 articles were accessed, and the studies were included in the coding key list formed by the researcher. The name of the researchers, the year, the database (Web of Science, ERIC, and SCOPUS), the sample group and size, the lectures that the academic achievement was tested in, the country that the study was conducted in, and the class levels were all included in this coding key.

The following criteria were identified to include 225 research studies which were coded based on the theoretical basis of the meta-analysis study: (1) The studies should be published in the refereed journals between the years 2020 and 2021, (2) The studies should be experimental studies that try to determine the effect of online education and online learning on academic achievement, (3) The values of the stated variables or the required statistics to calculate these values should be stated in the results of the studies, and (4) The sample group of the study should be at a primary education level. These criteria were also used as the exclusion criteria in the sense that the studies that do not meet the required criteria were not included in the present study.

After the inclusion criteria were determined, a systematic review process was conducted, following the year criterion of the study by means of EBSCO. Within this scope, 290,365 studies that analyze the effect of online education and online learning on academic achievement were accordingly accessed. The database (Web of Science, ERIC, and SCOPUS) was also used as a filter by analyzing the inclusion criteria. Hence, the number of the studies that were analyzed was 58,616. Afterwards, the keyword ‘primary education’ was used as the filter and the number of studies included in the study decreased to 3152. Lastly, the literature was reviewed by using the keyword ‘academic achievement’ and 225 studies were accessed. All the information of 225 articles was included in the coding key.

It is necessary for the coders to review the related studies accurately and control the validity, safety, and accuracy of the studies (Stewart & Kamins, 2001 ). Within this scope, the studies that were determined based on the variables used in this study were first reviewed by three researchers from primary education field, then the accessed studies were combined and processed in the coding key by the researcher. All these studies that were processed in the coding key were analyzed in accordance with the inclusion criteria by all the researchers in the meetings, and it was decided that 27 studies met the inclusion criteria (Atici & Polat, 2010 ; Carreon, 2018 ; Ceylan & Elitok Kesici, 2017 ; Chae & Shin, 2016 ; Chiang et al. 2014 ; Ercan, 2014 ; Ercan et al., 2016 ; Gwo-Jen et al., 2018 ; Hayes & Stewart, 2016 ; Hwang et al., 2012 ; Kert et al., 2017 ; Lai & Chen, 2010 ; Lai et al., 2015 ; Meyers et al., 2015 ; Ravenel et al., 2014 ; Sung et al., 2016 ; Wang & Chen, 2013 ; Yu, 2019 ; Yu & Chen, 2014 ; Yu & Pan, 2014 ; Yu et al., 2010 ; Zhong et al., 2017 ). The data from the studies meeting the inclusion criteria were independently processed in the second coding key by three researchers, and consensus meetings were arranged for further discussion. After the meetings, researchers came to an agreement that the data were coded accurately and precisely. Having identified the effect sizes and heterogeneity of the study, moderator variables that will show the differences between the effect sizes were determined. The data related to the determined moderator variables were added to the coding key by three researchers, and a new consensus meeting was arranged. After the meeting, researchers came to an agreement that moderator variables were coded accurately and precisely.

2.2 Study group

27 studies are included in the meta-analysis. The total sample size of the studies that are included in the analysis is 1772. The characteristics of the studies included are given in Table 1 .

2.3 Publication bias

Publication bias is the low capability of published studies on a research subject to represent all completed studies on the same subject (Card, 2011 ; Littell et al., 2008 ). Similarly, publication bias is the state of having a relationship between the probability of the publication of a study on a subject, and the effect size and significance that it produces. Within this scope, publication bias may occur when the researchers do not want to publish the study as a result of failing to obtain the expected results, or not being approved by the scientific journals, and consequently not being included in the study synthesis (Makowski et al., 2019 ). The high possibility of publication bias in a meta-analysis study negatively affects (Pecoraro, 2018 ) the accuracy of the combined effect size, causing the average effect size to be reported differently than it should be (Borenstein et al., 2009 ). For this reason, the possibility of publication bias in the included studies was tested before determining the effect sizes of the relationships between the stated variables. The possibility of publication bias of this meta-analysis study was analyzed by using the funnel plot, Orwin’s Safe N Analysis, Duval and Tweedie’s Trip and Fill Analysis, and Egger’s Regression Test.

2.4 Selecting the model

After determining the probability of publication bias of this meta-analysis study, the statistical model used to calculate the effect sizes was selected. The main approaches used in the effect size calculations according to the differentiation level of inter-study variance are fixed and random effects models (Pigott, 2012 ). Fixed effects model refers to the homogeneity of the characteristics of combined studies apart from the sample sizes, while random effects model refers to the parameter diversity between the studies (Cumming, 2012 ). While calculating the average effect size in the random effects model (Deeks et al., 2008 ) that is based on the assumption that effect predictions of different studies are only the result of a similar distribution, it is necessary to consider several situations such as the effect size apart from the sample error of combined studies, characteristics of the participants, duration, scope, and pattern of the study (Littell et al., 2008 ). While deciding the model in the meta-analysis study, the assumptions on the sample characteristics of the studies included in the analysis and the inferences that the researcher aims to make should be taken into consideration. The fact that the sample characteristics of the studies conducted in the field of social sciences are affected by various parameters shows that using random effects model is more appropriate in this sense. Besides, it is stated that the inferences made with the random effects model are beyond the studies included in the meta-analysis (Field, 2003 ; Field & Gillett, 2010 ). Therefore, using random effects model also contributes to the generalization of research data. The specified criteria for the statistical model selection show that according to the nature of the meta-analysis study, the model should be selected just before the analysis (Borenstein et al., 2007 ; Littell et al., 2008 ). Within this framework, it was decided to make use of the random effects model, considering that the students who are the samples of the studies included in the meta-analysis are from different countries and cultures, the sample characteristics of the studies differ, and the patterns and scopes of the studies vary as well.

2.5 Heterogeneity

Meta-analysis facilitates analyzing the research subject with different parameters by showing the level of diversity between the included studies. Within this frame, whether there is a heterogeneous distribution between the studies included in the study or not has been evaluated in the present study. The heterogeneity of the studies combined in this meta-analysis study has been determined through Q and I 2 tests. Q test evaluates the random distribution probability of the differences between the observed results (Deeks et al., 2008 ). Q value exceeding 2 value calculated according to the degree of freedom and significance, indicates the heterogeneity of the combined effect sizes (Card, 2011 ). I 2 test, which is the complementary of the Q test, shows the heterogeneity amount of the effect sizes (Cleophas & Zwinderman, 2017 ). I 2 value being higher than 75% is explained as high level of heterogeneity.

In case of encountering heterogeneity in the studies included in the meta-analysis, the reasons of heterogeneity can be analyzed by referring to the study characteristics. The study characteristics which may be related to the heterogeneity between the included studies can be interpreted through subgroup analysis or meta-regression analysis (Deeks et al., 2008 ). While determining the moderator variables, the sufficiency of the number of variables, the relationship between the moderators, and the condition to explain the differences between the results of the studies have all been considered in the present study. Within this scope, it was predicted in this meta-analysis study that the heterogeneity can be explained with the country, class level, and lecture moderator variables of the study in terms of the effect of online education, which has been highly used since the beginning of the pandemic, and it has an impact on the students’ academic achievement in different lectures. Some subgroups were evaluated and categorized together, considering that the number of effect sizes of the sub-dimensions of the specified variables is not sufficient to perform moderator analysis (e.g. the countries where the studies were conducted).

2.6 Interpreting the effect sizes

Effect size is a factor that shows how much the independent variable affects the dependent variable positively or negatively in each included study in the meta-analysis (Dinçer, 2014 ). While interpreting the effect sizes obtained from the meta-analysis, the classifications of Cohen et al. ( 2007 ) have been utilized. The case of differentiating the specified relationships of the situation of the country, class level, and school subject variables of the study has been identified through the Q test, degree of freedom, and p significance value Fig.  1 and 2 .

3 Findings and results

The purpose of this study is to determine the effect size of online education on academic achievement. Before determining the effect sizes in the study, the probability of publication bias of this meta-analysis study was analyzed by using the funnel plot, Orwin’s Safe N Analysis, Duval and Tweedie’s Trip and Fill Analysis, and Egger’s Regression Test.

When the funnel plots are examined, it is seen that the studies included in the analysis are distributed symmetrically on both sides of the combined effect size axis, and they are generally collected in the middle and lower sections. The probability of publication bias is low according to the plots. However, since the results of the funnel scatter plots may cause subjective interpretations, they have been supported by additional analyses (Littell et al., 2008 ). Therefore, in order to provide an extra proof for the probability of publication bias, it has been analyzed through Orwin’s Safe N Analysis, Duval and Tweedie’s Trip and Fill Analysis, and Egger’s Regression Test (Table 2 ).

Table 2 consists of the results of the rates of publication bias probability before counting the effect size of online education on academic achievement. According to the table, Orwin Safe N analysis results show that it is not necessary to add new studies to the meta-analysis in order for Hedges g to reach a value outside the range of ± 0.01. The Duval and Tweedie test shows that excluding the studies that negatively affect the symmetry of the funnel scatter plots for each meta-analysis or adding their exact symmetrical equivalents does not significantly differentiate the calculated effect size. The insignificance of the Egger tests results reveals that there is no publication bias in the meta-analysis study. The results of the analysis indicate the high internal validity of the effect sizes and the adequacy of representing the studies conducted on the relevant subject.

In this study, it was aimed to determine the effect size of online education on academic achievement after testing the publication bias. In line with the first purpose of the study, the forest graph regarding the effect size of online education on academic achievement is shown in Fig.  3 , and the statistics regarding the effect size are given in Table 3 .

figure 1

The flow chart of the scanning and selection process of the studies

figure 2

Funnel plot graphics representing the effect size of the effects of online education on academic success

figure 3

Forest graph related to the effect size of online education on academic success

The square symbols in the forest graph in Fig.  3 represent the effect sizes, while the horizontal lines show the intervals in 95% confidence of the effect sizes, and the diamond symbol shows the overall effect size. When the forest graph is analyzed, it is seen that the lower and upper limits of the combined effect sizes are generally close to each other, and the study loads are similar. This similarity in terms of study loads indicates the similarity of the contribution of the combined studies to the overall effect size.

Figure  3 clearly represents that the study of Liu and others (Liu et al., 2018 ) has the lowest, and the study of Ercan and Bilen ( 2014 ) has the highest effect sizes. The forest graph shows that all the combined studies and the overall effect are positive. Furthermore, it is simply understood from the forest graph in Fig.  3 and the effect size statistics in Table 3 that the results of the meta-analysis study conducted with 27 studies and analyzing the effect of online education on academic achievement illustrate that this relationship is on average level (= 0.409).

After the analysis of the effect size in the study, whether the studies included in the analysis are distributed heterogeneously or not has also been analyzed. The heterogeneity of the combined studies was determined through the Q and I 2 tests. As a result of the heterogeneity test, Q statistical value was calculated as 29.576. With 26 degrees of freedom at 95% significance level in the chi-square table, the critical value is accepted as 38.885. The Q statistical value (29.576) counted in this study is lower than the critical value of 38.885. The I 2 value, which is the complementary of the Q statistics, is 12.100%. This value indicates that the accurate heterogeneity or the total variability that can be attributed to variability between the studies is 12%. Besides, p value is higher than (0.285) p = 0.05. All these values [Q (26) = 29.579, p = 0.285; I2 = 12.100] indicate that there is a homogeneous distribution between the effect sizes, and fixed effects model should be used to interpret these effect sizes. However, some researchers argue that even if the heterogeneity is low, it should be evaluated based on the random effects model (Borenstein et al., 2007 ). Therefore, this study gives information about both models. The heterogeneity of the combined studies has been attempted to be explained with the characteristics of the studies included in the analysis. In this context, the final purpose of the study is to determine the effect of the country, academic level, and year variables on the findings. Accordingly, the statistics regarding the comparison of the stated relations according to the countries where the studies were conducted are given in Table 4 .

As seen in Table 4 , the effect of online education on academic achievement does not differ significantly according to the countries where the studies were conducted in. Q test results indicate the heterogeneity of the relationships between the variables in terms of countries where the studies were conducted in. According to the table, the effect of online education on academic achievement was reported as the highest in other countries, and the lowest in the US. The statistics regarding the comparison of the stated relations according to the class levels are given in Table 5 .

As seen in Table 5 , the effect of online education on academic achievement does not differ according to the class level. However, the effect of online education on academic achievement is the highest in the 4 th class. The statistics regarding the comparison of the stated relations according to the class levels are given in Table 6 .

As seen in Table 6 , the effect of online education on academic achievement does not differ according to the school subjects included in the studies. However, the effect of online education on academic achievement is the highest in ICT subject.

The obtained effect size in the study was formed as a result of the findings attained from primary studies conducted in 7 different countries. In addition, these studies are the ones on different approaches to online education (online learning environments, social networks, blended learning, etc.). In this respect, the results may raise some questions about the validity and generalizability of the results of the study. However, the moderator analyzes, whether for the country variable or for the approaches covered by online education, did not create significant differences in terms of the effect sizes. If significant differences were to occur in terms of effect sizes, we could say that the comparisons we will make by comparing countries under the umbrella of online education would raise doubts in terms of generalizability. Moreover, no study has been found in the literature that is not based on a special approach or does not contain a specific technique conducted under the name of online education alone. For instance, one of the commonly used definitions is blended education which is defined as an educational model in which online education is combined with traditional education method (Colis & Moonen, 2001 ). Similarly, Rasmussen ( 2003 ) defines blended learning as “a distance education method that combines technology (high technology such as television, internet, or low technology such as voice e-mail, conferences) with traditional education and training.” Further, Kerres and Witt (2003) define blended learning as “combining face-to-face learning with technology-assisted learning.” As it is clearly observed, online education, which has a wider scope, includes many approaches.

As seen in Table 7 , the effect of online education on academic achievement does not differ according to online education approaches included in the studies. However, the effect of online education on academic achievement is the highest in Web Based Problem Solving Approach.

4 Conclusions and discussion

Considering the developments during the pandemics, it is thought that the diversity in online education applications as an interdisciplinary pragmatist field will increase, and the learning content and processes will be enriched with the integration of new technologies into online education processes. Another prediction is that more flexible and accessible learning opportunities will be created in online education processes, and in this way, lifelong learning processes will be strengthened. As a result, it is predicted that in the near future, online education and even digital learning with a newer name will turn into the main ground of education instead of being an alternative or having a support function in face-to-face learning. The lessons learned from the early period online learning experience, which was passed with rapid adaptation due to the Covid19 epidemic, will serve to develop this method all over the world, and in the near future, online learning will become the main learning structure through increasing its functionality with the contribution of new technologies and systems. If we look at it from this point of view, there is a necessity to strengthen online education.

In this study, the effect of online learning on academic achievement is at a moderate level. To increase this effect, the implementation of online learning requires support from teachers to prepare learning materials, to design learning appropriately, and to utilize various digital-based media such as websites, software technology and various other tools to support the effectiveness of online learning (Rolisca & Achadiyah, 2014 ). According to research conducted by Rahayu et al. ( 2017 ), it has been proven that the use of various types of software increases the effectiveness and quality of online learning. Implementation of online learning can affect students' ability to adapt to technological developments in that it makes students use various learning resources on the internet to access various types of information, and enables them to get used to performing inquiry learning and active learning (Hart et al., 2019 ; Prestiadi et al., 2019 ). In addition, there may be many reasons for the low level of effect in this study. The moderator variables examined in this study could be a guide in increasing the level of practical effect. However, the effect size did not differ significantly for all moderator variables. Different moderator analyzes can be evaluated in order to increase the level of impact of online education on academic success. If confounding variables that significantly change the effect level are detected, it can be spoken more precisely in order to increase this level. In addition to the technical and financial problems, the level of impact will increase if a few other difficulties are eliminated such as students, lack of interaction with the instructor, response time, and lack of traditional classroom socialization.

In addition, COVID-19 pandemic related social distancing has posed extreme difficulties for all stakeholders to get online as they have to work in time constraints and resource constraints. Adopting the online learning environment is not just a technical issue, it is a pedagogical and instructive challenge as well. Therefore, extensive preparation of teaching materials, curriculum, and assessment is vital in online education. Technology is the delivery tool and requires close cross-collaboration between teaching, content and technology teams (CoSN, 2020 ).

Online education applications have been used for many years. However, it has come to the fore more during the pandemic process. This result of necessity has brought with it the discussion of using online education instead of traditional education methods in the future. However, with this research, it has been revealed that online education applications are moderately effective. The use of online education instead of face-to-face education applications can only be possible with an increase in the level of success. This may have been possible with the experience and knowledge gained during the pandemic process. Therefore, the meta-analysis of experimental studies conducted in the coming years will guide us. In this context, experimental studies using online education applications should be analyzed well. It would be useful to identify variables that can change the level of impacts with different moderators. Moderator analyzes are valuable in meta-analysis studies (for example, the role of moderators in Karl Pearson's typhoid vaccine studies). In this context, each analysis study sheds light on future studies. In meta-analyses to be made about online education, it would be beneficial to go beyond the moderators determined in this study. Thus, the contribution of similar studies to the field will increase more.

The purpose of this study is to determine the effect of online education on academic achievement. In line with this purpose, the studies that analyze the effect of online education approaches on academic achievement have been included in the meta-analysis. The total sample size of the studies included in the meta-analysis is 1772. While the studies included in the meta-analysis were conducted in the US, Taiwan, Turkey, China, Philippines, Ireland, and Georgia, the studies carried out in Europe could not be reached. The reason may be attributed to that there may be more use of quantitative research methods from a positivist perspective in the countries with an American academic tradition. As a result of the study, it was found out that the effect size of online education on academic achievement (g = 0.409) was moderate. In the studies included in the present research, we found that online education approaches were more effective than traditional ones. However, contrary to the present study, the analysis of comparisons between online and traditional education in some studies shows that face-to-face traditional learning is still considered effective compared to online learning (Ahmad et al., 2016 ; Hamdani & Priatna, 2020 ; Wei & Chou, 2020 ). Online education has advantages and disadvantages. The advantages of online learning compared to face-to-face learning in the classroom is the flexibility of learning time in online learning, the learning time does not include a single program, and it can be shaped according to circumstances (Lai et al., 2019 ). The next advantage is the ease of collecting assignments for students, as these can be done without having to talk to the teacher. Despite this, online education has several weaknesses, such as students having difficulty in understanding the material, teachers' inability to control students, and students’ still having difficulty interacting with teachers in case of internet network cuts (Swan, 2007 ). According to Astuti et al ( 2019 ), face-to-face education method is still considered better by students than e-learning because it is easier to understand the material and easier to interact with teachers. The results of the study illustrated that the effect size (g = 0.409) of online education on academic achievement is of medium level. Therefore, the results of the moderator analysis showed that the effect of online education on academic achievement does not differ in terms of country, lecture, class level, and online education approaches variables. After analyzing the literature, several meta-analyses on online education were published (Bernard et al., 2004 ; Machtmes & Asher, 2000 ; Zhao et al., 2005 ). Typically, these meta-analyzes also include the studies of older generation technologies such as audio, video, or satellite transmission. One of the most comprehensive studies on online education was conducted by Bernard et al. ( 2004 ). In this study, 699 independent effect sizes of 232 studies published from 1985 to 2001 were analyzed, and face-to-face education was compared to online education, with respect to success criteria and attitudes of various learners from young children to adults. In this meta-analysis, an overall effect size close to zero was found for the students' achievement (g +  = 0.01).

In another meta-analysis study carried out by Zhao et al. ( 2005 ), 98 effect sizes were examined, including 51 studies on online education conducted between 1996 and 2002. According to the study of Bernard et al. ( 2004 ), this meta-analysis focuses on the activities done in online education lectures. As a result of the research, an overall effect size close to zero was found for online education utilizing more than one generation technology for students at different levels. However, the salient point of the meta-analysis study of Zhao et al. is that it takes the average of different types of results used in a study to calculate an overall effect size. This practice is problematic because the factors that develop one type of learner outcome (e.g. learner rehabilitation), particularly course characteristics and practices, may be quite different from those that develop another type of outcome (e.g. learner's achievement), and it may even cause damage to the latter outcome. While mixing the studies with different types of results, this implementation may obscure the relationship between practices and learning.

Some meta-analytical studies have focused on the effectiveness of the new generation distance learning courses accessed through the internet for specific student populations. For instance, Sitzmann and others (Sitzmann et al., 2006 ) reviewed 96 studies published from 1996 to 2005, comparing web-based education of job-related knowledge or skills with face-to-face one. The researchers found that web-based education in general was slightly more effective than face-to-face education, but it is insufficient in terms of applicability ("knowing how to apply"). In addition, Sitzmann et al. ( 2006 ) revealed that Internet-based education has a positive effect on theoretical knowledge in quasi-experimental studies; however, it positively affects face-to-face education in experimental studies performed by random assignment. This moderator analysis emphasizes the need to pay attention to the factors of designs of the studies included in the meta-analysis. The designs of the studies included in this meta-analysis study were ignored. This can be presented as a suggestion to the new studies that will be conducted.

Another meta-analysis study was conducted by Cavanaugh et al. ( 2004 ), in which they focused on online education. In this study on internet-based distance education programs for students under 12 years of age, the researchers combined 116 results from 14 studies published between 1999 and 2004 to calculate an overall effect that was not statistically different from zero. The moderator analysis carried out in this study showed that there was no significant factor affecting the students' success. This meta-analysis used multiple results of the same study, ignoring the fact that different results of the same student would not be independent from each other.

In conclusion, some meta-analytical studies analyzed the consequences of online education for a wide range of students (Bernard et al., 2004 ; Zhao et al., 2005 ), and the effect sizes were generally low in these studies. Furthermore, none of the large-scale meta-analyzes considered the moderators, database quality standards or class levels in the selection of the studies, while some of them just referred to the country and lecture moderators. Advances in internet-based learning tools, the pandemic process, and increasing popularity in different learning contexts have required a precise meta-analysis of students' learning outcomes through online learning. Previous meta-analysis studies were typically based on the studies, involving narrow range of confounding variables. In the present study, common but significant moderators such as class level and lectures during the pandemic process were discussed. For instance, the problems have been experienced especially in terms of eligibility of class levels in online education platforms during the pandemic process. It was found that there is a need to study and make suggestions on whether online education can meet the needs of teachers and students.

Besides, the main forms of online education in the past were to watch the open lectures of famous universities and educational videos of institutions. In addition, online education is mainly a classroom-based teaching implemented by teachers in their own schools during the pandemic period, which is an extension of the original school education. This meta-analysis study will stand as a source to compare the effect size of the online education forms of the past decade with what is done today, and what will be done in the future.

Lastly, the heterogeneity test results of the meta-analysis study display that the effect size does not differ in terms of class level, country, online education approaches, and lecture moderators.

*Studies included in meta-analysis

Ahmad, S., Sumardi, K., & Purnawan, P. (2016). Komparasi Peningkatan Hasil Belajar Antara Pembelajaran Menggunakan Sistem Pembelajaran Online Terpadu Dengan Pembelajaran Klasikal Pada Mata Kuliah Pneumatik Dan Hidrolik. Journal of Mechanical Engineering Education, 2 (2), 286–292.

Article   Google Scholar  

Ally, M. (2004). Foundations of educational theory for online learning. Theory and Practice of Online Learning, 2 , 15–44. Retrieved on the 11th of September, 2020 from https://eddl.tru.ca/wp-content/uploads/2018/12/01_Anderson_2008-Theory_and_Practice_of_Online_Learning.pdf

Arat, T., & Bakan, Ö. (2011). Uzaktan eğitim ve uygulamaları. Selçuk Üniversitesi Sosyal Bilimler Meslek Yüksek Okulu Dergisi , 14 (1–2), 363–374. https://doi.org/10.29249/selcuksbmyd.540741

Astuti, C. C., Sari, H. M. K., & Azizah, N. L. (2019). Perbandingan Efektifitas Proses Pembelajaran Menggunakan Metode E-Learning dan Konvensional. Proceedings of the ICECRS, 2 (1), 35–40.

*Atici, B., & Polat, O. C. (2010). Influence of the online learning environments and tools on the student achievement and opinions. Educational Research and Reviews, 5 (8), 455–464. Retrieved on the 11th of October, 2020 from https://academicjournals.org/journal/ERR/article-full-text-pdf/4C8DD044180.pdf

Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., et al. (2004). How does distance education compare with classroom instruction? A meta- analysis of the empirical literature. Review of Educational Research, 3 (74), 379–439. https://doi.org/10.3102/00346543074003379

Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2009). Introduction to meta-analysis . Wiley.

Book   Google Scholar  

Borenstein, M., Hedges, L., & Rothstein, H. (2007). Meta-analysis: Fixed effect vs. random effects . UK: Wiley.

Card, N. A. (2011). Applied meta-analysis for social science research: Methodology in the social sciences . Guilford.

Google Scholar  

*Carreon, J. R. (2018 ). Facebook as integrated blended learning tool in technology and livelihood education exploratory. Retrieved on the 1st of October, 2020 from https://files.eric.ed.gov/fulltext/EJ1197714.pdf

Cavanaugh, C., Gillan, K. J., Kromrey, J., Hess, M., & Blomeyer, R. (2004). The effects of distance education on K-12 student outcomes: A meta-analysis. Learning Point Associates/North Central Regional Educational Laboratory (NCREL) . Retrieved on the 11th of September, 2020 from https://files.eric.ed.gov/fulltext/ED489533.pdf

*Ceylan, V. K., & Elitok Kesici, A. (2017). Effect of blended learning to academic achievement. Journal of Human Sciences, 14 (1), 308. https://doi.org/10.14687/jhs.v14i1.4141

*Chae, S. E., & Shin, J. H. (2016). Tutoring styles that encourage learner satisfaction, academic engagement, and achievement in an online environment. Interactive Learning Environments, 24(6), 1371–1385. https://doi.org/10.1080/10494820.2015.1009472

*Chiang, T. H. C., Yang, S. J. H., & Hwang, G. J. (2014). An augmented reality-based mobile learning system to improve students’ learning achievements and motivations in natural science inquiry activities. Educational Technology and Society, 17 (4), 352–365. Retrieved on the 11th of September, 2020 from https://www.researchgate.net/profile/Gwo_Jen_Hwang/publication/287529242_An_Augmented_Reality-based_Mobile_Learning_System_to_Improve_Students'_Learning_Achievements_and_Motivations_in_Natural_Science_Inquiry_Activities/links/57198c4808ae30c3f9f2c4ac.pdf

Chiao, H. M., Chen, Y. L., & Huang, W. H. (2018). Examining the usability of an online virtual tour-guiding platform for cultural tourism education. Journal of Hospitality, Leisure, Sport & Tourism Education, 23 (29–38), 1. https://doi.org/10.1016/j.jhlste.2018.05.002

Chizmar, J. F., & Walbert, M. S. (1999). Web-based learning environments guided by principles of good teaching practice. Journal of Economic Education, 30 (3), 248–264. https://doi.org/10.2307/1183061

Cleophas, T. J., & Zwinderman, A. H. (2017). Modern meta-analysis: Review and update of methodologies . Switzerland: Springer. https://doi.org/10.1007/978-3-319-55895-0

Cohen, L., Manion, L., & Morrison, K. (2007). Observation.  Research Methods in Education, 6 , 396–412. Retrieved on the 11th of September, 2020 from https://www.researchgate.net/profile/Nabil_Ashraf2/post/How_to_get_surface_potential_Vs_Voltage_curve_from_CV_and_GV_measurements_of_MOS_capacitor/attachment/5ac6033cb53d2f63c3c405b4/AS%3A612011817844736%401522926396219/download/Very+important_C-V+characterization+Lehigh+University+thesis.pdf

Colis, B., & Moonen, J. (2001). Flexible Learning in a Digital World: Experiences and Expectations. Open & Distance Learning Series . Stylus Publishing.

CoSN. (2020). COVID-19 Response: Preparing to Take School Online. CoSN. (2020). COVID-19 Response: Preparing to Take School Online. Retrieved on the 3rd of September, 2021 from https://www.cosn.org/sites/default/files/COVID-19%20Member%20Exclusive_0.pdf

Cumming, G. (2012). Understanding new statistics: Effect sizes, confidence intervals, and meta-analysis. New York, USA: Routledge. https://doi.org/10.4324/9780203807002

Deeks, J. J., Higgins, J. P. T., & Altman, D. G. (2008). Analysing data and undertaking meta-analyses . In J. P. T. Higgins & S. Green (Eds.), Cochrane handbook for systematic reviews of interventions (pp. 243–296). Sussex: John Wiley & Sons. https://doi.org/10.1002/9780470712184.ch9

Demiralay, R., Bayır, E. A., & Gelibolu, M. F. (2016). Öğrencilerin bireysel yenilikçilik özellikleri ile çevrimiçi öğrenmeye hazır bulunuşlukları ilişkisinin incelenmesi. Eğitim ve Öğretim Araştırmaları Dergisi, 5 (1), 161–168. https://doi.org/10.23891/efdyyu.2017.10

Dinçer, S. (2014). Eğitim bilimlerinde uygulamalı meta-analiz. Pegem Atıf İndeksi, 2014(1), 1–133. https://doi.org/10.14527/pegem.001

*Durak, G., Cankaya, S., Yunkul, E., & Ozturk, G. (2017). The effects of a social learning network on students’ performances and attitudes. European Journal of Education Studies, 3 (3), 312–333. 10.5281/zenodo.292951

*Ercan, O. (2014). Effect of web assisted education supported by six thinking hats on students’ academic achievement in science and technology classes . European Journal of Educational Research, 3 (1), 9–23. https://doi.org/10.12973/eu-jer.3.1.9

Ercan, O., & Bilen, K. (2014). Effect of web assisted education supported by six thinking hats on students’ academic achievement in science and technology classes. European Journal of Educational Research, 3 (1), 9–23.

*Ercan, O., Bilen, K., & Ural, E. (2016). “Earth, sun and moon”: Computer assisted instruction in secondary school science - Achievement and attitudes. Issues in Educational Research, 26 (2), 206–224. https://doi.org/10.12973/eu-jer.3.1.9

Field, A. P. (2003). The problems in using fixed-effects models of meta-analysis on real-world data. Understanding Statistics, 2 (2), 105–124. https://doi.org/10.1207/s15328031us0202_02

Field, A. P., & Gillett, R. (2010). How to do a meta-analysis. British Journal of Mathematical and Statistical Psychology, 63 (3), 665–694. https://doi.org/10.1348/00071010x502733

Geostat. (2019). ‘Share of households with internet access’, National statistics office of Georgia . Retrieved on the 2nd September 2020 from https://www.geostat.ge/en/modules/categories/106/information-and-communication-technologies-usage-in-households

*Gwo-Jen, H., Nien-Ting, T., & Xiao-Ming, W. (2018). Creating interactive e-books through learning by design: The impacts of guided peer-feedback on students’ learning achievements and project outcomes in science courses. Journal of Educational Technology & Society., 21 (1), 25–36. Retrieved on the 2nd of October, 2020 https://ae-uploads.uoregon.edu/ISTE/ISTE2019/PROGRAM_SESSION_MODEL/HANDOUTS/112172923/CreatingInteractiveeBooksthroughLearningbyDesignArticle2018.pdf

Hamdani, A. R., & Priatna, A. (2020). Efektifitas implementasi pembelajaran daring (full online) dimasa pandemi Covid-19 pada jenjang Sekolah Dasar di Kabupaten Subang. Didaktik: Jurnal Ilmiah PGSD STKIP Subang, 6 (1), 1–9.

Hart, C. M., Berger, D., Jacob, B., Loeb, S., & Hill, M. (2019). Online learning, offline outcomes: Online course taking and high school student performance. Aera Open, 5(1).

*Hayes, J., & Stewart, I. (2016). Comparing the effects of derived relational training and computer coding on intellectual potential in school-age children. The British Journal of Educational Psychology, 86 (3), 397–411. https://doi.org/10.1111/bjep.12114

Horton, W. K. (2000). Designing web-based training: How to teach anyone anything anywhere anytime (Vol. 1). Wiley Publishing.

*Hwang, G. J., Wu, P. H., & Chen, C. C. (2012). An online game approach for improving students’ learning performance in web-based problem-solving activities. Computers and Education, 59 (4), 1246–1256. https://doi.org/10.1016/j.compedu.2012.05.009

*Kert, S. B., Köşkeroğlu Büyükimdat, M., Uzun, A., & Çayiroğlu, B. (2017). Comparing active game-playing scores and academic performances of elementary school students. Education 3–13, 45 (5), 532–542. https://doi.org/10.1080/03004279.2016.1140800

*Lai, A. F., & Chen, D. J. (2010). Web-based two-tier diagnostic test and remedial learning experiment. International Journal of Distance Education Technologies, 8 (1), 31–53. https://doi.org/10.4018/jdet.2010010103

*Lai, A. F., Lai, H. Y., Chuang W. H., & Wu, Z.H. (2015). Developing a mobile learning management system for outdoors nature science activities based on 5e learning cycle. Proceedings of the International Conference on e-Learning, ICEL. Proceedings of the International Association for Development of the Information Society (IADIS) International Conference on e-Learning (Las Palmas de Gran Canaria, Spain, July 21–24, 2015). Retrieved on the 14th November 2020 from https://files.eric.ed.gov/fulltext/ED562095.pdf

Lai, C. H., Lin, H. W., Lin, R. M., & Tho, P. D. (2019). Effect of peer interaction among online learning community on learning engagement and achievement. International Journal of Distance Education Technologies (IJDET), 17 (1), 66–77.

Littell, J. H., Corcoran, J., & Pillai, V. (2008). Systematic reviews and meta-analysis . Oxford University.

*Liu, K. P., Tai, S. J. D., & Liu, C. C. (2018). Enhancing language learning through creation: the effect of digital storytelling on student learning motivation and performance in a school English course. Educational Technology Research and Development, 66 (4), 913–935. https://doi.org/10.1007/s11423-018-9592-z

Machtmes, K., & Asher, J. W. (2000). A meta-analysis of the effectiveness of telecourses in distance education. American Journal of Distance Education, 14 (1), 27–46. https://doi.org/10.1080/08923640009527043

Makowski, D., Piraux, F., & Brun, F. (2019). From experimental network to meta-analysis: Methods and applications with R for agronomic and environmental sciences. Dordrecht: Springer. https://doi.org/10.1007/978-94-024_1696-1

* Meyers, C., Molefe, A., & Brandt, C. (2015). The Impact of the" Enhancing Missouri's Instructional Networked Teaching Strategies"(eMINTS) Program on Student Achievement, 21st-Century Skills, and Academic Engagement--Second-Year Results . Society for Research on Educational Effectiveness. Retrieved on the 14 th November, 2020 from https://files.eric.ed.gov/fulltext/ED562508.pdf

OECD. (2020). ‘A framework to guide an education response to the COVID-19 Pandemic of 2020 ’. https://doi.org/10.26524/royal.37.6

Pecoraro, V. (2018). Appraising evidence . In G. Biondi-Zoccai (Ed.), Diagnostic meta-analysis: A useful tool for clinical decision-making (pp. 99–114). Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-78966-8_9

Pigott, T. (2012). Advances in meta-analysis . Springer.

Pillay, H. , Irving, K., & Tones, M. (2007). Validation of the diagnostic tool for assessing Tertiary students’ readiness for online learning. Higher Education Research & Development, 26 (2), 217–234. https://doi.org/10.1080/07294360701310821

Prestiadi, D., Zulkarnain, W., & Sumarsono, R. B. (2019). Visionary leadership in total quality management: efforts to improve the quality of education in the industrial revolution 4.0. In the 4th International Conference on Education and Management (COEMA 2019). Atlantis Press

Poole, D. M. (2000). Student participation in a discussion-oriented online course: a case study. Journal of Research on Computing in Education, 33 (2), 162–177. https://doi.org/10.1080/08886504.2000.10782307

Rahayu, F. S., Budiyanto, D., & Palyama, D. (2017). Analisis penerimaan e-learning menggunakan technology acceptance model (Tam)(Studi Kasus: Universitas Atma Jaya Yogyakarta). Jurnal Terapan Teknologi Informasi, 1 (2), 87–98.

Rasmussen, R. C. (2003). The quantity and quality of human interaction in a synchronous blended learning environment . Brigham Young University Press.

*Ravenel, J., T. Lambeth, D., & Spires, B. (2014). Effects of computer-based programs on mathematical achievement scores for fourth-grade students. i-manager’s Journal on School Educational Technology, 10 (1), 8–21. https://doi.org/10.26634/jsch.10.1.2830

Rolisca, R. U. C., & Achadiyah, B. N. (2014). Pengembangan media evaluasi pembelajaran dalam bentuk online berbasis e-learning menggunakan software wondershare quiz creator dalam mata pelajaran akuntansi SMA Brawijaya Smart School (BSS). Jurnal Pendidikan Akuntansi Indonesia, 12(2).

Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effective- ness of Web-based and classroom instruction: A meta-analysis . Personnel Psychology, 59 (3), 623–664. https://doi.org/10.1111/j.1744-6570.2006.00049.x

Stewart, D. W., & Kamins, M. A. (2001). Developing a coding scheme and coding study reports. In M. W. Lipsey & D. B. Wilson (Eds.), Practical meta­analysis: Applied social research methods series (Vol. 49, pp. 73–90). Sage.

Swan, K. (2007). Research on online learning. Journal of Asynchronous Learning Networks, 11 (1), 55–59.

*Sung, H. Y., Hwang, G. J., & Chang, Y. C. (2016). Development of a mobile learning system based on a collaborative problem-posing strategy. Interactive Learning Environments, 24 (3), 456–471. https://doi.org/10.1080/10494820.2013.867889

Tsagris, M., & Fragkos, K. C. (2018). Meta-analyses of clinical trials versus diagnostic test accuracy studies. In G. Biondi-Zoccai (Ed.), Diagnostic meta-analysis: A useful tool for clinical decision-making (pp. 31–42). Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-78966-8_4

UNESCO. (2020, Match 13). COVID-19 educational disruption and response. Retrieved on the 14 th November 2020 from https://en.unesco.org/themes/education-emergencies/ coronavirus-school-closures

Usta, E. (2011a). The effect of web-based learning environments on attitudes of students regarding computer and internet. Procedia-Social and Behavioral Sciences, 28 (262–269), 1. https://doi.org/10.1016/j.sbspro.2011.11.051

Usta, E. (2011b). The examination of online self-regulated learning skills in web-based learning environments in terms of different variables. Turkish Online Journal of Educational Technology-TOJET, 10 (3), 278–286. Retrieved on the 14th November 2020 from https://files.eric.ed.gov/fulltext/EJ944994.pdf

Vrasidas, C. & MsIsaac, M. S. (2000). Principles of pedagogy and evaluation for web-based learning. Educational Media International, 37 (2), 105–111. https://doi.org/10.1080/095239800410405

*Wang, C. H., & Chen, C. P. (2013). Effects of facebook tutoring on learning english as a second language. Proceedings of the International Conference e-Learning 2013, (2009), 135–142. Retrieved on the 15th November 2020 from https://files.eric.ed.gov/fulltext/ED562299.pdf

Wei, H. C., & Chou, C. (2020). Online learning performance and satisfaction: Do perceptions and readiness matter? Distance Education, 41 (1), 48–69.

*Yu, F. Y. (2019). The learning potential of online student-constructed tests with citing peer-generated questions. Interactive Learning Environments, 27 (2), 226–241. https://doi.org/10.1080/10494820.2018.1458040

*Yu, F. Y., & Chen, Y. J. (2014). Effects of student-generated questions as the source of online drill-and-practice activities on learning . British Journal of Educational Technology, 45 (2), 316–329. https://doi.org/10.1111/bjet.12036

*Yu, F. Y., & Pan, K. J. (2014). The effects of student question-generation with online prompts on learning. Educational Technology and Society, 17 (3), 267–279. Retrieved on the 15th November 2020 from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.565.643&rep=rep1&type=pdf

*Yu, W. F., She, H. C., & Lee, Y. M. (2010). The effects of web-based/non-web-based problem-solving instruction and high/low achievement on students’ problem-solving ability and biology achievement. Innovations in Education and Teaching International, 47 (2), 187–199. https://doi.org/10.1080/14703291003718927

Zhao, Y., Lei, J., Yan, B, Lai, C., & Tan, S. (2005). A practical analysis of research on the effectiveness of distance education. Teachers College Record, 107 (8). https://doi.org/10.1111/j.1467-9620.2005.00544.x

*Zhong, B., Wang, Q., Chen, J., & Li, Y. (2017). Investigating the period of switching roles in pair programming in a primary school. Educational Technology and Society, 20 (3), 220–233. Retrieved on the 15th November 2020 from https://repository.nie.edu.sg/bitstream/10497/18946/1/ETS-20-3-220.pdf

Download references

Author information

Authors and affiliations.

Primary Education, Ministry of Turkish National Education, Mersin, Turkey

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Hakan Ulum .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Ulum, H. The effects of online education on academic success: A meta-analysis study. Educ Inf Technol 27 , 429–450 (2022). https://doi.org/10.1007/s10639-021-10740-8

Download citation

Received : 06 December 2020

Accepted : 30 August 2021

Published : 06 September 2021

Issue Date : January 2022

DOI : https://doi.org/10.1007/s10639-021-10740-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online education
  • Student achievement
  • Academic success
  • Meta-analysis
  • Find a journal
  • Publish with us
  • Track your research

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Wiley - PMC COVID-19 Collection

Logo of pheblackwell

Online and face‐to‐face learning: Evidence from students’ performance during the Covid‐19 pandemic

Carolyn chisadza.

1 Department of Economics, University of Pretoria, Hatfield South Africa

Matthew Clance

Thulani mthembu.

2 Department of Education Innovation, University of Pretoria, Hatfield South Africa

Nicky Nicholls

Eleni yitbarek.

This study investigates the factors that predict students' performance after transitioning from face‐to‐face to online learning as a result of the Covid‐19 pandemic. It uses students' responses from survey questions and the difference in the average assessment grades between pre‐lockdown and post‐lockdown at a South African university. We find that students' performance was positively associated with good wifi access, relative to using mobile internet data. We also observe lower academic performance for students who found transitioning to online difficult and who expressed a preference for self‐study (i.e. reading through class slides and notes) over assisted study (i.e. joining live lectures or watching recorded lectures). The findings suggest that improving digital infrastructure and reducing the cost of internet access may be necessary for mitigating the impact of the Covid‐19 pandemic on education outcomes.

1. INTRODUCTION

The Covid‐19 pandemic has been a wake‐up call to many countries regarding their capacity to cater for mass online education. This situation has been further complicated in developing countries, such as South Africa, who lack the digital infrastructure for the majority of the population. The extended lockdown in South Africa saw most of the universities with mainly in‐person teaching scrambling to source hardware (e.g. laptops, internet access), software (e.g. Microsoft packages, data analysis packages) and internet data for disadvantaged students in order for the semester to recommence. Not only has the pandemic revealed the already stark inequality within the tertiary student population, but it has also revealed that high internet data costs in South Africa may perpetuate this inequality, making online education relatively inaccessible for disadvantaged students. 1

The lockdown in South Africa made it possible to investigate the changes in second‐year students' performance in the Economics department at the University of Pretoria. In particular, we are interested in assessing what factors predict changes in students' performance after transitioning from face‐to‐face (F2F) to online learning. Our main objectives in answering this study question are to establish what study materials the students were able to access (i.e. slides, recordings, or live sessions) and how students got access to these materials (i.e. the infrastructure they used).

The benefits of education on economic development are well established in the literature (Gyimah‐Brempong,  2011 ), ranging from health awareness (Glick et al.,  2009 ), improved technological innovations, to increased capacity development and employment opportunities for the youth (Anyanwu,  2013 ; Emediegwu,  2021 ). One of the ways in which inequality is perpetuated in South Africa, and Africa as a whole, is through access to education (Anyanwu,  2016 ; Coetzee,  2014 ; Tchamyou et al.,  2019 ); therefore, understanding the obstacles that students face in transitioning to online learning can be helpful in ensuring more equal access to education.

Using students' responses from survey questions and the difference in the average grades between pre‐lockdown and post‐lockdown, our findings indicate that students' performance in the online setting was positively associated with better internet access. Accessing assisted study material, such as narrated slides or recordings of the online lectures, also helped students. We also find lower academic performance for students who reported finding transitioning to online difficult and for those who expressed a preference for self‐study (i.e. reading through class slides and notes) over assisted study (i.e. joining live lectures or watching recorded lectures). The average grades between pre‐lockdown and post‐lockdown were about two points and three points lower for those who reported transitioning to online teaching difficult and for those who indicated a preference for self‐study, respectively. The findings suggest that improving the quality of internet infrastructure and providing assisted learning can be beneficial in reducing the adverse effects of the Covid‐19 pandemic on learning outcomes.

Our study contributes to the literature by examining the changes in the online (post‐lockdown) performance of students and their F2F (pre‐lockdown) performance. This approach differs from previous studies that, in most cases, use between‐subject designs where one group of students following online learning is compared to a different group of students attending F2F lectures (Almatra et al.,  2015 ; Brown & Liedholm,  2002 ). This approach has a limitation in that that there may be unobserved characteristics unique to students choosing online learning that differ from those choosing F2F lectures. Our approach avoids this issue because we use a within‐subject design: we compare the performance of the same students who followed F2F learning Before lockdown and moved to online learning during lockdown due to the Covid‐19 pandemic. Moreover, the study contributes to the limited literature that compares F2F and online learning in developing countries.

Several studies that have also compared the effectiveness of online learning and F2F classes encounter methodological weaknesses, such as small samples, not controlling for demographic characteristics, and substantial differences in course materials and assessments between online and F2F contexts. To address these shortcomings, our study is based on a relatively large sample of students and includes demographic characteristics such as age, gender and perceived family income classification. The lecturer and course materials also remained similar in the online and F2F contexts. A significant proportion of our students indicated that they never had online learning experience before. Less than 20% of the students in the sample had previous experience with online learning. This highlights the fact that online education is still relatively new to most students in our sample.

Given the global experience of the fourth industrial revolution (4IR), 2 with rapidly accelerating technological progress, South Africa needs to be prepared for the possibility of online learning becoming the new norm in the education system. To this end, policymakers may consider engaging with various organizations (schools, universities, colleges, private sector, and research facilities) To adopt interventions that may facilitate the transition to online learning, while at the same time ensuring fair access to education for all students across different income levels. 3

1.1. Related literature

Online learning is a form of distance education which mainly involves internet‐based education where courses are offered synchronously (i.e. live sessions online) and/or asynchronously (i.e. students access course materials online in their own time, which is associated with the more traditional distance education). On the other hand, traditional F2F learning is real time or synchronous learning. In a physical classroom, instructors engage with the students in real time, while in the online format instructors can offer real time lectures through learning management systems (e.g. Blackboard Collaborate), or record the lectures for the students to watch later. Purely online courses are offered entirely over the internet, while blended learning combines traditional F2F classes with learning over the internet, and learning supported by other technologies (Nguyen,  2015 ).

Moreover, designing online courses requires several considerations. For example, the quality of the learning environment, the ease of using the learning platform, the learning outcomes to be achieved, instructor support to assist and motivate students to engage with the course material, peer interaction, class participation, type of assessments (Paechter & Maier,  2010 ), not to mention training of the instructor in adopting and introducing new teaching methods online (Lundberg et al.,  2008 ). In online learning, instructors are more facilitators of learning. On the other hand, traditional F2F classes are structured in such a way that the instructor delivers knowledge, is better able to gauge understanding and interest of students, can engage in class activities, and can provide immediate feedback on clarifying questions during the class. Additionally, the designing of traditional F2F courses can be less time consuming for instructors compared to online courses (Navarro,  2000 ).

Online learning is also particularly suited for nontraditional students who require flexibility due to work or family commitments that are not usually associated with the undergraduate student population (Arias et al.,  2018 ). Initially the nontraditional student belonged to the older adult age group, but with blended learning becoming more commonplace in high schools, colleges and universities, online learning has begun to traverse a wider range of age groups. However, traditional F2F classes are still more beneficial for learners that are not so self‐sufficient and lack discipline in working through the class material in the required time frame (Arias et al.,  2018 ).

For the purpose of this literature review, both pure online and blended learning are considered to be online learning because much of the evidence in the literature compares these two types against the traditional F2F learning. The debate in the literature surrounding online learning versus F2F teaching continues to be a contentious one. A review of the literature reveals mixed findings when comparing the efficacy of online learning on student performance in relation to the traditional F2F medium of instruction (Lundberg et al.,  2008 ; Nguyen,  2015 ). A number of studies conducted Before the 2000s find what is known today in the empirical literature as the “No Significant Difference” phenomenon (Russell & International Distance Education Certificate Center (IDECC),  1999 ). The seminal work from Russell and IDECC ( 1999 ) involved over 350 comparative studies on online/distance learning versus F2F learning, dating back to 1928. The author finds no significant difference overall between online and traditional F2F classroom education outcomes. Subsequent studies that followed find similar “no significant difference” outcomes (Arbaugh,  2000 ; Fallah & Ubell,  2000 ; Freeman & Capper,  1999 ; Johnson et al.,  2000 ; Neuhauser,  2002 ). While Bernard et al. ( 2004 ) also find that overall there is no significant difference in achievement between online education and F2F education, the study does find significant heterogeneity in student performance for different activities. The findings show that students in F2F classes outperform the students participating in synchronous online classes (i.e. classes that require online students to participate in live sessions at specific times). However, asynchronous online classes (i.e. students access class materials at their own time online) outperform F2F classes.

More recent studies find significant results for online learning outcomes in relation to F2F outcomes. On the one hand, Shachar and Yoram ( 2003 ) and Shachar and Neumann ( 2010 ) conduct a meta‐analysis of studies from 1990 to 2009 and find that in 70% of the cases, students taking courses by online education outperformed students in traditionally instructed courses (i.e. F2F lectures). In addition, Navarro and Shoemaker ( 2000 ) observe that learning outcomes for online learners are as effective as or better than outcomes for F2F learners, regardless of background characteristics. In a study on computer science students, Dutton et al. ( 2002 ) find online students perform significantly better compared to the students who take the same course on campus. A meta‐analysis conducted by the US Department of Education finds that students who took all or part of their course online performed better, on average, than those taking the same course through traditional F2F instructions. The report also finds that the effect sizes are larger for studies in which the online learning was collaborative or instructor‐driven than in those studies where online learners worked independently (Means et al.,  2010 ).

On the other hand, evidence by Brown and Liedholm ( 2002 ) based on test scores from macroeconomics students in the United States suggest that F2F students tend to outperform online students. These findings are supported by Coates et al. ( 2004 ) who base their study on macroeconomics students in the United States, and Xu and Jaggars ( 2014 ) who find negative effects for online students using a data set of about 500,000 courses taken by over 40,000 students in Washington. Furthermore, Almatra et al. ( 2015 ) compare overall course grades between online and F2F students for a Telecommunications course and find that F2F students significantly outperform online learning students. In an experimental study where students are randomly assigned to attend live lectures versus watching the same lectures online, Figlio et al. ( 2013 ) observe some evidence that the traditional format has a positive effect compared to online format. Interestingly, Callister and Love ( 2016 ) specifically compare the learning outcomes of online versus F2F skills‐based courses and find that F2F learners earned better outcomes than online learners even when using the same technology. This study highlights that some of the inconsistencies that we find in the results comparing online to F2F learning might be influenced by the nature of the course: theory‐based courses might be less impacted by in‐person interaction than skills‐based courses.

The fact that the reviewed studies on the effects of F2F versus online learning on student performance have been mainly focused in developed countries indicates the dearth of similar studies being conducted in developing countries. This gap in the literature may also highlight a salient point: online learning is still relatively underexplored in developing countries. The lockdown in South Africa therefore provides us with an opportunity to contribute to the existing literature from a developing country context.

2. CONTEXT OF STUDY

South Africa went into national lockdown in March 2020 due to the Covid‐19 pandemic. Like most universities in the country, the first semester for undergraduate courses at the University of Pretoria had already been running since the start of the academic year in February. Before the pandemic, a number of F2F lectures and assessments had already been conducted in most courses. The nationwide lockdown forced the university, which was mainly in‐person teaching, to move to full online learning for the remainder of the semester. This forced shift from F2F teaching to online learning allows us to investigate the changes in students' performance.

Before lockdown, classes were conducted on campus. During lockdown, these live classes were moved to an online platform, Blackboard Collaborate, which could be accessed by all registered students on the university intranet (“ClickUP”). However, these live online lectures involve substantial internet data costs for students. To ensure access to course content for those students who were unable to attend the live online lectures due to poor internet connections or internet data costs, several options for accessing course content were made available. These options included prerecorded narrated slides (which required less usage of internet data), recordings of the live online lectures, PowerPoint slides with explanatory notes and standard PDF lecture slides.

At the same time, the university managed to procure and loan out laptops to a number of disadvantaged students, and negotiated with major mobile internet data providers in the country for students to have free access to study material through the university's “connect” website (also referred to as the zero‐rated website). However, this free access excluded some video content and live online lectures (see Table  1 ). The university also provided between 10 and 20 gigabytes of mobile internet data per month, depending on the network provider, sent to students' mobile phones to assist with internet data costs.

Sites available on zero‐rated website

Browser access to the university intranet (ClickUp)Zero‐ratedPaid with internet data
ContentXX (Bb App)
Interactive videos and contentX
YouTube (only if linked in ClickUP)X
AnnouncementsXX
Blackboard Collaborate—live sessions
Blackboard Collaborate—recordingsX
DiscussionsX
BlogsX
JournalsX
AssignmentsX
Turnitin assignmentsX
TestsX
GmailX
LibraryX
Google Drive (accessed via Gmail)X
Google Hangouts/MeetX
Blackboard App (Bb App)X
Instructor AppX
UP & Library AppX
CengageX
ElsevierX
IT SchoolsX
MacmillanX
McGraw HillX
SapingX
VitalsourceX
WebassignX
WilleyplusX

Note : The table summarizes the sites that were available on the zero‐rated website and those that incurred data costs.

High data costs continue to be a contentious issue in Africa where average incomes are low. Gilbert ( 2019 ) reports that South Africa ranked 16th of the 45 countries researched in terms of the most expensive internet data in Africa, at US$6.81 per gigabyte, in comparison to other Southern African countries such as Mozambique (US$1.97), Zambia (US$2.70), and Lesotho (US$4.09). Internet data prices have also been called into question in South Africa after the Competition Commission published a report from its Data Services Market Inquiry calling the country's internet data pricing “excessive” (Gilbert,  2019 ).

3. EMPIRICAL APPROACH

We use a sample of 395 s‐year students taking a macroeconomics module in the Economics department to compare the effects of F2F and online learning on students' performance using a range of assessments. The module was an introduction to the application of theoretical economic concepts. The content was both theory‐based (developing economic growth models using concepts and equations) and skill‐based (application involving the collection of data from online data sources and analyzing the data using statistical software). Both individual and group assignments formed part of the assessments. Before the end of the semester, during lockdown in June 2020, we asked the students to complete a survey with questions related to the transition from F2F to online learning and the difficulties that they may have faced. For example, we asked the students: (i) how easy or difficult they found the transition from F2F to online lectures; (ii) what internet options were available to them and which they used the most to access the online prescribed work; (iii) what format of content they accessed and which they preferred the most (i.e. self‐study material in the form of PDF and PowerPoint slides with notes vs. assisted study with narrated slides and lecture recordings); (iv) what difficulties they faced accessing the live online lectures, to name a few. Figure  1 summarizes the key survey questions that we asked the students regarding their transition from F2F to online learning.

An external file that holds a picture, illustration, etc.
Object name is AFDR-33-S114-g002.jpg

Summary of survey data

Before the lockdown, the students had already attended several F2F classes and completed three assessments. We are therefore able to create a dependent variable that is comprised of the average grades of three assignments taken before lockdown and the average grades of three assignments taken after the start of the lockdown for each student. Specifically, we use the difference between the post‐ and pre‐lockdown average grades as the dependent variable. However, the number of student observations dropped to 275 due to some students missing one or more of the assessments. The lecturer, content and format of the assessments remain similar across the module. We estimate the following equation using ordinary least squares (OLS) with robust standard errors:

where Y i is the student's performance measured by the difference between the post and pre‐lockdown average grades. B represents the vector of determinants that measure the difficulty faced by students to transition from F2F to online learning. This vector includes access to the internet, study material preferred, quality of the online live lecture sessions and pre‐lockdown class attendance. X is the vector of student demographic controls such as race, gender and an indicator if the student's perceived family income is below average. The ε i is unobserved student characteristics.

4. ANALYSIS

4.1. descriptive statistics.

Table  2 gives an overview of the sample of students. We find that among the black students, a higher proportion of students reported finding the transition to online learning more difficult. On the other hand, more white students reported finding the transition moderately easy, as did the other races. According to Coetzee ( 2014 ), the quality of schools can vary significantly between higher income and lower‐income areas, with black South Africans far more likely to live in lower‐income areas with lower quality schools than white South Africans. As such, these differences in quality of education from secondary schooling can persist at tertiary level. Furthermore, persistent income inequality between races in South Africa likely means that many poorer black students might not be able to afford wifi connections or large internet data bundles which can make the transition difficult for black students compared to their white counterparts.

Descriptive statistics

Columns by: Transition difficultyVery easy to moderately easyDifficult to impossibleTotal
(%)169 (61.5)106 (38.5)275 (100.0)
, (%)
African82 (48.5)69 (65.1)151 (54.9)
Colored9 (5.3)4 (3.8)13 (4.7)
Indian15 (8.9)7 (6.6)22 (8.0)
White63 (37.3)26 (24.5)89 (32.4)
(%)
Female82 (48.5)57 (53.8)139 (50.5)
Male87 (51.5)49 (46.2)136 (49.5)
, (%)
Mobile internet data33 (19.5)31 (29.2)64 (23.3)
Wifi122 (72.2)58 (54.7)180 (65.5)
Zero‐rated, (%)14 (8.3)17 (16.0)31 (11.3)
Post‐lockdown quiz average, mean ( )83.09 (8.50)79.76 (11.07)81.81 (9.69)
Difference pre‐ and post‐grades, mean ( )6.81 (12.35)3.99 (14.07)5.72 (13.09)
Self‐study, mean ( )0.61 (0.49)0.58 (0.50)0.60 (0.49)
Class attendance pre‐lockdown, mean ( )0.54 (0.50)0.57 (0.50)0.55 (0.50)
Quality collaborate: Picture/sound, mean ( )0.24 (0.43)0.31 (0.47)0.27 (0.44)
Below average income, mean ( )0.24 (0.43)0.06 (0.23)0.17 (0.38)

Notes : The transition difficulty variable was ordered 1: Very Easy; 2: Moderately Easy; 3: Difficult; and 4: Impossible. Since we have few responses to the extremes, we combined Very Easy and Moderately as well as Difficult and Impossible to make the table easier to read. The table with a full breakdown is available upon request.

A higher proportion of students reported that wifi access made the transition to online learning moderately easy. However, relatively more students reported that mobile internet data and accessing the zero‐rated website made the transition difficult. Surprisingly, not many students made use of the zero‐rated website which was freely available. Figure  2 shows that students who reported difficulty transitioning to online learning did not perform as well in online learning versus F2F when compared to those that found it less difficult to transition.

An external file that holds a picture, illustration, etc.
Object name is AFDR-33-S114-g003.jpg

Transition from F2F to online learning.

Notes : This graph shows the students' responses to the question “How easy did you find the transition from face‐to‐face lectures to online lectures?” in relation to the outcome variable for performance

In Figure  3 , the kernel density shows that students who had access to wifi performed better than those who used mobile internet data or the zero‐rated data.

An external file that holds a picture, illustration, etc.
Object name is AFDR-33-S114-g001.jpg

Access to online learning.

Notes : This graph shows the students' responses to the question “What do you currently use the most to access most of your prescribed work?” in relation to the outcome variable for performance

The regression results are reported in Table  3 . We find that the change in students' performance from F2F to online is negatively associated with the difficulty they faced in transitioning from F2F to online learning. According to student survey responses, factors contributing to difficulty in transitioning included poor internet access, high internet data costs and lack of equipment such as laptops or tablets to access the study materials on the university website. Students who had access to wifi (i.e. fixed wireless broadband, Asymmetric Digital Subscriber Line (ADSL) or optic fiber) performed significantly better, with on average 4.5 points higher grade, in relation to students that had to use mobile internet data (i.e. personal mobile internet data, wifi at home using mobile internet data, or hotspot using mobile internet data) or the zero‐rated website to access the study materials. The insignificant results for the zero‐rated website are surprising given that the website was freely available and did not incur any internet data costs. However, most students in this sample complained that the internet connection on the zero‐rated website was slow, especially in uploading assignments. They also complained about being disconnected when they were in the middle of an assessment. This may have discouraged some students from making use of the zero‐rated website.

Results: Predictors for student performance using the difference on average assessment grades between pre‐ and post‐lockdown

(1)(2)(3)(4)(5)
Difference pre and postDifference pre and postDifference pre and postDifference pre and postDifference pre and post
Transition−2.086 −2.216 −2.207 −2.020 −2.166
Difficulty(1.220)(1.202)(1.189)(1.200)(1.198)
Wifi4.533 4.415 4.399 4.662 4.721
(2.153)(2.150)(2.091)(2.109)(2.116)
Zero‐rated−0.2450.0890.2140.4991.226
(2.625)(2.659)(2.629)(2.652)(2.609)
Self‐study−3.649 −3.360 −3.388 −2.824
(1.609)(1.588)(1.593)(1.617)
Class−3.403 −3.195 −3.478
Attendance pre‐lockdown(1.557)(1.571)(1.578)
Quality−1.968−1.997
Collaborate:(1.603)(1.562)
Picture/sound
Male−3.038
(1.596)
Colored3.7833.4913.0643.5004.408
(2.421)(2.622)(2.566)(2.652)(2.652)
Indian4.2404.6114.7004.5634.701
(3.105)(3.046)(2.991)(2.991)(2.976)
White−0.1310.3920.020−0.0610.339
(1.829)(1.844)(1.832)(1.834)(1.856)
Below−3.165−3.436 −4.005 −3.685 −3.535
Average income(2.008)(1.996)(1.953)(1.967)(1.959)
‐adj.0.0350.0500.0630.0640.073
Observations275275275273273

Coefficients reported. Robust standard errors in parentheses.

∗∗∗ p  < .01.

Students who expressed a preference for self‐study approaches (i.e. reading PDF slides or PowerPoint slides with explanatory notes) did not perform as well, on average, as students who preferred assisted study (i.e. listening to recorded narrated slides or lecture recordings). This result is in line with Means et al. ( 2010 ), where student performance was better for online learning that was collaborative or instructor‐driven than in cases where online learners worked independently. Interestingly, we also observe that the performance of students who often attended in‐person classes before the lockdown decreased. Perhaps these students found the F2F lectures particularly helpful in mastering the course material. From the survey responses, we find that a significant proportion of the students (about 70%) preferred F2F to online lectures. This preference for F2F lectures may also be linked to the factors contributing to the difficulty some students faced in transitioning to online learning.

We find that the performance of low‐income students decreased post‐lockdown, which highlights another potential challenge to transitioning to online learning. The picture and sound quality of the live online lectures also contributed to lower performance. Although this result is not statistically significant, it is worth noting as the implications are linked to the quality of infrastructure currently available for students to access online learning. We find no significant effects of race on changes in students' performance, though males appeared to struggle more with the shift to online teaching than females.

For the robustness check in Table  4 , we consider the average grades of the three assignments taken after the start of the lockdown as a dependent variable (i.e. the post‐lockdown average grades for each student). We then include the pre‐lockdown average grades as an explanatory variable. The findings and overall conclusions in Table  4 are consistent with the previous results.

Robustness check: Predictors for student performance using the average assessment grades for post‐lockdown

(1)(2)(3)(4)(5)
Post‐lockdown quiz averagePost‐lockdown quiz averagePost‐lockdown quiz averagePost‐lockdown quiz averagePost‐lockdown quiz average
Pre‐lockdown0.171 0.171 0.177 0.175 0.181
Quiz average(0.050)(0.048)(0.049)(0.049)(0.049)
Transition−1.745 −1.875 −1.875 −1.744 −1.818
Difficulty(0.842)(0.815)(0.816)(0.823)(0.826)
Wifi2.945 2.827 2.834 2.949 2.990
(1.624)(1.619)(1.599)(1.605)(1.599)
Zero‐rated−0.590−0.257−0.215−0.0450.318
(1.889)(1.924)(1.928)(1.937)(1.946)
Self‐study−3.648 −3.558 −3.606 −3.325
(1.100)(1.103)(1.110)(1.155)
Class−1.061−1.003−1.158
Attendance pre‐lockdown(1.132)(1.148)(1.158)
Quality−1.267−1.286
Collaborate: picture/sound(1.202)(1.189)
Male−1.506
(1.179)
Colored3.3073.0152.8853.1633.615
(2.477)(2.402)(2.394)(2.493)(2.657)
Indian4.147 4.518 4.547 4.457 4.526
(2.022)(1.981)(1.969)(1.975)(1.983)
White1.2151.7381.6121.4481.636
(1.356)(1.349)(1.346)(1.344)(1.349)
Below average1.4761.2040.9931.2781.319
Income(1.363)(1.327)(1.344)(1.335)(1.342)
‐adj.0.1110.1420.1420.1410.143
Observations275275275273273

As a further robustness check in Table  5 , we create a panel for each student across the six assignment grades so we can control for individual heterogeneity. We create a post‐lockdown binary variable that takes the value of 1 for the lockdown period and 0 otherwise. We interact the post‐lockdown dummy variable with a measure for transition difficulty and internet access. The internet access variable is an indicator variable for mobile internet data, wifi, or zero‐rated access to class materials. The variable wifi is a binary variable taking the value of 1 if the student has access to wifi and 0 otherwise. The zero‐rated variable is a binary variable taking the value of 1 if the student used the university's free portal access and 0 otherwise. We also include assignment and student fixed effects. The results in Table  5 remain consistent with our previous findings that students who had wifi access performed significantly better than their peers.

Interaction model

All assignment grades
(1)(2)(3)(4)
Post × Transition difficulty−1.746 −1.005−1.008
(0.922)(0.948)(0.948)
Wifi × Post4.599 4.199 3.807
(1.342)(1.379)(1.618)
Zero‐rated × Post−1.138
(2.223)
Assignment FEYesYesYesYes
Student FEYesYesYesYes
‐adj0.3700.3730.3730.373
Observations2215221522152215

Notes : Coefficients reported. Robust standard errors in parentheses. The dependent variable is the assessment grades for each student on each assignment. The number of observations include the pre‐post number of assessments multiplied by the number of students.

6. CONCLUSION

The Covid‐19 pandemic left many education institutions with no option but to transition to online learning. The University of Pretoria was no exception. We examine the effect of transitioning to online learning on the academic performance of second‐year economic students. We use assessment results from F2F lectures before lockdown, and online lectures post lockdown for the same group of students, together with responses from survey questions. We find that the main contributor to lower academic performance in the online setting was poor internet access, which made transitioning to online learning more difficult. In addition, opting to self‐study (read notes instead of joining online classes and/or watching recordings) did not help the students in their performance.

The implications of the results highlight the need for improved quality of internet infrastructure with affordable internet data pricing. Despite the university's best efforts not to leave any student behind with the zero‐rated website and free monthly internet data, the inequality dynamics in the country are such that invariably some students were negatively affected by this transition, not because the student was struggling academically, but because of inaccessibility of internet (wifi). While the zero‐rated website is a good collaborative initiative between universities and network providers, the infrastructure is not sufficient to accommodate mass students accessing it simultaneously.

This study's findings may highlight some shortcomings in the academic sector that need to be addressed by both the public and private sectors. There is potential for an increase in the digital divide gap resulting from the inequitable distribution of digital infrastructure. This may lead to reinforcement of current inequalities in accessing higher education in the long term. To prepare the country for online learning, some considerations might need to be made to make internet data tariffs more affordable and internet accessible to all. We hope that this study's findings will provide a platform (or will at least start the conversation for taking remedial action) for policy engagements in this regard.

We are aware of some limitations presented by our study. The sample we have at hand makes it difficult to extrapolate our findings to either all students at the University of Pretoria or other higher education students in South Africa. Despite this limitation, our findings highlight the negative effect of the digital divide on students' educational outcomes in the country. The transition to online learning and the high internet data costs in South Africa can also have adverse learning outcomes for low‐income students. With higher education institutions, such as the University of Pretoria, integrating online teaching to overcome the effect of the Covid‐19 pandemic, access to stable internet is vital for students' academic success.

It is also important to note that the data we have at hand does not allow us to isolate wifi's causal effect on students' performance post‐lockdown due to two main reasons. First, wifi access is not randomly assigned; for instance, there is a high chance that students with better‐off family backgrounds might have better access to wifi and other supplementary infrastructure than their poor counterparts. Second, due to the university's data access policy and consent, we could not merge the data at hand with the student's previous year's performance. Therefore, future research might involve examining the importance of these elements to document the causal impact of access to wifi on students' educational outcomes in the country.

ACKNOWLEDGMENT

The authors acknowledge the helpful comments received from the editor, the anonymous reviewers, and Elizabeth Asiedu.

Chisadza, C. , Clance, M. , Mthembu, T. , Nicholls, N. , & Yitbarek, E. (2021). Online and face‐to‐face learning: Evidence from students’ performance during the Covid‐19 pandemic . Afr Dev Rev , 33 , S114–S125. 10.1111/afdr.12520 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]

1 https://mybroadband.co.za/news/cellular/309693-mobile-data-prices-south-africa-vs-the-world.html .

2 The 4IR is currently characterized by increased use of new technologies, such as advanced wireless technologies, artificial intelligence, cloud computing, robotics, among others. This era has also facilitated the use of different online learning platforms ( https://www.brookings.edu/research/the-fourth-industrialrevolution-and-digitization-will-transform-africa-into-a-global-powerhouse/ ).

3 Note that we control for income, but it is plausible to assume other unobservable factors such as parental preference and parenting style might also affect access to the internet of students.

  • Almatra, O. , Johri, A. , Nagappan, K. , & Modanlu, A. (2015). An empirical study of face‐to‐face and distance learning sections of a core telecommunication course (Conference Proceedings Paper No. 12944). 122nd ASEE Annual Conference and Exposition, Seattle, Washington State.
  • Anyanwu, J. C. (2013). Characteristics and macroeconomic determinants of youth employment in Africa . African Development Review , 25 ( 2 ), 107–129. [ Google Scholar ]
  • Anyanwu, J. C. (2016). Accounting for gender equality in secondary school enrolment in Africa: Accounting for gender equality in secondary school enrolment . African Development Review , 28 ( 2 ), 170–191. [ Google Scholar ]
  • Arbaugh, J. (2000). Virtual classroom versus physical classroom: An exploratory study of class discussion patterns and student learning in an asynchronous internet‐based MBA course . Journal of Management Education , 24 ( 2 ), 213–233. [ Google Scholar ]
  • Arias, J. J. , Swinton, J. , & Anderson, K. (2018). On‐line vs. face‐to‐face: A comparison of student outcomes with random assignment . e‐Journal of Business Education and Scholarship of Teaching, , 12 ( 2 ), 1–23. [ Google Scholar ]
  • Bernard, R. M. , Abrami, P. C. , Lou, Y. , Borokhovski, E. , Wade, A. , Wozney, L. , Wallet, P. A. , Fiset, M. , & Huang, B. (2004). How does distance education compare with classroom instruction? A meta‐analysis of the empirical literature . Review of Educational Research , 74 ( 3 ), 379–439. [ Google Scholar ]
  • Brown, B. , & Liedholm, C. (2002). Can web courses replace the classroom in principles of microeconomics? American Economic Review , 92 ( 2 ), 444–448. [ Google Scholar ]
  • Callister, R. R. , & Love, M. S. (2016). A comparison of learning outcomes in skills‐based courses: Online versus face‐to‐face formats . Decision Sciences Journal of Innovative Education , 14 ( 2 ), 243–256. [ Google Scholar ]
  • Coates, D. , Humphreys, B. R. , Kane, J. , & Vachris, M. A. (2004). “No significant distance” between face‐to‐face and online instruction: Evidence from principles of economics . Economics of Education Review , 23 ( 5 ), 533–546. [ Google Scholar ]
  • Coetzee, M. (2014). School quality and the performance of disadvantaged learners in South Africa (Working Paper No. 22). University of Stellenbosch Economics Department, Stellenbosch
  • Dutton, J. , Dutton, M. , & Perry, J. (2002). How do online students differ from lecture students? Journal of Asynchronous Learning Networks , 6 ( 1 ), 1–20. [ Google Scholar ]
  • Emediegwu, L. (2021). Does educational investment enhance capacity development for Nigerian youths? An autoregressive distributed lag approach . African Development Review , 32 ( S1 ), S45–S53. [ Google Scholar ]
  • Fallah, M. H. , & Ubell, R. (2000). Blind scores in a graduate test. Conventional compared with web‐based outcomes . ALN Magazine , 4 ( 2 ). [ Google Scholar ]
  • Figlio, D. , Rush, M. , & Yin, L. (2013). Is it live or is it internet? Experimental estimates of the effects of online instruction on student learning . Journal of Labor Economics , 31 ( 4 ), 763–784. [ Google Scholar ]
  • Freeman, M. A. , & Capper, J. M. (1999). Exploiting the web for education: An anonymous asynchronous role simulation . Australasian Journal of Educational Technology , 15 ( 1 ), 95–116. [ Google Scholar ]
  • Gilbert, P. (2019). The most expensive data prices in Africa . Connecting Africa. https://www.connectingafrica.com/author.asp?section_id=761%26doc_id=756372
  • Glick, P. , Randriamamonjy, J. , & Sahn, D. (2009). Determinants of HIV knowledge and condom use among women in Madagascar: An analysis using matched household and community data . African Development Review , 21 ( 1 ), 147–179. [ Google Scholar ]
  • Gyimah‐Brempong, K. (2011). Education and economic development in Africa . African Development Review , 23 ( 2 ), 219–236. [ Google Scholar ]
  • Johnson, S. , Aragon, S. , Shaik, N. , & Palma‐Rivas, N. (2000). Comparative analysis of learner satisfaction and learning outcomes in online and face‐to‐face learning environments . Journal of Interactive Learning Research , 11 ( 1 ), 29–49. [ Google Scholar ]
  • Lundberg, J. , Merino, D. , & Dahmani, M. (2008). Do online students perform better than face‐to‐face students? Reflections and a short review of some empirical findings . Revista de Universidad y Sociedad del Conocimiento , 5 ( 1 ), 35–44. [ Google Scholar ]
  • Means, B. , Toyama, Y. , Murphy, R. , Bakia, M. , & Jones, K. (2010). Evaluation of evidence‐based practices in online learning: A meta‐analysis and review of online learning studies (Report No. ed‐04‐co‐0040 task 0006). U.S. Department of Education, Office of Planning, Evaluation, and Policy Development, Washington DC.
  • Navarro, P. (2000). Economics in the cyber‐classroom . Journal of Economic Perspectives , 14 ( 2 ), 119–132. [ Google Scholar ]
  • Navarro, P. , & Shoemaker, J. (2000). Performance and perceptions of distance learners in cyberspace . American Journal of Distance Education , 14 ( 2 ), 15–35. [ Google Scholar ]
  • Neuhauser, C. (2002). Learning style and effectiveness of online and face‐to‐face instruction . American Journal of Distance Education , 16 ( 2 ), 99–113. [ Google Scholar ]
  • Nguyen, T. (2015). The effectiveness of online learning: Beyond no significant difference and future horizons . MERLOT Journal of Online Teaching and Learning , 11 ( 2 ), 309–319. [ Google Scholar ]
  • Paechter, M. , & Maier, B. (2010). Online or face‐to‐face? Students' experiences and preferences in e‐learning . Internet and Higher Education , 13 ( 4 ), 292–297. [ Google Scholar ]
  • Russell, T. L. , & International Distance Education Certificate Center (IDECC) (1999). The no significant difference phenomenon: A comparative research annotated bibliography on technology for distance education: As reported in 355 research reports, summaries and papers . North Carolina State University. [ Google Scholar ]
  • Shachar, M. , & Neumann, Y. (2010). Twenty years of research on the academic performance differences between traditional and distance learning: Summative meta‐analysis and trend examination . MERLOT Journal of Online Learning and Teaching , 6 ( 2 ), 318–334. [ Google Scholar ]
  • Shachar, M. , & Yoram, N. (2003). Differences between traditional and distance education academic performances: A meta‐analytic approach . International Review of Research in Open and Distance Learning , 4 ( 2 ), 1–20. [ Google Scholar ]
  • Tchamyou, V. S. , Asongu, S. , & Odhiambo, N. (2019). The role of ICT in modulating the effect of education and lifelong learning on income inequality and economic growth in Africa . African Development Review , 31 ( 3 ), 261–274. [ Google Scholar ]
  • Xu, D. , & Jaggars, S. S. (2014). Performance gaps between online and face‐to‐face courses: Differences across types of students and academic subject areas . The Journal of Higher Education , 85 ( 5 ), 633–659. [ Google Scholar ]
  • Open access
  • Published: 16 September 2021

Online learning during COVID-19 produced equivalent or better student course performance as compared with pre-pandemic: empirical evidence from a school-wide comparative study

  • Meixun Zheng 1 ,
  • Daniel Bender 1 &
  • Cindy Lyon 1  

BMC Medical Education volume  21 , Article number:  495 ( 2021 ) Cite this article

215k Accesses

92 Citations

118 Altmetric

Metrics details

The COVID-19 pandemic forced dental schools to close their campuses and move didactic instruction online. The abrupt transition to online learning, however, has raised several issues that have not been resolved. While several studies have investigated dental students’ attitude towards online learning during the pandemic, mixed results have been reported. Additionally, little research has been conducted to identify and understand factors, especially pedagogical factors, that impacted students’ acceptance of online learning during campus closure. Furthermore, how online learning during the pandemic impacted students’ learning performance has not been empirically investigated. In March 2020, the dental school studied here moved didactic instruction online in response to government issued stay-at-home orders. This first-of-its-kind comparative study examined students’ perceived effectiveness of online courses during summer quarter 2020, explored pedagogical factors impacting their acceptance of online courses, and empirically evaluated the impact of online learning on students’ course performance, during the pandemic.

The study employed a quasi-experimental design. Participants were 482 pre-doctoral students in a U.S dental school. Students’ perceived effectiveness of online courses during the pandemic was assessed with a survey. Students’ course grades for online courses during summer quarter 2020 were compared with that of a control group who received face-to-face instruction for the same courses before the pandemic in summer quarter 2019.

Survey results revealed that most online courses were well accepted by the students, and 80 % of them wanted to continue with some online instruction post pandemic. Regression analyses revealed that students’ perceived engagement with faculty and classmates predicted their perceived effectiveness of the online course. More notably, Chi Square tests demonstrated that in 16 out of the 17 courses compared, the online cohort during summer quarter 2020 was equally or more likely to get an A course grade than the analogous face-to-face cohort during summer quarter 2019.

Conclusions

This is the first empirical study in dental education to demonstrate that online courses during the pandemic could achieve equivalent or better student course performance than the same pre-pandemic in-person courses. The findings fill in gaps in literature and may inform online learning design moving forward.

Peer Review reports

Introduction

Research across disciplines has demonstrated that well-designed online learning can lead to students’ enhanced motivation, satisfaction, and learning [ 1 , 2 , 3 , 4 , 5 , 6 , 7 ]. A report by the U.S. Department of Education [ 8 ], based on examinations of comparative studies of online and face-to-face versions of the same course from 1996 to 2008, concluded that online learning could produce learning outcomes equivalent to or better than face-to-face learning. The more recent systematic review by Pei and Wu [ 9 ] provided additional evidence that online learning is at least as effective as face-to-face learning for undergraduate medical students.

To take advantage of the opportunities presented by online learning, thought leaders in dental education in the U.S. have advocated for the adoption of online learning in the nation’s dental schools [ 10 , 11 , 12 ]. However, digital innovation has been a slow process in academic dentistry [ 13 , 14 , 15 ]. In March 2020, the COVID-19 pandemic brought unprecedented disruption to dental education by necessitating the need for online learning. In accordance with stay-at-home orders to prevent the spread of the virus, dental schools around the world closed their campuses and moved didactic instruction online.

The abrupt transition to online learning, however, has raised several concerns and question. First, while several studies have examined dental students’ online learning satisfaction during the pandemic, mixed results have been reported. Some studies have reported students’ positive attitude towards online learning [ 15 , 16 , 17 , 18 , 19 , 20 ]. Sadid-Zadeh et al. [ 18 ] found that 99 % of the surveyed dental students at University of Buffalo, in the U.S., were satisfied with live web-based lectures during the pandemic. Schlenz et al. [ 15 ] reported that students in a German dental school had a favorable attitude towards online learning and wanted to continue with online instruction in their future curriculum. Other studies, however, have reported students’ negative online learning experience during the pandemic [ 21 , 22 , 23 , 24 , 25 , 26 ]. For instance, dental students at Harvard University felt that learning during the pandemic had worsened and engagement had decreased [ 23 , 24 ]. In a study with medical and dental students in Pakistan, Abbasi et al. [ 21 ] found that 77 % of the students had negative perceptions about online learning and 84 % reported reduced student-instructor interactions.

In addition to these mixed results, little attention has been given to factors affecting students’ acceptance of online learning during the pandemic. With the likelihood that online learning will persist post pandemic [ 27 ], research in this area is warranted to inform online course design moving forward. In particular, prior research has demonstrated that one of the most important factors influencing students’ performance in any learning environment is a sense of belonging, the feeling of being connected with and supported by the instructor and classmates [ 28 , 29 , 30 , 31 ]. Unfortunately, this aspect of the classroom experience has suffered during school closure. While educational events can be held using a video conferencing system, virtual peer interaction on such platforms has been perceived by medical trainees to be not as easy and personal as physical interaction [ 32 ]. The pandemic highlights the need to examine instructional strategies most suited to the current situation to support students’ engagement with faculty and classmates.

Furthermore, there is considerable concern from the academic community about the quality of online learning. Pre-pandemic, some faculty and students were already skeptical about the value of online learning [ 33 ]. The longer the pandemic lasts, the more they may question the value of online education, asking: Can online learning during the pandemic produce learning outcomes that are similar to face-to-face learning before the pandemic? Despite the documented benefits of online learning prior to the pandemic, the actual impact of online learning during the pandemic on students’ academic performance is still unknown due to reasons outlined below.

On one hand, several factors beyond the technology used could influence the effectiveness of online learning, one of which is the teaching context [ 34 ]. The sudden transition to online learning has posed many challenges to faculty and students. Faculty may not have had adequate time to carefully design online courses to take full advantage of the possibilities of the online format. Some faculty may not have had prior online teaching experience and experienced a deeper learning curve when it came to adopting online teaching methods [ 35 ]. Students may have been at the risk of increased anxiety due to concerns about contracting the virus, on time graduation, finances, and employment [ 36 , 37 ], which may have negatively impacted learning performance [ 38 ]. Therefore, whether online learning during the pandemic could produce learning outcomes similar to those of online learning implemented during more normal times remains to be determined.

Most existing studies on online learning in dental education during the pandemic have only reported students’ satisfaction. The actual impact of the online format on academic performance has not been empirically investigated. The few studies that have examined students’ learning outcomes have only used students’ self-reported data from surveys and focus groups. According to Kaczmarek et al. [ 24 ], 50 % of the participating dental faculty at Harvard University perceived student learning to have worsened during the pandemic and 70 % of the students felt the same. Abbasi et al. [ 21 ] reported that 86 % of medical and dental students in a Pakistan college felt that they learned less online. While student opinions are important, research has demonstrated a poor correlation between students’ perceived learning and actual learning gains [ 39 ]. As we continue to navigate the “new normal” in teaching, students’ learning performance needs to be empirically evaluated to help institutions gauge the impact of this grand online learning experiment.

Research purposes

In March 2020, the University of the Pacific Arthur A. Dugoni School of Dentistry, in the U.S., moved didactic instruction online to ensure the continuity of education during building closure. This study examined students’ acceptance of online learning during the pandemic and its impacting factors, focusing on instructional practices pertaining to students’ engagement/interaction with faculty and classmates. Another purpose of this study was to empirically evaluate the impact of online learning during the pandemic on students’ actual course performance by comparing it with that of a pre-pandemic cohort. To understand the broader impact of the institutional-wide online learning effort, we examined all online courses offered in summer quarter 2020 (July to September) that had a didactic component.

This is the first empirical study in dental education to evaluate students’ learning performance during the pandemic. The study aimed to answer the following three questions.

How well was online learning accepted by students, during the summer quarter 2020 pandemic interruption?

How did instructional strategies, centered around students’ engagement with faculty and classmates, impact their acceptance of online learning?

How did online learning during summer quarter 2020 impact students’ course performance as compared with a previous analogous cohort who received face-to-face instruction in summer quarter 2019?

This study employed a quasi-experimental design. The study was approved by the university’s institutional review board (#2020-68).

Study context and participants

The study was conducted at the Arthur A. Dugoni School of Dentistry, University of the Pacific. The program runs on a quarter system. It offers a 3-year accelerated Doctor of Dental Surgery (DDS) program and a 2-year International Dental Studies (IDS) program for international dentists who have obtained a doctoral degree in dentistry from a country outside the U.S. and want to practice in the U.S. Students advance throughout the program in cohorts. IDS students take some courses together with their DDS peers. All three DDS classes (D1/DDS 2023, D2/DDS 2022, and D3/DDS 2021) and both IDS classes (I1/IDS 2022 and I2/IDS 2021) were invited to participate in the study. The number of students in each class was: D1 = 145, D2 = 143, D3 = 143, I1 = 26, and I2 = 25. This resulted in a total of 482 student participants.

During campus closure, faculty delivered remote instruction in various ways, including live online classes via Zoom @  [ 40 ], self-paced online modules on the school’s learning management system Canvas @  [ 41 ], or a combination of live and self-paced delivery. For self-paced modules, students studied assigned readings and/or viewings such as videos and pre-recorded slide presentations. Some faculty also developed self-paced online lessons with SoftChalk @  [ 42 ], a cloud-based platform that supports the inclusion of gamified learning by insertion of various mini learning activities. The SoftChalk lessons were integrated with Canvas @  [ 41 ] and faculty could monitor students’ progress. After students completed the pre-assigned online materials, some faculty held virtual office hours or live online discussion sessions for students to ask questions and discuss key concepts.

Data collection and analysis

Student survey.

Students’ perceived effectiveness of summer quarter 2020 online courses was evaluated by the school’s Office of Academic Affairs in lieu of the regular course evaluation process. A total of 19 courses for DDS students and 10 courses for IDS students were evaluated. An 8-question survey developed by the researchers (Additional file 1 ) was administered online in the last week of summer quarter 2020. Course directors invited student to take the survey during live online classes. The survey introduction stated that taking the survey was voluntary and that their anonymous responses would be reported in aggregated form for research purposes. Students were invited to continue with the survey if they chose to participate; otherwise, they could exit the survey. The number of students in each class who took the survey was as follows: D1 ( n  = 142; 98 %), D2 ( n  = 133; 93 %), D3 ( n  = 61; 43 %), I1 ( n  = 23; 88 %), and I2 ( n  = 20; 80 %). This resulted in a total of 379 (79 %) respondents across all classes.

The survey questions were on a 4-point scale, ranging from Strongly Disagree (1 point), Disagree (2 points), Agree (3 points), and Strongly Agree (4 points). Students were asked to rate each online course by responding to four statements: “ I could fully engage with the instructor and classmates in this course”; “The online format of this course supported my learning”; “Overall this online course is effective.”, and “ I would have preferred face-to-face instruction for this course ”. For the first three survey questions, a higher mean score indicated a more positive attitude toward the online course. For the fourth question “ I would have preferred face-to-face instruction for this course ”, a higher mean score indicated that more students would have preferred face-to-face instruction for the course. Two additional survey questions asked students to select their preferred online delivery method for fully online courses during the pandemic from three given choices (synchronous online/live, asynchronous online/self-paced, and a combination of both), and to report whether they wanted to continue with some online instruction post pandemic. Finally, two open-ended questions at the end of the survey allowed students to comment on the aspects of online format that they found to be helpful and to provide suggestion for improvement. For the purpose of this study, we focused on the quantitative data from the Likert-scale questions.

Descriptive data such as the mean scores were reported for each course. Regression analyses were conducted to examine the relationship between instructional strategies focusing on students’ engagement with faculty and classmates, and their overall perceived effectiveness of the online course. The independent variable was student responses to the question “ I could fully engage with the instructor and classmates in this course ”, and the dependent variable was their answer to the question “ Overall, this online course is effective .”

Student course grades

Using Chi-square tests, student course grade distributions (A, B, C, D, and F) for summer quarter 2020 online courses were compared with that of a previous cohort who received face-to-face instruction for the same course in summer quarter 2019. Note that as a result of the school’s pre-doctoral curriculum redesign implemented in July 2019, not all courses offered in summer quarter 2020 were offered in the previous year in summer quarter 2019. In other words, some of the courses offered in summer quarter 2020 were new courses offered for the first time. Because these new courses did not have a previous face-to-face version to compare to, they were excluded from data analysis. For some other courses, while course content remained the same between 2019 and 2020, the sequence of course topics within the course had changed. These courses were also excluded from data analysis.

After excluding the aforementioned courses, it resulted in a total of 17 “comparable” courses that were included in data analysis (see the subsequent section). For these courses, the instructor, course content, and course goals were the same in both 2019 and 2020. The assessment methods and grading policies also remained the same through both years. For exams and quizzes, multiple choice questions were the dominating format for both years. While some exam questions in 2020 were different from 2019, faculty reported that the overall exam difficulty level was similar. The main difference in assessment was testing conditions. The 2019 cohort took computer-based exams in the physical classroom with faculty proctoring, and the 2020 cohort took exams at home with remote proctoring to ensure exam integrity. The remote proctoring software monitored the student during the exam through a web camera on their computer/laptop. The recorded video file flags suspicious activities for faculty review after exam completion.

Students’ perceived effectiveness of online learning

Table  1 summarized data on DDS students’ perceived effectiveness of each online course during summer quarter 2020. For the survey question “ Overall, this online course is effective ”, the majority of courses received a mean score that was approaching or over 3 points on the 4-point scale, suggesting that online learning was generally well accepted by students. Despite overall positive online course experiences, for many of the courses examined, there was an equal split in student responses to the question “ I would have preferred face-to-face instruction for this course .” Additionally, for students’ preferred online delivery method for fully online courses, about half of the students in each class preferred a combination of synchronous and asynchronous online learning (see Fig.  1 ). Finally, the majority of students wanted faculty to continue with some online instruction post pandemic: D1class (110; 78.60 %), D2 class (104; 80 %), and D3 class (49; 83.10 %).

While most online courses received favorable ratings, some variations did exist among courses. For D1 courses, “ Anatomy & Histology ” received lower ratings than others. This could be explained by its lab component, which didn’t lend itself as well to the online format. For D2 courses, several of them received lower ratings than others, especially for the survey question on students’ perceived engagement with faculty and classmates.

figure 1

DDS students’ preferred online delivery method for fully online courses

Table  2 summarized IDS students’ perceived effectiveness of each online course during summer quarter 2020. For the survey question “ Overall, this online course is effective ”, all courses received a mean score that was approaching or over 3 points on a 4-point scale, suggesting that online learning was well accepted by students. For the survey question “ I would have preferred face-to-face instruction for this course ”, for most online courses examined, the percentage of students who would have preferred face-to-face instruction was similar to that of students who preferred online instruction for the course. Like their DDS peers, about half of the IDS students in each class also preferred a combination of synchronous and asynchronous online delivery for fully online courses (See Fig.  2 ). Finally, the majority of IDS students (I1, n = 18, 81.80 %; I2, n = 16, 84.20 %) wanted to continue with some online learning after the pandemic is over.

figure 2

IDS students’ preferred online delivery method for fully online courses

Factors impacting students’ acceptance of online learning

For all 19 online courses taken by DDS students, regression analyses indicated that there was a significantly positive relationship between students’ perceived engagement with faculty and classmates and their perceived effectiveness of the course. P value was 0.00 across all courses. The ranges of effect size (r 2 ) were: D1 courses (0.26 to 0.50), D2 courses (0.39 to 0.650), and D3 courses (0.22 to 0.44), indicating moderate to high correlations across courses.

For 9 out of the 10 online courses taken by IDS students, there was a positive relationship between students’ perceived engagement with faculty and classmates and their perceived effectiveness of the course. P value was 0.00 across courses. The ranges of effect size were: I1 courses (0.35 to 0.77) and I2 courses (0.47 to 0.63), indicating consistently high correlations across courses. The only course in which students’ perceived engagement with faculty and classmates didn’t predict perceived effective of the course was “ Integrated Clinical Science III (ICS III) ”, which the I2 class took together with their D3 peers.

Impact of online learning on students’ course performance

Chi square test results (Table  3 ) indicated that in 4 out of the 17 courses compared, the online cohort during summer quarter 2020 was more likely to receive an A grade than the face-to-face cohort during summer quarter 2019. In 12 of the courses, the online cohort were equally likely to receive an A grade as the face-to-face cohort. In the remaining one course, the online cohort was less likely to receive an A grade than the face-to-face cohort.

Students’ acceptance of online learning during the pandemic

Survey results revealed that students had generally positive perceptions about online learning during the pandemic and the majority of them wanted to continue with some online learning post pandemic. Overall, our findings supported several other studies in dental [ 18 , 20 ], medical [ 43 , 44 ], and nursing [ 45 ] education that have also reported students’ positive attitudes towards online learning during the pandemic. In their written comments in the survey, students cited enhanced flexibility as one of the greatest benefits of online learning. Some students also commented that typing questions in the chat box during live online classes was less intimidating than speaking in class. Others explicitly stated that not having to commute to/from school provided more time for sleep, which helped with self-care and mental health. Our findings are in line with previous studies which have also demonstrated that online learning offered higher flexibility [ 46 , 47 ]. Meanwhile, consistent with findings of other researchers [ 19 , 21 , 46 ], our students felt difficulty engaging with faculty and classmates in several online courses.

There were some variations among individual courses in students’ acceptance of the online format. One factor that could partially account for the observed differences was instructional strategies. In particular, our regression analysis results demonstrated a positive correlation between students’ perceived engagement with faculty and classmates and their perceived overall effectiveness of the online course. Other aspects of course design might also have influenced students’ overall rating of the online course. For instance, some D2 students commented that the requirements of the course “ Integrated Case-based Seminars (ICS II) ” were not clear and that assessment did not align with lecture materials. It is important to remember that communicating course requirements clearly and aligning course content and assessment are principles that should be applied in any course, whether face-to-face or online. Our results highlighted the importance of providing faculty training on basic educational design principles and online learning design strategies. Furthermore, the nature of the course might also have impacted student ratings. For example, D1 course “ Anatomy and Histology ” had a lab component, which did not lend itself as well to the online format. Many students reported that it was difficult to see faculty’s live demonstration during Zoom lectures, which may have resulted in a lower student satisfaction rating.

As for students’ preferred online delivery method for fully online courses during the pandemic, about half of them preferred a combination of synchronous and asynchronous online learning. In light of this finding, as we continue with remote learning until public health directives allow a return to campus, we will encourage faculty to integrate these two online delivery modalities. Finally, in view of the result that over 80 % of the students wanted to continue with some online instruction after the pandemic, the school will advocate for blended learning in the post-pandemic world [ 48 ]. For future face-to-face courses on campus after the pandemic, faculty are encouraged to deliver some content online to reduce classroom seat time and make learning more flexible. Taken together, our findings not only add to the overall picture of the current situation but may inform learning design moving forward.

Role of online engagement and interaction

To reiterate, we found that students’ perceived engagement with faculty and classmates predicted their perceived overall effectiveness of the online course. This aligns with the larger literature on best practices in online learning design. Extensive research prior to the pandemic has confirmed that the effectiveness of online learning is determined by a number of factors beyond the tools used, including students’ interactions with the instructor and classmates [ 49 , 50 , 51 , 52 ]. Online students may feel isolated due to reduced or lack of interaction [ 53 , 54 ]. Therefore, in designing online learning experiences, it is important to remember that learning is a social process [ 55 ]. Faculty’s role is not only to transmit content but also to promote the different types of interactions that are an integral part of the online learning process [ 33 ]. The online teaching model in which faculty uploads materials online but teach it in the same way as in the physical classroom, without special effort to engage students, doesn’t make the best use of the online format. Putting the “sage on the screen” during a live class meeting on a video conferencing system is not different from “sage on the stage” in the physical classroom - both provide limited space for engagement. Such one-way monologue devalues the potentials that online learning presents.

In light of the critical role that social interaction plays in online learning, faculty are encouraged to use the interactive features of online learning platforms to provide clear channels for student-instructor and student-student interactions. In the open-ended comments, students highlighted several instructional strategies that they perceived to be helpful for learning. For live online classes, these included conducting breakout room activities, using the chat box to facilitate discussions, polling, and integrating gameplay with apps such as Kahoot! @  [ 56 ]. For self-paced classes, students appreciated that faculty held virtual office hours or subsequent live online discussion sessions to reinforce understanding of the pre-assigned materials.

Quality of online education during the pandemic

This study provided empirical evidence in dental education that it was possible to ensure the continuity of education without sacrificing the quality of education provided to students during forced migration to distance learning upon building closure. To reiterate, in all but one online course offered in summer quarter 2020, students were equally or more likely to get an A grade than the face-to-face cohort from summer quarter 2019. Even for courses that had less student support for the online format (e.g., the D1 course “ Anatomy and Histology ”), there was a significant increase in the number of students who earned an A grade in 2020 as compared with the previous year. The reduced capacity for technical training during the pandemic may have resulted in more study time for didactic content. Overall, our results resonate with several studies in health sciences education before the pandemic that the quality of learning is comparable in face-to-face and online formats [ 9 , 57 , 58 ]. For the only course ( Integrated Case-based Seminars ICS II) in which the online cohort had inferior performance than the face-to-face cohort, as mentioned earlier, students reported that assessment was not aligned with course materials and that course expectations were not clear. This might explain why students’ course performance was not as strong as expected.

Limitations

This study used a pre-existing control group from the previous year. There may have been individual differences between students in the online and the face-to-face cohorts, such as motivation, learning style, and prior knowledge, that could have impacted the observed outcomes. Additionally, even though course content and assessment methods were largely the same in 2019 and 2020, changes in other aspects of the course could have impacted students’ course performance. Some faculty may have been more compassionate with grading (e.g., more flexible with assignment deadlines) in summer quarter 2020 given the hardship students experienced during the pandemic. On the other hand, remote proctoring in summer quarter 2020 may have heightened some students’ exam anxiety knowing that they were being monitored through a webcam. The existence and magnitude of effect of these factors needs to be further investigated.

This present study only examined the correlation between students’ perceived online engagement and their perceived overall effectiveness of the online course. Other factors that might impact their acceptance of the online format need to be further researched in future studies. Another future direction is to examine how students’ perceived online engagement correlates with their actual course performance. Because the survey data collected for our present study are anonymous, we cannot match students’ perceived online engagement data with their course grades to run this additional analysis. It should also be noted that this study was focused on didactic online instruction. Future studies might examine how technical training was impacted during the COVID building closure. It was also out of the scope of this study to examine how student characteristics, especially high and low academic performance as reflected by individual grades, affects their online learning experience and performance. We plan to conduct a follow-up study to examine which group of students are most impacted by the online format. Finally, this study was conducted in a single dental school, and so the findings may not be generalizable to other schools and disciplines. Future studies could be conducted in another school or disciplines to compare results.

This study revealed that dental students had generally favorable attitudes towards online learning during the COVID-19 pandemic and that their perceived engagement with faculty and classmates predicted their acceptance of the online course. Most notably, this is the first study in dental education to demonstrate that online learning during the pandemic could achieve similar or better learning outcomes than face-to-face learning before the pandemic. Findings of our study could contribute significantly to the literature on online learning during the COVID-19 pandemic in health sciences education. The results could also inform future online learning design as we re-envision the future of online learning.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Bello G, Pennisi MA, Maviglia R, Maggiore SM, Bocci MG, Montini L, et al. Online vs live methods for teaching difficult airway management to anesthesiology residents. Intensive Care Med. 2005; 31 (4): 547–552.

Article   Google Scholar  

Ruiz JG, Mintzer MJ, Leipzig RM. The impact of e-learning in medical education. Acad Med. 2006; 81(3): 207–12.

Kavadella A, Tsiklakis K, Vougiouklakis G, Lionarakis A. Evaluation of a blended learning course for teaching oral radiology to undergraduate dental students. Eur J Dent Educ. 2012; 16(1): 88–95.

de Jong N, Verstegen DL, Tan FS, O’Connor SJ. A comparison of classroom and online asynchronous problem-based learning for students undertaking statistics training as part of a public health master’s degree. Adv Health Sci Educ. 2013; 18(2):245–64.

Hegeman JS. Using instructor-generated video lectures in online mathematics coursesimproves student learning. Online Learn. 2015;19(3):70–87.

Gaupp R, Körner M, Fabry G. Effects of a case-based interactive e-learning course on knowledge and attitudes about patient safety: a quasi-experimental study with third-year medical students. BMC Med Educ. 2016; 16(1):172.

Zheng M, Bender D, Reid L, Milani J. An interactive online approach to teaching evidence-based dentistry with Web 2.0 technology. J Dent Educ. 2017; 81(8): 995–1003.

Means B, Toyama Y, Murphy R, Bakia M, Jones K. Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. U.S. Department of Education, Office of Planning, Evaluation and Policy Development. Washington D.C. 2009.

Google Scholar  

Pei L, Wu H. Does online learning work better than offline learning in undergraduate medical education? A systematic review and meta-analysis. Med Educ Online. 2019; 24(1):1666538.

Andrews KG, Demps EL. Distance education in the U.S. and Canadian undergraduate dental curriculum. J Dent Educ. 2003; 67(4):427–38.

Kassebaum DK, Hendricson WD, Taft T, Haden NK. The dental curriculum at North American dental institutions in 2002–03: a survey of current structure, recent innovations, and planned changes. J Dent Educ. 2004; 68(9):914–931.

Haden NK, Hendricson WD, Kassebaum DK, Ranney RR, Weinstein G, Anderson EL, et al. Curriculum changes in dental education, 2003–09. J Dent Educ. 2010; 74(5):539–57.

DeBate RD, Cragun D, Severson HH, Shaw T, Christiansen S, Koerber A, et al. Factors for increasing adoption of e-courses among dental and dental hygiene faculty members. J Dent Educ. 2011; 75 (5): 589–597.

Saeed SG, Bain J, Khoo E, Siqueira WL. COVID-19: Finding silver linings for dental education. J Dent Educ. 2020; 84(10):1060–1063.

Schlenz MA, Schmidt A, Wöstmann B, Krämer N, Schulz-Weidner N. Students’ and lecturers’ perspective on the implementation of online learning in dental education due to SARS-CoV-2 (COVID-19): a cross-sectional study. BMC Med Educ. 2020;20(1):1–7.

Donn J, Scott JA, Binnie V, Bell A. A pilot of a virtual Objective Structured Clinical Examination in dental education. A response to COVID-19. Eur J Dent Educ. 2020; https://doi.org/10.1111/eje.12624

Hung M, Licari FW, Hon ES, Lauren E, Su S, Birmingham WC, Wadsworth LL, Lassetter JH, Graff TC, Harman W, et al. In an era of uncertainty: impact of COVID-19 on dental education. J Dent Educ. 2020; 85 (2): 148–156.

Sadid-Zadeh R, Wee A, Li R, Somogyi‐Ganss E. Audience and presenter comparison of live web‐based lectures and traditional classroom lectures during the COVID‐19 pandemic. J Prosthodont. 2020. doi: https://doi.org/10.1111/jopr.13301

Wang K, Zhang L, Ye L. A nationwide survey of online teaching strategies in dental education in China. J Dent Educ. 2020; 85 (2): 128–134.

Rad FA, Otaki F, Baqain Z, Zary N, Al-Halabi M. Rapid transition to distance learning due to COVID-19: Perceptions of postgraduate dental learners and instructors. PLoS One. 2021; 16(2): e0246584.

Abbasi S, Ayoob T, Malik A, Memon SI. Perceptions of students regarding E-learning during Covid-19 at a private medical college. Pak J Med Sci. 2020; 3 6 : 57–61.

Al-Azzam N, Elsalem L, Gombedza F. A cross-sectional study to determine factors affecting dental and medical students’ preference for virtual learning during the COVID-19 outbreak. Heliyon. 6(12). 2020. doi: https://doi.org/10.1016/j.heliyon.2020.e05704

Chen E, Kaczmarek K, Ohyama H. Student perceptions of distance learning strategies during COVID-19. J Dent Educ. 2020. doi: https://doi.org/10.1002/jdd.12339

Kaczmarek K, Chen E, Ohyama H. Distance learning in the COVID-19 era: Comparison of student and faculty perceptions. J Dent Educ. 2020. https://doi.org/10.1002/jdd.12469

Sarwar H, Akhtar H, Naeem MM, Khan JA, Waraich K, Shabbir S, et al. Self-reported effectiveness of e-learning classes during COVID-19 pandemic: A nation-wide survey of Pakistani undergraduate dentistry students. Eur J Dent. 2020; 14 (S01): S34-S43.

Al-Taweel FB, Abdulkareem AA, Gul SS, Alshami ML. Evaluation of technology‐based learning by dental students during the pandemic outbreak of coronavirus disease 2019. Eur J Dent Educ. 2021; 25(1): 183–190.

Elangovan S, Mahrous A, Marchini L. Disruptions during a pandemic: Gaps identified and lessons learned. J Dent Educ. 2020; 84 (11): 1270–1274.

Goodenow C. Classroom belonging among early adolescent students: Relationships to motivation and achievement. J Early Adolesc.1993; 13(1): 21–43.

Goodenow C. The psychological sense of school membership among adolescents: Scale development and educational correlates. Psychol Sch. 1993; 30(1): 79–90.

St-Amand J, Girard S, Smith J. Sense of belonging at school: Defining attributes, determinants, and sustaining strategies. IAFOR Journal of Education. 2017; 5(2):105–19.

Peacock S, Cowan J. Promoting sense of belonging in online learning communities of inquiry at accredited courses. Online Learn. 2019; 23(2): 67–81.

Chan GM, Kanneganti A, Yasin N, Ismail-Pratt I, Logan SJ. Well‐being, obstetrics and gynecology and COVID‐19: Leaving no trainee behind. Aust N Z J Obstet Gynaecol. 2020; 60(6): 983–986.

Hodges C, Moore S, Lockee B, Trust T, Bond A. The difference between emergency remote teaching and online learning. Educause Review. 2020; 2 7 , 1–12.

Means B, Bakia M, Murphy R. Learning online: What research tells us about whether, when and how. Routledge. 2014.

Iyer P, Aziz K, Ojcius DM. Impact of COVID-19 on dental education in the United States. J Dent Educ. 2020; 84(6): 718–722.

Machado RA, Bonan PRF, Perez DEDC, Martelli JÚnior H. 2020. COVID-19 pandemic and the impact on dental education: Discussing current and future perspectives. Braz Oral Res. 2020; 34: e083.

Wu DT, Wu KY, Nguyen TT, Tran SD. The impact of COVID-19 on dental education in North America-Where do we go next? Eur J Dent Educ. 2020; 24(4): 825–827.

de Oliveira Araújo FJ, de Lima LSA, Cidade PIM, Nobre CB, Neto MLR. Impact of Sars-Cov-2 and its reverberation in global higher education and mental health. Psychiatry Res. 2020; 288:112977. doi: https://doi.org/10.1016/j.psychres.2020.112977

Persky AM, Lee E, Schlesselman LS. Perception of learning versus performance as outcome measures of educational research. Am J Pharm Educ. 2020; 8 4 (7): ajpe7782.

Zoom @ . Zoom Video Communications , San Jose, CA, USA. https://zoom.us/

Canvas @ . Instructure, INC. Salt Lake City, UT, USA. https://www.instructure.com/canvas

SoftChalk @ . SoftChalk LLC . San Antonio, TX, USA. https://www.softchalkcloud.com/

Agarwal S, Kaushik JS. Student’s perception of online learning during COVID pandemic. Indian J Pediatr. 2020; 87: 554–554.

Khalil R, Mansour AE, Fadda WA, Almisnid K, Aldamegh M, Al-Nafeesah A, et al. The sudden transition to synchronized online learning during the COVID-19 pandemic in Saudi Arabia: a qualitative study exploring medical students’ perspectives. BMC Med Educ. 2020; 20(1): 1–10.

Riley E, Capps N, Ward N, McCormack L, Staley J. Maintaining academic performance and student satisfaction during the remote transition of a nursing obstetrics course to online instruction. Online Learn. 2021; 25(1), 220–229.

Amir LR, Tanti I, Maharani DA, Wimardhani YS, Julia V, Sulijaya B, et al. Student perspective of classroom and distance learning during COVID-19 pandemic in the undergraduate dental study program Universitas Indonesia. BMC Med Educ. 2020; 20(1):1–8.

Dost S, Hossain A, Shehab M, Abdelwahed A, Al-Nusair L. Perceptions of medical students towards online teaching during the COVID-19 pandemic: a national cross-sectional survey of 2721 UK medical students. BMJ Open. 2020; 10(11).

Graham CR, Woodfield W, Harrison JB. A framework for institutional adoption and implementation of blended learning in higher education. Internet High Educ. 2013; 18 : 4–14.

Sing C, Khine M. An analysis of interaction and participation patterns in online community. J Educ Techno Soc. 2006; 9(1): 250–261.

Bernard RM, Abrami PC, Borokhovski E, Wade CA, Tamim RM, Surkes MA, et al. A meta-analysis of three types of interaction treatments in distance education. Rev Educ Res. 2009; 79(3): 1243–1289.

Fedynich L, Bradley KS, Bradley J. Graduate students’ perceptions of online learning. Res High Educ. 2015; 27.

Tanis CJ. The seven principles of online learning: Feedback from faculty and alumni on its importance for teaching and learning. Res Learn Technol. 2020; 28 . https://doi.org/10.25304/rlt.v28.2319

Dixson MD. Measuring student engagement in the online course: The Online Student Engagement scale (OSE). Online Learn. 2015; 19 (4).

Kwary DA, Fauzie S. Students’ achievement and opinions on the implementation of e-learning for phonetics and phonology lectures at Airlangga University. Educ Pesqui. 2018; 44 .

Vygotsky LS. Mind in society: The development of higher psychological processes. Cambridge (MA): Harvard University Press. 1978.

Kahoot! @ . Oslo, Norway. https://kahoot.com/

Davis J, Chryssafidou E, Zamora J, Davies D, Khan K, Coomarasamy A. Computer-based teaching is as good as face to face lecture-based teaching of evidence-based medicine: a randomised controlled trial. BMC Med Educ. 2007; 7(1): 1–6.

Davis J, Crabb S, Rogers E, Zamora J, Khan K. Computer-based teaching is as good as face to face lecture-based teaching of evidence-based medicine: a randomized controlled trial. Med Teach. 2008; 30(3): 302–307.

Download references

Acknowledgements

Not applicable.

Authors’ information

MZ is an Associate Professor of Learning Sciences and Senior Instructional Designer at School of Dentistry, University of the Pacific. She has a PhD in Education, with a specialty on learning sciences and technology. She has dedicated her entire career to conducting research on online learning, learning technology, and faculty development. Her research has resulted in several peer-reviewed publications in medical, dental, and educational technology journals. MZ has also presented regularly at national conferences.

DB is an Assistant Dean for Academic Affairs at School of Dentistry, University of the Pacific. He has an EdD degree in education, with a concentration on learning and instruction. Over the past decades, DB has been overseeing and delivering faculty pedagogical development programs to dental faculty. His research interest lies in educational leadership and instructional innovation. DB has co-authored several peer-reviewed publications in health sciences education and presented regularly at national conferences.

CL is Associate Dean of Oral Healthcare Education, School of Dentistry, University of the Pacific. She has a Doctor of Dental Surgery (DDS) degree and an EdD degree with a focus on educational leadership. Her professional interest lies in educational leadership, oral healthcare education innovation, and faculty development. CL has co-authored several publications in peer-reviewed journals in health sciences education and presented regularly at national conferences.

Author information

Authors and affiliations.

Office of Academic Affairs, Arthur A. Dugoni School of Dentistry, University of the Pacific, CA, San Francisco, USA

Meixun Zheng, Daniel Bender & Cindy Lyon

You can also search for this author in PubMed   Google Scholar

Contributions

MZ analyzed the data and wrote the initial draft of the manuscript. DB and CL both provided assistance with research design, data collection, and reviewed and edited the manuscript. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Meixun Zheng .

Ethics declarations

Ethics approval and consent to participate.

The study was approved by the institutional review board at University of the Pacific in the U.S. (#2020-68). Informed consent was obtained from all participants. All methods were carried out in accordance with relevant guidelines and regulations.

Consent for publication

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1:.

Survey of online courses during COVID-19 pandemic.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Zheng, M., Bender, D. & Lyon, C. Online learning during COVID-19 produced equivalent or better student course performance as compared with pre-pandemic: empirical evidence from a school-wide comparative study. BMC Med Educ 21 , 495 (2021). https://doi.org/10.1186/s12909-021-02909-z

Download citation

Received : 31 March 2021

Accepted : 26 August 2021

Published : 16 September 2021

DOI : https://doi.org/10.1186/s12909-021-02909-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Dental education
  • Online learning
  • COVID-19 pandemic
  • Instructional strategies
  • Interaction
  • Learning performance

BMC Medical Education

ISSN: 1472-6920

thesis study about online learning

IMAGES

  1. (PDF) Online Education and Its Effective Practice: A Research Review

    thesis study about online learning

  2. 📌 Research Paper on Online Learning

    thesis study about online learning

  3. Essay On Online Learning

    thesis study about online learning

  4. College essay: Phd dissertation on e learning

    thesis study about online learning

  5. Advantages And Disadvantages Of Online Learning Essay Conclusion

    thesis study about online learning

  6. Mini-Thesis The Effect of Online Classes in the Well-being

    thesis study about online learning

VIDEO

  1. Thesis Writing Service #thesiswriting #academicwriting #studymotivation

  2. Student Story: Balancing Study and being a Mum

  3. Best thesis services for students #thesis #mathhelp #studyhacks

  4. Thesis Writing

  5. Essay on Online Education

  6. Online Study Vs. Offline Study//Paragraph Writing//Advantage & Disadvantage of Online Learning

COMMENTS

  1. (PDF) The Effectiveness of Online Learning: Beyond No Significant

    Nashville, TN 3720 3 USA. t [email protected]. Abstract. The physical "brick and mortar" classroom is starting to lose its monopoly as the place of. learning. The Internet has made ...

  2. A Qualitative Case Study of Students' Perceptions of Their Experiences

    report, co-sponsored by the Online Learning Consortium, a collaborative community focused on the advancement of quality online education, revealed that enrollment in online courses had steadily increased over the past 14 years and as of Fall 2016, 31.6% of students were enrolled in at least one online education course (Seaman et al., 2018).

  3. PDF THE DESIGN OF ONLINE LEARNING ENVIRONMENTS A Thesis

    This study brings student activity data beyond excel spreadsheets by conducting wireframing exercises and 3-D mapping onto the visual design of the online learning environment. Platforms covered in this study include edX, Coursera, Lynda, Khan Academy, Duolingo, Blackboard, Udacity, and a custom platform built specifically for one course. I borrow

  4. The Impact of Online Learning Strategies on Students' Academic

    Furthermore, this study recommends the continued use of online learning if both students and instructors are technologically and physically prepared. Distribution of Aspects linked with Online ...

  5. PDF The Effectiveness and Challenges of Online Learning for Secondary ...

    online learning allows students to study in a "safe" environment, without experiencing embarrassment about asking questions. According to Harrison (2018), young children can access pictures and videos, navigate 'Youtube', and interact and participate in games and digital applications that are suited to their age.

  6. Factors Influencing Undergraduate Students' Online Learning Outcomes: A

    Recent studies found that online learning during the pandemic could achieve similar or better learning outcomes than face-to-face learning before the pandemic (e.g., Alzahrani, 2022; Stevens et al., 2021; Zheng et al., 2021). Learning outcomes are goals expected to be attained during the learning process (Kustono et al., 2021). The term ...

  7. PDF Practices in Online Distance Learning Students' Perception on The

    PRACTICES IN ONLINE DISTANCE LEARNING Thesis · May 2021 DOI: 10.13140/RG.2.2.22342.40009 CITATIONS 0 READS 29 1 author: ... Education, 2020). The focus of the study was to investigate which types of effective feedback ... applying online education for flexible learning, it was identified that in order for ...

  8. PDF A Comparative Analysis of Students Perceptions of Learning in Online

    greement with the statement that just as much learning takes place in online as in F2F courses. Similar to the age variab. e, the greatest differences were between second-year and the fourth-year and graduate students. This study also found the more. online courses students had taken, the more positive the perception of online learning becomes.

  9. Assessing the Impact of Online-Learning Effectiveness and Benefits in

    Online learning is one of the educational solutions for students during the COVID-19 pandemic. Worldwide, most universities have shifted much of their learning frameworks to an online learning model to limit physical interaction between people and slow the spread of COVID-19. The effectiveness of online learning depends on many factors, including student and instructor self-efficacy, attitudes ...

  10. The influence of SDL on learning satisfaction in online learning and

    This study examines the influences of learners' motivation, self-monitoring, and self-management on learning satisfaction in online learning environments. The participants were 185 undergraduates and 99 graduate students majoring in computer science and engineering. The participants' motivation, self-monitoring, self-management, and learning satisfaction were measured using a questionnaire ...

  11. The Impact of Online Learning on Student's Academic Performance

    The spread of online learning has grown exponentially at every academic level and in many. countries in our COVID-19 world. Due to the relatively new nature of such widespread use of. online learning, little analysis or studies have been conducted on whether student performance.

  12. Learnings from the Impact of Online Learning on Elementary Students

    The researcher designed this study to examine the impacts of online learning on elementary students' mental and social and emotional well-being amid the COVID-19 pandemic. Also, this study addresses the broader range of extant inequities that may arise due to the shift to online learning (from educator and parent perspectives).

  13. Students' Learning Experiences and Perceptions of Online Course Content

    future of online education. Rodriguez et al. (2008) concurred, arguing that sustaining enrollment in higher education will depend on the learning experiences and perceptions of students in an online environment, a sentiment also echoed by Dobbs, Waid, and del Carmen (2009) and Motargy and Boghikian-Whitby (2010). Other researchers have

  14. The effects of online education on academic success: A meta-analysis study

    The purpose of this study is to analyze the effect of online education, which has been extensively used on student achievement since the beginning of the pandemic. In line with this purpose, a meta-analysis of the related studies focusing on the effect of online education on students' academic achievement in several countries between the years 2010 and 2021 was carried out. Furthermore, this ...

  15. (PDF) The Influence of Online Learning on Academic ...

    The. study found that the variance of online learning is different, revealing that different levels of. online learning influence academic performance. It is also found that approximately 49.7% of ...

  16. Online and face‐to‐face learning: Evidence from students' performance

    1.1. Related literature. Online learning is a form of distance education which mainly involves internet‐based education where courses are offered synchronously (i.e. live sessions online) and/or asynchronously (i.e. students access course materials online in their own time, which is associated with the more traditional distance education).

  17. The Effects of Online Learning on Students' Anxiety and Motivation

    learning experience" (p. 5). In this thesis, the term "online learning" will be used to encompass the definition offered by Ally while incorporating the distinction used by Bates. Online learning, as described in this thesis, is taken to be a form of distance education mediated by technological tools where learners are geographically ...

  18. PDF ONLINE LEARNING EXPERIENCES AND SATISFACTION OF STUDENTS ON THE ...

    use technology and automation to address their concerns during online learning to meet students' changing needs and quality learning, too. Keywords: Remote Learning, Online Learning Experiences, Satisfaction, Technology, quality of Learning, learning issues INTRODUCTION People's lives have changed dramatically for almost two (2) years.

  19. (Pdf) Research on Online Learning

    Online learning engagement has long been considered essential to effective learning and teaching, but with many challenges (e.g., learner autonomy, cyber distraction, and digital competence) in ...

  20. The Effectiveness and Impact of Online Learning in Graduate Education

    Online. deficiencies of online learning. The primary concerns Learning are: frustrations associated with hardware and software. in Graduate Education. problems; the additional time required by faculty to. prepare and conduct courses; the additional time. students must spend learning to use the computer.

  21. PDF Theories and Frameworks for Online Education: Seeking an ...

    Theories and Frameworks for Online Education: Seeking an Integrated Model 170 1. Gain attention: Use media relevant to the topic. 2. Describe the goal: Provide clear objectives to the overall course goals. 3. Stimulate prior knowledge: Review previously presented material and concepts and connect them to the material to be addressed in the current module.

  22. Online learning during COVID-19 produced equivalent or better student

    Research across disciplines has demonstrated that well-designed online learning can lead to students' enhanced motivation, satisfaction, and learning [1,2,3,4,5,6,7].]. A report by the U.S. Department of Education [], based on examinations of comparative studies of online and face-to-face versions of the same course from 1996 to 2008, concluded that online learning could produce learning ...

  23. Online Education and Its Effective Practice: A Research Review

    Online Education a nd Its Effective Practice: A Research Re view. Anna Sun and Xiufang Chen. Rowan University, Glassboro, NJ, USA. [email protected] [email protected]. Abstrac t. Using a qualitative ...

  24. PDF Learning Online: A Case Study Exploring Student Perceptions and ...

    Higher Education Development, Evaluation, and Research Associates. This study explored the perceptions and experiences of a group of students enrolled in an online course in Economic Evaluation. A mixed methods approach was adopted for the data collection, and thematic analysis was used to synthesize the data collected and highlight key findings.