Introduction

The COVID-19 pandemic began swee** the globe in early 2020. This resulted in the resha** of education worldwide, preventing millions of students from attending school in person (UNESCO, 2021). The massive shift from in-person education to remote education was remarkable. Moreover, because few instructors around the world were prepared for the changes caused by the pandemic, they experienced many difficulties implementing remote courses (Hodges et al., 2020). Consequently, various remote teaching strategies that did not require instructors and students to be in the same room emerged in different educational contexts. Various strategies emerged for delivering audio-visual materials to students, facilitating synchronous interaction, and guiding student-paced assignments at home (Ramachandran & Rodriguez, 2020; Salta et al., 2021; Wut & Xu, 2021). Scholars are concerned with how remote teaching may affect and likely hamper students’ learning outcomes (Reimers & Schleicher, 2020).

In this context, university science laboratory courses (LCs) are worth examining. Since the 1980s, science educators have considered hands-on LCs the most appropriate method for supporting students’ learning processes (Hofstein & Lunetta, 2004). Hands-on LCs can provide students with opportunities to apply knowledge learned in theoretical classes to experimental processes, enabling them to acquire more contextualized knowledge, improve their process skills, and obtain a better understanding of the nature of science (Domin, 1999; Hofstein & Lunetta, 2004; Lee & Hong, 2021). In LCs, students usually attend labs, conduct experiments with peers, and write lab reports describing experimental procedures and study results. Preparing for hands-on LCs is burdensome for instructors and requires many resources (Hofstein & Lunetta, 2004). As a result, institutions of higher education are generally considered to be a more appropriate setting for offering LCs than K-12 institutions (Lowe et al., 2013). In particular, in East Asian countries such as Korea, Japan, and China, where high school science education tends to emphasize tends to emphasize memorizing scientific facts rather than learning through experimentation, university LCs offer students an important first introduction to science as a process (Rice et al., 2009). At universities, students studying the disciplines of science and engineering are generally required to take introductory-level science labs in physics, chemistry, biology, or earth science during their first year and then take major-specific lab courses in subsequent years. These courses are intended to help students acquire the scientific knowledge, skills, and attitudes towards science that are necessary for them to achieve their professional or vocational aspirations after graduation (Domin, 1999; Reid & Shah, 2007). This understanding of hands-on LCs is common in most science and engineering departments at universities across the world, and it has changed little over time (Reid & Shah, 2007; Lee et al., in press).

With the outbreak of the COVID-19 pandemic in 2020 and the implementation of social distancing measures, however, existing LCs had to shift to remote laboratory courses (RLCs) (Ray & Srivastava, 2020). For this study, we define RLCs as a type of minds-on LC implemented in a remote e-learning setting without hands-on experience. In RLCs, teaching and learning are mediated using computers, as well as videos, simulations, augmented and virtual reality (Lee & Hong, 2021; Tho et al., 2017). During the pandemic, students at Korean universities were unable to attend laboratory classes. Instead, students suddenly found themselves in RLC learning environments where they primarily watched videos of experiments recorded by instructors and teaching assistants (TAs). Moreover, students wrote lab reports without being able to obtain firsthand experience with the apparatuses and equipment used in experiments, and they completed tasks with little or no peer interaction (Jang et al., 2020). University science instructors and students viewed this lack of hands-on experience as problematic (Lee et al. in press; Jang et al., 2020).

Subsequently, university science instructors around the world began develo** solutions to these challenges (Pertillion & McNeil 2020; Blizak et al., 2020; Youssef et al., 2020; Lee et al. in press); they sought to change how RLCs were being taught by introducing more appropriate teaching methods. For example, several studies describe how university science instructors implemented RLCs that utilized enhanced media materials, provided opportunities for instructor–student and student–student interactions, and implemented modified student-paced homework assignments, including lab report writing (Kelley, 2020; Nguyen et al., 2021; Youssef et al., 2020). Other studies have reported that some RLCs incorporated cutting-edge technologies, such as augmented and virtual reality technologies (Hu-Au & Okita, 2021), to supplement students’ science lab learning experiences.

However, the abovementioned studies were limited in their capacity to explain which RLC teaching strategies may be most effective. There are two reasons for this shortcoming. First, very few studies compared different RLC teaching strategies: They either compared just a few RLCs or focused on only introductory or major-specific courses. As university LCs are essential throughout undergraduate coursework, comparing as many RLC practices as possible is recommended. Second, almost none of these studies considered RLC issues in light of the hands-on versus minds-on debate among scholars in the field of science education (Lee et al. in press; Lee & Hong, 2021). It is necessary to reemphasize that most science educators have advocated for the cognitive benefits of engaging students in in-person LCs rather than providing only theoretical explanations of scientific concepts (Hofstein & Lunetta, 2004; Reid & Shah, 2007). To enhance the effectiveness of RLCs for future science education, researchers must examine the potential merits of minds-on learning compared to hands-on learning. Therefore, we conducted an empirical investigation on the use of RLCs at a large public university in the Republic of Korea (hereafter, “Korea”) during the COVID-19 pandemic. Specifically, we compared different methods of adapting introductory physics, chemistry, biology, earth science, and major-specific science and engineering LCs into RLCs for undergraduate students at a large public university.

This research sought to accomplish the following three goals: (a) identify and document the RLC teaching strategies implemented for science and engineering courses that emerged in response to the pandemic; (b) measure students’ perceptions of these courses and consider how the use of different strategies for preparing media, facilitating instructor–student and student-peer interactions, and assessing student performance affected students’ learning experiences in different ways; and (c) offer practical suggestions for university science instructors who may continue to use RLCs even after the pandemic.

Research questions

  1. 1.

    How did university students generally perceive their learning experiences in remote laboratory courses?

  2. 2.

    How did university students’ perceptions of remote laboratory courses differ according to the teaching strategies implemented by the instructor?

Literature review

RLCs possess characteristics of both LCs and e-learning (Lee & Hong, 2021). Therefore, it is necessary to consider both types of learning environments before examining the RLC practices implemented in 2020. In this study, we focus on how to deliver course content, promote instructor–student and student–student interactions, and conduct assessments and provide feedback in LCs and e-learning situations. After the introduction, we will discuss the (re)emergence of RLCs since the start of the COVID-19 pandemic.

Methods for delivering course content in science laboratory courses

LCs are commonly used in university-level science courses, and the two major categories of LCs are hands-on and minds-on LCs (Hart et al., 2000; Hofstein & Lunetta, 2004). Traditionally, LCs have emphasized hands-on, physical experiences that allow students to handle apparatuses, equipment, and materials while engaging in experiments (Hofstein & Lunetta, 2004). However, some researchers have questioned the effectiveness of hands-on learning for students, noting that physical engagement does not automatically result in positive learning outcomes. Instead, these researchers advocate for minds-on science learning, arguing that inviting students to engage mentally with learning material can be as or more effective than hands-on inquiry (Hofstein & Lunetta, 2004; Reid & Shah, 2007).

In minds-on courses, instructors may use videos and animations to provide students with demonstrations and simulations that substitute actual hands-on experiments (Ma & Nickerson, 2006). Subsequently, instructors engage students through a minds-on inquiry process that involves students interpreting experimental phenomena and data by applying scientific theories (Rice et al., 2009). Although hands-on LCs can teach students lab techniques, they require many resources and a considerable amount of work to instruct the large numbers of students typically enrolled in introductory level courses (Ma & Nickerson, 2006). If the aim of a course is to encourage student inquiry rather than teach operational hands-on skills, then the minds-on method may be more efficient. Indeed, the minds-on method utilizes fewer resources on university campuses, such as time, chemical reagents and specimens, equipment, and precious laboratory spaces (Reid & Shah, 2007).

The importance of instructor–peer interactions in science laboratories

Science education researchers have emphasized that inquiry-based experimental courses must incorporate constructive, learner-centered activities that are either cognitive or socio-cultural. That is, inquiry-based experimental courses should involve interaction and cooperation among multiple learners, with instructors guiding students to form scientific understandings based on experimental results. Informal atmospheres that encourage students to interact with instructors and peers can create a positive and collaborative learning environment for inquiry-based courses (Lunetta et al., 2007). In particular, in inquiry-based experimental courses, peer interactions, such as brainstorming sessions and decision-making discussions, can enable teachers to observe students and better understand their thinking process. This, in turn, can enable instructors to help students learn and understand scientific concepts and to assess individual students more effectively (Lunetta et al., 2007). Moreover, student-centered class configurations can increase lab participation (French & Russell, 2006), and they have been proven to engage students more actively by encouraging students to clarify and defend their claims with evidence (Okada & Simon, 1997).

Challenges when writing lab reports

University science courses often involve student lab report writing assignments as part of the inquiry process; this task helps students learn how to effectively take observation notes and practice their scientific writing. Lab reports generally consist of the following sections: introduction, theoretical background, procedure, results, discussion, and references. This type of writing has a very particular structure that is similar to academic writing for scientific journals. However, students often have difficulty writing lab reports. For instance, because lab reports contain content related to specific topics within a specific domain, students can find it difficult to interpret experimental data and present their conclusion in ways that reflect the concept as understood in the specific domain (Kalaskas, 2013). Students also find it difficult to write using the scientific style required for laboratory reports, as this genre differs significantly from other writing styles taught in secondary school. For example, when describing research results, students may find it difficult to construct passive sentences (Abulazain 2019) or create appropriate visual representations of the data, such as tables, figures, and graphs. However, research has shown that establishing clear writing expectations and allowing for continuous feedback between instructors and peers can help students overcome difficulties (Ahmad et al., 2019; Cho & MacArthur, 2011).

Teaching strategies for e-learning

In this section, we introduce studies demonstrating that the mediums and methods used to convey content in e-learning contexts have a considerable impact on students’ learning. E-learning often involves delivering multimedia learning materials, including photos, animations, and videos. Many studies have focused on optimizing the design principles of multimedia materials to enhance the presentation of learning content. Research on instructional video materials for science and engineering courses has found that some factors impact the effectiveness of videos for learning. For example, Mayer et al. (2020) found that students learn better from videos filmed from a first-person rather than third-person perspective.

The importance of promoting interactions between participants during e-learning has been well documented in the computer-supported collaborative learning field (Clark & Mayer, 2016). Regarding time, interactions can be categorized into synchronous and asynchronous interactions. Synchronous methods enable more interaction than asynchronous methods (Clark & Mayer, 2016). In terms of participants, interactions can be classified as either instructor–learner or learner–learner interactions (Wut & Xu, 2021). Common web-based tools that support interaction by providing learning materials and that offer instructional support are called learning management systems (LMSs). The interactions between e-learning participants promoted by LMSs help reduce dropout rates and result in more effective e-learning courses (Lee et al., 2019).

Since e-learning lacks face-to-face classroom interaction, learning relies heavily on assignments, typically homework. Clark and Mayer (2016) mention that using appropriately designed assignments and working in collaborative groups can maximize learning. They further argue that solving challenging tasks via collaboration can enhance constructivist learning opportunities for students, regardless of whether the learning environment is synchronous or asynchronous.

The (re)emergence of remote laboratory courses in response to the COVID-19 outbreak

RLCs have been implemented across the globe in response to COVID-19 social distancing requirements. However, RLCs have been utilized for some university science and engineering courses for over a decade (Brinson, 2015; Lowe et al., 2013). Even before the pandemic, RLCs were seen as alternatives to in-person LCs that allowed instructors and students to overcome the time, space, and resource demands of physical labs (Lee & Hong, 2021; Lee et al. in press; Youssef et al., 2020; Tho et al., 2017). Many existing studies on RLCs were conducted by analyzing carefully designed instructional materials rather than by collecting data in realistic environments; moreover, most studies are limited to the fields of engineering and physics, with few studies conducted in the natural sciences (Brinson, 2017). Currently, there are no clear standards or frameworks for evaluating and comparing the efficacy of RLCs and in-person LCs (Lowe et al., 2013; Ma & Nickerson, 2006; Tho et al., 2017).

With the outbreak of the COVID-19 pandemic, RLCs reemerged worldwide as an alternative to in-person teaching and learning. Previously, remote teaching had been perceived as inefficient by many science instructors (Salta et al., 2021), and RLCs were seen as producing undesirable outcomes (Jang et al., 2020). Studies published by university chemistry educators early on in the pandemic (Blizak et al., 2020; Petillion & McNeil, 2020) demonstrated that students generally reported negative RLC experiences in terms of motivation, engagement, study pacing, and interactions between instructors and students.

However, as the pandemic continued, researchers began exploring the potential of RLCs to utilize innovative teaching and learning methods (West et al., 2021). For example, Sung et al. (2021) developed and implemented the “Remote Labs 2.0” model, which involved instructors conducting experiments while viewing students’ real-time feedback regarding the experiment, to which instructors could immediately respond. They reported that this RLC system improved student engagement and provided students with a sense of telepresence. Hu-Au and Okita (2021) explored differences in student learning and behavior in real and virtual chemistry laboratory environments. They found that students’ abilities to learn general content knowledge, laboratory skills, and procedure-related safety behaviors were comparable between the two approaches. Lee and Hong (2021), who also contributed to this paper, interviewed ten science education experts about their perceptions of RLCs. They found that the decrease in hands-on experiences and face-to-face interactions, as well as the increase in the instructor’s burden to prepare materials, were seen as the main weaknesses of RLCs. The researchers in this study attempted to improve their RLCs in various ways by considering methods for replacing the hands-on elements of LCs with minds-on activities. This study builds on existing research by sharing the findings of an investigation of several RLCs implemented at a large public university that implemented RLCs for undergraduate students. We anticipate that the results will produce generalizable and practical findings that will benefit university science educators.

Research field

The outbreak of COVID-19 and its influence on the Korean university education system

Like other countries, Korea and its education sector were significantly impacted by the COVID-19 pandemic. After the first confirmed case of COVID-19 was reported on January 20, 2020, the Korean Ministry of Education (2020) immediately initiated a response in February. For all tertiary education institutes, the government recommended postponing the start of the spring semester by four weeks; when the semester did begin, all classes had to be conducted remotely. Pursuant to this regulation, all universities in Korea were required to begin offering remote classes in the third week of March, and all laboratory classes also had to be conducted remotely.

Hankuk university

Our research was conducted at Hankuk University (this is a pseudonym) in Korea. Each year, approximately 3,000 new students begin attending the university, and nearly half of these students are enrolled in science, technology, engineering, and mathematics (STEM) majors. In order to graduate as a STEM major, students must take at least one introductory science course and its respective LC, such as physics lab, chemistry lab, biology lab, or earth science lab (Table 1), and they must continue to take a variety of labs as part of their major coursework during subsequent years. Hankuk University has a system that organizes the mass implementation of courses for first-year students. Each semester, multiple sections of a single course are offered, and the content taught is the same for all sections of the course. For each course, an instructor oversees multiple TAs who teach all the sections using the same instructional materials and syllabus. Therefore, each student’s learning experience during each introductory LC is supposed to be the same. Most of the learning content is rather traditional and uses cookbook-like experiments (Appendix 1), meaning the results of the experiments are expected to demonstrate theories that are being taught in the course. The LCs offered at Hankuk University cover approximately ten topics per semester.

Table 1 Number of students who completed an introductory LC in 2020 academic year

Teaching strategies used in remote laboratory courses in 2020

To explore how instructors implemented RLCs at Hankuk University in 2020 in response to the emergency situation created by the COVID-19 pandemic, we interviewed ten instructors who were teaching introductory physics, chemistry, biology, earth science, and other major-level RLCs (Lee et al. in press). We also accessed and analyzed all course-related documents from before and after the pandemic, including syllabuses, assessment tools, assignment descriptions, and teaching materials. For each semester in 2020, we found that the RLCs were similar in that they (1) changed the content of the experiments every week, (2) provided no hands-on experience to students, and (3) assigned lab report writing assignments. However, instructors used different remote teaching strategies in terms of how they utilized media, promoted interaction, and assessed and guided students with their assignments, such as, lab reports. We summarized the emergent teaching strategies used in each RLC in Fig. 1, with a focus on introductory courses.

Fig. 1
figure 1

Emergent teaching strategies used in RLCs at Hankuk University in the spring semester of 2020

In the sections that follow, we first explicate the differences in materials and strategies used by TAs during the RLCs. Rather than focusing on why these RLCs differed, we describe how the materials and strategies impacted the course structure and student learning experiences. Following this introduction, which offers context for the differences between the courses, we describe the research methodology and discuss the study findings.

Media preparation

Most TAs teaching RLCs provided students with videos of experiments that could be viewed repeatedly. However, only physics and biology labs provided newly recorded and edited videos of experiments that featured the actual TA who was teaching the course. For the chemistry labs, the TAs only provided short introductory videos that were recorded more than ten years ago. The earth science lab TAs did not record or provide students with videos of experiments; instead, they directly instructed students in real time using the videoconferencing tool Zoom. Notably, major course labs, such as the Animal Science Lab and the Pharmaceutical Lab, were streamed live, with TAs conducting live experiments for several hours without any editing.

Aspects of interaction

Only physics and earth science labs required real-time attendance via Zoom. In physics labs, the TAs and students watched videos simultaneously and then engaged in Q&A sessions about what they had seen. In the earth science labs, the TAs conducted Zoom sessions with students and directly instructed students about the content and tasks related to weekly topics. For chemistry and biology labs, students did not attend synchronous online sessions; instead, they were asked to access videos and course materials that TAs prepared and then uploaded to the LMS. In contrast to introductory courses, TAs of several major LCs assigned group work to students. For example, the Insect Diagnostics Lab required groups of students to prepare specimen samples of insects they were asked to collect from a nearby mountain. A Materials Lab required groups of students to read articles, process experimental data, and make presentations.

Assessment and feedback

No RLCs utilized quizzes before experimentation; instead, they all relied on written lab reports for assessment. The physics and biology lab TAs provided students with experimental raw data that had been collected by the TAs. The TAs for the chemistry labs, in contrast, required students to search for additional information and solve theoretical problem sets without using experimental data. The TAs for the earth science labs required students to download relevant data from repositories and analyze it. Only the earth science lab TAs instructed students on how to write lab reports; they also provided weekly feedback to students. Meanwhile, the biology lab TAs greatly reduced the number of lab report submissions usually required and instead had students give five-minutes presentations on a biology topic of interest to them. Most major LCs in this study required students to submit weekly lab reports. No other meaningful differences were noted.

Methods

We used an explanatory mixed-methods design (Fetters et al., 2013) to investigate university students’ perceptions of various RLCs at Hankuk University during the 2020 school year. A mixed-methods design can incorporate quantitative and qualitative data and enables the examination of complex processes and systems. For these reasons, it was appropriate for this paper’s object of study (Creswell, 2012). In this study, a quantitative online survey preceded the qualitative interviews. Although the qualitative data were used to complement the quantitative data, we integrated both forms of data to better illuminate university students’ perceptions of RLCs.

We recruited students who took the remote physics lab, chemistry lab, biology lab, earth science lab, and major course labs in the spring semester of 2020 when the COVID-19 pandemic first started and RLCs were first implemented. However, to secure an appropriate number of participants, we also recruited students who had taken the earth science lab in the fall semester to take the online survey (Table 1). It is important to note that the earth science lab is mainly offered in the fall semester, and its implementation method did not change between the 2020 spring and fall semesters. Therefore, we included students from both the spring and fall semester courses, as we determined that this would not negatively impact the results of the study.

Phase 1: quantitative online survey

The online survey was conducted in the fall semester from September to December 2020. We recruited participants through a bulk e-mail system sent to all students and two online communities at Hankuk University. We allowed the same student to take the survey multiple times when responding to questions about different subjects. For instance, if a student took a physics lab and a chemistry lab, they could respond to the survey twice. A total of 338 responses were collected from 308 students (Tables 2 and 3).

Table 2 Number of responses to the online survey and follow-up interview
Table 3 Number of responses to the online survey and follow-up interview by college

The survey items were designed to explore students’ general perceptions of the RLCs implemented at Hankuk University in the 2020 school year. The survey items were generated via repeated discussions among one expert, two doctoral students, and one master’s student in science education. The survey included items designed to collect demographic and course information, and it allowed participants to skip items asking about the use of video in cases where the RLC did not use videos. Only 58 respondents skipped video-related items (earth science labs N = 34; major course labs N = 15; chemistry labs N = 6; and physics labs N = 3). As the TAs and students in the physics and chemistry labs reported that videos had been provided in the courses, we concluded the nine respondents who skipped the video-related items for the physics and chemistry labs had probably done so mistakenly; therefore, we excluded their responses from our analysis. There were no other items with missing values.

The survey included 30 items on a 4-point Likert scale that are divided into ten categories with three items each (Table 4). The items were adapted from a review of the literature relevant to laboratory and e-learning courses. Items in the video satisfaction category asked about the audio-visual quality and editing of the videos (Clark & Mayer, 2016). Next, learning outcome expectation items asked about the levels of knowledge, skills, and attitudes towards science the students expected to gain from each RLC (Domin, 1999; LaBay & Comm, 2004). The learning outcome satisfaction items measured whether students believed they actually achieved the outcomes that they anticipated (Lee, 2014). The class participation items asked about student engagement in the course (Russel & French 2001; Lee et al., 2019), and the class preparation items asked students how much time they spent preparing for each weekly session (Glynn & Koballa, 2006). The experience during class items asked whether students felt they gained sufficient experience in terms of the tangible laboratory components (e.g., equipment) and the content and processes of the experiment, as well as whether they gained sufficient exposure to interpretations and discussions regarding data (Rice et al., 2009). The use of LMS items asked how much the online system helped students manage their course materials and engage in interactive discussions (McBrien et al., 2009; Rahman & Sahibuddin, 2010). The interaction with instructors and colleagues items asked to what degree instructor–student and student–student cooperation were encouraged and about the quality of those interactions (Ni, 2013). The lab report writing items asked whether students were able to access the necessary information, receive help with scientific writing, and obtain appropriate feedback (Nguyen et al., 2021; Rice et al., 2009). Finally, the evaluation items asked whether students perceived the assessments as reasonable and whether they displayed clear criteria that allowed objection (Fig. 2; Appendix 2). The reliability (Cronbach’s α) of items in each category ranged from 0.73 to 0.86, and the overall reliability of the survey items was 0.92 (Table 4).

Table 4 Number of items per category in the online survey
Fig. 2
figure 2

Examples of online survey items (see Appendix 2 for all items)

We also included open-ended response questions for several categories to allow students to freely describe their RLC experiences and, when possible, compare their RLC experiences with their expectations of hands-on LCs. The open-ended response items also asked students to describe the pros and cons of the RLC they experienced and offer suggestions for revising future RLCs. All respondents to the online survey answered the open-ended questions (N = 338).

Phase 2: qualitative interview

We interviewed 18 students who gave their consent when completing the online survey. The interviews were conducted from November 2020 to January 2021. As some of these students were enrolled in multiple RLCs in 2020, we were able to collect student reflections on 27 course experiences despite interviewing only 18 students (Tables 2 and 3).

The interviews were semi-structured, and the questions corresponded to the topics in the online survey to enable the interviews to complement the survey results. For example, we asked questions such as “What did you expect to learn from the course?” “How did you perceive the videos you were provided?” “How satisfied were you with the learning outcomes?” And, “How were your interactions and/or collaborations with your peers?” We asked students to elaborate on why they felt the way they did, to share their perceptions of the pros and cons of RLCs, and to offer their personal recommendations for how to improve RLCs. All participants were interviewed individually, and the interviews lasted about 40 min each. Some were interviewed in-person, and others were interviewed remotely via Zoom. All the interviews were audio-recorded and transcribed, and the texts were then analyzed.

Data analysis

For the online survey, the three items for each perception category were averaged to yield descriptive statistics. The mean perception scores of RLCs in each science discipline were compared via analysis of variance (ANOVA) to test for significant differences; this was followed by a Bonferroni post hoc test. The null hypothesis of equal variance for every dependent variable was accepted in the Bartlett’s test (p > 0.05). We used the STATA 16 statistical program throughout the quantitative analysis.

We qualitatively analyzed the transcribed interview data. Two faculty members, two doctoral students, and one master’s student in science education participated in the analysis. We first read the transcripts. As these data were to complement findings from the quantitative data, we decided to refer to the overall scheme of the online survey. However, we extracted meaningful first-level codes while reading the qualitative data. We worked on reducing the number of codes by repeatedly comparing the content and combining related categories. In the process, disagreements among researchers were resolved through constant discussion. Finally, we used the information about the RLCs provided by the TAs’ and students’ responses to open-ended questions on the online survey to triangulate our analysis of interviews.

Results

Differences in students’ perceptions of remote laboratory courses

Descriptive statistics of university students’ perceptions of RLCs, as well as the ANOVA F-test results for each science discipline, are presented in Table 5. The overall perception score was highest in the earth science labs (M = 2.82; SD = 0.52), followed by major course labs (M = 2.74; SD = 0.50), biology labs (M = 2.73; SD = 0.47), physics labs (M = 2.71; SD = 0.47), and chemistry labs (M = 2.45; SD = 0.48). Remarkably, there were significant differences in the overall perception scores for each RLC (F (4, 333) = 4.31, p < 0.01). The results of the Bonferroni post hoc test show that the chemistry labs had significantly lower overall perception scores than the physics labs, earth science labs, and major course labs (p < 0.05).

Table 5 University students’ perceptions of RLCs (mean, with standard deviation in parentheses) (N = 338)

Among the perception categories, class participation had the highest score (M = 3.53; SD = 0.6), followed by learning outcome expectations (M = 3.02; SD = 0.68); these were the only two categories that had mean scores higher than 3. In contrast, interaction with the instructor and colleagues (M = 2.31; SD = 0.89) and use of the LMS (M = 2.46; SD = 0.81) had the lowest scores. Additionally, every perception category showed significant differences in scores for each of the RLCs (F = 2.53–11.58, p < 0.05 or less) except for the learning outcome expectation (F (4, 333) = 2.12, p > 0.05) and the class participation (F (4, 333) = 1.29, p > 0.05) categories. The results of the Bonferroni post hoc tests demonstrate which of the RLC courses displayed significant differences in each category.

It is notable that differences in perceptions by gender were only found in two categories. In learning outcome expectation, the average female student score (M = 3.12; SD = 0.67) was higher than the average male student score (M = 2.93; SD = 0.67) (F (2, 335) = 3.36, p < 0.05). In class preparation, the average male student score (M = 2.59; SD = 0.84) was higher than the average female student score (M = 2.33; SD = 0.92) (F (2, 335) = 3.90, p < 0.05).

Pearson’s correlations of perception scores are presented in Table 6. All categories showed a highly significant correlation with the overall perception score (p < 0.001), with learning outcome satisfaction showing the highest (r = 0.8092) and class participation showing the lowest (r = 0.346). Although most of the categories demonstrated a highly significant correlation with each other (p < 0.001), class participation had the lowest correlation with other categories (r = 0.2407–0.1055); it was even non-significant with the interaction with the instructor and colleagues category (r = 0.1055, p > 0.05).

Table 6 Pearson’s correlation of university students’ perceptions of RLCs (N = 338)

The reason that these patterns appeared in the quantitative analysis will be discussed in more detail below with reference to the qualitative data.

Students’ high expectations for hands-on laboratory courses were not satisfied

Students’ experience of RLCs during the 2020 school year can be summarized as “high expectations, low satisfaction.” Learning outcome expectation (M = 3.02; SD = 0.68) was relatively high compared to other perception categories. This is because Korean students have few opportunities to engage in labs during K-12 school, and therefore, students eagerly anticipate the hands-on learning opportunities offered as part of university education:

In high school … we do not get to see most things, and very precise measurement is impossible. After coming to university, I expected to engage in detailed procedures when doing experiments, such as controlling certain conditions, conducting more precise measurements, and running [processing] them through a program to analyze graphs in a practical manner. (Student_1 on chemistry lab)

Students naturally anticipated that university LCs would be hands-on. However, the COVID-19 pandemic forced introductory LCs to be held remotely, causing them to be, at best, minds-on. Therefore, learning outcome satisfaction (M = 2.55; SD = 0.55) was significantly lower than learning outcome expectation in a paired t-test (t (337) = 11.40, p < 0.001). Students were concerned about what they had “gained” from RLCs, as they were unable to engage in practical, hands-on experiments:

First, it was disappointing.… When we say “experiment-based course,” we expect to come to a [laboratory] classroom and learn something or do experiments. (Student_17 on physics lab and earth science lab)

After I had realized that I would gain nothing even after finishing this course … I almost neglected it…. I had high expectations, and I really like conducting experiments…. So, my satisfaction was low because my expectations were high. (Student_8 on chemistry lab)

Because we did not conduct hands-on experiments, I assumed that I wouldn’t be able to gain anything besides what the videos presented. (Student_14 on a major course lab [Analytical Chemistry Lab])

Video materials impacted the quality of learning experience

Different teaching strategies, however, yielded different student perceptions. As described above, students could only indirectly engage with experiment procedures via videos in most RLCs. As a result, student learning was highly dependent upon the characteristics and qualities of the video recordings.

For example, videos in chemistry labs (M = 2.21; SD = 0.84) were evaluated as “inconvenient” and “not very meaningful,” as the content of the videos were outdated and had already been included in the original course materials taught prior to the pandemic (Student_7 on chemistry lab). In some major course labs (M = 2.76; SD = 0.80), the TAs simply broadcast a live stream of the experiments or uploaded the entire procedure without editing the video. Students considered these videos to be of little help because the quality of the videos were quite crude. Consequently, most students responded that their lack of “firsthand” experience during class hampered their learning experience (M = 2.51; SD = 0.76).

They showed us live-streaming videos while conducting experiments. When the camera ran fast, the definition [of the image] would suddenly worsen.… If an important scene passed by quickly [it could not be seen], … I may have a question [but couldn’t ask].… My concentration decreased greatly. (Student_2 on a major course lab [Animal Science Lab])

Sometimes, the entire waiting time of three hours for the separation [process] was just presented [as is]… I feel they [the videos] are unrefined and thus are of low quality.… Not edited, too long, or too short.… (Student_11 on a major course lab [Pharmaceutical Lab])

There was a total lack of trial and error during the class. (Student_5 on physics labs and biology labs)

For a few RLCs, however, students responded that being able to review videos repeatedly helped their learning. Students responded particularly positively to cases where TAs conducted experiments and recorded and edited videos, such as in the physics labs (M = 2.89; SD = 0.72) and biology labs (M = 3.23; SD = 0.55):

The overall content of the experiment could be understood perfectly from the video: what was seen, what happened when we controlled different variables. The process and results of the experiment could be figured out in an overall sense. (Student_4 on physics lab)

So, I repeated [repeatedly watched] the video five or six times in a short amount of time and discussed [scientific terms] with friends.… The quality of the sound and the definition was quite good. (Student_16 on biology lab)

Synchronous sessions facilitated student–instructor interactions

Due to the isolated nature of attending an RLC, students’ perceptions of interaction with peers and instructors were the lowest (M = 2.31; SD = 0.89) among the categories. Students responded that they could not interact with instructors or colleagues. They also responded that use of the LMS was poorly promoted in RLCs (M = 2.46; SD = 0.81).

However, we found that the synchronicity of online sessions affected students’ perceptions of interaction in RLCs. Students who took RLCs with no synchronous sessions, such as students in the chemistry labs (M = 1.84; SD = 0.82) and biology labs (M = 1.93; SD = 0.85), showed lower perceptions of interaction:

Anyway, even disregarding all the other limitations, the fact that there was no Zoom [was even more difficult]. I mean, there was no connection at all. (Student_8 on chemistry lab)

We did not make groups and just did [work] individually.… There were almost no [interactions]. (Student_3 on biology lab)

In contrast, students who took the RLCs with synchronous learning sessions involving TAs and students showed relatively higher scores on interaction (the physics labs: M = 2.3; SD = 0.82; the earth science labs: M = 2.58; SD = 0.82), which is also supported by the post hoc test. It is notable that the physics labs showed a significantly higher perception score (M = 2.85; SD = 0.76) for the category class preparation compared to other labs, which is attributable to the unique synchronous sessions used in the physics labs:

The sharpest contrast was between classes that used Zoom and those that did not. When the class used Zoom … both the TA and I could show our faces and hear each other’s voices. (Student_8 on physics lab and chemistry lab)

Wait, I don’t think there has been too little interaction. Because we could send a direct message [in Zoom] if we wanted. (Student_17 on physics lab and earth science lab)

Meanwhile, students who took major course labs responded that they were better able to interact with their TA and peers (M = 2.48; SD = 0.89) compared to students from the chemistry labs and biology labs. This could be attributed to the fact that group assignments required in major course labs demanded a certain extent of synchronous peer interaction:

I think the collaboration was quite good.… My friends and I went together to a mountain to catch them [insects needed for the lab]. During this time, sharing information and knowledge went smoothly. (Student_12 on a major course lab [Insect Diagnostics Lab])

The LC required a lot of group work.… In my case, there were six members for three modules; [we] allotted two members for each module … to process data and make a presentation, and others shared what they made. (Student_13 on a major course lab [Materials Lab])

Feedback on lab reports and supportive assessments guided student learning

As mentioned above, many LCs require lab reports for student evaluation. Most students responded that it was possible, in principle, to get help from a TA about the course materials, by contacting TAs using telephone or e-mail, and by communicating via the LMS. Overall, students gave neutral scores for lab report writing (M = 2.52; SD = 0.79).

Here, regular instruction and feedback from TAs seem to have been a determining factor impacting students’ experiences in earth science labs (M = 2.89; SD = 0.66) and chemistry labs (M = 2.25; SD = 0.79) when doing lab report writing:

In earth science lab, it was like [the TA] gave the basic report format, and we filled it in. (Student_17 on earth science lab)

I totally did not know I could ask questions to my TA. (Student_10 on physics lab and chemistry lab)

Students responded that, through lab report writing, they obtained theoretical knowledge and cultivated their data processing and lab report writing skills rather than hands-on skills. That is, responses from students in the earth science labs showed the highest learning outcome satisfaction (M = 2.78; SD = 0.71), which implies that remote settings presented little difficulty compared to other settings.

For the case of earth science lab, I think I would not feel much [difference] between face-to-face and non-face-to-face settings. (Student_15 on earth science lab)

For biology lab … my lab report writing ability improved to some extent.… For earth science lab, I learned graph-drawing techniques using Excel. (Student_16 on biology lab and earth science lab)

Meanwhile, in the biology lab, there were notable attempts to provide supportive assessments in addition to lab reports. Short talks about biology topics that interested students seemed to have impacted students’ experiences of evaluation in the biology labs (M = 3.21; SD = 0.56), physics labs (M = 2.69; SD = 0.73), and chemistry labs (M = 2.72; SD = 0.57). Students in the biology labs reported that they had additional opportunities to receive feedback from TAs during remote teaching situations:

I think that [RLCs] were certainly synergistic… There was a presentation in the biology lab.… Students picked a topic, recorded a video about it, and uploaded it. Then, TAs watched and evaluated it. (Student_3 on biology lab)

It was a five-minute presentation.… Once each [student] uploaded their topic, [the TA] gave feedback on whether the topic was good and how to develop it. And finally, we [students] made videos. (Student_7 on biology lab)

Student participation remained high even during the pandemic

Finally, we found unexpectedly positive student perceptions of RLCs during the spring semester of 2020. The high mean score of class participation (M = 3.53; SD = 0.6) and its low correlations with other categories (Table 6) suggest that most Hankuk University students diligently participated in RLCs even during the pandemic. This contrasts with the concerns about low participation and high student dropout rates described in previous research about students’ negative experiences in introductory LCs (Seymour & Hewitt, 1997), e-learning scenarios (Lee et al., 2019), and remote learning situations during the COVID-19 pandemic (Lee & Hong, 2021; Petillion & McNeil, 2020). These unexpectedly positive experiences were due to students’ desire to complete their mandatory courses during the first semester, during which the university had reduced the number of course requirements, in an effort to support students to overcome the disruption of normal academic and social life caused by the pandemic:

Chemistry lab is one of the required courses that we need to complete.… It’s a notorious course when it is held face-to-face. It was changed to a non-face-to-face class, … which was convenient. (Student_6 on chemistry lab)

Discussion

Based on the above findings, we discuss how RLCs can be understood using the hands-on versus minds-on framework; we also address how RLCs can be improved by implementing specific teaching strategies to guide university STEM instructors.

RLCs in the context of the hands-on versus minds-on debate

This study sought to determine whether there are differences among university students’ perceptions of RLCs. Students generally responded that they had high expectations for laboratory classes, and the lack of hands-on experience caused them to be dissatisfied. This calls for further consideration of the essence of LCs before the specific teaching strategies used in each teaching practice. Although previous studies have problematized RLCs as depriving students of hands-on experiences (Jang et al., 2020; Kelley, 2020; Nguyen et al., 2021; Youssef et al., 2020), very few studies have analyzed RLCs in the context of the hands-on versus minds-on debate, which can provide insight regarding the potential benefits of RLCs.

Rather than understanding RLCs as inferior to hands-on courses, this study recommends discussing RLCs in terms of their potential for using minds-on lessons in e-learning environments while setting clear learning objectives (Lee & Hong, 2021). Although participants in this study worried about what they would “gain” from RLCs that lacked hands-on experiences (Reid & Shah, 2007), they mostly responded that they were able to acquire some knowledge and skills while writing lab reports. This indicates that RLCs should be designed and evaluated with reference to the learning objectives of each specific course (Ma & Nicerkson 2006). For example, if instructors aim to foster students’ practical skills when performing experiments, hands-on experiences are necessary (Reid & Shah, 2007). However, if instructors aim to foster students’ other “scientific skills” (e.g., observation, deduction, interpretation, etc.) or general skills (e.g., teamwork, reporting, presenting, discussing, etc.; Reid & Shah, 2007), then instructors should incorporate minds-on lessons into their RLCs (Lee et al. in press). Moreover, instructors should clearly inform students about the objectives of each course to minimize the gap between students’ and instructors’ expectations. This can serve to reduce student and instructor frustration and dissatisfaction.

Teaching strategies for future RLCs

This study also sought to understand whether university students’ perceptions of various RLC experiences differed according to the emergent teaching strategies used in each discipline. The results reveal that there are differences in university students’ perceptions of individual RLCs—mainly among the four introductory LCs and some major lab courses. As the departments each manage massive numbers of students who are all required to enroll in the same LC, it is important that all the courses be conducted in the same manner. We found that because the same students usually take several introductory LCs, the class/teacher effect and student group effect were diminished. Therefore, the differences in perception scores can be attributed to the different teaching strategies used, which can provide lessons for future RLCs. Below we detail the strategies derived from the quantitative and qualitative data.

Strategy 1: record and edit new video material

First, video materials were confirmed to be of importance, as videos strongly impacted students’ learning experiences in RLCs. Although there are new cutting-edge technologies that can be used for RLCs, such as augmented and virtual reality (Hu-Au & Okita, 2021), they can be burdensome for instructors and students to develop and use. Therefore, recording and editing effective videos of experiments is helpful and practical when implementing RLCs (Jang et al., 2020; Mayer et al., 2020). While preparing new video materials, an instructor can also prepare to engage students in Q&A about the concepts, processes, and materials being used. Because the psychology of “presence” should be considered significant (Brinson, 2015; Ma & Nickerson, 2006), videos of experiments should not merely show experiment procedures but also focus on specific apparatuses and equipment to provide students with indirect but authentic experiences of laboratory activities. However, live streaming the whole experimental process without editing should be avoided.

Strategy 2: promote synchronous interaction and assign group work

The loss of instructor–student and student–student feedback was very problematic in these RLCs; therefore, this issue needs to be remedied. Synchronous video-watching and Q&A sessions similar to those used in the physics labs are strongly recommended. In addition to video, other tools that support collaborative learning could be introduced to real-time online sessions, such as systems for the real-time visualization of student discussion and e-portfolio constitution; moreover, an LMS to save collaborative processes and products could also be used (Clark & Mayer, 2016; Luchoomun et al., 2010; Youssef et al., 2020). Additionally, some major course labs reaffirmed that collaborative group work can promote student–student interaction in RLCs (see Clark & Mayer, 2016). If possible, even during the pandemic, allowing small groups of students to visit the laboratory and participate in necessary hands-on experiences, with the overall course still being taught online, would provide students with additional opportunities to interact with instructors and other students (Lee & Hong, 2021).

Strategy 3: promote lab report writing with regular feedback and adopt supportive assessments

Although lab reports reflect the results of student inquiry and heavily influence evaluation, our findings show that the physics, chemistry, and biology labs did not provide timely assessment and feedback, while the instructors in the earth science labs did. Earth science lab proved to represent an exemplary model for promoting lab report writing. That is, providing students regular feedback on their lab reports is strongly recommended, and direct instructions on the structure and writing style of lab reports are also helpful, particularly for first-year students. Visualizations, portfolios, and LMS systems could also function as repositories that support lab report writing and evaluation. Additionally, offering alternative evaluation criteria, such as having students search for and present on their own topic of interest (as was done in the biology lab), is also an option to consider in RLCs. As the study interviewees suggested, allowing several students to physically attend laboratory sessions or develo** and sending experiment kits to students’ homes are plausible choices for evaluating minimal hands-on skills during an emergency situation (Jang et al., 2020).

High participation in RLCs: a possibility for innovation?

Finally, we seek to answer the question of why student participation was high for the RLC courses despite expectations to the contrary. Although most students expressed several negative views about their RLC experiences in the 2020 school year, some positive views were also reported. Most significantly, RLCs were seen as “convenient” and time-saving. Students did not have to attend the in-person laboratory sessions and could simply watch a video synchronously and write lab reports at home. Some students did not even need to reside near the university. Additionally, lab reports could be submitted online, while conventional LCs required printed copies. This may have lowered the physical and psychological threshold of LCs for first-year university students, hel** them remain in the course.

Here, we note that many Korean university students were likely accustomed to e-learning to some degree. As some interviewees stated, many Korean high school students take so-called “internet lectures” as part of their private education experiences which are geared toward preparing students for the Korean college entrance examination process (Kim & Jung, 2022). These experiences likely helped students when taking RLCs. However, it is significant that the context of e-learning shifted from the realm of private education to institutional education at the university level. Students were able to experience how technologies could be used in formal learning contexts, and they considered how to improve future RLCs. Therefore, the RLC experiences during the 2020 school year somewhat ironically accelerated changes in university science teaching and learning:

Rather, I think those courses, including remote courses, were not that bad. But, they revealed things we didn’t know.… Without COVID-19, remote courses like those using Zoom … were considered to be something that would only happen in the future [but were realized already]. (Student_18 on earth science lab and a major course lab [Architectural Design Studio])

Limitations

As students at Hankuk University are high-achieving and highly engaged, they may not represent the experiences of university students in general. Students who finished their RLCs may have been more comfortable responding to our survey. Moreover, comparisons of university first-year students’ perceptions of RLCs with their expectations of hands-on LCs are unavoidably speculative, as they did not experience the latter. Therefore, the perceptions of second-year students and higher, some of which have been included in this study, might be more informative. As the data of this study were hurriedly collected amidst the fluctuating COVID-19 situation during the 2020 school year, future research could provide more insight into how students experience and perceive RLCs at the university level.

Conclusion

This study investigated university students’ perception of RLCs, which were necessitated by COVID-19 social distancing measures. This study is unique in its comparison of students’ perceptions of various RLCs at the university level. Our analysis revealed that RLCs can be implemented when drawing from existing structures and content of LCs and e-learning, with specific future suggestions for teaching strategies. Currently, in 2022, we are struggling to transition to a post-COVID-19 teaching and learning environment. Rather than just regressing to pre-COVID-19 teaching traditions, university STEM instructors should prepare for the prospective “new normal” of science education by enhancing the learning opportunities and outcomes offered via online education. In this regard, develo** and validating an instructional design model for RLCs that systematically incorporates the effective teaching strategies discussed above would be beneficial (Lee & Hong, 2021).