top of page
Image by Glenn Carstens-Peters


A study that investigates graduate students' thoughts and behaviors on camera usage in online classes.


          Educational institutions had started offering online classes for years due to a new market opportunity, an alternative education delivery format, and the ability to reach students in underserved areas. Many students also embraced the idea of online learning because of its flexibility, convenience, and variety (Littlefield, 2019). Before COVID-19, many online learners acknowledged the benefits and constraints in online classes and actively chose to be online learners. However, in the face of the current global pandemic, the majority of schools and institutions have switched from traditional face-to-face education to online education to maintain social distancing and thus prevent the spread of the novel coronavirus. This sudden shift made a lot of students involuntarily become online learners, and many of them might not be prepared for fully online education. As a result, this is a unique chance to study the essence of students’ online learning experience and provide insights for the future of learning to instructors and the entire education sector.

          In the current situation, the biggest difference in online learning is that it used to be mainly asynchronous learning before the pandemic; however, there is now a new emphasis on synchronous learning, as it mimics the face-to-face learning environment that most universities were built for. Common methods for implementing synchronous learning are video conferencing, teleconferencing, and live chatting (TBS STAFF, 2020). To imitate an in-person learning environment, video conferencing would be the best method because people can see each other in real-time and it provides active discussion, immediate feedback, and dynamic exploration of materials. Using cameras during a video conference would help instructors and students to build connections, community, and presence (Loyola University Chicago, n.d.); therefore, camera usage might have an impact on the quality of synchronous online learning. 

          The purpose of the study is to explore how camera usage affects students’ online synchronous learning experience and understand the variation of experience among individual students. Specifically, three research questions are developed to investigate the phenomena. First, how does the overall online class experience impact students’ decision making on camera usage? Then, what factors would influence students’ decisions on camera usage during class sessions? Finally, why do those factors affect students’ decision-making about keeping their cameras on or off? The present study contributes an in-depth examination of students’ overall online experience with Zoom, individual preferences on camera usage, and different attitudes toward other people and themselves on camera usage. The study provides an insight into students’ perception of online education and the sustainability of synchronous learning for the post-COVID time. Furthermore, it also provides implications for instructors considering the policy for camera usage in-class sessions.

          Camera usage in online classes had not been a popular topic of study until the hit of the global pandemic around the start of 2020; therefore, the literature body had not investigated students' thoughts and behaviors about camera usage in online learning systematically. However, online education had been examined in various aspects due to the rise of the Internet, so the literature review would mainly focus on overall online education and related theories. First, we examined the theoretical framework of the community of inquiry for creating an engaging and collaborative online learning environment. Social presence is part of the community of inquiry and is closely related to students’ online learning experience, as it is related to perceptions of connection, engagement, and social interaction. Then, the social presence was further discussed with a comparison between the online learning environment and in-person learning environment. Engagement was briefly reviewed how it would affect social interaction in online learning environment. Other perspectives on social interaction that we examined include social anxiety and social decision making, which may influence camera usage in online class sessions.

Literature Review

          Students’ experience is always considered as part of the essential component of learning, especially for online education. Garrison, Anderson, and Archer (1999) proposed a theoretical framework called Community of Inquiry, and it represents a process of creating collaborative- constructivist learning environments through the use of social, cognitive, and teaching presence (Garrison et al., 1999). Social presence was defined as “the ability of [students to identify with the community to] present themselves socially and emotionally, as ‘real’ people;” it includes emotional expression, open communication, and group cohesion (Garrison et al., 1999, p.89). The cognitive presence was the students’ ability to construct meaning through “sustain communication” (Garrison et al., 1999, p.89). Teaching presence included the design of the educational experience and the facilitation of teachers during class (Garrison et al., 1999). The psychological feeling of a sense of community may enhance the online learning experience (Conrad, 2005).

          From a students’ perspective, social presence might be the most important construct for their online education experience because it relates to the students’ subjective perception of psychological connection and engagement between students (Lyons et al., 2012). Also, students acquire knowledge not simply through access to information but also through social interaction with others (Vygotsky, 1978). More importantly, research has demonstrated the significance of students’ perception of social and emotional experience in an online education environment (Zembylas et al., 2008). With a high social presence, students can freely express their opinions, perceive they are part of a community (class), and feel supported for emotional expression.

          Bowers and Kumar (2015) examined the phenomenon of social presence when they conducted a study comparing satisfaction ratings between an online education format and a traditional education format for the same class. The researchers concluded that the online format received the same or even better ratings for social presence and teacher presence than the traditional format. However, the people who participated in the study are the people who finished the course; since finishing online courses is uncommon (Mubarak, Cao, & Zhang, 2020) this study probably had a very biased sample selection process. The reasons why people drop out of online education are extremely varied; however, it is important to note that for most of the history of education, real-life interpersonal interactions were an inseparable factor in explaining the effectiveness of education. With each new concept taught, instructors gather relevant facial movement information and compare these patterns to their memory of student facial expressions (Jack & Schyns, 2017). While we cannot be sure that non-verbal communication is a prominent driver of drop-out rates, it would not be surprising if the kinds of students who dropped out of online education are people who find the sterile online environment to be very impoverished.

          Research has shown if instructors want to improve interaction, it is essential to take engagement into consideration (Och & Cakir, 2011), Forming an interactive class setting can effectively prevent students from boredom, disinterest, absenteeism, and dropping out (Appleton, Christenson, & Reschly, 2006). Students would participate as members of the class and school community so that reducing the feeling of neglected or less valued in class sessions. Seeking to explain and understand engagement thus appearing to be relevant in students’ overall online learning experience.

Another body of literature that informs our study is the literature on socially anxious individuals, who face special challenges in online education environments. Communicating with others behind the screen might help them ease the stress of talking with someone in person, but it is also possible that having their self-image on the screen triggers more anxiety for them. Azriel, Lazarov, Segal, and Bar-Haim (2020) has examined how socially anxious individuals respond to video-mediated communication (VMC) in terms of their visual attention patterns. The study shows that using online platforms might provoke notable anxiety among the participants. Based on the finding, Azriel et al. (2020) concluded that the lack of direct contact with others can increase the level of anxiety among participants with high social anxiety. In addition, people with social anxiety might pay more attention to their self-image on the screen thus more aware of their behaviors. Although the study suggests that their anxiety level might depend on the content of their conferencing, the online environment is shown to be a distinguishable setting that can change their inner perception and thoughts. It is worth noticing for us to understand how students with social anxiety perceive the online learning environment and if it adds more pressure to them.

          Because turning one’s camera on or off is a decision about how much people want to reveal about themselves during a virtual social interaction, the psychology of social decision- making contains literature that is relevant to our study. Nir Halevy (2020) introduced a new framework for social decision-making. The framework contains four strategic orientations: “egocentric (thinking about how one’s actions shape one’s outcomes), impact (thinking about how one’s actions shapes others’ outcomes), dependency (thinking about how others’ actions shape one’s outcomes), and altercentric (thinking about how others’ actions shape their outcomes).” (Halevy, 2020, p. 648). Our interview questions may derive inspiration from Halevy’s framework by asking about the extent to which people think about how camera usage affects their and their peers’ outcomes.



          Participants are students at Claremont Graduate University (CGU), which is a graduate-only university that contains small class sizes that had some online education offerings before COVID-19. CGU students are from a variety of demographic backgrounds and they gather together to attend classes together, even across different departments. We are interested to figure out if online learning is a very distinguishable setting even for colleges with small classes like CGU. Furthermore, researchers could gain an understanding of students' online learning experience and provide insights into students’ perceptions of online education. Due to the fact that the researchers in this study are CGU students, it is not difficult for researchers to gain access to the site and could potentially build a good rapport with participants with fewer obstacles.

Pilot Study

          CGU students (N = 35, 71% female) were recruited from an announcement (see Appendix B) from the researchers before class sessions. Researchers talked to individual professors to ask permission for observing class sessions and post the consent form (see Appendix C) along with the announcement. A convenient sampling method was utilized in the study because of the study aim and the time constraint. The aim of the study is to explore the essence of students’ online learning experience with the effect of camera usage on Zoom. Therefore, maximum variation sampling seemed more appropriate because documenting diversity in students’ preferences and attitudes toward camera usage. However, due to the time constraint, which researchers had only approximately one month for data collection, researchers decided to only study students who were observed during class session researchers. In addition, if students declined a follow-up study, researchers would consider reaching out to students with whom they are familiar, but outside of the observation, to participate in the study to acquire sufficient information to complete the research. knew for reaching a minimum number of participants for the follow-up interviews.

Full Study

          Learning from the pilot study, researchers anticipate recruiting a large sample size of CGU students (N = 100) with digital flyers passing among students or announcements from the researchers before classes. Also, monetary compensation for participation might be considered for the purpose of attracting a more variety of participants. As mentioned before, maximum variation sampling would be utilized in the study because this study aims to document a diverse variety of students’ online education experience regarding camera usage and their motivation and justification behind their activities. The dimensions that would be likely varied among students include perception of the overall online class experience, preferences on using cameras in different situations, and attitudes toward camera usage for themselves and for other students in the class. Criterion sampling is used to ensure that the study does not only include students who do not turn on their camera the entire time but also include students who always turn on their cameras so that the study can present a complete description of the phenomenon. In addition, the research problem is fairly narrow in nature, this study may not need in-depth information from each participant, rather, it emphasizes the breadth of information that each participant contributes. Therefore, this study needs a large sample to reach data saturation so that enough variation and themes would appear.

Assessments and Measures


          Participants first completed a consent form (see Appendix C) to indicate whether they agreed to be observed and interviewed, and all the observed class sessions happened to be all under the psychology department. The consent form contained 6 questions which included conditions based on their answers. After signing consent forms, participants would be observed during their class sessions without interacting with researchers. Researchers observed an average of 7 students in each class session and took notes on students’ behaviors such as turning the cameras on or off, eating, and walking during class sessions. The length of observation time varied by the duration of a class session, and it approximately lasted for 2 hours for each session.

          Followed by observation, researchers would reach out to students who gave consent for follow-up interviews and who they were interested in studying based on the behaviors they observed. Zoom was used to conduct interviews with participants, and each interview lasted about 30 minutes. The interview protocol (see Appendix A) included 5 sections: warm-up, overall online experience, personal preferences on camera usage, attitudes toward camera usage, and closing.

          In the warm-up section, researchers asked questions like “how much time do you spend on Zoom for classes and related activities (in hours)?” In the overall online experience section, researchers asked questions like “do you feel the level of engagement changed between the two settings (online vs in person)?” In the preference section, researchers asked questions like “can you tell me more about the reasons why you turn your camera on/off?” In the attitude section, researchers asked questions like “what do you feel about students who turn their cameras off? And how about on?” In the closing section, researchers asked questions like “Is it okay if I want to follow-up with you and ask for some clarifications, if any?”

Analytic Plan

          To investigate our research questions, we conducted 15 semi-structured interviews with current CGU students. For analyzing the data, we plan to utilize a conventional content analysis. In general, content analysis is often used in qualitative research studies as a “subjective interpretation of the content of text data through the systematic classification process of coding and identifying themes or patterns” (Hsieh & Shannon, 2015). Our research is interested in studying students’ preferences and attitudes about their online experience, particularly regarding their camera usage on Zoom. Therefore, describing a phenomenon would be the aim of the current study. We believe the conventional content analysis seems to be the most appropriate one. We eliminated the use of directed content analysis because existing theory or research literature is limited on-camera usage in synchronous online classes, and we eliminated summative content analysis because we are not trying to understand the contextual meaning of certain words or concepts.

          To apply conventional content analysis, we would first read through one or two transcripts and then highlight texts that appear to describe students’ online class experience. We would take notes about the impression and initial analysis of the highlighted texts. Through this process, we would form an initial coding scheme. Then we would use this coding scheme to code the rest of the transcripts and add new codes if the data did not fit into the coding scheme. The whole coding process would be iterative, meaning that we would keep revising the coding scheme as we encounter more variation in new data. Then those emergent categories would be organized in a tree diagram to represent different aspects of students’ online experience regarding camera usage. Even though conventional content analysis has no intent to convert qualitative interview data into quantitative data, we would also quantify categories as a typical frequency or variant frequency to capture an overview of the phenomena, whether a particular category is a common theme across people or just an individual difference. The quantification of category is borrowed from the consensual qualitative research method (Lippman & Greenwood, 2012). A typical frequency refers to a category endorsed by 9 to 15 participants, and a variant frequency refers to a category endorsed by 1 to 8 participants.

          With our observational data, we plan to conduct a thematic analysis with the aim of triangulation and cross-verification of our study findings. Attending the classes with other students allowed us to take a closer look at what students are doing while they are on cameras. As students ourselves, we often do not intentionally notice which students keep their cameras on and what they usually do while attending online classes. Observing the classes as researchers, we had the chance to investigate online classes systematically. Similar to analyzing our interview data, we also quantified the behavioral data that was collected during the observation. For example, how many of them used virtual backgrounds in a class? How many of them were taking classes outdoors? Knowing how often these behaviors appear in the whole population helps us to understand if it is a pervasive phenomenon or a personal preference. Conducting observations also provides a way for us to find participants for the interviews. When we observed a particular student engaging in behaviors that we found interesting, such as frequently turning the camera on and off, we would invite them to participate in the follow-up interviews afterwards. Moreover, if schedules permitted, we alternated observations and interviews to ensure that our triangulated findings informed iterative improvements to our research plan.

          We applied triangulation aiming to use multiple methods to develop a more comprehensive understanding of the phenomenon we are studying and enhance the validity of our findings. For example, using only interviews might limit our data to only what participants report to us. However, adding the use of observation allows us to discover the behaviors they might not be aware of, or did not get to express in the interview. Then, the use of thematic analysis helps generate and review themes across the different methods we have used in the previous stages of the study. After we have generated and defined these themes, we can then better name them and present them in our research study.

Validity Check

          One threat to validity is that the members of our research team might come into the project with a variety of personal biases. All three members of this research team have been going through their studies online since March 2020, and the online format came to us in a circumstance that is far from ideal, which can greatly influence our beliefs about online education. We minimized this threat to validity using our identity memos. When researchers write identity memos, they must analyze the ways in which their own experiences and assumptions can sway them towards an overly narrow or incorrect interpretation of the data (Maxwell, 2013).

          The second threat to validity is reactivity, which is the “influence of the researcher on the setting or individuals studied” (Maxwell, 2013). To prepare for observations, the team would first ask the instructor of the course for permission to observe their class. We would then show up to the class on the day we were allowed to come, announce our presence, and post our consent form (see Appendix C) in the Zoom chat. Having a captive audience seemed to be the only way to make sure that many students filled out the consent form, but it also meant that the students were aware of our presence and might have behaved differently from usual. Also, we had no prior involvement with three of the four classes we observed, so we would not have easily had any prior opportunities to hold the students’ attention and ask for their consent. Therefore, in these three classes, there may not be a way to reverse the issue of reactivity; we tried to compensate as much as we could by keeping our cameras off and audios muted. However, in the fourth class, we were able to come up with a more effective way to address reactivity--one teammate was able to send out the consent form during the lecture before the one we observed because she is a teaching assistant in that class. That way, on the day when we conducted observations, students who filled out the consent form probably forgot that they were going to be observed.

          The third threat to validity is the possibility of a power dynamic between the researchers and the students. For example, one of the researchers is a teaching assistant in a class we observed, and our team interviewed several students from that class. During her interviews, the teaching assistant reminded participants that she was conducting this interview from the perspective of a student researcher but not as their TA. However, if a participant had beliefs about camera usage that could have created unwanted conflict, even this reassurance would not be very convincing. We decided to proceed with talking to these students anyway because they came from a class with a large number of students, which adds useful variation to our sample.

          The fourth threat to validity is our inability to collect information from people who habitually keep their cameras off. Our team reached out to a few such people, but they did not respond. Such people would provide an important perspective for our research questions, but there is not a good way to mitigate this threat. We might include compensation in the future to attract students who never had their cameras on, but for now, this would be part of our limitation in this study.

          The fifth threat to validity is that we are likely unable to reach data saturation given that we had only observed 34 people and interviewed 15 people for this study. We might be able to find some patterns in people’s thoughts about camera usage, but the chance was that the pattern would not be completed because we only had a limited amount of data to analyze. Ideally, we would keep gathering more data once we think we do not get any new information, but we had a time constraint that we need to complete the assignment by the end of the semester. Therefore, our findings may be biased due to insubstantial data. We would also acknowledge this validity threat as part of the limitation of our study, and try not to make a strong claim in our findings. In the full study, we have decided that observing 100 people and then pulling interview participants from this sample of 100 will get us closer to data saturation.

          A final threat to validity is assuming too quickly that interviewees are perfectly capable of reporting the cause of their actions. For example, one of our research sub-questions that may be subject to this threat to validity is if class size had any impact on camera usage. This led us to use two different process validation strategies: 1) we drew explicit comparisons by conducting observations across a variety of class sizes, and 2) we counted how many people used their cameras frequently vs. infrequently.

Emerging Findings

Basic Findings in Observations

          Across 34 observed participants, 28 people kept their cameras on very consistently. We noticed 13 of them eating or drinking during class. Two students used virtual backgrounds — in a follow-up interview, one of them explained that he “[thought] a virtual background [acted] like maybe some personality and character to maybe who I am...[And] customization is the best way I would say.” As is expected, the more interactive the structure of a class was, the larger was the percentage of students who spoke up regularly.

Basic Findings in Interviews

          We interviewed 12 females and 3 males. We interviewed 7 first-year students and 8 continuing students. On average, our interviewees spent 13.97 hours per week on Zoom for classes and related activities. In our sample of interviewees, it was typical for people to praise the virtual learning environment for increasing accessibility and convenience, particularly by eliminating commutes. At the same time, an important variant that we found was that people do wish they could spend less time on Zoom than they currently are.

Important Themes

          Some important themes in our emerging findings were: 1) distractions, 2) engaging with a community, and 3) normative behavior.


          One of the most prevalent in-person distractions that interviewees discussed was the behavior of other students, especially when they would talk during class. It is typical for our interviewees to report that the online setting is more distracting than the in-person setting. It is typical for interviewees to report that a source of these distractions are the very electronic devices that they are also using to attend class. For example, one student said, “Um, I would say for online distractions, if I, like, lose engagement during class, I definitely find myself doing stuff online, like shopping.” This also relates to our questions about camera usage, as one variant we found in our interview responses was that multitasking on electronic devices was a reason why people would choose to keep their cameras off. Before data collection, the research team seriously considered the ways in which competing obligations might be an important distraction; such obligations could include family members, pets, and other school assignments. All three of these potential distractions were variants that came up in our findings.

Engaging with the Community

          In our observations, we noticed that students keep their cameras on in discussion-based breakout rooms more than they kept them on during lecture-based sessions, which generally aligns with our expectations. However, we also found that the largest proportion of toggling between the cameras being on and off happened in the most interactive class that we observed.

          One of our most important expectations was that we would see strong opinions on how engaged people felt in online classes. Opinions on engagement and the strength of those opinions varied a lot more than we expected. Some interviewees found it easier to engage in interactions with other students through Zoom, and others found it easier to do this in person. Some people found it easier to ask questions to their professors over Zoom, but others found the communication stream between professors and students to be too unidirectional. However, this heterogeneity of opinion on whether Zoom classes are engaging did not seem to covary with beliefs about how engagement is a decision-making factor about camera usage; when we asked our interviewees for reasons why they would choose to keep their cameras on, it was typical for them to say that they believed it would increase their engagement. One unanimous point about the engagement was interviewees’ beliefs that keeping cameras on adds to the sense of community in a class. Specifically, one student said in an interview, “Um, it’s very unlikely that I reach out to somebody who doesn’t have a video on to, like, study or to ask a question from ‘cause they seem not there, right?”.

Normative Behavior

          An important variant we noticed was interviewees believing that keeping cameras on was a way of showing respect for a class or professor. It is very typical for interviewees to feel a sense of obligation to keep their cameras on, with several respondents reporting that professors create that sense of obligation both implicitly and explicitly. This concurred with our observation in which we saw one professor announce a reminder that she wants students to keep their videos on if they can since she wants to get feedback from people’s faces.

          It was typical for interviewees to feel at least a little bit of self-consciousness about camera usage, but only one person seemed to feel so strongly about it that watching her own self-image on the screen “triggers more anxiety.” We had believed that serious self-consciousness issues would be more prevalent than they actually were. However, where the self-consciousness issue did exist, the respondent behaved as we expected, where she was less likely to keep her camera on than other respondents were.

          Despite these interview responses that emphasized the need to adhere to normative behaviors, we did observe behaviors that do not demonstrate compliance with strict social norms. We noticed two people who were multitasking so visibly that it is noteworthy that they still kept their cameras on. One student spilled water during class and had to rapidly respond by shaking the water off her laptop and getting cleaning supplies from her kitchen. In our follow-up interview with this student, she stated that turning off her camera was the last thing on her mind. The second multitasker’s behavior consisted of eating, drinking, walking around, and playing with her hair; unlike the student who spilled her water, the second multitasker’s behavior suggests an idiosyncratic preference for staying highly active while listening to a lecture.


          Through interviewing CGU students and asking about their online class experience, we have gained a lot of insightful responses and got the chance to understand their rationale behind the camera usage preferences. In general, most students experienced more distractions than they have had in in-person classrooms, including the use of electronic devices or the presence of someone else in the house. One of the most frequently mentioned weaknesses of the online class is the lack of visual interaction with others thus decreasing the level of engagement, especially when others turned their cameras off during the class. Mostly, students felt obligated to turn their cameras on to show respect to peers and professors. In particular, they were more willing to use their cameras in a smaller-sized conference or class rather than a large lecturing-typed course. Many of them stated that even though it is not required, they feel more connected to their peers who keep their cameras on. However, it was still understandable for them to turn it off based on the fact that it is a personal choice. As one of our participants mentioned in the interview, forcing other students to turn on the camera does not act as encouragement of helping them engage in the class. On the opposite, it might add more pressure to students thus preventing them from staying interested in the course.

          Although our study intends to apply multiple methods to gain a comprehensive understanding of how students prefer to use their cameras and avoid unnecessary biases, it is still inevitable for researchers to encounter some obstacles along the way. At the early stage of our study, we have encountered difficulties in asking permission for conducting observation in classes. Many instructors turned us down due to the intention of protecting students’ online class experience and concerning that our observation might be distracting or disturbing especially when the online class environment is still novel to many students. We had to observe only those who gave us permission no matter the class size or settings. Similarly, recruiting participants was also very challenging because we are unable to reach out to many students who turned their cameras on and off during the observation. The classes we observed and participants we recruited were collected using convenient sampling, making the conclusion limited by only available resources. Moving further, we might consider conducting the study on multiple sides to compare and contrast students’ experiences in other educational institutions.

          During our observation, we encountered several problems that can impact our field notes. One of the biggest challenges is that we are unable to predict the moment students turn their cameras off, thus missing the critical behaviors before they switch it off. Although later in the interview, we tried to encourage them to recall the reason for their camera usage on that day, their memory or self-report might not be as accurate a few days after the observation. Therefore, obtaining instant information seemed to be impossible for researchers to understand what happened at the moment. Similarly, students’ behaviors can change while researchers were taking notes or observing others. Capturing these instantaneous moments precisely was one of the biggest challenges.

          In the interviews, participants were asked about their attitudes and preferences of virtual background on Zoom. However, two of the researchers were using a virtual background before we realized this can affect how participants answer this question. Although both of us stopped using the virtual background in the rest of the interviews, it is still possible that previous participants hesitated to report their dislike if they see the researcher is currently using it. Out of being polite or feeling the pressure to be aligned with researchers’ expectations, participants might provide different answers than their true preferences.

          In our study, researchers tried their best to use a standardized scale or interpreting participants’ behaviors and analyzing their responses. We persist in frequent communication and information exchange processes throughout the whole research study. However, it is impossible for us to eliminate researcher bias and detach from unconsciously adding any personal thoughts to the data. Most of the questions in our interview were open-ended ones, which means participants cannot answer them by giving us a number. All of our researchers tried to avoid adding personal interpretation by confirming or paraphrasing their phases along with the interview, but implying our thoughts became almost inevitable when recording the data.

          In addition, instructors can think of using other methods to assist them in staying concentrated and engaging in the class. Wu and Xie (2018) conducted a study regarding using time pressure and note-taking strategy to enforce university students’ focus on learning and prevent digital distraction. The authors concluded that adding appropriate time pressure on students’ note-taking process can successfully reduce the possibility of digital distraction because they need to take time to absorb the information and filter out the valuable information to take notes. This strategy can effectively prevent class-irrelevant online browsing behaviors and help students concentrating on the lecture.

          Another strategy can be making the lecture more interactive rather than a lecture-heavy setting to enhance the participation of students in the class. Camera usage should not be a requirement or expectation from the instructors but instead should serve as a way of visual interaction and communication among all members of the class. Instructors should carry the responsibility of facilitating an interactive and friendly online class setting by encouraging more camera usage but do not apply pressure or expectation to students that will prevent them from enjoying the interaction with others.


Appleton, J. J., Christenson, S. L., Kim, D., & Reschly, A. L. (2006). Measuring cognitive and psychological engagement: Validation of the student engagement instrument. Journal of School Psychology, 44, 427–445.

Azriel, O., Lazarov, A., Segal, A., & Bar-Haim, Y. (2020). Visual attention patterns during online video-mediated interaction in socially anxious individuals. Journal of Behavior Therapy and Experimental Psychiatry, 69, 101595.

Bowers, J. & Kumar, P. (2015). Students’ perceptions of teaching and social presence: A comparative analysis of face-to-face and online learning environments. International Journal of Web-Based Learning and Teaching Technologies, 10(1), 27-44.

Conrad, D. (2005). Building and maintaining community in cohort-based online learning. Journal of Distance Education, 20(1), 1-20.

Garrison, D. R., Anderson, T., & Archer, W. (1999). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2), 87–105.

Halevy, N. (2020). Strategic thinking and behavior during a pandemic. Judgment and Decision Making, 15(5), 648–659.

Hsieh, H.-F., & Shannon, S. E. (2005). Three Approaches to Qualitative Content Analysis. Qualitative Health Research, 15(9), 1277–1288.

Jack, R. E., & Schyns, P. G. (2017). Toward a social psychophysics of face communication.

Annual Review of Psychology, 68, 269-297.

Lippman, Julia R., and Dara N. Greenwood. “A Song to Remember: Emerging Adults Recall Memorable Music.” Journal of Adolescent Research, vol. 27, no. 6, Nov. 2012, pp. 751– 74. (Crossref),

Littlefield, J. (2019, March 11). 10 Reasons to Choose an Online Education to Earn Your Degree. Retrieved December 07, 2020, from choose-online-education-1098006

Loyola University Chicago. (n.d.). Guidelines for Student Camera Usage. Retrieved December 07, 2020, from

Lyons, A., Reysen, S., & Pierce, L. (2012). Video lecture format, student technological efficacy, and social presence in online courses. Computers in Human Behavior, 28(1), 181–186.

Maxwell, J. A. (2013). Qualitative research design: An interactive approach (3rd ed). SAGE Publications.

Mubarak, A. A., Cao, H., & Zhang, W. (2020). Prediction of students’ early dropout based on their interaction logs in online learning environment. Interactive Learning Environments, 1–20.

Oncu, S., & Cakir, H. (2011). Research in online learning environments: priorities and methodologies. Computers & Education, 57(1), 1098–1108.

TBS STAFF. (2020, September 11). Synchronous Learning vs. Asynchronous Learning. Retrieved December 07, 2020, from asynchronous-education/

Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.

Wu, J.-Y., & Xie, C. (2018). Using time pressure and note-taking to prevent digital distraction behavior and enhance online search performance: Perspectives from the load theory of attention and cognitive control. Computers in Human Behavior, 88, 244–254.

Zembylas, M., Theodorou, M., & Pavlakis, A. (2008). The role of emotions in the experience of online learning: challenges and opportunities. Educational Media International, 45(2), 107–117.

bottom of page