Results of remote learning and teaching surveys released
71% of respondents agreed that ‘general stress related to COVID-19’ made learning difficult
Rick Danheiser, faculty chair and chair of the Academic Policy and Regulations Team (APART), presented results of the Student Remote Learning Experience Survey and Instructor Remote Teaching Survey in an email to students June 23.
The remote experience survey received responses May 13-31 and had a response rate of 30% out of 11,010 undergraduate, doctoral, and master’s students. The response rate was 39% for undergraduates, 22% for doctoral students, and 29% for master’s students.
25% of undergraduate, 23% of doctoral, and 21% of master’s respondents “strongly” agreed that they “feel like part of the MIT community.” When asked whether they agreed that “MIT administrators are genuinely concerned” about their welfare, 20% of undergraduates, 32% of doctoral, and 35% of master’s respondents “strongly” agreed.
Over 50% of respondents either “somewhat” or “strongly agree” that MIT administrators are genuinely concerned about their welfare.
87% of students respondents indicated that their quality of engagement with friends from MIT was “worse” or “much worse” than on campus prior to the COVID-19 pandemic. Additionally, 89% reported that their quality of engagement with residents of their living communities was “worse” or “much worse” than on campus.
In addition, a majority of respondents indicated that their quality of engagement with other students in their subjects and majors or programs was “worse” or “much worse” than on campus. 65% of respondents “somewhat” or “strongly” disagreed that they were able to continue collaborating with other students as much as they did prior to the transition to remote learning.
Respondents also indicated that their quality of engagement with teaching staff was reduced, with 62% answering that engagement with TAs was “worse” or “much worse” than on campus. 60% of respondents answered similarly for faculty in departments and instructors for subjects.
45%, 58%, and 53% of student respondents indicated that their qualities of engagement with UROP supervisors, academic advisors, and research advisors, respectively, was “about the same” as those on campus.
32% of student respondents “strongly” agreed that they would “rather take a semester off than try to do it via remote learning.” An additional 21% of respondents “somewhat” agreed with this statement.
In addition, 71% of respondents somewhat or strongly agreed that “general stress related to COVID-19 made it difficult” for them to learn.
When asked whether they have access to adequate technology for remote learning, 92% of respondents “somewhat” or “strongly” agreed that they had adequate software, 91% “somewhat” or “strongly” agreed they had adequate hardware, and 79% “somewhat” or “strongly” agreed they had sufficient internet access.
88% of student respondents “somewhat” or “strongly” agreed that the emergency grading policy “eased some of the stress” of remote learning, and 77% “somewhat” or “strongly” agreed that the material covered in their courses “was reasonable, given the circumstances.” 29% of respondents “somewhat” or “strongly” agreed that they were concerned that the emergency grading policy would hurt their “future prospects for jobs or graduate school.”
79% of respondents “somewhat” or “strongly” disagreed with the statement that they were able to focus during online sessions “as well as” they do during in-person classes. 70% of respondents “somewhat” or “strongly” agreed that they “had a difficult time learning” in their new environment, and 64% “somewhat” or “strongly” agreed that distractions in their environment made learning difficult.
51% of respondents indicated that their stress level was “more” or “much more” than before remote learning, 22% indicated that it was “less” or “much less,” and 27% indicated that it was the same.
82% of respondents “somewhat” or “strongly” agreed that they felt supported by family and friends, and 56% “somewhat” or “strongly” agreed that they felt supported by MIT.
86% of student respondents who participated in intramural athletics while on campus indicated they were unable to participate remotely. Similarly, 79% and 78% of respondents answered that they were unable to participate in intercollegiate and club sports, respectively.
53% of student respondents in performance groups also responded that they were unable to participate in activities remotely.
Eight percent of respondents in FSILGs answered that they were unable to participate in their activity remotely.
Student respondents also indicated that the quality of participation in activities was reduced remotely, with 75% of respondents in intercollegiate athletics, 69% of respondents in performance groups, and 66% of respondents in FSILGs reporting that their remote experiences were “much worse” than on campus.
For all student activities, over 50% of respondents reported that their quality of participation was “worse” or “much worse” than on campus.
The instructor remote teaching survey had a response rate of 53% out of 1,433 MIT faculty and other instructional staff. 167 respondents indicated they had previous remote teaching experience.
The average proportions of undergraduate and graduate students in the respondents’ subjects were 61% and 39%, respectively.
42% of respondents described their subject as an elective, 26% described it as part of the undergraduate core curriculum, 14% described it as part of the graduate core curriculum, and 14% described it as satisfying a General Institute Requirement.
51% of respondents indicated that their subject was about half or more discussion-based, and 41% reported that about half or more of the subject required hands-on work.
Of the respondents whose subjects had discussion, performance, or hands-on activities, 57% answered that going remote did not require them “to change the learning goals” associated with these activities, and 89% answered that it was reasonable to continue offering the subject “in a remote format if needed.”
45% of respondents answered that “about half” or more of their students were in a timezone different from MIT. 82% of respondents answered that “about half” or more of the total contact hours of their subject were offered synchronously.
65% of respondents indicated that students were expected to attend live virtual lectures or discussions, compared to 72% requiring lecture attendance and 25% requiring discussion attendance on campus. Additionally, 24% of respondents expected students to watch pre-recorded lecture videos during remote education, compared to three percent expecting students to do so while on campus.
Of the respondents who expected students to watch videos or attend live lectures, 67% recorded new videos for remote teaching, and 72% provided recordings to students after live classes.
38% of respondents indicated that they expected students to attend live virtual recitations, compared to 34% expecting recitation attendance on campus.
Of the respondents who expected students to attend live remote lectures or recitations, 85% indicated that they asked for student questions, compared to 70% of all respondents asking for student questions on campus.
Additionally, 58% of respondents had virtual office hours, 55% had one-on-one meetings with students, 34% used discussion forums, and 28% had virtual meetings with small groups of students.
To determine grades, 68% of respondents used regular assignments; 48% used final projects, papers, or performances; 34% used class activities; 17% used midterm projects, papers, or performances; 16% used quizzes; 16% used midterm written exams; and 15% used final written exams.
Danheiser also wrote in his email that Subject Specific Surveys were sent to students late May to receive “specific and useful feedback on how subjects might be improved in remote offerings in the future.” The survey results were not released to students but have been provided to subject instructors, department leadership, “a tightly restricted number of key individuals” supporting remote learning, and departments’ “Subject Evaluation Coordinators.”
The survey differed from previous subject evaluations by not including numerical ratings and instead posing open-ended questions to students so that they could “recognize the performance of particular individuals” such as TAs, Danheiser wrote.