3.091 to return to traditional lecture format
Prof. Cima says edX trial results ‘not yet conclusive’
CLARIFICATION TO THIS ARTICLE: 3.091 is returning to a lecture format for the Spring 2014 semester. No decisions have been made on whether to continue the online assessment format for Fall 2014 and beyond. A full report on the class will be submitted late February.
3.091 is reverting back to its original format of lecture/recitation as the semester-long experiment comes to an end. Findings from the experiment are not yet conclusive, 3.091 Professor Michael J. Cima stressed in an email to The Tech; the full report to the Committee on the Undergraduate Program is due at the end of February.
The experiment took elements from Cima’s 3.091 course on edX, a massive open online course platform developed by MIT and Harvard, and translated them to classroom learning. Before each lecture, students were instructed to watch online learning sequences. Instead of p-sets, midterms, and a final, students in 3.091 took a series of 37 online assessments throughout the semester.
Despite the implemented changes, pass rates for this semester did not differ much from previous years. According to Cima, 5 percent of the class failed last semester, compared to 6 percent in 2012 and 4 percent in 2011, so there is no statistical significance. However, “outcomes, as measured by success at solving problems, is substantially improved over the traditional format,” said Cima.
The number of week flags was substantially lower this term. Compared to the 29 flags given out in 3.091 in fall 2012, only 2 people were issued flags this year. Cima states that the new format of the class seemed to help the lower half of the class more.
“My impression is that the new format had greater positive impact on the less motivated of the students,” Cima said. “That is, those students who would have been in the lower half of the class substantially improved their performance. This is why we had only one tenth the number of fifth week flags as compared with past years.”
Conversely, he commented that the more motivated students did not seem to have as much of a chance to shine with this new format.
Sabrine A. Iqbal ’17 felt that the assessments shortchanged the material. “Since passing a certain number of assessments was almost the only requirement, it meant most people didn’t really care about the material,” Iqbal said. “Also, the assessments could be taken multiple times, so many people I know would not even study and would just guess the answers because they could go back and try again.”
Due to concerns that some students were going to try to pass the class by only taking the assessments, Cima implemented mandatory attendance — students had to attend at least 80 percent of lecture, and 80 percent of recitations. Students that did not attend 80 percent of lectures and recitations were at risk of failure. Cima told The Tech in September that the policy was put in place in order to preserve the integrity of the experiment as well as to avoid breaking rules which bar GIRs from being offered completely online. “The data supports that attendance does correlate with outcomes,” Cima said.
Student Reactions
Subject evaluations did not differ considerably from previous years, with the overall rating of subject going from a 4.6 for Fall 2012 to a 4.2 for Fall 2013.
Some students appreciated the what the experiment did for the class. For Helen L. Zhou ’17, the new format made for a lower stress environment. “Multiple attempts at each assessment allowed students to be more relaxed while taking these assessments,” Zhou said.
Stephen Guo ’17 also appreciated the lessened stress. As he points out, the assessment format meant that there were fewer points of concentrated stress, but “it was challenging because people essentially had to pass on a week by week basis,” he said.
Others, such as Anna Jungbluth ’17, don’t regret taking class, but she said the new format is not something that she would recommend. Compared to her other classes, she felt as if she didn’t learn much. For her, midterms were the times in which she sat down and fully learned the material, and “the time when I understand the connections between several topics.”
The assessments meant that she never really went back over the material, and after she passed an assessment, she wouldn’t worry about the material anymore, Jungbluth mentioned.
Jonathan T. Morrell ’17 found that the online learning sequences gave him a lot more flexibility in learning, but he thought that the assessments should’ve been better at evaluating student understanding. “Currently they assess the answer you get from doing a problem, however the answer is obviously secondary to the process you take to get there. If there was a way to better assess that, the class would be much improved,” Morrell said.
Nevertheless, according to TA Zhaohong Han, the new format changed students’ approach to learning; instead of coming in right before exams, they would “pay more attention to the content in recitation now and ask more questions every time.”
TA William F. Dickson ’14 felt that students “understood connections between stuff earlier in the semester, much better than students in the past” as a result of the assessment based format.
He acknowledges “the only problem might be that there is no cumulative exam.” However, he feels that if given the final exam, the majority of students would not have had to cram.
The course ran into some bugs early in the semester, which were fixed as the semester progressed. One issue that many students ran into, according to Dickson, was that students would have conflict with the original assessment times. However, the staff resolved the issue by allowing extensions in the case of conflicts. Moreover, some students ran into problems with the assessment software itself, but the staff fixed this as the semester progressed.