News

No more exams in Fall 3.091 class

After teaching edX version of chemistry class, a professor ‘flips’ the popular on-campus GIR in a new educational experiment

Rather than take midterms and a final, freshmen in 3.091 this fall will earn their grades by answering a series of around 40 online questions spread out over the semester.

For each of around 40 topics in Introduction to Solid-State Chemistry, students will have a two-week window to head to an Athena cluster to solve a uniquely generated problem within that topic. A student who gets it wrong will be able to ask a proctor for an explanation and come back the next day to try a different problem within the same subtopic.

This latest experiment in MIT’s on-campus education is Professor Michael Cima’s twist on the ‘flipped classroom,’ in which students are first exposed to topics not during lecture but on their own time.

In 3.091 this fall, students will go through sequences of video clips and simple exercises, to be largely taken from the online version of 3.091, which 40,000 students around the world signed up for in the past year on edX, the platform for massive open online courses (MOOCs) started by Harvard and MIT.

“I’m going to take advantage of the fact that we finally have a text that’s perfect for the class,” he said, referring to the edX ‘learning sequences.’

Cima hopes that these learning sequences, which break down past lectures into shorter segments, will save time for those who remember their high-school chemistry.

“Some students have a background in the Bohr atom. They don’t need to mess with that. Moles to grams? They can just skip right by that.”

Cima also hopes the learning sequences will free up time for livelier classes.

“My lectures are going to be very different. I am not going to get up to the board and go through laboriously the kind of things I was forced to before because it was not even in the text,” he said. “Now I can take more questions. I can spend more time making sure people get the big picture. I can do more demos, which I plan to do. I can spend time putting questions to people in the audience, getting a discussion going, which I personally feel I’m much better at than talking at a chalkboard.”

As for splitting up the tests into bite-sized online assessments, Cima said the switch was borne of doubts that traditional exams were working.

When grading midterms, Cima said he often came across problems left blank.

“Does that mean the student didn’t know this? Or does it mean I didn’t test this? In other words, they didn’t have time. I believe it’s that they didn’t have time.”

Students will have up to several hours to do one problem this fall ­— the plan is for 3.091 TAs to “take over” an Athena cluster for some fraction of each week, including from 7 to 10 p.m. each night. Students will be allowed to come in at any time within that period.

Cima said that when he looked over the final, three midterms, and 10 or 12 quizzes from semesters past, he found that there were 37 problems, corresponding to around the same number of topics.

A team has been working through the summer to create a collection of problem templates for each of these topics. Simply by randomly varying the numbers, each template can spawn any number of different problems for the student at the Athena terminal.

It’s an approach that’s been tested on the students in the edX course. The edX team wrote 15 problem templates about acids and bases, and each student was assigned a random template with random numbers within bounds. “Over the entire world, no one got the same question,” Cima said, and to his knowledge, “nobody knew this was going on.”

Cima said he enjoys interacting with students on edX’s discussion forum. “The students like it when the ‘professor’ responds to a question or cheers them on in some way.”

Still, his MIT students take priority. “The only reason I got involved in this is I want to do a better job in my residence class,” he said.

In an experiment with grading open-ended responses with artificial intelligence, Cima asked edX students to answer a question about surfactants in five sentences or fewer. 100 of the responses were graded by hand and used to train the automated grader.

Though the experiment in machine learning was unsuccessful, Cima did come out with an insight that may forever change the way he assesses students.

The tests MIT students have been handing in? “All different angles — you can’t read it.”

“What I learned about [typed] responses is that they are tremendously easier to hand-grade than a hand-written response from a student,” he said. “It took me like a fraction of the time — boom, boom, boom, boom.“



7 Comments
1
G over 11 years ago

Wow, this is VERY exciting! What an interesting idea for addressing the pitfalls of modern student assessment. Time constraints gone, memorization gone... I really hope this works out, and that we'll see this in other classes soon!

-Class of '16

2
Anonymous over 11 years ago

And there goes the rigors and the quality of an MIT education...

Opportunity for cheating, everywhere. Ability to catch someone, gone. Easy A's, voila. Ability to work with a real person, gone.

3
Affi over 11 years ago

Whoa this is amazing! I would totally take a course taught this way. Having those segments of lectures available before lecture would definitely make lectures a lot more interestingmore time to get into the material and talk about it, and you know, not just learn it and leave it.

But what about retention? Maybe it would still be useful to have a final to encourage review of material so it sticks with the student when the semester is through. I've definitely felt that way about courses with no final. I hate finals, probably more than most, but I sort of see why they can be helpful...

4
blah over 11 years ago

It seems like an interesting idea, but I think that they should experiment this idea before implementing it in a real class. I definitely share commenter #2's concerns on this...

5
Anonymous over 11 years ago

Not only does this class format sound very cool, but I am fascinated by the prof's comment that he prefers typewritten responses over handwriting. One more argument against teaching handwriting in the schools.

6
Liz over 11 years ago

How will we know if this "experiment" works? It would be great if some of the students were required to take some part of a final similar to previous 3.091 finals, to see if they have learned more or less.

When TEAL was introduced to 8.01 and 8.02, pass rates rose and the "experiment" was deemed a success, but since the exams changed and became easier we have no way of knowing whether students actually learned more.

7
Anonymous over 11 years ago

I took 8.01 just before TEAL really got going. The lecture was the "more exiting" demonstration and learning was via textbook and recitation sections. It was a disaster - lectures were useless and people were at the mercy of the teaching ability of their TA.

Yes, we should be expected to largely teach ourselves and learn in small groups, but that last step to full understanding really needs to be interactive with an expert.

I also see #2's point. This setup seems to leave a lot of potential for cheating unless the student is filmed and all other devices are confiscated.