News

Marvin Minsky, pioneer in artificial intelligence, dies at 88

Marvin Minsky, who combined a scientist’s thirst for knowledge with a philosopher’s quest for truth as a pioneering explorer of artificial intelligence, work that helped inspire the creation of the personal computer and the Internet, died Sunday night in Boston. He was 88.

His family said the cause was a cerebral hemorrhage.

Well before the advent of the microprocessor and the supercomputer, Minsky, a revered computer science educator at MIT, laid the foundation for the field of artificial intelligence by demonstrating the possibilities of imparting common-sense reasoning to computers.

“Marvin was one of the very few people in computing whose visions and perspectives liberated the computer from being a glorified adding machine to start to realize its destiny as one of the most powerful amplifiers for human endeavors in history,” said Alan Kay, a computer scientist and a friend and colleague of Minsky’s.

Fascinated since his undergraduate days at Harvard by the mysteries of human intelligence and thinking, Minsky saw no difference between the thinking processes of humans and those of machines. Beginning in the early 1950s, he worked on computational ideas to characterize human psychological processes and produced theories on how to endow machines with intelligence.

Minsky, in 1959, co-founded the MIT Artificial Intelligence Project (later the Artificial Intelligence Laboratory) with his colleague John McCarthy, who is credited with coining the term “artificial intelligence.”

Beyond its artificial intelligence charter, however, the lab would have a profound impact on the modern computing industry, helping to impassion a culture of computer and software design. It planted the seed for the idea that digital information should be shared freely, a notion that would shape the open-source software movement, and it was a part of the original ARPAnet, the forerunner to the Internet.

Minsky’s scientific accomplishments spanned a variety of disciplines. He designed and built some of the first visual scanners and mechanical hands with tactile sensors, advances that influenced modern robotics. In 1951 he built the first randomly wired neural network learning machine, which he called Snarc. And in 1956, while at Harvard, he invented and built the first confocal scanning microscope, an optical instrument with superior resolution and image quality still in wide use in the biological sciences.

His own intellect was wide-ranging and his interests were eclectic. While earning a degree in mathematics at Harvard he also studied music, and as an accomplished pianist, he would later delight in sitting down at one and improvising complex baroque fugues.

Minsky was lavished with many honors, notably, in 1970, the Turing Award, computer science’s highest prize.

He went on to collaborate, in the early ’70s, with Seymour Papert, the renowned educator and computer scientist, on a theory they called “The Society of Mind,” which combined insights from developmental child psychology and artificial intelligence research.

Minsky’s book “The Society of Mind,” a seminal work published in 1985, proposed “that intelligence is not the product of any singular mechanism but comes from the managed interaction of a diverse variety of resourceful agents,” as he wrote on his website.

Underlying that hypothesis was his and Papert’s belief that there is no real difference between humans and machines. Humans, they maintained, are actually machines of a kind whose brains are made up of many semiautonomous but unintelligent “agents.” And different tasks, they said, “require fundamentally different mechanisms.”

Their theory revolutionized thinking about how the brain works and how people learn.

“Marvin was one of the people who defined what computing and computing research is all about,” Kay said. “There were four or five supremely talented characters from back then who were early and comprehensive and put their personality and stamp on the field, and Marvin was among them.”

Marvin Lee Minsky was born on Aug. 9, 1927, in New York City. He was the precocious son of Dr. Henry Minsky, an eye surgeon who was chief of ophthalmology at Mount Sinai Hospital, and Fannie Reiser, a social activist and Zionist.

Fascinated by electronics and science, the young Minsky attended the Ethical Culture School in Manhattan, a progressive private school from which J. Robert Oppenheimer, who oversaw the creation of the first atomic bomb, had graduated. (Minsky later attended the affiliated Fieldston School in Riverdale.) He went on to attend the Bronx High School of Science and later Phillips Academy in Andover, Massachusetts.

After a stint in the Navy during World War II, he studied mathematics at Harvard and received a Ph.D. in math from Princeton, where he met John McCarthy, a fellow graduate student.

Intellectually restless throughout his life, Minsky sought to move on from mathematics once he had earned his doctorate. After ruling out genetics as interesting but not profound, and physics as mildly enticing, he chose to focus on intelligence itself.

“The problem of intelligence seemed hopelessly profound,” he told The New Yorker magazine when it profiled him in 1981. “I can’t remember considering anything else worth doing.”

To further those studies he reunited with McCarthy, who had been awarded a fellowship to MIT in 1956. Minsky, who had been at Harvard by then, arrived at MIT in 1958, joining the staff at its Lincoln Laboratory. A year later, he and McCarthy founded MIT’s AI Project. (McCarthy left for Stanford in 1962.)

Minsky’s courses at MIT — he insisted on holding them in the evenings — became a magnet for several generations of graduate students, many of whom went on to become computer science superstars themselves.

Among them were Ray Kurzweil, the inventor and futurist; Gerald Sussman, a prominent AI researcher and professor of electrical engineering at MIT; and Patrick Winston, who went on to run the AI Lab after Minsky stepped aside.

Another of his students, Danny Hillis, an inventor and entrepreneur, co-founded Thinking Machines, a supercomputer maker in the early 1990s.

Hillis said he had so been taken by Minsky’s intellect and charisma that he found a way to insinuate himself into the AI Lab and get a job there. He ended up living in the Minsky family basement in Brookline, Massachusetts.

“Marvin taught me how to think,” Hillis said in an interview. “He had a style and a playful curiosity that was a huge influence on me. He always challenged you to question the status quo. He loved it when you argued with him.”

Minsky’s prominence extended well beyond MIT. While preparing to make the 1968 science-fiction epic “2001: A Space Odyssey,” director Stanley Kubrick visited him seeking to learn about the state of computer graphics and whether Minsky believed it would be plausible for computers to be able to speak articulately by 2001.

Minsky is survived by his wife, Gloria Rudisch, a physician; two daughters, Margaret and Juliana Minsky; a son, Henry; a sister, Ruth Amster; and four grandchildren.

“In some ways, he treated his children like his students,” Hillis recalled. “They called him Marvin, and he challenged them and engaged them just as he did with his students.”

In 1989, Minsky joined MIT’s fledgling Media Lab. “He was an icon who attracted the best people,” said Nicholas Negroponte, the Media Lab’s founder and former director.

For Kay, Minsky’s legacy was his insatiable curiosity. “He used to say, ‘You don’t really understand something if you only understand it one way,’” Kay said. “He never thought he had anything completely done.”

© 2016 The New York Times