Written by Rambus Fellow Dr. David G. Stork
The recent death of MIT Professor Marvin Minsky at age 88—one of the pioneers and central figures in the field of artificial intelligence, the discipline seeking to make machines “intelligent”—brought to mind a number of memories. This is not the place for a full obituary of Minsky, but in brief: he earned degrees from Harvard and Princeton in mathematics (there was no academic computer science at the time) and spent several years in the Harvard Society of Fellows.
Image Credit: Wikipedia (Bcjordan)
He co-organized the landmark monthlong workshop at Dartmouth College in 1956 where the term “artificial intelligence” was first coined and his many honors included membership in the National Academy of Science, National Academy of Engineering, and the 1969 Turing Award, generally considered the “Nobel Prize of Computer Science” for his work on artificial intelligence. Minsky was a polymath, an accomplished concert pianist, and inventor, including the confocal scanning microscope, an extremely important approach to imaging.
I met Minsky, briefly, as an undergraduate at MIT, and our paths crossed over the decades at conferences and my occasional trips back to my alma mater, but we spent quite some time together around 1999, when I was working on my book, HAL’s Legacy: 2001’s computer as dream and reality. My book compared the computer science visions depicted in the 1968 epic film 2001: A Space Odyssey with actual developments in computing, all in its namesake year.
Minsky was the scientific consultant to the film’s director, Stanley Kubrick, and script writer, science fiction author Arthur C. Clarke. Kubrick asked Marvin innumerable questions about what computers might be able to do 33 years later, and especially what they might look like. More than any other director—then, or even now—Kubrick sought to “get the science right.” We all remember HAL and his ability to see, speak, lipread, think. When computers of the time were for the most part crunching bank records or monitoring potential Russian missile trajectories, Minsky envisioned new uses for these machines and expected rapid and broad progress in making computers speak, see, reason, have common sense and otherwise behave “intelligently.”
Minsky also appeared in my companion PBS documentary film, 2001: HAL’s Legacy. Other than Kubrick and screenwriter Arthur C. Clarke, no one had more influence on the look, the vision for computing.
Minsky graciously invited me to his home around 1999, and we spoke for hours about the development of artificial intelligence, computer vision or making computers “see” (my great interest), serious music, and of course our mutual admiration of 2001: A Space Odyssey. He told me about working with Kubrick, including the time he was on the set of 2001 in Borehamwood, UK, and was almost killed when a wrench fell from the top of the famous rotating stage set of the space ship Discovery. He and I and a number of computing and artificial luminaries sat on a highly publicized panel with writer Arthur C. Clarke on video connection in 2001 assessing the state of artificial intelligence and the legacy of the 2001: A Space Odyssey.
I, and most of my colleagues in artificial intelligence and related fields such as computer vision, speech recognition, and so on, were deeply influenced by 2001: A Space Odyssey. It wouldn’t be an exaggeration to say that 2001, and indirectly Minsky, led me to science, technology, and ultimately a career in Silicon Valley.
He will be missed.