Marvin Minsky, 1927-2016: the death of a genial skeptic

“Marvin,” as Professor Minsky was known to nearly everyone at MIT, died yesterday. I’m sad that he is gone, but happy to have spent time with him on and off since 1979.

Marvin questioned many of the assumptions around Academia. He would show up to deliver a formal talk with a stack of notes and pick from them more or less at random. I wish that I could say that the results were amazing due to his dazzling intellect, but unfortunately the lack of an organized outline made these talks less than satisfying. Marvin was at his best in small groups or working one-on-one with others at MIT. What I remember most about him was his genial skepticism. If two people were arguing, rather than take one side, Marvin could show that both were operating from an assumption about the world that perhaps wasn’t necessary or true. He was a bit of a modern-day Socrates.

Marvin co-created one of the most successful university laboratories ever, the MIT Artificial Intelligence Lab (history). Though managed by others, at least for a couple of decades the style of the place was strongly influenced by Marvin. We could do anything we wanted, pretty much. We could build anything using parts from the stockroom. We could ask anyone for help at any time. Our $1000/month stipends didn’t afford a luxurious out-of-the-lab lifestyle (many trips to the “Twin Cities” strip mall McDonald’s; it took us years to realize that the “Twin Cities” were Cambridge and Somerville) but within the lab we had everything that we could have wanted. Marvin opened the lab physically and virtually to almost anyone with a sincere interest in computer nerdism. High school students came in on weekends to experiment with million-dollar mainframes, eventually becoming MIT undergraduates or founders of Boston-area software companies. As a 12-year-old I connected to the ARPAnet via a 300-baud modem in Bethesda, Maryland. I asked the administrators of the ITS mainframes at Marvin’s AI Lab if I could have an account to experiment with the Macsyma computer algebra system. Despite the fact that anyone with such an account could type a single command to crash the system, the “philg” account was created for me.

Marvin was not too interested in the dismal corners of academic computer science. He was the wrong person to talk to about operating systems, disk latency, or the relative merits of the latest programming languages and environments. Marvin was passionate about creating an artificial intelligence and was more than content to allow others, preferably in industry, to tackle the practical day-to-day computer science stuff. He expected IBM to develop System R and hoped that MIT would develop a computer that he could be friends with.

[Despite Marvin’s lack of interest in the quotidian aspects of coding, he ended up having a large indirect effect on the practical world of software. Richard Stallman wandered over to the AI Lab from Harvard and eventually founded the free software movement (not to be confused with “open source” unless you wanted a stack of AI Technical Reports thrown at your head).]

At age 62, instead of comfortably retiring in place at the lab that he’d created, Marvin jumped at the chance to be an early MIT Media Lab researcher.

Marvin was a loyal friend and family member. He would show up to a birthday party (sometimes my own!) even at the cost of a further slip to a deadline for one of his books. Marvin never used his success to try to climb the bureaucratic ladder within the university. Nor did he try to make money from his discoveries or his prominence. He supported his wife’s most questionable projects, e.g., the adoption of a fearful and periodically aggressive shelter dog (Prozac was tried; not sure if the dog read the literature regarding Prozac versus placebo).

He was a great research advisor and collaborator, as evidenced by the Turing Award of Manuel Blum, the impact of Logo (pushed mostly by Hal Abelson), the important and varied career of Gerry Sussman (Scheme, SICP, Digital Orrery), the invention of interactive computer-aided design by Ivan Sutherland, the definition of “AI” for at least a couple of decades by Patrick Winston, and the pioneering computer algebra work of Joel Moses.

Virtually everyone in American computer science has been strongly influenced by Marvin, either directly or through one of his students.

I will miss Marvin.

6 thoughts on “Marvin Minsky, 1927-2016: the death of a genial skeptic

  1. A great scientist and a great human being. I remember loaning him a chess book and, after seeing his unbelievably extensive and disorganized home library (I am not suggesting that he didn’t know where everything was, just that no arrangement was apparent), immediately resigning myself to never seeing the book again.

    I wish he had lived long enough to see the fruits of the progress AI has been making in the last 10 years, after decades of frustratingly slow progress. We still have not reached the point where his theories about human cognition could be evaluated.

  2. Minsky always seemed to me like one of those people who are so damned smart that they exist on an entirely different plane to most of humanity. The trait you mention, of rising (or staying) above the argument instead of taking sides, only reinforces that impression. The Minsky-Sussman “AI Koan” also comes to mind.

    I just hope that his last days weren’t filled with disappointment that the singularity isn’t here yet.

  3. My favorite Minsky story: Marvin’s nephew Matt and I were videotaping his niece Hope’s wedding in the early 1980s (using a huge Panasonic consumer video camera tethered to a portable VCR). We walked around, interviewing many guests, asking for their impressions of the wedding. When we got to Marvin, he looked into the camera, and commenting on the fact that the ceremony had been jointly conducted by a rabbi and a priest, said “if there was a god, he’d be very confused today.”

  4. Not to mention his house full of corners and surprises, back when I belonged to a a “computer club” hosted by one of his progeny.

Comments are closed.