MIT Computer Science Lab 50th Anniversary Celebration

Robert Fano emigrated from Italy in 1939, got his Sc.D. from MIT in 1947, and officially started Project MAC, the Project on Mathematics and Computation, on July 1, 1963. Daniela Rus, a young faculty star and director of the successor CSAIL organization, organized a fabulous celebration of Project MAC’s 50th anniversary last week, which turned out to be a good place to catch up with what is happening in computer science, as well as an occasion to present Professor Fano, always the nicest guy in the building and still reasonably spry at age 96, with a fancy plaque.

Project MAC was important in the development of time-sharing, the modern operating system, computer networking, computer algebra, personal computing, practical robotics, etc.(see this list). In some ways the celebration was bittersweet because computing research is now so widespread that there is no way for any current university to have the kind of impact that MIT had in the 1960s and 70s.

The most information-dense talk was given by Bill Dally, an academic who is also an accomplished practical hardware designer (he and I worked together 25 years ago on circuits to encode and decode digital information stored in analog video streams). Bill came in from his dual perch at Stanford and as Chief Scientist for NVIDIA, to scare us all with “The End of Moore’s Law and the Future of Computing”. It seems that the popular conception of Moore’s Law, i.e., that computers will double in speed every couple of years, stopped being true in 2000. The law as stated still applies, but that just gives us twice as many transistors on a chip every two years. A typical computer program cannot use those extra transistors and they don’t run much faster than transistors on older chips. So the latest and greatest multi-core computer system might compress video a little faster than a two-year-old machine, but a standard one-instruction-at-a-time program might not be speeded up significantly. Processors now spend more power pushing data around than computing. The answer is specialized hardware, according to Dally. This dovetails with what is happening over in the Bitcoin world where people are running custom hardware (ASICs) rather than standard computers or graphics cards.

The talk on the topic that has the most potential to change the world of academic computer science and far and away the best-presented talk was by Charles Isbell, a machine learning researcher at Georgia Tech who is running an online Master’s in CS program that covers the same material as Georgia Tech’s in-person program but for about $6600 instead of $42,000. There should be 2000 students enrolled by January 2015.

The other talk where I wanted to act as cheerleader was given by Dan Huttenlocher, who is running the new project-based graduate school in New York City: Cornell Tech. A lot of the ideas behind Cornell Tech are similar to those that I wrote about in “What’s wrong with the standard undergraduate computer science curriculum” and “Teaching Software Engineering at MIT”.

Tom Leighton, the theory professor who co-founded Akamai and is now the company’s CEO, reminded us that there isn’t nearly enough bandwidth at the core of the Internet for everyone on the world to stupefy themselves with streaming 4K video. About 25,000 Tbps would be required just to keep people in 1080p and capacity near the core is closer to 25 Tbps. There might be sufficient capacity, however, in the “last mile” to folks’ houses. So what the world needs is to park all of its content at servers near the edges of the Internet. And what company do you think might have thousands of such servers? …

Marc Raibert showed crowd-pleasing videos of Big Dog and other robots from the company that he founded, Boston Dynamics (acquired in December 2013 by Google). Rod Brooks and Matt Mason tried to explain why robots still weren’t useful around the house. Antonio Torralba gave the funniest talk per minute, with examples of the inadequate performance of current computer vision systems. We’re a long way from a computer system that can interpret a scene as well as some of the simplest animals.

One of my old students, Manolis Kellis, gave a great talk on computational biology. I take full credit for his success, though the course he took seemingly has no relation to biology…

Amidst the “future is so bright you’ll need to wear sunglasses” and the awesome technological achievements chronicled, the limits of every day personal computing were on display. The speakers who brought Apple Macintosh laptops for projecting slides had terrible difficulties getting their computers to work at all. The laptop would freeze or crash and/or could not be made to drive the video projector. The Windows laptops brought by lecturers functioned perfectly as slide projectors, but one speaker was reminded in the middle of her talk about a canceled lunch. There was a massive pop-up to the entire audience from Microsoft Outlook while she was in “presentation mode” in Microsoft PowerPoint. This was made more embarrassing by the fact that her employer is Microsoft Research. Why wasn’t the computer smart enough to note that it was in a different location than her office and therefore if she was in presentation mode it was very likely to give a real presentation. Given those conditions, did a notification about a canceled calendar event really need to take over the screen? A 4-year-old child is smart enough not to interrupt you when you’re taking a shower or sleeping to ask if you still want that leftover apple slice on the dining room table. Why aren’t computers?

I began using Project MAC/CSAIL computer systems in 1976 via the ARPAnet and then started using the MIT Lisp Machine on campus in 1980. Thus I have about 35 years of experience following a cohort of (top) computer scientists. Here are some observations:

  • most of the people whom I can remember as tenured professors in 1980 are still occupying tenured faculty slots at MIT. I.e., if the field ever stops growing there will be almost no academic jobs for young PhDs
  • weight = age. The fifty-something-year-olds who maintained their graduate school weight look remarkably younger than those who’ve expanded.
  • the men who have had academic/research CS careers have experienced fairly standard lives as men, e.g., with wives and kids (and with divorce rates consistent with “These Boots are Made for Walking: Why Most Divorce Filers Are Women” (Margaret Brinig and Douglas Allen; 2000;the PDF version of the paper or New York Times article) and “Child Support Guidelines: The Good, the Bad, and the Ugly” (Family Law Quarterly, 45:2, Summer 2011; PDF is available for free … i.e., the men who’d had a successful startup and lived in a winner-take-all state were much more likely to have been sued by their wives)
  • the women who have been successful in academic/research CS are much more likely to be single and childless than women in the general population (see also “Women in Science”). Given that mid-career research computer scientists generally earn between $150,000 and $200,000 per year pre-tax, this means that a PhD in CS was economically damaging to a lot of women (since it would have been more profitable to have a couple of children with, e.g., two different medical doctors, and collect child support). Of course, it is possible that they enjoy their jobs much more than they would have enjoyed having kids, but the single/childless/earning-less-than-a-child-support-plaintiff life trajectory doesn’t seem to be a universally appealing advertisement for STEM careers for women.
  • the handful of folks who identified themselves as homosexual or bisexual back in the 1980s are today generally childless
  • the best job of all seemed to be university support staff. The people who were doing admin jobs back in the 1980s and 1990s are still MIT employees. They are cheerful, well-rested, and don’t seem to be aging at all.

Ray Stata, the founder of Analog Devices and donor behind the fancy Frank Gehry-designed CS building at MIT, gave a talk about how entrepreneurship is important and how MIT had gone from barely supporting this activity to having dozens of institutionalized support programs for entrepreneurs. He didn’t justify the importance of small companies and startups, however. Why aren’t GE, 3M, and Boeing more important to the U.S. than the latest batch of social networking startups? In a world that is increasingly regulated, why wouldn’t it make more sense to tell young people to “go big or go home”? And if these MIT programs to teach entrepreneurship are so effective, how come Massachusetts was more prominent (relative to Silicon Valley) in the computing/IT startup world back before MIT made any attempt to assist people with starting companies? And finally, if entrepreneurship is so important and MIT has $billions in cash, why doesn’t MIT open a satellite campus in Silicon Valley where students can go for a semester and see what is happening first-hand?

Conclusion: It was an inspiring event and much of the self-congratulation seems earned. People who are still active today as researchers and teachers built machines that completely transformed the world, e.g, Bob Metcalfe and Tom Knight who delivered networked personal computers with bitmapped displays. At the same time, we have a long way to go. Computers have no common sense. Robots are not helpful around the house. The SABRE system that revolutionized database management systems and transaction processing cost $40 million in 1960 dollars, i.e., less than half as much as what should have been the banal healthcare.gov web site.

6 thoughts on “MIT Computer Science Lab 50th Anniversary Celebration

  1. “nd finally, if entrepreneurship is so important and MIT has $billions in cash, why doesn’t MIT open a satellite campus in Silicon Valley where students can go for a semester and see what is happening first-hand?”

    Or MIT could spend considerably less at Beacon Hill to bring California’s legal “special sauce” back east.

    In 1980, Route 128 was reasonably competitive to Silicon Valley, but California adopted the “Minnesota Model” on employee IP. Each decade hence, the East Coast tech industry has progressively lost ground to tech hubs in states such as California, Washington, and even North Carolina (RTP), which protect individual liberty over the ownership of one’s own thoughts.

    California also enacted Related laws about noncompete clauses and moonlighting appeared around the same time; I think that the laws often include “public policy” language, to make it very clear to judges that these ideas are to be taken seriously. And there are even notification clauses which require employers notify new hires about these legal rights, just so that no potential entrepreneur is dissuaded from innovation due to ignorance (or intimidated by being presented with unenforcable terms).

    The argument for the “Massachusetts Model” seems to be, at its core, an argument that individual freedom is an impediment to innovation and economic progress. Strange that this position would be named after and used in a state that was so central to the American Revolution.

    [This is not to be misconstrued as an argument for stealing IP which legitimately belongs to the employer.]

    [1]http://www.ieeeusa.org/members/IPandtheengineer.pdf

  2. People don’t seem to recognize the degree to which the lousy academic job market is partly a function of near zero turnover, exacerbated by no-age discrimination laws.

  3. How competitive is that $6600 GA tech MS program going to be to get in? Is the same thing ever going to be possible in EE? If the natural progression of technology ever wins against bureaucracy & education finally becomes massively cheaper like everything else, won’t all jobs start requiring master’s degrees to keep up with the increased supply of grad students?

  4. Jack: One of Isbell’s point was that high quality does not mean “saying no”. The goal of Georgia Tech is to admit every person who is qualified to complete the program.

  5. Hmmm…wonder if I should point this article (particularly, the part on women in STEM) to female students who consider a PhD program or rather say something along the lines ” it may also be beneficial to find and marry an MD, become a mom and if things turn south – at least, there will be no need to worry about $$:). Insightful -thanks for sharing, Phillip.

  6. Natalia: As it happens, one of my grad school friends attended as well and came pretty close to executing on your plan. She is an MD/PhD, married to a successful (by Massachusetts standards) entrepreneur, and mother to three children. She works about half time. Her life has gone so smoothly and age has had so little effect on her body that another attendee asked “Are you a student?”

    [She actually met her husband at a party at my house, which means I could take credit for nearly all of her adult life success, but I think he is getting the better deal :-)]

Comments are closed.