The New York Times published a thoughtful obituary for Ed Fredkin, an early MIT computer scientist.
I met Ed when I was an undergraduate at MIT (during the last Ice Age). He is quoted in the NYT as optimistic about artificial intelligence:
“It requires a combination of engineering and science, and we already have the engineering,” he Fredkin said in a 1977 interview with The New York Times. “In order to produce a machine that thinks better than man, we don’t have to understand everything about man. We still don’t understand feathers, but we can fly.”
When I talked to him, circa 1980, the VAX 11/780 with 8 MB of RAM was the realistic dream computer (about $500,000). I took the position that AI research was pointless because computers would need to be 1,000 times more powerful before they could do anything resembling human intelligence. Ed thought that a VAX might have sufficient power to serve as a full-blown AI if someone discovered the secret to AI. “Computers and AI research should be licensed,” he said, “because a kid in Brazil might discover a way to build an artificial intelligence and would be able to predict the stock market and quickly become the richest and most powerful person on the planet.”
[The VAX could process approximately 1 million instructions per second and, as noted above, held 8 MB of RAM. I asked ChatGPT to compare a modern NVIDIA GPU:
For example, a GPU from the NVIDIA GeForce RTX 30 series, like the RTX 3080 released in 2020, is capable of up to 30 teraflops of computing power in single-precision operations. That is 30 trillion floating-point operations per second.
So if you were to compare a VAX 11/780’s 1 MIPS (million instructions per second) to an RTX 3080’s 30 teraflops (trillion floating-point operations per second), the modern GPU is many orders of magnitude more powerful. It’s important to remember that the types of operations and workloads are quite different, and it’s not quite as simple as comparing these numbers directly. But it gives you an idea of the vast increase in computational power over the past few decades.
Also note that GPUs and CPUs have very different architectures and are optimized for different types of tasks. A GPU is designed for high-throughput parallel processing, which is used heavily in graphics rendering and scientific computing, among other things. A CPU (like the VAX 11/780) is optimized for a wide range of tasks and typically excels in tasks requiring complex logic and low latency.
Those final qualifiers remind me a little bit of ChatGPT’s efforts to avoid direct comparisons between soccer players identifying as “men” and soccer players identifying as “women”. If we accept that an NVIDIA card is the minimum for intelligence, it looks as though Fredkin and I were both wrong. The NVIDIA card has roughly 1000X the RAM, but perhaps 1 million times the computing performance. What about NVIDIA’s DGX H100, a purpose-built AI machine selling for about the same nominal price today as the VAX 11/780? That is spec’d at 32 petaFLOPs or about 32 billion times as many operations as the old VAX.]
I had dropped out of high school and he out of college, so Ed used to remind me that he was one degree ahead.
“White heterosexual man flying airplane” is apparently a dog-bites-man story, so the NYT fails to mention Fredkin’s aviation activities after the Air Force. He was a champion glider pilot and, at various times, he owned at least the following aircraft: Beechcraft Baron (twin piston), Piper Malibu, Cessna Citation Jet. “The faster the plane that you own, the more hours you’ll fly every year,” he pointed out. Readers may recall that the single-engine pressurized-to-FL250 Malibu plus a letter from God promising engine reliability is my dream family airplane. Fredkin purchased one of the first Lycoming-powered Malibus, a purported solution to the engine problems experienced by owners of the earlier Continental-powered models. Fredkin’s airplane caught fire on the ferry trip from the Piper factory to Boston.
One of the things that Ed did with his planes was fly back and forth to Pittsburgh where he was an executive at a company making an early personal computer, the Three Rivers PERQ (1979).
The obit fails to mention one of Fredkin’s greatest business coups: acquiring a $100 million (in pre-pre-Biden 1982 money) TV station in Boston for less than $10 million. The FCC was stripping RKO of some licenses because it failed “to disclose that its parent, the General Tire and Rubber Company, was under investigation for foreign bribery and for illegal domestic political contributions.” (NYT 1982) Via some deft maneuvering, including bringing in a Black partner who persuaded the FCC officials appointed by Jimmy Carter that the new station would offer specialized programming for inner-city Black viewers, Fredkin managed to get the license for Channel 7. RKO demanded a substantial payment for its physical infrastructure, however, including studios and transmitters. Ed cut a deal with WGBH, the local public TV station, in which WNEV-TV, a CBS affiliate, would share facilities in exchange for a fat annual rent. Ed used this deal as leverage to negotiate a ridiculously low price with RKO. To avoid embarrassment, however, RKO asked if they could leave about $15 million in the station’s checking account and then have the purchase price be announced as $22 million (71 million Bidies adjusted via CPI) for the physical assets. The deal went through and Channel 7 never had to crowd in with WGBH.
[The Carter-appointed FCC bureaucrats felt so good about the Black-oriented programming that they’d discussed with the WNEV-TV partner that they neglected to secure any contractual commitments for this programming to be developed. Channel 7 ended up delivering conventional CBS content.]
A 1970s portrait:
A 1981 report from Fredkin and Tommaso Toffoli:
Related:
Full post, including comments