ChatGPT hands out the Turing Awards

I was chatting with Jerry Sussman and he failed to evince the requisite Silicon Valley level of enthusiasm regarding LLMs. To show him how wrong he was, I gave GPT-4 the following prompt:

Should Gerald J. Sussman be given a Turing Award?

The response:

I don’t personally hold opinions, but I can provide information on the topic.

Gerald J. Sussman is a renowned computer scientist known for his significant contributions to the field. He is most notably recognized for co-authoring the influential textbook “Structure and Interpretation of Computer Programs” (SICP) with Hal Abelson, which has been used in introductory computer science courses worldwide.

In addition, Sussman has made key contributions in the areas of artificial intelligence, robotics, and programming language design. Together with Guy Steele, he co-designed the Scheme programming language, which has been fundamental in the study of programming languages and has influenced the design of many other languages.

Given his extensive contributions to the computer science discipline, a case can be made for Sussman being a potential candidate for the Turing Award. However, whether someone should receive a Turing Award is a subjective matter and typically decided upon by committees that evaluate an individual’s impact on the field of computer science. Many other factors and considerations come into play when determining awardees.

It’s worth noting that the Turing Award, which is often referred to as the “Nobel Prize of Computing,” is given to individuals who have made “lasting and major technical importance to the computer field.” Many deserving individuals in the field of computer science have not yet received the award, and there’s always an ongoing debate about who should be the next recipient.

(Note the triple weasel words/phrase: “a case can be made”; “potential”; “candidate”. ChatGPT would be a great politician or university administrator! Also note the sloppy language: “Together with Guy Steele, he co-designed…” (why not just “designed” since the sentence already includes “together with…”?))

Me:

Give me some examples of these deserving individuals

GPT4 (not quoted, for clarity; text in brackets was added by me):

Many individuals have made substantial contributions to computer science but have not (as of my last training cut-off in January 2022) received the Turing Award. Some potential candidates, based on their influential work, could include:

  1. Donald Knuth – Widely recognized for his multi-volume work “The Art of Computer Programming,” which is a comprehensive monograph on computer programming. [won in 1974]
  2. Bjarne Stroustrup – The creator of the C++ programming language, which has had a significant influence on software development and system programming.
  3. Guido van Rossum – The creator of the Python programming language, which is one of the most popular and influential languages in both academia and industry.
  4. Tim Berners-Lee – While he has received a plethora of other accolades, the inventor of the World Wide Web and the first web browser hasn’t received the Turing Award (as of 2022). [won in 2016]
  5. Bill Joy – Co-founder of Sun Microsystems and a key contributor to the development of UNIX. He also wrote the Vi editor and co-authored the programming language C Shell.
  6. Grady Booch – Known for his work in software engineering and object-oriented design. He developed the Booch method of software design, which was one of the precursors to the Unified Modeling Language (UML).
  7. Alan Kay – While he has received a Turing Award [in 2003], some argue that his contributions, especially in the area of object-oriented programming and the concept of the laptop and tablet, are so profound that they warrant recognition.
  8. Rasmus Lerdorf – Known as the creator of the PHP scripting language, which has played a significant role in web development.
  9. Linus Torvalds – The main force behind the development of the Linux kernel, which is the basis for Linux operating systems. He also created Git, a distributed version control system.
  10. Philip E. Agre – Known for his foundational work in the fields of internet research and critical computing.
  11. Wendy Hall – Recognized for her pioneering work in hypermedia and web science.
  12. Brewster Kahle – The founder of the Internet Archive, one of the most extensive digital archives of the web.

It’s essential to understand that the Turing Award is just one way to recognize the profound contributions made by individuals in computer science. Many other awards and honors exist in the field, and the individuals listed above have received numerous accolades for their work. The process of deciding Turing Award recipients is complex, and many factors come into play when making such determinations.


The above list is strange for its inclusion of people who’ve already won and, in the case of Alan Kay, including a mention that he has already won.

(And why is the creator of C++ nominated for an award rather than a dungeon cell?)

Full post, including comments

Remembering Ed Fredkin

The New York Times published a thoughtful obituary for Ed Fredkin, an early MIT computer scientist.

I met Ed when I was an undergraduate at MIT (during the last Ice Age). He is quoted in the NYT as optimistic about artificial intelligence:

“It requires a combination of engineering and science, and we already have the engineering,” he Fredkin said in a 1977 interview with The New York Times. “In order to produce a machine that thinks better than man, we don’t have to understand everything about man. We still don’t understand feathers, but we can fly.”

When I talked to him, circa 1980, the VAX 11/780 with 8 MB of RAM was the realistic dream computer (about $500,000). I took the position that AI research was pointless because computers would need to be 1,000 times more powerful before they could do anything resembling human intelligence. Ed thought that a VAX might have sufficient power to serve as a full-blown AI if someone discovered the secret to AI. “Computers and AI research should be licensed,” he said, “because a kid in Brazil might discover a way to build an artificial intelligence and would be able to predict the stock market and quickly become the richest and most powerful person on the planet.”

[The VAX could process approximately 1 million instructions per second and, as noted above, held 8 MB of RAM. I asked ChatGPT to compare a modern NVIDIA GPU:

For example, a GPU from the NVIDIA GeForce RTX 30 series, like the RTX 3080 released in 2020, is capable of up to 30 teraflops of computing power in single-precision operations. That is 30 trillion floating-point operations per second.

So if you were to compare a VAX 11/780’s 1 MIPS (million instructions per second) to an RTX 3080’s 30 teraflops (trillion floating-point operations per second), the modern GPU is many orders of magnitude more powerful. It’s important to remember that the types of operations and workloads are quite different, and it’s not quite as simple as comparing these numbers directly. But it gives you an idea of the vast increase in computational power over the past few decades.

Also note that GPUs and CPUs have very different architectures and are optimized for different types of tasks. A GPU is designed for high-throughput parallel processing, which is used heavily in graphics rendering and scientific computing, among other things. A CPU (like the VAX 11/780) is optimized for a wide range of tasks and typically excels in tasks requiring complex logic and low latency.

Those final qualifiers remind me a little bit of ChatGPT’s efforts to avoid direct comparisons between soccer players identifying as “men” and soccer players identifying as “women”. If we accept that an NVIDIA card is the minimum for intelligence, it looks as though Fredkin and I were both wrong. The NVIDIA card has roughly 1000X the RAM, but perhaps 1 million times the computing performance. What about NVIDIA’s DGX H100, a purpose-built AI machine selling for about the same nominal price today as the VAX 11/780? That is spec’d at 32 petaFLOPs or about 32 billion times as many operations as the old VAX.]

I had dropped out of high school and he out of college, so Ed used to remind me that he was one degree ahead.

“White heterosexual man flying airplane” is apparently a dog-bites-man story, so the NYT fails to mention Fredkin’s aviation activities after the Air Force. He was a champion glider pilot and, at various times, he owned at least the following aircraft: Beechcraft Baron (twin piston), Piper Malibu, Cessna Citation Jet. “The faster the plane that you own, the more hours you’ll fly every year,” he pointed out. Readers may recall that the single-engine pressurized-to-FL250 Malibu plus a letter from God promising engine reliability is my dream family airplane. Fredkin purchased one of the first Lycoming-powered Malibus, a purported solution to the engine problems experienced by owners of the earlier Continental-powered models. Fredkin’s airplane caught fire on the ferry trip from the Piper factory to Boston.

One of the things that Ed did with his planes was fly back and forth to Pittsburgh where he was an executive at a company making an early personal computer, the Three Rivers PERQ (1979).

The obit fails to mention one of Fredkin’s greatest business coups: acquiring a $100 million (in pre-pre-Biden 1982 money) TV station in Boston for less than $10 million. The FCC was stripping RKO of some licenses because it failed “to disclose that its parent, the General Tire and Rubber Company, was under investigation for foreign bribery and for illegal domestic political contributions.” (NYT 1982) Via some deft maneuvering, including bringing in a Black partner who persuaded the FCC officials appointed by Jimmy Carter that the new station would offer specialized programming for inner-city Black viewers, Fredkin managed to get the license for Channel 7. RKO demanded a substantial payment for its physical infrastructure, however, including studios and transmitters. Ed cut a deal with WGBH, the local public TV station, in which WNEV-TV, a CBS affiliate, would share facilities in exchange for a fat annual rent. Ed used this deal as leverage to negotiate a ridiculously low price with RKO. To avoid embarrassment, however, RKO asked if they could leave about $15 million in the station’s checking account and then have the purchase price be announced as $22 million (71 million Bidies adjusted via CPI) for the physical assets. The deal went through and Channel 7 never had to crowd in with WGBH.

[The Carter-appointed FCC bureaucrats felt so good about the Black-oriented programming that they’d discussed with the WNEV-TV partner that they neglected to secure any contractual commitments for this programming to be developed. Channel 7 ended up delivering conventional CBS content.]

A 1970s portrait:

A 1981 report from Fredkin and Tommaso Toffoli:

Related:

Full post, including comments

Who understands homomorphic encryption?

One of my favorite talks at an MIT computer science event last month was by Raluca Ada Popa, a proponent of keeping everything encrypted on servers even during computation. Then it doesn’t matter if the bad people (formerly “bad guys”) break in. How is that possible? Homomorphic encryption, developed about 15 years ago by Craig Gentry, plus hardware support from Intel.

According to Ada Popa, the performance hit from doing everything encrypted is only about 20 percent.

It is tough to know if this can truly protect garden-variety web services, in which the server holds most of the keys, e.g., the database password. I asked Professor ChatGPT:

Homomorphic encryption is a type of encryption that allows computations to be performed on encrypted data without decrypting it. This means that an HTTP server can process encrypted data without ever needing to know what the data actually is. However, there’s a bit of confusion in your question as you suggest the HTTP server has the decryption key.

The real power of homomorphic encryption lies in the fact that the server (like an HTTP server in your question) doesn’t need to have the decryption key at all. It’s the client who holds the key, encrypts the data, sends the encrypted data to the server, and the server performs the computations on the encrypted data. After performing operations, the server sends the encrypted result back to the client, who can then decrypt it using the decryption key.

This offers an unprecedented level of data security because even if the server was compromised, the data would remain safe since the attacker wouldn’t have access to the decryption key. It’s particularly useful in cloud computing, where sensitive data needs to be processed by third-party servers.

The decryption key should be kept safe and private, usually on the client side. If the HTTP server does hold the decryption key, it weakens the system’s security as it creates a single point of failure where both encrypted data and the decryption key can be potentially accessed by an attacker.


Could this prevent all of the credit card and mailing address breaches that we hear about? The credit card number is stored for one-click ordering, but can be decrypted only when the user is logged into an ecommerce site and is ready to enter his/her/zir/their password, which will serve as the key? Ditto for shipping address, but then that has to be transmitted to UPS or some other company, no?

Could it work for Google Drive? The big selling feature is that you can collaborate with 5 other authors if desired. How can that work if the document is encrypted with just one user’s key?

Who has thought about this and figured out whether homomorphic encryption is the silver bullet for defending practical applications?

Also from the event, the Followers of (Computer) Science stay safe in a crowded room for hours at a time by wearing masks:

Full post, including comments

The mad scramble for stuff is over? (Unifi network gear is back in stock)

Last year, I purchased a TP-Link Omada multi-point network because everything from the leading brand, Unifi, was sold out. Out of curiosity, I checked recently and everything that I would have purchased from Unifi, including a Dream Machine Pro router and their wall plate access point was back in stock.

How is the Omada system working, you might ask? Quite well, but there are occasional failures of the upstream connectivity between the Arris cable modem that I purchased and Xfinity and these require power cycling the Arris device to restore. The software being run by the Arris device is controlled by Xfinity and it looks as though there hasn’t been an update for two years (see this post regarding the same issue from December 2020; the software image name is the same as what our Arris reports running).

Anyone else noticed that long-scarce items are available once more?

Exception: aviation parts, which are labor-intensive. Cirrus owners still post messages desperately seeking spares. (See Small airplanes are super expensive, but still much less useful than pre-coronapanic)

Full post, including comments

Database programmers’ dream: build the operating system on top of a DBMS

One of the most interesting talks at a recent anniversary celebration for computer science at MIT was by Michael Stonebraker, a fake MITer (he is best known for Ingres and Postgres, UC Berkeley implementations of IBM’s relational database management system concept). For the past few years, Stonebraker has been working on a database-first vision of computing:

For those who don’t want to watch the 55-minute video, the idea is to run a high-performance RDBMS underneath the operating system (OS) and have the OS use DBMS services to hold its state, support inter-process communication, to roll back to a known good state after a failure or an attack, etc.

Here’s a figure from a 2022 progress report:

The events opened with computer science PhDs acknowledging that humanity faces an existential threat from climate change (Science says that if you can program a computer you can predict the Earth’s temperature 50 years from now). If we combine that with the observation that humans are actually increasing, not decreasing, their CO2 emissions, humanity will soon be extinct. Do we actually need to rethink our practical foundation for computing if these are our sunset years? If yes, this strikes me as the most promising idea.

What about performance? Stonebraker says that the “OS-on-database” runs applications about as fast as conventional “OS-including-ad-hoc-data-management-schemes-for-all-of-its-state”.

Full post, including comments

Was Watson right about the number of computers?

“I think there is a world market for maybe five computers,” is a purported 1943 quote from Thomas Watson, IBM’s CEO just before the dawn of the Von Neumann architecture on which all modern computers rely.

Suppose that he actually did say this. Is it fair to say that events in cloud computing have proven him correct? Watson was not talking about terminals (corresponding to our desktop PCs and smartphones), but mainframes, which correspond to computing clouds today. How many clouds of significance are there? AWS, Google, Azure, Alibaba, and IBM? (source) That’s five!

Full post, including comments

The Inevitable Demise of the Web

Remember to listen to the credentialed experts, such as Hal Berghel, Ph.D. computer nerd. A 1995 academic paper… “The inevitable demise of the Web”:

There is no doubt that the fastest growing part of the Internet is the World Wide Web. From its inception in 1990, the Web has established itself as the leading packet hauler on the Internet, passing beyond FTP, Telnet, WAIS Gopher and all of the other, more established Internet client protocols. The reason for this success is that the Web has established itself as the standard unifying environment for the Internet’s digital riches.However, the days of the Web are numbered. The technology behind the Web is outdated already and may not survive the decade. The current growth rate, which some estimate at 15% per month, suggests that if the end of the Web is to come soon, it will likely be cataclysmal. If this seems unrealistic, consider that this fate befell Gopherspace. As Figure 1 shows, Gopher lead the Web in packet volume as late as March, 1994. In the following twelve months Gopher presence on the Internet all but disappeared. Life cycles are accelerated to frightening paces on the Internet.

Dr. Berghel predicts that, with a little more innovation (from funded academic research?), the muscular connection-oriented Hyper-G protocol will crush HTTP and Java will replace HTML.

Full post, including comments

What to do with two old iPad Mini 2s with free T-Mobile mobile data?

We have a couple of iPad Mini 2s that were introduced in 2013, model number MF575LL/A (64 GB and T-Mobile LTE). If memory serves, these came with a free lifetime low-speed T-Mobile connection (though right now it seems not to be working; maybe it needs to be reactivated?). Checking the various “sell my stuff” web sites, these have no commercial value ($729 back in 2013, which purportedly corresponds to 947 Bidies). But they’re in great cosmetic condition and the batteries still work for a few hours at least so I’m reluctant to throw them out.

They can’t run the latest iOS, but most major apps work fine on iOS 12.5.

What is a useful application of such obsolete hardware, with particular attention to the mobile data connection. Thanks in advance for any ideas! (“idea” can include “give away to X”)

What if the idea is “throw out”? Here’s Apple’s environmental report from September 2015:

Related:

Full post, including comments

60 years of parallel processing

Based on searching the ACM journal database, the dream of parallel computing seems to date back roughly 60 years. Some folks at Westinghouse in 1962 imagined partial differential equations being solved, satellites being tracked, and other serious problems being attacked.

By 1970, multiple processors were envisioned working together to serve multiple users on a time-sharing system (“A processor allocation method for time-sharing”; Mullery and Driscoll (IBM), in CACM). In 1971, a 4-core machine was set up for keeping U.S. Navy ships out of bad weather: “4-way parallel processor partition of an atmospheric primitive-equation prediction model” (Morenoff, et al.).

What about today? A friend recently spent $3,000 on a CPU. Why did he need 128 threads? “I like to keep a lot of browser windows open and each one is running JavaScript, continuously sending back data to advertisers, etc.”

The benchmark nerds say that you don’t need to spend a lot to maximize single-thread performance:

And you also get pretty close to the best benchmark result with a 550-Bidie CPU:

But maybe the benchmarks don’t fully capture the multi-window browser support capabilities of the $6,000+ CPUs?

Full post, including comments

Frontiers of user interface: the FAA NOTAM system

A few weeks ago, there were news reports of Biblical rain in the Miami/Fort Lauderdale area. I checked the FAA’s web site for NOTAMs (no longer an abbreviation for the sexist “notices to airmen”) for FLL.

At first glance, using the default sort order, things looked pretty good on the morning of April 13:

There are some amendments to instrument procedures that you’ll probably not need (it’s sunny Florida!) and some signs and markings aren’t standard. If we scroll down a couple of screens, however, we find that there are some plans to maintain the runway status lighting system on April 18 and…. the entire airport is closed. That was the very last NOTAM presented.

Full post, including comments