The female roots of all computer science, vol 17: Barbara Liskov

“The Architect of Modern Algorithms” (Quanta) is a recently popular link among some computer nerds on Facebook (all of the sharers, when I last checked, identified as older white males):

Barbara Liskov pioneered the modern approach to writing code.

But by the late 1960s, advances in computing power had outpaced the abilities of programmers. Many computer scientists created programs without thought for design. They wrote long, incoherent algorithms riddled with “goto” statements — instructions for the machine to leap to a new part of the program if a certain condition is satisfied. Early coders relied on these statements to fix unforeseen consequences of their code, but they made programs hard to read, unpredictable and even dangerous.

When she was still a young professor at the Massachusetts Institute of Technology, she led the team that created the first programming language that did not rely on goto statements. The language, CLU (short for “cluster”), relied on an approach she invented — data abstraction — that organized code into modules. Every important programming language used today, including Java, C++ and C#, is a descendant of CLU.

Note that in the discredited male-authored history of computer nerdism, the modern programming language dates back at least to ALGOL 60, developed when Professor Liskov was 21 years old. The public war on goto was waged not by Liskov, but by the developers of ALGOL and Edsger W. Dijkstra, a Dutch curmudgeon, who wrote “Go To Statement Considered Harmful” in 1968, pointing out that “the remark about the undesirability of the go to statement is far from new” and describes some of the history since at least 1959 (criticism by by Heinz Zemanek). Note that Dijkstra is also known for saying “The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offense.” Liskov, for her part, was known at MIT primarily for developing and teaching the standard software engineering class, 6.170, in which the CLU language was used by students. She was a usually modest and always hard-working person who believed that properly engineered software could function perfectly: “If you find a bug in your code, you should be as embarrassed as if you found a cockroach in your kitchen,” she memorably noted (we had a lot of cockroaches in our East Campus dorm and they were regularly visible during visits to restaurants in Central Square at the time!).

[The article also notes that Liskov is concerned about the impact of the Internet:

I’m worried about the divorced couple in which the husband publishes slander about the wife, including information about where she lives. There is terrible stuff going on.

Yet if one of these two sued the other, the most common precursor to their divorced status, the lawsuit, and anything said by a party during it, as well as the mailing address where the plaintiff wants the checks sent, was already public information, available to anyone who wanted to go down to the courthouse, decades before women developed microprocessors and TCP/IP. (see this study on Massachusetts court records, though records of litigation following out-of-wedlock sex are sealed) Reporters were covering divorce litigation in newspaper stories prior to the computer age, e.g., a November 11, 1939 piece in the NYT describing an allegation of “cruelty”, and one from December 2, 1934, “a charge of extreme cruelty won a divorce here today for Mrs. Edith Crocker Sanger from Prentice Sanger of New York City.” Divorce was apparently a good business even in the Depression. From September 24, 1931: “More than $1,000,000 was handed to Mrs. Eunice Essig Brach of Winnetka today with a divorce from her husband, Frank V. Brach, president of a candy company.” Certainly someone launching a divorce lawsuit and obtaining a profitable judgment in 2019 gets a lot less publicity than he or she would have prior to the Internet.]

Readers: What will the next edition in the “female roots of all computer science” saga be? What other fundamental technologies can be plausibly attributed to a person who identified as a “woman”? My vote: find a woman to replace William Shockley as developer of the semiconductor transistor and Silicon Valley. How can it be done? Here’s a National Public Radio story that credits Hedy Lamarr with having invented frequency hopping. Wikipedia contradicts this story to some extent and the actual patent to Ms. Lamarr and George Anthell reveals that they narrowly claimed a specific piano roll-style mechanism for controlling frequency hopping, not the broad invention of frequency hopping. So we need to find an early patent on a specific application of semiconductor transistors in which one of the inventors has a female-sounding name. Then we can discover the female roots of the modern transistor and rely on the fact that reporters won’t read the patent claims to see that they narrowly cover an application of transistors, not the transistor itself.

Also, will this article on Barbara Liskov and the promotion of the article by “allies” have the desired effect of getting more people who identify as “women” into computer nerdism? The article reveals that Barbara Liskov, despite having invented essentially all of practical programming technology, was not popularly recognized until she reached the age of 80. Moreover, she describes having to struggle as a result of her identification as a “woman” (see also a 2008 interview, in which she notes that “there were a large percentage of women” at her first programming job at MITRE in the early 1960s, at which she learned FORTRAN (several years after inventing ALGOL?) and then got a PhD working with John McCarthy, credited for now at least with the development of Lisp, and then met Dijkstra in 1969 (giving him the idea to write his 1968 screed against goto?)). Compare to Britney Spears, a top-of-the-charts success at age 17 who has never described being a cisgender female as a career handicap in her industry. Why wouldn’t a cisgender female aware of both Liskov and Spears conclude that computer science should be a last resort?


Full post, including comments

Give thanks that we don’t live in the early PC age

Happy Thanksgiving! (Or National Day of Mourning, depending on your perspective/ethnicity.)

Here’s a friend’s nostalgia shelf:

I hope that we can all agree to give thanks that we’ve moved on from this phase of personal computing!

Separately, with no Thanksgiving to slow them down, China can concentrate fully on Christmas decoration weeks earlier than Americans. “There’s Snow Place Like Shanghai Disney Resort” shirts in a city where November high temps had fallen to around 70 degrees…


Full post, including comments

Why isn’t Apple Wallet smart enough to delete old boarding passes?

Software is so smart that it will soon be driving us around, recognizing and dodging pedestrians. WIRED and similar cheerleaders for technology assure us that our software pals will also diagnose us based on medical imaging and other tests.

Apple Wallet is an app from one of the world’s leading software companies. Every time I open it there is a boarding pass from a flight that occurred a month or two ago. The software is neither smart enough to delete the pass a month after the flight (and after the location subsystem shows that the flight was actually taken) nor pop up a “would you like to delete these old boarding passes?” offer.

If this is the best that Apple can do, how is it that we can have confidence in the hard problems being solved?

Full post, including comments

Why would health care look to startups for answers?

Several executives speaking at a recent 25th anniversary celebration for a health care informatics lab spoke hopefully about solutions that might be forthcoming from startups yet to be founded. This struck me as odd. A hospital is a huge enterprise (I learned that Children’s for example, has more than $2.3 billion/year in revenue). Health care is 18 percent of U.S. GDP (compare to 4.5 percent for Singapore where people failed to realize that they needed to take opioids 24/7 and also inhale medical marijuana).

When an industry is this big, why wouldn’t it be the biggest tech companies, e.g., IBM and Amazon, that deliver solutions? The current Epic system is clunky, but why wouldn’t the Epic research lab be the place where useful innovation happens? IBM and AT&T were slow-moving, but IBM Watson and AT&T Bell Labs were the main sources of useful innovation in their day, not a handful of engineers in a garage.

What has changed that we think it will be the startups that inherit the Earth and that fix what ails us? The availability of venture capital such that nobody capable of accomplishing anything wants to work for a straight salary anymore?

Separately, a friend who is plugged into Silicon Valley told me about the latest trend. VCs will try to fund a company around a bunch of former employees of a single big company, e.g., Xooglers (former coders on the Google plantation). This protects the VC from downside risk. No matter how lame the idea is or how poor the execution, even if the startup is a complete failure it will likely still be acquired by, e.g., Google, simply because the big company wants this set of employees back and they are known quantities. The VC fund will at least get most of its money back.

Full post, including comments

Business and software blueprint for a personal portable medical record

Good news: I can share with you a complete functional and business blueprint for how to make a personal portable electronic medical record.

Bad news: The plan is from 1994 and almost no progress has been made toward any of the goals set forth 25 years ago.

A slide for the Guardian Angel system appeared during a 25th anniversary celebration for a health care informatics lab that I played a small role in starting. From the big blueprint:

Current health information systems are built for the convenience of health care providers and consequently yield fragmented patient records in which medically relevant lifelong information is sometimes incomplete, incorrect, or inaccessible. We are constructing information systems centered on the individual patient instead of the provider, in which a set of “guardian angel” (GA) software agents integrates all health-related concerns, including medically-relevant legal and financial information, about an individual (its “subject”). This personal system will help track, manage, and interpret the subject’s health history, and offer advice to both patient and provider. Minimally, the system will maintain comprehensive, cumulative, correct, and coherent medical records, accessible in a timely manner as the subject moves through life, work assignments, and health care providers.

This would be awesome to have today and yet we are as far away from it, I think, as we were in 1994. Sobering!


Full post, including comments

Why we still need Microsoft

I wanted to save a PowerPoint presentation about a recent Northwest Passage cruise to a series of HD-resolution (1920×1080) JPEGs for direct display from a USB stick to a TV.

PowerPoint will let you do this, but only at 1280×720(!) resolution.

After a brief search, I found an official Microsoft document on this subject: “How to change the export resolution of a PowerPoint slide”.

Instead of adding a dialog box to prompt the user for the desired resolution, Microsoft took the trouble to advertise a method of doing this by editing the Windows registry, complete with cautions about how “serious problems might occur” if you make any mistakes while editing said registry.

It is kind of awe-inspiring.

(How does one accomplish this goal? The advertised procedure does work and a DPI resolution of 144 results in 1920×1080 pixel JPEGs. The current version of PowerPoint included with Office 365 is 16.0. See this video tutorial if you want a little more handholding.)

Full post, including comments

Software updates before a trip now take longer than packing?

I’m heading out on a trip that will involve limited Internet connectivity. My notebook computer hadn’t been turned on for 1.5 weeks. Updating that to the latest version of Windows 10 took four tries and roughly three hours (admittedly this was a bigger update than usual and there was also a Dell BIOS update). I’m taking a Sony a7F II camera and two lenses. Software updates were available for all three of these items. Downloading them required resetting my password on the Sony web site, verifying my Sony account, download three Windows applications, running three separate Windows apps, connecting and disconnecting the camera via USB to the PC, following Sony instructions to remove the battery after each lens update, etc.

Thus, I think it is fair to say that updating devices took longer than packing up for a somewhat remote trip!

What if the Internet of Things became a reality? Given the security issues around completely automatic updates, will humans eventually be reduced to full-time sysadmins just for the stuff in their apartments?

Update: arrived in Copenhagen. Here’s the hotel coffee shop electronic menu screen:

Full post, including comments

Artificial Intelligence and commercial sex purportedly intersect

For anyone who knew Marvin, “AI pioneer accused of having sex with trafficking victim on Jeffrey Epstein’s island” is a surprise.

Apparently Jeffrey Epstein was using some of the money that he stole to run Templeton Foundation-style scientific gatherings in the Caribbean (funded with money that John Templeton earned and then skipped paying taxes on by renouncing his U.S. citizenship and relocating to the Bahamas). A woman now says that, as a 17-year-old, she was paid to have sex with the then-73-year-old Marvin Minsky at one of these gatherings. (There does not seem to be any evidence the Marvin ever left the mainland U.S. to hang out with Epstein, though.)

In the 40+ years that I saw Marvin, at his office, at his home, and at conferences, he never once took notice of a young woman or commented on the appearance of a woman. He was simply not very interested in matters of the flesh.

On a more practical level, if Marvin had wanted to have sex with 17-year-olds, he could have done so legally in Massachusetts, in which the age of consent is 16. (Prostitution per se is illegal in Massachusetts, but it wouldn’t be illegal for an older person to supply a young sex partner with gifts of jewelry, housing, transportation, vacation trips, etc. (though the real money would be in a pregnancy followed by harvesting the unlimited child support cash available under Massachusetts law)) There were also quite a few graduate students who had sexual relationships with successful academics and, lo and behold, found that the path to a tenure-track professorship was wide open. There was never any hint or rumor around Marvin of a sex-for-career-advancement exchange (or any other kind of affair).

Ever since Stormy Daniels dominated the mainstream media, I guess it isn’t surprising that people whose job is having sex in exchange for money are newsworthy. But if they’re claiming that they were paid to have sex with those who are deceased, and there is no evidence to support these claims, should reporters be broadcasting these tales? This is the first one about someone that I know personally and it rings false.


Full post, including comments

USB-C more durable than USB-A?

When trying to charge a phone from public charging stations and power outlets on airliners, one thing that I’ve noticed is that the USB-A (traditional rectangular) ports tend to be “loose like wizard sleeve”. Unless one is willing to hold the connector and apply pressure, therefore, they are useless for charging. I’m not sure how they get like this. I can’t remember a USB-A connector failing mechanically on a home computer or charger. Is it just that if 1,000 different cables have been plugged over a one- or two-year period that the socket is stretched out to the size of the largest? Compare to at home where I might use only three or four different cables in any given socket.

What’s the prognosis for USB-C? Are the tolerances more precise such that the public connectors will remain functional?

Full post, including comments