Masters Thesis Idea: Conversational Flight Planner

In working on the slides for a flight planning section of our FAA Private Pilot Ground School at MIT (videos and slides available free online), it occurred to me that none of the fancy computer tools were as convenient or efficient as talking to a competent human.

What about a system where the input is, e.g.,”I’m thinking about going from Bedford to Washington, D.C. this weekend.” (could be entered via menus; does not have to be natural language)

The Conversational Flight Planner responds after looking at the voluminous official FAA briefing and some of the long-term weather forecast products, such as MOS:

There will be a strong wind from the north so you should consider paying up to fly into Dulles and land Runway 1R rather than deal with the crosswind at KGAI.

Looks like ice is possible on Sunday evening so you’ll need to depart Sunday at noon.

It will be below freezing overnight Saturday night so you need to arrange for a hangar or a preheater plug-in.

Interesting Master’s Thesis project for a computer science or aero/astro major?

Full post, including comments

When does the great age of machine intelligence reach our desktop computers?

Turning Google Contacts into address labels for Christmas/New Year’s cards is a task that I expected to be simple. The plan was

  • export to “Google CSV” format
  • upload to avery.com to generate a PDF for printing

This fails because the Google export process produces a CSV file with nearly 100 columns, which is too many for the Avery system to handle.

No problem, open in Microsoft Excel and cut down to about 5 columns, right?

What happens when you combine programmer’s from two of the world’s smartest companies? Excel is not smart enough to recognize a column of 5- or 9-digit values as ZIP codes, even if they appear right after a column of two-character state abbreviations. The leading zeros are trimmed off, turning Massachusetts ZIP codes into four-digit values, e.g., “02138” to “2138” (the ZIP code of the great minds of Harvard and Harvard Square, who will soon be tapped by President Warren to optimize our government).

What if we keep this as a Google-only process? The people who built Contacts apparently don’t talk to the people who built Sheets. There is no way to export directly from Contacts to a Google spreadsheet.

Save to the local disk and then upload, right? The behavior is exactly the same as with Excel: leading zeroes of all of the five-digit ZIP codes are trimmed off. This is the company we’re going to trust with medical diagnoses? (“The doctor will Google you now” turning into “The Google will doctor you now.”)

As with most other challenges, if you’re a skilled user of Excel the solution is straightforward: create a blank workbook and then use the Data tab to import “From Text/CSV”. Even on the full automatic setting, it correctly infers that the ZIP column is text, not number. But if the fully automated import works, why doesn’t it work simply to open the CSV file in Excel?

(The whole process ended up taking way longer than if I’d simply addressed 180 envelopes by hand, of course.)

The particular challenge of wrestling with Google Contacts or generating addressed envelopes is not that interesting, but I think it is a good starting point for a discussion of how machine learning and AI can ever be integrated back into the computer systems we use day to day. Google Translate does some impressive stuff, but why isn’t it easy to enhance Google Sheets?

Separately, the Google Contacts software has a long way to go to reach the same level of quality as what Sharp was shipping with the Wizard organizer in 1989. A contact with a single street address, once exported, will appear in a CSV-file row without any street address. Why is it difficult for Google to do what Apple, Microsoft, Motorola, Nokia, and Sharp were doing successfully in the 1990s?

Full post, including comments

Feel better about the time you’ve invested in writing documentation

During a recent rental car excursion I became curious about the USB-C port in the front of the Nissan Maxima. Could one run a laptop from the car, for example? I decided to open the glovebox and read the specs from the owner’s manual. After one year and more than 20,000 miles of rental by perhaps 100 different drivers…

(still in its shrink wrap)

Full post, including comments

Is autocorrect location-aware?

Annals of our future self-driving overlords… I was at the Burlington Mall with the kids. Apple iMessage exchange:

  • where are you?
  • “in Arhaus” autocorrected to “in Arafat’s”

The Burlington Mall has pretty good WiFi and LTE so I think that the phone should have been able to figure out where it was. Does autocorrect not even try to adjust its behavior based on location?

Full post, including comments

Why wasn’t Google Glass popular for translation?

Aside from missing family and friends and finding that wearing an air pollution mask tended to fog up my glasses, one reason that I was happy to return home from China was that it was no fun being illiterate. WeChat can be used to translate a sign or menu into English, but it is somewhat cumbersome. Same deal with Google Translate, which works to turn English text into characters to show shop and restaurant personnel.

It occurred to me to wonder why systems such as Google Glass hadn’t caught on simply for the purpose of finding text in every scene and translating into the traveler’s home language. Was there simply not enough battery power to have the thing running continuously? It would have added a lot to the trip if I could have just walked around streets and museums and, without having to take any explicit action, seen English language versions of all of the surrounding writing.

Full post, including comments

The female roots of all computer science, vol 17: Barbara Liskov

“The Architect of Modern Algorithms” (Quanta) is a recently popular link among some computer nerds on Facebook (all of the sharers, when I last checked, identified as older white males):

Barbara Liskov pioneered the modern approach to writing code.

But by the late 1960s, advances in computing power had outpaced the abilities of programmers. Many computer scientists created programs without thought for design. They wrote long, incoherent algorithms riddled with “goto” statements — instructions for the machine to leap to a new part of the program if a certain condition is satisfied. Early coders relied on these statements to fix unforeseen consequences of their code, but they made programs hard to read, unpredictable and even dangerous.

When she was still a young professor at the Massachusetts Institute of Technology, she led the team that created the first programming language that did not rely on goto statements. The language, CLU (short for “cluster”), relied on an approach she invented — data abstraction — that organized code into modules. Every important programming language used today, including Java, C++ and C#, is a descendant of CLU.

Note that in the discredited male-authored history of computer nerdism, the modern programming language dates back at least to ALGOL 60, developed when Professor Liskov was 21 years old. The public war on goto was waged not by Liskov, but by the developers of ALGOL and Edsger W. Dijkstra, a Dutch curmudgeon, who wrote “Go To Statement Considered Harmful” in 1968, pointing out that “the remark about the undesirability of the go to statement is far from new” and describes some of the history since at least 1959 (criticism by by Heinz Zemanek). Note that Dijkstra is also known for saying “The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offense.” Liskov, for her part, was known at MIT primarily for developing and teaching the standard software engineering class, 6.170, in which the CLU language was used by students. She was a usually modest and always hard-working person who believed that properly engineered software could function perfectly: “If you find a bug in your code, you should be as embarrassed as if you found a cockroach in your kitchen,” she memorably noted (we had a lot of cockroaches in our East Campus dorm and they were regularly visible during visits to restaurants in Central Square at the time!).

[The article also notes that Liskov is concerned about the impact of the Internet:

I’m worried about the divorced couple in which the husband publishes slander about the wife, including information about where she lives. There is terrible stuff going on.

Yet if one of these two sued the other, the most common precursor to their divorced status, the lawsuit, and anything said by a party during it, as well as the mailing address where the plaintiff wants the checks sent, was already public information, available to anyone who wanted to go down to the courthouse, decades before women developed microprocessors and TCP/IP. (see this study on Massachusetts court records, though records of litigation following out-of-wedlock sex are sealed) Reporters were covering divorce litigation in newspaper stories prior to the computer age, e.g., a November 11, 1939 piece in the NYT describing an allegation of “cruelty”, and one from December 2, 1934, “a charge of extreme cruelty won a divorce here today for Mrs. Edith Crocker Sanger from Prentice Sanger of New York City.” Divorce was apparently a good business even in the Depression. From September 24, 1931: “More than $1,000,000 was handed to Mrs. Eunice Essig Brach of Winnetka today with a divorce from her husband, Frank V. Brach, president of a candy company.” Certainly someone launching a divorce lawsuit and obtaining a profitable judgment in 2019 gets a lot less publicity than he or she would have prior to the Internet.]

Readers: What will the next edition in the “female roots of all computer science” saga be? What other fundamental technologies can be plausibly attributed to a person who identified as a “woman”? My vote: find a woman to replace William Shockley as developer of the semiconductor transistor and Silicon Valley. How can it be done? Here’s a National Public Radio story that credits Hedy Lamarr with having invented frequency hopping. Wikipedia contradicts this story to some extent and the actual patent to Ms. Lamarr and George Anthell reveals that they narrowly claimed a specific piano roll-style mechanism for controlling frequency hopping, not the broad invention of frequency hopping. So we need to find an early patent on a specific application of semiconductor transistors in which one of the inventors has a female-sounding name. Then we can discover the female roots of the modern transistor and rely on the fact that reporters won’t read the patent claims to see that they narrowly cover an application of transistors, not the transistor itself.

Also, will this article on Barbara Liskov and the promotion of the article by “allies” have the desired effect of getting more people who identify as “women” into computer nerdism? The article reveals that Barbara Liskov, despite having invented essentially all of practical programming technology, was not popularly recognized until she reached the age of 80. Moreover, she describes having to struggle as a result of her identification as a “woman” (see also a 2008 interview, in which she notes that “there were a large percentage of women” at her first programming job at MITRE in the early 1960s, at which she learned FORTRAN (several years after inventing ALGOL?) and then got a PhD working with John McCarthy, credited for now at least with the development of Lisp, and then met Dijkstra in 1969 (giving him the idea to write his 1968 screed against goto?)). Compare to Britney Spears, a top-of-the-charts success at age 17 who has never described being a cisgender female as a career handicap in her industry. Why wouldn’t a cisgender female aware of both Liskov and Spears conclude that computer science should be a last resort?

Related:

Full post, including comments

Give thanks that we don’t live in the early PC age

Happy Thanksgiving! (Or National Day of Mourning, depending on your perspective/ethnicity.)

Here’s a friend’s nostalgia shelf:

I hope that we can all agree to give thanks that we’ve moved on from this phase of personal computing!

Separately, with no Thanksgiving to slow them down, China can concentrate fully on Christmas decoration weeks earlier than Americans. “There’s Snow Place Like Shanghai Disney Resort” shirts in a city where November high temps had fallen to around 70 degrees…

Related:

Full post, including comments

Why isn’t Apple Wallet smart enough to delete old boarding passes?

Software is so smart that it will soon be driving us around, recognizing and dodging pedestrians. WIRED and similar cheerleaders for technology assure us that our software pals will also diagnose us based on medical imaging and other tests.

Apple Wallet is an app from one of the world’s leading software companies. Every time I open it there is a boarding pass from a flight that occurred a month or two ago. The software is neither smart enough to delete the pass a month after the flight (and after the location subsystem shows that the flight was actually taken) nor pop up a “would you like to delete these old boarding passes?” offer.

If this is the best that Apple can do, how is it that we can have confidence in the hard problems being solved?

Full post, including comments

Why would health care look to startups for answers?

Several executives speaking at a recent 25th anniversary celebration for a health care informatics lab spoke hopefully about solutions that might be forthcoming from startups yet to be founded. This struck me as odd. A hospital is a huge enterprise (I learned that Children’s for example, has more than $2.3 billion/year in revenue). Health care is 18 percent of U.S. GDP (compare to 4.5 percent for Singapore where people failed to realize that they needed to take opioids 24/7 and also inhale medical marijuana).

When an industry is this big, why wouldn’t it be the biggest tech companies, e.g., IBM and Amazon, that deliver solutions? The current Epic system is clunky, but why wouldn’t the Epic research lab be the place where useful innovation happens? IBM and AT&T were slow-moving, but IBM Watson and AT&T Bell Labs were the main sources of useful innovation in their day, not a handful of engineers in a garage.

What has changed that we think it will be the startups that inherit the Earth and that fix what ails us? The availability of venture capital such that nobody capable of accomplishing anything wants to work for a straight salary anymore?

Separately, a friend who is plugged into Silicon Valley told me about the latest trend. VCs will try to fund a company around a bunch of former employees of a single big company, e.g., Xooglers (former coders on the Google plantation). This protects the VC from downside risk. No matter how lame the idea is or how poor the execution, even if the startup is a complete failure it will likely still be acquired by, e.g., Google, simply because the big company wants this set of employees back and they are known quantities. The VC fund will at least get most of its money back.

Full post, including comments

Business and software blueprint for a personal portable medical record

Good news: I can share with you a complete functional and business blueprint for how to make a personal portable electronic medical record.

Bad news: The plan is from 1994 and almost no progress has been made toward any of the goals set forth 25 years ago.

A slide for the Guardian Angel system appeared during a 25th anniversary celebration for a health care informatics lab that I played a small role in starting. From the big blueprint:

Current health information systems are built for the convenience of health care providers and consequently yield fragmented patient records in which medically relevant lifelong information is sometimes incomplete, incorrect, or inaccessible. We are constructing information systems centered on the individual patient instead of the provider, in which a set of “guardian angel” (GA) software agents integrates all health-related concerns, including medically-relevant legal and financial information, about an individual (its “subject”). This personal system will help track, manage, and interpret the subject’s health history, and offer advice to both patient and provider. Minimally, the system will maintain comprehensive, cumulative, correct, and coherent medical records, accessible in a timely manner as the subject moves through life, work assignments, and health care providers.

This would be awesome to have today and yet we are as far away from it, I think, as we were in 1994. Sobering!

Related:

Full post, including comments