Why didn’t coronapanic and shutdown push virtual reality over the hump?

In Virtual reality and augmented reality: the technologies of the future (March 2019) I asked

Is it fair to say that “VR/AR is the technology of the future, and always will be”?

The future arrived in March 2020, with governments around the world making it illegal to interact face to face, illegal to travel, etc. If VR were ever going to catch on, shouldn’t coronapanic and associated lockdowns have been the catalyst?

If there were complete VR experiences at most of the world’s art museums, I would buy a VR headset right now, but museum web sites don’t seem to offer more than conventional image galleries. Maybe there are a handful of museum experiences available, but certainly it is not like the freedom that we had in the physical world when the physical world (beyond South Dakota and Sweden) included freedom.

VR could also be great for mass (virtual) gatherings. Wander around in VR and form small conversation groups (but maybe this wouldn’t be as good as Zoom because you’d have to interact with avatars unless you wanted to see pictures of people with VR goggles attached to their heads.

Who has tried the Oculus Quest 2? One of my cousins loves this, but maybe that is because he has been locked into his house with wife and two (mostly grown) children (i.e., perhaps coronapanic did push him into the VR fold). No cumbersome cables (and therefore limited to two hours of battery-based usage). No need to configure a PC. No privacy issues because it is tied to Facebook, which already knows everything about you.


Full post, including comments

Windows or MacOS better for restricting teenager activity online?

As noted in Coronapanic proved Greta Thunberg right, 2020 will go down in history as the year when adults stole the most from children (a whole year of their educational and social life in hopes that a handful of (mostly very old) adults might live a few additional years).

American children are now supposed to be focused computer users all day at home in “remote school” with no supervision. Adults in this situation will generally get distracted with online shopping, online chatting with friends, social media, etc. But we have set up a system in which a teenager who fails to resist all of these temptations will lose a year of education.

First, I’m wondering why there isn’t a service in which someone in India or the Philippines will remote desktop into the child’s computer and stay there all day. The remote proctor can then shout out “Hey, get back to your school browser. Tiktok will not help you get into Yale.” Let the remote proctor connect to a speaker in the corner of the room to do the shouting and call the monthly service Telescreen. Perhaps for a reduced monthly fee, the folks in India/Philippines could use conventional operating system controls and alert parents on a daily or weekly basis, block out new chat sites daily, etc.

For those who want to do it all themselves, but not stand over the child/teenager every day, what operating system is best? Windows has an extensive array of controls, I think, when the parent is the Admin account and the child is a User account. Some explanations:

A friend who has a history of monitoring activity within his household (see Au pair to green card) says the following:

Windows does it perfectly. There’s a browsing and search history monitor. You can restrict by host. If his chat apps are inside the browser, you can block the host name. It knows about browsers even you don’t know about. The parent can easily see that he is spending 4 hours a day on somechat.com and then go see herself what it is and then block it with one click. It can all be done remotely.

(Some of the protections on web activity may work only if the browser is Microsoft’s own Edge program.)

How about the Macintosh? This Macworld UK article suggests that it is easy to block categories of web sites, but not individual hosts. A third-party app, bark, seems to go deeper at $100/year.

Should we ask Professor Dr. Jill Biden, Ed.D. for advice in this area?

Finally, why isn’t there a good marketplace for American parents to hire teachers/tutors from foreign countries to sit virtually with their children in the sad parody that we call “remote school”? For a higher fee, instead of a proctor who can block time-wasting activities (such as blogging!), the teenager gets a qualified teacher to look at assignments, suggest references, etc. There are markets for language tutors, right? Why not a market for a remote private teacher for one’s kids? It could be useful also for parents whose children are “homeschooled”.

Touchscreen gloves for the child who needs to be online in the snow…

From our in-house 11-year-old artist, who is not a screen-time junkie. I wonder how much paint will be coming off with the tape that she used…

Readers: What is the technical solution? Windows, Mac, Windows+App/Service, or Mac+App/Service? And why can’t we easily pay the foreigners who might be able to help our children stay focused on their schoolwork?

Full post, including comments

Alexa and Google Home have proved that home automation is useless?

Ken Olsen, the founder of Digital Equipment Corporation, a pioneer in minicomputers, disparaged microprocessors for controlling houses back in 1977:

In 1977, referring to computers used in home automation at the dawn of the home computer era, Olsen is quoted as saying “There is no reason for any individual to have a computer in his home.” Olsen admitted to making the remark, even though he says his words were taken out of context and he was referring to computers set up to control houses, not PCs. According to Snopes.com, “the out-of-context misinterpretation of Olsen’s comments is considered much more amusing and entertaining than what he really meant, so that is the version that has been promulgated for decades now”.

We’ve had 43 years of progress since then. The functions that he said were useless to accomplish by touching a switch are now useless to accomplish with our voices (are we truly so fat and lazy that we can’t get off the sofa to flick a light switch and need to ask Alexa to activate a light?).

I’m still kind of an enthusiast for a computer-controlled home, especially if we could have electrochromic windows and skylights everywhere around the house and/or motorized shades and brise soleil. But even in technologically advanced societies, such as Korea and Taiwan, the typical component of a house continues to be dumb, right?

Bonus… a picture of Ken Olsen’s former house, past peak foliage:

For folks who believe in the magic of American real estate as an investment: the Zillow link above says that the house was sold in 2007 for $1.9 million and is now worth $2 million, 13 years later. Up 5 percent, right? (actually 0 percent if it costs 5 percent in real estate commissions to sell) But let’s not forget that it is attracting $28,832 per year in property tax even before the ground has been broken on the nation’s most expensive (per student) school ever constructed.) The S&P 500, by contrast, was at 1,455 at the time of the sale. On October 26, 2020 it was 3,465 (up 138 percent). Instead of requiring the payment of property tax, the S&P 500 has been paying a dividend every year during this period.

What if we adjust for inflation? The house cost $2.4 million in today’s mini dollars. So it has actually lost more than 20 percent in value when you consider the broker fees that will need to be paid to unload it. (Adding insult to injury: U.S. capital gains tax does not adjust for inflation, so the unlucky owner might have to pay capital gains tax on the increase in nominal value despite the fact that there was a loss. in real (inflation-adjusted) terms.)

Full post, including comments

Zoom should treat baldness electronically?

The highest-paying American employers celebrate diversity… so long as all of the diverse individuals are between 20 and 40 years of age. If interviews and work are virtual, though, could an older person slip in via the magic of image processing software?

In Achieve college student skin color diversity via image processing? I looked at whether Zoom could help colleges achieve the rainbow of skin tones that they seek. For interviews and long-term work, why not image processing to make an older person look reasonably young? Younger men are typically slimmer and have more hair than older men. Why not use image processing to bring the hairline back down towards the eyes and to slenderize the face, neck, and torso? For the righteous Silicon Valley employers, add skin tone to whatever the employers are seeking at the moment.

Readers: Is there any reason to show up to a job interview as a fat bald 60-year-old? Why not show up as a slim 35-year-old with luxuriant hair, like Brendan Fraser as the Colombian drug lord in Bedazzled:

Full post, including comments

PhD Computer Nerdism in 1969

“Syllabi and qualifying examinations for the Ph. D. in computer science at Stanford University” from 1969 (uncovered while doing a prior art search on a patent case):

What are the modern equivalents? A thorough knowledge of JavaScript?

Another barn find… “The debugging system AIDS” (1970):

The object of the AIDS project has been to provide a debugging system for FORTRAN and assembly language code on the Control Data 6600 … The story of AIDS may be traced back to early 1965… AIDS, the All-purpose Interactive Debugging Sys-tem, is a main program with three input files… In evaluating the results of the AIDS project, it is necessary to ask two separate questions: Is such a powerful debugging system worthwhile? and Has this implementation been successful… Thus the fundamental question is, does AIDS save the programmer time in debugging?

Full post, including comments

Robot overlords versus dog with upset tummy

We are told that we will soon be replaced by robots. An account of one of our future overlords cleaning the house…

So … one [of] my dogs had a BAD accident and then my Roomba went off at 4 am as scheduled. After cleaning up for 2 hours and tossing the disgusting Roomba, I need to replace it. It was 5 years old so I am sure there are new features out there. Any specific recs? I’m not married to the brand but I did love it. Until today.

Our manual vacuum cleaner doesn’t seem so bad after reading this.

Full post, including comments

Professor Karen prefers to stay home this fall

“Expecting Students to Play It Safe if Colleges Reopen Is a Fantasy” (nytimes), is by a 67-year-old professor whose paychecks are guaranteed to keep rolling in. What does Professor Karen (a.k.a. “Laurence Steinberg”) say?

Safety plans border on delusional and could lead to outbreaks of Covid-19 among students, faculty and staff.

Most types of risky behavior — reckless driving, criminal activity, fighting, unsafe sex and binge drinking, to name just a few — peak during the late teens and early 20s.

First, this is the age at which we are most sensitive and responsive to the potential rewards of a risky choice, relative to the potential costs. College-age people are just as good as their elders at perceiving these benefits and dangers, but compared with older people, those who are college-aged give more weight to the potential gains. They are especially drawn to short-term rewards.

This is fascinating to me! The alter cocker calls the students reckless and implies that they are not properly weighing “dangers” when, in fact, healthy 18-22-year-olds don’t face any “danger” from coronaplague! Out of 7,647 people killed in Massachusetts by/with Covid-19 through June 15, exactly 15 were under 30:

Maybe those 15 were running around the soccer field the day before coronavirus struck them down? Not likely. 98.3 percent of Covid-19 deaths in Massachusetts are among those “with underlying conditions” (dashboard example).

The old cower-in-place advocate for Zoom continues to write with the assumption that young people are taking “risks” unreasonably.

Finally, college-age people show more activation of the brain’s reward regions and are more likely to take risks when they are with their peers than when they are alone. There are no such effects of peers among people who are past their mid-20s.

Not all adolescents are risk-takers, of course, and not all adults are risk-averse. But it’s hard to think of an age during which risky behavior is more common and harder to deter than between 18 and 24, and people in this age make up about three-fourths of full-time American undergraduates. … It’s one of those perfect storms — people who are inclined to take risks in a setting that provides ample temptation to do so.

The NYT reader will infer from this that two slender undergraduates meeting for coffee are taking roughly the same level of personal risk that Andy Green took while driving ThrustSSC at 763 mph.

My pessimistic prediction is that the college and university reopening strategies under consideration will work for a few weeks before their effectiveness fizzles out. By then, many students will have become cavalier about wearing masks and sanitizing their hands. They will ignore social distancing guidelines when they want to hug old friends they run into on the way to class. They will venture out of their “families” and begin partying in their hallways with classmates from other clusters, and soon after, with those who live on other floors, in other dorms, or off campus. They will get drunk and hang out and hook up with people they don’t know well. And infections on campus — not only among students, but among the adults who come into contact with them — will begin to increase.

I look forward to a time when we are able to return to campus and in-person teaching. But a thorough discussion of whether, when and how we reopen our colleges and universities must be informed by what developmental science has taught us about how adolescents and young adults think. As someone who is well-versed in this literature, I will ask to teach remotely for the time being.

In other words, “Science (TM) tells me to stay at home and watch the direct deposits stack up in my bank account while occasionally checking in via Zoom.”

In what other universe would a national newspaper run a plea by an old guy who wants to sit at home and do almost nothing, but still get paid at 100 percent?


Full post, including comments

What I learned about teaching computer nerdism remotely

My favorite kind of computer nerdism class is the lab class. Software development is a skill and the only way to learn it is by doing. A lecture from a successful programmer will not turn beginners into successful programmers.

In mid-March we got kicked out off campus. We had been teaching successfully (at least from our self-serving point of view!) in a classroom at Harvard Medical School. Three groups of three students each in the same room. By walking around we could fairly quickly see what was on everyone’s screen, help as necessary, and talk either to the entire group of 9 or to one group of 3. Groups of 3 could talk amongst themselves without disturbing the others.

Using Webex and Zoom reduced productivity by at least 70 percent. We could work with only one student’s screen at a time and essentially only one team at a time. Switching from screen to screen is a cumbersome time-consuming heavyweight process.

Now that we’re going to stay home for the next 20-50 years (even if we cure coronavirus, we still have influenza as our mortal enemy, right?), what would the ideal infrastructure be for teaching our brand of computer nerdism?

In addition to a personal monitor or two, the teacher needs an array of 9 monitors, each one at least as large physically as a student’s screen (teachers have older eyes than do students, typically!). This will enable the teacher to see what each student is doing and interrupt with help as necessary. We need four voice chat channels: one for each student group and one for the entire class. Each student needs two physical screens. One for himself/herself/zerself/theirself to use for editing and running SQL and R code and one as a mirror of the teacher’s screen (how else will students know which sites teachers like to visit?).

If we had had this infrastructure, I think we could have been 80 percent as productive as we had been during our physical meetings.

Readers: What else would help for hardware and software infrastructure for teaching?


Full post, including comments

iCloud for Windows creates a single folder with 44,000 items

Trigger Warning: A First World problem.

A recent example of software engineering from the best and brightest of Silicon Valley is iCloud for Windows version 11. Want to see the picture that you just took on your phone? It will be zapped automatically to \Pictures\iCloud Photos\Photos … where it is mixed in with 44,000+ additional photos and videos that you’ve taken since 2014 (thumbnails only, which load slowly even with a 1 Gbit fiber connection).

Yes, a single flat directory of however many thousands, or hundreds of thousands, of photos and videos that you’ve ever taken. Even worse, the software no longer converts from Apple’s unconventional choice of HEIC to JPEG. Except that if you edit the photo on the device, e.g., because the orientation sensor got it wrong, the corrected version comes through as a JPEG. So now you’ve got a directory with a mixture of HEIC and JPEG files.

Is there any way to change this behavior? The 10.x version of iCloud would take the HEIC files captured by the phone, convert them to JPEG, and actually download them into a \Pictures\iCloud Photos\Downloads

Stylish Macintosh users: does it work the same way on the Mac? One enormous flat folder with every photo that you’ve ever taken?

(Maybe Apple is just leading the way into a HEIC future? Apparently not. The Apple-brand silicone case for the iPhone 11 Pro Max failed and I tried to send them a picture of the failure so they’d send me a replacement. Apple support has a web-based system for uploading “files”. If you try to upload a photo that you took with Apple’s own device, from Apple’s own browser (Safari), into Apple’s own server, it fails with no further explanation. If you try to do it from Windows, you get the same unexplained failure. If you convert the HEIC to JPEG on Windows and then upload… it works.)

Full post, including comments

Facebook pay cuts for remote employees who move to Nevada or Texas prove that the labor market is rigged?

“Zuckerberg says employees moving out of Silicon Valley may face pay cuts” (CNBC):

The company will begin allowing certain employees to work remotely full time, he said. Those employees will have to notify the company if they move to a different location by Jan. 1, 2021. As a result, those employees may have their compensations adjusted based on their new locations, Zuckerberg said.

“We’ll adjust salary to your location at that point,” said Zuckerberg, citing that this is necessary for taxes and accounting. “There’ll be severe ramifications for people who are not honest about this.”

If there is a market for productivity and accomplishment, the remote worker should be able to get paid the same regardless of location, no? For items where there is a functional market, we can’t say “Oh, this is of excellent quality, but was produced in Cambodia so I am going to pay only half as much as I would pay for the same item, same quality, made in higher-cost China, right?

Readers: Does the fact that Facebook can unilaterally set the price it will pay for labor depending on the cost of housing from which the labor toils show that the market for Silicon Valley labor is rigged?


  • High-Tech Employee Antitrust Litigation (Wikipedia): High-Tech Employee Antitrust Litigation is a 2010 United States Department of Justice (DOJ) antitrust action and a 2013 civil class action against several Silicon Valley companies for alleged “no cold call” agreements which restrained the recruitment of high-tech employees.
  • Hacker News thread on this post (my favorite: “Supply and demand makes sense as an explanation [for why on-site workers in different locations are paid different amounts], but it doesn’t actually explain this one. If facebook were just charging a market rate determined by supply and demand, then your salary would drop when you become remote, regardless of where you actually live, as your location has nearly no bearing on your productivity or competition for the same job. The fact that Facebook wants workers to report their location, as they cannot easily see the difference, shows their motivation cannot be driven by supply and demand.” Also good: “Salary based on an individual’s needs is quite the ‘hmmmmm’ moment. It is one of the reasons Violet Newstead — Lily Tomlin’s character in 9 to 5 — is given when she furiously demands to know why she was passed over for a fair promotion. The guy who got the job instead? Well had a wife and kids to support. He needed it more.” And quoting American academia’s favorite thinker: “No, it just proves that Marx was right about the nature of the wage/salary. The value of labour power is the cost of reproducing/maintaining that worker at a particular standard of living, not some particular fraction of the value generated at work.”)
Full post, including comments