Has disk drive progress stalled?

Magnetic disk drives were supposed to get more capacious, on a per-platter basis, at a steady “Kryder rate” (40 percent per year).

Now that it is time to get a monster hard disk drive to run Windows File History and try to recover from the CrashPlan debacle, I’m trying to figure out what progress has been made since April 2015 when I purchased a 6 TB hard drive for $270. If capacity had grown at 40 percent per year, with the same number of platters and roughly the same cost, this should be a 16 TB drive for $270. One can purchase a 16 TB drive from Amazon, but it costs $580 and some of the extra capacity comes from extra platters (9 versus 5 for the WD60EFRX that I bought in 2015).

If you want to spend $270 on a 5400 RPM drive, you get 10 TB, not a huge increment over 6 TB after more than four years.

Have all of the brightest minds in storage moved to work on SSD?

Full post, including comments

What’s a good online backup service? (Crashplan can do only 10 GB per day)

I used to back up my computer with Crashplan, but the service failed after I parked some big videos from our MIT Ground School class on a secondary drive. I was able to get it started again by beefing up its RAM allocation to 8 GB (it seems to use 3-5 GB; this is why I want every computer to have 64 GB of RAM minimum!) and cutting the backup interval to once/day (attempt to prevent a new backup starting from causing an in-progress upload to file).

The backup is unbelievably slow. Windows says Crashplan uses 0.1 Mbps most of the time, i.e., about 1/10,000th of the provisioned Verizon FiOS 1 Gbps symmetric link. My information will be at risk of drive failure for the next 83 days (about 830 GB of stuff that Crashplan missed during its failed period).

I pinged the Crashplan folks for support. It turns out that their goal is 10 GB per day:

Looking at your recent history, I’m seeing that you’re getting above-average upload speeds to us. CrashPlan users can expect to back up about 10 GB of information per day on average if their computer is powered on and not in standby mode.

In other words, the consumer who buys a $360 laptop at Amazon with a 1 TB hard drive, fills it up with family photos and videos, and then subscribes to the service will not have a complete backup until 3.5 months of being continuously connected (maybe not for a year if the laptop is turned on only when in use). The consumer who captures or modifies 1 hour of video every day will never get a complete backup, I don’t think.

[Update 5/6: Since the bandwidth used, according to Windows, is the same 24/7, I’m 99 percent sure that Crashplan is throttling to 100 kbps. The customer support emails use some careful language: “We do not apply throttling based on the size of your backup. We also do not limit upload speed based [on?] file sizes or types.”]

I started with Crashplan in 2012, according to this post on the topic:

[Update 11/15/2012: Based on the comments below, I installed CrashPlan. It is uploading 2.2 Mbps currently, maxing out the admittedly feeble Comcast cable modem upload capacity. So this makes it 22 times faster than Carbonite, throttled to 100 kbps.]

Given that it is only 1/20th of its former speed, I wonder if Crashplan has now discovered the miracle of throttling while charging customers for “unlimited” service.

“Why I Switched to Backblaze from CrashPlan” (February 2017):

I failed to get CrashPlan to complete a single successful backup on my new machine for a full month. … After I cranked up Backblaze to the fastest possible, I was shown a transfer speed of 208.14Mbps. Remember CrashPlan? That was at 2.4mbps. So 100x the speed. But could Backblaze really do this in an actual upload? … .it only took Backblaze 18 hours to upload 641GB of data. 735 thousand files. … I’m switching over to Backblaze because of the nice interface, and because of the speed, and well, mostly because they actually can back up my computer.

How about following this guy with a switch to Backblaze? It is $60/year., half the price of Crashplan at $120/year for a single computer, but I think Backblaze adds fees for persistent storage of older versions (6 cents per GB per year, so a 6 TB hard drive could run up a $360/year bill?). This memory usage comparison showed that Backblaze required only 1/25th as much RAM as Crashplan.

Or maybe it makes sense to subscribe to both? Use Backblaze to make sure that you actually can restore your computer if it fails within one of the multi-month windows in which Crashplan is hundreds of GB behind. Use Crashplan to restore an ancient version of a file.

Full post, including comments

Black hole photo: Back to the lone genius in science?

From one of our most intelligent citizens, a salute to the scientific genius working alone:

Brian Keating’s Losing the Nobel Prize, published just last year, said that the age of a Katherine Clerk Maxwell discovering Maxwell’s Equations mostly on her own was over. The book describes a paper regarding Higgs boson discovery with 6,225 co-authors.

Readers: What is the significance of this “photo” (it is all false or pseudo color since the emissions were not in the visible portion of the E-M spectrum)? What is the actual “advancement of science”? (Rare break from Trump hatred from the NY Times: article on how the experiment worked. The core article on the announcement doesn’t suggest that any in-question hypotheses were confirmed or rejected. I asked a physicist friend: “it’s not exactly the event horizon. It’s the photosphere seen on edge. With limb darkening, it appears as a torus. … eventually you might use images like this to see how general relativity plays out over time. In other words, make a movie called Event Horizon (after they see the actual horizon that is). … It confirmed part of a theoretical prediction. One that was made by a scientist other than Einstein.”)

(Separately, my Facebook friends who were energized by this example of female nerddom (a postdoc identifying as a “woman” writing software! And earning 1/8th the income of a same-age dermatologist (postdoc salary provides less after-tax spending power than obtainable by having sex with a primary care doctor in Massachusetts)) decided that they needed to add a photo of another successful female-identifying programmer. They had to reach back only half a century to find one:

In 1969 Margaret Hamilton wrote the onboard software code for Apollo 11 and coined the term “software engineering.” Now 50 years later, Dr. Katie Bouman’s algorithm enabled the connecting of telescopes around the world to take the first photo ever of a black hole. Here is a photo of Margaret Hamilton with the reams of code, and one of Dr. Bouman with the hard drives containing the 5 petabytes of data generated. Cheers to #WomeninSTEM – now imagine what we could do if they let women run the world! (Photo credit @floragraham). #blackhole #bigdata #IoT

The folks who were excited to see someone identify as female sitting at a desk typing code took their last science class in high school, would consider attending a computer science course to be a physical assault, and would flee if offered the opportunity to spend 45 minutes learning about how their smartphones work.

(There seems to be some question regarding whether Margaret Hamilton was the sole author of the big stack of assembly language code next to which she stands. In 2014, for example, the Boston Globe ran an obituary on Richard H. Battin:

Dr. Battin, who developed and led the design of the guidance, navigation, and control systems for the Apollo flights … As astronauts Neil Armstrong and Aldrin were approaching the Sea of Tranquility on that historic July 20, 1969, flight, Dr. Battin was at Mission Control in Houston with MIT Instrumentation Lab founder Charles Stark “Doc” Draper.

See also a 2016 discussion on Hacker News on the question of whether this Battin guy contributed anything significant.))

Is it safe to say that the 19th century lone genius of science is back?

Full post, including comments

Disney World shows that VR is pointless?

If you’re trying to save a few dollars, maybe a head-mounted display is a good idea. What if you don’t care about capital cost? Disney World has a lot of immersive simulators that don’t require any headgear for the park guests. They just project a virtual world on big curved screens.

What about for home use? Why not build a small room in a house with a curved screen that completely surrounds the player? Use whatever tricks they’re using at Disney to make the projection work, but with $100 LCD projectors instead of the super bright ones needed for the monster domes that hold hundreds of people simultaneously.

If you’ve got your head-mounted VR system on, you’re not going to be a great asset to the rest of the folks in an apartment or house. Why not declare that immersive gaming is an activity that happens in its own room? Maybe it costs $5,000 instead of $500 for the hardware, but people used to pay $5,000 for the then-new plasma TVs.

Readers: Would this be better or worse than the VR headsets?

Full post, including comments

What laptop for Senior Management?

Stop the Presses! My opinion has been asked for by another household member!

It is time for a new laptop for Senior Management. She is accustomed to Microsoft Windows and a 15″ screen. She does not like or want a touchscreen. She’ll be using it at home and in conference rooms at various pharma companies.

Surveying the laptop market I’m surprised at how little improvement there has been in price or specs in the past few years. This seems like truly a stalled industry. You have to pay about $500 minimum. You get a mechanical hard drive just like the 1957 IBM RAMAC. You get 8 GB of RAM, barely enough to ran a cleanly booted Windows 10 (welcome to Swap City!). How is this different than three years ago, for example?

Given that, despite a few trips back to Dell for hardware service and software reinstallation, my last laptop (Dell XPS 13) could never be made to sleep properly, I’m thinking that Dell shouldn’t be on the list of contenders.

The LG gram series seems interesting. Costco is selling one with 16 GB of RAM and a 512 GB SSD for $1300. They promise to support it for two years and take it back within 90 days if it fails the way that the Dell did. It weighs a minimal 2.4 lbs. and reviews say that the promised battery life is real (16+ hours).

Unlike Dell (and Apple?), LG does not plunge the unlucky buyer into a world of dongles. The specs include 3 legacy USB ports, one hip new USB-C port, and a HDMI output (perfect for the executive who needs to plug into a projector and doesn’t want to have to remember a dongle). Photographers will be stuck in dongle hell, however, because there is no SD card reader (only “Micro-SD”).

The LG site claims that the device has been tested for ruggedness and is stronger than its minimal weight would suggest. The only way in which this LG differs from Senior Management’s spec is in the provision of a touch screen (but she doesn’t have to use it!). And perhaps the screen resolution could be higher? But then we would say goodbye to the long battery life?

Readers: What do you think? Is there a better idea than this LG?

Related:

Full post, including comments

Virtual reality and augmented reality: the technologies of the future

Part of our Austin experience was visiting the virtual/augmented reality lab at Capital Factory. Folks there have decided that the best current VR hardware is the HTC Vive. They aren’t in love with the much-hyped Oculus, but have it available to demo.

We did a 3D drawing game, browsed around in Google Earth, and played a first-person space-themed shooting game with the Vive. With Oculus, I played Angry Birds.

The good news is that we didn’t get sick, even flying around in Google Earth. On the other hand, I would rather have just covered the walls with more TVs for a more immersive experience.

I asked the folks running the lab for their theory on why VR hasn’t caught on. They cited the cost, noting that a complete HTC Vive rig is about $600. Yet that’s nothing compared to what hardcore gamers spend.

Readers: What do you think? Is it fair to say that “VR/AR is the technology of the future, and always will be”?

Full post, including comments

Sizing a UPS for cable modem and router; market opportunity for a long duration low power UPS?

Things that our neighbors hate more than Donald Trump:

  • cell towers
  • underground power lines

Power failures are routine and, when they happen, we lose all communications capability (since a mobile phone won’t work inside the house and only barely works out in the yard).

I’m thinking it might be nice to back up our Verizon FiOS service, including the Internet. Then, in theory, we can at least use our landline and our smartphones or laptops that are charged.

A friend in town says that this is a fool’s errand: “when we had power failures, it turned out that the fiber switch on the street would go down.” On the other hand, this FiOS customer had 72 power outages with Internet in a 6-year period (great advertisement for U.S. infrastructure!).

I’m wondering how to size the UPS to run the latest ONT (corresponding to a cable modem) and VZ’s WiFi router. Verizon sells a ghetto backup battery system, just for the ONT (to run the landline for 24 hours), based on 12 D cell disposable batteries. Wikipedia says a D battery has 18 amp-hours of capacity at 1.5V, so the total of 12 would have 324 watt-hours?

If we assume that the WiFi router draws a similar amount, and will have both boxes plugged into a UPS, we therefore need a UPS with 650 watt-hours of battery? Add another 20 percent for the efficiency losses in converting from DC up to 120V AC down to DC, so now we need 800 watt-hours of battery inside the UPS to run for 24 hours?

It seems to be tough to find this information. UPS vendors spec them in volt-amps or watts and then bury the battery details. Also, maybe Verizon is selling its own thing because the appropriate product does not exist in the market? To get a beefy battery one needs to invest in crazy high max VA, which is irrelevant in this application. A $200 UPS rated at 1500 VA is backed by only two feeble $20 8.5 Ah 12V batteries (204 watt-hours; less than Verizon’s 12 D cells). We bought one to try out and it supplies the ONT and router for 2.5 hours, less than half as long as expected. The higher-capacity machines seem to be marketed as “generators” (without the generator!), e.g., this 412 Wh 11 lb. box for $550.

APC makes a box with a replaceable lithium ion battery for only about $71, which they say is intended to power routers, but it stores a pathetic 41 Wh. Lithium-ion is just not a sensible way to buy watt-hours, apparently.

Readers: Is there a market opportunity here? Apparently providing even the power of 12 D cells on a trickle-out basis is crazy expensive right now. How about a device that holds 24(!) D cell batteries and, in the event of a power failure, will supply power from those batteries to a router and ONT or cable modem? A brief interruption in the power supply is acceptable. Amazon sells D cell Energizer alkaline batteries for about $1 each, delivered. Instead of buying a $500 lith-ion battery that will be garbage after 3 years, just buy $24 of D cells every year or two.

Full post, including comments

New York Times tried to teach Americans how digital computers worked…

… back in 1967: “The Electronic Digital Computer: How It Started, How It Works and What It Does”:

Many men have played pivotal roles in the evolution of technologies important to computer science. Among them are the following:

George Boole (1815-1864), a British mathematician who developed the concepts of symbolic logic.

Norbert Wiener (1894-1964) of the Massachusetts Institute of Technology, whose Cybernetics systematized concepts of control and communications.

Alan M. Turing (1912-54), a British mathematician who formulated a definition of automatic machines, worked out a mathematical model of an elementary “universal computer and proved theoretically that it could be programmed to do any calculation that could be done by any automatic machine.

John von Neumann (1903-57), a Hungarian mathematician who came to the United States where he developed the concept of the stored program for digital computers.

Claude E. Shannon, now aged 50, of M.I.T, who defined the application of symbolic logic to electrical switching and is even better known for his basic work in information theory.

The journalists missed the Atanasoff–Berry computer (1937-1942), but it apparently didn’t become widely known until a 1967 patent lawsuit.

There is some real technical info:

The stored-program concept involves the storage of both commands and data in a dynamic memory system in which the commands as well as the data can be processed arithmetically. This gives the digital computer a high degree of flexibility that makes it distinct from Babbage’s image of the Analytical Engine.

Despite its size and complexity, a computer achieves its results by doing a relatively few basic things. It can add two numbers, multiply them, subtract one from the other or divide one by the other. It also can move or rearrange numbers and, among other things, compare two values and then take some pre-determined action in accordance with what it finds.

For all its transistor chips, magnetic cores, printed circuits, wires, lights and buttons, the computer must be told what to do and how. Once a properly functioning modern computer gets its instructions in the form of a properly detailed “program,” it controls itself automatically so that it responds accurately and at the right time in the step-by-step sequence required to solve a given problem.

Progress has enabled us to use more lines of code of JavaScript to format a web page than the pioneers used of assembly code to run a company:

Developing the software is a very expensive enterprise and frequently more troublesome than designing the actual “hardware”—the computer itself. As an illustration of what the software may involve, it is often necessary to specify 60,000 instructions or more for a large centralized inventory-control system.

Hardware:

A flip-flop is an electronic switch. It usually consists of two transistors arranged so that incoming pulses cause them to switch states alternately. One flips on when the other flops off, with the latter releasing a pulse in the process. Thus, multiple flip-flops can be connected to form a register in which binary counting is accomplished by means, of pulse triggers.

Stable two-state electronic devices like the flip-flop are admirably suited for processing the 0 and 1 elements of the binary-number system. This, in fact, helps to explain why the binary-number system is commonly used in computers.

Math:

In Boolean representation, the multiplication sign (x) means AND while the plus sign (+) means OR. A bar over any symbol means NOT. An affirmative statement like A can therefore be expressed negatively as Ā (NOT A).

Hardware again:

There are three basic types of gates: the OR, which passes data when the appropriate signal is present at any of its inputs; the AND, which passes data only when the same appropriate signals are present at all inputs; and the NOT, which turns a 1-signal into a 0-signal, and vice versa.

All operations in the computer take place in fixed time intervals measured by sections of a continuous train of pulses. These basic pulses are sometimes provided by timing marks on a rotating drum, but more frequently they are generated by a free-running electronic oscillator called the “clock.”

The clock-beat sets the fundamental machine rhythm and synchronizes the auxiliary generators inside the computer. In a way, the clock is like a spinning wheel of fortune to which has been affixed a tab that touches an outer ring of tabs in sequence as the wheel revolves. If a signal is present at an outer tab when the wheel tab gets there, the appropriate gate is opened.

Each time-interval represents a cycle during which the computer carries out part of its duties. One machine operation can be set up for the computer during an instruction cycle (I-time), for example, and then processed during an execution cycle (E-time).

Click through at the bottom to the scanned page and you will see flowcharts, circuit diagrams, and an instruction set.

The public used to be interested in this stuff, apparently!

Full post, including comments

Google shows that James Damore and Econ 101 were right?

James Damore, the Google Heretic, was cast out for saying that intelligent people who identify as “women” did not enjoy staring at a screen and typing out pages of boring C and Java code (while simultaneously wearing headphones and rubbing elbows with other nerds?).

Damore suggested that the programming job be reconfigured so that it would be more appealing to people identifying as women. Instead of doing that, Google fired him for his thoughtcrime.

If Damore were correct, Econ 101 would predict that women at Google would be getting paid more than men for doing the same job. Otherwise, why would they be there doing something that was distasteful to them?

“Google Finds It’s Underpaying Many Men as It Addresses Wage Equity” (nytimes):

When Google conducted a study recently to determine whether the company was underpaying women and members of minority groups, it found, to the surprise of just about everyone, that men were paid less money than women for doing similar work.

Doesn’t this tend to show that both Damore and Econ 101 are correct?

Related:

Full post, including comments