A recent Windows 10 Creators Update ran for roughly one hour on my desktop PC. The boot drive is an SSD and therefore I don’t think that much of the time was spent waiting for the hard drive. I don’t think that the update process per se requires much network access because the operating system told me that it had already downloaded the update. In any case, the computer is hard-wired to Verizon FiOS at 75 Mbps so there can’t have been too much network delay.
The CPU is an Intel CORE I7 5820K clocked at 3.3 Ghz. This machine has 6 cores. If we assume that an average of 3 of the cores were busy for the entire hour, that is equivalent to all of the scientific computation done through what year?
Instructions done in one second: 3 cores times 3.3 billion = roughly 10 billion instructions (the i7 is at roughly one instruction per cycle?). Instructions done in one hour: 36 trillion (3.6×10^13).
Good old days: The IBM 360/95 that was still kind of exciting when I worked on the Pioneer Venus project at NASA did 3.8 million instructions per second (Wikipedia). It would have taken 9.5 million seconds to run 36 trillion instructions on the IBM 360/95. That’s about 110 days continuously. So plainly by 1968 when the IBM 360/95 had been delivered to Goddard Space Flight Center there had already been more than 36 trillion instructions run.
So I’m going to guess that my latest Windows update took roughly as much computation as the first 11 years of modern computing (start with EDSAC in 1949 and through 1960). Not all of that was done for scientific purposes, though, so maybe 1949-1962 is a better estimated period?
Readers: Corrections? Better ideas?