New York Times tried to teach Americans how digital computers worked…

… back in 1967: “The Electronic Digital Computer: How It Started, How It Works and What It Does”:

Many men have played pivotal roles in the evolution of technologies important to computer science. Among them are the following:

George Boole (1815-1864), a British mathematician who developed the concepts of symbolic logic.

Norbert Wiener (1894-1964) of the Massachusetts Institute of Technology, whose Cybernetics systematized concepts of control and communications.

Alan M. Turing (1912-54), a British mathematician who formulated a definition of automatic machines, worked out a mathematical model of an elementary “universal computer and proved theoretically that it could be programmed to do any calculation that could be done by any automatic machine.

John von Neumann (1903-57), a Hungarian mathematician who came to the United States where he developed the concept of the stored program for digital computers.

Claude E. Shannon, now aged 50, of M.I.T, who defined the application of symbolic logic to electrical switching and is even better known for his basic work in information theory.

The journalists missed the Atanasoff–Berry computer (1937-1942), but it apparently didn’t become widely known until a 1967 patent lawsuit.

There is some real technical info:

The stored-program concept involves the storage of both commands and data in a dynamic memory system in which the commands as well as the data can be processed arithmetically. This gives the digital computer a high degree of flexibility that makes it distinct from Babbage’s image of the Analytical Engine.

Despite its size and complexity, a computer achieves its results by doing a relatively few basic things. It can add two numbers, multiply them, subtract one from the other or divide one by the other. It also can move or rearrange numbers and, among other things, compare two values and then take some pre-determined action in accordance with what it finds.

For all its transistor chips, magnetic cores, printed circuits, wires, lights and buttons, the computer must be told what to do and how. Once a properly functioning modern computer gets its instructions in the form of a properly detailed “program,” it controls itself automatically so that it responds accurately and at the right time in the step-by-step sequence required to solve a given problem.

Progress has enabled us to use more lines of code of JavaScript to format a web page than the pioneers used of assembly code to run a company:

Developing the software is a very expensive enterprise and frequently more troublesome than designing the actual “hardware”—the computer itself. As an illustration of what the software may involve, it is often necessary to specify 60,000 instructions or more for a large centralized inventory-control system.

Hardware:

A flip-flop is an electronic switch. It usually consists of two transistors arranged so that incoming pulses cause them to switch states alternately. One flips on when the other flops off, with the latter releasing a pulse in the process. Thus, multiple flip-flops can be connected to form a register in which binary counting is accomplished by means, of pulse triggers.

Stable two-state electronic devices like the flip-flop are admirably suited for processing the 0 and 1 elements of the binary-number system. This, in fact, helps to explain why the binary-number system is commonly used in computers.

Math:

In Boolean representation, the multiplication sign (x) means AND while the plus sign (+) means OR. A bar over any symbol means NOT. An affirmative statement like A can therefore be expressed negatively as Ā (NOT A).

Hardware again:

There are three basic types of gates: the OR, which passes data when the appropriate signal is present at any of its inputs; the AND, which passes data only when the same appropriate signals are present at all inputs; and the NOT, which turns a 1-signal into a 0-signal, and vice versa.

All operations in the computer take place in fixed time intervals measured by sections of a continuous train of pulses. These basic pulses are sometimes provided by timing marks on a rotating drum, but more frequently they are generated by a free-running electronic oscillator called the “clock.”

The clock-beat sets the fundamental machine rhythm and synchronizes the auxiliary generators inside the computer. In a way, the clock is like a spinning wheel of fortune to which has been affixed a tab that touches an outer ring of tabs in sequence as the wheel revolves. If a signal is present at an outer tab when the wheel tab gets there, the appropriate gate is opened.

Each time-interval represents a cycle during which the computer carries out part of its duties. One machine operation can be set up for the computer during an instruction cycle (I-time), for example, and then processed during an execution cycle (E-time).

Click through at the bottom to the scanned page and you will see flowcharts, circuit diagrams, and an instruction set.

The public used to be interested in this stuff, apparently!

3 thoughts on “New York Times tried to teach Americans how digital computers worked…

  1. You’re right that a technical article like this is pretty much inconceivable in today’s NY Times. The technology has become more advanced and the general public (and NY Times reporters) have become dumber and more divorced from STEM so the gap is too wide to bridge.

    It’s interesting that the diagrams that show the details of the hardware (magnetic core memory, magnetic drum storage) are hopelessly outdated but the conceptual diagrams explaining binary math are timeless and still applicable.

Comments are closed.