The History of Clocks & Watches — Eric Bruton (1979)
We do not know who first invented the clock. Of course, the same goes for many other venerable inventions — the originators of writing, paper, movable type and cast iron are similarly obscure — but the humble clock is so familiar that it is shocking to learn how little we know about its genesis. Happily, once the dark ages of the clock are dispensed with, Eric Bruton has a lot of recorded history to work with. There are the sundials of Babylon and Egypt; the water clocks of Greece, China and the Arab world; and the hourglasses of Italy and Germany. There is the disconcerting concept of “temporal hours,” where once the 12 hours of the day were lengthened in summer and shortened in winter as daylight waxed and waned. When it comes to more recent clocks, Mr. Bruton lays bare the seductive elegance of the jewels, springs, escapements and complications that make analog timepieces tick — or simply tickle their owners’ fancy. The mechanical clock may be an anachronism, but it remains a joyful and satisfying thing to contemplate.
The Universal History of Numbers — Georges Ifrah (1998)
“There must have been a time when nobody knew how to count,” opens The Universal History of Numbers. True enough, one might have thought, but wrong: It transpires that both humans and many animals have innate senses of “moreness” and “lessness.” But if Georges Ifrah gets off on the wrong foot, the rest of his book more than makes up for it, sweeping through mathematical history in compendious and engaging style. If there was or is a way to count, it is in here: fingers, toes, genitals; pebbles, sticks, knots; pen and ink, abaci, calculators; Roman numerals, Hindu-Arabic numerals, fractions, decimal numbers. It all gets a little dizzying, with coincidence layered on coincidence and fact piled upon fact. But numbers are like that — unending in scope and mind-bending in their importance — and our ways of counting have necessarily evolved to match.
Computing Before Computers — William Aspray (ed., 1990)
This prehistory of computing is a trove of details on the earliest forms of automatic computation. We encounter Blaise Pascal in the throes of a rare intellectual misfire, building a mechanical calculator of equivocal usefulness. An aging Charles Babbage fulminates against street musicians, his foul mood prompted by government indifference to his fabled analytical engine. John Atanasoff mulls the first electronic computer at a roadhouse far from home, having driven 200 miles from dry Iowa to wet Illinois for a drink. And we watch Lt. Grace Hopper (later admiral) extract a moth from a jammed relay in one of the first big American computers. “She removed the moth and taped it in the logbook, noting that she had found the ‘bug’ that was causing the problem!” William Aspray writes that every computer, large or small, is essentially the same in design, with a memory to hold bits and bytes, a unit to process them and a program to run the show. With quantum computers just round the corner that may not hold much longer but, like the act of programming itself, it shows that the right level of abstraction makes almost anything comprehensible.
The Mysterious Affair at Olivetti — Meryle Secrest (2019)
Meryle Secrest takes as her subject one Adriano Olivetti, the driving force behind a company whose elegant business machines once enlivened office desks across the world. By the 1950s, having transformed the family firm into a global typewriter behemoth, Olivetti turned his sights on a new product: the computer. It would not be an easy pivot. British and American firms, egged on by their respective governments, led the world in building and selling room-filling mainframes, yet Italy boasted barely a handful of computer experts and the Italian government did not much seem to care. From this unpromising start, Olivetti built a computing empire. Even after his untimely (and, some would say, not entirely natural) death, his company pushed on to create what was arguably the world’s first desktop computer. The Programma 101 was a masterpiece of design, serious and sensuous. NASA used P101s to align Apollo antennae; the U.S. Air Force used them to direct bombers in Vietnam. In the end, there is almost too much to tell here — who stole the P101 prototype days before its public debut? was the demise of Olivetti’s chief engineer as suspicious as that of his erstwhile boss? — but Ms. Secrest’s book remains a fascinating introduction to a man who deserves to stand alongside Steve Jobs and Bill Gates.
The Man Behind the Microchip — Leslie Berlin (2005)
Robert’s Noyce’s résumé is staggering. As Leslie Berlin recounts in this jaunty biography, Noyce co-founded Fairchild Semiconductor, once the second-largest electronics company in America, in 1957. Three years later, he figured out how to make the fragile, fickle microchip into a robust, practical product. In 1968, he co-founded Intel. And in 1971, almost by accident, he helped birth the modern central processing unit, or CPU. Yet for all Noyce’s drive and charisma, he was only a supporting player in this final drama. With Intel strapped for cash, Noyce visited Japan to look for business. He came away with a contract to build calculator chips, but Intel didn’t have the resources to design the myriad types of chip the client wanted. What if, an engineer named Ted Hoff asked, Intel instead built one chip — a chip that could do anything and everything if given the right instructions? Noyce agreed, the client was convinced and the rest is history. The Intel 4004 was the first microprocessor, and the world would have looked very different without it.