A Brief History of Technology
Exploration of the common narrative about the history of invention, the acceleration of technological innovation, and the driving force behind unprecedented growth in the 21st century.
Contents:
II. Where No Man Has Gone Before
III. The Blind March of Industry
IV. Trends in Computational Power
V. How Close Are We to the Void?
I. Hark! We Accelerate
A child born today finds themselves a part of a 6,000 year old global civilization the product of millions of years of technological invention. While that child will adapt to survive in modern society, they may not stop to wonder how things got to where they are today and act without the knowledge of our ancestors. For humans, the past is one of accelerating returns to scale where the only thing guaranteed in life (other than death and taxes) is change.
The oldest technologies are stone tools (Lomekwi) dating back 3.3 million years. The three original components of technology were anvils (a surface upon which another object is struck), cores (large rocks), and flakes (fragments struck from a core), used in a process called stone-flaking. More advanced stone-flaking enabled “Oldowan” tools, 2.6 million years ago, and later still, Acheulean tools (hand-axes) were developed about 1.76 million years ago.
Prehistoric hunter-gatherers lived in caves, with the earliest huts built 1.8 million years ago. Boats were used to cross straights as far back as 900,000 years, around the same time as fire and cooking. Later came glue, clothing, instruments, and ceramics. The invention of bricks is considered the beginning of “urban” development, 8,000 years ago. Mancur Olson traced civilization back to the moment when pre-historic bandits realised instead of raiding camps and moving on, they could stay put and steal from camps all the time. He says with a line that describes history from hear-on-out; “Roving banditry means anarchy, and replacing anarchy with government brings about a considerable increase in output”.
Humanity’s early civilizations formed agriculture, supporting early traces of economy with surplus food and trade. Mesopotamian civilians could specialise in more professions than farming, leading to an acceleration in technological invention. Antiquity began with the invention of writing systems, enabling the documentation of history from 3,500 BC, and soon after came the Egyptian pyramids, which remained the tallest structures on Earth until 1320AD. Before 1,000 BC we invented the plough, the wheel, paper, the pen, glass, and the production of bronze, copper and iron.
Since the invention of the brick, humans have created a great many things, from slavery, economy, and democracy, to the scientific process, industry, globalisation, electricity, and the internet.
A child born today is one of 7.8 billion living humans, and one of 117 billion humans born ever. They are a member of the second largest animal species on planet Earth (behind cows), and will live most – if not all – of their life with the global threats of environmental collapse, rampant pandemics, cyberterrorism, nuclear holocaust, and the rise of superintelligence.
Edward Wilson, a great American biologist, said in 1929, “The real problem of humanity is the following: we have Paleolithic emotions, medieval institutions, and god-like technology.”
To mitigate the risks of our existence, we must update our emotions and institutions as technology advances. Without an appreciation of how civilization became the way it is, all children born today risk falling further behind the curve of technological adoption, just as it becomes critical.
II. Where No Man Has Gone Before
Arguably the most crucial difference between the human race now compared to any other time in human history, is the unprecedented rate of scientific progress. To the face of the old ticking clock of history, humanity’s rate-of-invention describes the shape of a ski slope. The reason for this acceleration can be attributed to the pairing of two parallel forces;
1. The number of researchers alive at one time.
Globalisation has given education to billions, and curriculums are only a recent idea. Access to scientific resources and facilities has become widespread in a short period of time. In 1900, there were 13 universities in the UK; in 2022 there are over 160. The Future of Life Institute estimate that 90% of all PhD recipients are alive today: it is highly possible that there are more “people of science” working today than in all accumulated history. [1]
2. The proclivity of disruptive inventions.
If “inventions” are new devices, “disruptive inventions” are devices that change the way invention is typically carried out. The invention of the microscope by teenager Zacharias Janssen gave us radical access to cellular biology in 1590. The invention of the scientific method by Sir Francis Bacon laid the groundworks for scientific validation in science in 1620. Both have had a snowball impact on how science has progressed.
Figure 1. Timeline of the acceleration of human innovation, some landmarks labelled.
More recent examples are the invention of the World Wide Web (1989) and the Mosaic Netscape browser (1994). Like the printing press 600 years before, these inventions gave humans power to vastly greater amounts of information, and paired with new hardware, access to the internet has been species-universal. [2]
III. The Blind March of Automation
While the printing press morbidly put scribes out of work, it was in hindsight nothing but a force for job creation. Where would we be without authors, scholars, libraries and written news? The gateway of disruption has been open for many hundreds of years, but widespread change took on in the industrial revolution. [3]
The First Industrial Revolution began in the 1760s with the application of the powerloom, capable of sewing volumes of threat at 8x that of the average worker. New basic materials, new energy sources, the first factories, and inventions in transport and communication (particularly by harnessing steam power) were closely involved in this period. Scottish weavers, laid off by powerlooms, were the first to “riot against automation” in 1816. [4]
The Second Industrial Revolution began with assembly line production and the discovery of electricity, enabling ‘mass production’. Economic principles of specialisation further increased factory speed and reduced costs.
The Third Industrial Revolution began with computer automation in the 1970s. Entire production processes may now be human-free, as robots can simulate the role of humans in warehouses. Broadly respectable professions of blank clerks, switchboard operators, proof-readers, and bookkeepers became automated.
The Fourth Industrial Revolution or “Industry 4.0” is underway, characterised by algorithms that automate many less-routine roles that until now have been carried out by humans. Automated networks that communicate without the input of humans make “cyber-physical production systems” and “smart factories” possible, replacing production workers, production managers and area managers all across the manufacturing industry. Machines can predict failures and trigger maintenance processes automatically, self-organised logistics can react to changes in production in real-time. Automation is already changing the way we work in a significant way.
IV. Trends in Computational Power
The principle of technology that is responsible is the growth in computational power. Between 1900 and 2015, the cost of computing decreased by 10^18. That’s a quintillion. As it is easy to misunderstand the size of that figure, I will simplify. Since 1955, computing has become 10 trillion times cheaper. If in that time global real estate had fallen by the same amount, you could buy modern day New York for 10 cents, or all gold ever mined for $1.
Though 25 years out of date, Moravec shows in his 1997 book “Robot”, that computational power (in MIPS; millions of instructions per second) has grown from virtually nothing to the level that reptiles operate at in 100 years. That is about ten million times faster than evolution. [5]
Figure 2. The 21st century trend in computer power.
In reality, this graph shows us when it becomes economically viable to replace animals with computers, and this has a truly ground-breaking impact when we reach human levels, where both routine and even un-routine jobs can be phased out by business owners in favour of cheaper algorithms, a waymark of the fourth industrial revolution.
To spur on this point, we must look at the trends in computational acceleration. The nature of Moore’s Law has driven computational acceleration between the invention of the transistor in 1961 until the recent decade. Gordon Moore’s 1965 observation that computational power (a.k.a. the number of transistors we can fit onto an integrated circuit was doubling bi-annually) was doubling every 2 years. This meant that code written by developers would get faster automatically as time went on.
Figure 3. ‘Moore’s Law’, or the exponential growth in computational power, measured in FLOPs.
When dealing with computational power, it is important to distinguish between MIPS (Millions of Instructions Per Second) and FLOPS (Floating-Point Operations Per Second), both measure the number of operations per second done by a computer (neither measure the amount of work performed, which is a notable limitation). They are not directly comparable, but for simplicity MFLOPS = 2.3 x MIPS^0.89. [6]
V. How Close Are We to the Void?
It appeared in 2015 that we were capable of computing around 2.3 million MIPS for $1000. In 2022, $1000 digital systems can compute up to 100 million MIPS, and Mythic analogue microchips can compute 25 million MIPS for as little as 3 watts in cheap (and admittedly limited) systems. Moravec estimated in 1997 that the human brain performs at around 100 million MIPS, but Sandberg and Bostrom revaluated this position in 2008, giving a range of estimates based on what architecture a computational network would use.
Low end estimate is 2.3 billion MIPS (spiking neural networks), currently achievable by supercomputers, but unlikely to be achievable by commodity computers until 2042.
Mid-range estimate is 23 trillion MIPS (electrophysiology), expected to be achievable by supercomputers by 2033 and by commodity computers by 2068. [7]
High-range estimate is 23 quadrillion MIPS, or peta-MIPS, expected to be achievable by supercomputers by 2044 and by commodity computers by 2087.
Kurzweil predicted in 2006 that the computational level of a $1000 computer will achieve human brain capacity around 2023 (putting his estimate of human brain capacity at Moravec’s 100 million MIPS). That level is about what is currently available on the market. The remaining problems lie in
-
How we can programme an algorithm to simulate the human brain (note the array of methods outlined by Sandberg and Bostrom), and
-
How we can do this cost effectively and at scale (if it requires $1mn per year to run such an algorithm it has little to no economic value over a $30k salary employee). The latter problem will also involve the use of robotics.
While this range appears to put at least some distance between the average worker and absolute unemployment (the void), it would be foolish to assume that most job roles and professionals cannot be automated long before. Commodity computers capable of replicating the human mind in its entirety won’t just make truck drivers, stockbrokers and neurosurgeons economically replaceable, they’ll make artists, novelists and scientists inferior also.
References
[1] - Yes, Zacharias Janssen and even Francis Bacon had no idea what a PhD was, PhDs have been around for 180 years and represent the pinnacle of quality scientific research. Here’s the article.
[2] - The details about Zacharias Janssen’s inventions are awry, but he’s also said to have made the first telescope. Regardless, he (and possibly his neighbour) had quite the legacy.
[3] - Is the Gutenberg press the OG of ‘process inventions’? Other contenders include Fire, the Wheel, and the Compass. And maybe sliced bread (coming in late in 1929).
Here’s an in-depth article on its impact.
And here’s a top 10 of what’s happened since.
[4] - For insight into those new basic materials, new energy sources, first factories and exactly what inventions in transport and communication I’m alluding to, the source is here.
The power loom did to weavers what the car did to horses.
[5] - For more animal-related comparisons, you can find his other studies here.
[6] - MIPS are not directly comparable to FLOPS, but their empirical relationship is approximately MFLOPS = 2.3 x MIPS^0.89, according to Sandberg and Bostrom (2008). So these are my converted figures. Their paper is included below.