One of the first large-scale automatic digital computers – the Mark 1 – was created around 1940 – and provided some impressive specs at 8ft high, 50ft long, 2ft wide, and weighing 5 tonnes[i]. By 1946 the computer age had been officially established, with the U.S Army established ENIAC computer’s 18,000 vacuum tubes able to better calculate projectiles’ paths. The U.S Army had spent over a half-million dollars (equivalent to several million in current dollars) for what was then a risky and unproven technique[ii]. For decades computing remained the preserve of militaries and big corporations. It wasn’t until the 1970’s that we saw the start of the digital age – or third industrial revolution – with mass personal computing taking root with the 1977 launch of the Apple II.
The revolution marked by the incipient digital age witnessed the rise of electronics, telecommunications and computers. It also saw an era of emergent automation[iii]. By the late 1980’s, several threads born on the early digital age- of networks and linked personal and mini computers- had merged to create the internet.
It is a sign of how slow our companies are to respond to technological change – in terms of culture, organisation and even business models that close to fifty years after the digital age started, we are still talking about digital transformation as a game changer for industries and businesses. That does not mean that the digital era’s impact will quickly dissipate the digital era, but it is no longer the driving force of innovation. $1.25 trillion is forecast to be spent on digital transformation initiatives globally in 2019, and that number is forecast to climb to $1.97 trillion in 2022[iv]. The importance of digital cannot be downplayed, but very little of this investment will redefine industries or business: being digital is now table stakes or the bar minimum needed for doing business and no longer a source of differentiation or competitive advantage[v].
It is speculated that the end of the digital era will be definitively marked as have ended by an increased level of consumer trust. ‘When we start to trust technology again,’ notes TechRadar, ‘…we’ll know that we’re in a post-digital world: where businesses aren’t infatuated with technology, but are dedicated to people[vi].’
It seems somewhat ironic that an intelligent era should be marked by dedicated to people. Surrounded by technology will enable personalisation to a degree never seen before. In many ways the intelligent age that we are now entering is a natural evolution from the digital age, but the change between the two marks an important juncture in how we think about change. Some are dubbing the transition digital v3 but something deeper is changing as opposed to just an enhanced digital age. It has been suggested by Accenture that while ‘…technology for the last 25 years has been something we can use, we’re entering a new era where technology is who we are[vii].’
At the same time there is a need to look at all changes simultaneously – we are no longer in a time where can pick the top three or even five technologies. This fundamentally represents a different way of thinking[viii]. Distributed ledgers, artificial intelligence, extended reality and quantum computing all have the potential to bring about not just new processes and new ways of doing things, but different things. However, ‘…as powerful as today’s emerging technologies are, they will yield only incremental gains if their application is limited to existing processes[ix].’
It’s an open question as to whether we are ready – as businesses, societies and body polities for the repercussion of breakthrough technologies such as machine learning and biotechnology. Fundamental transformations will impact a range of industries, including computing, healthcare, energy and transportation[x]. The future, for all of these industries and more, promises more change in the next decade than the previous 40 to 100 years.
The intelligent age is upon us, but the age of ‘linear’ tech change may be at an end. Given a range of changes across multiple domains, we are entering the era of exponential change. At the same time, the main role for bridging discovery and commercialization is increasingly failing to private concerns and away from government. Twinned with the range of technological changes, this promises to complicate attempts at regulation of what promise to be powerful technologies. Nowhere is this more the case than with the promise of biotech.
Steve Jobs once remarked that ‘…the biggest innovations of the 21st century will be at the intersection of biology and technology[xi].’ Biology is fast becoming the new digital and biotechnology (broadly speaking the combination of the two) a key driver of the future economy. Using living organisms to make products or manipulate existing processes could fuel innovation across healthcare, industry and the food sector and beyond. If DNA does indeed emerge as the new silicon as is suggested, Wired proclaims that ‘…biology will be the next great computing platform[xii].’ The post-human age, the intelligent age and the biotech age are all upon us – how they interact with not only each other, but our ill-prepared systems will help define multiple paths of change.
|Computer||Tail end of second industrial revolution||1946-1969||Initially unproven and costly.
Benefits restricted to militaries and big businesses.
|1960||By 1960 the transistor, invented in the 1940s, was reliable enough to replace the fragile vacuum tubes of an earlier day.
By the end of the 1960s the silicon chip had become the principal device in computer processors and was beginning to replace memory cores as well.
|Digital||Third industrial revolution||1970||Era of personal computing. Also called the information age.|
|1979||In 1979 a program called “Visicalc” appeared for the Apple II: it manipulated rows and columns of figures known to accountants as a “spread sheet,” only much faster and easier than anyone had imagined possible. A person owning Visicalc and an Apple II could now do things that even a large mainframe could not do easily.|
|Internet||1990||Networking is entrenched.
HTML developed and world wide web opened to public in 1991.
|1977-2019||Digital age sees spread of electronics, robotics, networks, global telecomms.
Internet turns 40 in 2009
|Intelligent||Fourth industrial revolution||2016 –||Google unveils Google Assistant, a voice-activated personal assistant program, marking the entry of the Internet giant into the “smart” computerized assistant marketplace. Google joins Amazon’s Alexa, Siri from Apple, and Cortana from Microsoft.
Quantum, A.I, blockchain combine to remake industries en-masse and society at large.
|Biological||Exponential era||2020s||Biology could be the next great computing platform and a key driver of transformations across a range of industries – from construction to health.|
|Post-human||2030s||The intelligent and biological eras combine to create post-humanity – nanotech, brain implants and more become mainstream advancements.|
[i] Source: Open Book Project, 2017 http://openbookproject.net/courses/intro2ict/history/history.html
[ii] Source: BBVA Openmind, retriebed 2019 https://www.bbvaopenmind.com/en/articles/the-trajectory-of-digital-computing/
[iii] Source: Sentryo (Cisco) 2017 https://www.sentryo.net/the-4-industrial-revolutions/
[iv] Source: Accenture, 2019 https://www.accenture.com/us-en/blogs/blogs-paul-daugherty-digital-transformation
[v] Source: Accenture, 2019 https://www.accenture.com/us-en/blogs/blogs-paul-daugherty-digital-transformation
[vi] Source: TechRadar, 2019 https://www.techradar.com/news/were-already-in-a-post-digital-era-so-what-comes-next
[vii] Source: Accenture, via Wired, 2018 https://video.wired.com/watch/wired25-accenture-cto-paul-daugherty-on-reimagining-the-future-of-business
[ix] Source: BCG, 2019 https://www.bcg.com/publications/2019/company-of-the-future.aspx
[xii] Source: Wired, 2017 https://www.wired.com/story/biology-will-be-the-next-great-computing-platform/