The future of digital identity

Whether or not we like it, the nature of the internet means that pretty much all of us have a digital identity of sorts, albeit one we don’t have direct control over. At present this leads to partial or overly simplified profiles since customers are starting to generate or present different identities across different contexts. Knowing when and how to engage becomes problematic. ‘They no longer just need to work out who the consumer is, but which of that consumer’s identities they are dealing with.’ Current segmentation models will need to become ever more complex and granular without appearing to be overly invasive.


It is also plausible that we will see digital identity initiatives crystalize into something more solid. The World Economic Forum suggests that by 2030, we’ll see credit scoring expanding into ‘life scoring’. Identity and reputation will be digitised and analysed in minute detail, shaping a future where a personal ‘trust score’ will be the norm, with all the benefits and drawbacks that might bring. It has also been suggested by WEF that a ‘…person’s data should reside in an account where it would be controlled, managed, exchanged and accounted for ,’ by around 2028. If data does indeed become a bankable commodity then banks and other financial services players have an opportunity to become the safe-keepers of the underlying digital identity. Digital identity systems will likely proliferate as a medium for managing personal data flows (i.e a consolidated point of control as consumers gain increased control over how their data is used).

Not only would consumer data use shift towards a consent-based system, but whole new models would be needed to account for micropayments. The nature of payments themselves could undergo transformation with trusted online identities easier to verify. The technological infrastructure to support such a future is already taking shape. Witness, for example, Estonia allowing its citizens to vote online during a 6-day window before election day, casting 247,232 i-votes, or internet votes, using their national ID cards and PINs. Meanwhile, the Known Traveler Digital Identity – KTDI – a public-private collaboration seeks to enable seamless and secure cross-border travel, using biometrics, cryptography and trust could one day evolve to replace the passport.

Biometrics are likely to play a role in any future digital identity program. Biometric technology is being examined by 77 percent of global airports and 71 percent of airlines as a digital identity option, while AccorHotels has a service that uses your biometrics to determine possible vacation destinations. The impact of digital identity could ripple out much further than travel however, Juniper Research suggests that mobile biometrics could authenticate $2 trillion of sales by 2023, up from $124bn in 2018. Biological data banks will fast become the norm; with governments and businesses likely interested in owning or accessing such data, it is time we wrestled back control of our data.

Through the digital age and beyond

One of the first large-scale automatic digital computers – the Mark 1 – was created around 1940 – and provided some impressive specs at 8ft high, 50ft long, 2ft wide, and weighing 5 tonnes[i]. By 1946 the computer age had been officially established, with the U.S Army established ENIAC computer’s 18,000 vacuum tubes able to better calculate projectiles’ paths. The U.S Army had spent over a half-million dollars (equivalent to several million in current dollars) for what was then a risky and unproven technique[ii]. For decades computing remained the preserve of militaries and big corporations. It wasn’t until the 1970’s that we saw the start of the digital age – or third industrial revolution – with mass personal computing taking root with the 1977 launch of the Apple II.

The revolution marked by the incipient digital age witnessed the rise of electronics, telecommunications and computers. It also saw an era of emergent automation[iii]. By the late 1980’s, several threads born on the early digital age- of networks and linked personal and mini computers- had merged to create the internet.

It is a sign of how slow our companies are to respond to technological change – in terms of culture, organisation and even business models that close to fifty years after the digital age started, we are still talking about digital transformation as a game changer for industries and businesses. That does not mean that the digital era’s impact will quickly dissipate the digital era, but it is no longer the driving force of innovation. $1.25 trillion is forecast to be spent on digital transformation initiatives globally in 2019, and that number is forecast to climb to $1.97 trillion in 2022[iv]. The importance of digital cannot be downplayed, but very little of this investment will redefine industries or business: being digital is now table stakes or the bar minimum needed for doing business and no longer a source of differentiation or competitive advantage[v].

It is speculated that the end of the digital era will be definitively marked as have ended by an increased level of consumer trust. ‘When we start to trust technology again,’ notes TechRadar, ‘…we’ll know that we’re in a post-digital world: where businesses aren’t infatuated with technology, but are dedicated to people[vi].’

It seems somewhat ironic that an intelligent era should be marked by dedicated to people. Surrounded by technology will enable personalisation to a degree never seen before. In many ways the intelligent age that we are now entering is a natural evolution from the digital age, but the change between the two marks an important juncture in how we think about change. Some are dubbing the transition digital v3 but something deeper is changing as opposed to just an enhanced digital age. It has been suggested by Accenture that while ‘…technology for the last 25 years has been something we can use, we’re entering a new era where technology is who we are[vii].’

At the same time there is a need to look at all changes simultaneously – we are no longer in a time where can pick the top three or even five technologies. This fundamentally represents a different way of thinking[viii]. Distributed ledgers, artificial intelligence, extended reality and quantum computing all have the potential to bring about not just new processes and new ways of doing things, but different things. However, ‘…as powerful as today’s emerging technologies are, they will yield only incremental gains if their application is limited to existing processes[ix].’

It’s an open question as to whether we are ready – as businesses, societies and body polities for the repercussion of breakthrough technologies such as machine learning and biotechnology. Fundamental transformations will impact a range of industries, including computing, healthcare, energy and transportation[x]. The future, for all of these industries and more, promises more change in the next decade than the previous 40 to 100 years.

The intelligent age is upon us, but the age of ‘linear’ tech change may be at an end. Given a range of changes across multiple domains, we are entering the era of exponential change. At the same time, the main role for bridging discovery and commercialization is increasingly failing to private concerns and away from government. Twinned with the range of technological changes, this promises to complicate attempts at regulation of what promise to be powerful technologies. Nowhere is this more the case than with the promise of biotech.

Steve Jobs once remarked that ‘…the biggest innovations of the 21st century will be at the intersection of biology and technology[xi].’ Biology is fast becoming the new digital and biotechnology (broadly speaking the combination of the two) a key driver of the future economy. Using living organisms to make products or manipulate existing processes could fuel innovation across healthcare, industry and the food sector and beyond. If DNA does indeed emerge as the new silicon as is suggested, Wired proclaims that ‘…biology will be the next great computing platform[xii].’ The post-human age, the intelligent age and the biotech age are all upon us – how they interact with not only each other, but our ill-prepared systems will help define multiple paths of change.


Age Tech age Dates Characteristics
Computer Tail end of second industrial revolution 1946-1969 Initially unproven and costly.

Benefits restricted to militaries and big businesses.

1960 By 1960 the transistor, invented in the 1940s, was reliable enough to replace the fragile vacuum tubes of an earlier day.

By the end of the 1960s the silicon chip had become the principal device in computer processors and was beginning to replace memory cores as well.

Digital Third industrial revolution 1970 Era of personal computing. Also called the information age.
1979 In 1979 a program called “Visicalc” appeared for the Apple II: it manipulated rows and columns of figures known to accountants as a “spread sheet,” only much faster and easier than anyone had imagined possible. A person owning Visicalc and an Apple II could now do things that even a large mainframe could not do easily.
Internet 1990 Networking is entrenched.

HTML developed and world wide web opened to public in 1991.

1977-2019 Digital age sees spread of electronics, robotics, networks, global telecomms.

Internet turns 40 in 2009

Intelligent Fourth industrial revolution 2016 – Google unveils Google Assistant, a voice-activated personal assistant program, marking the entry of the Internet giant into the “smart” computerized assistant marketplace. Google joins Amazon’s Alexa, Siri from Apple, and Cortana from Microsoft.

Quantum, A.I, blockchain combine to remake industries en-masse and society at large.

Biological Exponential era 2020s Biology could be the next great computing platform and a key driver of transformations across a range of industries – from construction to health.
Post-human 2030s The intelligent and biological eras combine to create post-humanity – nanotech, brain implants and more become mainstream advancements.

[i] Source: Open Book Project, 2017

[ii] Source: BBVA Openmind, retriebed 2019

[iii] Source: Sentryo (Cisco) 2017

[iv] Source: Accenture, 2019

[v] Source: Accenture, 2019

[vi] Source: TechRadar, 2019

[vii] Source: Accenture, via Wired, 2018

[viii] Source: ThinkAdvisor, 2018

[ix] Source: BCG, 2019

[x] Source: Forbes, 2016

[xi] Source: O’Reilly, 2017

[xii] Source: Wired, 2017

Feeding the future

The world’s population is set to grow by 2.2 billion between now and 2050[i], and given that this is in tandem with rising prosperity, we are forecast to need an extra 70 percent of food than we did in 2009[ii]. Despite the serious challenge of climate change it is thought that a range of technologies, from GPS and drones to robotics will help achieve much of the needed gains.


In order for a sustainable breakthrough, however, the future of food is going to have to depart radically from its traditions. This is unavoidable if we are to sustain a growing array of environmental, economic and social needs. Business as usual, plus technology, as in so many other industries, will probably not meet the multiple demands being placed on food producers.

Given the water intensiveness of meat, for example, alternatives are needed. Indeed, the alternative meat industry could become toward a $140bn market by 2030 and by 2040, AT Kearney believes that 60 percent of all meat will either be grown in vats or come from textured plant proteins[iii]. It is perhaps worth noting that 3D printing could also become a viable production method of proteins, however unappetising that sounds. It will be interesting to see how such trends collide with social trends. The global halal food market is expected to reach $739 billion by 2025 from $436 billion in 2016 as the numbers of Muslims approaches one third of Earth’s total[iv].

Regardless, the shift from industrialised agriculture to scientific agriculture could be one of the most important changes in the last hundred years or so. UBS notes that ‘…the ability to grow food in a lab that replicates meat, fish, eggs, and dairy products — with a lower carbon footprint and without the need to slaughter animals — is likely to become a commercially viable option in the next decade[v].’ The World Economic Forum, meanwhile, suggests that food computers could be the future of agriculture[vi]. Precision fermentation and the ‘food-as-Software’ trend are helping dramatically lower manufactured protein cost. One report suggests that by 2030, the U.S market for ground beef could shrink by 70 percent, the steak market by 30 percent and the dairy market by almost 90 percent[vii].

What food producers should do about such trends remains dependent on their skills, networks, and even marketing. What they cannot do is ignore such trends. Fully autonomous farm equipment is already becoming commercially available, meaning machines will be able to completely take over a multitude of tasks[viii]. Farming is already, in places, a highly digital industry. If farmers and others in the value chain are to remain relevant, they need to build bridges between what they today and what food production could look like tomorrow.

[i] Source: UN, 2018

[ii] Source: CNN, 2019

[iii] Source: CNBC, 2019

[iv] Source: PRNewswire, 2018

[v] Source: Live Kindly, 2019

[vi] Source: World Economic Forum, 2019

[vii] Source: Food Navigator, 2019

[viii] Source: Independent, 2019

The futures gap

The difference between identifying change and acting upon it is where many organisations wither and die as Kodak and other organisations stand testament to. While most firms believe they’re ‘picking up on signals of change,’ that might disrupt their lines of business, 42 percent admit that they’re unable to act on those signals[i]. While signals of change come from multiple directions, the technological driver is key. It is forecast that enterprises dithering on embracing changes in technology stand to lose 46 percent of their revenues by 2023[ii].


Many pieces rightfully assert that digital transformation is not solely, or even primarily, a technological change, tech does have the ability to catalase a range of other organisational change. A.I, blockchain and the edge will all accelerate the rate of change in how data is produced, transmitted analysed and consumed, for example[iii]. Furthermore several of these technologies, perhaps most notably machine learning, promise to redraw organisational design itself, potentially replacing silo’s with a ‘…powerful core of purpose and culture,’ that also sees a task, as opposed to job, driven organisation[iv].

Without experimenting with new technology, using it to enable different services and products, and ultimately changing how organisations do business, the futures gap is set to grow uncomfortably wide for many organisations. 75 percent of executives, for example, agree that their organisations will need to make significant changes to keep up with rising customer expectations finds[v]. Concurrently, 80 percent say that their own culture has to evolve over the next five years to allow their company to succeed and grow[vi], yet shifting this – especially within large organisations is a difficult, often iterative task not without its own significant risks.

For those stuck in the futures gap – of identifying opportunities and challenges but unable or unwilling to do anything about them – a couple of likely paths open up. For the largest organisations it may even be possible to defer change until it is critical, and like Microsoft, purchase your way into a new business lines, cultures and revenue streams. Few, if any, have this luxury, and must look to rapidly close the gap between what is possible and business as usual. Tools like 3H can help, although the gap could become unbridgeable for those without a strategy to overhaul what they do, how they do it and why they do it. It remains plausible that several industries could fall – en masse – into the futures gap should digital and intelligent capabilities honed in merging market segments play better than incumbent offerings.

[i]Source: Forbes, 2019

[ii] Source: MoneyControl, 2019

[iii] Source: Deloitte, 2019

[iv] Source: Forrester, 2019, via Consulting

[v] Source: EIU, 2019

[vi] Source: PwC,

Changing sectors at the Edge

The coming volume of data will have a range of impacts. It will challenge current tech models and architectures – by 2025, nearly 30 percent of data generated will be real-time[i] and more than 50 percent of data is forecast to be managed autonomously[ii]. It will also force industries to collide, whether through new data models, third party data arbitrage or even more direct partnerships and collaborations.


The edge could also reshape what it is that industries do. Take insurance, for example; at present the industry adheres to its higher purpose of ‘protection,’ by compensating after loss and allowing people to take on risk that is otherwise too large. Edge powered IoT devices will likely shift the nature of the insurance industry from ‘compensation,’ to ‘prevention,’ and in doing so draw new players into the space. If insurers do not use their natural comparative advantage of rigour and distribution to help distinguish such a strategy, the space is likely to be explored by utilities and even big tech companies with footholds in the smart home environment.

It is not the only industry likely to undergo profound change in purpose or practice thanks to the edge. As-a-service models, exemplified by Rolls Royce selling outcomes as opposed to engines to a range of aviation customers, is likely to spread into whole new sectors once data regarding the use of a whole range of objects becomes made available in real time. Retailers, for example, could move beyond just providing augmented reality mirrors, to providing a personalised selection for customers to peruse based on real-time, contextual information. Fast, better, real time data could also enable financial services to in-built into products and services or offered in an on-demand basis, while logistics could decentralize into P2P type services.

Healthcare, like insurance could also move into a more preventative realm, with real-time recommendations and feedback likely shifting elements of the industry into a life-coach role. How this could interplay with others – from insurers to sportswear manufacturers – will likely spread the opportunities and challenges of such changes way beyond the confines of traditional industry barriers.

This will render traditional planning obsolete – today’s strengths can rapidly morph into irrelevance once the rules of the game are changed. The edge, combined with 5G, will force every organisation to reaffirm and perhaps redress the question of just what business they are in and what it is they do to achieve their higher purpose. Mental models, the nature of competition, talent requirements and the competencies needed to play in future spaces will all change


[i] Source: ZDNet, 2018

[ii] Source: Oracle, 2019

Cybersecurity at the Edge

The future of computing lies at the edge of networks, such as with IoT linked devices and infrastructure. While today only about 20 percent of enterprise data is being produced and processed outside of centralised datacentres, by 2025, that is expected to rise to 75 percent and could reach 90 percent[i].


These forecasts, if realised, would imply even handheld devices will have AI capabilities built into them without outsourcing the heavy lifting to large servers, something that would otherwise be next to impossible.’ All data could therefore be processed in near real-time, at the edge of networks such as the IoT. Information technology strategies, consumer behaviour and the architecture within which to operate would all shift as a result, some in unpredictable ways as the edge economy potentially approaches $4.1Tn by 2030[ii]. As it grows in prominence, so will cybersecurity issues associated with it.

I.T ecosystems will increasingly need to exist ‘out there’ – at the edge- rather than within the organisational walls. With most IoT using organisations having limited visibility to their network, let alone their exposure to IoT cyber risk, new standards will likely be needed. Security and privacy controls will need to be built at the edge and intrinsically part of every device and network. Where possible security will have to be inbuilt into the edge device itself, which brings up some potentially interesting collaboration models and partnership possibilities.

Edge computing will also bring about significant changes to organisational IT architecture[iii]. Given both the networked nature of the edge, the likely creation of ecosystems around edge data, and the increasingly intertwined nature of IT systems, approaching edge cybersecurity at the ecosystem level is increasingly necessary to protect potentially weaker links in the cybersecurity chain, such as third parties.

71 percent of CEOs already state that they see information security as a strategic function and a source of competitive advantage[iv]. However, data breaches could potentially reach $5Tn yearly by 2024[v], complicated by edge technology ‘…as dependency on complex internet-enabled business models outpaces the ability to introduce adequate safeguards that protect critical assets[vi].’ If infosec strategy is to thrive in an edge era, new security and data architectures that span multiple organisations and even industries will need to emerge. Until then it is likely that organisations will have to impose ‘…zero trust concepts where they can’t trust the network, have to authenticate use, and have to understand what data is actually resident there[vii],’ and plan for the fact that the edge is dynamic. As with other technologies, organisational change would seem a must if edge yields are to accumulate effectively.










Outside regulation, or self-regulation?

From being seen as an onerous burden, the nature, breadth and impact of regulation is set to become a key strategic facet for many organisations. Those with strategic foresight will look to pre-empt and in conjunction with others, help shape the direction of regulation to come.


On October 3rd, 2019 it was reported that ‘…European courts can now force Facebook to scrub illegal content worldwide.’ Reuters reported that Facebook and other platforms can also be made to comply with requests to take down content globally, even in countries where it is not illegal[i]. How the governments of the United States and other powers react to a foreign law that could easily be seen as subverting freedom of speech in their own countries will likely add to what is already, technologically speaking, a patchwork of privacy laws, lack of transparency and general incoherence that neither protect competition nor customers[ii]. Either way, Europe’s gambit could have lasting geopolitical clout, MIT Sloan believes that ‘…he first country to figure out the best way to regulate the broader tech industry could become the focal point for the next chapter of the world’s digital revolution[iii].’

Indeed, the EU Commission digital department has already recommended a regulatory framework for AI that would set transparency obligations on automated decision-making[iv]. Could Artificial Intelligence be the next GDPR? Wired notes that ‘…Intelligent systems at scale need regulation because they are an unprecedented force multiplier for the promotion of the interests of an individual or a group[v].’ Digital reality will also likely need some regulatory guidelines[vi], requiring business and government to work together.

An example of how this can be done lies in the sandbox format. Already adopted by developers of autonomous vehicles, virtual currencies, and fintech regulators, sandboxes provide a safe environment to encourage innovation while protecting consumer safety. ‘For example, the United Kingdom’s Financial Conduct Authority launched the first fintech regulatory sandbox in June 2016, allowing fintech players to test innovative products and services in a safe, live environment, with the appropriate consumer safeguards, and, when appropriate, is exempt from some regulatory environments[vii].’

While waiting, or planning for the future, organisations of all types would do well in policing themselves. A Stanford study, for example, found that companies that try to fix problems on their own may sidestep more onerous regulations in the future[viii], not to mention avoid damaging the trust of consumers and ceding brand value. If the future of business is trust, self-regulation is a must in developing a future-proof product and service.

[i] Source: Reuters, 2019

[ii][ii] Source: MIT Sloan Management Review, 2019

[iii] Source: MIT Sloan Management Review, 2019

[iv] Source: Politico, 2019

[v] Source: Wired, 2019

[vi] Source: Deloitte, 2019

[vii] Source: Deloitte, 2019

[viii] Source: Fast Company, 2019

Working with robots

Much has been written about the future impact of automation, with headline numbers or percentage of jobs being replaced the most common. Many of these numbers seem predicated on technological viability and economic rationale. Automation, however, remains a strategic challenge for organisations, not a strictly technological one.


Consider, for example, that 44 percent of organisations have not yet determined how their automation strategies will affect their workforce[i]. Without such an assessment, not only do organisations and their workers miss out on appropriate training and upskilling opportunities, but the challenge of transitioning norms, working practices and culture is also lost.

Over half of employees worldwide currently feel threatened by automation, with 77 percent wanting to learn new digital skills[ii] to be able to learn new roles and jobs that automation will help create.  In addition, 73 percent of business leaders cite company culture as the single most important contributor to corporate success[iii]. Unless ‘…executives are proactive in shaping and measuring culture, approaching it with the same rigor and discipline with which they tackle operational transformations[iv],’ it is almost impossible to see how any corporate culture survives almost constant disruption brought about by AI.

Working with and alongside robots (and how this impacts other person to person contact in the workplace) will prove one of the key challenges for workers of the 21st century, from both a cultural and skills perspective. Ensuring digital and cultural readiness amongst the workforce in an appropriate structure will be key in addressing an area that current change management cannot sufficiently cope with.

Some have even suggested that ‘…it’s time for a C-Level role dedicated to re-skilling workers[v].’ It is certainly time to elevate organisational continuous learning to board level priority. Accenture, for example, developed a ‘Job Buddy’ program that has helped to retrain almost 300,000 employees over the four years to January 2019. The program assesses which roles are most likely to be automated, offers advice on which adjacent roles can be learned within the company and provided relevant training. Within 18 months of launching the pilot, 85 percent of employees for whom it was made available had used the system to assess their current job and enroll in further training[vi]. With 76 percent of executives believing internal talent mobility is important, but only 6 percent of companies believing they are excellent at moving people from role to role[vii], such programmes are likely to become more popular.

Digital skills may be vital, but if automation is to yield its true potential, management and leadership skills are the ones most needed, both in crafting win-win proposals and building the core skills needed for tomorrow’s environment.


[i] Source: Deloitte, 2019

[ii] Source: Business Review, 2019

[iii] Source: HR Grapevine, 2019

[iv] Source: The Financial Brand, 2018

[v] Source: HBR, 2019

[vi] Source: Business Insider, 2019

[vii] Source: InnerMobility, 2019

A whole new industry – Insurance in the 2020’s.

Traditional planning and strategy aiming to project a year or two out will not cut it given the range and nature of change, especially technological change. This is shifting towards a five year plus horizon from both leadership and execution standpoints, although it remains that while some 75 percent of CEOs have innovation at the top of their agenda, close to only 25 percent are looking to self-disrupt.


Technology changes the very nature of insurance products and services, and so the core of industry itself. Insurance should remain an enabler for people to take on risk that is otherwise too large, yet its processes need to change. Some of the Insurtechs, for example, could be viewed as the outsourced R&D functions and innovation hubs of existing players. There is plenty of historical precedent – as with Microsoft –of large firms approaching the point of no return and failure, only to renew via purchases and new partnerships. Appropriately some insurers are setting up their own VC firms to experiment and hedge their bets.

Process innovation does not stop there. The whole supply chain needs to rethink their proposition as multiple new models for engagement emerge. Agents and brokers still have a role to play, but young talent is unlikely to want to work in current roles. Automation can help shift the role towards spending more time with customers, help provide strategic advice and generally shift into more personalised service. Many of the tasks agents and brokers currently do will be automated, yet a promising future is available should individuals be prepared to embrace it. When looking to reposition themselves, current professionals and firms should be cognizant of the key macro trends simultaneously creating both opportunities and presenting challenges

  • Consumer use of new channels (such as voice) and automation (i.e our own personal digital assistants. As consumers we embrace what make sour lives easier as it enables us to do more. This needs to be mirrored in the supply side.
  • Personalisation of the nature of risk/cover. Traditional annual products will be replaced by dynamic and variable cover.
  • Risk mitigation will trump compensation. This requires capabilities that can handle a flood of real-time data, often from the expanding IoT. Continuous underwriting for example, requires both a different process and different technological base.

Our assumptions of what insurance is, are being challenged. This is perhaps especially dangerous for those who have had recent success, as it reduces the need they feel to change. For any players, several key questions need addressing in helping chart a way forward

  • Higher purpose is fundamental – what are we about as a company? It’s about protection and the management/mitigation of risk. Does allowing harm to come to people and then compensating them count as protection? What does protection mean now and in the future? What does that mean we do? Someone will play in this space, how do we?
  • Too many people have a future vision of doing more of what they already do. What’s driving towards the future we want? Can we acquire the skills and capability to do what we want to do when we get there?
  • Can we remain relevant? A lot of new products take advantage of digital models of engagement, but we have not yet looked at what we do differently as a result.
  • As a consumer I need someone to tell me what I need, I’m not smart enough to know what I want. I don’t care about products (rotary requirements aside), I want answers. Life-cycle and lifestyle products will be key. Can insurers deliver?


Your short term goal? To create long term thinking

Whether we like to admit it or not, short-term thinking is entrenched in many of our political and economic systems, and as a result in many of our working assumptions as business people. Many policy level decisions are taken with the next election in mind – usually a four or five year process, whilst quarterly reports dominate stock market sentiment and many business’ outlooks.


The focus on the short-term certainly has tactical benefits and for certain companies, short-term trends assume an understandable primacy. However, short-term trends offer at best a glimpse, and at worst a misrepresentation of deeper seated megatrends that take longer to evolve but are more disruptive, and potentially advantageous.

Our current climate is characterized by incredible uncertainty; political, economic and social norms are being rewritten globally. This perhaps partially accounts for the ever-shortening business horizon detected in research since uncertainty breeds limited outlooks. Harvard Business Review notes that in its analysis of the ‘…extent to which the share prices of S&P 500 firms are driven by a firm’s present value of future growth options (PVGO) rather than cash flow from current operations[i].’ In the decade to 2015, firms’ degree of exploration decreased by 7% points—larger firms, including Apple and IBM, are even more affected with an average 10%-point reduction. The bottom line is that the focus on the short-term and on defending business models rather than exploring new ones represents a significant loss in future option value. HBR estimates that collectively, investors now value the future growth options of these firms relatively less, by $1Trillion[ii].’ This would seem proof enough that a myopic focus tends to generate less growth and value over the long term[iii].

Organisations across a range of industries and a spectrum of sizes are being forced to adapt to ever changing consumers, rapidly evolving technology and a quickening of the business environment. Opportunities will increasingly need to be ‘discovered’ since technology alone does not constitute a strategy nor is it plug and play in the sense that a new tech overlay cannot compensate for a fundamental legacy infrastructure – whether mindset, technology or organisational structure. Business would do well to begin a process of alignment, using deep-seated changes that fundamentally create change as a guide. Several of the key drivers of changes are forces larger and more complex than many standard industry-level trends normally interrogated by standard strategic tools such as Five Forces.  Even three horizons has been cited as unable to keep up with the rate of change. However, taking the longer view – often beyond ten years – can often feel too abstract, and beyond the job tenure of most CEOs.

As most businesses are becoming aware, either through business model pressures, friction from grafting new technologies onto legacy systems or else organisational issues, a new level of planning is needed. A yearly competitive analysis of predefined competitors no longer suffices. New competitors, new pressures and new opportunities are emerging and cannot be ignored.