Through the digital age and beyond

One of the first large-scale automatic digital computers – the Mark 1 – was created around 1940 – and provided some impressive specs at 8ft high, 50ft long, 2ft wide, and weighing 5 tonnes[i]. By 1946 the computer age had been officially established, with the U.S Army established ENIAC computer’s 18,000 vacuum tubes able to better calculate projectiles’ paths. The U.S Army had spent over a half-million dollars (equivalent to several million in current dollars) for what was then a risky and unproven technique[ii]. For decades computing remained the preserve of militaries and big corporations. It wasn’t until the 1970’s that we saw the start of the digital age – or third industrial revolution – with mass personal computing taking root with the 1977 launch of the Apple II.

The revolution marked by the incipient digital age witnessed the rise of electronics, telecommunications and computers. It also saw an era of emergent automation[iii]. By the late 1980’s, several threads born on the early digital age- of networks and linked personal and mini computers- had merged to create the internet.

It is a sign of how slow our companies are to respond to technological change – in terms of culture, organisation and even business models that close to fifty years after the digital age started, we are still talking about digital transformation as a game changer for industries and businesses. That does not mean that the digital era’s impact will quickly dissipate the digital era, but it is no longer the driving force of innovation. $1.25 trillion is forecast to be spent on digital transformation initiatives globally in 2019, and that number is forecast to climb to $1.97 trillion in 2022[iv]. The importance of digital cannot be downplayed, but very little of this investment will redefine industries or business: being digital is now table stakes or the bar minimum needed for doing business and no longer a source of differentiation or competitive advantage[v].

It is speculated that the end of the digital era will be definitively marked as have ended by an increased level of consumer trust. ‘When we start to trust technology again,’ notes TechRadar, ‘…we’ll know that we’re in a post-digital world: where businesses aren’t infatuated with technology, but are dedicated to people[vi].’

It seems somewhat ironic that an intelligent era should be marked by dedicated to people. Surrounded by technology will enable personalisation to a degree never seen before. In many ways the intelligent age that we are now entering is a natural evolution from the digital age, but the change between the two marks an important juncture in how we think about change. Some are dubbing the transition digital v3 but something deeper is changing as opposed to just an enhanced digital age. It has been suggested by Accenture that while ‘…technology for the last 25 years has been something we can use, we’re entering a new era where technology is who we are[vii].’

At the same time there is a need to look at all changes simultaneously – we are no longer in a time where can pick the top three or even five technologies. This fundamentally represents a different way of thinking[viii]. Distributed ledgers, artificial intelligence, extended reality and quantum computing all have the potential to bring about not just new processes and new ways of doing things, but different things. However, ‘…as powerful as today’s emerging technologies are, they will yield only incremental gains if their application is limited to existing processes[ix].’

It’s an open question as to whether we are ready – as businesses, societies and body polities for the repercussion of breakthrough technologies such as machine learning and biotechnology. Fundamental transformations will impact a range of industries, including computing, healthcare, energy and transportation[x]. The future, for all of these industries and more, promises more change in the next decade than the previous 40 to 100 years.

The intelligent age is upon us, but the age of ‘linear’ tech change may be at an end. Given a range of changes across multiple domains, we are entering the era of exponential change. At the same time, the main role for bridging discovery and commercialization is increasingly failing to private concerns and away from government. Twinned with the range of technological changes, this promises to complicate attempts at regulation of what promise to be powerful technologies. Nowhere is this more the case than with the promise of biotech.

Steve Jobs once remarked that ‘…the biggest innovations of the 21st century will be at the intersection of biology and technology[xi].’ Biology is fast becoming the new digital and biotechnology (broadly speaking the combination of the two) a key driver of the future economy. Using living organisms to make products or manipulate existing processes could fuel innovation across healthcare, industry and the food sector and beyond. If DNA does indeed emerge as the new silicon as is suggested, Wired proclaims that ‘…biology will be the next great computing platform[xii].’ The post-human age, the intelligent age and the biotech age are all upon us – how they interact with not only each other, but our ill-prepared systems will help define multiple paths of change.

 

Age Tech age Dates Characteristics
Computer Tail end of second industrial revolution 1946-1969 Initially unproven and costly.

Benefits restricted to militaries and big businesses.

1960 By 1960 the transistor, invented in the 1940s, was reliable enough to replace the fragile vacuum tubes of an earlier day.

By the end of the 1960s the silicon chip had become the principal device in computer processors and was beginning to replace memory cores as well.

Digital Third industrial revolution 1970 Era of personal computing. Also called the information age.
1979 In 1979 a program called “Visicalc” appeared for the Apple II: it manipulated rows and columns of figures known to accountants as a “spread sheet,” only much faster and easier than anyone had imagined possible. A person owning Visicalc and an Apple II could now do things that even a large mainframe could not do easily.
Internet 1990 Networking is entrenched.

HTML developed and world wide web opened to public in 1991.

1977-2019 Digital age sees spread of electronics, robotics, networks, global telecomms.

Internet turns 40 in 2009

Intelligent Fourth industrial revolution 2016 – Google unveils Google Assistant, a voice-activated personal assistant program, marking the entry of the Internet giant into the “smart” computerized assistant marketplace. Google joins Amazon’s Alexa, Siri from Apple, and Cortana from Microsoft.

Quantum, A.I, blockchain combine to remake industries en-masse and society at large.

Biological Exponential era 2020s Biology could be the next great computing platform and a key driver of transformations across a range of industries – from construction to health.
Post-human 2030s The intelligent and biological eras combine to create post-humanity – nanotech, brain implants and more become mainstream advancements.

[i] Source: Open Book Project, 2017 http://openbookproject.net/courses/intro2ict/history/history.html

[ii] Source: BBVA Openmind, retriebed 2019 https://www.bbvaopenmind.com/en/articles/the-trajectory-of-digital-computing/

[iii] Source: Sentryo (Cisco) 2017 https://www.sentryo.net/the-4-industrial-revolutions/

[iv] Source: Accenture, 2019 https://www.accenture.com/us-en/blogs/blogs-paul-daugherty-digital-transformation

[v] Source: Accenture, 2019 https://www.accenture.com/us-en/blogs/blogs-paul-daugherty-digital-transformation

[vi] Source: TechRadar, 2019 https://www.techradar.com/news/were-already-in-a-post-digital-era-so-what-comes-next

[vii] Source: Accenture, via Wired, 2018 https://video.wired.com/watch/wired25-accenture-cto-paul-daugherty-on-reimagining-the-future-of-business

[viii] Source: ThinkAdvisor, 2018 https://www.thinkadvisor.com/2018/09/17/from-robo-advisors-to-brain-chips-the-future-of-ai/?slreturn=20190915101238

[ix] Source: BCG, 2019 https://www.bcg.com/publications/2019/company-of-the-future.aspx

[x] Source: Forbes, 2016 https://www.forbes.com/sites/gregsatell/2016/07/17/a-new-era-of-innovation/#17861dca5a4e

[xi] Source: O’Reilly, 2017 https://www.oreilly.com/ideas/how-synthetic-biology-startups-are-building-the-future-at-rebelbio

[xii] Source: Wired, 2017 https://www.wired.com/story/biology-will-be-the-next-great-computing-platform/

Feeding the future

The world’s population is set to grow by 2.2 billion between now and 2050[i], and given that this is in tandem with rising prosperity, we are forecast to need an extra 70 percent of food than we did in 2009[ii]. Despite the serious challenge of climate change it is thought that a range of technologies, from GPS and drones to robotics will help achieve much of the needed gains.

corn-field-440338_640

In order for a sustainable breakthrough, however, the future of food is going to have to depart radically from its traditions. This is unavoidable if we are to sustain a growing array of environmental, economic and social needs. Business as usual, plus technology, as in so many other industries, will probably not meet the multiple demands being placed on food producers.

Given the water intensiveness of meat, for example, alternatives are needed. Indeed, the alternative meat industry could become toward a $140bn market by 2030 and by 2040, AT Kearney believes that 60 percent of all meat will either be grown in vats or come from textured plant proteins[iii]. It is perhaps worth noting that 3D printing could also become a viable production method of proteins, however unappetising that sounds. It will be interesting to see how such trends collide with social trends. The global halal food market is expected to reach $739 billion by 2025 from $436 billion in 2016 as the numbers of Muslims approaches one third of Earth’s total[iv].

Regardless, the shift from industrialised agriculture to scientific agriculture could be one of the most important changes in the last hundred years or so. UBS notes that ‘…the ability to grow food in a lab that replicates meat, fish, eggs, and dairy products — with a lower carbon footprint and without the need to slaughter animals — is likely to become a commercially viable option in the next decade[v].’ The World Economic Forum, meanwhile, suggests that food computers could be the future of agriculture[vi]. Precision fermentation and the ‘food-as-Software’ trend are helping dramatically lower manufactured protein cost. One report suggests that by 2030, the U.S market for ground beef could shrink by 70 percent, the steak market by 30 percent and the dairy market by almost 90 percent[vii].

What food producers should do about such trends remains dependent on their skills, networks, and even marketing. What they cannot do is ignore such trends. Fully autonomous farm equipment is already becoming commercially available, meaning machines will be able to completely take over a multitude of tasks[viii]. Farming is already, in places, a highly digital industry. If farmers and others in the value chain are to remain relevant, they need to build bridges between what they today and what food production could look like tomorrow.

[i] Source: UN, 2018 https://news.un.org/en/story/2018/10/1023371

[ii] Source: CNN, 2019 https://www.cnn.com/2019/04/01/business/5g-farming/index.html

[iii] Source: CNBC, 2019 https://www.cnbc.com/2019/05/23/alternative-meat-to-become-140-billion-industry-barclays-says.html

[iv] Source: PRNewswire, 2018 https://www.prnewswire.com/news-releases/the-global-halal-food-market-is-expected-to-reach-usd-73959-billion-by-2025-300621035.html

[v] Source: Live Kindly, 2019 https://www.livekindly.co/vegan-meat-market-85-billion-2030/

[vi] Source: World Economic Forum, 2019 https://www.weforum.org/agenda/2019/04/the-future-of-agriculture-is-computerized

[vii] Source: Food Navigator, 2019 https://www.foodnavigator-usa.com/Article/2019/09/17/By-2030-the-US-dairy-and-cattle-industry-will-have-collapsed-predicts-RethinkX

[viii] Source: Independent, 2019 https://www.independent.co.uk/life-style/gadgets-and-tech/robots-farming-autonomous-equipment-canada-australia-a8919836.html

The futures gap

The difference between identifying change and acting upon it is where many organisations wither and die as Kodak and other organisations stand testament to. While most firms believe they’re ‘picking up on signals of change,’ that might disrupt their lines of business, 42 percent admit that they’re unable to act on those signals[i]. While signals of change come from multiple directions, the technological driver is key. It is forecast that enterprises dithering on embracing changes in technology stand to lose 46 percent of their revenues by 2023[ii].

snail-1447233_1280

Many pieces rightfully assert that digital transformation is not solely, or even primarily, a technological change, tech does have the ability to catalase a range of other organisational change. A.I, blockchain and the edge will all accelerate the rate of change in how data is produced, transmitted analysed and consumed, for example[iii]. Furthermore several of these technologies, perhaps most notably machine learning, promise to redraw organisational design itself, potentially replacing silo’s with a ‘…powerful core of purpose and culture,’ that also sees a task, as opposed to job, driven organisation[iv].

Without experimenting with new technology, using it to enable different services and products, and ultimately changing how organisations do business, the futures gap is set to grow uncomfortably wide for many organisations. 75 percent of executives, for example, agree that their organisations will need to make significant changes to keep up with rising customer expectations finds[v]. Concurrently, 80 percent say that their own culture has to evolve over the next five years to allow their company to succeed and grow[vi], yet shifting this – especially within large organisations is a difficult, often iterative task not without its own significant risks.

For those stuck in the futures gap – of identifying opportunities and challenges but unable or unwilling to do anything about them – a couple of likely paths open up. For the largest organisations it may even be possible to defer change until it is critical, and like Microsoft, purchase your way into a new business lines, cultures and revenue streams. Few, if any, have this luxury, and must look to rapidly close the gap between what is possible and business as usual. Tools like 3H can help, although the gap could become unbridgeable for those without a strategy to overhaul what they do, how they do it and why they do it. It remains plausible that several industries could fall – en masse – into the futures gap should digital and intelligent capabilities honed in merging market segments play better than incumbent offerings.

[i]Source: Forbes, 2019 https://www.forbes.com/sites/robertbtucker/2019/10/01/seven-trends-driving-the-future-of-innovation/#5196a95f6738

[ii] Source: MoneyControl, 2019 https://www.moneycontrol.com/news/business/companies/enterprises-dithering-on-embracing-changes-in-technology-stand-to-lose-almost-50-of-their-revenues-report-4500941.html?linkId=100000008499135

[iii] Source: Deloitte, 2019 https://www2.deloitte.com/us/en/insights/topics/digital-transformation/digital-transformation-financial-services-boards.html

[iv] Source: Forrester, 2019, via Consulting https://www.consulting.us/news/2493/automation-to-push-dynamic-and-adaptive-work-says-forrester

[v] Source: EIU, 2019 https://perspectives.eiu.com/technology-innovation/integrated-transformation-how-rising-customer-expectations-are-turning-companies-outside/white-paper/integrated-transformation-how-rising-customer-expectations-are-turning-companies-outside

[vi] Source: PwC, https://www.strategyand.pwc.com/gx/en/insights/global-culture-survey.html

Changing sectors at the Edge

The coming volume of data will have a range of impacts. It will challenge current tech models and architectures – by 2025, nearly 30 percent of data generated will be real-time[i] and more than 50 percent of data is forecast to be managed autonomously[ii]. It will also force industries to collide, whether through new data models, third party data arbitrage or even more direct partnerships and collaborations.

living-on-the-edge-844873_1280

The edge could also reshape what it is that industries do. Take insurance, for example; at present the industry adheres to its higher purpose of ‘protection,’ by compensating after loss and allowing people to take on risk that is otherwise too large. Edge powered IoT devices will likely shift the nature of the insurance industry from ‘compensation,’ to ‘prevention,’ and in doing so draw new players into the space. If insurers do not use their natural comparative advantage of rigour and distribution to help distinguish such a strategy, the space is likely to be explored by utilities and even big tech companies with footholds in the smart home environment.

It is not the only industry likely to undergo profound change in purpose or practice thanks to the edge. As-a-service models, exemplified by Rolls Royce selling outcomes as opposed to engines to a range of aviation customers, is likely to spread into whole new sectors once data regarding the use of a whole range of objects becomes made available in real time. Retailers, for example, could move beyond just providing augmented reality mirrors, to providing a personalised selection for customers to peruse based on real-time, contextual information. Fast, better, real time data could also enable financial services to in-built into products and services or offered in an on-demand basis, while logistics could decentralize into P2P type services.

Healthcare, like insurance could also move into a more preventative realm, with real-time recommendations and feedback likely shifting elements of the industry into a life-coach role. How this could interplay with others – from insurers to sportswear manufacturers – will likely spread the opportunities and challenges of such changes way beyond the confines of traditional industry barriers.

This will render traditional planning obsolete – today’s strengths can rapidly morph into irrelevance once the rules of the game are changed. The edge, combined with 5G, will force every organisation to reaffirm and perhaps redress the question of just what business they are in and what it is they do to achieve their higher purpose. Mental models, the nature of competition, talent requirements and the competencies needed to play in future spaces will all change

 

[i] Source: ZDNet, 2018 https://www.zdnet.com/article/by-2025-nearly-30-percent-of-data-generated-will-be-real-time-idc-says/

[ii] Source: Oracle, 2019 http://www.oracle.com/us/solutions/cloud/oracle-cloud-predictions-2019-5244106.pdf

Cybersecurity at the Edge

The future of computing lies at the edge of networks, such as with IoT linked devices and infrastructure. While today only about 20 percent of enterprise data is being produced and processed outside of centralised datacentres, by 2025, that is expected to rise to 75 percent and could reach 90 percent[i].

cyber-4062449_1280

These forecasts, if realised, would imply even handheld devices will have AI capabilities built into them without outsourcing the heavy lifting to large servers, something that would otherwise be next to impossible.’ All data could therefore be processed in near real-time, at the edge of networks such as the IoT. Information technology strategies, consumer behaviour and the architecture within which to operate would all shift as a result, some in unpredictable ways as the edge economy potentially approaches $4.1Tn by 2030[ii]. As it grows in prominence, so will cybersecurity issues associated with it.

I.T ecosystems will increasingly need to exist ‘out there’ – at the edge- rather than within the organisational walls. With most IoT using organisations having limited visibility to their network, let alone their exposure to IoT cyber risk, new standards will likely be needed. Security and privacy controls will need to be built at the edge and intrinsically part of every device and network. Where possible security will have to be inbuilt into the edge device itself, which brings up some potentially interesting collaboration models and partnership possibilities.

Edge computing will also bring about significant changes to organisational IT architecture[iii]. Given both the networked nature of the edge, the likely creation of ecosystems around edge data, and the increasingly intertwined nature of IT systems, approaching edge cybersecurity at the ecosystem level is increasingly necessary to protect potentially weaker links in the cybersecurity chain, such as third parties.

71 percent of CEOs already state that they see information security as a strategic function and a source of competitive advantage[iv]. However, data breaches could potentially reach $5Tn yearly by 2024[v], complicated by edge technology ‘…as dependency on complex internet-enabled business models outpaces the ability to introduce adequate safeguards that protect critical assets[vi].’ If infosec strategy is to thrive in an edge era, new security and data architectures that span multiple organisations and even industries will need to emerge. Until then it is likely that organisations will have to impose ‘…zero trust concepts where they can’t trust the network, have to authenticate use, and have to understand what data is actually resident there[vii],’ and plan for the fact that the edge is dynamic. As with other technologies, organisational change would seem a must if edge yields are to accumulate effectively.

 

[i] https://searchcio.techtarget.com/feature/The-shift-to-edge-computing-is-happening-fast-heres-why

[ii] https://www.fiercewireless.com/wireless/edge-internet-economy-to-reach-4-1t-by-2030-analyst

[iii] https://searchcio.techtarget.com/feature/Edge-computing-architecture-takes-on-a-partner-ecosystem

[iv] https://home.kpmg/xx/en/blogs/home/posts/2019/09/cyber-as-a-competitive-advantage.html

[v] https://www.siliconrepublic.com/enterprise/data-breaches-cost-2024-juniper-research

[vi] https://www.raconteur.net/technology/cybercrime-business

[vii] https://www.information-age.com/cyber-security-for-iot-and-edge-computing-123485616/

 

Outside regulation, or self-regulation?

From being seen as an onerous burden, the nature, breadth and impact of regulation is set to become a key strategic facet for many organisations. Those with strategic foresight will look to pre-empt and in conjunction with others, help shape the direction of regulation to come.

regulation-3246979_1280

On October 3rd, 2019 it was reported that ‘…European courts can now force Facebook to scrub illegal content worldwide.’ Reuters reported that Facebook and other platforms can also be made to comply with requests to take down content globally, even in countries where it is not illegal[i]. How the governments of the United States and other powers react to a foreign law that could easily be seen as subverting freedom of speech in their own countries will likely add to what is already, technologically speaking, a patchwork of privacy laws, lack of transparency and general incoherence that neither protect competition nor customers[ii]. Either way, Europe’s gambit could have lasting geopolitical clout, MIT Sloan believes that ‘…he first country to figure out the best way to regulate the broader tech industry could become the focal point for the next chapter of the world’s digital revolution[iii].’

Indeed, the EU Commission digital department has already recommended a regulatory framework for AI that would set transparency obligations on automated decision-making[iv]. Could Artificial Intelligence be the next GDPR? Wired notes that ‘…Intelligent systems at scale need regulation because they are an unprecedented force multiplier for the promotion of the interests of an individual or a group[v].’ Digital reality will also likely need some regulatory guidelines[vi], requiring business and government to work together.

An example of how this can be done lies in the sandbox format. Already adopted by developers of autonomous vehicles, virtual currencies, and fintech regulators, sandboxes provide a safe environment to encourage innovation while protecting consumer safety. ‘For example, the United Kingdom’s Financial Conduct Authority launched the first fintech regulatory sandbox in June 2016, allowing fintech players to test innovative products and services in a safe, live environment, with the appropriate consumer safeguards, and, when appropriate, is exempt from some regulatory environments[vii].’

While waiting, or planning for the future, organisations of all types would do well in policing themselves. A Stanford study, for example, found that companies that try to fix problems on their own may sidestep more onerous regulations in the future[viii], not to mention avoid damaging the trust of consumers and ceding brand value. If the future of business is trust, self-regulation is a must in developing a future-proof product and service.

[i] Source: Reuters, 2019

https://uk.reuters.com/article/uk-eu-alphabet-content/facebook-can-be-forced-to-remove-content-worldwide-after-landmark-eu-court-ruling-idUKKBN1WI0R7

[ii][ii] Source: MIT Sloan Management Review, 2019 https://sloanreview.mit.edu/article/the-right-way-to-regulate-the-tech-industry/

[iii] Source: MIT Sloan Management Review, 2019 https://sloanreview.mit.edu/article/the-right-way-to-regulate-the-tech-industry/

[iv] Source: Politico, 2019 https://www.politico.eu/article/ai-data-regulator-rules-next-european-commission-takes-aim/

[v] Source: Wired, 2019 https://www.wired.com/story/ai-algorithms-need-drug-trials/

[vi] Source: Deloitte, 2019 https://www2.deloitte.com/us/en/insights/industry/public-sector/regulating-digital-reality-augmented-spaces.html

[vii] Source: Deloitte, 2019 https://www2.deloitte.com/us/en/insights/industry/public-sector/future-of-regulation/regulating-emerging-technology.html

[viii] Source: Fast Company, 2019 https://www.fastcompany.com/90345349/why-companies-should-police-themselves-better-than-the-politicians

Working with robots

Much has been written about the future impact of automation, with headline numbers or percentage of jobs being replaced the most common. Many of these numbers seem predicated on technological viability and economic rationale. Automation, however, remains a strategic challenge for organisations, not a strictly technological one.

workplace-1616459_1280

Consider, for example, that 44 percent of organisations have not yet determined how their automation strategies will affect their workforce[i]. Without such an assessment, not only do organisations and their workers miss out on appropriate training and upskilling opportunities, but the challenge of transitioning norms, working practices and culture is also lost.

Over half of employees worldwide currently feel threatened by automation, with 77 percent wanting to learn new digital skills[ii] to be able to learn new roles and jobs that automation will help create.  In addition, 73 percent of business leaders cite company culture as the single most important contributor to corporate success[iii]. Unless ‘…executives are proactive in shaping and measuring culture, approaching it with the same rigor and discipline with which they tackle operational transformations[iv],’ it is almost impossible to see how any corporate culture survives almost constant disruption brought about by AI.

Working with and alongside robots (and how this impacts other person to person contact in the workplace) will prove one of the key challenges for workers of the 21st century, from both a cultural and skills perspective. Ensuring digital and cultural readiness amongst the workforce in an appropriate structure will be key in addressing an area that current change management cannot sufficiently cope with.

Some have even suggested that ‘…it’s time for a C-Level role dedicated to re-skilling workers[v].’ It is certainly time to elevate organisational continuous learning to board level priority. Accenture, for example, developed a ‘Job Buddy’ program that has helped to retrain almost 300,000 employees over the four years to January 2019. The program assesses which roles are most likely to be automated, offers advice on which adjacent roles can be learned within the company and provided relevant training. Within 18 months of launching the pilot, 85 percent of employees for whom it was made available had used the system to assess their current job and enroll in further training[vi]. With 76 percent of executives believing internal talent mobility is important, but only 6 percent of companies believing they are excellent at moving people from role to role[vii], such programmes are likely to become more popular.

Digital skills may be vital, but if automation is to yield its true potential, management and leadership skills are the ones most needed, both in crafting win-win proposals and building the core skills needed for tomorrow’s environment.

 

[i] Source: Deloitte, 2019 https://www2.deloitte.com/us/en/insights/focus/technology-and-the-future-of-work/intelligent-automation-technologies-strategies.html

[ii] Source: Business Review, 2019 http://business-review.eu/business/human-resources/over-half-of-employees-worldwide-feel-threatened-by-automation-77-pct-want-to-learn-new-digital-skills-pwc-study-finds-204600

[iii] Source: HR Grapevine, 2019 https://www.hrgrapevine.com/content/article/2019-10-08-3-in-4-business-leaders-say-corporate-culture-is-central-to-success

[iv] Source: The Financial Brand, 2018 https://thefinancialbrand.com/73953/banking-culture-change-digital-transformation-cx-risk/

[v] Source: HBR, 2019 https://hbr.org/2019/09/its-time-for-a-c-level-role-dedicated-to-reskilling-workers

[vi] Source: Business Insider, 2019 https://www.businessinsider.com/training-employees-on-new-skills-and-technology-what-accenture-learned-2019-1

[vii] Source: InnerMobility, 2019 https://www.innermobility.com/why-internal-talent-mobility-is-broken/