Future media

There is little doubt that deep seated technological disruption is introducing new distribution channels at scale and increasing the speed to market required. Consumers and prosumers remain in flux, with local factors – including demographics, content taste, and infrastructures complicating matters.


All of this is painfully obvious to many within media companies, whose recent experiences have included significant shifts in both content creation and consumption. More concerning, from an incumbents’ point of view, is the lack of strategic response. Only 18% of media company leaders strongly agree they have a clear strategy and mission for disruptive technology[i]. How can strategies for the prime agent of change – the consumer-  be crafted in such an environment? Many remain caught between two hugely different business models. Clearly the current social media platform and the traditional publisher business model differ significantly, especially with regards to the required revenue per page. It has been suggested that this state of affairs is unsustainable and because of that, offers no long term future for traditional media.

When embarking upon digital transformation, industry executives should be asking themselves whether their plans are sufficiently joined-up or whether they need to become real technology companies or else network orchestrators to succeed in an environment defined by platforms and networks[ii]? Either way, there can be little doubt that large chunks of the media landscape will migrate to platforms, and that the mastery of other platform typologies will be critical in helping them get there. The overall churn and complexity of the industry belies pockets of opportunity and clarity that only a strong data platform can inform.

Platformising operations as well as business offerings represents more than just a technical challenge however. The mental models that underpin the industry, and indeed the very beliefs that have enabled incumbents to thrive in the past (and even recent past) are being upended. What will take their place is up for discussion; in all likelihood we will witness a plethora of successful models albeit with highly uneven demographic and geographic footprints. The losers of tomorrow are altogether easier to spot; reinvention of business models, mental models and operating processes are no longer optional and those that do not realise these imperatives will in all likelihood fail.

[i] https://assets.kpmg.com/content/dam/kpmg/xx/pdf/2016/12/disruptive-technologies-barometer-media-report.pdf

[ii] http://knowledge.wharton.upenn.edu/article/platforms-will-disrupt-future-media-entertainment/

Genetic destiny

In a rush of parental anxiety, thousands of Chinese infants have undergone genetic testing with the aim of revealing what aptitudes and talents they may naturally possess or be more inclined to develop. The tests, based on nothing more than a saliva swab reportedly cost hundreds of pounds[i].


The science behind this may be dubious but the development alone confirms that we have already entered an age of human genetic mapping, enhancement and even manipulation. As costs of DNA sequencing continue to fall significantly, an increasing number of people could benefit from complete sequencing. Consider for example, that even healthy people, with no obvious personal or family medical history consistent with genetic diseases have around a 4.5 percent chance of a genetic mutation that will directly impact their healthcare needs – both current and future[ii].

As millions and, in the future, billions of people undergo genomic sequenced as part of (increasingly) standard health care, data analytics will come into its own. Genomes will be compared to life experience and outcomes – helping uncover hitherto hidden genetic and epigenetic patterns. The potential for curing a range of conditions and diseases expands with the growth of such data. It has even been suggested that such analysis could render assisted fertility the most popular way of producing babies by 2040[iii].

Stephen Hsu, a professor at Michigan State University with an extensive background in genomics, suggests that within years we will be able to predict of the future height of the human from cells taken from early stage embryos; ascertaining intelligence from such cells may be reachable within a decade. Along with the innate ethics of such precise selection, the general spread of genomics presents us with a range of other issues.

Not all countries have explicit legal policies designed to protect consumers from corporate misuse of personal genetic information. Genetic discrimination, whether overt or covert, could flourish without regulatory safeguards – rendering health insurance potentially impossible for to obtain for some people. Could such genetic information even enter the realm of work or education? Whilst personal education could result that pinpoints areas of specific weakness and uses methodologies and curricula to help address it, the flipside – of abusing such information also looms large. A number of key questions remain unanswered; perhaps most critically is how we can ensure equitable access to these technologies – and the information contained within –  so populations within a nation and even globally do not fall into a binarity of genetic haves and have-nots?

[i] https://www.geneticliteracyproject.org/2017/02/14/chinese-parents-embrace-dubious-dna-tests-uncover-childs-natural-talent/

[ii] https://www.bbvaopenmind.com/en/article/large-scale-whole-genome-sequencing-efforts/?fullscreen=true

[iii] https://qz.com/677335/by-the-year-2040-embryo-selection-will-replace-sex-as-the-way-most-of-us-make-babies/


A network of networks: the future work unit

Digital technologies including the Internet of Things (IoT), cloud, and mobile could render 40 percent of companies in the world today irrelevant within a decade[i]. Automation could clearly compound this since the digital replacements may naturally lend themselves to requiring less human labour. This would clearly impact a huge swathe of the labour force and further encourage already visible signs of rising freelancing, new SME formation and new forms of employment. 1 in 4 students and 1 in 3 working professionals already want to start their own business[ii]. It is more than plausible that with most activity automated, most remaining roles will be mission critical and require a range of skills not typically found in any single example of today’s professionals. Teams, therefore could become the dominant model.


The infrastructure for more ‘team’ based employment is already being established. PwC has already established an internal marketplace for extra work tasks, whilst tech tools are evolving to facilitate better collaboration between professionals. Colony Beta is a tool for teams wishing to create their own collaboration network space – ‘a place to work with, incentivize, and track the contributions of a network of collaborators.’ It combines task management with “payments” and tracking[iii]. Bain predicts that by 2027, most work will be project based, with teams blending internal and external expertise to provide the required skill-set[iv]. Under this scenario, it is possible that formal mentors will appear – helping guide employees from one project to the next. The wider ‘gig economy’ talent platforms could themselves become a tool for increasing employee power, much as unions have traditionally done for manual workers. Whether or not membership of such ‘guilds’ is monetised or paid by via consumer data access could depend very much on the benefits on offer.

The transition to an agile team or network based work model could also provide an opportunity to disrupt the traditional (and fraying) employment benefits link. France, for example, is considering the implementation of a new benefits system with the aim of helping prepare for a new economy. Under the proposed scheme, the so called Individual Activity Accounts (IAAs) given to each adult member of society are designed to accumulate points in a manner not unlike airline miles. Work in both the private and public sectors are said to contribute points, whilst volunteer or pro-bono community service work could also be factored in. Points would then be used to finance an array of (personalised) benefits as well as lifelong education and training[v]. Such systems could be moderated or even initiated via new work platforms and networks (as could in theory universal basic income).

[i] http://www.businessinsider.com/former-cisco-ceo-500-billion-connected-devices-by-2025-2015-11

[ii] http://bit.ly/2jz95yz

[iii] https://blog.colony.io/colony-beta-product-summary-2121a357d61d#.95reikhrz

[iv]  http://www.bain.com/publications/articles/firm-of-the-future.aspx

[v] https://www.project-syndicate.org/commentary/individual-social-benefits-account-economic-efficiency-by-jean-pisani-ferry-2015-10

Future of learning: decentralised and self-guided?

There is often a lag between the emergence of disruptive factors and an industry undergoing transformational change. Fifty years passed between the emergence of canned food and the can opener, for example. For these fifty years we had the means of preserving food but not a modern way of extracting it. Although today’s gaps are in many cases shorter, they can still prove hazardous, not least because we tend to discount technologies and mediums that have appeared but not yet changed the world. Technology and education stands out as one clear example of this.


The internet may have already reduced a few critical barriers to change of educational models – access and availability as well as cost, but quality has generally remained an obstacle. MIT has recently added an adjunct certification process onto its MOOC offerings[i], helping to boost both the program’s financial viability and removing another barrier to the type of continuous learning advocated by many urging change in our learning and education systems.

Commentators often look for a big-bang change moment to confer shifts in paradigms, but the evolution of MIT’s MOOCs suggests that several waves evolution will be enough to confer radical changes. In effect the new model delivers value for money, efficiency in time and is more targeted than traditional length degrees. Indeed, David Gelernter, the Yale computer scientist suggests that ‘…over 90% of U.S. colleges will be gone within the next generation, since students demand value for their money and society demands colleges that work[ii].’ In its place, he suggests a wave of alternative credentials certified by a range of gatekeepers could emerge, with A.I presumably playing a key role in determining where gaps exist in an individuals’ education or learning and suggesting optimal – and not necessarily standard – ways to fill them.

Indeed as the pressure for continuous learning grows, both as a result of automation and companies’ demand for ever more adaptive workforces, the need for on-demand learning experiences will grow. Thomas Frey of the DaVinci Institute even suggests that ‘…by 2030 the largest company on the internet is going to be an education-based company that we haven’t heard of yet[iii].’ Whilst many companies are struggling to integrate external learning platforms into their learning opportunities, they are at least in some cases exploring new ways of putting the employee in charge of the learning experience[iv]. This would seem prudent in both developing a learning culture and organisation as well as allowing individuals to maximise their career opportunities in an era soon to de defined by white collar automation. A personalised approach to learning could similarly inform a new public education discourse. Unlike the wait for the can opener, the key technologies for unlocking this potential are already with us and evolving quickly.

[i] http://www.forbes.com/sites/realspin/2016/09/06/learning-2-0-and-the-future-of-the-disrupted-university/#3cc81d4c2fe9

[ii] https://www.insidehighered.com/blogs/technology-and-learning/david-gelernters-scary-vision-future-higher-ed

[iii] http://www.businessinsider.com/futurist-predicts-online-school-largest-online-company-2016-12

[iv] https://dupress.deloitte.com/dup-us-en/focus/human-capital-trends/2016/fostering-culture-of-learning-for-employees.html


The rate of artificial intelligence (AI) development is quickening across a range of typologies and an expanding range of applications now includes even management. McKinsey has already noted that between ‘…25-30 percent of current activities (at the C-suite level) can already be automated using currently developed technologies,’ let alone the rapidly evolving suite that is threatening to take over further.


Increasingly, with AI influencing management decisions via provision of evidence-based models, the idea of such AI generated insight being fed to workers via fully AI mediums will become science fact rather than science fiction in coming years. Corporate visions are already appearing; Bridgewater Associates, a $160 billion hedge fund, has suggested that it would like artificial intelligence software to make 75 percent of all management decisions by 2022[i].

The idea of AI bosses may seem implausible to some, but once you disassociate the notion of jobs from their constituent tasks, it makes more sense. A job will continue to evolve in what it does and how it does things after automation. Bank tellers still exist (although at reduced numbers) in 2017, decades after the introduction of the ATM IN 1969, but bank tellers now have a different skillset and purpose than they used to.

Looking at the chokepoints of business productivity is also instructive. Office workers are estimated to spend a third of their time in the workplace collecting and processing data but AI could streamline and even eliminate the need to spend time on this[ii]. If management is thought of as, in part, the diffusion of tasks designed to help the company meet its objectives, arrived at via the gathering and processing of data and then acting upon it then it becomes clear that AI could have a role. The role, scope and skillset of management would need to evolve as a result, as would greater organisational structures. Flatter structures may be inherently more agile but the transformation of many legacy organisations into such a structure would inevitably place change management and organisation design at a premium.

Ultimately, as we suggest in our free What’s Hot in Technology 2017 paper, the move to robo-bosses would require the creation of a robust and flexible data infrastructure. AI relies on data and could, in some circumstances, accentuate and spread the negative impacts of low quality data. Strategically implemented however, robo-bosses could usher in new work forms, new forms of value and a boom in productivity.

[i] http://www.nextbigfuture.com/2016/12/160-billion-hedgefund-want-artificial.html

[ii] http://blogs.opentext.com/top-tech-trends-2017/

Zero UI

The notion of zero UI (user interface) is increasingly accepted as we move away from screens. The emergence of automatic and predictive technologies coupled with ever decreasing sizes mean that the next step could involve humans becoming the next UI[i]. Although the early iterations of Google Glass failed to take off, they are indicative of the drive for ever more accessible, augmented and ‘natural’ data and information retrieval and creation. With examples of making our skin the interface for computing, we are indeed on the verge of a long lasting and profound change.


Zero UI will expand beyond voice tech such as Alexa; first into chatbots and voice biometrics, and later in to face recognition technology, gesture control and haptic feedback[ii]. Haptic feedback is already evident in various guises; for example Google’s Project Soli makes ‘your hands the only interface you need,’ by using radar to detect fine movements[iii]. The impact will extend beyond the consumer sphere; the world of work could be upended- from office design and requirements to the skills needed for a range of job tasks.

Ultimately Zero UI is both a customer experience and data issue. Intrinsically this makes it a design issue and one that will need to account for interoperability between many systems in the future[iv]. Design thinking can help ensure that the focus remains on the consumer experience rather than the interface.

Zero UI can also help in both leveraging data and, perhaps more fundamentally, understanding user intent. The importance of the latter lies in the ability to then design and build personalized experiences that are not only relevant but able to anticipate user needs. Zero UI can also create new experiences that help connect the digital and physical worlds – providing a new medium for media, entertainment, commerce and a host of other industries[v].

For additional information on Zero UI and what it means for organisations, check out our free What’s Hot in Technology 2017 paper. In addition, the paper features a range of other technologies that are set to impact organisations in the coming year and beyond.

[i]  http://www.fastcodesign.com/3048139/what-is-zero-ui-and-why-is-it-crucial-to-the-future-of-design

[ii]  http://www.techradar.com/news/beyond-voice-control-the-future-of-the-zero-user-interface

[iii] https://blogs.adobe.com/creativecloud/zero-ui-designing-for-screen-less-interactions/

[iv]  http://www.techradar.com/news/beyond-voice-control-the-future-of-the-zero-user-interface

[v] https://blogs.adobe.com/creativecloud/zero-ui-designing-for-screen-less-interactions/

Wearables 2.0

Wearables 1.0 are reasonably entrenched consumer goods, despite the ostensible failure of Google Glass, although albeit in mostly prosaic forms such as fitness trackers. As the range of applications rises and the data generated increases, these devices are likely to become ever more interlinked and, thanks to machine learning, adaptive and personalised.


Health is one prominent area of wearables application, with bioengineers creating sweat-based sensors to monitor glucose[i]. Other wearables have been developed that monitor both biochemical and electric signals in the human body[ii] and even provide (as a skin implant) 16 years of birth control.

As the field of neuroscience develops, mental health and wellbeing could become the dominant part of the health ecosystem. Researchers have developed temporary nanotechnology ‘tattoos’ able to map emotions[iii] whilst new devices have been created that purportedly stimulates the brain to boost both academic and athletic performance[iv].’With allusions to productivity and hitherto unexplored areas of neuro-management, it is perhaps unsurprising that 81% of CIOs believe wearables will perform in the workplace and that retailers see wearables as forming a key part of their future immersive retail vision.

As a result, products that create a tailored experience for both enterprise users and consumers are likely to emerge, channelling and compiling input and data from multiple wearable devices and generating actionable insights. The ever widening range of wearable utility along with the sheer growth in the number of both wearable devices and the IoT traffic they generate could easily overwhelm individual consumers, as well as the information infrastructure/architecture of many businesses and organisations looking to capitalise on such data. Developing a hub for controlling, correlating, synthesizing and extracting key-takeaways is clearly of import.


Whilst the smartphone is likely to feature as such a hub in the short term, the emergence of virtual personal assistants not bound to a screen are likely to proliferate. Virtual, augmented and mixed reality combined with haptics hold potential for displaying data in a more meaningful and personalised way that can be done with a screen. With numerous wearables including contact lenses able to take photos and provide social media updates, already in the labs – the human body is set to become the next interface, rather than the screen. All of which begs the question – at which point do wearables in all their diversity – implantables, ingestibles and injectables become part of us, and no longer ‘wearable?’

[i] http://phys.org/news/2016-10-bioengineers-sweat-based-sensor-glucose.html

[ii] https://www.eurekalert.org/pub_releases/2016-05/uoc–etf051916.php

[iii] https://techxplore.com/news/2016-07-nanotech-tattoo-emotions-muscle.html

[iv] https://www.fastcompany.com/3058464/startup-report/a-new-device-stimulates-the-brain-to-boost-athletic-performance