The creative economy Andy Haldane is Chief Economist at the Bank of England It is creativity, and its role in improving incomes in the economy and well-being in society, that I will discuss. I hope that by analysing creativity through an economic and historical lens we can learn something about its key ingredients. Developing those raw ingredients, and mixing them appropriately, has been crucial for social and economic progress over the course of history. And what has been true of the past is likely to become even more important for driving economic and societal improvement in future. The Bank of England is a public institution whose role is to serve the public good. We have been doing so for 325 years. The issues I will discuss – stability in our economies and societies, economic and social progress, the impact of innovation and ideas, the role of education and insurance – have remained central to public policy throughout that time. They will remain so in the period ahead. To summarise up front my line of argument: Imagination and creativity are what set humans apart from other animals. They explain why human evolution has been jet-propelled, while other animals have proceeded on foot. Through their creativity, humans have followed a completely different evolutionary arc – so-called Life Version 2.0. Innovation and creativity are the wellspring of improvements in economies and societies. They explain the secular, spectacular rises in global living standards and well-being now seen for several centuries. Human imagination and intelligence has been the engine of progress and growth. Creativity and innovation can come at a cost. The US economist Joseph Schumpeter spoke of “creative destruction.” He was right. With innovation can come a loss of livelihood and a straining of the social fabric. That has been true of every industrial revolution. These disruptive side-effects do not self-heal. They need to be managed if the fruits of innovation are not to rot in the fields. That means social safety nets to protect the livelihoods of those disrupted by innovation. And it means social institutions supporting the up-skilling of people. The GSA is a social institution created precisely to meet the latter need. The next 50 years may bring opportunities and challenges every bit as great as any in the past. The Fourth Industrial Revolution will offer huge potential to individuals, economies and societies. But the engine of growth may change, with artificial rather than human intelligence playing a much larger role. Some have called this Life Version 3.0. The creative forces unleashed by the Fourth Industrial Revolution will need to be carefully managed to avoid a tear in the social fabric. This is likely to require new sets of skills, with creative and social skills at a greater premium. It may require new sets of social institutions and safety nets. And it may require a reorientation of our approach to education, training and work. The ‘creative industries’ are a thriving sector of the UK economy, generating £100 billion of value-added each year. But, in future, we may need every industry and every worker to see themselves as creative. Knowledge will not be enough. We already have a knowledge economy. In future we will need a creative one. That may call for a very different model of education and training. The Great Fire Let me illustrate some of these points with a very simple historical example. The example is fire. The domestication of fire is, by many people’s reckoning, one of the greatest-ever human discoveries. If you type ‘greatest ever discoveries’ into a search engine, fire typically comes close to the top. It usually ranks alongside the wheel, gunpowder, the printing press, electricity and the internet in people’s all-time top ten. No-one of course knows quite when or where fire was first discovered or at least domesticated. But historians believe the domestication of fire may have occurred around one million years ago, when hunter-gatherer communities were first being formed. Indeed, it has been conjectured, plausibly, that fire was a key factor in allowing those communities to grow and develop. It is not difficult to see why. Fire would have been a truly transformative technology for these communities. It served as a source of heat as well as light. Equally vital then, if less so now, fire served as protection against marauding peoples and animals. Hunter-gatherer communities typically comprised around 150 people1. This meant they faced existential threat on an almost daily basis. At a stroke, fire reduced significantly that existential threat. These benefits meant fire quickly became one of the first and most important examples of what economists would these days call a GPT – a General Purpose Technology2. This is a technology that can be used across a wide variety of settings to a wide variety of tasks in a wide variety of places. Fire was the formative GPT for the human species, providing many of the everyday essentials of heat, light and protection. As with many GPTs, however, perhaps the greatest benefit of domesticating fire may have come indirectly and inadvertently. It was not a benefit primitive humans could remotely have foreseen at the time. Yet it was a benefit that, in time, would transform not just human bodies but human brains, not just human lives but human evolution. That elusive, transformational benefit was cooking. Fire transformed what humans ate. Cooking made edible the inedible, in particular protein-rich staples such as rice, wheat and potatoes. Cooking reduced the chances of illness from eating, by killing germs and parasites commonplace in foodstuffs. And cooking made it quicker and easier for human bodies to digest all foodstuffs, causing humans’ intestinal track (the part responsible for digestion) to shorten dramatically3. Together, these had a transformative impact on the human body. Cooking meant far less of our bodies’ energies were devoted to gathering, eating and digesting food. The intestinal tract is one of the body’s most energy-intensive parts. As its energy-usage fell, large amounts of energy were released to support growth elsewhere in the body. And where in the body did this unleashed energy go? It headed north to what would in time become humans most energy-intensive organ of all – the brain. This energy surge caused the brain to grow and its neural connections to multiply. The result was an organ which, despite being only 2-3% of our body mass, today accounts for 25% of its energy consumption. This may have been one of the key evolutionary steps in human history. It is the brain, above all other organs, that distinguishes humans from other animals. Our closest biological cousin, the primates, have physiologies which are 99%-identical to ours. Where they differ most significantly is the brain. In primates, the brain uses only 8% of the body’s energies – a third of humans. It is our energy-intensive brains that put humans of an entirely different evolutionary trajectory to all other animals. To see why, note that the pace of evolutionary progress in all animals, other than humans, is determined by biology. It occurs through a sequence of slow-moving, biological mutations. This can take millions of years to bring about recognisable change. That is why almost-all animals live in an almost-identical environment for an almost-identical lifespan doing near-identical tasks to their ancient ancestors. Human biologies are no different. The appendix is a biological artefact. It was essential for digestion at a time when humans’ diet consisted mainly of grass and vegetation in our hunter-gatherer past. But such is the sedate pace of biological evolution, the appendix is still with us, and causing us grief, hundreds of thousands of years later4. This is evidence of slow biological evolution of the type that constrains all animals. Except, that is, humans. Many millennia ago, humans found a way of breaking free from their biological chains. The reason we know this is because the pace of human evolution has radically outstripped that of all other animals. Human environments, lifespans and tasks are unrecognisable from a hundred years ago, let alone a million. Something other than our bodies put humans on the evolutionary fast-track5. That something was the brain. In particular, the energies released from digestion allowed a particular part of the brain to expand and connect - the pre-frontal cortex. This is located where our foreheads now sit. Indeed, our bodies may have adapted to meet this neurological need with the flat brow ridges of Neanderthal skulls in time replaced by the Tefal foreheads of today’s homo sapiens. And what exactly did this newly-installed pre-frontal cortex do? Modern Magnetic Resonance Imaging (MRI) can now tell us. Among other things, this part of the brain is responsible for imagination. That might sound like a rather niche characteristic, like having a good sense of humour or an upbeat personality. But in fact imagination appears to have been foundational for humans and transformational for their evolution. How so? Imagination allows us to conceive of a future different to any seen previously. Humans’ progress was now as limitless as their imagination. But imagination alone is not sufficient. Imagination without action is day-dreaming. What set humans apart was the ability to create their imagined future. The imagined was made real. That is what is meant by creativity. Imagination with action is creativity. Knowledge and imagination are different creatures. To have knowledge is to know about things that exist. To have imagination is to conceive of things that don’t yet exist. And to be creative is to make real those imagined things. Knowledge is vital for school exams and pub quizzes. Imagination is vital for ideas and innovation. And creativity is vital for human progress. Einstein put it thus: “Knowledge is limited. Imagination encircles the world. Logic will get you from A to Z. Imagination will get you everywhere.”6 Animals possess plenty of knowledge, often genetically encoded, sometimes learned through experience. The squirrel in my garden is genetically encoded to gather food for winter. It has also learned from experience that an effective way of doing so is to befriend my daughter who provides a daily supply of nuts. It is a knowledgeable squirrel, if not as knowledgeable as the squirrels in the remake of the film Charlie in the Chocolate Factory that were taught by film-producer Tim Burton to crack nuts on demand. Yet squirrels live identically to their ancient ancestors. They have evolved at a snail’s pace, as have snails. That is because squirrels and snails lack imagination. Squirrels cannot imagine a world where nuts are delivered courtesy of a complex international supply chain, much less set about creating that supply chain. Nor are they likely to write a Roald Dahl novel involving oompa loompas or become the next Tim Burton. Imagination is a uniquely human attribute. And it is an attribute that moved humans from the evolutionary slow lane to the neurological superhighway. Human progress was no longer constrained by sinews but by synapses, no longer tethered by biology but neurology. As my namesake (but no relation), the evolutionary biologist JBS Haldane, put it: “the world shall not perish for lack of wonders, but for lack of wonder.”7 It is not just evolutionary biologists that have recognised the importance of imagination in powering societies. Economists have too. The heterodox British economist George Shackle placed imagination centre-stage in explaining the evolution of economies8. This gives rise to a model of economic progress that is subject to a high degree of intrinsic, or radical, uncertainty. Shackle has his followers to this day9. Humans are social animals. As humans grouped in larger numbers, ideas and imaginations were collectivised and socialised. Individual intelligence gave way to collective intelligence. Many minds made for light work of the world’s most complex problems. This added heat to the creative crucible, enabling humans to move at warp speed from their hunter-gatherer communities to today’s hyper-connected super-cities. Once, only around 150 people could be connected through hunter-gatherer conversations. Today, 4 billion people can be instantly connected in conversation. That number grows by 750,000 globally each day. Imaginative outpourings, good and bad, are instantly collectivised and socialised. Today, we have a societal neural network that mirrors the human brain’s. Today, imagination has fulfilled Einstein’s prediction; it encircles the world. These days that troublesome biological relic, the appendix, is no longer quite so troublesome. Our brains imagined a different future, with surgery and medication, and humans then set out creating it. When the first appendectomy was carried out, in London in 1735, another biological barrier to human evolution was lifted. Over the same period, squirrels made precious little progress towards mastering just-in-time technologies. Imagination and creativity are what distinguish the human from the animal brain. They help explain the very different pace of brain-propelled human evolution and body-propelled animal evolution. The brain rebooted humans from Version 1.0 to Version 2.0. The rest is (human) history. Fire, for reasons unforeseeable at the time, appeared to play an important supporting role in this extraordinary evolutionary story. Creative destruction As it is creative, fire also has the potential to be destructive. No-one associated with this great institution needs any reminding of that. Nor do the residents of Paradise, California where many people tragically perished in the recent wildfires. These events are the latest in a long line of devastating and destructive fires over the millennia. Monument is a five-minute walk from the Bank of England and a minute’s walk away from Pudding Lane, where the Great Fire of London started on 2 September 1666. As destruction goes, this one takes some beating. One-third of London was destroyed, including 80% of all churches. 100,000 people were made homeless, around 25% of the city’s population10. The cost of the damage amounted to £10 million, or almost a thousand times the annual income of London at the time11. Yet the often untold part of this story is what happened next. The response to the destruction caused by the Great Fire of London is every bit as much its legacy as the Monument that stands by the banks of the Thames. It is a remarkable a story of individual imagination and collective action – the self-same ingredients that first enabled humans to break free from their biological chains. One response to the Great Fire was the introduction of new fire laws and regulations. The first-ever building regulations were after soon put in place, specifying the materials to be used when building houses and the minimum spacing between them. This may not sound transformational now, but at the time marked a significant rewriting of the social contract between government, businesses and households. A second response, this time from the private sector, was to create a market previously missing entirely. In response to the devastation, a market for insuring people’s homes was for the first time created. The first fire insurance company, the Insurance Office for Houses, was set up in 1681 at the back of Royal Exchange. It was the inspiration of (of all things) an economist, Nicholas Barbon. The idea was simple enough. By pooling risks across a number of households, both the insurer and insured were provided with an extra degree of financial protection against fire risk. A problem shared was a problem halved. The newly-installed rules and regulations around house-building reduced this risk further, by protecting individually-insured households, and collectively-at-risk insurance companies, from correlated conflagrations. This risk-pooling idea caught on. The home insurance industry grew rapidly, first in London, then Edinburgh, then across the UK. In 17th century Britain, home insurance was very much a creative industry. It was the fintech of its day. On the back of this, markets for other types of insurance began to flourish, based on the same risk-pooling principle. Lloyds of London emerged in 1688. In the space of a decade, London became the pre-eminent insurance market in the world. And so it has remained. That first-mover advantage has locked in London’s dominance for 340 years. Within a mile’s radius of Monument today are hundreds of insurance companies writing billions of pounds of insurance contracts each working day. The Bank of England keeps an eye on this industry in its role as regulator. The impulse for that creative agglomeration came from the destruction of the Great Fire. A third creative response came during the 18th century. Until then, there was no public fire service in the UK. Companies hired private companies to fire-fight or had their own fire service. The Bank of England was in a particularly vulnerable position at the time, as one of the world’s largest paper factories. The Bank bought its own fire engines and hired its own firemen to help protect it from going up in smoke12. With time, it became clear it made no sense for everyone to self-insure against fire risk. This risk was best collectivised through a municipal fire service - a ‘public good’, like lampposts and lighthouses. A public fire service was introduced in Scotland in 1824 and in England in 183313. The Bank’s fire engines and firemen were stood down. This particular public good also arose, with a lag, from the destruction of 1666. From the ashes of the Great Fire emerged a new public infrastructure of laws, regulations and institutions. That supported a new private infrastructure of contracts, companies and services. And this mass flourishing created one of the world’s leading financial centres, a position London retains. The regeneration spawned by the Great Fire set the economy and society – as well as the Bank - on an entirely different course.


Enlightenment now? That is the story of fire. But it is a story repeated for most transformative innovation over the course of human history: a spark of imagination; a flame of creativity as an imagined future is made real; periodic destructive burn-outs; and an eventual mass flourishing14. History tells us it is through this evolutionary process of creative destruction that economies grow and societies improve. But is the creative flame still burning? Prominent economists, such as Robert Gordon in the United States, have recently argued that the world may be, in Haldane’s terms, at risk of perishing for lack of wonder15. This hypothesis has support, with falling returns on research and development spending. The low-hanging fruit of creative innovation may already have been picked, leaving slimmer pickings for future generations. These fears do need, though, to be set in some context. One important piece of context is that, viewed over the long arc of history, humans have never had it so good. Stephen Pinker and Hans Rosling have recently brought home this crucial point in clear, statistical terms. Just consider the last two hundred years16. Over that period, each generation has a bit less than 50% better off financially than its predecessor. You are almost 50% better-off than your parents and more than twice as well-off as your grandparents. Your living standards today are around 16 times higher than your Glaswegian ancestors in the mid-18th century and 11 times better-off than when the GSA was founded. That is mass financial flourishing by anyone’s reckoning. This flourishing was physiological as well as financial. Over the same period, rates of infant mortality have fallen 40 percentage points. Lifespans have doubled. Although the average age of the population has never been higher, their average remaining life has also never been longer. Levels of global poverty have fallen from over 90% to single figures today. Societal progress has become an entrenched social norm. What explains these great leaps forward in economies and societies? As with any reading of history, there is no universally-agreed account. My reading of economic and social history suggests the secret sauce of economic and social progress has two essential ingredients – ideas and institutions. It was these two ‘I’s that were responsible for taking us from yesteryear’s hunter-gatherer clans to today’s super-cities17. Let me start with the first ‘i’ – ideas. These have their source in another, by now familiar, i – imagination. Great leaps forward societally have always had innovation, ideas and imagination at their hub. While fire provided the spark for early homo sapiens, this was only the first in a sequence of technological fireworks. In the mid-18th century, the Industrial Revolution was sparked by the firing of three ideas – James Watt’s steam engine, Richard Arkwright’s water frame and James Hargreaves’ spinning jenny. These inventions occurred at almost exactly the same time (within a handful of years) and in almost exactly the same place (within a few hundred miles of latitudes North of Stockport). This was remarkable, if not coincidental. The second Industrial Revolution of the mid-19th century was sparked by a different set of innovations. This time it was sanitation, electrification and internal combustion that lit the fuse on the mass-industrialisation of countries and continents. The third Industrial Revolution of the mid-20th century brought a further wave of innovation, with digitisation, computing and the internet generating a transformation of business and society. Each of these inventions involved a creative leap of imagination. As adoption spread, each became in time a GPT, applicable across sectors, industries and geographies. Like fire, these GPTs then transformed industries, jobs and lifestyles in ways inconceivable to their creators. In each Industrial Revolution, a first imaginative step resulted in a great leap forward for societal living standards, a mass flourishing18. In that sense the three industrial revolutions of the past three centuries fit the longer-run evolutionary arc of humankind. It is creativity and imagination, fuelled by big brains and nourished by cooked meals, that set humans on their jet-propelled evolutionary path. The rapid, ideas-fuelled, progress made by societies over recent centuries is a continuation of that ever-upward evolutionary arc. Except, that is, for one small detail. The evolutionary arc of humans has not been ever-upward. The historical path has not been a North-bound ascent. While human ingenuity and creativity have been ever-present, economies and societies have in fact spent protracted periods crabbing sideways. Prior to the first Industrial Revolution, living standards appear to have been essentially static for several thousand years19. Living standards in Glasgow in 1750 were little different than their ancestors constructing Hadrian’s Wall. Levels of poverty, nutrition, infant mortality, height and longevity would also have been indistinguishable. Prior to the Industrial Revolution, societies and economies stood still, financially and physiologically. There was flat-lining, not mass flourishing. Societal progress was far from being a social norm. What explains this great pause in living standards? It was not through lack of ideas and imagination. People did not suddenly make like monkeys for millennia. To the contrary, innovation came thick and fast in the pre-industrial era, from the windmill in the 12th century to the mechanical clock in the 13th, from the cannon in the 14th to the printing press in the 15th, from the postal service in the 16th to the telescope in the 17th20. It is clear pre-industrial innovation played an important role in fuelling subsequent growth. Shakespeare’s imaginative genius would not have been sparked without Guttenberg’s 14th century invention. Einstein would not have transformed our understanding of the world without Lippershey’s 17th century creativity. Yet neither great invention translated into consistently higher living standards for the great mass of society at the time. What was the missing ingredient? A number of historians believe it was a second ‘I’ – institutions21. In the words of economist Douglass North, institutions are “humanly devised constraints that structure political, economic and social interactions.”22 If ideas and imagination are the fuel and engine that drive economies forward, rules and institutions are the bolts and chassis holding societies together. Institutions, defined broadly, play two crucial roles. First, they provide the rules of the game that allow the creative process to flourish. For example, the rule of law can help ensure property, physical and intellectual, is not stolen and contracts are honoured. This provides private individuals and companies with the foundations to flourish. Nation states without these rules of the game have been found, historically, not to flourish but to fail23. Without fire regulations and property rights, could a private market for home insurance have flourished after the Great Fire? Second, institutions cushion the adverse side-effects of technological disruption. Innovation brings destruction for businesses and joblessness for workers. If those costs are not cushioned, the social fabric is torn and new ideas risk being strangled at birth. Institutions can help protect those made redundant or obsolescent by innovation, helping repair the social fabric. And they can retool and reskill workers to prepare them to thrive, helping loom a new fabric. As much as ideas, the three Industrial Revolutions are a story of institutions. Institutions that provided people with social insurance, such as public healthcare and public transport, social housing and social safety nets, central banks and charities, credit unions and trade unions. And institutions that provided people with the tools and infrastructure to reskill, such as guilds and professional associations, primary and secondary schools, colleges and universities. The Bank of England was founded before the Industrial Revolution. But it emerged as a public institution in the 19th century. The public good provided was (and still is) monetary and financial stability. Some put the date when the Bank became a genuinely central bank at 1844, with the passing of the Bank Charter Act granting the Bank a monopoly over issuing legal tender. A five minute walk from my office is St Paul’s Cathedral. In the churchyard of St Paul’s on 6 June 1844, the same year as the Bank Charter Act, George Williams set up a shelter for young men who had come to London in search of work as part of the first wave of industrialisation. These shelters offered a roof and food. Their role spread quickly across the UK then globally, like home insurance after the Great Fire. The YMCA was born. 170 years on, the YMCA is still offering food and shelter. But it now operates in 119 countries and has helped millions of young men (and, through the YWCA, women) to improve their lives. It is one of the millions of civic institutions providing social insurance to those in need. Many of these institutions, including the GSA, YMCA and Bank of England, can trace their roots to the Industrial Revolution. Institutions turned tragedy into triumph after the Great Fire and turned stagnation into success either side of the Industrial Revolution. The lesson of history is clear. For societies to grow sustainably, we need the imagination inside our heads to generate creativity, ideas, innovation. But we also need social institutions that connect and curate these heads to generate collective intelligence and collective action. Our economies and societies will also need to reseed to harness the potential of the Fourth Industrial Revolution. For mass flourishing, our knowledge economy will need to evolve into a genuinely creative one
The Fourth Industrial Revolution From the past to the future. A new technological wave is breaking. On some accounts, this wave could be as great as any seen previously. The so-called Fourth Industrial Revolution is associated with the emergence of a whole new class of technologies with the potential to be tomorrow’s GPT24. These include machine learning, Big Data, robotics, bio-technologies and Artificial Intelligence (AI). The rise of the robot has well and truly begun25. Machines, and indeed robots, are not of course new. Nor are fears of them rising up and taking jobs and control. The Luddites had the same concerns in the 19th century. This time, though, seems a bit different. These machines, unlike their predecessors during the first three industrial revolutions, are capable of thinking as well as doing. Although this might sound like a small step for humankind, it is a potential game-changer. So far in human history, humans have kept one-step-ahead of the machine by gravitating towards tasks out of robotic reach. That has meant cognitive tasks. This spawned the growth of educational institutions, with universal primary and then secondary education and then the rapid expansion of colleges and universities. In other words, humans used the self-same neurological advantage over machines as had earlier put them on an entirely different evolutionary arc to other animals. Except, unlike with animals, it is not clear humans will retain their cognitive lead over machines. If current rates of machine advance were to continue, it is simply a matter of time before humans lose pole position. And, once passed, what hope of humans ever catching up? As the evolutionary arc of humans split from animals thousands of years ago, the upwards arc of machines may be about to detach from humans. Some of the writing is already on the wall. In 1997, a significant milestone was passed when the IBM supercomputer Deep Blue beat the world chess champion, Gary Kasporov. A generation later in 2016, another milestone was passed when an algorithm called AlphaGo, developed by AI company DeepMind, for the first time beat the world champion at Go (a game considerably more complex than chess), Lee Sedol. These are just board games and two-person board games at that. The game of life is infinitely more complex, involving many-more moves among many-more combinations of players. Nonetheless it is worth asking how machines compare when it comes to the everyday, but complex, tasks humans perform. Looked at function by function, some of that paranoia about the rise of the robots begins to look justified. The processing capacity of super-computers already exceeds the human brain by an order of magnitude. Courtesy of Moore’s Law, that gap will widen at an ever-increasing rate over time. For some critical faculties - seeing, hearing, learning - machines are already well ahead of humans. The sensors in a self-driving car have far-better vision than any human. Alexa already does a much better job of hearing than Andy. And AlphaGo learned thousands of years of human knowledge within a matter of months. Against that backdrop, there has been a surge of recent interest in the potential for large-scale job losses, as machines displace humans. This fear is not new. Fears about job displacement have been a recurrent theme for 300 years, as machines first displaced agricultural workers and, more recently, factory workers. These shifts were huge. In 1750, half the labour force worked in the primary sector. Today, it is 1%. Estimates of the potential scale of future job displacement are highly uncertain. Studies suggest between 10% and 50% of the global workforce could see their jobs disrupted significantly, if not displaced entirely, over the next 10-15 years26. At the upper end, that would be almost 2 billion people globally whose livelihoods could be significantly disrupted. This is vastly more than any previous industrial revolution. As in the past, the costs of this disruption are unlikely to be spread evenly. Recent studies have shown that those at greatest risk are likely to work in sectors and regions still reeling from earlier Industrial Revolutions. Jobs among lower-skilled workers doing routine tasks in the service sector in post-industrial towns and cities are ripe for automation27. The BBC website has an app which puts this probability at 83%. If that were the path followed by the Fourth Industrial Revolution, it could be a recipe for another ‘I’ – inequality – as various studies have shown28. Earlier industrial revolutions were also associated with rising levels of inequality, at least in their initial phase. Against a background of uncomfortably high starting levels of inequality in some countries, that could raise already-grave concerns about the inclusiveness of societies. Just in case I haven’t depressed you enough already, it is possible to paint a more dystopian picture still. The science fiction writer Raymond Kurtzweil, among others, has speculated about the possibility of a ‘singularity’ – a point where machines surpass the functioning of the brain in every task29. Estimates vary on when this point might be reached, if ever. But a number place it this century. The singularity, were it to arrive, would mark a second inflexion point for humankind. The course of human history, beyond that singularity point, is not just unknown but unknowable. It is, by definition, beyond the limits of even our imagination. At that point, human control over our own destinies could be lost forever. This is not biological extinction, in the sense of dinosaurs and dodos. But it could be neurological extinction. Were the singularity to arrive, some have argued this would take societies to their next evolutionary state30. If Life Version 1.0 was defined by biology and Version 2.0 by neurology, then Version 3.0 would be defined by technology. The engine of societal progress would no longer be human ingenuity, imagination and intelligence. Instead it would be artificial ingenuity, imagination and intelligence. Scared yet? Co-evolution Don’t be. I want to argue that loss of human jobs and control of their destinies is far from being the only possible, or even the most likely, ending to this story. Humans can remain in work and masters of their own destiny. As in the past, there are good grounds for optimism. This will, however, require some fundamental changes to human skills and human work and in the social institutions supporting them. One reason for optimism is that, for the foreseeable future, humans are likely to retain their upper hand across a number of tasks. One such set of tasks are those requiring large doses of creativity involving leaps of imagination or bespoke design. Super-computers have designed and created international supply chains for nuts. But they are no more likely than squirrels to replace Roald Dahl or Tim Burton any time soon. A second set of tasks not easily machine-reproducible involve interpersonal or social skills. There are robots for childcare and social care. But I doubt they will become the norm. People are social animals and value social interaction above all else. I cannot see a robot replacing GSA lecturers in my lifetime. The BBC app tells me a teacher, childminder or artist has a probability of being automated of less than 10%. Even if you are an economist, it is only 15%31. If anything, we might see the demand for these skills grow in the period ahead. It has been estimated that, between now and 2030, demand for jobs where creativity is a key skill could increase by 30-40%. Demand for jobs with high levels of social and emotional skills are forecast to increase by 25%32. For all the jobs lost, new ones will be created in a different image, mirroring the pattern in previous Industrial Revolutions. We may also need to be more creative about how we define 'creative'. TV programme-makers are creative, but so are computer program-makers. AI is about as creative an activity as you could imagine. NESTA have tried to classify the creativity of tasks. Their estimates put the number of people currently in creative professions at between a fifth and a quarter33. Were jobs to evolve in line with expectations, that fraction could rise to more than a third in the next decade. For a great many tasks, it is probably wrong to even think of jobs being displacement. More likely, their nature and the skills required will evolve. Take medicine. Many aspects of clinical diagnosis and prescription are routine. Armed with Big Data on someone’s genome and health history, an algorithm could diagnose many ailments, and prescribe treatments, as well as humans if not better. Would doing so sound the death knell for doctors and surgeons? I suspect not. Medical professionals are likely to draw increasingly on data and algorithmic insights. But there will be an accompanying, increasingly important, role for human judgement, explanation and empathy. Indeed, it is already the case that these human skills are often among the most highly-valued by patients. In a world of robo-medical advice, the balance of doctors’ skills will shift further in this interpersonal direction. As with much of human history, this will be a case not so much of humans versus machines as humans with machines. The evolutionary arc of humankind will not switch, discretely, from one defined by neurology to one defined by technology. What we could see instead is a co-evolutionary arc with minds and machines, neurology and technology, co-mingled and complementary34. In a number of tasks they already are. Take chess. A generation on from Deep Blue beating Gary Kasparov, you might expect machines to have taken an unassailable lead. In fact, they have not. The world chess championship is not waged between mainframes. If you asked who or what was the best chess player in the world today the answer would be a human, albeit a human working with a machine. Chess has followed a co-evolutionary arc. Perhaps this is simply a matter of time. But our brains themselves are far from static. This is not a case of technological hare racing neurological tortoise. Our brains are more energy-efficient than super-computers, by a factor of perhaps 1,00035. One reason is because they are hyper-connected, with around 100 billion neurons each with 1,000-10,000 connections. No digital web comes even close to this scale of connectivity. Moreover, these connections are not static; they are hyper-flexible. Our brains are not a plug wired once. They are constantly re-wiring themselves, a phenomenon known as neuro-plasticity36. Unlike biological adaptation, this re-wiring takes place relatively rapidly and can be significant. This rewiring is particularly significant at times of extreme shifts in environment – for example, personal trauma or technological change. Guttenberg’s printing press did not just spark Shakespeare’s imagination. It sparked a re-wiring of all of our brains37. As communication switched from word to print, different neurological processes were needed for comprehension and deliberation. In response our brains adapted, as had our homo sapiens brains years earlier. The future AI revolution could generate a similar re-wiring of the super-computer between our ears. Perhaps it has already started. When the AlphaGo algorithm beat Lee Sedol in 2016 the decisive move came in Game 2, Move 37. With that move, the algorithm broke all previous human playing conventions, built up over thousands of years of play. If you watch the documentary of the match, Move 37 appeared to personally traumatise the World Champion Lee Sedol38. He promptly lost the game and, in time, the match. Much less widely reported on, but for me as interesting, is what happened in Game 4, Move 78. This move was played not by the AlphaGo algorithm, but by Lee Sedol. This, too, broke all centuries’-old human conventions; it was the imagined made real. Watching the film, Move 78 appeared to send the AlphaGo algorithm into meltdown. It was digitally traumatised, began playing erratically and promptly lost the game. It is difficult to know what prompted Lee Sedol to create an untried and untested move. Perhaps the personal trauma of Move 37 caused some re-wiring of his brain. Perhaps out of destruction was forged creativity, as after the Great Fire. If so, this would be an example of co-evolution in practice, with the neurological and the technological combining in a mutually beneficial cycle. Since 2016, humans playing Go have adapted their playing strategies learning from the ever-improving algorithms. The same was true of Chess champions after Deep Blue. Move 37, or its successor strategies, has become a new human convention. This creativity was machine-assisted. But it is human creativity nonetheless, just as Hamlet was the creation of Shakespeare not Guttenberg and the theory of relativity was the creation of Einstein not Lippershey. What applies to board games applies to other spheres of human endeavour. Driverless cars are not, in the main, driverless. Most have a human override for situations where the algorithm experiences circumstances outside its sphere of knowledge, the automotive equivalent of Move 78. Alexa hears better than me and learns faster than me. But I can still deliver a better talk about creativity to students at the GSA, even if I did draw on her knowledge in putting it together. If that ever changes, I can always pull the plug. Institutions for the 21st century At the same time, this co-evolutionary path will not be an easy one. Even if (and it is a big if) as many jobs are created as are destroyed by the Fourth Industrial Revolution, there will be transitional costs and societal casualties to manage. As during past waves of innovation, making a success of this revolution will require a reworking of the social infrastructure if these costs and casualties are not to tear the social fabric. There are many possible dimensions of this reformation. One of the most important is close to the GSA’s heart – education. This is also an issue close to the Bank of England’s heart. Education on economic and financial matters is one of the public goods the Bank can provide and is providing. Our educational programme, comprising school visits and curriculum materials, has developed dramatically over recent years and is now reaching around a third of schoolchildren across the UK39. This economic and financial knowledge is crucial. We hear a lot these days about our knowledge-based economy and with good reason. Like human evolution, our economies are evolving in ways which give prominence to digital over physical assets. Intellectual property (the brain) is often more important than physical property (the body), and software (neurology) is often more important than hardware (biology), in driving growth in our companies and economies40. The rapid emergence of a knowledge-based economy has important implications for educational institutions. They were designed, in the UK in the 19th century, as factories for the manufacture of knowledge on an industrial scale. At the time, they worked well. In an increasingly knowledge-based economy, this suggests these knowledge-factories will become even more important in the future than they have been in the past. That is the right answer but for the wrong reasons. What will be needed in future is not improved knowledge-factories producing more knowledgeable students. What will be needed instead are creativity-academies producing a more creative workforce. In future, we will not need people simply to get from A to Z - Alexa is faster and cheaper at doing that. We need people who can navigate everywhere. That means creativity not knowledge, imagination rather than intelligence, EQ as well as IQ. The most-watched TED talk of all time is not on signature topics of global importance such as climate change, inequality or even robotics, as important as these are. It is about (of all things) education and given by (of all things) a British educational expert, Ken Robinson41. It has had a remarkable 56 million views on YouTube, or about 56,000 times more than my TED talk on a topic of global importance42. Robinson’s TED talk is now 13 years old, but its message could not be more topical. Robinson says that our current education system tends to teach creativity out of children, rather than into it. The audience spontaneously applaud. In standard tests of creativity, at what age do you think people’s scores peak? The answer is around age 6. That, not-coincidentally, is around the age children start school. If creativity holds the key, why not teach it in rather than out? That may sound odd. We often think of creativity as somehow innate or genetic, like having red hair or a good sense of humour. On this view, teaching creativity is like teaching someone to grow a funny bone. It is not. It is perfectly possible to teach someone to be funny – there are courses aplenty on it. And it is possible too to teach them to be creative. Philip Bond teaches a course on creativity at the University of Manchester. Creativity is not the result of random lightning strikes of inspiration. It is about creating the right environment for lightening to strike in the first place. The shapes and colours of our offices are important. Eating and sleeping patterns matter. Walking helps. Meeting new people matters. New experiences work wonders (and wonder)43. Creativity does not require an apple to land coincidentally on the head of a genius, any more than it requires the coincidence of a Watt, a Hargreaves and an Arkwright in the late 18th century in latitudes North of Stockport. Creativity is a core skill in us all – indeed, the one skill we know is uniquely human. But nurturing it requires the right environment, as Ken Robinson’s reflections on our educational systems make clear. Apples are less likely to fall on our heads if we are deskbound. This point is not confined to creativity. Today’s educational system is heavily skewed towards developing cognitive skills in the young. This made sense during the first three Industrial Revolutions as humans sought to keep ahead of machines that were long brawn and short brain. As machines’ cognitive capacity grew, so too did the demand for institutions offering higher-level cognitive skills, such as colleges and universities. These institutions were the right response to the challenges of first three Industrial Revolutions. They are unlikely to be the right response to the fourth. The rise of the thinking machine means the future world of work will no longer require narrowly cognitive skills. And in a world of 100-year lives and 70-year careers, educational institutions will need to equip old and young alike with these skills. Developing cognitive skills in the young was a brilliant model for the past 300 years, but not one for our educational future. Many different models are possible. Elsewhere, I have called one possibility multiversities, as distinct from universities44. The multi serves double-duty. It connotes the need for these new institutions to expand their disciplinary horizons and become less subject-singular. History shows that creativity breakthroughs are often sourced in straddling disciplinary boundaries, in being subject-plural. Creating the right environment for creativity often means breaking free from disciplinary silos. Indeed, if we are to embed a cross-disciplinary culture, we may need to rethink how we classify subjects. Traditional domain-knowledge may make less sense in a creative rather than knowledge-based economy. A new classification system might recognise subjects like creativity and digital literacy, emotional intelligence and empathy, entrepreneurship and design. These would, by design, straddle disciplinary boundaries. The multi also signifies the need to straddle generational, as well as disciplinary, divides. Education will need in future to cater for old and young alike, making lifelong learning a reality. Rather than the sequential model, with first education and then work, we would instead have a rotation model over a career. Universities currently tend to be a one-way street into work. Multiversities would operate like career roundabouts with turnoffs into both work and study. This would be a fundamental shift, culturally and educationally. Even if we pushed our peak age of creativity into early adulthood, that still leaves a long creative downslope during which our capacity to teach ever-older dogs ever-newer tricks would become ever-more difficult. Lifelong learning and re-skilling of adults has been difficult to make a reality for a reason. The right infrastructure will be needed to support this shift and to create the incentives to sustain it through our 100-year lives. For understandable reasons, there has been an upsurge in interest in this issue recently, including in the UK. Philip Augar is leading a government review of post-18 education which is due to report this year45. Both of the main opposition parties have also initiated reviews. The Centenary Commission on Adult Education (of which I am a patron) began work just last month to assess the future needs of the educational system46. There are already models which may offer some clues on a future direction of travel. The Open University celebrates its 50th birthday this year. Since its inception, it has been a model of flexible lifelong learning, vocational and cognitive. Its students combine work and study, typically through part-time distance learning. It is the UK’s largest university with over 170,000 students. A future educational model needs importantly to embody this sort of flexibility – the flexibility to combine work and study through a career, with educational credits which accumulate over time and which are portable between institutions. The current post-18 model operates like a driving license – obtained during the early years but then rarely if ever refreshed or augmented. The future world of work may call for a model which is more like a training schedule for a marathon, which builds capacity over time. Creating the right incentives to engage in a career-long marathon training programme is not easy. There is a reason relatively few people run marathons and fewer still enjoy the experience. But a number of options are being tried. In 2016, Singapore introduced a credit system for lifelong training - SkillsFuture Credit. These credits can be drawn down at any stage of an individual’s career to support approved skills-related training. Denmark has a similar system in place for those displaced from work. Conclusion Creativity flowed from the Great Fire. Our economies and societies will also need to reseed to harness the potential of the Fourth Industrial Revolution. For mass flourishing, our knowledge economy will need to evolve into a genuinely creative one. And our social institutions, including our educational institutions, may need to be radically reworked. It is a time to make the imagined real. Endnotes 1. Dunbar (1993). 2. Bresnahan and Trajtenberg (1995). 3. Harari (2014). 4. Looking across all its forms across species, the appendix probably evolved more than 30 times (Smith et al (2013)). 5. Tegmark (2017). 6. What life means to Einstein, Saturday Evening Post October 26, 1929, 7. Clark (1969). 8. Ford, JL (1994). 9. Beckert and Bronk (2018). 10. Museum of London and Demographia. 11. 12. Murphy (2018). 13. 14. Phelps (2013). 15. Gordon (2012). 16. Pinker (2018), Rosling (2018). 17. Haldane (2018a). 18. Mokyr (2011). 19. Haldane (2015), Parliamentary Office of Science & Technology (2016). 20. Clark (2007). 21. Acemoglu and Robinson (2012). 22. North (1990). 23. Acemoglu and Robinson (2012). 24. Klingenberg (2017). 25. Ford (2015). 26. Manyika et al (2017). 27. Lawrence (2017), and House of Parliament, Parliamentary Office of Science & Technology (2016). 28. Baweja (2016), Lawrence (2017), and Berriman (2018). 29. Kurzweil (2005). 30. Tegmark (2017). 31. 32. Bughin et al (2018) 33. Bakhshl et al (2015). 34. Tegmark (2017). 35. 36. Berlucci (2009). 37. Carr (2011). 38. 39. Carney (2018), Haldane (2018b). 40. Unger (2019). 41. Available at 42. Available, as if you were bothered, at 43. Available at 44. Haldane (2018a). 45. Prime Minister launches major review of post-18 education in 2018 46. 1919 Centenary Commission was launched in 2019 References Acemoglu, D and Robinson, J (2012), Why Nationa Fail: The Origins of Power, Prosperity, and Poverty, Crown Business. Bakhshl, H, Frey, C and Osborne, M (2015), ‘Creativity Vs. Robots: The Creative Economy and the Future of Employment’, NESTA, UK. Baweja, B, Donovan, P, Haefele, M, Siddiqi, L and Smiles, S (2016), ‘Extreme automation and connectivity: The global, regional, and investment implications of the Fourth Industrial Revolution’, UBS White Paper for the World Economic Forum. Beckert, J and Bronk, R (2018), Uncertain Futures: Imaginaries, Narratives, and Calculation in the Economy, Oxford University Press. Berlucci, G and Buchtel, H (2009), ‘Neuronal plasticity: historical roots and evolution of meaning’, Experimental Brain Research, Vol. 192, No. 3, pp. 307-319. Berriman, R, Cameron, E and Hawksworth, J (2018), ‘Will robots really steal our jobs? An international analysis of the potential long teerm impact of automation’, PwC UK. Beveridge, W (1950), The Art of Scientific Investigation, W. W. Norton & Company. Bresnahan, T and Trajtenberg, M (1995), ‘General purpose technologies – ‘Engines of growth’?’, Journal of Econometrics, Vol. 65, pp. 83-108. Bughin, J, Hazan, E, Lund, S, Dahlstrom, P, Wiesinger, A and Subramaniam, A (2018), ‘Skill Shift: Automation and the Future of the Workforce’, McKinsey Global Institute. Carney, M (2018), ‘Opening remarks at the econoME launch event’, speech available at Carr, N (2010), The Shallows: How the Internet is Changing the Way We Think, Read and Remember, WW Norton & Company. Clark, R (1969), JBS: The Life and Work of JBS Haldane, Coward McCann, pp. 231. Clark, G (2007), A Farewell to Alms: A Brief Economic History of the World, Princeton University Press. Dunbar, R (1993), ‘Co-evolution of neocortical size, group size and language in humans’, Behavioral and Brain Sciences, Vol. 16, No. 4, pp. 681-735. Ford, JL (1994), GLS Shackle: The dissenting economist’s economist, Edward Elgar. Ford, M (2015), Rise of the Robots: Technology and the Threat of a Jobless Future, Basic Books. Gordon, R (2012), ‘Is US Economic Growth Over? Faltering Innovation Confronts the Six Headwinds’, NBER Working Paper, No. 18315. Haldane, A (2015), ‘Growing, Fast and Slow’, speech available at Haldane, A (2018a), ‘Ideas and Institutions – A Growth Story’, speech available at Haldane, A (2018b), ‘Folk Wisdom’, speech available at Harari, Y (2014), Sapiens: A Brief History of Humankind, Harvill Secker. House, J, Landis, K and Umberson, D (1988), ‘Social relationships and health’, Science, Vol 241, pp. 540- 545. Klingenberg, C (2017), ‘Industry 4.0: What Makes it a Revolution’, presented in EurOMA Conference. Kurzweil, R (2005), The Singularity Is Near: When Humans Transcend Biology, Viking. Lawrence, M, Roberts, C and King, L (2017), ‘Managing Automation: Employment, inequality and ethics in the digital age’, IPPR Commission on Economic Justice, UK. Lieberman, M (2013), Social: Why our brains are wired to connect, Crown. Manyika, J, Lund, S, Chui, M, Bughin, J, Woetzel, J, Batra, P, Ko, R and Sanghvi, S (2017), ‘Jobs lost, jobs gained: workforce transitions in a time of automation’, McKinsey Global Institute, USA. Michalko, M (2011), Creative Thinkering: Putting Your Imagination to Work, New World Library. Mokyr, R (2011), ‘The Rate and Direction of Invention in The British Industrial Revolution: Incentives and Institutions’, NBER Working Paper Series, No. 16993. Murphy, A (2018), ‘How did organisations adapt to change in the 18th and 19th century: Lessons from the Bank of England Archives…’, Bank Underground, 7 November 2018. North, D (1991), ‘Institutions’, The Journal of Economic Perspectives, Vol. 5, No. 1, pp. 97-112. Parliamentary Office of Science & Technology (2016), ‘Automation and the Workforce’, Houses of Parliament, No. 534, UK. Phelps, E (2013), Mass Flourishing: How Grassroots Innovation Created Jobs, Challenge, and Change, Princeton University Press. Pinker, S (2018), Enlightenment Now: The Case for Reason, Science, Humanism, and Progress, Penguin Books Limited. Powers, S, van Schaik, C and Lehmann, L (2016), ‘How institutions shaped the last major evolutionary transition to large-scale human societies’, Philosophical Transactions of the Royal Society B: Biological Sciences, Vol. 371(1687). Rosling, H (2018), Factfulness: Ten Reasons We’re Wrong About The World – And Why Things Are Better Than You Think, Sceptre. Smith, H, Parker, F, Kotze, S and Laurin, M (2013), ‘Multiple independent appearances of the cecal appendix in mammalian evolution and an investigation of related ecological and anatomical factors’, Comptes Rendus Palevol, Vol. 12, No. 6, pp. 339-354. Tegmark, M (2017), Life 3.0: Being Human in the Age of Artificial Intelligence, Allen Lane. Unger, R (2019), The Knowledge Economy, Verso. Viereck (1929), What Life Means to Einstein: An Interview by George Sylvester Viereck, The Saturday Evening Post. The views expressed here are not necessarily those of the Bank of England or the Monetary Policy Committee. I would like to thank Marilena Angeli and Shiv Chowla for their help in preparing the text. I would like to thank Philip Bond, Clare Macallan and Mette Nielson for their comments and contributions. This article is based on a speech delivered at the Inaugural Glasgow School of Art Creative Engagement Lecture, The Glasgow School of Art, 22 November 2018.