The world in 2050
THE most unwittingly profound things that have been said about predicting the future were said by two sportsmen, neither renowned for his conventional intellect. The American baseball player Yogi Berra said “I never make predictions, especially about the future”; the English footballer Paul Gascoigne went one better: “I never make predictions and I never will.”
Prediction is a mug’s game. Everybody who has ever done it has proved to be terrible at it — and always will be. Yes, even the ones with reputations as great seers were hopeless most of the time. Arthur C Clarke saw geostationary satellites coming, but also said (in 1962) that hovercraft were going to dominate land transport to the point that by the 1990s a common sign would read “No wheeled vehicles on this highway”.
So, before issuing my own predictions, here are some of the reasons they will fail. First, many trends are non-linear, so things that are mere smudges of cloud on the horizon in one decade (mobile telephony and the internet in the late-1980s) can turn into hurricanes in the next.
Second, as Tim Harford argues in his book Adapt, blind trial-and-error is responsible for most of the innovations that change the world, not intelligent design or planning.
Third, as Dan Gardner argues in his book Future Babble, status quo bias means that forecasters tell you much more about their own times than about the future.
In the mid-20th century, after 50 years of little change in communications technologies, but astonishing innovations in transport, futurologists were all babbling about routine space travel, personal gyrocopters and jetpacks, but not one of them foresaw the internet or ubiquitous mobile telephony.
The next 40 years will not see the end of anything: not history, science, oil, capitalism, books or love. With remarkably few exceptions (trebuchets, antler retoucheurs), old technologies and ideas persist alongside new ones. Even trebuchets have been revived — for throwing flaming pianos at rock concerts.
The nature of human society is to accumulate ideas, more than to replace them. Likewise, it is a mistake to rule any outcome impossible, as Ernest Rutherford did atomic weapons (“moonshine”), or the British Astronomer Royal did space travel two weeks before Sputnik (“bunk”). Arthur C Clarke again — on target this time: “When a distinguished scientist... states that something is impossible, he is almost certainly wrong.” Yet to forecast the rise or fall of any specific technology is as good as impossible. If I could predict the invention of the teletransporter, then I could invent it.
By far the sharpest lesson to draw from past forecasts is that planetary pessimism is usually wrong. The field of futurology is littered with cataclysmic prognostications that failed. Go back 40 years to 1971 and recall the continual dirge of dire doom that those of us who became teenagers that year were subjected to over the next four decades.
The grown-ups told us with terrible certainty that the population explosion was unstoppable; global famine was inevitable; crop yield increases would peter out; food aid to India was futile; a cancer epidemic caused by pesticides in the environment would shorten our lives; the desert was advancing at two miles a year; nuclear fallout was a growing risk; nuclear winter was an inevitable consequence of an inevitable nuclear war; Ebola, hanta virus and swine flu pandemics were overdue; urban decay was irreversible; acid rain was going to destroy whole forests; oil spills were increasing; economic growth was ceasing; global inequalities would rise; oil and gas would soon run out; and so would copper, zinc, chrome and many other natural resources; urban air pollution was getting worse; dozens of bird and mammal species would go extinct each year; a new ice age was coming; sperm counts were falling; mad cow disease would kill hundreds of thousands of people; genetically modified weeds would devastate ecosystems; nanotechnology would run riot; computers would crash at the dawn of the Millennium, bringing down parts of civilisation with them; winter snow would become a rarity; hurricanes would increase in frequency; malaria would get worse; climate change would wipe out species; weather would kill more people and sea level rise would accelerate. All of these were trumpeted loudly in the mainstream media at one time or another.
Some quotations to show that I am not exaggerating. U Thant, director general of the United Nations, in 1969: “If such a global partnership is not forged within the next decade, then I very much fear that the problems I have mentioned will have reached such staggering proportions that they are beyond our capacity to control.”
The best-selling book The Limits to Growth, in 1972, on its cover: “Will this be the world that your grandchildren will thank you for? A world where industrial production has sunk to zero. Where population has suffered a catastrophic decline. Where the air, sea, and land are polluted beyond redemption. Where civilisation is a distant memory. This is the world that the computer forecasts.”
The distinguished economist Robert Heilbroner, in 1973: “The outlook for man, I believe, is painful, difficult, perhaps desperate and the hope that can be held out for his future prospects seems to be very slim indeed.”
The celebrity ecologist Paul Ehrlich in 1974: “The train of events leading to the dissolution of India as a viable nation is already in motion.” The New York Times, in 1980, just before oil prices fell: “There should be no such thing as optimism about energy for the foreseeable future...prices will go up and up.” Al Gore, in 2006: “Many scientists are now warning that we are moving closer to several ‘tipping points’ that could —within ten years — make it impossible for us to avoid irretrievable damage to the planet’s habitability for human civilisation.”
With the exception of a handful where the jury is still out, every single one of the predictions listed in the last paragraph but one was wrong. Not just off by a few years, but diametrically, 180-degrees, wrong. Over the 40 years after 1971 the population growth rate halved; famine became rare; average crop yields doubled; India became a food exporter; lifespan increased by 25% globally; age-adjusted cancer rates fell; the Sahel grew greener; radioactive fallout levels fell by 90%; two-thirds of nuclear weapons were dismantled; no viral pandemic occurred; many cities flourished; forest cover increased; the amount of oil spilled in the ocean fell by 90%; an unprecedented global economic boom occurred; inequality fell sharply as poor countries grew rich far faster than rich countries; oil and gas reserves ballooned and prices fell; metal prices plummeted; urban air pollution improved rapidly; the ice age never came; bird and mammal species extinction rates remained low; sperm counts failed to fall; mad cow disease killed at most 171 people in 20 years; genetically modified crops led to increased biodiversity; nanotechnology did nada; computers suffered few and minor problems at the Millennium even in countries that did nothing about the Y2K bug (like Italy and South Korea); average winter snow cover in the Northern hemisphere increased; tropical cyclone accumulated energy fell to record lows; malaria retreated; not a single species went extinct as a result of climate change (golden toads were killed by a fungus); fewer people died as a result of extreme weather; and sea-level rise did not accelerate.
You think I have cherry-picked cases to suit my argument? What doom predictions would you prefer instead? Arctic summer sea ice retreated in the first years of this century faster than some expected. True, but Antarctic summer sea ice increased slightly over the same period.
The seasonal ozone hole over Antarctica failed to recover, true, but the resulting skin cancer and cataract epidemic in Patagonia or New Zealand turned out to be pseudoscientific. As for global warming generally, the warming between 1970 and 2010 was about 0.5C and was slowest in the last decade of the four. It has significantly undershot the predictions made in the 1980s by the likes of NASA scientist James Hansen, the man who first made global warming a celebrity. He said in 1986 that within 15 years temperatures would have risen by 2C-4C to “levels not seen on earth in 100,000 years”. A quarter of a century later they have risen about one tenth as much.
Forgive me, then, if I take with a pinch of salt the dire dirge of pessimism that my teenage children are being subjected to in their textbooks and in the media today — a dirge that keeps an industry of pressure groups and subsidised corporations in funds.
The ability of this Armageddon industry to move on from its mistakes with a shrug of hindsight bias and without the media asking even the simplest of questions about what went wrong with previous predictions is apparently something that one can safely predict will continue. It is a safe bet that in 2050 the media (whatever form it takes) will be dominated by pessimists then too. Disaster will be imminent still.
There are two simple reasons that these predictions of doom were wrong in 1971 and will be wrong again in 2011 and in 2050. First, bad things will always be much more newsworthy than good things. Good news is smooth, bad news is lumpy: general improvements in human living standards are incremental, and all but invisible, while wars, recessions, earthquakes and asteroids tend to strike suddenly.
So every time a new scare comes up, the moderate or sanguine voices are drowned out by the more extreme and negative predictions. Yet year by year the world’s 2%-5% economic growth rate goes on, all but unreported.
The second reason is more fundamental. All the scare stories assume a static response. They assume that humanity is standing on a railway track watching a train coming towards it, rather than moving out of the way. The number of people killed by droughts, floods and storms in the first decade of the 2000s was 93% down on the number killed by the same causes in the 1920s, despite a much larger population. That is not because weather became less dangerous; but because technologies enabled people to cut their risk of death — technologies of shelter, transport, communication, medicine and more.
If food gets scarce, prices rise and farmers plant more crops, use more fertiliser or try more experiments to raise yields; if oil gets scarce, prices rise and new drilling techniques are invented; if whales or copper get expensive, substitutes are found; if pollution gets worse, clean air laws are enacted; and so on. The (otherwise distinguished) Victorian economist William Stanley Jevons said that “it is useless to think of substituting any other kind of fuel for coal” — six years after the first oil well was drilled.
The relationship between humans and the resources they need is not that of a man helping himself from a diminishing pile of chocolates; it is a negotiation between human ingenuity and natural constraints in which price signals divert effort into fruitful directions.
Frequently, natural substances “become” resources having not been before. Iron ore was not a resource till the iron age, nor uranium ore till the nuclear age. Shale gas was known to exist but was not a resource till John Mitchell figured out how to get it out of shale in the 1990s, more than doubling world gas reserves.
Methane clathrates on the ocean floor — more abundant than all other fossil fuels put together — are not yet a resource, because nobody has yet found a way to exploit them cost-effectively. By 2050, they may be a resource. Most non-renewable resources are more abundant and cheaper now than they were 40 years ago despite there being twice as many people on the planet.
Economic growth works by reducing the time it takes for people to earn the means of fulfilling their needs and desires.
Where an entirely self-sufficient person would take hours to acquire by his or her own efforts even a basic supply of food, shelter or clothing each day, a modern participant in the global division of labour who is on the average wage takes just a few hours each day to earn the money that he or she needs for good meals, fashionable clothes and rent for a decent house.
In the 1950s it took 30 minutes to earn the price of a hamburger on the average wage; today it takes three minutes. Nobody loved them, but the robber barons of the late 19th century got rich through cutting the costs of goods.
Between 1870 and 1900 Cornelius Vanderbilt cut the price of rail freight by 90%, Andrew Carnegie slashed steel prices by 75% and John D Rockefeller cut oil prices by 80%. A century later Malcom McLean, Sam Walton and Michael Dell did roughly the same for container shipping, discount retailing and home computing, and nobody loved them. A technology impacts human living standards not when it is invented, but when-decades later it becomes affordable.
So what goods and services will get cheaper in the next 40 years? Probably energy. Thanks to new technology, natural gas and solar power could both become much cheaper to deliver to customers by 2050. Cheap gas now looks like it will last for many decades, while solar power is getting steadily cheaper and may soon be affordable without subsidies.
Because gas-fired turbines can be switched on and off relatively efficiently, these two can work well together — gas by night, photovoltaic by day. Nuclear may contribute if it can ever get the cost of its safety features under control, get away from the water-cooled uranium-fuelled designs that have dominated its early years and go for inherently safer thorium instead. Wild cards like cold fusion that might suddenly emerge, and high temperature superconductors will help. But the key point is that energy will probably get cheaper, not dearer. The old-fashioned renewables, like wind, wood and water, have not a snowball’s chance in hell of competing on price, or generating the sort of quantities of energy needed in the future (as now), because they need too much land, and land is not going to get cheaper given how many people want to buy it or preserve it.
Apart from a few niche applications, they will be white elephants by 2050 and our current enthusiasm for heavily subsidising them will astonish our grandchildren.
Biotechnology is a good bet too for cheapening. A plethora of genetic and molecular breakthroughs in the past 40 years have helped researchers a lot more than they have helped patients or consumers, but that could soon change. One of the most exciting features of stem cells is that they could prove to be affordable once the technology is working well: just hook the patient up a machine that extracts cells, reprogrammes them and re-implants them. As a way of repairing organs, it could be less costly than surgery as well as less painful. Similarly, cancer treatment stands on the brink of breakthroughs, probably using vaccines and viral gene therapy as a way of delivering large molecules inside cells to where they are needed. (The old-fashioned pharmaceutical approach of seeking small molecules that can cross cell membranes to attack cancer cells is running out of steam.)
Again, this could be cheaper than radio and chemotherapy, as well as more effective. One of the most hopeful trends of the past decade has been a small but noticeable downturn in the age-adjusted death rate from all cancers. By 2050 the cancer death rate could be falling as fast as the death rate from heart disease and stroke is now.
Communication, which has had a spectacular four decades of cheapening, must surely bottom out in the next four decades. Once ultra-broadband-mobile video is costing the square root of nothing per minute, as it almost is, there is not a lot you can do for living standards by making it cheaper still.
Transport, whose cheapening in recent decades has come through enterprise more than technology (budget airlines still use the basically same engine designs as four decades ago), will probably struggle to get more affordable in coming decades. Congestion, caused by the impossibility of improving infrastructure in crowded cities, will continue to bedevil every attempt to cut its costs.
What about government? For the most part, our political masters and their bureaucratic enforcers did little or nothing over the last 40 years to reduce the cost of the services they provide. Indeed, the time it takes the average taxpayer to earn the services of his government grew substantially, with barely discernible improvements in that service: any productivity gains were captured in higher pay and pensions in the public sector itself, as is the way with monopolies.
There is an excuse for this, which is at least partly persuasive: the things government provides — infrastructure, healthcare, social services, defence, law and order and so on — are essentially immune to cheapening. Indeed, as it gets cheaper and cheaper for each of us to clothe, feed, communicate and be entertained (by the private sector), so we are increasingly left with the high-hanging fruit, the things in our lives that cannot easily be made cheaper, like nursing and tax accountancy.
Farming and manufacturing are now absurdly cheap, and make up such tiny parts of the cost of fulfilling your needs, that making them cheaper will not make much difference. Gains must come in services. If this is true (as hinted in Tyler Cowan’s The Great Stagnation), then it implies that diminishing returns will have at last arrived.
They are long overdue. Economists have been expecting diminishing technological returns ever since Mill, Ricardo and even Adam Smith. Instead growth just keeps on accelerating-showing increasing returns instead. So if global growth were to begin to slow as soon as we have run out of technologies that can be made more productive, it will be a reversal of a trend that has been going since 1800.
That is one reason I do not believe it will happen. Another is that somewhere in the world, somebody will work out how to make the quality rise and the price fall in things like medical care and house building and transport and restaurant management. Given that innovation comes from the meeting and mating of ideas, rather than from the thinking sessions of lonely genii, it is likely that the internet has accelerated the innovation rate and therefore the chance of finding cheapening solutions.
There is a feature of economic growth that often goes unremarked, namely that the larger the political entity, the steadier the growth. A country is less prone to booms and busts than a city; and a continent than a country. The growth rate of the planet is still steadier, not least because a planet cannot borrow.
In only one year since the Second World War did global GDP growth dip below zero: 2009, when it was minus 0.6%. Real world GDP per capita has doubled since 1970. If it did so again in the next 40 years, the average citizen of the planet would have an annual income of roughly €17,000 in today’s euro — more than the average citizen in several European Union countries now.
The Malthusian assumption that higher living standards inevitably mean using more resources is wrong. Take land, for example — a limited and vital resource. The amount of land needed to feed, shelter, fuel and clothe a single person keeps going down, not up, as people get richer.
Higher farm yields, the use of synthetic fleece instead of wool, the replacement of wood fires with gas-fired central heating, even the use of new building materials like steel or breeze blocks instead of wood — they all cut the acreage footprint of supporting a human lifestyle. The gradual migration of people to cities cuts the land each needs, even allowing for the rural hinterland that supports each urban dweller’s lifestyle. The same is true of many other resources, which are more frugally used by more modern lifestyles.
The energy intensity of the world economy, in terms of joules per euro of output, has been falling for two decades. It is no accident that the greatest threats to wild habitats come in poor countries, not rich ones. Haiti, for example, mostly cannot afford fossil fuels and relies on wood for much of its energy — even bakeries use charcoal — with the result that it has lost 98% of its forest. Next door, the Dominican Republic is much richer and its forest cover is increasing.
An especially revealing measure goes under the name of HANPP — the human appropriation of net primary production. Helmut Haberl of Vienna University calculates human beings pinch 14.2% of the growth of plants on land for themselves and their domestic animals, and destroy or prevent the growth of a further 9.6% by building roads or unleashing goats, leaving 76.2% for wild nature.
But that is only part of the story. Although on balance human beings suppress plant growth, in some places they increase plant growth by fertilising and irrigating the soil. There are even large areas where this enhancement of growth is so great that it roughly matches the human appropriation (as in much of northern Europe) or even exceeds it (the Nile delta).
That is to say, even after human appropriation, there is as much or more primary production left for nature as there would be if human beings were not there.
Consequently, the net impact of HANPP is lowest in the most industrialised regions. Now imagine that the rest of the world gradually goes further in this direction, steadily raising the amount of plant material that human beings consume, but steadily raising the amount that is left for other animals, too, till eventually nine billion human beings are living high on the hog without having any net impact on the quantity of wild plant and animal life at all.
All the evidence suggests that this prospect is eminently feasible so long as energy and water are both plentiful. In other words, by getting energy (and hence water) from some source other than the land, we can spare the land-in contrast to the despoliation of the landscape caused by early, primitive agriculture that had no access to artificial fertiliser and irrigation. So here is an optimistic prediction about the world in 2050. It will be a time of extensive ecological restoration. Just as rich countries today are regrowing forests at a breakneck pace (New England, for example is now 70% woodland, where it was once 70% farmland), so the world may be doing the same in 2050 — even while feeding a larger population.
“Re-wilded” areas in Africa, the American mid-west and central Asia may be home once again to migrating herds of wild mammals. Just as rich countries today are bringing some species back from the brink of extinction (it is 160 years since a breeding bird native to Europe went globally extinct: The great auk. The pied raven was a subspecies; the slender billed curlew and bald ibis are not yet extinct), so by 2050 many Asian countries and perhaps even some African ones will be doing the same.
It may be too late for some species by then, but perhaps not. Even if the tiger were to die out in the wild, its restoration from captivity by wealthy future Indians seems probable. And one thing that may well have happened by 2050 is the rebirth of an extinct species.
The mammoth will probably be the first, partly because of the excellent preservation state of frozen specimens and partly because of the effort that is already going into extracting mammoth cells of good quality. Reprogramming such a cell to become an embryo, implanting it into an Indian elephant and bringing a fetus to term will be a very tall order. But then test-tube babies seemed an impossible dream in 1970 and they were just eight years away.
* Matt Ridley is a journalist, author, biologist, and businessman. This article is taken from Megachange: The World in 2050 (The Economist).




