For cosmologists, the biggest discovery of the next decade would be to find out what our universe is made of. Astronomers who observe the movement and distribution of objects across the cosmos calculate that normal matter — all the galaxies, stars, planets, quasars, black holes and so on whose existence they can detect or deduce — accounts for just 5 per cent of the universe. The missing 95 per cent consists of “dark energy” and “dark matter”, the nature of which is a total mystery.
Dark energy seems to be an intrinsic property of space, perhaps some sort of anti-gravity accelerating the expansion of the universe. It is unlikely to be explained in the near future. There is a far better chance of understanding dark matter, whose gravitational pull holds galaxies together. Various experiments have been designed to discover the “weakly interacting massive particles” (or Wimps) that probably make up dark matter by detecting their (extremely rare) collisions with ordinary matter.
While the chances of success in elucidating the dark universe are uncertain, we can be sure that space missions to explore visible matter in the solar system and beyond will produce fascinating results. The closest planetary target is Mars, with missions planned from this year not only to land a new generation of rovers to explore the Martian surface but also, later in the decade, to bring soil and rock samples back from the red planet for analysis on Earth, including a biochemical search for signs of Martian micro-organisms. Then we can expect to be reasonably confident in answering the question: is there life on Mars? (I think not.)
The most exciting planetary mission later in the decade will be Europe’s Jupiter Icy Moons Explorer. Juice, as it is called, will look for signs of life on three Jovian moons (Ganymede, Europa and Callisto) that have liquid oceans covered with ice.
Meanwhile a plethora of orbiting observatories will be finding planets far beyond our solar system, orbiting stars many light-years away. Astronomers have already discovered more than 4,000 exoplanets since the first one in 1995, and we can expect the total to run into tens or hundreds of thousands by 2030, as they deploy more sensitive instruments and train them on planets where conditions are similar to those on Earth. The biggest achievement would be detecting a planet that carries a biochemical signature of life — a balance of gases in its atmosphere that could only be produced by living creatures.
At the same time Seti, the search for extraterrestrial intelligence, will screen the heavens for radio or optical signals from an advanced civilisation elsewhere in our galaxy — a search enhanced by more powerful radio telescopes and artificial intelligence designed to trace transmissions that could not come from natural processes. If AI helps us find ET, that would undoubtedly be the discovery of the decade, whatever the content of the message. I put the chances of success at 20 per cent.
This should be the decade when artificial intelligence finally delivers. If so, its impact will be felt in every field of science and technology as it supercharges the ability of computers to process data and deduces patterns beyond human cognition. In the process, AI will transform many aspects of life, directly and indirectly, for better and for worse.
Although historians of technology point to past cycles of enthusiasm and disillusionment for AI going back to the mid-20th century, machine intelligence has only recently started to demonstrate its true potential for disruption. “Deep learning” programs tell computers to teach themselves to draw conclusions from vast volumes of unstructured or unlabelled information. Well-publicised examples include beating human champions in a wide range of games, guiding self-driving cars and translating between languages.
Behind the scenes, AI is beginning to help to discover new drugs, to diagnose disease from medical scans and to help astronomers find distant planets.
Some say that AI is currently overhyped and set for another downturn. Yes, there is hype — and I would have a nice income stream if I received a pound for every press release reaching my inbox that attributes a routine advance to AI — but the technology is advancing too fast and on too broad a front for the old cyclical pattern to repeat itself.
For instance, within 10 years we can expect to see accurate, reliable translators for travellers and companion robots that can conduct reasonably fluent conversations with people. Excellent facial recognition software will be a blessing if you want to identify a figure in an old family photo — but not if a totalitarian regime stops you going where you want when your socially undesirable face appears on a video feed.
Society will need to be on its guard against one unmitigated downside of AI: the ability to manipulate voice and video to show people saying and doing things they never did. These “deepfakes” are already plausible and by 2030 will be impossible to distinguish from the real thing without forensic electronics. This would threaten not only politicians and the democratic process but also businesses and private individuals. It will be hard to educate people not to take what they see and hear in the media at face value, without destroying social trust at the same time.
As AI develops, computer scientists also face a growing “black box problem” — their inability to understand how the system works and reaches its conclusions. This leads on to one of the least predictable aspects of AI. When, if ever, will the technology move on from today’s increasingly capable but essentially specialised systems — which have learnt to carry out a defined task such as diagnosing cancer or identifying faces — and create “artificial general intelligence” as flexible and adaptable as the human brain? Few experts expect AGI as soon as 2030, but well before then society should begin preparing for its arrival, just in case.
Chemically, the computer hardware in which AI lives is utterly different from the living brain whose functions it tries to reproduce. One is made from hard inorganic materials such as silicon chips with connections fashioned from metal; the other consists of complex biological molecules in soft tissues. But both process information through electrical signals and they have enough in common for computer engineers and neuroscientists to work fruitfully together. Such collaboration offers immense scope for affecting how the brain works, in sickness and in health.
Although Steven Pinker’s essay on these pages last weekend rightly pooh-poohed the idea that the 2020s would see a “brave new world” of high-tech mind-hacking, we can expect a huge advance in two-way communications between brain and computer. Today’s brain-computer interfaces offer rudimentary one-way traffic. Some tap our intentions, for example when a paralysed patient with a neural implant drives a robotic prosthesis or when someone dons an EEG (electroencephalogram) cap to play an electronic game. Others send an electronic pulse the other way into the brain, for example to relieve symptoms of Parkinson’s disease or depression.
As Britain’s Royal Society predicted last summer, these primitive devices will develop into high-bandwidth interfaces between brain and external devices — a view boosted when Elon Musk unveiled with typical panache his Neuralink implant that will be inserted by a microsurgical robot weaving threads of flexible electrodes across the brain.
At the same time, in corporate and academic labs around the world, more modest neurotechnology initiatives are under way whose impact could be just as significant in the long run. To give one example, an international research team based at the University of Bath in the UK has made implantable “artificial neurons” that reproduce accurately the electrical properties of brain cells in silicon chips, while running on just one billionth the power of a digital microprocessor. Their first application, already tested on animals and soon to be tried in patients, will treat heart failure by supplementing neurons that co-ordinate heartbeat with breathing. The researchers’ long-term ambition is for electronic implants to replace the failing neurons of people with Alzheimer’s and other degenerative brain diseases.
Efforts to rescue failing brains through electronics and computing will complement biological and chemical approaches. The latter have failed so far to produce effective treatments (let alone cures) for most neurological disorders — though the pharmaceutical industry has spent billions of dollars trying over many years. However, there is real reason to hope that the 2020s will at last see substantial progress in tackling the underlying biological causes of neurodegeneration.
Convergent advances in several fields of bioscience — notably genetics, gene editing, stem cells and immunology — give cause for optimism, not only for treating the brain but also killer diseases from cancer to diabetes, where there has already been some progress but much more is likely over the next few years.
Although scientists finished a first draft of the human genome — the 3bn biochemical letters of DNA that store our genetic inheritance — almost 20 years ago, it has taken far longer than the optimists expected back then to disentangle the function of individual genes, how they are regulated and how they work together to keep us healthy and, when they go wrong, cause disease. Many mysteries remain but enough is known to predict which genetic interventions are likely to work.
Fortunately scientists have gained a tool for intervening in the genome: the now famous Crispr, which was unveiled only eight years ago as a way to edit genes more cleanly and efficiently than previous hit-or-miss methods of genetic engineering. The first clinical trials of Crispr began in 2019, to treat forms of cancer, blood disorders and inherited blindness, and patients with a wide range of problems can expect to benefit in the 2020s. At the same time scientists will improve the accuracy of Crispr itself; indeed a derivative called “prime editing”, published in October, seems to offer a more precise way to cut and splice DNA.
One pioneering Crispr trial edits genes in immune cells to make them fight cancers (multiple myeloma and sarcoma) more effectively. Better understanding of human immunity — and how to use it to fight disease — has been an unsung feature of recent biomedical research and this will be exploited much more extensively in years to come, as scientists learn to engineer the genetics of the immune system.
The beneficiaries will include not only cancer patients (whose prospects have already been improved through immunotherapy) but also sufferers of other common diseases such as autoimmunity and infections. The most tantalising prospect, which researchers will explore thoroughly during the 2020s, is that manipulating immunity could provide the long-sought weapon against Alzheimer’s. Scientists know already that some aspects of the immune system are overactive and some underactive during the lengthy period when deposits of toxic protein are building up in the brain but symptoms of dementia have not yet developed. Rebalancing the system might stop Alzheimer’s in its tracks or even reverse it by removing unwanted proteins.
Stem-cell research is another fast-moving field set to make a big impact during the coming decade. Over the past 10 years, scientists have learnt how to create almost any living tissue in lab dishes. With clever biochemical cocktails, they turn adult cells back to an embryo-like state and then drive them to develop into other specialised cells — which in turn organise themselves into simplified replicas of human organs known as organoids. So far organoids have been used mainly for lab studies of diseases and possible treatments — but we may see them transplanted into patients by the late 2020s to replace their own failed organs such as kidneys and hearts. Organoids grown from the patients’ own cells may be more acceptable than an alternative technology under development: growing “xenotransplant” organs in pigs, genetically edited to avoid rejection by the human immune system.
The top global challenge in the 2020s will be the climate emergency — and we will have to respond primarily through political willpower and economic and industrial action, because it is hard to envisage a realistic scientific breakthrough within the next decade that would easily or quickly wean the world off fossil fuels. Nothing will match oil and natural gas as convenient, portable, compact and cheap energy sources.
Even so, we can look forward to extensive incremental innovation in non-carbon energy generation and storage. Ironically, the next really new commercial energy source will be one that scientists have been working on for 70 years: nuclear fusion. Taming fusion, the reaction that powers the Sun and stars, has been the province of government-funded big science — the international ITER project is spending $22bn to build a gigantic doughnut-shaped reactor in southern France that is scheduled to start experiments in 2025 — but the private sector is now joining in too.
Several companies in the US and Europe are taking advantage of recent advances in plasma physics, magnets and materials science to develop more compact and less expensive reactors. Fusion power is unlikely to reach the market within the next decade, but the results from experimental reactors should at least show whether it will be worth making an enormous investment during the 2030s to commercialise a safe energy source that does not contribute to global warming and produces very little radioactivity compared with nuclear fission.
Fission, which splits heavy atoms while fusion combines light ones, has generated nuclear power since the 1950s. It has been in the economic doldrums for decades for several reasons — including the substantial volumes of radioactive waste generated, the risk of a catastrophic accident and the huge capital costs of building a large power station. But the world needs non-carbon energy so badly that various suppliers will demonstrate a new generation of small modular fission reactors — safe and relatively inexpensive — over the next few years. Their commercial introduction may depend on the climate crisis inducing a change of heart among green politicians, who have tended to be automatically anti-nuclear.
The renewable sources that environmental advocates love — solar and wind power — will need technical progress, above all in ways to store their output for release when the sun isn’t shining and the wind isn’t blowing. There is considerable scope to improve lithium ion batteries, the current favourite, and many other battery types are in development around the world, but we should not expect a quantum leap in energy storage technology any time soon.
Clive Cookson is the FT’s science editor