Europe’s Quest for Technological Power

Alice Pannier heads the Geopolitics of Technology program at the French Institute of International Relations (IFRI). She previously held the position of Assistant Professor in International Relations and European Studies at the Paul H. Nitze School of Advanced International Studies (SAIS) at Johns Hopkins University. This essay is based on a longer study entitled “Strategic Calculation” published by IFRI in October 2021. You may follow her on Twitter @AlicePannier.

Computing power plays a key role in enabling data analytics and machine learning, in cybersecurity, for scientific research, and in military domains like nuclear warhead design and detonation simulation. Computing also has industrial ramifications, not least due to a relatively small number of players that hold key spots in the value chain. This leads some to argue that the contours of computational power define who has control over and access to the benefits of computer-based technologies like artificial intelligence.

This essay focuses on two complementary segments of computing: high-performance computing (HPC, also known as “supercomputing”), and quantum computing. Both are very distinct in terms of maturity. HPC has been widely used in scientific research, meteorology, the military, finance, and industry since the 1990s. Arguably, a nation’s ability to deploy supercomputers constitutes a form of soft power, as well as being a scientific and national security imperative. Today, a few countries around the globe are engaged in a race to deploy the next level of supercomputers, known as exascale machines. But the field is also currently witnessing a diversification of uses, with new needs stemming from big data applications for industry.

Meanwhile, quantum computing is still at an experimental stage but has highly disruptive potential, in both civilian and military domains. It seeks to exploit the properties of quantum mechanics and as such, constitutes a whole new paradigm in computing. Indeed, quantum computing indeed will multiply computing power exponentially, with considerable cybersecurity implications and industrial and scientific applications. The field has experienced swift progress in recent years, even if large-scale quantum computers might still be a decade away.

The race for computing power, including quantum computing, has become a key element of the U.S.-China technological competition. Yet, the competition is far from being solely a U.S.-China matter. For logical reasons considering its applications and implications, computing power is a strategic priority for European governments as well as for the European Union. This is especially the case when it comes to quantum computing. Countries around the world have recognized that quantum science has moved from an academic field of research to a fast-growing technological sector. Consequently, they are developing strategies in the field, so that current dynamics are akin to a space race.

This essay examines the state of play of technological developments and international competition in HPC and quantum computing, with a particular emphasis on where France and Europe stand in the global race for computing power. The first half of the essay presents current dynamics in the HPC sector: on the one hand, the enduring role of states in shaping supercomputers as a strategic sector, and on the other hand, the ongoing “democratization” of the field, with growing uses in industry. It then then addresses the geopolitical considerations that supercomputing raises for Europe, as well as ongoing strategies aimed at enhancing Europe’s technological power in the field.

The second half of the essay addresses quantum computing. After introducing the main principles of quantum computing, it reviews recent progress and remaining technological hurdles, as well as the strategic and economic applications and implications of quantum computing. It then looks at the quantum strategies deployed by the U.S., China, and EU countries, with a particular emphasis on the French 2021 Quantum Plan.

This essay shows that HPC and quantum computing present both opportunities and challenges for European countries as they seek to leverage the potentials of computing power for the data economy and national security, as well as for addressing critical societal challenges in areas of health and climate change. Aside from national strategies and investments in HPC and quantum technologies, collective efforts in Europe to pool resources are ongoing. The EU is indeed striving to develop federated computing services and data infrastructure, and to secure resilient supply chains in components, technologies, and knowledge, not least to limit risks of disruption. 

Computing technologies raise challenges for Europe—from the supply of chips to energy consumption—as well as risks, from export restrictions to company takeovers. Yet in both HPC and quantum computing, procurement choices challenge Europe in its internal debates and contradictions when it comes to developing its technological power, as the line between scientific research and strategic advantage gets blurred. Europe is also facing a well-known problem of lack of private investment in disruptive technologies. Quantum technologies do offer an opportunity to learn lessons from past developments in the field of classical computing, and to take the right actions early on. If Europe fails, it will have not only economic but also serious security implications.

Trend in High-Performance Computing

High-performance computing (HPC) refers to computer systems often called “supercomputers” with extremely high computational power, able to solve very complex problems at high speed. The development of supercomputers has been and remains largely state-driven, as their acquisition and running costs are high and as they have uses in national security (e.g., nuclear simulation), scientific and medical research, and climate modelling. Nonetheless, the rise of the big data economy and the computing power required to train Artificial Intelligence (AI) algorithms and to run simulations, together with the growth of cloud computing, have “democratized” the recourse to high-performance computing in industry. 

Today, U.S. computer and processor companies dominate the market in Europe, while China has developed indigenous technology. Europe is seeking to catch up by supporting its industrial base and developing its processor technology but is facing internal hurdles that are industrial, technological, and political.

The current concept behind high-performance computers appeared in the late 1980s-early 1990s with the advent of massively parallel processing, whereby supercomputers started to be built with hundreds of thousands of processing cores. The chart below shows the pace at which computing power has grown from the 1990s onward as a result. Computing power in the HPC jargon is measured in floating point operations per second (flop/s). While it was previously measured in gigaflop/s (i.e., one billion (109) operations per second), computer power is now measured in petaflop/s (1015 operations per second) and the standard is about to shift to exaflop/s: 1018 or one quintillion (one billion billion) operations per second. To provide an indication of scale, a petaflopic supercomputer is about one million times more powerful than a high-end laptop.

Performance Development


Governments have historically played an important role in the development and growth of computer technology, especially HPC technology—even if, in recent decades, computer chip development has been mainly driven by the private sector, not least by the smartphone industry. The public sector is still the main consumer of concentrated computing power today. 

In 2018, in Europe, over 90 percent of HPC operating time was by universities or academic research centers, whereas the remaining 10 percent served commercial purposes and end users. Chief among state-driven uses of computing power are national security uses: supercomputers can be used to design, develop, manufacture, and test weapons (including nuclear weapons) and weapons platforms; collect, process, analyze, and disseminate intelligence; for cryptography; for combat simulation; for missile defense, etc. Supercomputers are also used for weather forecasting and scientific research, including medical research. The COVID-19 pandemic is said to have engendered new needs, for biomedical applications, new drugs development, and digital twins of humans. 

Since supercomputers have dual applications, they have been subject to export restrictions since the 1990s. Today, the U.S.-Chinese competition clearly plays out in the realm of HPC, due to concerns about both intellectual property and the potential uses that can be made of U.S. chip technology. In April 2021, the Biden administration added several entities involved in HPC to the Entity List, including China’s National Supercomputing Center (which is involved in the simulation of hypersonic vehicles). This prevents export of U.S. technology to such entities.

Global Distribution of Computing Power

Across the globe, a small number of countries possess significant supercomputing capabilities. China and the U.S. lead the race, followed by second tier HPC powers: Japan, Germany, France, Netherlands, Ireland, the UK, and Canada. In terms of companies, in the HPC sector, the top three vendors are Lenovo (China, #1 vendor worldwide, with 36.8 percent market share), Inspur (China, 11.6 percent), Hewlett Packard Enterprise (HPE, U.S., 9 percent), Sugon (China, 7.8 percent) and Atos (France, 7.2 percent). If we include Cray, which HPE acquired in 2019, and which holds 6.4 percent of the market, HPE has moved up to the second place, with 17.4 percent. The picture changes if one looks at computer performance rather than market share. Then, the first player is Fujitsu (Japan, 19.8 percent). The Fugaku, unveiled in June 2021, is the most powerful machine in the world. It has three times the power of the second most powerful. Its computing power is equivalent to 20 million smartphones combined, and it is halfway towards the exascale.

Public sector institutions—whether government departments or university research laboratories—possess the most powerful machines. Nvidia and Eni (the Italian oil and gas company) appear as two notable exceptions in the top-ten list. Aside of the top ten, we find large digital companies, and in particular cloud service providers, like Microsoft Azure’s and Amazon Web Services, which have acquired significant computing resources of their own. Weather forecast agencies also rank high as HPC users across the globe.

The performance of a supercomputer for a given task depends not only on the number of the machine’s cores and on the speed of the interconnecting network, but also on the type and architecture of its chips. One difficulty with HPC is that higher performance tends to come at the cost of flexibility: hardware built for specific purposes outpaces general purpose computers. Thus, many countries (the U.S., Canada, Japan, the UK, Germany, or France) have recently acquired large calculation instruments specifically dedicated to AI for their research communities. This includes the RIKEN Center in Japan, which developed the Fugaku; the Joint Academic Data Science Endeavour (JADE) in the UK in 2018; or the GENCI (Grand équipement national de calcul intensif) in France, which inaugurated a new machine for AI research (the Jean Zay), in 2020.

It is thus essential to consider the processors that enable HPC machines and the central role that chip manufacturers (AMD, Intel, or Nvidia) play in shaping and structuring the realm of computing power. The international landscape of processors has greatly evolved over the past few years, with the breakthrough of Nvidia, a Californian company created in 1997. Computers rely on central processing units (CPUs), as well as increasingly on graphics processing units (GPUs), which allow visual computing (such as 3D, video, computer vision and image recognition). Nvidia designs GPUs, originally aimed at the gaming industry, and which now equip supercomputers around the world. Indeed, GPUs turned out to accelerate significantly computing power for certain applications like machine learning. As such, they constituted a real revolution in HPC. In 2017, Nvidia launched a GPU based on a new architecture (“Volta”), designed for AI and especially for self-driving cars, which now equips America’s two most powerful supercomputers, Sierra and Summit. The company also started manufacturing its own supercomputers. Nvidia hoped to acquire ARM, a British, world-leading chip designer acquired by Japanese holding SoftBank in 2016 for £24.3 billion (€29 billion). But as of late January 2022, press reports indicate that Nvidia is preparing to abandon its acquisition, given the lack of progress made in convincing the UK government not to block the sale due to anti-competitiveness and national security concerns, in a context of heightened geopolitical competition surrounding computer chips.

Toward an Exascale Machine

Currently, the world of high-performance computing is characterized by a race for the development of exascale computers. Exascale computers will be capable of carrying out one billion (one quintillion) operations per second. In other words, exascale machines will be twice as powerful as the world’s most powerful machine to date, and twenty times more powerful than the best European machine. Exascale machines will make a difference in specific areas of simulation and 3D visualization used in nuclear power research (e.g., next-generation nuclear warheads), climate sciences (e.g., forecasts of the consequences of temperature changes), high-resolution meteorology and oceanography, as well as biological and medical research (e.g., cardiac physiology). But a country’s place in the world’s computing power rankings are also an expression of national sovereignty and a soft power tool, as suggested by Emmanuel Jeannot of INRIA.

There are no exascale machines deployed today, but government programs are underway in the United States, Europe, China, and Japan. Developing exascale machines is a matter of cost as well as a matter of access to technology. The latter is a particular problem for China due to export restrictions of U.S. technology. As for the cost, it is illustrative that the R&D funding for the Fugaku has been around €1 billion over 10 years.

Very recently, China and the United States have made their own exascale machines operational. In July 2015, former U.S. president Barack Obama launched a National Strategic Computing Initiative calling for the accelerated development of an exascale computing system that integrates hardware and software across a range of applications representing government needs. HPE Cray delivered its first exascale computer, Frontier, to the Oak Ridge National Laboratory (ORNL, attached to the Department of Energy) in late 2021. ORNL already hosts Summit, which ranked as the world’s most powerful machine in 2018-2020. 

While China did not have a single supercomputer in 2001, it superseded the U.S. in terms of performance and number of supercomputers worldwide in 2016. Today, China now owns the highest number of the Top500 supercomputers worldwide, even if its machines are less performant than American ones, overall. China’s first exascale machine, based on indigenous technology, became operational in late 2021, which made it the first nation to operationalize an exascale computer. In addition, Beijing has included a target of having 10 national exascale supercomputing centers in its 14th Five-year-development-plan spanning 2021-2025. 

Other countries around the globe are seeking to build exascale machines, but they are less advanced than China, the U.S., or Japan. This list includes South Korea, which is aiming for a national exascale computer, relying on Korean processors, by 2030. Europe, for its part, is seeking to deploy exascale machines by 2022-2023, as we shall see below. 

‘Democratization’ of HPC

Given its costs and possible uses, HPC was not a critical business need for many companies until the emergence of the big data economy. Today, the functions of computing power are changing, and HPC is ‘democratizing,’ as it plays an important role in facilitating the development of the big data economy. Indeed, with the proliferation of data, there is growing demand for extracting insights from such big data, and for near-real time analysis. HPC is especially attractive for AI technologies such as deep learning. They require a lot of computing power, so much so that the amount needed worldwide to train AI programs doubles every 3.4 month.

HPC-enabled big data analytics and simulation have plenty of uses in industry and in all design and manufacturing processes: digital twinning, design customization, operations management, maintenance, optimization, or assessment. HPC-based simulation is also relevant for 3D simulation of fluid dynamics, in driving simulation for autonomous vehicles, for tank simulation in the oil and gas business, in finance for the optimization of portfolio risk management, in crisis management scenarios (e.g., forest fires), etc.

Besides industry, growth in uses of HPC and digital twinning continues to be driven by research, including medical research and climate sciences. In fact, the need for computing power in science is ever growing. By way of illustration, since 2020, Japan has been using its Fugaku, to simulate the spread of COVID-19. In the future, digital twinning of the human body, based on individuals’ DNA, will in theory allow medical treatments to be tailored to a person’s physiology and needs. 

When it comes to climate, the EU launched its “Destination Earth” (DestinE) project in 2021, as part of its Green Deal plan. The plan is to create digital twins of the Earth to simulate the effects of climate change, through a high precision digital model of the Earth to monitor and simulate natural and human activity. The project will span 2021-2030. By 2025, the project will include a cloud-based platform, with four to five operational digital twins.

Due to the generalization of its uses, the market for HPC has become quite dynamic: its growth rate for businesses has been estimated at +9.8 percent from 2017 to 2022. Market turnover reached $41 billion (€35 billion) in 2020 worldwide and is expected to reach $66.5 billion (€56.7 billion) in 2028. 

Aside from greater demand, the growth of the HPC sector is fueled by sinking costs and the greater affordability of hardware, as well as the move of HPC from “on premises” to (partly) on the cloud, whereby HPC users can access high-speed data processing from commercial services such as Amazon Web Services. A report suggests that the on premises HPC market, worth $24 billion (€20.5 billion) in 2020 and will grow at 7 percent a year by 2024, while the HPC cloud market was worth only $4.3 billion (€3.7 billion) in 2020 but will grow at 17 percent a year until 2024. As an indication of these trends, the world-leading, Taiwan-based chip manufacturer TSMC expects that, due to “unprecedented demand for compute power in cloud datacenters and communication infrastructures, […] the main driver of its growth in the next several years [will] be high-performance computing, overtaking its current smartphone business.”


Is Europe Catching Up?


In 2018, the President of the European Investment Bank (EIB) regretted that “while a third of the global demand for HPC capabilities comes from European industry, SMEs, and researchers, […] only 5 percent of […] HPC capabilities [were] being provided by European HPC centers.” Europe hosts one main player in supercomputing—Atos—but it is seeking to host more machines on the continent through collaborative ventures, and to develop capacities in areas of the HPC value chain where Europe is largely absent, notably processors.


The French company Atos-Bull is the sole European supercomputer hardware company. It is the result of successive French governments’ objective to be sufficiently industrially autonomous to develop and maintain nuclear weapon systems. This goal drove France’s post-World War II techno-industrial planning in telecommunications, atomic energy, space capabilities, and computer science. Another motivation to have a French supercomputer company was the U.S. embargo on the export to France of state-of-the-art computer equipment, for fear that the equipment could possibly fall into Soviet hands. When Bull was taken over by General Electric in 1964 and left Western Europe with no challenger to U.S. companies in a strategic sector, France launched its “Plan Calcul” (“computing plan”) to support the emergence of a French national champion in computing. The Plan failed to create a new industrial actor from scratch, but Bull was eventually nationalized in 1982 and re-privatized in 1994. Finally, Atos took over Bull in 2014.
For the past two decades, Bull (now Atos-Bull) has been an important player in the supercomputer business. Since 2001, five years after it launched its nuclear simulation program (“Simulation”), the French Alternative Energies and Atomic Energy Commission (CEA), which oversees the French nuclear deterrent, has endeavored to have a “sovereign” national industry of supercomputers. The CEA got into a partnership with Bull, which delivered its first machine to the CEA (the TERA-10) in 2005. By 2012, Bull had three machines in the world’s top 20. Today, Atos-Bull continues to provide the CEA with Simulation, and the French Ministry of Armed Forces more broadly for other uses. Nowadays, the need for 3D computing for future generations of nuclear weapons is what is driving the search for ever more powerful computers and the pursuit of exascale machines.

But Atos and other European companies are not present all along the production cycle for supercomputers. Atos has its own interconnection system (BXI) but relies on non-European manufacturers for processors: the U.S.-based companies AMD, Intel and Nvidia. As suggested by a representative of Atos, the company is indeed “agnostic” when it comes to Processing Unit providers. French and EU authorities have, however, identified the absence of a European provider as a problem and sought to develop alternatives. In 2019, the head of the CEA’s military arm estimated that, while the CEA has been working with Intel, in the future, there should be “a sovereign European processor,” because France would “not want to be subject to an inability to have these processors.”

Another problem is public procurement choices. Government procurement programs are a key determinant of HPC hardware providers’ outlooks. This is especially true when companies have little chance to sell into foreign markets. Today, both the United States and China are keeping their respective markets closed to foreign vendors. In the U.S., the domestic industry is currently strongly supported with a “Buy-American” requirement for the purchase of supercomputers. A European company like Atos thus cannot hope to export its machines to the U.S. or China, and its market is largely located in Europe (UK included), Brazil, and India. What is more, unlike in the U.S. or China, public procurement in the EU is open to non-EU entities, a practice that does not always favor local providers. It is indeed striking that the Jean Zay, the CNRS’s largest and most recent supercomputer, is HPE-built, and not Atos-built. New initiatives aim at boosting the European HPC sector (see below), but debates have arisen about whether to give precedence to EU-based options in public procurement choices. This may change with the International Procurement Instrument, a new piece of legislation currently being negotiated in Brussels, and which would introduce a principle of reciprocity in the openness of public procurement markets, in response to legislation such as the “Buy American” Act.

A further limitation is the absence of European companies in cloud-based HPC services. Cloud-service suppliers in Europe remain largely American companies. For example, the French firm Atos has partnered with Google Cloud to provide a hybrid cloud solution for data analysis and machine learning. This is not without problems for data security. Consequently, the EIB has pushed for the development of European cloud offers, including HPC applications. 

Europe, finally, has a funding problem. The European Investment Bank in 2018 called for significant investments to be made in infrastructure, access to big data, and tailor-made complex software solutions. Mariya Gabriel, then Commissioner for Digital Economy and Society, identified a funding gap in European HPC of €500 million to €750 million per year, compared to the USA, China, or Japan. One conclusion of the report was that no single country in Europe by itself has the capacity to sustainably set up and maintain an exascale HPC ecosystem within a competitive timeframe.

EU Plans for More
Computing Power

As its use has become more common, HPC has become a priority issue for most EU Member States in recent years, and with it an attempt to address the limits of Europe’s computing power. The need became more pressing from 2015 onward, after both U.S. and China developed their plans for HPC. There was thus pressure in the EU to aim for exascale computers too, and to make the EU space not only a consumer but also a producer of computing power. Within the EU, too, Thierry Breton, the current EU Commissioner for Industry, then CEO of Atos, strongly supported the initiative. The result has been new HPC initiatives and funding at a European scale, as part of a broader agenda for European digital infrastructures.

Two initiatives in the EU are ongoing: a plan to build supercomputers in the EU, including exascale HPC, known as the EuroHPC Joint Undertaking (JU), and a plan to develop an EU microprocessor for extreme-scale computing, known as European Processor Initiative (EPI). In 2017, seven EU member states—Germany, Portugal, France, Spain, Italy, Luxembourg, and the Netherlands—signed a declaration establishing the EuroHPC Joint Undertaking. The legal and funding entity was established in 2018. The EU Commission’s DG Connect did a lot of the initial work, before EuroHPC became autonomous in September 2020. In the meantime, two private actors (the Big Data Value association, and ETP4HPC) joined the public-private partnership, as well as several non-EU countries, including Norway and Turkey (but not the UK)—reaching 33 members.

The principle is one of co-funding between the EU, its member states, and private actors federated in an association. Participating countries and entities coordinate their efforts and pool their resources. During the initial phase of 2019-2021, the JU had a €1 billion budget. On July 13, 2021, the European Council adopted a regulation on establishing the EuroHPC, thus allowing existing activities to continue. The new regulation will grow the initiative’s budget, staffing, and missions. EuroHPC funding for 2021-2027 (to be matched by participating states) will come from Digital Europe (€2 billion), Horizon Europe (€900 million), and the Connecting Europe facility (€200 million).

The funding will be used toward advancing a dual goal: to deploy top-of-the-range supercomputing infrastructure across Europe to match users’ needs, and to develop a research and innovation ecosystem for HPC technologies in Europe. The JU aims to deploy two exascale machines by 2023, which France and Germany are hoping to host. In France, the partnership between the CEA and Atos-Bull, via the GENCI, is driving the progress toward exascale, and the post-TERA 1000 machine. In November 2020, they chose to integrate Fujitsu’s A64FX processor technology, the same that equips the Fugaku, to develop the first French exascale computer.

Before exascale machines are built, eight hosting entities, around Europe, have been selected to host five petascale and three pre-exascale machines. Each of the five petascale machines will be worth between €12 million and €30 million each. The first computer, the Atos-built Vega, was inaugurated in March 2021 in Slovenia.

The EU’s plan also includes three pre-exascale machines (at 1017 flop/s). Two are currently under construction: one, LUMI in Finland (Cray-HPE, with AMD CPUs and GPUs); and the second, Leonardo, in Italy (Atos-built with Nvidia GPUs), respectively worth €144 million and €120 million. The third one, MareNostrum 5, will be located at the Barcelona Supercomputing Center, but its procurement is currently in limbo. In this case, the shared desire to procure petascale or exascale machines to respond to users’ needs conflicts with another goal, which is to support and develop European industrial capabilities in HPC. 

Two bids competed for winning the procurement contract: a U.S.-Chinese consortium with IBM and Lenovo, and Atos. The initial technical evaluation showed that IBM and Lenovo offered a more powerful machine at a better price. Atos’ advantage, by contrast, was to have a supply chain that is more embedded in Europe—so one question was whether the latter should be a defining criterion in favor of choosing Atos. The EuroHPC criterion of “EU added value” does include an evaluation of how much a bid “reinforce[s] the digital technology supply chain in the Union.” In light of this requirement, the EuroHPC advisory board recommended opting for the Atos offer: a choice which France supported, but Spain opposed. The issue became so political as to be the object of a discussion between the Spanish Prime Minister Pedro Sanchez and the French President Emmanuel Macron in March 2021. In May 2021, EuroHPC cancelled the tender, under the justification that the COVID-19 pandemic had changed the specifications required for the machine.

Energy Efficiency & Tech
Sovereignty

Access time to EuroHPC machines will be allocated to European scientific, industrial, and public sector users in such way that it maximizes the positive impact of these systems on Research and Innovation (R&I). Indeed, the second goal of EuroHPC, parallel to infrastructure deployment, is to develop an R&I ecosystem for HPC in Europe that includes hardware and software capabilities, applications, training, and skills. That ecosystem, in turn, should contribute to Europe’s double agenda: the Green Deal and technological sovereignty.

Energy consumption is becoming a major issue for the expansion of computing power and of the data economy. The huge increase in computing power and energy is in large part attributable to the training of machine learning programs. Aside from environmental considerations, this also has economic costs. The electricity bill for a supercomputer amounts to tens of millions of euros per year. The Fugaku consumes 30 to 40 megawatts, with a corresponding yearly cost of up to €40 million per year. For a country like France, the limitation to the deployment of an exascale machine is not so much technical as financial. In France, the current budget allocated to HPC only permits to consume half of that energy.

As part of Europe’s Green Deal, any plan to develop HPC in Europe has to address the question of energy efficiency. Europe is already well positioned in the Green500 ranking, another ranking by Top500 that looks at power efficiency (in GFlops/Watt) in supercomputers: there are four Atos-Bull machines in the top 10, which is better than in the regular ranking where only one Atos-Bull machine makes in to the top 10. The EU intends to continue down this path. As part of EuroHPC, the laboratories that apply for hosting machines must be exemplary in terms of energy-efficiency. For instance, the LUMI, which is being installed in Finland, will be powered by hydropower from a nearby river.

Europe’s two agendas of the Green Deal and technological sovereignty are coming together when it comes to processors: not only do processors play a big role in defining the power efficiency of a machine, but the past couple of years have also showed that foreign dependences on such technology causes risks of disruption.

The project for European low-power microprocessors—known as European Processor Initiative (EPI)—was launched in 2015 and started in practical terms in 2018. It gathers 28 public and private partners, including the CEA, STMicroelectronics, BMW, and various universities and research labs, and is coordinated by Atos. The EPI will create advanced processors for HPC applications. The main guidelines for a first generation of processors were announced in June 2019, and this vision was further materialized with the operational launch of SiPearl, a start-up company responsible for the design of the chips. A first prototype of processor, Rhea (which is largely based on a design by the British company ARM) was presented in January 2020. SiPearl hopes to launch Rhea in 2022 and to deliver it on time for the European exascale supercomputers in 2023. Aside from supercomputers, SiPearl aims to develop other microprocessors for other, bigger markets like autonomous vehicles, edge computing, and data centers.

The EPI’s processors aim to be extremely energy-efficient: SiPearl promises it will halve the energy consumption of supercomputers. In principle, they also aim to fill a political and strategic goal, as they are “proudly designed in Europe to set out Europe’s technological sovereignty.” Yet, at this point, the enterprise is faced with a lack of investment: presently the HPC sector is largely financed by national or European budgets and grants, but private investment is missing to make it viable. Finally, it is worth noting that the processors will indeed be designed in Europe, but the chips will most likely be manufactured by TSMC.

Quantum Computing’s Revolution

According to Moore’s Law, computing power doubles every two years, as the number of transistors on integrated circuits doubles. This trend is facilitated by the gradual reduction in the size of semiconductors, together with other advancements in digital electronics. However, we now frequently read that we are reaching the end of that law, as we are close to reaching the physical limits of nanoscale computer chips. According to a 2018 report by the European Investment Bank, “by 2025, hardware configurations could come to the end of exponential capacity increase” of classical computers. 

In contrast to these physical limitations faced in classical computers, quantum computing exploits the characteristics of quantum physics and promises to multiply computing power exponentially. Consequently, quantum computing has become a strategic field in which governments, research laboratories, technology companies are increasingly investing, with a view to reaping the benefits of the quantum advantage.


Quantum information sciences emerge from the consideration that quantum physics (i.e., physics at the atomic or sub-atomic level) has implications for how systems like computers process information. In turn, quantum effects can be leveraged for computing information. Quantum computing is one segment of the field of quantum information sciences, which is vast and also includes communication and cryptography, metrology and sensing, and simulation, with wide-ranging application domains. Quantum information technologies currently stand at various levels of readiness. Compared to sensing and cybersecurity applications, quantum computing is the furthest away from market readiness, but it also has the highest disruption potential. 

Explanation of Its Principles

Quantum computing emerged recently, and its development has accelerated in recent years. In 1995, scientists understood that quantum algorithms will make it possible to perform calculations in record time, and that this posed security challenges for information systems. The following year, IBM proposed the first principles of the quantum computer and introduced the first 2-quantum bits (qubit) computer. Five years later, in 2001, IBM researchers managed to factor the number 15 using quantum bits.

Quantum computers, as they exploit the special properties of matter in quantum mechanics, constitute a whole new paradigm in computing—whether in terms of hardware or software. In classical computing, the bit is the basic unit for storing information. Its value is either 1 or 0. In quantum computing, rather counter-intuitively, the basic unit, the qubit, can be “more or less 0” and “more or less 1,” in varying proportions. This superposition of states allows for the multiplying effects of quantum computers compared to classical ones: where a classical computer can provide a result of a calculation at the end of a chain of successive instructions, qubits allow 2N simultaneous combinations and provide an instantaneous solution at the time of measurement (however, without indication of the process). Consequently, quantum machines allow certain computations to be performed exponentially more quickly than by classical computers.

To exploit the properties of quantum physics, a quantum computer manipulates the states of particles using lasers or electric and magnetic fields. Different quantum processor technologies are currently undergoing experimentation in quantum research laboratories. Most industrial teams, including IBM and Google, as well as the UK in collaboration with California-based Rigetti, have focused on qubits as superconducting circuits that are cooled to extreme temperatures close to absolute zero, where certain materials conduct electricity with no resistance. They are the qubit technologies that appear to be the most advanced. 

“Trapped-ions” is another technology that is also very promising. These are electrically charged atoms (ions) that are cooled and “trapped” with lasers. This technology is pursued by Honeywell, IonQ (sponsored by the United Arab Emirates with the University of Maryland) and the EU Quantum Flagship project “AQTION.” Photon-based technologies are also being developed, not least by China’s University of Science and Technology. Finally, the emergence of quantum computing goes hand in hand with technological advances in many scientific and technological fields: nanotechnologies, cryogenics, materials sciences, lasers, etc.

It is unclear today which qubit technology or technologies will eventually prevail. Currently, governments, technology companies, and investors need to adopt a “Darwinian” mindset: even if still at an experimental stage, it would be risky to bet on the failure of quantum computers or to choose one technology over another. Eventually, there are probably going to be different types of quantum computing systems that will coexist.

Remaining Challenges

The American physicist John Preskill developed the concept of “quantum supremacy,” which will be attained when a quantum machine solves a mathematical problem that no classical computer can solve due to the latter’s physical limitations. Examples of such limitations include a calculation that would require millions of years to solve, or a calculation that would require more particles than there are in the universe. 

Some U.S. and Chinese laboratories have successfully set out on a race for “quantum supremacy,” as I will discuss below. However, most governments, research labs and start-ups around the globe are seeking to harness “quantum advantage,” and to develop practical uses of quantum computing. Quantum advantage will come down to combining a quantum machine with a classical machine to reach a level of acceleration of computing that is sufficiently significant to provide an advantage compared to classical machines. What remains unknown at present is both how and when the industry will reach the point where a quantum computer can solve a relevant problem faster than a classical computer for concrete commercial applications. 

According to the U.S. Department of Energy, we are currently at the same point in the development of quantum computing as were scientists in the 1950s, when conventional computers ran on vacuum tubes. The technologies behind classical computers are today sufficiently performant to create, transfer, store information with a reliability rate close to 100 percent, and few resources (memory, CPU) are needed to correct errors that inevitably result from electronic components. 

We are far from there with quantum computers: they remain extremely sensitive to interactions with their environment, which affects the properties of quantum bits and the quality of the output. The mere day-to-day vibrations of a building in which a quantum machine is located can lead qubits to “decohere” and lose their programmed quantum information. Quantum hardware requires intricate wiring to control and measure qubits and this also introduces noise, in a way that increases as the number of qubits grows. Thus, a key challenge is to solve the noise and errors caused by the fragility of quantum systems. Noise-correction, though adaptations in code, algorithms or hardware is thus a central workstream of quantum research.

Today, Noisy Intermediate-Scale Quantum computers (NISQ) are a first generation of quantum computers that are not so precise, but which can demonstrate the validity of technologies and algorithms. They provide experimentation grounds to identify use cases and develop quantum algorithms. Practical uses will come later. NISQs contain between around 50 to a few hundred qubits. Below 40 qubits, a classical computer can be faster than a quantum one; above 60 qubits, a quantum computer is always faster. 
It is estimated that a quantum computer with 1,000 to 5,000 qubits will start having real-world applications, and implications for cybersecurity. But the “Holy Grail” of quantum computing science—which would constitute a real game-changer—would be the advent of large-scale, general-purpose quantum computers (known as LSQs). But their arrival is uncertain. 

If some, like Google, have already succeeded in performing quantum calculations which demonstrated quantum supremacy, the error rate is so high that we are still very far from achieving the promises of quantum computing. Today, the best qubits make a mistake every 1,000 operations—that is, 10 billion more errors than a classical computer. Thus, most experts argue that to have an efficient error correction rate, these computers would need to have millions of qubits. 

This presents another problem, which is space: today, an experimental quantum computer of a few dozen qubits occupies a full room, not least due to refrigeration and shielding requirements. Another option is to develop a qubit technology that can limit or correct errors. In either case, it is necessary to develop qubit technology that can be manufactured at scale.

Two other necessary lines of effort are classical-quantum computers integration, and software design. Quantum computers will not operate independently but, instead, for the foreseeable future, will work in conjunction with classical computers. A classical computer will be able to do bulk information management and data processing, while the quantum part of the machine could solve a specific problem. 

The realization of hybrid machines that integrate quantum and classical computers still pose many engineering challenges from both a software and a hardware point of view. When it comes to software, companies are today designing software that runs on classical supercomputers that mimic perfect quantum computers, as well as quantum computer emulators. However, practical considerations severely limit the circuit sizes which can be emulated. Due to the laws of quantum physics, a classical computer can only simulate a quantum computer of up to around 40 qubits. Nonetheless, simulators and emulators are useful to help researchers experiment with quantum systems and to develop algorithms that can make use of the peculiarities of quantum computers as well as their future applications.

Applications and Implications 

Conscious of the step-change that will come when quantum machines are ready, both industries and governments are examining the practical use cases of the intermediate devices that will likely be available within a decade.

To begin with, quantum machines will speed up the development of AI, as they will allow acceleration of deep learning and neural networks, with both civilian and military applications. In the military domain, for instance, quantum AI could facilitate the development of autonomous weapon systems, and accurate intelligence, especially when coupled with other quantum technologies.

In addition to AI, quantum computers will be especially suited for tasks of factorization, optimization, and simulation. Factorization is especially relevant for cryptography and makes the cybersecurity implications of future large-scale quantum computers a major concern for states. In 2015, the U.S. National Security Agency (NSA) updated its encryption system to make it “quantum resistant.” While significant advances in quantum computing are still required to break current encryption methods, a fully functioning quantum computer could allow a country or a non-state actor to break any public encryption key that is secured with current technology. That includes the RSA, the cryptosystem currently used to secure online payments. According to some estimates, it would take a classical computer 300 trillion years to crack an RSA encryption key (of 2,048 bits), while a quantum computer with 4,000 stable qubits could in theory do the same in just ten seconds. Conversely, quantum technology can also be used to secure communications. 

Complex simulation will arguably form a key part of the uses of quantum computers. The simulation of molecules requires a lot of computing power, as the bonds and interactions among atoms behave probabilistically, which exhausts classical computing logic. One quantum scientist was recently quoted as saying, in this context, that quantum computing is about “simulating nature, using the laws of nature.” Simulation at the molecular scale could have applications in medicine (e.g., for creating targeted medicines), in energy (e.g., more efficient batteries), in sustainable agriculture (e.g., fertilizers), or even for developing processes to capture CO2 present in the atmosphere.
Finally, quantum computers would be very useful for optimization tasks required for autonomous vehicles. With a fully autonomous fleet, it should theoretically be possible to optimize the individual journey of each vehicle according to its place of departure and destination. Conventional algorithms could work with a limited quantity of vehicles, but beyond a few hundred vehicles and journeys, traditional calculation capacities would be largely saturated. Optimization is also key for actors in the energy sector in the development of electrical networks and the management of electricity consumption in the context of a multiplication of electric vehicles.

Considering the enormous scientific and technological uncertainties that remain, the full business implications of quantum technologies will unfold and sediment over time. According to a report by Boston Consulting Group, we should expect quantum computing to develop toward maturity over three generations spanning the next 25 years. It is likely that ad hoc civilian uses of quantum machines will develop over the next 10 to 15 years, and that a quantum computer capable of breaking current encryption methods will see the light by 2040. 
In any case, quantum computers will not replace conventional computers. They will likely be complex and fragile machines with much narrower functions than universal classical computers, and they will thus be rare: at least at first, only a few machines will exist and be accessible on the cloud. Given the complexity of the field of quantum technologies, delegating them to providers on a cloud would avoid companies being forced to develop extremely advanced skills that are difficult to acquire. Thus, quantum computers will not become a replacement but a complement to current HPC tools.

Despite these limitations, quantum computing will have significant business and financial implications. The Boston Consulting Group sees a potential addressable quantum computing market of £4 billion (€4.7 billion) by 2024. In a slightly less optimistic scenario, the Quantum Economic Development Consortium and Hyperion Research foresee a 27 percent yearly growth from 2020 onward, with a global market worth $830 million (€701 million) by 2024.
Europe in the Quantum Race

The 2010s saw a clear acceleration of global competition around quantum information processing technologies. Illustratively, while in 2014, the UK Ministry of Defense judged the field of quantum information processing as “too immature” for near-term defense and security application, the UK Defense Science and Technology Laboratory (DSTL) reported in June 2020 that “the progress achieved both nationally and globally has exceeded early expectations,” so that today “many regard the rush to develop quantum computing as a new ‘space race.’” 

Quantum computing has become a race not least because of the risks of lagging behind. A first risk is cybersecurity, as explained above, as quantum computers will be able to break current encryption protocols in seconds. Another risk is access to technologies. Quantum cryptography and quantum computers indeed are making their way onto defense and strategic goods lists, and are thus becoming subject to export restrictions. The enabling technologies that are needed to make quantum computers work can also be placed under control: certain qubits require extremely cold temperatures that are obtained thanks to cryostats, a technology whose export to China the United States is considering blocking.

U.S.-China Competition and Quantum Technologies

Currently, the most advanced countries in the field for quantum computing, in terms of technological advancement and government strategy and funding, are the United States and China—they have each already claimed quantum supremacy—as well as a few EU member states (e.g., France, Germany, and the Netherlands) plus the UK.

Washington’s concerns of being overtaken by China have grown since Beijing demonstrated its capacity in satellite-based quantum communications in 2017. In 2018, President Donald Trump launched the National Quantum Initiative, with $1.2 billion (€1 billion) in public funding for an initial period of 5 years, until 2023. Trump set up a National Quantum Coordination office within the White House, and in August 2020, the U.S. launched its national quantum research centers, and an additional $237 million (€200 million) was voted as part of the 2021 budget. The United States Innovation and Competition Act (USICA), approved by the U.S. Senate in early June 2021 (it has not yet been taken up by the U.S. House of Representatives), proposes to allocate $150 billion (€128 billion) between 2022 and 2026 for research, innovation, and education in critical and emerging technologies, including quantum technologies.

Aside from the government, big technology firms are pouring huge amounts of money into their own research in quantum science—although their internal investment figures are not disclosed. The financial power of private investors and the attractiveness of large digital companies like IBM, Google, and Intel have given those companies a head start in quantum research. It was IBM that proposed the first principles of a quantum computer and introduced the first two-qubit computer. By 2016, the company had managed to simulate a molecular structure and reached the theoretical threshold of quantum supremacy with 50 qubits. The following year, Intel unveiled a 49-qubit calculator, and Google a 72-qubit processor. In September 2019, Google claimed to have achieved quantum supremacy with a 53-qubit quantum computer using superconductors. It succeeded in completing in just over three minutes a calculation that Google said would take 10,000 years to solve by a conventional supercomputer. IBM downplayed this achievement as it affirmed the calculation would take only 2.5 days on the most powerful of supercomputers.

Since 2016, IBM has offered an online quantum programming interface, IBM Quantum Experience. This platform offers a quantum programming simulator that gives access to 22 IBM computers. To date, more than 325,000 users have registered with it and more than 700 articles have been published based on work carried out on this machine. Aside from IBM, other American companies like Microsoft, Amazon, and Rigetti, as well as Canada’s Xanadu, offer online services of small-scale quantum computing chipsets with capacities of up to 65 qubits. When it comes to on-premises machines, U.S. companies, starting with IBM, have already built and exported quantum computer prototypes. IBM’s strategy is to make its technology available online, so as to encourage early adoption of its product. It has exported the first ever commercial quantum computer (albeit still experimental), the 20-qubit Quantum System One, to Germany and Japan, to drive quantum R&D there. And it is currently working towards making a stable quantum computer capable of handling more than 1,000 qubits by 2023. 

Meanwhile in China, efforts have been ongoing since 2015, after the 2013 Snowden revelations prompted anxiety over the extent of U.S. intelligence capabilities and activities and intensified the government’s focus on quantum communications and computing. Beijing has thus sought to leverage quantum networks to secure China’s most sensitive communications. Simultaneously with Obama’s plan entering the field, Beijing listed quantum as a part of China’s major science and technology priorities to be developed by 2030. 

There is limited information about total funding on quantum technologies in China. Officially, China spent over $302 million on quantum sciences between 2013 and 2015. In 2017, Beijing announced a $10 billion investment into a new quantum computing research center. While estimates of China’s actual spending on quantum research vary, the country is leading in terms of patent holding in quantum communication and cryptography hardware as well as software.

Robust research has led to rapid progress and even leadership in other quantum technologies (cryptography and communications), as was illustrated when China launched the world’s first quantum communication satellite in 2016. When it comes to quantum computers, China’s efforts have been more recent, but Beijing has been quick to catch up. In December 2020, a group of researchers from the University of Science and Technology of China (USTC) made a credible claim to have achieved quantum supremacy, using a photonic system to complete a calculation in 200 seconds that would have taken a supercomputer 2.5 billion years. That is to say that the calculation was performed 100 trillion times faster than with a classical supercomputer. In June 2021, China again demonstrated quantum advantage, this time with a system based on superconducting circuits.

Europe’s Growing Quantum Ecosystem

There are serious players in quantum technologies in Europe, among which is the UK. The head of the Government Communications Headquarters (GCHQ) suggested in April 2021 that the UK must develop “sovereign capabilities” in quantum computing, not least to respond to the cyber threat posed by China. The country is not starting from scratch—far from it, in fact: it launched its National Quantum Technologies program as early as 2013. The British government planned to invest £400 million (€467 million) in the first phase (2014-2019) and £350 (€400 million) in the second. In the past year, the UK government has renewed its commitment to quantum and other information technologies in a series of policy documents and decisions. The March 2021 Integrated Review—the main document guiding the UK’s foreign, security, and defense policy in the post-Brexit context—placed a strong emphasis on technological power and suggested the UK should be a leader in cyber technologies (quantum technologies included) and in new forms of data transmission.

Like in the U.S., new research centers and policy strategy positions are being set up in the UK. In June 2021, Boris Johnson announced he would create a National Science and Technology Council, chaired by the Prime Minister, as well as a new Office for Science and Technology Strategy, based in the Cabinet Office, and a new role of National Technology Adviser. A new National Quantum Computing Centre is being set up and will open in 2023. In September 2020, the UK government passed a £10 million (€11.7 million) agreement with U.S.-based company Rigetti to build the UK’s first commercially available quantum computer. The UK also has homegrown companies. Meanwhile, in July 2021, Oxford Quantum Circuits (OQC), a UK-based start-up, announced that the company has launched the nation’s first commercial “quantum computing-as-a-service” built entirely using its proprietary technology. OQC did not disclose how many qubits its machine contains, but in 2017 the company was working on a 9-qubit system. 

The German government is also investing in quantum technologies. An investment of €2 billion over five years was announced in June 2020 as part of a major stimulus injection, which builds on an initial government effort of €650 million for the period 2018-2022. One must add to this the contribution of Länders, including for example Bavaria’s recent €300 million investment in a “Quantum Valley.”

In June 2021, Germany’s then-Chancellor Angela Merkel reflected on the fact that quantum computing can play a key role in the country’s endeavor to “acquire technological and digital sovereignty,” as Germany and the EU find themselves in the context of a “very intense competition.” Merkel indicated a hope in promoting the development and production of quantum technologies in Germany to form a new industrial pillar, both in terms of hardware and software. First steps have already been taken: the construction of at least two quantum computers in Germany have been commissioned. The first machine was unveiled at the Fraunhofer institute for applied research, near Stuttgart, in June 2021. It is an IBM, the Quantum System One computer, the first of its type in Europe which was installed near Stuttgart, in June 2021. This will allow German researchers to work more intensively on future quantum applications. The choice of an IBM machine (and the cloud that comes with it) was justified by the fact that there are, currently, few leading European quantum companies. In the first step of its quantum computing strategy, Germany has been more focused on developing uses of quantum technologies than on supporting national quantum hardware companies. 

In February 2021, Emmanuel Macron unveiled a National Plan for Quantum Technologies that aims to make France the third-largest spender in the world on quantum technologies, behind the U.S. and Germany. The French plan has been in preparation since 2018, after Thierry Breton, then the CEO of Atos, called on the French government to elaborate a quantum strategy. At the time, he was one of the few French industrialists to be vocal on the issue. The French plan for quantum technologies was announced in February 2021. It calls for a total of €1.8 billion public-private investment (including €1 billion public funding) between 2021 and 2025, going toward education and training, research, support for start-ups, and support for industrial deployment and innovation.
With the strategy, France’s goal is to master decisive quantum technologies, including quantum accelerators, simulators and computers, business software for quantum computing, sensors, and communication systems. The bulk of the funding will go to quantum computing, with NISQ and LSQ totaling €784 million.

The choice of which qubit technology to favor is based on an analysis of the chances of success of a given technological avenue, the presence of a critical mass of researchers in France, and the presence of an industrial base able to build the technologies. Trapped ion technologies—developed by Honeywell, IonQ, and AQT (Austria), for example—are considered very promising but difficult to develop in France due to a lack of a critical mass of researchers. In the context of limited funding, the objective is to gradually diminish risks, but for the time being, France is treating all technological avenues equally. 

The French 2021 Quantum Plan draws lessons from the past failures at government planning, that is to say large, national investments in certain strategic sectors and infrastructure. According to Mathieu Landon, in charge of industry in the French Prime Minister’s office, one lesson learned is that such state strategies must be based on ecosystems, where there are already research and industry, rather than building an ecosystem from scratch. The French quantum ecosystem is already rich. It builds on research institutions (the CNRS, especially Paris-Saclay University, the CEA, and INRIA) as well as large companies involved in quantum computers (Atos) and telecommunications (Orange and Thales), and quantum-relevant enabling technologies, such as cryogenics (Air Liquide). 

Atos has become involved in quantum simulation and testing algorithms for future quantum computers. In 2017, it started commercializing the Atos Quantum Learning Machine, a quantum computer simulator, capable of processing up to 30 qubits in memory. Atos delivered simulators to the Oak Ridge National Laboratory, which is part of the United States Department of Energy (DOE), to the Argonne National Laboratory in the United States, to the CEA, and to the Hartree Center, a British research laboratory.

Aside from large companies, over recent years, France has seen its quantum start-ups flourish. Of the world’s 260 quantum technology start-ups and SMEs, almost 10 percent are thought to be in France. Pasqal is a hardware quantum company, created in 2019 which is developing a quantum computer based on atoms manipulated by lasers, intended for high-performance computing centers. It is backed by the Optics Institute of the University of Paris-Saclay. The company has so far built a quantum machine that works on its premises, and it has also received an order for two other machines to be delivered early 2023 to the GENCI in France and to the German research center in Jülich. The start-up has already also entered partnerships with Atos and Crédit Agricole. Pasqal has also decided to make their computer available on a cloud.

Alice & Bob is another promising start-up. It was created in February 2020 as a spin-off of an ENS-INRIA team. It raised €3 million a few months later from French funds Elaia Partners and Breega. The “cat qubit” is a ground-breaking discovery on self-correcting qubits that led to the creation of the start-up. According to a May 2020 item in a leading trade publication, the start-up is aiming to create an error-free, or “ideal” quantum computer, “which is one of the fundamental scientific problems that has limited development of more powerful quantum computing.” It plans to deploy the world’s first ideal quantum processor in the cloud by 2026. Amazon is seeking develop a quantum computer on the basis of this very technology, following scientific publications on self-correcting qubits. While this is testament to the relevance and excellence of their discovery, it places Alice & Bob in competition with a tech giant that has incomparable financial room for maneuver and scientific teams that are ten to twenty times larger than those of the French start-up.

When Science Becomes Strategic Technology

International collaboration is central to scientific research and vital for Europe to reach the scale necessary to compete globally in quantum technologies. Since 2018, the EU too has made quantum technologies a priority and has committed €1bn to co-finance collaborative research programs over 10 years. The Quantum Flagship is one of the EU’s largest and most ambitious research initiatives. In fact, it is currently the largest international funding framework for quantum technology. It brings together research institutions, academia, industry, enterprises, and policymakers in a joint and collaborative initiative on an unprecedented scale. 

Among the funded programs are a quantum computer (accelerator) based on trapped ions (“AQTION,” based at the University of Innsbruck) and a quantum simulation platform (“PASQuanS,” carried out at the Max Planck Institute in Munich). Atos leads both projects, on the industry side. The EU is also funding projects in other quantum technologies, especially quantum communications. Bilateral collaborations among EU member states are also developing—partly motivated by the goal of securing EU funding—as illustrated most recently by the signing of a Memorandum of Understanding between France and the Netherlands, for academic cooperation, but also to build synergies between French and Dutch companies and create quantum unicorns. 

The transition of quantum sciences from the realm of academia to concrete applications with security and industrial applications has been creating new dilemmas for the EU’s collaborative projects and cooperation with non-EU members. The UK, as explained above, but also Switzerland and Israel have significant quantum research ecosystems and are willing to join the EU’s Horizon Europe programs in quantum and space. 

The EU Commission, and in particular Thierry Breton, has opposed the participation of several non-EU countries, (including the aforementioned three states) in EU research programs on quantum computing, saying the goal is to “make independent European capacities in developing and producing quantum computing technologies of strategic importance,” with applications in security and dual-use technologies. However, a group of EU countries, led by Germany, pushed to maintain the openness to Associated Countries in quantum and space research programs, arguing that the bid for technological sovereignty should not get in the way of scientific collaboration.


The Global Race Is Also One for Capital

While it has yet to reach the volume and quantity of other industries like artificial intelligence, the ecosystem of quantum technology companies, especially start-ups, continues to grow around the world. Some estimate that there are over 260 quantum technology startups and SMEs globally. Many start-ups are still at the stage of applied research and sometimes still fundamental research, and quantum computing remains an uncertain technological sector. 

Investment is at the heart of the matter. As suggested above, funding is key for allowing researchers to conduct their experiments, but also for scaling up and commercializing quantum systems. Besides, if, as mentioned above, future import restrictions on quantum technologies are feared, foreign takeovers of successful companies are too. Private sector investment is needed, if France and the rest of the EU are to retain their talents and prevent individual researchers and promising start-ups from going overseas. Globally, the private sector’s involvement in the funding of quantum start-ups has boomed: quantum-computing companies landed $779.3 million (€662 million) in 77 deals in 2020, a surge from $288.3 million (€194 million) in 69 deals in 2019. Several quantum technology start-ups are now valued at several hundred million euros and at least two are now publicly traded. 

The current investment boom in quantum start-ups is so far playing into the hands of American venture capital funds and large digital companies. The Canadian firm Xanadu, too, raised $100 million (€85 million) largely from U.S. investors, including the CIA’s investment branch, In-Q-Tel. In the UK, the leading British start-up PsiQuantum was established in 2016, but has since settled in California. It is promising to build a one-million qubit large-scale, general purpose quantum computer by 2025. The move to the Silicon Valley was partly motivated by a need to raise capital. PsiQuantum has, so far, successfully raised a total of $665 million (€565 million) including, in late July 2021, $450 million (€382 million) from mostly U.S. investors such as BlackRock and Microsoft’s venture fund, M12. The story recalls that of DeepMind, the British AI company that Google acquired for £400 million (€628 million) in 2014. The UK government has taken action on the issue, when in July 2021, it set up a new fund for R&D intensive firms, including quantum companies. To be eligible, businesses must have secured funding commitments from private investors venture capitalists. 

All countries that get into the quantum race but have limited private investment face the same risks as UK companies. This is especially true for EU-based companies, where venture capital is scarce. Quantonation is a Paris-based investment fund — the first in the world that is specialized in quantum technologies. It was set up in late 2018, and funded Pasqal in 2019. Quantonation supports quantum companies in their early stages, but for later stages, other investment funds must take over and invest hundreds of millions to help seeded companies grow. This is where there is a risk for EU companies that seek to commercialize products, and there is a real challenge for ensuring not only that companies are not taken over by foreign capital, but also that they can grow in the European Union space.

Truly Strategic Opportunities

The democratization of high-performance computing and new levels of conventional computing power, together with the emergence of disruptive quantum information technologies, are changing the calculations of governments, researchers, and private companies alike. Private companies are finding new ways to use the potential of data analysis, governments are developing strategies to gain relative technological power and ensure the security of their digital systems, while scientists can hope to make new discoveries in medicine and in the fight against climate change. Technological progress is also promising to significantly reduce the energy consumption of computers, which has become a bigger concern as uses continue to grow.

The global distribution of computing power is changing. While the U.S. has for long dominated the sector of conventional computing, not least with the defining role played by IBM, China’s Lenovo has now become the first HPC company worldwide in terms of market share. Today, the U.S. and China are also neck and neck in the race for quantum computing, with massive investments and impressive technological achievements. These raise the risk of developing hardware-dependent tools and technology dependencies.

But the ongoing quantum revolution is nurturing a wealth of actors, from research laboratories to start-ups and investment funds, which could further redistribute computing power across the globe. Technologies with a lower level of readiness offer the EU and its member states a chance to position itself early in this emerging sector and develop capacities along the value chains of quantum computing in hardware and software. European governments, including France and Germany, as well as EU institutions, have made significant efforts in this direction. Together with private investors, they will need to remain committed throughout the life cycle of this emerging and highly disruptive technology

Back to Table of Contents