Quantum Computing
In July 2016, the National Science and Technology Council of the Executive Office of the President, in a report titled "Advancing Quantum Information Science: National Challenges and Opportunities", described Quantum Information Science (QIS) is a foundational science, with envisioned applications (that) include sensing and metrology, communications, simulation, and high-performance computing. The report also pointed out specifically that Quantum communication, the ability to transmit information encoded in quantum states of light or matter, is currently an active area of development. The report also states that in the longer term, quantum networks will connect distributed quantum sensors to allow long-distance transmission of quantum information.
Quantum information science combines two of the great scientific and technological revolutions of the 20th century: quantum mechanics on the one hand, and computer-based information science on the other. One of the fundamentally important research areas involved in quantum information science is quantum communications, which deals with the exchange of information encoded in quantum states of matter or quantum bits (known as qubits) between both nearby and distant quantum systems.
Quantum computing is based on quantum bits or qubits. Unlike traditional computers, in which bits must have a value of either zero or one, a qubit can represent a zero, a one, or both values simultaneously. Representing information in qubits allows the information to be processed in ways that have no equivalent in classical computing, taking advantage of phenomena such as quantum tunneling and quantum entanglement. As such, quantum computers may theoretically be able to solve certain problems in a few days that would take millions of years on a classical computer.
Qubits are the quantum analogue to the classical computer bits “0” and “1.” Engineering materials that can function as qubits is technically challenging. Using supercomputers, scientists from the University of Chicago and Argonne National Laboratory predicted possible new qubits built out of strained aluminum nitride. Moreover, the scientists showed that certain newly developed qubits in silicon carbide have unusually long lifetimes.
Quantum computers could break common cryptography techniques, search huge datasets, and simulate quantum systems in a fraction of the time it would take today’s computers. However, engineers first need to harness the properties of quantum bits. Engineering new qubits with less difficult methods could lower one of the significant barriers to scaling quantum computers from small prototypes into larger-scale technologies.
One of the leading methods for creating qubits involves exploiting specific structural atomic defects in diamonds. Using diamonds is both technically challenging and expensive. Now researchers from the University of Chicago and Argonne National Laboratory have suggested an analogous defect in aluminum nitride, which could reduce the difficulty and ultimate cost of manufacturing materials for quantum computing applications. Using the Edison and Mira supercomputers at DOE’s National Energy Research Scientific Computing Center and Argonne National Laboratory respectively, the researchers found that by applying strain to aluminum nitride, they can create structural defects in the material that may be harnessed as qubits similar to those seen in diamonds. They performed their calculations using different levels of theory and the Quantum Espresso and WEST codes, the latter developed at the University of Chicago. The codes allowed them to accurately predict the position of the defect levels in the band-gap of semiconductors. The researchers also closely collaborated with experimentalists to understand and improve the performance of qubits in industrial materials. Recently, they showed that newly developed qubits in silicon carbide have much longer coherence times than that of the more well-established defect qubits in diamond. Their results pointed to industrially important polyatomic crystals as promising hosts for coherent qubits for scalable quantum devices.
Peter Shor’s 1994 breakthrough discovery of a polynomial time quantum algorithm for integer factorization sparked great interest in discovering additional quantum algorithms and developing hardware on which to run them. The subsequent research efforts yielded quantum algorithms offering speedups for widely varying problems, and several promising hardware platforms for quantum computation. These platforms include analog systems (usually cold atoms) used for simulating quantum lattice models from condensed-matter and high-energy physics, quantum annealers for combinatorial optimization, boson samplers, and small-scale noisy prototypes of digital gate-model quantum computers.
In the longer term, the emergence of scalable, fault-tolerant, digital quantum computers offers a new direction for progress in high performance computing as conventional technologies reach their fundamental limitations. Quantum speedups have been discovered for a number of areas of DOE interest, including simulations for chemistry, nuclear and particle physics, and materials science, as well as data analysis and machine learning. In addition, quantum speedups have been discovered for basic primitives of applied mathematics such as linear algebra, integration, optimization, and graph theory. These demonstrate the potential of quantum computers to yield better-scaling methods (in some cases exponentially better) for performing a wide variety of scientific computing tasks. Practical realization of this potential will depend not only on advances in quantum computing hardware but also advances in optimizing languages and compilers to translate these abstract algorithms into concrete sequences of realizable quantum gates, and simulators to test and verify these sequences. The development of such software has recently seen rapid progress, which can be expected to continue given sufficient support.
Imagine typing a very complex query into your computer and having to wait more than a lifetime for results. Thanks to scientists like Davide Venturelli, supercomputers of the future could return those results in a fraction of a second. Davide is a quantum computer research scientist for the Universities Space Research Association. Quantum theory explains how matter acts at the tiniest levels; in applying it to computing, researchers study ways in which that behavior can advance processing power.
To test whether their theories work, quantum computer research scientists may conduct experiments or work with experimental physicists. For example, they may create a quantum environment with computer hardware, then test how particles in that environment react to different levels of laser intensity. Experiments that verify a theory may lead to improvements, such as more efficient computer design and faster, more secure communication for computer networks. But relying on theory means that scientists work with incomplete information—so they’re sometimes surprised at the outcomes. The hope is that quantum computing will vastly improve a wide range of tasks that can lead to new discoveries and technologies, and which may significantly change the way we solve real-world problems.
Quantum physics drives much of the research at the National Institute of Standards and Technology (NIST). Explaining this research is a challenge, because quantum physics—nature's rules for the smallest particles of matter and light—inspires words like weird, curious, and counter-intuitive. The quantum world is strange and invisible in the context of everyday life. And yet, quantum physics can be explained and at least partially demonstrated visually.
By its very nature, quantum science sets fundamental limits on precision measurements, so by necessity NIST is a leader in basic and applied research in quantum science. Some of the most fundamental quantum research in the world is carried out in partnerships between NIST and top universities, such as JILA, the Joint Quantum Institute (JQI) and the Joint Center for Quantum Information and Computer Science (QuICS) . Scientists in these institutes leverage the combined resources of the partners to advance research in the control of atoms and molecules and development of ultra-fast lasers capable of manipulating states of matter. The discoveries that have been made in these institutes continue to be applied at NIST to meeting new measurement challenges, such as the development of the world’s best atomic clocks and lasers.
An emerging research focus at NIST is understanding the potential for quantum-based technology to transform security, computing and communications, and to develop the measurement and standards infrastructure necessary to exploit this potential. Breakthroughs at NIST enabled the first forays into real-world quantum computing and tested the limits of quantum information and security. NIST is also developing the technology to harness the power of quantum computing in the everyday world through nanotechnology.
If an exotic quantum computer is invented that could break the codes we depend on to protect confidential electronic information, what will we do to maintain our security and privacy? That's the overarching question posed by a report from the National Institute of Standards and Technology (NIST), whose cryptography specialists are beginning the long journey toward effective answers.
NIST Internal Report (NISTIR) 8105: Report on Post-Quantum Cryptography details the status of research into quantum computers, which would exploit the often counterintuitive world of quantum physics to solve problems that are intractable for conventional computers. If such devices are ever built, they will be able to defeat many of our modern cryptographic systems, such as the computer algorithms used to protect online bank transactions. NISTIR 8105 outlines a long-term approach for avoiding this vulnerability before it arises.
The report shares NIST's current understanding of the status of quantum-resistant cryptography, and details what the agency is doing to mitigate risk in the future. One overall recommendation for the near term is that organizations focus on crypto agility, or the rapid ability to switch out whatever algorithms they are using for new ones that are safer.
Creating those newer, safer algorithms is the longer-term goal and a key part of this effort will be an open collaboration with the public, which will be invited to devise and vet cryptographic methods that—to the best of experts' knowledge—will be resistant to quantum attack. Many current algorithms rely on the difficulty that conventional computers have with factoring very large numbers, a difficulty that a quantum computer can overcome. Defenses that rely on different mathematical approaches might stymie a quantum computer, and there is worldwide research interest in developing them. Historically, it has taken a long time from deciding a cryptographic system is good until we actually get it out there as a disseminated standard in products on the market. It can take 10 to 20 years and companies have to respond to all the changes. Physicists at the National Institute of Standards and Technology (NIST) have added to their collection of ingredients for future quantum computers by performing logic operations—basic computing steps—with two atoms of different elements. This hybrid design could be an advantage in large computers and networks based on quantum physics.
NIST's new mixed-atom gates could also help make better simulators to model quantum systems and could enable faster and simpler measurements in applications such as NIST's experimental quantum logic clock. The mixed-atom gates rely on NIST's technique for entangling ions demonstrated more than a decade ago. Multiple carefully tuned laser beams apply an oscillating force to a pair of ions. If the ions are in different internal states, they feel different laser forces that alter the ions' external motions. This coupling of internal states with external motions has the effect of entangling the ions.
The research was supported by the Office of the Director of National Intelligence, Intelligence Advanced Research Projects Activity, and the Office of Naval Research.
As this nation’s national metrology institute, NIST’s overall mission is to promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life. NIST does this through programs focused on national priorities from cybersecurity, advanced manufacturing and the digital economy to precision metrology, biosciences, and more.
NIST conducts basic and applied research in quantum science to advance the field of fundamental metrology as part of its core mission, by developing more precise measurement tools and technologies to address industry’s increasingly challenging requirements. This work has positioned NIST both as a global leader among national metrology institutes, and as one of the world’s leading centers of quantum research and engineering. While NIST’s work in quantum science is revolutionizing the world of metrology, it also has direct application to quantum communications and quantum computation. Today, I’ll describe in more detail some of NIST’s quantum research efforts and how they are being leveraged to positively advance the field.
NIST scientists began researching quantum information in the early 1990s in their quest to make better atomic clocks. Qubits and atomic clocks may seem worlds apart, but experimentally they are very much the same thing. By 2000, NIST had established a formal quantum information program.
Atomic clocks define the second and tell time with amazing precision. For example, the most accurate U.S. atomic clock currently used for defining the second is the NIST-F2. It keeps time to an accuracy of less than a millionth of a billionth of a second. Stated in another way, the NIST-F2 clock will not lose a second in at least 300 million years. And just this month, NIST published a description of a radically new atomic clock design—the three-dimensional (3-D) quantum gas atomic clock. With a precision of just 3.5 parts error in 10 quintillion (1 followed by 19 zeros) in about 2 hours, it is the first atomic clock to ever reach the 10 quintillion threshold, and promises to usher in an era of dramatically improved measurements and technologies across many areas based on controlled quantum systems.
These breakthroughs in precision timekeeping have critical real-world applications to navigation and timing. Today, commercial atomic clocks contained in GPS satellites provide the timekeeping precision that we take for granted when we use our GPS devices to pinpoint our location to within a meter almost anywhere on earth.
NIST’s most advanced atomic clocks, so precise that they will not lose a second over the life of the universe, also are being applied to make the world’s most sensitive measurements of quantities other than time. For example, NIST is actively pursuing the use of atomic clocks as quantum sensors, another application of quantum information, for a range of entirely new technologies. NIST is now able to detect the barely perceptible slowing of time in a large gravitational potential. This is the second form of time dilation predicted by Einstein in his general theory of relativity and may help scientists detect gravitational waves or prospectors find hidden oil reserves and mineral deposits. The technology might even have the potential to allow scientists to predict earthquakes days or even weeks before a cataclysmic event.
NIST’s breakthroughs in the measurement of time also have laid the technological foundations for how to manipulate quantum information. NIST’s pioneering work in the cooling and trapping of ions and atoms to improve timekeeping provided NIST researchers with the experimental platform to demonstrate the first two-qubit quantum logic gate in 1995, by controlling and entangling the energy levels of two ions. Logic gates in classical computers are used to process information. By analogy, quantum logic gates form the basic building block for quantum computing. Scaling up to experiments involving multiple logic gates provides a platform to test more complex quantum computing theory.
Atomic clocks are just one example of NIST’s research focused on measurement science that has applications to quantum computing. NIST also is the world’s leader in specially designed devices, made from superconductors, known as Josephson Junctions. Josephson Junction technology is used by NIST to realize and disseminate NIST’s quantum voltage standard. The quantum voltage standard is also integral to the proposed 2019 effort to redefine the international system of units (colloquially, the metric system) to be based on fundamental constants of nature, as defined through world-leading experiments at NIST such as the electronic kilogram. This same technology is being explored as a key competitor to trapped ions and atoms as another way to manipulate and store quantum information.
Additionally, Superconductors are used by researchers at NIST to make ultra-sensitive single photon detectors used in precision photonic measurements at NIST and by external stakeholders. These specially designed sensors have become essential components in experiments at NIST to test the foundations of quantum mechanics and realize quantum teleportation. In quantum teleportation, quantum information gets transmitted instantaneously from one qubit to another. Discrete photons, like ions and atoms, can also be carefully controlled and entangled to form qubits. Prior to China’s recent 1200 kilometer demonstration, NIST had held the distance record for quantum teleportation, transmitting information between photons separated by 100 kilometers. Progress in quantum teleportation is expected to be essential for eventual commercial quantum computing, and for other forms of quantum information transfer.
In the end, building a quantum computer will involve many disparate quantum technologies. Those technologies will need to be integrated to provide long-term storage and memory, transmission or teleportation, transduction, and detection of qubits while not corrupting the qubit’s extremely delicate state.
NIST supports joint centers with the University of Colorado Boulder (CU) and the University of Maryland (UMD). JILA at UC was founded in 1962 and has been doing research in quantum science and in atomic clocks and is evolving into quantum information science. Two joint centers in quantum information science at UMD were established more recently. The Joint Quantum Institute (JQI) was established in 2006 through a cooperative effort between NIST, UMD, and the Laboratory for Physical Sciences. The Joint Center for Quantum Information and Computer Science (QuICS) was established in 2014 to complement JQI’s experimental and theoretical work by focusing the use of quantum systems to process, transmit and store quantum information. Taken together, NIST’s joint institutes interact strongly to push the frontiers of quantum science, information, and computing and provide a training ground for industry’s future quantum workforce.
NIST recognizes that it has an essential role to play in U.S. leadership in quantum computing and information. However, that role is not to build a quantum computer. NIST’s role, consistent with its mission, is to develop the foundational knowledge and measurement science support for U.S. leadership in quantum computing, to create the basis for characterizing quantum logic gates, to explore approaches to quantum control and error correction, to develop rudimentary quantum processors that are capable of creating the exotic quantum states that will allow improvement of our measurements beyond the standard quantum limit, and to ensure that our cybersecurity infrastructure remains resilient in the quantum era. Part of this foundational knowledge will come from using NIST’s measurement platforms to experimentally conduct quantum simulations and validate quantum computing theory. NIST also anticipates that the early adoption of the quantum technologies that emerge as NIST continues to develop the world’s most precise atomic clocks (quantum logic clocks) and quantum based sensors will ultimately provide substantial support to the effort to build a quantum computer.
NASA’s QuAIL team aims to demonstrate that quantum computing and quantum algorithms may someday dramatically improve the agency’s ability to solve difficult optimization problems for missions in aeronautics, Earth and space sciences, and space exploration.
Beginning with the D-Wave Two™ quantum computer, NASA’s QuAIL team is evaluating various quantum computing approaches to help address NASA challenges. Initial work focuses on theoretical and empirical analysis of quantum annealing approaches to difficult optimization problems.
The research team is also studying how the effects of noise, imprecision in the quantum annealing parameters, and thermal processes affect the efficacy and robustness of quantum annealing approaches to these problems. Over the next five years, the team will also develop quantum AI algorithms, problem decomposition and hardware embedding techniques, and quantum-classical hybrid algorithms.
In support of NASA's Quantum Artificial Intelligence Laboratory (QuAIL), the NAS facility hosts a 1,097-qubit D-Wave 2X™ quantum computer. The QuAIL project is a collaborative effort among NASA, Google, and Universities Space Research Association (USRA) to explore the potential for quantum computers to tackle optimization problems that are difficult or impossible for traditional supercomputers to handle.
As part of its mission to address some of the most difficult challenges in the Intelligence Community by investing in high-risk, high-payoff research, IARPA sponsors several applied research programs that explore the potential and possibilities in quantum computing. Current and previous quantum computing programs include:
Quantum computing holds great promise for solving important classically intractable computational problems. Ongoing work in theoretical and experimental physics continues to make advances in a number of technologies that might one day underlay a quantum information processor. Relatively little investment has been made in exploring the computer science side of quantum information science (QIS) even though the challenges that quantum computing poses to the world of computer science are on a par with the challenges posed to the world of physics.
The Intelligence Advanced Research Projects Activity (IARPA) Quantum Computer Science (QCS) Program explores questions relating to the computational resources required to run quantum algorithms on realistic quantum computers.
Any implementation of a quantum algorithm requires not only programming the algorithm at a logical level but also the incorporation of error correction and control schemes at the physical level, and resource estimation must account for all of these factors. The QCS program is developing a tool chain to study these issues throughout the computing process.
The tools will include an integrated development environment for the quantum programming languages already developed by the program, compilers to generate logical circuits, and tools for analyzing quantum error correction and control protocols. Through its research QCS will build a foundation for measuring and reducing the resources required to program and implement complex quantum algorithms of realistic size.
Quantum computers are in theory capable of simulating the interactions of molecules at a level of detail far beyond the capabilities of even the largest supercomputers today. Such simulations could revolutionize chemistry, biology and materials science, but the development of quantum computers has been limited by the ability to increase the number of quantum bits, or qubits, that encode, store and access large amounts of data.
In a paper published in the Journal of Applied Physics, a team of researchers at the Georgia Tech Research Institute (GTRI) and Honeywell International have demonstrated a new device that allows more electrodes to be placed on a chip – an important step that could help increase qubit densities and bring us one step closer to a quantum computer that can simulate molecules or perform other algorithms of interest.
The goal of the CSQ program is to demonstrate a reproducible, ten-fold increase in coherence times in superconducting qubits. To achieve this goal, researchers are focused on developing 1) fundamental understanding of defects that currently limit coherence times (T1 and T2) and readout fidelity; 2) means to characterize, measure and definitively discriminate between separate defect mechanisms contributing to loss and dephasing; and 3) novel designs, materials and fabrication methods to eliminate these defects.
The LogiQ Program seeks to overcome the limitations of current multi-qubit systems by building a logical qubit from a number of imperfect physical qubits. LogiQ envisions that program success will require a multi-disciplinary approach that increases the fidelity of quantum gates, state preparation, and qubit readout; improves classical control; implements active quantum feedback; has the ability to reset and reuse qubits; and performs further system improvements.
Additionally, LogiQ seeks a modular architecture design of two coupled logical qubits that creates a flexible and feasible path to larger systems. Modular designs facilitate the incorporation of next-generation advances with minimal constraints, while maintaining or improving performance.
The Multi-Qubit Coherent Operations Program aims to resolve the technical challenges involved in fabricating and operating multiple qubits in close proximity. The main themes of the program include qubit fabrication and yield; cross talk within the multi-qubit system; incorporation of the controls necessary to operate multiple qubits; coupling qubits to generate a universal gate set for quantum operations; and minimizing the overall system footprint. The program is comprised of different technologies including atomic and solid state based qubits. The end goal of the program is to execute quantum algorithms using multiple qubits and to evaluate the performance using a metric that can scale to higher qubit numbers.
QEO seeks to harness quantum effects required to enhance quantum annealing solutions to hard combinatorial optimization problems. The physics underlying quantum enhancement will be corroborated by design and demonstration of research-scale annealing test beds comprised of novel superconducting qubits, architectures, and operating procedures. All work will serve to demonstrate a plausible path to enhancement and a basis for design of application-scale quantum annealers.
In 2015 IRAPA stated: Quantum computing becomes viable when a quantum state can be protected from environment-induced error. If quantum bits (qubits) are sufficiently reliable, errors are sparse and quantum error correction (QEC) is capable of identifying and correcting them. Adding more qubits improves the preservation of states by guaranteeing that increasingly larger clusters of errors will not cause logical failure—a key requirement for large-scale systems. Using QEC to extend the qubit lifetime remains one of the outstanding experimental challenges in quantum computing. Relative to a single physical qubit the failure rate is reduced in retrieving an input state by a factor of 2.7 when using five of our nine qubits and by a factor of 8.5 when using all nine qubits after eight cycles. The successful suppression of environment-induced errors will motivate further research into the many challenges associated with building a large-scale superconducting quantum computer.
In February 27, 2018 the National Science Foundation (NSF) announces three new Expeditions in Computing awards, each providing $10 million in funding over five years to multi-investigator research teams pursuing large-scale, far-reaching and potentially transformative research in computer and information science and engineering. This year's awards aim to enable game-changing advances in real-time decision making, quantum computing and non-invasive biomedical imaging.
Since the inception of the program a decade ago, NSF has funded 22 Expeditions in Computing awards, including these three. Over the years, NSF-funded Expeditions awards have pursued foundational research in a range of areas spanning computing hardware and software, wireless networks, robotics, Big Data, artificial intelligence (AI), and synthetic biology and molecular programming, to name a few.
A new era is rising in which AI systems will play an increasingly central role in people’s lives by revolutionizing healthcare, transportation and the way business is conducted. This Expeditions project seeks to build AI decision systems to address these challenges by developing open source platforms, tools and algorithms for Real-time, Intelligent, Secure and Explainable (RISE) decisions. The project will also empower a large community of pioneers to build innovative applications and solutions, as well as broaden participation in research activities by allowing students and researchers across many disciplines to contribute.