University of Hawaii at Manoa
Department of Physics and Astronomy

Jeffrey Yepez, Ph.D.
Quantum Information Science


Jeffrey Yepez

Theoretical physics



Physics research

My research is in quantum information dynamics, particularly quantum computational models of quantum field theory and classical field theory. I have been involved in quantum computing research is the early 1990's. A unifying methodology in my research has been information preserving lattice-based model, spanning quantum to classical representations of many-body systems.

I am interested in quantum computational models of nonlinear physical systems (including quantum gases, strongly-correlated condensed matter, gauge field theories and gravity). I am currently investigating unitary lattice models to generically represent quantum dynamics on small scales that naturally manifest a transition to classical dynamics on large scales. The simplest example of this behavior is my quantum lattice gas model of a superfluid, well described by the Gross-Pitaevskii and Bogoliubov equations at small scales and the Navier-Stokes and acoustic fluid equations at large scales. In short, I study fundamental models of particle dynamics that are analytically tractable and can be efficiently run on computers and that naturally bridge the gap between quantum and classical effective field theories.


Quantum computing


Analog and digital quantum computers

For quantum simulation purposes, two different approaches to quantum computation are typically considered. A quantum computer that maps the continuous quantum mechanical evolution of one microscopic system with an engineered system Hamiltonian to emulate the behavior of another quantum system may be termed an analog quantum computer or quantum emulator. A quantum computer that maps a discrete qubit array (a quantum circuit network) governed by an engineered unitary evolution to efficiently simulate the behavior of another quantum system may be termed a digital quantum computer or a Feynman quantum computer. At the inception of my experimental quantum computing work, I was fascinated by both approaches, and remain so to this day.

ColdQuanta Atom Chip Vacuum Cell ColdQuanta Atom Chip Vacuum Cell
The quantum engineering technology explored in my BEC Lab was an example of using atomic and quantum optical methods to form and control a spinor Bose-Einstein condensate (BEC) of an ultracold quantum gas comprised of alkali bosonic atoms. This work was conducted for purpose of prototyping an analog quantum computer.

NMR quantum information processing was originally used to obtain the first experimental proof-of-concept of quantum information processing needed for the development of a Feynman quantum computer. The quantum engineering technology explored in my NMR Lab was an example of using spatial NMR spectroscopy to form and control a quantum psuedo-pure state of spin-1/2 atomic nuclei. This work explored the frontiers of measurement-based quantum information processing technology and was conducted for the purpose of prototyping an analog quantum computer.

Fine-grained parallel computing explored in my Dynamics Lab was an example of using reversible information processing methods to form and control an emergent Navier-Stokes fluid of a classical lattice gas comprised of fermionic bits. This work was conducted for the purpose of testing a Feynman quantum computer in its simplest application of simulating the behavior of a classical many-body system confined to a spacetime lattice. A Feynman quantum computer can efficiently simulate the behavior of a finite quantum many-body system, but it can also simulate the behavior of a finite classical many-body system. Lattice-gas information processing is time-reversible, a salient characteristic it shares with unitary quantum information processing. So a classical lattice gas is an archetype of a quantum lattice gas.

Type-I and type-II quantum computers

For quantum simulation and classical simulation purposes, I considered two different types of quantum computers. Let me define the second type of quantum computer first. A quantum computer that maps the continuous quantum mechanical evolution of one microscopic system with an engineered system Hamiltonian and with a restriction of its quantum entanglement being strictly local in space is termed a type-II quantum computer. In contradistinction, a quantum computer with an engineered system Hamiltonian, but without any restriction on the spatial extent of its quantum entanglement, is a type-I quantum computer.

If quantum entanglement localization in a type-II quantum computer is induced by continual state measurement, then the type-II quantum computer is a measurement-based quantum computer. Historically, type-II quantum algorithms were the first example of a measurement-based quantum computing algorithm that employs entangled cluster states. Quantum entanglement localization can occur naturally in a many-body quantum system, and this is a remarkable way to realize a type-II quantum computer.

A Feynman quantum computer is a prototypical type-I quantum computer, while an analog quantum computer based on a zero-temperature spinor Bose-Einstein condensate (BEC) is a type-II quantum computer -- and this occurs without any externally applied state measurement. So one reason I explored type-II quantum computers was the connection to BECs -- a form of quantum matter that occurs naturally (for example in the compressed matter in white dwarfs and neutron stars) and that can be produced in a table-top setup in a lab at a repetition rate of nearly one per second. Another reason for studying type-II quantum computers is that they are easier to analyze theoretically. It is for these reasons that (when I was a lead Air Force Quantum Computing Program Manager) I helped support the development of table-top BEC-QC prototypes. The description below of the BEC Lab is an example of developing ultracold quantum gas technology toward prototyping a type-II quantum computer. The description below of the NMR Lab is another example of developing quantum information processing technology for prototyping a type-II quantum computer.

A type-I quantum computer can be used to model strongly-correlated quantum mechanical systems while a type-II quantum computer can be readily used to model nonlinear classical systems (such as Navier-Stokes fluids) and quantum systems (such as for zero-temperature BEC superfluids). A spinor BEC superfluid is interesting from an applied mathematics viewpoint because it can behave as a single macroscopically coherent quantum particle whose equation of motion admits soliton solutions such as Skyrmions.

In summary, as a practical engineering design advantage, an analog quantum system can behave as a type-II quantum computer with only local quantum entanglement. Such a quantum system is isomorphic to a large array of localized quantum processors interconnected by communication channels whose data transfers are represented by permutation-based interchange operators (orthogonal operators).


Measurement-based quantum computing

I have pursued measurement-based quantum computing with the application of efficiently representing highly nonlinear classical field theories. I have developed perhaps the earliest measurement-based (or entangled-cluster) quantum computational model. Originally used for fluid dynamics simulations, an entangled cluster is induced at each spatial point where all possible outgoing collisional states are present, efficiently yielding an allowed outgoing many particle state upon qubit measurement.

I have been interested in the issue of reversibility in quantum maps at the mesoscopic scale using a quantum Boltzmann equation. This research direction, related to one-way quantum computing, led to a collaboration with Professor David Cory and members of his group at the MIT Nuclear Engineering Department where we achieved an early proof-of-concept of measurement-based quantum computing using spatial nuclear magnetic resonance. My interest in quantum computing led me to establish the first research programs in the Air Force to explore scalable quantum information processing devices, which included qubit architectures based on nuclear magnetic research spectroscopy and superconductive electronics.

Here are some my papers on quantum models of nonlinear dynamics in 1+1 dimensions developed for testing analog quantum computation:

Here is a paper on a measurement-based quantum lattice gas model for Navier-Stokes fluid dynamics in 2+1 dimensions:


Nonlinear physics

The salient characteristic of a nonlinear field theory is its soliton solutions, field configurations that are localized in space and persistent in time. There are many well known nonlinear partial differential equations that admit physically relevant soliton solutions, including the Korteweg-de Vries (KdV) equation and a large class of nonlinear Schroedinger (NLS) equations. Example NLS equations are the Manakov equations for light in optical fibers, the Gross-Pitaevskii (GP) equation for a scalar superfluid in the zero-temperature limit, and coupled GP equations for spinor Bose-Einstein condensates. A class of solitons that I am investigating are known as vortex solitons, which are topological singularities of the quantum amplitude field associated with a condensed quantum gas. I also study skyrmions in spinor condensates. I use quantum lattice gas models to investigate nonlinear quantum field theories that admit soliton solutions and study the dynamical behavior of these solitons using unitary quantum simulations.

Here are papers on quantum lattice gas models for nonlinear Schroedinger equation and Korteweg-de Vries equation:

Optical solitons

Here are papers on quantum lattice gas models for coupled-nonlinear Schroedinger equations and the Manakov equations for optical solitons:


Quantum topology

I have studied quantum entanglement from the perspective of the theory of quantum information theory. I have investigated pairwise entanglement and have developed an analytical treatment using joint ladder operators, the sum of two single-particle fermionic ladder operators. This mathematical physics-oriented approach readily allows one to write down analytical representations of quantum algorithms and to explore quantum entanglement manifested in a large system of qubits. Using these analytical tools, I have developed a topological representation of quantum logic that views entangled qubit spacetime histories (or qubit world lines) as a generalized braid, which I have referred to as a superbraid. The crossing of world lines may be either classical or quantum mechanical in nature, and in the latter case most conveniently expressed with our analytical expressions for entangling quantum gates. At a quantum mechanical crossing, independent world lines can become entangled.

I found quantum skein relations that allow complicated superbraids to be recursively reduced to alternate classical histories. If the superbraid is closed, then one can decompose the resulting superlink into an entangled superposition of classical links. Also, one can compute a superlink invariant, for example the Jones polynomial for the square root of a knot. The quantum skein relations naturally transition to the well known classical skein relations as quantum entanglement vanishes.

Here are a couple papers on quantum knots:


Quantum theory


Quantum mechanics

Feynman's idea of building a quantum computer to simulate physics is one of the great ideas of the late 20th century and this has led to an entirely new subject area in physics, the now prominent area of quantum computation and quantum information. There are many proposed routes to achieving sustained and scalable quantum computation and perhaps one may eventually prove viable later in the 21st century. I explore quantum algorithms intended to run for long times on very large quantum computers.

I study many-fermion dynamics in strictly informational terms, including fundamental particle physics that is otherwise very well described at low energies by quantum field theory and the Standard Model. I develop qubit array representations of particle dynamics that are accurate unitary representations. These representations can be implemented and run efficiently on quantum computers. The spatial organization of the qubit array is typically a regular cubical lattice. All the qubit amplitudes, distributed over the points in space in an ordered way, offer a convenient encoding of a probability amplitude field associated with a system of Dirac particles.

I consider quantum algorithms that are ultimately congruent to the Feynman path integral representation of relativistic quantum field theory and that use the smallest number of quantum gates. There are many possible protocols, especially in models where the number of qubits at each point is small. For simplicity, I focus on quantum algorithms where all the quantum gates act locally and the resulting unitary evolution operator has a tensor-product structure. A quantum gate is applied either at a single point in space modeling particle-particle interaction (collide operation) or applied between two nearby points modeling particle motion (stream operation). Furthermore, all the quantum gate operations in a model are applied homogeneously in space (that is, all the points are treated exactly the same way). The qubit array is typically treated as a tensor-product state with local entangled clusters, where each entangled cluster represents both a point in space and the quantum field located at that point.

Quantum circuit network

Quantum circuit network

Conventionally, in the standard quantum circuit model of quantum computation, a quantum circuit has an input state (usually prepared in quantum superposition) that is transformed by a sequence of quantum gates (and sometimes with an oracle too) into an output state that is available for deterministic measurement. If one draws the type of quantum algorithm that I consider as a quantum circuit diagram, then that quantum circuit would look like a closed-loop quantum circuit network. That is, the output quantum state is subsequently feed back into the quantum circuit as a new input quantum state. The unitary transformation represented by this type of quantum circuit network occurs in an homogeneous and systolic fashion (simultaneously across all the points of the space represented in the qubit array). This unitary transformation is applied to the qubit array in what is called a collide-stream cycle. The time for completing one closed-loop cycle is a single time step of the dynamical evolution of the amplitude field of the modeled quantum particle(s).

In the early 1990's as part of a research program in computational quantum mechanics, I began exploring quantum algorithms to represent the unitary evolution of quantum particles in terms of stream and collide operators; for example see Section 3 of this technical report. Sauro Succi had developed a lattice Boltzmann equation description of quantum mechanics [Physica D, 69, 327 (1993)]. I was interested in a unitary model as a generalization of a classical lattice gas (by replacing bits with qubits) that could be implemented on a quantum computer for representing quantum mechanics. I termed this model a quantum lattice gas. Bruce Boghosian joined the Dynamics Lab in 1994 to support the research carried out in my Lattice-Gas Theory and Computation Group at Phillips Laboratory (now Air Force Research Laboratory). Here is a quantum lattice-gas model for the many-particle Schroedinger equation, the first simple quantum algorithm.

Here are a couple papers on the topic of a numerically accurate quantum lattice gas model for the many-body Schroedinger equation:


Relativisitic quantum mechanics

A quantum algorithm with all the restrictions mentioned above is called a quantum lattice gas. Back in the 1940's, Feynman developed the first quantum lattice gas algorithm in 1+1 dimensions in his search for the simplest way to encode the relativistic dynamics of a Dirac particle. This is known as the Feynman checkerboard problem. If no more than one particle is modeled and the size of the Hilbert space scales with the number of points, then his modeling approach is also known as the quantum walk model. If many particles are modeled and the size of the Hilbert space scales as a binomial coefficient (the total number of points choose the total number of particles), then his model is just the many-body sector of a quantum lattice gas.

This quantum algorithm is my solution to the Feynman checkerboard for the relativistic quantum mechanical dynamics of Dirac particles in 3+1 dimensions:

I study quantum algorithms for efficient quantum simulations of gauge field theories on Feynman (qubit array and quantum gate-based) quantum computers.

Path summation rule for relativistic quantum mechanics

Path summation rule

The above mentioned work on superconducting fluid dynamics is based on a quantum algorithm for relativistic quantum mechanics. This quantum algorithm is a quantum lattice gas model that is derived from a simple path summation rule. Here is a preprint describing the path summation rule with applications to quantum computation and quantum simulation:


Gauge field theory

Here is a couple papers on a quantum computing algorithm for quantum field theory that are based on a discrete representation of the Dirac-Maxwell-London equations for a superconducting fluid:

Curved space theory for superconductivity

Curved space theory

On the topic of a quantum algorithm for gauge field theory, a new quantum computing model for a superconducting Fermi fluid is based on a curved space quantum field theory for massless spin-1/2 fermions coupled to a spin-1 gauge field. Here is a preprint describing this quantum algorithm:


Quantum matter


Bose-Einstein condensates

A spinor Bose-Einstein condensate (BEC) superfluid is a macroscopically coherent state of quantum matter that, although still rare today and technically difficult to make in the laboratory, will become ever more commonly used for new and revolutionary 21st century quantum devices. For example, some expected future applications include:

  • analog quantum computers for solid-state condensed matter simulation (strongly-correlated many-body physics achieving and surpassing exascale high performance at extremely low power)
  • quantum matter-wave interferometry (BEC interferometers and gravimeters for extremely precise sensors to detect magnetic field variations on the order of 10^(-12) Tesla and gravitational variations with sensitivity levels of 10^(-14) g, respectively).
  • quantum lattice clock and metrology (accuracy and stability to better than 1 sec in 5 billion years, at the 10^(-18) level).
Designing such quantum devices requires advanced modeling and simulation of their quantum mechanical performance and (because of the complexity of the inherent quantum fluid dynamics). This type of advanced modeling and simulation in three space dimensions becomes most practical on extremely large supercomputers. My new quantum algorithm for modeling and simulating a spin-2 BEC employs a closed-form infinite-order expansion for the particle-particle interaction even with order unity coupling strengths. The method uses an operator splitting technique that avoids the Baker-Campbell-Hausdorff catastrophe that normally occurs because the kinetic energy operator in the free part of the Hamiltonian does not commute with the potential energy operator in the interaction part of the Hamiltonian for the spinor BEC.

Here is the paper on a new quantum lattice gas model for the spin-2 BEC:

Here are a few group papers on modeling quantum turbulence in a spin-0 BEC:


Non-Abelian superfluidity in a spin-2 BEC

Spin-2 BEC equation of motion

Equation of motion for a spin-2 BEC

A spin-2 BEC has ferrromagnetic, polar, and cyclic phases. The cyclic phase of a spin-2 BEC is analogous to a d-wave Bardeen-Cooper-Schrieffer (BCS) superfluid, and it is this phase that admits non-Abelian quantum vortex solitons. My new quantum algorithm for modeling and simulating a spin-2 BEC employs a closed-form infinite-order expansion for the particle-particle interaction even with order unity coupling strengths. The model uses an operator splitting technique that avoids the Baker-Campbell-Hausdorff catastrophe that normally occurs because the kinetic energy operator in the free part of the Hamiltonian does not commute with the potential energy operator in the interaction part of the Hamiltonian for the spinor BEC:

Collision of spin-2 BEC solitons

Collision of orthogonal solitons in a spin-2 BEC

Here is a talk on the spin-2 BEC paper above. This talk includes an example quantum simulation of a spin-2 BEC demonstrating the behavior in a non-Abelian superfluid. Orthogonally oriented 1D soliton wave trains induce rapid instability that is triggered on all Zeeman levels. The instability is characterized by oscillating and rising peaks of all five Zeeman levels at the intersection center. This is a generalization (rich in nonlinear physics) of the same instability originally observed in a spin-0 BEC and reported in QIP, Vol. 4, No. 6 (2005). Here is the talk slide: Here is a quantum simulation of a non-Abelian superfluid showing the instability due to the collision of two orthogonally directed 1D soliton wave trains. The animation cycles through all the f=2 Zeeman level, going from m=-2, -1, 0, 1 , 2.: This quantum simulation was done by Jasper Taylor and Steven Smith while they were my Research Assistants.


Strongly-correlated condensed matter

I have a long held interest in efficient representations of strongly-correlated condensed matter systems, particularly many-fermion systems that possess a high degree of quantum entanglement. I study the energy eigenstates of these systems in terms of entangled particle-particle configurations.

I have long advocated for quantum computers that exploit exponential quantum complexity, rather than making uncontrolled approximations or decimating the physics to fit numerical representations into available classical memory resources. A prime application here is to explore the phase diagram of a system of many electrons on a lattice governed by the Fermi-Hubbard Hamiltonian. I originally worked on the Fermi-Hubbard model with Professor Eric Jensen at Brandeis University. We were motivated by its connection to high-temperature superconductivity, and this connection is remains an important research topic for me. An experimental quantum simulation route for which I have long advocated is large scale quantum computation, particularly for its application to the Fermi-Hubbard model.

BCS superconductiviity

Spinor form of the superconducting fluid equations

Superconducting fluid eqs.

The superconducting fluid's equations of motion take the elegant form of a pair of coupled Dirac equations, one governing the 4-spinor field for superconducting Fermi matter and the other one governing a doublet field (pair of 4-spinors) for gauge matter:


Classical theory


Fluid dynamics and turbulence

Entropic method

I am interested in the turbulence problem and over the years developed various entropy-conserving nonlinear models of high Reynolds number fluid dynamics. An early model I had developed with Professor Bruce Boghosan at Tufts University is the entropic lattice Boltzmann equation model. With its distinguishing trait of entropy conservation and parallelism, it has served as a useful computational fluid dynamics model of turbulence on supercomputers.

I am interested in information conservation on the small scales (both mesoscopic and microscopic) underlying the low-energy and small-momentum flow dynamics on the large hydrodynamic (macroscopic) scale. For example, at the large scale one derives an effective hydrodynamic-level momentum equation, in the form of a viscous Navier-Stokes equation. The entropic method captures the kinetic energy spectra in turbulence. In particular, I have sought to represent fluid eddies with a highly accurate model and with sufficient spatial resolution to capture and understand the energy cascades that occur in turbulent flows.

Here are some group papers on the entropic Boltzmann equation:

Quantum computational method

My first assignment at a national laboratory was at the Air Force Geophysics Laboratory near Boston, Massachusetts. I was first assigned to the Space Particles Branch in 1987 and then to the Atmospheric Sciences Division in 1992. It is here that I learned plasma physics, atmospheric physics and climatology. I realized that the computational fluid dynamics methods and computational resources available at the time were broadly inadequate for modeling extremely high Reynolds number fluid flows (Re ~ 10^(11)), which typically occur in turbulent layers in the troposphere. So I embarked upon a new research program to develop efficient quantum algorithms for turbulent fluid modeling and built a new quantum technologies program to develop a large scale quantum computer useful for efficient computational physics modeling of the troposphere.

Some three decades later, quantum computing research is expanding at a remarkable rate. So regarding climate change research, efficient atmospheric turbulence modeling is a vital topic area. Hopefully, applying quantum computation for efficient computational fluid dynamics to atmospheric turbulent modeling will be a game changer for climate change research.

Here are some my papers on quantum lattice gas models for computational fluid dynamics:


Gravity

I study gauge field theory representations of gravitational dynamics -- in particular, quantum computing models of gravity. I also study quantum particle dynamics in curved space.

Although general relativity is a classical field theory, it may be represented by unitary lattice models (e.g. quantum lattice gas models). Since quantum field theory can be represented by unitary lattice models too, such lattice models provide a new and unified way to study quantum dynamics and gravitational dynamics.

When one constructs unitary lattice models of quantum field theories, then such models can be used to explore particle dynamics at the highest-energy grid scale (or Planck scale). Remarkably, if constructed correctly, Lorentz invariance in unitary lattice models is retained even near the smallest spatial scales. However, the modeled spacetime at the smallest spatial scale admits unexpected and nonlinear behavior that can be interpreted as an effect due to curved space.

The theoretical approach I use is based on Einstein's vierbein representation of general relativity that he developed in his quest for a unified field theory. Einstein was on the right track! The vierbein representation is perhaps the most natural choice for a gauge field theory representation of gravity.

Here is a review paper on gauge gravity and the Dirac equation in curved space:

Here is a paper on a quantum lattice model of gauge gravity: