At the heart of quantum computing’s promise lies the ability to solve problems that are fundamentally out of reach for classical computers. One of the most powerful ways to unlock that promise is through a novel approach we call Generative Quantum AI, or GenQAI. A key element of this approach is the (GQE).
GenQAI is based on a simple but powerful idea: combine the unique capabilities of quantum hardware with the flexibility and intelligence of AI. By using quantum systems to generate data, and then using AI to learn from and guide the generation of more data, we can create a powerful feedback loop that enables breakthroughs in diverse fields.
Unlike classical systems, our quantum processing unit (QPU) produces data that is extremely difficult, if not impossible, to generate classically. That gives us a unique edge: we’re not just feeding an AI more text from the internet; we’re giving it new and valuable data that can’t be obtained anywhere else.
The Search for Ground State Energy
One of the most compelling challenges in quantum chemistry and materials science is computing the properties of a molecule’s ground state. For any given molecule or material, the ground state is its lowest energy configuration. Understanding this state is essential for understanding molecular behavior and designing new drugs or materials.
The problem is that accurately computing this state for anything but the simplest systems is incredibly complicated. You cannot even do it by brute force—testing every possible state and measuring its energy—because  the number of quantum states grows as a double-exponential, making this an ineffective solution. This illustrates the need for an intelligent way to search for the ground state energy and other molecular properties.
That’s where GQE comes in. GQE is a methodology that uses data from our quantum computers to train a transformer. The transformer then proposes promising trial quantum circuits; ones likely to prepare states with low energy. You can think of it as an AI-guided search engine for ground states. The novelty is in how our transformer is trained from scratch using data generated on our hardware.
Here's how it works:
- We start with a batch of trial quantum circuits, which are run on our QPU.
- Each circuit prepares a quantum state, and we measure the energy of that state with respect to the Hamiltonian for each one.
- Those measurements are then fed back into a transformer model (the same architecture behind models like GPT-2) to improve its outputs.
- The transformer generates a new distribution of circuits, biased toward ones that are more likely to find lower energy states.
- We sample a new batch from the distribution, run them on the QPU, and repeat.
- The system learns over time, narrowing in on the true ground state.
To test our system, we tackled a benchmark problem: finding the ground state energy of the hydrogen molecule (Hâ‚‚). This is a problem with a known solution, which allows us to verify that our setup works as intended. As a result, our GQE system successfully found the ground state to within chemical accuracy.
To our knowledge, we’re the first to solve this problem using a combination of a QPU and a transformer, marking the beginning of a new era in computational chemistry.
The Future of Quantum Chemistry
The idea of using a generative model guided by quantum measurements can be extended to a whole class of problems—from to materials discovery, and potentially, even drug design.
By combining the power of quantum computing and AI we can unlock their unified full power. Our quantum processors can generate rich data that was previously unobtainable. Then, an AI can learn from that data. Together, they can tackle problems neither could solve alone.
This is just the beginning. We’re already looking at applying GQE to more complex molecules—ones that can’t currently be solved with existing methods, and we’re exploring how this methodology could be extended to real-world use cases. This opens many new doors in chemistry, and we are excited to see what comes next.