IonQ is making significant strides in the realm of quantum computing applied to artificial intelligence (AI) and machine learning. Their research has led to advancements in hybrid quantum-classical approaches that are enhancing large language models (LLMs) and generative AI. In two new research papers, IonQ researchers detailed how quantum computing can support advanced materials development by generating synthetic images of rare anomalies and enhancing large language models by adding a quantum layer for fine-tuning. These efforts reflect IonQ’s commitment to practical, near-term commercial quantum applications in AI to drive value in data-scarce settings and for complex tasks.
In a recent paper, IonQ introduced a hybrid quantum-classical architecture designed to enhance LLM fine-tuning. This approach involves supplementing a pre-trained LLM with a small set of training data to customize its functionality through quantum machine learning. By incorporating a parameterized quantum circuit as a new layer, the researchers compared the performance against classical methods using an open-source large language model. The results showed that the hybrid quantum approach outperformed classical-only methods in accuracy, particularly as the number of qubits increased. This breakthrough paves the way for quantum-enhanced fine-tuning of foundational AI models in various fields such as natural language processing, image processing, and property prediction in chemistry, biology, and materials science.
“This work highlights how quantum computing can strategically integrate into classical AI workflows, leveraging increased expressivity to enhance traditional AI LLMs in rare-data regimes,” said Masako Yamada, Director of Applications Development at IonQ. “We believe that hybrid quantum-classical models have the potential to unlock the next wave of AI capabilities, expanding the versatility of LLMs beyond language applications.”
In a separate research publication, IonQ collaborated with a leading automotive manufacturer to apply quantum-enhanced generative adversarial networks (GANs) to materials science. By training GANs to sample the output distribution of a quantum circuit, researchers were able to generate synthetic images of steel microstructures. These synthetic images proved to be of higher quality in up to 70% of cases compared to those produced using baseline classical generative models. This advancement is crucial for industries like manufacturing, where data scarcity and imbalance can hinder model trainability.
“The combination of IonQ’s quantum computers and classical machine learning has shown impressive results for materials science and manufacturing,” said Ariel Braunstein, SVP of Product at IonQ. “By using a quantum hybrid approach to supplement experimental data with synthetic generation, we can achieve higher quality images with less data compared to classical methods. This breakthrough could lead to new applications across industries such as materials science, medical imaging, and financial forecasting.”