How to Make the Shift from Exploring to Operationalizing Quantum Computing

How to Make the Shift from Exploring to Operationalizing Quantum Computing

Data analytics. Data velocity. Data gravity. Data…[fill in the blank]. 

I’ve been thinking a lot lately about data and how it can be most efficiently used to maximize business success. This is because I am part of the Zapata engineering team working with our new partner Andretti Autosport to help them better use data to win races.  

In professional motorsports like the NTT INDYCAR circuit that Andretti competes in, fractions of a second can mean the difference between standing on the podium and being back in the pack. And the role data plays in finding — and achieving — that competitive edge is substantial.  

Think about it – in racing, the less drag due to air resistance and G-forces (i.e., fighting gravity) and the faster the velocity, the better the efficiency with which the fuel powers the car.  

It’s the same in the enterprise world. The less drag due to data (i.e., the resistance to moving datasets as they grow larger and larger) and the faster the data velocity, the better the efficiency with which the data analytics (i.e., the “fuel”) powers business decisions that ultimately lead to successful outcomes. 

Which brings me to exploration and its role with regard to data. 

Beyond the foundational data infrastructure found at every enterprise – and smaller companies like Andretti Autosport in data-intensive industries like motorsports — such as databases, data warehouses and BI tools, there’s a growing need to explore data sources and management techniques in innovative ways in order to gain a competitive advantage.  

Forward-thinking enterprises are learning about – and investing in – techniques such as generative modeling, unsupervised machine learning (research here for a deeper dive) and simulation to unlock more value from their data to fuel better and faster business decisions. 

Enter quantum computing. 

While quantum is still in its early days and not a replacement for classical computing solutions, it is far enough along where enterprises are actively pursuing pilots that use quantum infrastructure and software and benchmark against a pure classical approach. Some of the use cases being explored include financial portfolio optimization and risk analysis, large molecule simulation, and supply chain/logistics optimization. (Deeper dive on enterprise quantum computing use cases here.) 

They are also incorporating computational workflows and “swapability” options for combining the best possible quantum and classical tools in their stack. Not every problem will be a fit for a quantum operationalization, but the ones that are could enable new use cases beyond the scope of what’s possible with classical computing. 

Lesson #1 – there’s no “add quantum computing here” switch 

Before I go too far down the quantum computing road in this post, however, there’s one important lesson to share from Zapata’s work with enterprise and government customers: you can’t just flip a switch and add quantum computing to your data infrastructure. 

When useful quantum computers come online — likely within the next decade — only the organizations that have done the prep work to create the necessary infrastructure will be able to harness that quantum computing power. 

What does this mean? It means that to truly maximize the benefits of quantum computing today, organizations need to move beyond simply exploring quantum computing pilots and proofs of concept. To achieve a quantum advantage, organizations need to start laying the groundwork to operationalize quantum applications. 

The same was once true for classical data science and machine learning. Organizations spent significant resources building AI and machine learning models, but significantly less resources operationalizing these models. As a result, many models developed for deployment were never actually operationalized. Despite the maturity of AI, Gartner predicts that it will take until the end of 2024 for 75% of organizations to pivot from piloting to operationalizing AI. This would mean a five-fold increase in streaming data and analytics infrastructures. 

The thing is, although quantum computing lags behind AI in maturity, it’s never too early to learn the lessons from the evolution of AI, nor is it too early to start operationalizing quantum-enhanced models like Zapata’s proprietary quantum-enhanced optimization (QEO) method. 

Three Steps to Start Operationalizing Quantum Computing 
  1. Identify a Clear Use Case

The first step is to identify a clearly defined problem where quantum computing could unlock a new speed-up or performance improvement. To do this, you’ll need to engage data scientists and data consumers across business units to determine where quantum can add value. 

That said, it’s important to note that not every complex problem is best solved by quantum computing. Quantum applications are highly specialized, and most problems can still be best tackled with classical computing. Depending on your enterprise’s experience with quantum, you may need to bring in external experts who can help identify areas where quantum can actually make a difference. Generative models are a promising starting point, but there’s also a lot of potential in optimization and simulation problems. 

  1. Create a Workflow

Once you’ve identified your problem, the next step is to build a computational workflow. This goes beyond merely creating a model: you need to be able to pull from real data sources, process it using mostly classical compute resources, then feed it through one or two powerful quantum steps that can be achieved with today’s quantum devices. This requires organizations to build QuantumOps™ and AIOps infrastructure, particularly for pre- and post-processing of data. 

This is important: without an automated workflow for processing data (both before and after it is plugged into the quantum algorithm), any efficiency that could be gained from the quantum algorithms will be lost on the auxiliary data management. 

As we covered in a recent blog post, organizations need unified workflows to integrate the data sources (cloud-based or on-premises data warehouses and data lakes), classical data processing jobs (ETL jobs, Java frameworks, etc.), the quantum algorithms and other forms of high performance computing (HPC) resources, and the ultimate outputs — dashboards or other visualizations that can be easily interpreted to drive business decision making. 

  1. Make Your Workflow Forward-Compatible

While there are certainly areas where quantum devices can add value today, they are far from perfect. The last thing you want to do is to be locked into a workflow that uses today’s quantum devices without the flexibility to swap in tomorrow’s devices. Or worse: your competitor launches a quantum workflow with a more powerful quantum device than your own, while you’re stuck retooling your workflow to accommodate the new device. 

That’s why any quantum or quantum-inspired workflow you create should be forward-compatible and extensible to easily incorporate new technology as it matures. With an orchestration platform like Orquestra®, our software platform for building and deploying quantum-ready applications™ at enterprise scale, you can abstract the hardware and plug and play different devices to optimize your application’s performance. 

These steps have been somewhat simplified here for the sake of brevity, but they are the roadmap to producing real results with both near-term quantum devices and the more powerful, fault-tolerant quantum devices of the future. These operational steps are the difference between generating hype and generating a real business impact with quantum computing. Or, as anyone in motorsports will tell you, between finishing first in a race or finishing 0.001 seconds behind the leader. 

To learn more about building forward-compatible, quantum-enabled workflows®, check out Orquestra.io. And, if your organization is interested in learning more about any step in the exploring to operationalizing journey, contact us.

Author
James Clark
Zapata Author

James Clark

Quantum Software Engineer