Why Any Quantum Advantage Depends on Unified Workflows

Why Any Quantum Advantage Depends on Unified Workflows

In the 21st century, enterprises depend on data to drive their decisions and fuel their growth. But to unlock value from that data, enterprises need a universe of complex, interlocking software systems: whether it’s pipelines to transport data; ETL processes to clean it; data warehouses and lakes to store it; machine learning and SQL tools to analyze it; dashboards to present it; and security tools to protect it.  

(For reference, Andreessen Horowitz outlined the common data infrastructure architectures and component vendors in a detailed blog post last year.) 

Inevitably, however, having multitudes of technology vendors for each data infrastructure component results in fragmented architectures that vary wildly from organization to organization. This is a problem for enterprises: the more fragmented the architecture, the more resources are needed to operate it. What’s more, with so many components to mix and match, chances are slim that any two enterprises will have the exact same architecture, making it harder to find talent or build internal expertise to manage it all.

Enter Quantum Computing 

The advent of quantum computing will only exacerbate this fragmentation problem. As quantum computing becomes more accessible, enterprises will need to integrate quantum hardware and software alongside their HPC clusters, ETL processes, databases, S3 buckets and Java frameworks — just to name a few of the key components. 

The research is growing increasingly clear that quantum computing has the potential to give forward-thinking enterprises an industry-leading advantage — and soon. But any quantum advantage depends on quantum components being seamlessly integrated with the rest of the technology stack. Otherwise, any efficiency gains from quantum computing could be lost on the fragmentation inefficiencies surrounding it. This is why unified workflows are so important to the success of quantum-enabled applications — not to mention reducing the costs of managing all this complexity. 

From Fragmented to Unified Workflows 

Quantum computing will always exist in hybrid architectures combining classical and quantum hardware and software. Making the two work seamlessly together requires unified workflows. The workflow’s job is to orchestrate and unify all the components — quantum and classical — to make efficient use of them in production. 

Workflows abstract the complexities of operating quantum algorithms by codifying each step into containerized tasks that can be run in parallel or serially. In other words, workflows automate the management and execution of SDKs and programming languages (in their own environments) to support larger quantum solutions and experiments. Workflows are highly inclusive, allowing users to easily switch between libraries and backends as more powerful or useful options become available. 

Within the next decade, we can expect the arrival of fault-tolerant quantum devices that reliably outperform their classical counterparts in complex, commercially valuable problems. But even the most game-changing quantum algorithms will not deliver a computational speed-up if they’re slowed down by fragmented, poorly constructed and unwieldy workflows. 

The time to start building workflows is now 

Building unified quantum workflows today is one of the most important things enterprises can do to prepare for the quantum future. Even if today’s noisy intermediate-scale quantum (NISQ) devices can’t yet deliver a quantum advantage, enterprises can still begin building unified workflows around them. That way, when more powerful quantum computers become available, they can be swapped in for the NISQ devices to give enterprises first-mover advantage over competitors. 

At Zapata, we’ve found workflows to be an excellent solution to many of the problems we encountered with existing techniques for running quantum experiments. It’s why we now use unified workflows to power our own and our customers’ experiments. Further, it’s why we built Orquestra®, our software platform for building and deploying quantum-ready applications™ at enterprise scale, to help forward-thinking organizations build their own workflows.  

Parting advice 

Not much about quantum computing is simple, but my advice about getting started now with workflows is! 

  1. Engage data scientists and data consumers within business units to identify an AI/ML problem where quantum computing could unlock new speed-up or performance improvement 
  2. Create a real computational workflow with real data sources that leverages mostly classical compute, with one or two powerful quantum steps that can be achieved with today’s NISQ quantum devices 
  3. Deploy a quantum or quantum-inspired workflow that is forward-compatible and able to be swapped in as quantum hardware matures 

Learn more about building quantum-enabled workflows® at Orquestra.io and about workflows that incorporate quantum computing in this blog post. 

 

Author
Timothy Hirzel
Zapata Author

Timothy Hirzel

Director of Product