Zephyrnet Logo

Bleximo builds its competitive advantage with an application-specific approach

Date:

The key to achieving ‘quantum advantage’ lies in the co-design of algorithms and hardware for a new generation of superconducting quantum computers

Full-stack quantum computing: Bleximo’s holistic approach to product development covers everything from the fundamental physics of superconductors to the software architecture of a deployed quantum computing system. (Courtesy: Bleximo)

California-based start-up Bleximo is betting that its application-specific approach to quantum computing is more efficient – indeed transformative – in addressing highly complex practical problems across a range of industries – from global logistics and aerospace to pharmaceuticals, advanced materials, and energy production and distribution. The company, which is “powering innovation through quantum computing”, has been developing full-stack, superconducting application-specific computing systems since 2018, working with high-profile R&D partners such as the University of California Berkeley and Lawrence Berkeley National Laboratory as well as other companies in the quantum computing ecosystem.

Strategic differentiation comes with Bleximo’s co-design methodology, where coding and hardware design follow each other, with the focus on boosting a particular algorithm’s execution speed. The underlying holistic approach to product development, covering everything from the fundamental physics of superconductors to the software architecture of a deployed quantum computing system, integrates algorithms, software and hardware into one platform.

That unified development strategy is similar for all use cases, though each computing system may require changes and enhancements specific to a given application. As such, it is fundamental to Bleximo’s business model to collaborate closely with customers and R&D partners – whether government agencies, national laboratories, academic institutions or other technology companies – to ensure a granular understanding of their respective workflow pain-points and downstream computing requirements.

Co-design and collaboration

Fabio Sanches is head of quantum engineering at Bleximo, working with customers to develop targeted quantum algorithms that address their difficult computational problems while building a software framework for algorithm and hardware co-design. The raison d’être for application-specific quantum computers, says Sanches, is to address a challenge the customer faces from a high-performance computing standpoint. “The customer will typically have a problem that takes a lot of computing power and a considerable amount of time – even with access to leading-edge computing resources,” he explains. “These problems are the best candidates for studying the efficacy and upside of quantum computing solutions.”

Fabio Sanches

Starting with a specific class of practical problems – for example, supply-chain optimization or the pricing of financial products – the task for Sanches and colleagues is to understand what quantum algorithms make sense to tackle the problem in question. The end-game of shortened execution times and faster operation hinges on the tight interworking between Bleximo’s algorithms team and the hardware engineering function.

“This is the back-and-forth where we add value for our customers,” notes Sanches. “Ultimately, because quantum computers are best suited for certain problems, we think it makes sense to build systems wholly tailored to those tasks by co-developing quantum algorithms, quantum processors and supporting hardware that are tuned for these specific applications.”

Bringing down the cost and complexity

The application-specific approach, in turn, means that Bleximo is able to eliminate a lot of the hardware overhead associated with quantum computing systems, driving down the upfront investments. “Our mantra is lower complexity translates into higher reliability and lower capital and operational costs,” explains Chiara Pelletti, the company’s director of hardware engineering. At the system sharp-end, this means qubits and couplers – as well as processor components, microwave controllers and other hardware building blocks – are only added on a “must-have” rather than “nice-to-have” basis.

“The objective is to figure out the best architecture for an application-specific quantum processor – how many qubits of different types are required and how they connect to each other – and to engineer the hardware to ensure it can run specific gate operations efficiently,” adds Pelletti. Put another way: design the simplest architecture possible in hardware to provide the necessary capabilities for the software, while also building a platform with the potential to scale to a larger number of qubits over time.

Chiara Pelletti

To streamline the design of those application-specific superconducting quantum processors, Pelletti and her team have developed a software tool for quantum chip optimization – in effect, automated chip layout for the optimal placement of all processor components. “The workflow includes fine-tuning increasingly larger processor areas with the goal of meeting specifications and improving the coherence time, which is the time a qubit can basically ‘stay alive’,” she adds. “Engineering couplings to speed up gate and readout processes while reducing noise levels allows us to increase the number of operations that can be executed while the processor remains coherent.”

Zooming in, the core technical drivers for Pelletti and her team are eliminating any interference that might affect the qubit operation; engineering the platform to reduce classical cross-talk; and introducing ad-hoc filtering devices to the chip itself to protect the qubits from decoherence. Careful choices around packaging technology are also essential to maximize coherence, while a data-driven approach to chip design underpins everything, including experimental verification (and iteration) of key performance metrics and their dependence on microfabrication processes, chip packaging, cryogenic operating conditions and the control electronics.

It’s all about the people

Technology differentiators notwithstanding, the Bleximo value proposition is all about its people – or more precisely, the alignment of the team’s collective domain knowledge and expertise versus the fast-moving requirements of the nascent quantum computing industry. “On the hardware and software side, our strength lies in having a talented mix of engineers – focused on operational execution and issues like scalability, manufacturability and reproducibility – working alongside scientists geared up for ground-breaking applied research,” explains Sanches.

Bleximo, for its part, is also unique in having a team of mechanical engineers with specialist know-how in cryogenic science and technology – specifically, the dilution refrigerator subsystems and specialized wiring needed to achieve ultralow-temperature operation of the superconducting quantum processors. “We can take the physics of the whole quantum computing system into account because of our in-house expertise across a range of disciplines,” adds Sanches. “In this way, we’re creating solutions to address the fundamental problems with this emerging technology – solutions that will ultimately take computing power into uncharted territory.”

More broadly, argues Pelletti, quantum computing represents a compelling career pathway for talented graduate and postgraduate students in the physical sciences and engineering, especially those with an interest in continuous problem-solving at the interface between cutting-edge physics and technology development. “Joining a start-up is a great way to gain exposure to all the core disciplines in the field – processor design, testing, cryogenics and algorithm development – so you can figure out where your preference lies,” she concludes.

Making quantum computers practical

Alexei Marchenkov

Alexei Marchenkov is founder and chief executive officer of Bleximo. Here he gives Physics World the headline take on the start-up’s commercial and technology roadmap.

What sorts of customers and R&D partners are you looking to engage?

Bleximo’s focus on application-specific systems gives us a unique advantage, especially with organizations that already have an established quantum programme. We’re keen to work with teams that have a nuanced understanding of what their computational problems are, also what they are targeting in terms of a quantum computing solution to a given problem or pain-point in their workflow.

In this scenario, we can sit down immediately with the customer to start designing a quantum computing system to meet their requirements, working closely with them all the way through deployment of the quantum modules. From our perspective, a pharmaceutical scientist, for example, whose main role is to design new blockbuster drugs, doesn’t need to know how to program a quantum computer. Rather we will integrate quantum computing into their day-to-day operations.

How are you leveraging the emerging industry ecosystem in quantum science and technology?

Alongside our existing customer and partner collaborations, one planned initiative is to bring more academic and government researchers as well as quantum start-ups into the mix. We currently have several ongoing projects with research institutions and companies developing products for quantum computing, sensing and communications. These partnerships accelerate our own platform development while generating revenue.

What does success look like in the near term and over the medium term?

Right now, the priority for Bleximo is continuous product improvement and technology innovation, developing the architectures, algorithms and compilation methods that will enhance the performance of our superconducting quantum processors.

Over the next 12 months, we’ll deploy processors targeting several specific algorithms – at the level of eight to 16 qubits – and initiate beta-testing with a network of strategic partners. The context here is that connectivity, fidelity and coherence times are frequently a bottleneck to developers, much more than a large number of qubits. Our partners will be able to run their software on our backend and compare its performance to that on other backends.

Three years down the line, we plan to scale up to 1000-qubit processors. With this in mind, we’re working on a hybrid processor architecture that combines superconducting and photonic technologies that we believe will unlock much cheaper and more reliable control of superconducting qubits on a high-density chip – up to 1000 qubits on a 6 inch wafer.

spot_img

Latest Intelligence

spot_img