Quantum computing of turbulent fluid flows?
Quantum computing changes the paradigms of solving computationally hard problems and thus might pave the way to tackle longstanding challenging tasks in the future, such as numerical simulations of turbulent flows at very large Reynolds numbers. Researchers from TU Ilmenau in cooperation with colleagues from the United States have successfully ported two very simple problems on Germany’s first quantum computer that aim at modeling classical nonlinear dynamical systems, such as fluid flows, on current noisy quantum devices.
Motivation and first results
Quantum computing has been receiving a lot of attention in the course of the past years as there exist several hard problems in optimization, machine learning, and cryptography where quantum algorithms have the potential to outperform classical computations. The public interest was consequently large when the first IBM Quantum System One has been launched outside the United States of America in Ehningen near Stuttgart this summer under the auspices of the Fraunhofer Gesellschaft
(see figure 1).
Quantum algorithms follow the laws of quantum physics which completely changes the way of how to design computations. The qubit – the quantum analogue to the bit – cannot only take the discrete states 0 and 1 as its classical counterpart, but any superposition of both. A collection of n qubits can thus store 2n complex numbers which results in an exponential increase of the encoding capability. The information contained in the human genome would thus theoretically fit into 34 qubits . Elementary logical gate operations correspond to unitary transformations of a quantum state to a new one and have to be reversible, a property which is not required on a classical computer. A further unique property of quantum computing is that of the entanglement of qubits which introduces correlations for many-qubit systems and thus establishes correlations between quantum particles which can even be probed experimentally over macroscopic distances.
Inspired by these new opportunities, the Fluid Mechanics Group of Prof. Jörg Schumacher at TU Ilmenau was asking whether quantum algorithms can be also applied to classical dynamical systems, such as fluid flows. They typically require the monitoring of a huge number of degrees of freedom, i.e., their dynamical description proceeds in a high-dimensional data space. The research efforts of the group went into two specific directions:
Quantum machine learning algorithms which are able to predict the time evolution of dynamical systems without solving the underlying mathematical equations. More detailed, a classical recurrent neural network algorithm was transformed into a quantum reservoir computing model. In this model, a large random network of neurons – the reservoir – is encoded as a sequence of elementary quantum gates and applied to a quantum state of up to 8 qubits.
Embeddings of classical dynamical systems on a quantum computer by the so-called unitary Koopman operator framework – a joint research effort with colleagues from the United States that opens the perspective of a fully data-driven modelling of complex dynamical systems .
In the past 2 months, the researchers from Ilmenau had access to the quantum computer in Ehningen within a 4-week-ticket to test their algorithms on the real quantum device. Compared to ideal quantum simulations in a Python software environment, actual computations still put numerous challenges to the execution of quantum programs. All current universal quantum computers belong to the class of noisy intermediate scale quantum (NISQ) devices which implies that qubits states have a short lifetime and will eventually decay due to interactions with the environment (which is impossible on a classical computer). This decoherence requires typically additional error correction routines which in turn request more qubits than originally necessary for the algorithm. Consequently, the number of successively applied operations to a quantum state is limited and quantified by an empirical measure, the quantum volume.
Despite these limitations, the researchers managed to successfully run their simulations and to compare them with existing classical simulations. No need to say that the solvable problems are still very small and far away from a turbulent flow. For example, the Lorenz (L63) model is a simple reduced blueprint of a two-dimensional Rayleigh-Bénard convection layer where the velocity and temperature fields are approximated by 3 (instead of billions) degrees of freedom or variables. They are denoted by X(t), Y(t) and Z(t). In Figure 2, a convection flow snapshot is shown that is reconstructed from the X, Y, Z at a time instant. Figure 3 shows the prediction of the dynamics of the three variables which were obtained from the execution of the quantum reservoir computing program on the real quantum computer. It can be seen how noisy the time series are in comparison to the actual numerically exact solution, which can be obtained with a 20-line program code on a laptop.
Despite these existing challenges and the small progress, the researchers from Ilmenau are optimistic that increasingly complex applications, i.e., nonlinear dynamical systems with a significantly larger number of degrees of freedom, will be processable by future quantum devices that contain much more qubits with significantly reduced error rates.
 C. Outeiral et al., The prospects of quantum computing in computational molecular biology, WIREs Comput. Mol. Sci. 11, e1481 (2021).
 D. Giannakis, A. Ourmazd, P. Pfeffer, J. Schumacher and J. Slawinska, Embedding of classical dynamics on a quantum computer, arXiv:2012.06097v3 (2021).
Prof. Jörg Schumacher
Fluid Mechanics Group
Technische Universität Ilmenau