The Laboratory for Digitalisation primarily focuses on the intersection between three research areas: Quantum Computing, Systems Engineering, and Software Engineering. Future computing systems will leverage non-classical algorithms, and their hardware and software architectures need to combine advantages of classical and quantum processing units. Consequently, scientific progress needs interdisciplinary thinking across fields now more than ever. The group seeks cross-cutting answers to highly topical scientific questions and participates in active transfer into applications.
Verband der Elektro- und Digitalindustrie, Arbeitskreis Funktionale Sicherheit ISO 26262 und Untergruppe Software
Consolidation of multiple systems of different criticality to one platform of mixed-criticality is an ongoing trend in various embedded industries due to the availability of powerful multicore processors. The isolation of different computing domains is the most crucial factor to guarantee freedom from interference. In this talk, Ralf Ramsauer presents the current state of Static Hardware Partitioning, a technique that leverages virtualisation extensions of modern CPUs to strongly isolate different computing domains on SMP platforms. He shows that it is possible to virtualise embedded real-time systems with (almost) zero runtime overhead and software interaction.
International Workshop on Quantum Data Science and Management organised by Wolfgang Mauerer jointly with Sven Groppe (University of Lübeck), Jiaheng Lu (University of Helsinki), and Le Gruenwald (University of Oklahoma) at the 49th International Conference on Very Large Data Bases.
Goals of the Workshop
For most database researchers, quantum computing and quantum machine lerning are still new research fields. The goal of this workshop is to bring together academic researchers and industry practitioners from multiple disciplines (e.g., database, AI, software, physics, etc.) to discuss the challenges, solutions, and applications of quantum computing and quantum machine learning that have the potential to advance the state of the art of data science and data management technologies. Our purpose is to foster the interaction between database researchers and more traditional quantum disciplines, as well as industrial users. The workshop serves as a forum for the growing quantum computing community to connect with database researchers to discuss the wider questions and applications of how quantum resources can benefit data science and data management tasks, and how quantum software can support this endeavor.
Contribution at the ACM SIGMOD/PODS International Conference on Management of Data by Tobias Winker, Sven Groppe (University of Lübeck), Valter Uotila, Zhengtong Yan, Jiaheng Lu (University of Helsinki), Maja Franz and Wolfgang Mauerer.
In the last few years, the field of quantum computing has experienced remarkable progress. The prototypes of quantum computers already exist and have been made available to users through cloud services (e.g., IBM Q experience, Google quantum AI, or Xanadu quantum cloud). While fault-tolerant and large-scale quantum computers are not available yet (and may not be for a long time, if ever), the potential of this new technology is undeniable. Quantum algorithms have the proven ability to either outperform classical approaches for several tasks, or are impossible to be efficiently simulated by classical means under reasonable complexity-theoretic assumptions. Even imperfect current-day technology is speculated to exhibit computational advantages over classical systems. Recent research is using quantum computers to solve machine learning tasks. Meanwhile, the database community already successfully applied various machine learning algorithms for data management tasks, so combining the fields seems to be a promising endeavour. However, quantum machine learning is a new research field for most database researchers. In this tutorial, we provide a fundamental introduction to quantum computing and quantum machine learning and show the potential benefits and applications for database research. In addition, we demonstrate how to apply quantum machine learning to the optimization of the join order problem for databases.
Contribution to the 26th International Conference on Computing in High Energy & Nuclear Physics (CHEP) by Maja Franz, Pia Zurita (University of Regensburg), Markus Diefenthaler (Jefferson Lab) and Wolfgang Mauerer.
Quantum Computing (QC) is a promising early-stage technology that offers novel approaches to simulation and analysis in nuclear and high energy physics (NHEP). By basing computations directly on quantum mechanical phenomena, speedups and other advantages for many computationally hard tasks are potentially achievable, albeit both, the theoretical underpinning and the practical realization, are still subject to considerable scientific debate, which raises the question of applicability in NHEP.
In this contribution, we describe the current state of affairs in QC: Currently available noisy, intermediate-scale quantum (NISQ) computers suffer from a very limited number of quantum bits, and are subject to considerable imperfections, which narrows their practical computational capabilities. Our recent work on optimization problems suggests that the Co-Design of quantum hardware and algorithms is one route towards practical utility. This approach offers near-term advantages throughout a variety of domains, but requires interdisciplinary exchange between communities.
To this end, we identify possible classes of applications in NHEP, ranging from quantum process simulation over event classification directly at the quantum level to optimal real-time control of experiments. These types of algorithms are particularly suited for quantum algorithms that involve Variational Quantum Circuits, but might also benefit from more unusual special-purpose techniques like (Gaussian) Boson Sampling. We outline challenges and opportunities in the cross-domain cooperation between QC and NHEP, and show routes towards co-designed systems and algorithms. In particular, we aim at furthering the interdisciplinary exchange of ideas by establishing a joint understanding of requirements, limitations and possibilities.
Planung und Steuerung industrieller Fertigung: Quantum Learning Machine Atos QLM38 kommt im BMBF-Forschungsprojekt TAQO-PAM zum Einsatz.
Vorgezogenes Weihnachtsgeschenk für das Labor für Digitalisierung an der OTH Regensburg: Dort wurde jetzt eine Quantensimulationsanlage im Wert von einer Million Euro angeliefert und installiert. „Solche Hightech-Anlagen stehen üblicherweise in bedeutenden Instituten wie dem Forschungszentrum Jülich, dem Leibnitz Rechenzentrum München, bei der europäischen Organisation für Kernforschung (CERN) und im Munich Quantum Valley“, macht Präsident Prof. Dr. Ralph Schneider die besondere Dimension der Anschaffung deutlich.
Neue Professur im Rahmen der Hightech Agenda Bayern
Das Weihnachtsgeschenk kommt zwar optisch recht unscheinbar daher. Dennoch reiht sich die OTH Regensburg mit der Quantum Learning Machine „Atos QLM38“ ein in die Riege hochkarätiger Forschungsinstitute. Das kommt nicht von Ungefähr. An der Fakultät Informatik und Mathematik sind über Jahre hinweg Kompetenzen im Bereich Quantencomputing aufgebaut worden. Zuletzt hatte der Freistaat Bayern mitgeteilt, dass im Programm zur Stärkung von Quantenprofessuren im Rahmen der Hightech Agenda eine neue Professur für Algorithmik und Quantencomputing-Anwendungen an die OTH Regensburg geht.
Prof. Dr. Wolfgang Mauerer leitet das Labor für Digitalisierung und ist Vorsitzender Direktor des Regensburg Center for Artificial Intelligence (RCAI). Er beschäftigt sich seit mehr als 15 Jahren mit konkreten Anwendungsfällen der Quanteninformatik und gilt hierfür als ausgewiesener Experte. Ihm geht es nicht um den bloßen akademischen Austausch, sondern vor allem darum, die Lücke zwischen Grundlagenforschung und industrieller Anwendung zu schließen.
TAQO-PAM: Starke Partner aus Forschung und Industrie
Diesem Ziel widmet sich auch das von Mauerer ins Leben gerufene Konsortialprojekt TAQO-PAM, das über das Bundesministerium für Bildung und Forschung mit insgesamt 8,2 Millionen Euro gefördert wird. Dabei entfallen alleine 2,6 Millionen Euro auf die OTH Regensburg. Partner sind BMW München, Siemens München und Karlsruhe, die Regensburger Optware GmbH, die Friedrich-Alexander-Universität Erlangen-Nürnberg und Atos Scientific Computing (Tübingen).
Die zunehmende Massenproduktion individualisierter Güter und die dafür notwendige komplexe Logistik innerhalb moderner Fabriken erfordern die Lösung umfangreicher Optimierungsprobleme in Echtzeit. „Klassische Computer können solche Probleme nicht ausreichend gut und schnell verarbeiten; auch mit Quantencomputern ist die Machbarkeit nicht selbstverständlich“, bemerkt Mauerer. Im Projekt sollen unter seiner Führung daher hybride, quanten-klassische Spezialalgorithmen entworfen werden. Diese befähigen die demnächst verfügbaren Quantencomputer mit einigen 10 Qubits zur Lösung dieser Probleme beizutragen. Dies erfolgt durch die Integration von angepassten Quantenprozessoren (QPUs) in bestehende Szenarien und durch Erweiterung bestehender Methoden der Fabrikautomation und Produktionsplanung.
Stärkung des Hightech-Standorts Regensburg
"Das ist ein Paradebeispiel für die starke anwendungsorientierte und zukunftsgerichtete Forschung an unserer Hochschule", sind sich Präsident Schneider und Prof. Dr. Frank Herrmann, Dekan der Fakultät Informatik und Mathematik, einig. Sinnbildlich dafür steht das millionenschwere Weihnachtspaket mit dem Hochleistungsrechner. Ralph Schneider sieht darin auch eine Stärkung des Hightech-Standorts Regensburg. Längst nicht jede Hochschule und jede industrielle Forschungseinrichtung könne das nötige Fachwissen und die Ressourcen für die Nutzung einer Quantum Learning Machine bieten. In der Regel seien nicht nur Neueinsteiger in den Bereich des Quantencomputings auf einen teuren Zukauf von Rechenzeiten in großen Rechenzentren angewiesen.
Link zur Pressemitteilung der OTH Regensburg
©Fotos: OTH Regensburg/Michael Hitzek
At the Critical System Summit in Yokohama, Benno Bielmeier and Wolfgang Mauerer presented in one of six sessions a semiformal approach to deriving statements about the runtime behaviour of complex, mixed-criticality systems.
As part of the Open Source Summit Japan, the event was hosted by the Linux Foundation and its corporate members, among them AT&T, Cisco, Fujitsu, Google, Hitachi, Huawei, IBM, Intel, Meta, Microsoft, NEC and many others, with more than 600 participants.
The approach links theoretical formalisms with empirically collected data from real-world applications and aims to remain interpretable and tangible. Its idea is to augment a simplified, formal model based on a priori knowledge about the system's intrinsics with empirical information from measurements on real-world scenarios, which then allows us to infer properties of interest for the certification of safety-critical systems.
Tom Krüger has joined the team as doctoral student in the field of quantum computing, contributing to the TAQO-PAM project. Welcome, Tom!
Wolfgang Mauerer, Ralf Ramsauer and Andrej Utz present results of the iDev 4.0 project at the SEMICON Europa 2022 in Munich.
Abstract: The advent of multi-core CPUs in nearly all embedded markets has prompted an architectural trend towards combining safety critical and uncritical software on single hardware units. We present an architecture for mixed-criticality systems based on Linux that allows for the consolidation of critical and uncritical components onto a single hardware unit.
In the context of the iDev 4.0 project, the use-case of this technological building block is to reduce the overall amount of distributed computational hardware components accross semiconductor assembly lines in fabs.
CPU virtualisation extensions enable strict and static partitioning of hardware by direct assignment of resources, which allows us to boot additional operating systems or bare metal applications running aside Linux. The hypervisor Jailhouse is at the core of the architecture and ensures that the resulting domains may serve workloads of different criticality and can not interfere in an unintended way. This retains Linux’s feature-richness in uncritical parts, while frugal safety and real-time critical applications execute in isolated domains. Architectural simplicity is a central aspect of our approach and a precondition for reliable implementability and successful certification.
In this work, we present our envisioned base system architecture, and elaborate implications on the transition from existing legacy systems to a consolidated environment.
Contribution to CORE A* conference ACM SIGMOD driven by Manuel Schönberger and Wolfgang Mauerer breaks new ground for quantum computing in the database community (PDF).
Abstract: The prospect of achieving computational speedups by exploiting quantum phenomena makes the use of quantum processing units (QPUs) attractive for many algorithmic database problems. Query optimisation, which concerns problems that typically need to explore large search spaces, seems like an ideal match for the known quantum algorithms. We present the first quantum implementation of join ordering, which is one of the most investigated and fundamental query optimisation problems, based on a reformulation to quadratic binary unconstrained optimisation problems. We empirically characterise our method on two state-of-the-art approaches (gate-based quantum computing and quantum annealing), and identify speed-ups compared to the best know classical join ordering approaches for input sizes that can be processed with current quantum annealers. However, we also confirm that limits of early-stage technology are quickly reached. Current QPUs are classified as noisy, intermediate scale quantum computers (NISQ), and are restricted by a variety of limitations that reduce their capabilities as compared to ideal future quantum computers, which prevents us from scaling up problem dimensions and reaching practical utility. To overcome these challenges, our formulation accounts for specific QPU properties and limitations, and allows us to trade between achievable solution quality and possible problem size. In contrast to all prior work on quantum computing for query optimisation and database-related challenges, we go beyond currently available QPUs, and explicitly target the scalability limitations: Using insights gained from numerical simulations and our experimental analysis, we identify key criteria for co-designing QPUs to improve their usefulness for join ordering, and show how even relatively minor physical architectural improvements can result in substantial enhancements. Finally, we outline a path towards practical utility of custom-designed QPUs.
Felix Greiwe has joined the group as doctoral student in the field of quantum computing, contributing to the TAQO-PAM project. Welcome, Felix!
Contribution to the highly competitive Open Source Summit (with acceptance rates below 20%) in Yokohama, Japan by Benno Bielmeier and Wolfgang Mauerer.
Abstract: Software for safety-critical systems must meet strict functional and temporal requirements. Since it is impossible to exhaustively test the required qualities, formal verification techniques have been devised. However, these approaches are usually only applicable to small systems, and require software architecture and development to consider verification goals from the ground up. Safety-critical systems face an increasing demand for functionality, and need to handle the associated complexity. While the desired functionalities can be satisfied by embedded Linux, established verification techniques fail for code of such magnitude. We show a semi-formal, model-based approach to derive reliable statements about the run-time characteristic of embedded Linux. Using a-priori expert knowledge, we generate a finite automaton-based effective description of safety-relevant aspects. The real-world, system-dependent behaviour of the resulting automata, in particular timing statistics for state transitions, is empirically obtained via system instrumentation. We then show how to turn this into (statistical) guarantees on their behaviour. We show how this allows to draw conclusions that can be used in certifying systems in terms of reliability, latencies, and other real-time properties.
QPU-System Co-Design for Quantum HPC Accelerators (with contributions by Hila Safi and Wolfgang Mauerer) was accepted by the 35th GI/ITG International Conference on the Architecture of Computing Systems (PDF).
Abstract: The use of quantum processing units (QPUs) promises speed-ups for solving computational problems, but the quantum devices currently available possess only a very limited number of qubits and suffer from considerable imperfections. One possibility to progress towards practical utility is to use a co-design approach: Problem formulation and algorithm, but also the physical QPU properties are tailored to the specific application. Since QPUs will likely be used as accelerators for classical computers, details of systemic integration into existing architectures are another lever to influence and improve the practical utility of QPUs.
In this work, we investigate the influence of different parameters on the runtime of quantum programs on tailored hybrid CPU-QPU-systems. We study the influence of communication times between CPU and QPU, how adapting QPU designs influences quantum and overall execution performance, and how these factors interact. Using a simple model that allows for estimating which design choices should be subjected to optimisation for a given task, we provide an intuition to the HPC community on potentials and limitations of co-design approaches. We also discuss physical limitations for implementing the proposed changes on real quantum hardware devices.
The paper is joint work with with Siemens Technology, and was performed within the BMBF sponsored project TAQO-PAM. A reproduction package allows independent researchers to confirm our results.
Uncovering Instabilities in Variational-Quantum Deep Q-Networks (with contributions by Maja Franz, Lucas Wolf and Wolfgang Mauerer) was accepted by the Journal of the Franklin Institute (Impact Factor: 4.25)
Abstract: Deep Reinforcement Learning (RL) has considerably advanced over the past decade. At the same time, state-of-the-art RL algorithms require a large computational budget in terms of training time to converge. Recent work has started to approach this problem through the lens of quantum computing, which promises theoretical speed-ups for several traditionally hard tasks. In this work, we examine a class of hybrid quantum- classical RL algorithms that we collectively refer to as variational quantum deep Q-networks (VQ-DQN). We show that VQ-DQN approaches are subject to instabilities that cause the learned policy to diverge, study the extent to which this afflicts reproduciblity of established results based on classical simulation, and perform systematic experiments to identify potential explanations for the observed instabilities. Additionally, and in contrast to most existing work on quantum reinforcement learning, we execute RL algorithms on an actual quantum processing unit (an IBM Quantum Device) and investigate differences in behaviour between simulated and physical quantum systems that suffer from implementation deficiencies. Our experiments show that, contrary to opposite claims in the literature, it cannot be conclusively decided if known quantum approaches, even if simulated without physical imperfections, can provide an advantage as compared to classical approaches. Finally, we provide a robust, universal and well-tested implementation of VQ-DQN as a reproducible testbed for future experiments.
The paper ist joint work with with Fraunhofer IIS, and arose of the BMBF sponsored project QLindA. Of course, the publication is accompanied by an extensive reproduction package that allows independent researchers to confirm our results.
OTH Regensburg ist Konsortialführer in 3 Millionen-EUR-Leuchtturmprojekt Q-Stream
Ein federführend von der OTH Regensburg eingebrachter Vorschlag für ein Quanten-Leuchtturmprojekt im Munich Quantum Valley ist nach dem Votum einer international besetzten Expertenkommission zur Förderung ausgewählt worde. Der Projekt Q-Stream: Quantum-Accelerated Data Stream Analytics, das sich mit der Konstruktion hybrider quanten-klassischer Spezialhardware beschäftigen wird, die auf die Analyse von Datenströmen spezialisiert ist.
Das Projekt zielt darauf ab, mittels eines Hardware-Software-Codesign-Ansatzes Anwendungen für die aktuell vorhandene Generation von Quantencomputern zu finden, die aufgrund technischer Imperfektionen noch zahlreiche Unzulänglichkeiten und Störungen aufweisen, die ihre potentiell enorme Leistungsfähigkeit noch nicht zur vollen Entfaltung bringen.
Von der Gesamt-Fördersumme vom 2.98 Millionen entfallen rund 750,000 EUR auf die OTH Regensburg, die zudem eine Stelle aus der High-Tech-Agenda in der Projekt einbringt. Das Labor für Digitalisierung wird sich in den Arbeiten auf den konzeptionellen Entwurf problemspezifisch adaptierter Spezial-Rechner konzentrieren. Weitere Quantenkompetenzen werden vom Fraunhofer-Institut für integrierte Schaltungen (IIS) (Transpilation und Dekomposition von Quantenschaltkreisen) sowie der Technischen Hochschule Deggendorf (Vorabsimulation zukünftiger Quantenrechner auf klassischen HPC-Systemen) eingebracht.
Dass es die OTH Regensburg neben sechs bayerischen Universitäten, der Max-Planck- und der Fraunhofer-Gesellschaft sowie der Bayerischen Akademie der Wissenschaften auf die Liste der geförderten Einrichtungen geschafft hat, scheint auch eine Bestätigung für die langjährigen Quanten-Aktivitäten in Forschung und Lehre von Prof. Dr. Wolfgang Mauerer zu sein.
Informationen des Bayerischen Staatsministeriums für Wissenschaft und Kunst, von dem das Labor die Förderung dankbar entgegennimmt, finden sich in einer Pressemitteilung.
Static Hardware Partitioning on RISC-V - Shortcomings, Limitations, and Prospects was accepted by the IEEE IoT Forum Special Session: Virtualization for IoT Devices 2022.
Abstract: On embedded processors that are increasingly equipped with multiple CPU cores, static hardware partitioning is an established means of consolidating and isolating workloads onto single chips. This architectural pattern is suitable for mixed-criticality workloads that need to satisfy both, real-time and safety requirements, given suitable hardware properties.
In this work, we focus on exploiting contemporary virtualisation mechanisms to achieve freedom from interference respectively isolation between workloads. Possibilities to achieve temporal and spatial isolation-while maintaining real-time capabilities-include statically partitioning resources, avoiding the sharing of devices, and ascertaining zero interventions of superordinate control structures.
This eliminates overhead due to hardware partitioning, but implies certain hardware capabilities that are not yet fully implemented in contemporary standard systems. To address such hardware limitations, the customisable and configurable RISC-V instruction set architecture offers the possibility of swift, unrestricted modifications.
We present findings on the current RISC-V specification and its implementations that necessitate interventions of superordinate control structures. We identify numerous issues adverse to implementing our goal of achieving zero interventions respectively zero overhead: On the design level, and especially with regards to handling interrupts. Based on micro-benchmark measurements, we discuss the implications of our findings, and argue how they can provide a basis for future extensions and improvements of the RISC-V architecture.
Ralf Ramsauer, Stefan Huber and Wolfgang Mauerer will discuss Zero-Overhead Virtualisation: It's a Trap! at the Embedded Linux Conference in Dublin
Abstract: Embedded processors are increasingly equipped with powerful CPU cores. For mixed-criticality scenarios with multiple independent real-time appliances, this allows us to consolidate formerly distributed systems. This requires absence of unintended interaction between different computing domains, which can be achieved by exploiting virtualisation extensions of modern CPUs. Though providing strong isolation guarantees, virtualisation comes with an overhead, which may endanger global real-time properties of the system. The statically partitioning, Linux-based hypervisor Jailhouse addresses this challenge and strives at zero-overhead virtualisation, which maintains real-time capabilities of the platform by design. However, limitations of current architectures counter our architectural design goal of eliminating virtualisation overheads. In this talk, we extract architecture-independent common requirements on contemporary platforms to enable zero-trap virtualisation. We explore and compare the ARM, x86, and the RISC-V architecture, and inspect their architectural limitations for embedded zero-overhead virtualisation. We present common pitfalls and barriers of those platforms: Issues that have been addressed, that are being fixed and that need to be addressed in future.
Die Bayerische Forschungsallianz hat Gastwissenschaftleraufenthalte an der Tokio University of Science bewilligt. Prof. Dr. Wolfgang Mauerer wird seine Systems-Expertise in ein Projekt zur statistischen Analyse von Echtzeitgarantien einbringen.
PhD student Manuel Schönberger took second place in the graduate track of this year's Student Research Competition at the CORE A* SIGMOD conference in Philadelphia!
The Student Research Competition takes place annually for various ACM conferences including SIGMOD. In the first round, students submit an extended abstract about their research. Based on the quality of their submission, a select few students from universities around the globe, including Columbia University, University of Illinois at Urbana-Champaign, Hasso-Plattner-Institut and TUM were invited to present their research posters at the SIGMOD conference. Three students were selected for the third round, where they gave a more detailed presentation on their research. In the graduate category, Manuel reached the second place, competing against Alex Yao and Sughosh Kaushik, both from Columbia University, who took first and third places respectively.
In his research, Manuel analyses the applicability of quantum computing on database query processing. The research goes beyond merely mapping problems onto quantum hardware, and moreover addresses the co-design of future quantum systems, such that they become tailor-made for database problems. Congrats, Manuel, for achieving this international recognition!
Quantum technologies are on the verge of breaking out of their ivory tower existence and entering the general marketplace.
With the premier of the World of Quantum researchers and industry presented the latest findings on potential quantum applications and quantum hardware at the Laser World of Photonics in Munich. Research Master student Maja Franz and others visited the fair and explored the new platform for quantum technologies.
Exhibitors from industry and manufacturers of quantum computers gave a broad overview of current quantum technology, for instance IBM Quantum let visitors see inside its quantum computer via augmented reality. Researchers in the field of quantum computing, such as from Fraunhofer IKS, offered a lively exchange on hybrid quantum-classical algorithms.
Thanks to the sponsors of the Bayerisches Staatsministerium für Digitales and other partners the World of Quantum was a success and an interesting experience.
State Minister announces extension of KI-Transfer Plus project headed by Wolfgang Mauerer
In the state-funded project "KI-Transfer Plus", AI regional centers such as the Regensburg Center for Artificial Intelligence (RCAI) support SMEs in getting started with artificial intelligence. At the closing event, Bavarian digital minister Judith Gerlach reviewed results of the first project round. The host of the event, Horsch Maschinen GmbH from Schwandorf, showed how artificial intelligence enhances its own agricultural machinery. Horsch developed an algorithm to recognize plants and their center points, which is important for autonomous driving in the field as well as for automated weed removal. The other five project participants from Upper Bavaria and the Upper Palatinate also presented innovative AI developments in a wide range of domains. Digital minister Judith Gerlach was pleased with the results and announced the expansion of the project. As a consequence of the program's success, Gerlach announced an extension for another year to prepare the Bavarian economy for the key technologies of the future. See a summary video of the impressive work engineers Nicole Höß and Matthias Melzer did together with our industry partners!