Accelerated Data Analytics and Computing Institute
- ADAC is an international consortium dedicated to advancing high-performance computing and hybrid data analytics, integrating AI and quantum technologies.
- It coordinates strategic initiatives through focused working groups, fostering collaboration and best practices for integrating classical and quantum computing.
- ADAC drives innovation by developing interoperable middleware, advancing simulation techniques, and standardizing hybrid orchestration for scientific discovery.
The Accelerated Data Analytics and Computing Institute (ADAC) is an international consortium of high-performance computing (HPC) centers and research organizations dedicated to advancing the science and engineering of accelerated data analytics and computing. Its core mission is to catalyze the development and integration of advanced computational, analytical, and AI-driven methods—often leveraging hybrid and heterogeneous architectures—to address the scale, complexity, and multidisciplinary nature of modern scientific and industrial data challenges. ADAC coordinates community-wide strategic initiatives, fosters best-practice sharing and technology transfer, and leads efforts to integrate emerging paradigms such as quantum computing into the broader HPC ecosystem.
1. Institutional Mission and Strategic Initiatives
ADAC’s principal objective is to accelerate scientific discovery and technological innovation through the integration of cutting-edge data analytics, AI, and advanced computational resources. The Institute’s agenda is shaped by a recognition that future research computing platforms will be inherently hybrid, combining traditional HPC, GPU-acceleration, FPGA-based systems, and, increasingly, quantum computing resources.
A central mechanism for advancing collaboration and innovation within ADAC is the formation of focused Working Groups. For instance, the Quantum Computing Working Group, established at the 12th ADAC workshop in 2023, is tasked with overseeing community strategy, survey data collection, and knowledge synthesis regarding quantum–classical integration (Buchs et al., 15 Aug 2025). These working groups target co-design and optimization challenges across multiple computational paradigms, establish benchmarks and best practices for hybrid environments, and coordinate the development of interoperable software infrastructure for distributed and accelerated analytics.
ADAC’s strategy encompasses (a) preparing existing simulation and analytics applications for next-generation heterogeneous architectures, (b) catalyzing the adoption and adaptation of quantum acceleration methodologies, and (c) supporting the development of new algorithms and workflows capable of utilizing both classical and quantum resources.
2. Integration of Quantum Computing with HPC
ADAC recognizes quantum computing (QC) not as a replacement but as a complement to modern HPC. The expected future is hybrid: quantum devices will serve as accelerators for specialized computational kernels, while classical HPC infrastructure remains critical for integrating and orchestrating the overall workflow (Buchs et al., 15 Aug 2025).
Two primary integration models are being explored:
- Loose integration: QPUs are accessed remotely (often as cloud resources), with classical HPC nodes acting as clients. This approach is presently the most common, given the physical and technological constraints of contemporary quantum hardware.
- Tight integration: Envisions direct physical co-location of classical and quantum hardware, enabling low-latency communication and efficient hybrid scheduling. This requires bespoke infrastructure to accommodate cryogenic and noise-sensitive quantum devices.
Execution models under investigation include:
- Stateless client–server APIs, in which quantum circuits are generated classically and dispatched as jobs to quantum hardware via vendor-specific APIs.
- Accelerator models where QPUs are addressed similarly to GPUs in modern systems, permitting tighter integration, reduced communication latency, and unified scheduling.
- Emerging quantum networking scenarios, which may eventually facilitate distributed quantum–classical workflows and entanglement-based protocols.
The integration stack relies on a combination of quantum software development kits (e.g., Qiskit, Cirq, Pennylane, CUDA-Q, TKET) and custom middleware capable of bridging scheduling, resource management, and error mitigation between classical and quantum resources.
3. Current Landscape and Technical Challenges
While the potential for quantum acceleration is widely acknowledged—in areas ranging from quantum chemistry and lattice gauge simulation to optimization and quantum machine learning—current quantum hardware is constrained by limited qubit counts, high error rates, and short coherence times (the so-called NISQ era). Full fault-tolerant quantum computing (FTQC) requires substantial advances in error correction, physical qubit scaling, and control stack engineering (Buchs et al., 15 Aug 2025).
The integration of quantum acceleration into classical HPC presents additional hurdles:
- Software stack mismatches: HPC scheduling and resource management paradigms differ fundamentally from those in quantum job management; specialized middleware is needed for unified job orchestration.
- Architectural heterogeneity: The landscape is fragmented across superconducting, trapped ion, neutral atom, and other quantum device types, each with unique physical and connectome constraints.
- Energy efficiency and benchmarking: Real quantification of resource and energy costs in hybrid workflows is an active research area, demanding robust empirical analysis and new benchmarking methodologies.
- Programming and compilation: Efficient hybrid code generation requires new compiler technology, transpilers, and circuit optimization strategies (e.g., for logical-to-physical qubit mapping and error mitigation).
4. ADAC Community Activities and Collaborative Mechanisms
ADAC operates as a collaborative network, fostering cross-institutional sharing of technological expertise, infrastructure access, and research outcomes. A key focus is the development of community testbeds—ranging from dedicated in-house quantum processors to cloud-accessed and simulated platforms—and the design of middleware capable of orchestrating hybrid classical–quantum jobs.
The Institute’s Working Groups synthesize member feedback, maintain institutional engagement through surveys and focused workshops, and set forth strategic recommendations. For example, ADAC has prioritized user training initiatives to lower the steep learning curve in quantum programming, and recommended a multi-pronged approach targeting both near-term NISQ experimentation and long-horizon FTQC preparation.
5. Emerging Directions and Technical Details
The hybrid HPC+QC paradigm is likely to drive long-term progress in computational science, potentially enabling exponential or polynomial speedups in select problem classes (such as quantum simulation, cryptography, combinatorial optimization, and certain machine learning scenarios). Key technical notions featured in ADAC’s strategic discourse include:
- Quantum superposition: , where , defines the state of a qubit.
- Bell-state entanglement: , foundational for quantum communication and error correction.
- Resource scaling for quantum simulation (e.g., lattice gauge theories): scaling as , illustrating the complexity of digital quantum simulations.
- Trotterization techniques for simulating continuous time evolution on digital quantum hardware.
- Hybrid orchestration models, where quantum subroutines are embedded within classical simulation or optimization pipelines, scheduled via extended classical job managers (e.g., SLURM or PBS with QC extensions).
A plausible implication is that, as hardware and software stacks mature, the standardization and generalization of hybrid orchestration technologies—combining robust benchmarking, workflow management, and adaptive AI-driven circuit optimization—will become central topics for ADAC and its partners.
6. Long-term Implications for Accelerated Data Analytics
The integration of quantum computing into the HPC ecosystem, as coordinated by ADAC, is guided by the principle of quantum utility: identifying and exploiting those computational kernels where quantum acceleration can deliver tangible speed or quality improvements over classical computation (Buchs et al., 15 Aug 2025). This is expected to have significant consequences for scientific fields reliant on simulation and optimization, augment complex classical models with quantum-accelerated routines, and catalyze new applications in AI.
The emergence of robust, production-grade hybrid classical–quantum workflows—integrated with legacy HPC scheduling, benchmarking, and reproducibility instrumentation—remains a central goal. ADAC’s work accelerates this transition, ensuring that best practices, performance metrics, and architectural insights are shared widely across academia, industry, and governmental research.
In summary, the Accelerated Data Analytics and Computing Institute acts as a focal point for the integration of advanced, hybrid, and quantum approaches into the broader HPC community. Through structured initiatives, community consensus building, and technical innovation, ADAC advances the development and deployment of scalable, multidisciplinary computing platforms capable of addressing new frontiers in scientific data analysis and discovery (Buchs et al., 15 Aug 2025).