TY - GEN
T1 - ResQGRNN
T2 - 2025 IEEE International Conference on Quantum Artificial Intelligence, QAI 2025
AU - Kaldari, Jawaher
AU - Kashif, Muhammad
AU - Al-Kuwari, Saif
AU - Shafique, Muhammad
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Quantum Graph Recurrent Neural Networks (QGRNNs) provide a powerful framework for learning sequential dependencies in quantum graph-based models. However, a significant challenge in advancing these networks lies in ensuring efficient trainability and scalability, as quantum systems inherently exhibit unique optimization difficulties. While residual learning has proven to be effective in classical deep learning for improving gradient flow and network convergence, integrating residual connections in quantum architectures is non-trivial due to the no-cloning theorem, which prevents the direct copying of quantum states. In this paper, we introduce Residual Quantum Graph Recurrent Neural Networks (ResQGRNNs), a novel approach that enhances the trainability of QGRNNs by incorporating residual connections while preserving quantum constraints. Unlike prior works that primarily focus on expressivity or classical datasets, our method directly addresses trainability concerns in quantum graph-based architectures. We propose an alternative strategy by using ancilla qubits to incorporate residual learning without violating quantum mechanical principles, ensuring effective parameter updates during training. Our results demonstrate that ResQGRNNs outperform the (plain) QGRNNs, leading to more stable optimization and enhanced performance on quantum graph learning tasks.
AB - Quantum Graph Recurrent Neural Networks (QGRNNs) provide a powerful framework for learning sequential dependencies in quantum graph-based models. However, a significant challenge in advancing these networks lies in ensuring efficient trainability and scalability, as quantum systems inherently exhibit unique optimization difficulties. While residual learning has proven to be effective in classical deep learning for improving gradient flow and network convergence, integrating residual connections in quantum architectures is non-trivial due to the no-cloning theorem, which prevents the direct copying of quantum states. In this paper, we introduce Residual Quantum Graph Recurrent Neural Networks (ResQGRNNs), a novel approach that enhances the trainability of QGRNNs by incorporating residual connections while preserving quantum constraints. Unlike prior works that primarily focus on expressivity or classical datasets, our method directly addresses trainability concerns in quantum graph-based architectures. We propose an alternative strategy by using ancilla qubits to incorporate residual learning without violating quantum mechanical principles, ensuring effective parameter updates during training. Our results demonstrate that ResQGRNNs outperform the (plain) QGRNNs, leading to more stable optimization and enhanced performance on quantum graph learning tasks.
UR - https://www.scopus.com/pages/publications/105033456780
U2 - 10.1109/QAI63978.2025.00026
DO - 10.1109/QAI63978.2025.00026
M3 - Conference contribution
AN - SCOPUS:105033456780
T3 - Proceedings - 2025 IEEE International Conference on Quantum Artificial Intelligence, QAI 2025
SP - 120
EP - 127
BT - Proceedings - 2025 IEEE International Conference on Quantum Artificial Intelligence, QAI 2025
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 2 November 2025 through 5 November 2025
ER -