< Explain other AI papers

Quantum Variational Activation Functions Empower Kolmogorov-Arnold Networks

Jiun-Cheng Jiang, Morris Yu-Chao Huang, Tianlong Chen, Hsi-Sheng Goan

2025-09-18

Quantum Variational Activation Functions Empower Kolmogorov-Arnold Networks

Summary

This paper explores a new way to build better quantum machine learning models by combining ideas from two different areas: variational quantum circuits and a type of neural network called Kolmogorov-Arnold Networks. They create a new building block for these models, called Quantum Variational Activation Functions, which are more efficient and powerful than existing methods.

What's the problem?

Current quantum machine learning models often require a huge number of parameters, making them difficult to train and run on today’s quantum computers. Traditional methods for creating complex functions within these models, like using Fourier-based activations, aren't very efficient in terms of the number of settings (parameters) they need. This limits the size and complexity of problems these models can tackle.

What's the solution?

The researchers developed 'DARUANs,' which are special quantum circuits that act like activation functions in a neural network. These DARUANs are designed to create a wide range of frequencies with fewer parameters than traditional methods. They then integrated these DARUANs into a quantum-inspired version of Kolmogorov-Arnold Networks (QKANs). To make these QKANs even more practical, they introduced techniques like 'layer extension' and 'hybrid QKANs' to improve scalability and efficiency, allowing them to potentially replace parts of standard neural networks. They tested these new models on tasks like predicting functions, classifying images, and generating text.

Why it matters?

This work is important because it offers a path towards building more powerful and practical quantum machine learning models. By reducing the number of parameters needed, these models become more feasible to run on current and near-future quantum hardware, and they can potentially outperform classical machine learning models on certain tasks. The techniques presented could significantly advance the field of quantum machine learning and open up new possibilities for solving complex problems.

Abstract

Variational quantum circuits (VQCs) are central to quantum machine learning, while recent progress in Kolmogorov-Arnold networks (KANs) highlights the power of learnable activation functions. We unify these directions by introducing quantum variational activation functions (QVAFs), realized through single-qubit data re-uploading circuits called DatA Re-Uploading ActivatioNs (DARUANs). We show that DARUAN with trainable weights in data pre-processing possesses an exponentially growing frequency spectrum with data repetitions, enabling an exponential reduction in parameter size compared with Fourier-based activations without loss of expressivity. Embedding DARUAN into KANs yields quantum-inspired KANs (QKANs), which retain the interpretability of KANs while improving their parameter efficiency, expressivity, and generalization. We further introduce two novel techniques to enhance scalability, feasibility and computational efficiency, such as layer extension and hybrid QKANs (HQKANs) as drop-in replacements of multi-layer perceptrons (MLPs) for feed-forward networks in large-scale models. We provide theoretical analysis and extensive experiments on function regression, image classification, and autoregressive generative language modeling, demonstrating the efficiency and scalability of QKANs. DARUANs and QKANs offer a promising direction for advancing quantum machine learning on both noisy intermediate-scale quantum (NISQ) hardware and classical quantum simulators.