WiMi Hologram Cloud Inc. (NASDAQ: WiMi) has launched a Multi-Scale Fusion Quantum Deep Convolutional Neural Network. The new QDCNN addresses critical limitations within the field, including high model complexity, limited embedding representations, and difficulty in scaling quantum networks. The company claims this technology achieves improvements in key dimensions such as parameter scale, computational complexity, and training efficiency and realizes unified modeling of word-level and sentence-level information, obtaining performance superior to existing quantum models on standard datasets. This release is regarded as an important milestone in promoting quantum natural language processing from theory to practical application, as the industry seeks to deliver fast, scalable, and low-energy-consumption models.
Multi-Scale Feature Fusion for Enhanced Text Understanding
The core innovation lies in WiMi’s implementation of Quantum Depthwise Separable Convolution, a technique adapted from classical neural networks to minimize parameter consumption within the quantum architecture. By encoding input features into quantum states and applying quantum convolution operations on a per-channel basis, the model circumvents the exponential increase in parameters typically associated with expanding quantum networks. WiMi’s research team explains that “Quantum depthwise convolution allows each qubit or qubit cluster to independently process word-level local semantics, thereby preserving the locality advantage of convolution operations.” This approach not only reduces computational load but also enhances the model’s execution efficiency on both simulators and real quantum hardware, exceeding the performance of existing quantum convolution models. Beyond structural efficiency, the QDCNN tackles the challenge of capturing both local and global semantic information within text.
WiMi’s multi-scale feature fusion mechanism addresses this by integrating word-level feature extraction, identifying elements like sentiment polarity, with sentence-level analysis through multi-layer quantum convolution and quantum pooling. This allows the model to understand not just individual word meanings, but also the broader contextual semantics of entire sentences. Through experimentation, WiMi found that this multi-scale feature fusion mechanism contributes significantly to the model’s final performance improvement, bringing more than 6% accuracy gains on multiple datasets. Benchmarking against QRNN, QSAM, and QTF models, the QDCNN demonstrated accuracy improvements ranging from 4% to 10%, alongside a 30% reduction in model parameters compared to classical CNNs. This validates the feasibility of quantum convolutional structures in text processing.
Quantum Depthwise Separable Convolution Reduces Model Complexity
The pursuit of increasingly sophisticated natural language processing models currently faces limitations in scalability, computational cost, and model size; however, the research team is attempting to address these challenges with a novel approach to quantum neural networks. The research team mapped this concept to quantum circuit architecture, aiming to reduce parameter consumption as quantum networks expand. A multi-scale feature fusion mechanism was implemented, extracting features at different levels of granularity. This mechanism utilizes quantum convolution to model n-grams, capturing sentiment and grammatical structures, alongside multi-layer quantum convolution and pooling to understand overall sentence themes. According to the findings, “The fused quantum state possesses both local sensitivity and the ability to reflect overall semantics, exhibiting richer feature capabilities than traditional QNNs.”
Ultimately, this quantum depthwise separable convolution significantly reduces the number of controlled rotation gates required by traditional quantum convolution structures, making the model’s execution efficiency on simulators and real quantum hardware several times higher than existing quantum convolution models.
QDCNN Achieves Leading Accuracy on Text Classification Benchmarks
WiMi Hologram Cloud Inc. Unlike much of the current quantum computing research focused on theoretical advancements, this approach targets practical application, aiming to resolve specific limitations hindering progress in the field. The company states that this adaptation to a quantum circuit architecture avoids the exponential increase in parameters typically seen when scaling quantum networks, allowing each qubit to independently process local semantics. The company reports that this mechanism contributed to a greater than 6% accuracy gain on multiple datasets. The company asserts that “It is not only a technological innovation but also a profound breakthrough in the field of quantum natural language processing.” This validates the feasibility of quantum convolutional structures in text processing.
Researchers “cleverly mapped this concept to the quantum circuit architecture,” achieving a reduction in parameters while maintaining expressive power. The QDCNN integrates both word-level and sentence-level information through a novel multi-scale feature fusion mechanism, and experimental validation on public text classification benchmarks demonstrates the QDCNN’s effectiveness.
