The increasing demand for data privacy necessitates novel approaches to machine learning, particularly in the training of complex deep neural networks. Current methods often require exposing sensitive data during the learning process, creating vulnerabilities. Researchers are now addressing this challenge through homomorphic encryption (HE), a technique allowing computation on encrypted data without decryption. This preserves confidentiality throughout the entire process, from initial training to final deployment. Alberto Pirillo and Luca Colombo, both from Politecnico di Milano, alongside colleagues, detail a new framework, titled ‘ReBoot: Encrypted Training of Deep Neural Networks with CKKS Bootstrapping’, which facilitates fully encrypted, non-interactive training of deep neural networks, a significant advancement beyond previous limitations largely confined to encrypted inference or simpler models like logistic regression. Their work, utilising the CKKS scheme, introduces an architecture designed to minimise computational complexity and noise accumulation, enabling effective training of arbitrarily deep networks and offering potential for secure machine learning as a service.
The ReBoot framework presents a notable development in the field of privacy-preserving machine learning, enabling the training of complex models without direct access to raw data. Traditional machine learning methodologies frequently necessitate access to sensitive, unencrypted data, creating inherent privacy vulnerabilities, and ReBoot addresses this challenge through a combination of cryptographic and computational techniques.
At its core, ReBoot employs homomorphic encryption, a form of encryption that allows computations to be performed directly on encrypted data without requiring decryption. This ensures that data remains confidential throughout the entire training process, mitigating the risk of exposure. However, homomorphic encryption is computationally intensive, and naive implementations often introduce significant performance overhead. ReBoot mitigates this through architectural optimisation, specifically by minimising the multiplicative depth of computations. Multiplicative depth refers to the number of sequential multiplications required to compute a result; reducing this number is critical for efficiency when utilising homomorphic encryption schemes, as multiplications are particularly costly operations in this context.
The framework further enhances performance through the implementation of Single Instruction, Multiple Data (SIMD) optimisation. SIMD allows a single instruction to operate on multiple data points simultaneously, accelerating computations and reducing processing time. This is particularly effective in machine learning, where many operations involve parallel calculations on large datasets. Benchmarking demonstrates that ReBoot achieves accuracy comparable to conventional, non-private machine learning models across a range of datasets, including MNIST, Fashion-MNIST, Kuzushiji-MNIST, and datasets relating to breast cancer and heart disease.
The versatility of ReBoot extends to its compatibility with diverse model architectures, further broadening its potential applications. The primary benefit lies in enhanced data privacy, allowing organisations to leverage sensitive information without compromising confidentiality. This facilitates secure data sharing and collaboration between entities, and assists in compliance with stringent data privacy regulations such as the General Data Protection Regulation (GDPR). Consequently, ReBoot has potential applicability across numerous sectors, including healthcare, finance, government, and education, enabling organisations to unlock the value of their data while upholding privacy principles.
👉 More information
🗞 ReBoot: Encrypted Training of Deep Neural Networks with CKKS Bootstrapping
🧠 DOI: https://doi.org/10.48550/arXiv.2506.19693
