Exploring Adaptive Techniques for Differentially-Private Stochastic Gradient Descent

  • Muhamad Fauzan

Student thesis: Master's Dissertation

Abstract

Differential Privacy (DP) has emerged as a cornerstone of privacy-preserving machine learning. Differentially Private Stochastic Gradient Descent (DP-SGD) is a widely adopted algorithm for training machine learning models with DP guarantees. However, DP-SGD faces a significant challenge: the trade-off between privacy and utility is highly sensitive to hyperparameters such as the learning rate, gradient clipping threshold, and noise scale. Traditional hyperparameter tuning methods are often inefficient and fail to adapt to the dynamic nature of training, resulting in suboptimal performance. The introduction of adaptive hyperparameter tuning has provided a significant advantage in balancing privacy and utility compared to fixed hyperparameter approaches by continuously adjusting the hyperparameters during the training process based on a set of criteria. However, one underexplored aspect of private training is that existing public datasets can significantly help in training the model. We leverage such public data to perform regular training and track the evolution of model parameters over time. In particular, we analyze the behaviour of the average gradient norms observed during public training. Using this baseline trend, we propose a novel adaptive learning technique that dynamically adjusts the learning rate and clipping threshold based on trends while performing private training in order to avoid incurring additional privacy budget. We evaluate our approach on benchmark datasets, including MNIST and CIFAR-10, comparing it to non-adaptive DP-SGD baselines. Our results demonstrate that adaptive hyperparameter tuning significantly improves model accuracy without compromising privacy. For instance, on CIFAR-10, our method consistently achieves higher accuracy compared to fixed hyperparameter settings under the same privacy budget. These findings underscore the potential of adaptive methods to enhance the practicality and efficiency of DP-SGD in real-world applications. In summary, this thesis makes three key contributions: (1) a comprehensive analysis of the challenges associated with hyperparameter tuning in DP-SGD, (2) a novel adaptive hyperparameter tuning algorithm, and (3) empirical evidence demonstrating the effectiveness of our approach. Our work advances the state-of-the-art in privacy-preserving machine learning and lays the groundwork for future research in adaptive DP algorithms.
Date of Award2025
Original languageAmerican English
Awarding Institution
  • HBKU College of Science and Engineering

Keywords

  • Artificial Intelligence
  • Cybersecurity
  • Differential Privacy
  • Machine Learning

Cite this

'