This Is Auburn

Show simple item record

Uncertainty Quantification and Inference under Differential Privacy


Metadata FieldValueLanguage
dc.contributor.advisorMolinari, Roberto
dc.contributor.authorRomanus, Ogonnaya
dc.date.accessioned2025-07-31T19:44:29Z
dc.date.available2025-07-31T19:44:29Z
dc.date.issued2025-07-31
dc.identifier.urihttps://etd.auburn.edu/handle/10415/9926
dc.description.abstractAdvances in technology have led to the proliferation of adversarial techniques capable of undermining traditional data security protocols designed to ensure data privacy. Such breaches have resulted in significant financial and reputational costs for companies, particularly in addressing privacy violations and complying with regulatory penalties. Strict privacy regulations, including the General Data Protection Regulation (GDPR), the California Consumer Privacy Act (CCPA), and the Health Insurance Portability and Accountability Act (HIPAA), further necessitate the integration of privacy-preserving mechanisms throughout the data modeling pipeline to avoid legal repercussions. Differential privacy (DP) and its variants have emerged as the gold standard for mathematically quantifying and mitigating privacy loss in sensitive data analysis. However, the noise introduced to achieve DP often distorts data structure and statistical properties, rendering traditional inferential methods inapplicable. This dissertation develops efficient and robust methods for privacy-preserving data analysis under DP guarantees. Chapter~1 introduces a flexible simulation-based framework that constructs the distribution of a DP estimator by matching observed DP statistics with simulated counterparts. The method is applied to one- and two-sample confidence interval estimation, hypothesis testing, chi-square tests of independence, and logistic regression with categorical predictors. Extensive simulations demonstrate that the proposed approach performs comparably to state-of-the-art methods. Applications to real-world datasets further confirm that inferences drawn from the DP statistical tests align with those obtained via conventional non-private methods. Chapter 2 presents a binary search-based DP algorithm for conformal prediction in classification tasks. Extensive empirical evaluations reveal that this method outperforms the only existing alternative in both efficiency and predictive performance. Evaluations on benchmark datasets (CIFAR-10, ImageNet, and CoronaHack) demonstrate its superiority over the current state-of-the-art method across realistic and practical scenarios.en_US
dc.subjectMathematics and Statisticsen_US
dc.titleUncertainty Quantification and Inference under Differential Privacyen_US
dc.typePhD Dissertationen_US
dc.embargo.statusNOT_EMBARGOEDen_US
dc.embargo.enddate2025-07-31en_US

Files in this item

Show simple item record