Open Access. Powered by Scholars. Published by Universities.®

Information Security Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 6 of 6

Full-Text Articles in Information Security

Towards Practical Differentially Private Mechanism Design And Deployment, Dan Zhang Jul 2021

Towards Practical Differentially Private Mechanism Design And Deployment, Dan Zhang

Doctoral Dissertations

As the collection of personal data has increased, many institutions face an urgent need for reliable protection of sensitive data. Among the emerging privacy protection mechanisms, differential privacy offers a persuasive and provable assurance to individuals and has become the dominant model in the research community. However, despite growing adoption, the complexity of designing differentially private algorithms and effectively deploying them in real-world applications remains high. In this thesis, we address two main questions: 1) how can we aid programmers in developing private programs with high utility? and 2) how can we deploy differentially private algorithms to visual analytics systems? …


Achieving Differential Privacy And Fairness In Machine Learning, Depeng Xu May 2021

Achieving Differential Privacy And Fairness In Machine Learning, Depeng Xu

Graduate Theses and Dissertations

Machine learning algorithms are used to make decisions in various applications, such as recruiting, lending and policing. These algorithms rely on large amounts of sensitive individual information to work properly. Hence, there are sociological concerns about machine learning algorithms on matters like privacy and fairness. Currently, many studies only focus on protecting individual privacy or ensuring fairness of algorithms separately without taking consideration of their connection. However, there are new challenges arising in privacy preserving and fairness-aware machine learning. On one hand, there is fairness within the private model, i.e., how to meet both privacy and fairness requirements simultaneously in …


Differential Privacy Protection Over Deep Learning: An Investigation Of Its Impacted Factors, Ying Lin, Ling-Yan Bao, Ze-Minghui Li, Shu-Sheng Si, Chao-Hsien Chu Dec 2020

Differential Privacy Protection Over Deep Learning: An Investigation Of Its Impacted Factors, Ying Lin, Ling-Yan Bao, Ze-Minghui Li, Shu-Sheng Si, Chao-Hsien Chu

Research Collection School Of Computing and Information Systems

Deep learning (DL) has been widely applied to achieve promising results in many fields, but it still exists various privacy concerns and issues. Applying differential privacy (DP) to DL models is an effective way to ensure privacy-preserving training and classification. In this paper, we revisit the DP stochastic gradient descent (DP-SGD) method, which has been used by several algorithms and systems and achieved good privacy protection. However, several factors, such as the sequence of adding noise, the models used etc., may impact its performance with various degrees. We empirically show that adding noise first and clipping second will not only …


Adding Differential Privacy In An Open Board Discussion Board System, Pragya Rana May 2017

Adding Differential Privacy In An Open Board Discussion Board System, Pragya Rana

Master's Projects

This project implements a privacy system for statistics generated by the Yioop search and discussion board system. Statistical data for such a system consists of various counts, sums, and averages that might be displayed for groups, threads, etc. When statistical data is made publicly available, there is no guarantee of preserving the privacy of an individual. Ideally, any data extracted should not reveal any sensitive information about an individual. In order to help achieve this, we implemented a Differential Privacy mechanism for Yioop. Differential privacy preserves privacy up to some controllable parameters of the number of items or individuals being …


Dpweka: Achieving Differential Privacy In Weka, Srinidhi Katla May 2017

Dpweka: Achieving Differential Privacy In Weka, Srinidhi Katla

Graduate Theses and Dissertations

Organizations belonging to the government, commercial, and non-profit industries collect and store large amounts of sensitive data, which include medical, financial, and personal information. They use data mining methods to formulate business strategies that yield high long-term and short-term financial benefits. While analyzing such data, the private information of the individuals present in the data must be protected for moral and legal reasons. Current practices such as redacting sensitive attributes, releasing only the aggregate values, and query auditing do not provide sufficient protection against an adversary armed with auxiliary information. In the presence of additional background information, the privacy protection …


Reconstruction Privacy: Enabling Statistical Learning, Ke Wang, Chao Han, Ada Waichee Fu, Raymond C. Wong, Philip S. Yu Mar 2015

Reconstruction Privacy: Enabling Statistical Learning, Ke Wang, Chao Han, Ada Waichee Fu, Raymond C. Wong, Philip S. Yu

Research Collection School Of Computing and Information Systems

Non-independent reasoning (NIR) allows the information about one record in the data to be learnt from the information of other records in the data. Most posterior/prior based privacy criteria consider NIR as a privacy violation and require to smooth the distribution of published data to avoid sensitive NIR. The drawback of this approach is that it limits the utility of learning statistical relationships. The differential privacy criterion considers NIR as a non-privacy violation, therefore, enables learning statistical relationships, but at the cost of potential disclosures through NIR. A question is whether it is possible to (1) allow learning statistical relationships, …