A Framework for Quantifying the Privacy/Utility Trade-off in Generative Model based Synthetic Data
This PhD studentship is fully funded by a grant from The Alan Turing Institute as part of the Accenture / Turing Strategic Partnership (subject to funding confirmation) and will be supervised jointly by Prof Emiliano De Cristofaro (University College London), Dr Isabel Wagner (De Montfort University), and Prof Eerke Boiten (De Montfort University). Over the course of the PhD, you will spend time at both universities. You will apply to, be formally registered at, and get your degree from De Montfort University.
The PhD Studentship, which starts 1 January 2022, comprises a tax-free stipend of £15,609 per annum (paid monthly). The bursary is renewable annually for up to 36 months in total, subject to you making satisfactory progression within your PhD research. This funding model also includes a Full Time Home fees studentship (£4,500 for 2021-22) for up to 3 years, subject to you making satisfactory progress with your PhD research.
This opportunity is open to UK, EU, and International applicants. International applicants will be required to meet the difference in fee costs from their own funds.
Synthetic datasets drawn from generative models (GMs) have been advertised as a silver-bullet solution to privacy-preserving data publishing. Recent work suggests that such claims do not match reality: for instance, synthetic data does not seem to prevent attribute inference. Overall, attacks against GMs allow adversaries to learn private information about training data. While researchers have studied these attacks thoroughly, they have done so: 1) in isolation, 2) on publicly available datasets, and/or 3) without evaluating, in-depth, the effectiveness of defence mechanisms or their effect on utility.
This prompts two big research challenges. First, we lack a comprehensive picture of the risks caused by the attacks or the effectiveness of defence techniques. Second, it is hard for organizations and individuals using generative models in the wild to “port” considerations about the attacks to real-world settings or the impact of defences on task accuracy. We propose to build a research agenda and to build on it to provide actionable insight and easy-to-understand risk assessment of different inference attacks against machine learning models in real-world GM applications, as well as meaningful defences.
Specifically, this project aims to develop methods and tools to assess privacy/utility trade-offs of synthetic data generation, and to build up a comprehensive picture of privacy and utility for existing generative models that offer privacy guarantees in the differential privacy model and are subject to attacks on training data.
- MSc in a relevant field, such as Computer Science, Computer Engineering, or Cyber Security
- Strong programming skills, preferably Python, Matlab, or R
- Experience with privacy-enhancing technologies and/or machine learning
- Good communication skills, especially in written English
- Strong work ethic
- Ability to think creatively and work independently
How to apply
If you are interested in applying, please apply via our application portal which can be accessed here. When applying, please ensure that you identify Accenture-Turing Strategic Partnership studentship under the Proposed Programme of Study tab.
Please also provide a 200-word statement of interest in which you say why you are interested in studying for a PhD and how you believe you have the desired qualities and experiences for this research project.
The deadline for applications is 1 November 2021. The start of the project is 1 January 2022.
De Montfort University offers alumni discounts for students that have previously studied with us at undergraduate or taught masters degree level. This discount is available to Home, EU and International Students. This discount is only available to new starters and will be automatically applied if you are eligible.
Home and EU alumni discount is £500 per year (full-time equivalent).
International alumni discount is £2,000 per year (full-time equivalent).