The rapidly growing demand to share data more openly creates a need for secure and privacy-preserving sharing technologies. However, there are multiple challenges associated with the development of a universal privacy-preserving data sharing mechanism, and existing solutions still fall short of their promises.
Your institute does not have access to this article
Access options
Subscribe to Journal
Get full journal access for 1 year
92,52 €
only 7,71 € per issue
All prices are NET prices.
VAT will be added later in the checkout.
Tax calculation will be finalised during checkout.
Buy article
Get time limited or full article access on ReadCube.
$32.00
All prices are NET prices.
References
European Data Strategy (European Commission, 2020); https://ec.europa.eu/info/strategy/priorities-2019-2024/europe-fit-digital-age/european-data-strategy
Federal Data Strategy (US Federal Department of Commerce US Federal Office of Management & Budget and Science & Technology, 2020); https://strategy.data.gov/
Article 29 Data Protection Working Party. Opinion 05/2014 on Anonymisation Techniques (European Parliament; 2014); https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2014/wp216_en.pdf
Troncoso, C. In The Cyber Security Body of Knowledge Ch. 5 (Univ. Bristol, 2019).
Smart, N. In The Cyber Security Body of Knowledge Ch. 10 (Univ. Bristol, 2019).
Archer, D. W. et al. From Keys to Databases — Real-World Applications of Secure Multi-Party Computation (The Computer Journal, 2018).
Dwork, C. & Roth, A. Found. Trends Theor. Comput. Sci. 9, 211–407 (2014).
Sweeney, L. Int. J. Uncertain Fuzz. 10, 557–570 (2002).
Narayanan, A. & Shmatikov, V. Myths and Fallacies of “Personally Identifiable Information” (Communications of the ACM, 2010).
Culnane, C., Rubinstein, B. I. P. & Teague, V. Preprint at https://arxiv.org/abs/1712.05627 (2017).
Bellovin, S. M., Dutta, P. K. & Reitinger, N. Stan. Tech. L. Rev. 22, 1 (2019).
Augusto, C., Morán, J., De La Riva, C. & Tuya, J. Test-driven anonymization for artificial intelligence. In 2019 IEEE Int. Conf. Artificial Intelligence Testing (AITest) (IEEE, 2019).
Bernardo, V. Tech Sonar 2021–2022 Report — Synthetic Data (European Data Protection Supervisor, 2021).
Houssiau, F., Rocher, L. & de Montjoye, Y. A. Nat. Commun. 13, 29 (2022).
Stadler, T., Oprisanu, B. & Troncoso, C. Synthetic Data–Anonymisation Groundhog Day (USENIX, 2022).
El Emam, K., Mosquera, L. & Zheng, C. J. Am. Med. Inform. Assoc. 28, 3–13 (2021).
Froelicher, D. et al. Nat. Commun. 12, 5910 (2021).
Bagdasaryan, E., Poursaeed, O. & Shmatikov, V. Differential privacy has disparate impact on model accuracy. In Proc. 33rd Int. Conf. Neural Information Processing Systems 1387 (Neural Information Processing Systems, 2019).
Acknowledgements
This work was partially funded by the Swiss National Science Foundation with grant 200021-188824 (T.S.).
Author information
Authors and Affiliations
Contributions
C.T. wrote the paper; T.S. wrote the paper and contributed to the analysis of synthetic data privacy properties.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Nature Computational Science thanks the anonymous reviewers for their contribution to the peer review of this work.
Rights and permissions
About this article
Cite this article
Stadler, T., Troncoso, C. Why the search for a privacy-preserving data sharing mechanism is failing. Nat Comput Sci 2, 208–210 (2022). https://doi.org/10.1038/s43588-022-00236-x
Published:
Issue Date:
DOI: https://doi.org/10.1038/s43588-022-00236-x