Littérature scientifique sur le sujet « Computational Differential Privacy »
Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres
Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « Computational Differential Privacy ».
À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.
Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.
Articles de revues sur le sujet "Computational Differential Privacy"
Bhavani Sankar Telaprolu. « Privacy-Preserving Federated Learning in Healthcare - A Secure AI Framework ». International Journal of Scientific Research in Computer Science, Engineering and Information Technology 10, no 3 (16 juillet 2024) : 703–7. https://doi.org/10.32628/cseit2410347.
Texte intégralEt. al., Dr Priyank Jain,. « Differentially Private Data Release : Bias Weight Perturbation Method - A Novel Approach ». Turkish Journal of Computer and Mathematics Education (TURCOMAT) 12, no 10 (28 avril 2021) : 7165–73. http://dx.doi.org/10.17762/turcomat.v12i10.5607.
Texte intégralKii, Masanobu, Atsunori Ichikawa et Takayuki Miura. « Lightweight Two-Party Secure Sampling Protocol for Differential Privacy ». Proceedings on Privacy Enhancing Technologies 2025, no 1 (janvier 2025) : 23–36. http://dx.doi.org/10.56553/popets-2025-0003.
Texte intégralMeisingseth, Fredrik, et Christian Rechberger. « SoK : Computational and Distributed Differential Privacy for MPC ». Proceedings on Privacy Enhancing Technologies 2025, no 1 (janvier 2025) : 420–39. http://dx.doi.org/10.56553/popets-2025-0023.
Texte intégralKim, Jongwook. « DistOD : A Hybrid Privacy-Preserving and Distributed Framework for Origin–Destination Matrix Computation ». Electronics 13, no 22 (19 novembre 2024) : 4545. http://dx.doi.org/10.3390/electronics13224545.
Texte intégralFang, Juanru, et Ke Yi. « Privacy Amplification by Sampling under User-level Differential Privacy ». Proceedings of the ACM on Management of Data 2, no 1 (12 mars 2024) : 1–26. http://dx.doi.org/10.1145/3639289.
Texte intégralAlborch Escobar, Ferran, Sébastien Canard, Fabien Laguillaumie et Duong Hieu Phan. « Computational Differential Privacy for Encrypted Databases Supporting Linear Queries ». Proceedings on Privacy Enhancing Technologies 2024, no 4 (octobre 2024) : 583–604. http://dx.doi.org/10.56553/popets-2024-0131.
Texte intégralLiu, Hai, Zhenqiang Wu, Yihui Zhou, Changgen Peng, Feng Tian et Laifeng Lu. « Privacy-Preserving Monotonicity of Differential Privacy Mechanisms ». Applied Sciences 8, no 11 (28 octobre 2018) : 2081. http://dx.doi.org/10.3390/app8112081.
Texte intégralPavan Kumar Vadrevu. « Scalable Approaches for Enhancing Privacy in Blockchain Networks : A Comprehensive Review of Differential Privacy Techniques ». Journal of Information Systems Engineering and Management 10, no 8s (31 janvier 2025) : 635–48. https://doi.org/10.52783/jisem.v10i8s.1119.
Texte intégralHong, Yiyang, Xingwen Zhao, Hui Zhu et Hui Li. « A Blockchain-Integrated Divided-Block Sparse Matrix Transformation Differential Privacy Data Publishing Model ». Security and Communication Networks 2021 (7 décembre 2021) : 1–15. http://dx.doi.org/10.1155/2021/2418539.
Texte intégralThèses sur le sujet "Computational Differential Privacy"
Alborch, escobar Ferran. « Private Data Analysis over Encrypted Databases : Mixing Functional Encryption with Computational Differential Privacy ». Electronic Thesis or Diss., Institut polytechnique de Paris, 2025. http://www.theses.fr/2025IPPAT003.
Texte intégralIn our current digitalized society, data is ruling the world. But as it is most of the time related to individuals, its exploitation should respect the privacy of the latter. This issue has raised the differential privacy paradigm, which permits to protect individuals when querying databases containing data about them. But with the emergence of cloud computing, it is becoming increasingly necessary to also consider the confidentiality of "on-cloud'' storage confidentiality of such vast databases, using encryption techniques. This thesis studies how to provide both privacy and confidentiality of such outsourced databases by mixing two primitives: computational differential privacy and functional encryption. First, we study the relationship between computational differential privacy and functional encryption for randomized functions in a generic way. We analyze the privacy of the setting where a malicious analyst may access the encrypted data stored in a server, either by corrupting or breaching it, and prove that a secure randomized functional encryption scheme supporting the appropriate family of functions guarantees the computational differential privacy of the system. Second, we construct efficient randomized functional encryption schemes for certain useful families of functions, and we prove them secure in the standard model under well-known assumptions. The families of functions considered are linear functions, used for example in counting queries, histograms and linear regressions, and quadratic functions, used for example in quadratic regressions and hypothesis testing. The schemes built are then used together with the first result to construct encrypted databases for their corresponding family of queries. Finally, we implement both randomized functional encryption schemes to analyze their efficiency. This shows that our constructions are practical for databases with up to 1 000 000 entries in the case of linear queries and databases with up to 10 000 database entries in the case of quadratic queries
Leukam, Lako Franklin. « Protection des données à caractère personnel pour les services énergétiques ». Electronic Thesis or Diss., Institut polytechnique de Paris, 2021. http://www.theses.fr/2021IPPAS004.
Texte intégralSmart grids are important bricks in the fight against climate change. Smart grids allow the massive introduction of renewable energies, which are intermittent, while guaranteeing grid stability, i.e., ensuring a real-time balance between demand and production in the power grid. The management of grid stability is possible thanks to smart meters installed in households, allowing the distribution system operator to collect consumption/production data from consumers/producers at a time step of up to 10 min in France. This real-time consumption data enables to provide new energy services, such as customer consumption forecasts or demand response. Demand response services help to avoid consumption peaks in a neighborhood by ensuring that, at all times, users' consumption does not exceed the maximum power of the local grid. However, the collection of users' consumptions is a key privacy concern. Indeed, individual consumption data reflect the use of all electric appliances by inhabitants in a household over time, and enable to deduce the behaviors, activities, age or preferences of the inhabitants. This thesis aims to propose new energy services, while protecting the privacy of consumers. We propose five contributions that relate to two themes:1- The transformation of a demand response algorithm by making it privacy friendly. This transformation uses secure multiparty computation, allowing to compute an aggregate, such as a sum of users’ consumption, without disclosing any individual consumption.2- The publication of sum of users' consumption while preserving privacy and good utility. This publication uses differential privacy, ensuring that the publication of the sum does not indirectly reveal individual users' consumption. Among other energy services, these sums of consumption enable to perform consumption forecasts
(6565679), Fang-Yu Rao. « Privacy-Enhancing Techniques for Data Analytics ». Thesis, 2019.
Trouver le texte intégralOrganizations today collect and aggregate huge amounts of data from individuals under various scenarios and for different purposes. Such aggregation of individuals’ data when combined with techniques of data analytics allows organizations to make informed decisions and predictions. But in many situations, different portions of the data associated with individuals are collected and curated by different organizations. To derive more accurate conclusions and predictions, those organization may want to conduct the analysis based on their joint data, which cannot be simply accomplished by each organization exchanging its own data with other organizations due to the sensitive nature of data. Developing approaches for collaborative privacy-preserving data analytics, however, is a nontrivial task. At least two major challenges have to be addressed. The first challenge is that the security of the data possessed by each organization should always be properly protected during and after the collaborative analysis process, whereas the second challenge is the high computational complexity usually accompanied by cryptographic primitives used to build such privacy-preserving protocols.
In this dissertation, based on widely adopted primitives in cryptography, we address the aforementioned challenges by developing techniques for data analytics that
not only allow multiple mutually distrustful parties to perform data analysis on their
joint data in a privacy-preserving manner, but also reduce the time required to complete the analysis. More specifically, using three common data analytics tasks as
concrete examples, we show how to construct the respective privacy-preserving protocols under two different scenarios: (1) the protocols are executed by a collaborative process only involving the participating parties; (2) the protocols are outsourced to
some service providers in the cloud. Two types of optimization for improving the
efficiency of those protocols are also investigated. The first type allows each participating party access to a statistically controlled leakage so as to reduce the amount
of required computation, while the second type utilizes the parallelism that could
be incorporated into the task and pushes some computation to the offline phase to
reduce the time needed for each participating party without any additional leakage.
Extensive experiments are also conducted on real-world datasets to demonstrate the
effectiveness of our proposed techniques.
Chapitres de livres sur le sujet "Computational Differential Privacy"
Mironov, Ilya, Omkant Pandey, Omer Reingold et Salil Vadhan. « Computational Differential Privacy ». Dans Advances in Cryptology - CRYPTO 2009, 126–42. Berlin, Heidelberg : Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-03356-8_8.
Texte intégralPejó, Balázs, et Damien Desfontaines. « Computational Power (C) ». Dans Guide to Differential Privacy Modifications, 55–57. Cham : Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-96398-9_9.
Texte intégralBen Hamida, Sana, Hichem Mrabet et Abderrazak Jemai. « How Differential Privacy Reinforces Privacy of Machine Learning Models ? » Dans Advances in Computational Collective Intelligence, 661–73. Cham : Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-16210-7_54.
Texte intégralValovich, Filipp, et Francesco Aldà. « Computational Differential Privacy from Lattice-Based Cryptography ». Dans Number-Theoretic Methods in Cryptology, 121–41. Cham : Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-76620-1_8.
Texte intégralKang, Yilin, Jian Li, Yong Liu et Weiping Wang. « Data Heterogeneity Differential Privacy : From Theory to Algorithm ». Dans Computational Science – ICCS 2023, 119–33. Cham : Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-35995-8_9.
Texte intégralTao, Fuqiang, Zhe Sun, Rui Liang, Rundong Shao, Yuhan Chai et Yangyang Wang. « FEDSET : Federated Random Forest Based on Differential Privacy ». Dans Computational and Experimental Simulations in Engineering, 791–806. Cham : Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-42987-3_55.
Texte intégralPodsevalov, Ivan, Alexei Podsevalov et Vladimir Korkhov. « Differential Privacy for Statistical Data of Educational Institutions ». Dans Computational Science and Its Applications – ICCSA 2022 Workshops, 603–15. Cham : Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-10542-5_41.
Texte intégralGroce, Adam, Jonathan Katz et Arkady Yerukhimovich. « Limits of Computational Differential Privacy in the Client/Server Setting ». Dans Theory of Cryptography, 417–31. Berlin, Heidelberg : Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-19571-6_25.
Texte intégralQian, Jiaqing, Yanan Chen et Sanxiu Jiao. « DPFL-AES : Differential Privacy Federated Learning Based on Adam Early Stopping ». Dans Computational and Experimental Simulations in Engineering, 905–19. Cham : Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-42515-8_64.
Texte intégralBun, Mark, Yi-Hsiu Chen et Salil Vadhan. « Separating Computational and Statistical Differential Privacy in the Client-Server Model ». Dans Theory of Cryptography, 607–34. Berlin, Heidelberg : Springer Berlin Heidelberg, 2016. http://dx.doi.org/10.1007/978-3-662-53641-4_23.
Texte intégralActes de conférences sur le sujet "Computational Differential Privacy"
Li, Xianzhi, Ran Zmigrod, Zhiqiang Ma, Xiaomo Liu et Xiaodan Zhu. « Fine-Tuning Language Models with Differential Privacy through Adaptive Noise Allocation ». Dans Findings of the Association for Computational Linguistics : EMNLP 2024, 8368–75. Stroudsburg, PA, USA : Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-emnlp.491.
Texte intégralGupta, Yash, Jeswin M. S, Aniruddh Mantrala, Davin Henry Monteiro, Adhithi M et M. N. Thippeswamy. « Enhancing Differential Privacy in Federated Learning via Quantum Computation and Algorithms ». Dans 2024 8th International Conference on Computational System and Information Technology for Sustainable Solutions (CSITSS), 1–6. IEEE, 2024. https://doi.org/10.1109/csitss64042.2024.10816807.
Texte intégralVu, Doan Nam Long, Timour Igamberdiev et Ivan Habernal. « Granularity is crucial when applying differential privacy to text : An investigation for neural machine translation ». Dans Findings of the Association for Computational Linguistics : EMNLP 2024, 507–27. Stroudsburg, PA, USA : Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-emnlp.29.
Texte intégralTajima, Arisa, Wei Jiang, Virendra Marathe et Hamid Mozaffari. « Enhanced Private Decision Trees using Secure Multiparty Computation and Differential Privacy ». Dans 2024 IEEE International Conference on Knowledge Graph (ICKG), 352–59. IEEE, 2024. https://doi.org/10.1109/ickg63256.2024.00051.
Texte intégralFlemings, James, et Murali Annavaram. « Differentially Private Knowledge Distillation via Synthetic Text Generation ». Dans Findings of the Association for Computational Linguistics ACL 2024, 12957–68. Stroudsburg, PA, USA : Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-acl.769.
Texte intégralChen, Bo, Baike She, Calvin Hawkins, Alex Benvenuti, Brandon Fallin, Philip E. Paré et Matthew Hale. « Differentially Private Computation of Basic Reproduction Numbers in Networked Epidemic Models ». Dans 2024 American Control Conference (ACC), 4422–27. IEEE, 2024. http://dx.doi.org/10.23919/acc60939.2024.10644264.
Texte intégralRamesh, Krithika, Nupoor Gandhi, Pulkit Madaan, Lisa Bauer, Charith Peris et Anjalie Field. « Evaluating Differentially Private Synthetic Data Generation in High-Stakes Domains ». Dans Findings of the Association for Computational Linguistics : EMNLP 2024, 15254–69. Stroudsburg, PA, USA : Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-emnlp.894.
Texte intégralMeisenbacher, Stephen, Maulik Chevli, Juraj Vladika et Florian Matthes. « DP-MLM : Differentially Private Text Rewriting Using Masked Language Models ». Dans Findings of the Association for Computational Linguistics ACL 2024, 9314–28. Stroudsburg, PA, USA : Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-acl.554.
Texte intégralGhazi, Badih, Rahul Ilango, Pritish Kamath, Ravi Kumar et Pasin Manurangsi. « Towards Separating Computational and Statistical Differential Privacy ». Dans 2023 IEEE 64th Annual Symposium on Foundations of Computer Science (FOCS). IEEE, 2023. http://dx.doi.org/10.1109/focs57990.2023.00042.
Texte intégralKotevska, Olivera, Folami Alamudun et Christopher Stanley. « Optimal Balance of Privacy and Utility with Differential Privacy Deep Learning Frameworks ». Dans 2021 International Conference on Computational Science and Computational Intelligence (CSCI). IEEE, 2021. http://dx.doi.org/10.1109/csci54926.2021.00141.
Texte intégralRapports d'organisations sur le sujet "Computational Differential Privacy"
Rannenberg, Kai, Sebastian Pape, Frédéric Tronnier et Sascha Löbner. Study on the Technical Evaluation of De-Identification Procedures for Personal Data in the Automotive Sector. Universitätsbibliothek Johann Christian Senckenberg, octobre 2021. http://dx.doi.org/10.21248/gups.63413.
Texte intégral