RDF graph anonymization robust to data linkage - INRIA - Institut National de Recherche en Informatique et en Automatique Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

RDF graph anonymization robust to data linkage

Résumé

Privacy is a major concern when publishing new datasets in the context of Linked Open Data (LOD). A new dataset published in the LOD is indeed exposed to privacy breaches due to the linkage to objects already present in the other datasets of the LOD. In this paper, we focus on the problem of building safe anonymizations of an RDF graph to guarantee that linking the anonymized graph with any external RDF graph will not cause privacy breaches. Given a set of privacy queries as input, we study the data-independent safety problem and the sequence of anonymization operations necessary to enforce it. We provide sufficient conditions under which an anonymization instance is safe given a set of privacy queries. Additionally, we show that our algorithms for RDF data anonymization are robust in the presence of sameAs links that can be explicit or inferred by additional knowledge.
Fichier principal
Vignette du fichier
paper.pdf (523.55 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02444752 , version 1 (19-01-2020)

Identifiants

Citer

Rémy Delanaux, Angela Bonifati, Marie-Christine Rousset, Romuald Thion. RDF graph anonymization robust to data linkage. WISE 2019 - 20th International Conference on Web Information Systems Engineering, Jan 2020, Hong Kong, China. pp.491-506, ⟨10.1007/978-3-030-34223-4_31⟩. ⟨hal-02444752⟩
252 Consultations
290 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More