Full Program »
Reconstruction Attack on Differential Private Trajectory Protection Mechanisms
Location trajectories collected by smartphones and other sensor-equipped devices represent a valuable data source for analytics services such as transport optimisation, location-based services, and contact tracing. Likewise, trajectories have the potential to reveal sensitive information about individuals, such as religious beliefs, social connections, or sexual orientation. Accordingly, trajectory datasets require appropriate protection before publication. Due to their strong theoretical privacy guarantees, differential private publication mechanisms have received much attention in the past. However, the large amount of noise that needs to be added to achieve differential privacy yields trajectories that differ significantly from the original trajectories. These structural differences, e.g., ship trajectories passing over land, or car trajectories not following roads, can be exploited to reduce the level of privacy provided by the publication mechanism. We propose a deep learning-based Reconstruction Attack on Protected Trajectories (RAoPT), that leverages the mentioned differences to partly reconstruct the original trajectory from a differential private release. The evaluation shows that our RAoPT model can reduce the Euclidean and Hausdorff distances of released trajectories to the original trajectories by over 65 % on the T-Drive dataset even under protection with 𝜀 ≤ 1. Trained on the T-Drive dataset, the model can still reduce both distances by over 48 % if applied to GeoLife trajectories protected with a state-of-the-art protection mechanism and 𝜀 = 0.1. This work aims to highlight shortcomings of current publication mechanisms for trajectories and thus motivates further research on privacy-preserving publication schemes.