Publication Detail
Adaptive Differential Privacy Mechanism for Aggregated Mobility Dataset
UCD-ITS-RP-21-159 Journal Article |
Suggested Citation:
Haydari, Ammar, Michael Zhang, Chen-Nee Chuah, Jane Macfarlane, Sean Peisert (2021) Adaptive Differential Privacy Mechanism for Aggregated Mobility Dataset. Institute of Transportation Studies, University of California, Davis, Journal Article UCD-ITS-RP-21-159
Location data is collected from users continuously to acquire user mobility patterns. Releasing the user trajectories may compromise user privacy. Therefore, the general practice is to release aggregated location datasets. However, private information may still be inferred from an aggregated version of location trajectories. Differential privacy (DP) protects the query output against inference attacks regardless of background knowledge. This paper presents a differential privacy-based privacy model that protects the user’s origins and destinations at the aggregated level. This is achieved by injecting Planar Laplace noise to the user origin and destination GPS points. The noisy GPS points are then transformed to a link representation using a link-matching algorithm. Finally, the link trajectories form an aggregated mobility network. The injected noise level is selected adaptively, by considering the link density of the location and the functional category of the localized links. Compared to the different baseline models, including a k-anonymity method, our differential privacy-based aggregation model offers closer query responses to the raw data in terms of aggregate statistics at both the network and trajectory-levels with max 4% deviation from the baseline. Beyond link aggregation and spatial noise injection, temporal aggregation can also provide a degree of privacy and a discussion of temporal aggregation requirements is presented.