%0 Conference Proceedings %T Practical Differentially Private Modeling of Human Movement Data %+ University of Texas at Dallas [Richardson] (UT Dallas) %+ Harvard University %A Roy, Harichandan %A Kantarcioglu, Murat %A Sweeney, Latanya %Z Part 4: Protection and Privacy of Data and Big Data %< avec comité de lecture %( Lecture Notes in Computer Science %B 30th IFIP Annual Conference on Data and Applications Security and Privacy (DBSec) %C Trento, Italy %Y Silvio Ranise %Y Vipin Swarup %I Springer International Publishing %3 Data and Applications Security and Privacy XXX %V LNCS-9766 %P 170-178 %8 2016-07-18 %D 2016 %R 10.1007/978-3-319-41483-6_13 %K Differential privacy %K Human mobility %K Utility %K Hubway %Z Computer Science [cs]Conference papers %X Exciting advances in big data analysis suggest that sharing personal information, such as health and location data, among multiple other parties could have significant societal benefits. However, privacy issues often hinder data sharing. Recently, differential privacy emerged as an important tool to preserve privacy while sharing privacy-sensitive data. The basic idea is simple. Differential privacy guarantees that results learned from shared data do not change much based on the inclusion or exclusion of any single person’s data. Despite the promise, existing differential privacy techniques addresses specific utility goals and/or query types (e.g., count queries), so it is not clear whether they can preserve utility for arbitrary types of queries. To better understand possible utility and privacy tradeoffs using differential privacy, we examined uses of human mobility data in a real-world competition. Participants were asked to come up with insightful ideas that leveraged a minimally protected published dataset. An obvious question is whether contest submissions could yield the same results if performed on a dataset protected by differential privacy? To answer this question, we studied synthetic dataset generation models for human mobility data using differential privacy. We discuss utility evaluation and the generality of the models extensively. Finally, we analyzed whether the proposed differential privacy models could be used in practice by examining contest submissions. Our results indicate that most of the competition submissions could be replicated using differentially private data with nearly the same utility and with privacy guarantees. Statistical comparisons with the original dataset demonstrate that differentially private synthetic versions of human mobility data can be widely applicable for data analysis. %G English %Z TC 11 %Z WG 11.3 %2 https://inria.hal.science/hal-01633671/document %2 https://inria.hal.science/hal-01633671/file/428203_1_En_13_Chapter.pdf %L hal-01633671 %U https://inria.hal.science/hal-01633671 %~ IFIP-LNCS %~ IFIP %~ IFIP-TC %~ IFIP-WG %~ IFIP-TC11 %~ IFIP-WG11-3 %~ IFIP-DBSEC %~ IFIP-LNCS-9766