Third party apps that work on top of personal cloud services, such as Google Drive and Drop-box, require access to the user’s data in order to provide some functionality. Through detailed analysis of a hundred popular Google Drive apps from Google’s Chrome store, we discover that the existing permission model is quite often misused: around two-thirds of analyzed apps are over-privileged, i.e., they access more data than is needed for them to function. In this work, we analyze three different permission models that aim to discourage users from installing over-privileged apps. In experiments with 210 real users, we discover that the most successful permission model is our novel ensemble method that we call Far-reaching Insights. Far-reaching Insights inform the users about the data-driven insights that apps can make about them (e.g., their topics of interest, collaboration and activity patterns etc.) Thus, they seek to bridge the gap between what third parties can actually know about users and users’ perception of their privacy leakage. The efficacy of Far-reaching Insights in bridging this gap is demonstrated by our results, as Far-reaching Insights prove to be, on average, twice as effective as the current model in discouraging users from installing over-privileged apps. In an effort to promote general privacy awareness, we deployed PrivySeal, a publicly available privacy-focused app store that uses Far-reaching Insights. Based on the knowledge extracted from data of the store’s users (over 115 gigabytes of Google Drive data from 1440 users with 662 installed apps), we also delineate the ecosystem for 3rd party cloud apps from the standpoint of developers and cloud providers. Finally, we present several general recommendations that can guide other future works in the area of privacy for the cloud. To the best of our knowledge, ours is the first work that tackles the privacy risk posed by 3rd party apps on cloud platforms in such depth.
Keywords
- cloud computing
- usable privacy
- apps
Understanding Privacy-Related Advice on Stack Overflow Revisiting Identification Issues in GDPR ‘Right Of Access’ Policies: A Technical and Longitudinal Analysis Employees’ privacy perceptions: exploring the dimensionality and antecedents of personal data sensitivity and willingness to disclose Visualizing Privacy-Utility Trade-Offs in Differentially Private Data Releases Analyzing the Feasibility and Generalizability of Fingerprinting Internet of Things Devices CoverDrop: Blowing the Whistle Through A News App Building a Privacy-Preserving Smart Camera System FP-Radar: Longitudinal Measurement and Early Detection of Browser Fingerprinting Are iPhones Really Better for Privacy? A Comparative Study of iOS and Android Apps How to prove any NP statement jointly? Efficient Distributed-prover Zero-Knowledge Protocols Editors’ Introduction PUBA: Privacy-Preserving User-Data Bookkeeping and Analytics Who Knows I Like Jelly Beans? An Investigation Into Search Privacy SoK: Plausibly Deniable Storage d3p - A Python Package for Differentially-Private Probabilistic Programming Updatable Private Set Intersection Knowledge Cross-Distillation for Membership Privacy RegulaTor: A Straightforward Website Fingerprinting Defense Privacy-Preserving Positioning in Wi-Fi Fine Timing Measurement Efficient Set Membership Proofs using MPC-in-the-Head Checking Websites’ GDPR Consent Compliance for Marketing Emails Comprehensive Analysis of Privacy Leakage in Vertical Federated Learning During Prediction Understanding Utility and Privacy of Demographic Data in Education Technology by Causal Analysis and Adversarial-Censoring User-Level Label Leakage from Gradients in Federated Learning Privacy-preserving training of tree ensembles over continuous data Differentially Private Simple Linear Regression Increasing Adoption of Tor Browser Using Informational and Planning Nudges