« All Events
Nina TaftSenior Staff Research Scientist with Google
Many app developers have a hard time providing good quality privacy in their smartphone apps for a broad variety of reasons. The Applied Privacy Research team at Google has pursued an approach aimed at providing nudges to developers to incentivize them to improve privacy. We’ll briefly summarize existing nudges, and then focus on recent work that aims to leverage advances in NLP to automatically extract privacy insights from app reviews. We design a multi-stage methodology that identifies whether or not a review discusses any privacy related topic, assigns fine-grained issue tags to privacy texts, clusters texts that discuss similar issues, summarizes a set of texts within a cluster and also assigns emotions to the texts. We’ll summarize our methodology for each of these steps and then present an initial look at what this analysis pipeline uncovers when applied to 600M app reviews.
Nina Taft is a Senior Staff Research Scientist at Google where she leads the Applied Privacy Research group. Nina received her PhD from UC Berkeley, and has worked in industrial research labs since then – at SRI, Sprint Labs, Intel Berkeley Labs, and Technicolor Research – before joining Google. For many years, Nina worked in the field of networking, focused on Internet traffic modeling, traffic matrix estimation, network protocols, and intrusion detection. In 2017 she received the top-10 women in networking N2Women award. Recently she is working on privacy enhancing technologies with a focus on applications of machine learning for privacy. She has been chair of the SIGCOMM, IMC and PAM conferences, has published over 80 papers, holds 10 patents, and currently serves on the Scientific Advisory Board of the Max Planck Institute.