Top Banner

of 31

2012_Privacy Papers for Policy Makers

Apr 04, 2018

Download

Documents

futureofprivacy
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
  • 7/31/2019 2012_Privacy Papers for Policy Makers

    1/31

    Privacy Papers or

    Policy Makers2012

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    2/31

    Te publication o Privacy Papers or Policy Makers was supportedby A & , Microso t, and GMAC.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    3/31

    November 7, 2012

    We are delighted to provide you with FPFs third annual Privacy Papers or Policy Makers,representing cutting-edge research and analysis on a variety o important privacy issues.

    Te eatured works were selected by members o the Future o Privacy Forum Advisory Board(scholars, privacy advocates, and Chie Privacy O cers) based on criteria emphasizing clarity,practicality and overall utility. Given the excellent submissions we received, choosing was adi cult task. But we believe our Advisory Board has chosen well, and has put together a diverseand thought-provoking collection. wo o the papers were recipients o the IAPP award or bestpapers presented at the 2012 Privacy Law Scholars Con erence.

    We hope this relevant and timely scholarship helps in orm and stimulate thinking amongpolicy makers and policy in uentials in the US and around the world, with whom we are sharingthis compilation.

    We are delighted to share new ways o thinking about privacy.

    We want to thank A &, Microso t, and GMAC or their special support o thePrivacy Papers project.

    Sincerely yours,

    Christopher Wol Founder and Co-chair

    Jules Polonetsky Director and Co-chair

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    4/31

    Alessandro Acquisti Associate Pro essor o In ormation echnology and Public Policy at the Heinz College,Carnegie Mellon University Jim Adler Chie Privacy O cer & General Manager,

    Data Systems, InteliusEllen Agress Senior Vice President and Deputy General Counsel, News Corporation Annie I. Antn Pro essor and Chair, Georgia ech School o Interactive Computing Stephen Balkam CEO, Family Online Sa ety InstituteKenneth A. Bamberger Pro essor o Law, Berkeley School o Law Elise Berkower Associate General Counsel, Privacy, Te Nielsen Company Debra Berlyn President, Consumer Policy Solutions Joan ( Jodie) Z. Bernstein Counsel, Kelley Drye & Warren, LLP and ormer director o the Bureau o Consumer Protectionat the Federal rade CommissionMichael Blum General Counsel, Quantcast Bruce Boyden Assistant Pro essor o Law,Marquette University Law School Allen Brandt Corporate Counsel, Data Privacy & Protection,Graduate Management Admission Council (GMAC) Jim Brock CEO, PrivacyChoice Justin Brookman Director, Consumer Privacy,Center or Democracy & echnology Kathryn C. Brown Senior Vice President, Public Policy Development and Corporate Responsibility, Verizon James M. Byrne Chie Privacy O cer,Lockheed Martin CorporationRyan Calo Assistant Pro essor, University o WashingtonSchool o Law A liate Scholar, Stan ord Center or Internet and Society Dr. Ann Cavoukian In ormation and Privacy Commissioner o OntarioBrian Chase General Counsel, Foursquare Labs, Inc.Danielle CitronPro essor o Law, University o MarylandLaw School Maureen Cooney Senior Counsel and Deputy Chie Privacy O cer,Sprint Nextel Lorrie Faith Cranor Associate Pro essor o Computer Science andEngineering, Carnegie Mellon University Mary Culnan Pro essor Emeritus, Bentley University

    Future o Privacy Forum Advisory Board

    Simon Davies Founder, Privacy International Michelle De Mooy Senior Associate, National Priorities,Consumer ActionElizabeth Denham In ormation and Privacy Commissioner or British Columbia Michelle Dennedy Chie Privacy O cer, McA eeLeslie Dunlap Vice President o Privacy, Policy and rust,Yahoo! Inc.Benjamin Edelman Assistant Pro essor, Harvard Business School Erin Egan Chie Privacy O cer, Policy, Facebook Keith Enright Senior Corporate Counsel, GoogleLeigh Feldman

    SVP, Senior Privacy Executive Global ComplianceRisk Enterprise Privacy, Bank o America Eric Friedberg Co-President, Stroz Friedberg Rip Gerber President and CEO, LocaidScott Goss Senior Privacy Counsel, QualcommSusan Gindin Sr. Privacy Manager, Wal-Mart Jenni er Barrett Glasgow Chie Privacy O cer, AcxiomGreg Goeckner Executive Director, Merchant Risk Council

    Kimberly Gray Chie Privacy O cer, IMS HealthSean Hanley Director o Compliance, Zynga Game Network, Inc.Pamela Jones Harbour Former Federal rade Commissioner; Partner,Fulbright & Jaworski LLPMegan Hertzler Director o In ormation Governance, Xcel Energy Michael HoVP Business Development, Bering Media David Ho man Director o Security Policy and Global Privacy O cer, Intel Marcia Ho man Sta Attorney, Electronic Frontier Foundation im Hollenbeck Director o Global Ethics and Compliance,Procter & GambleChris Hoo nagleDirector, Berkeley Center or Law & echnologysin ormation privacy programs and senior ellow to the Samuelson Law, echnology & Public Policy Clinic Jane Horvath Apple, Inc.Sandra Hughes Sandra Hughes Strategies, Ltd.Brian Huseman Director, Public Policy, Amazon

    Je Jarvis Associate Pro essor; Director o the InteractiveProgram, Director o the ow-Knight Center or Entrepreneurial Journalism at the City Universito New York David Kahan General Counsel, JumptapIan Kerr Canada Research Chair in Ethics,Law & echnology, University o Ottawa,Faculty o Law Bill Kerrigan CEO, Abine, Inc.Brian Knapp Chie Privacy O cer and Vice President,Corporate A airs, Loopt Jerry Kovach Senior Vice President, External A airs, Neustar John Krop Deputy Counsel, Privacy and In ormationGovernance, Reed Elsevier Fernando Laguarda Vice President, External A airs and Policy Coun ime Warner CableManuj Lal Vice President, Legal A airs, Press Ganey Associates, Inc.Barbara Lawler Chie Privacy O cer, Intuit Peter Le kowitz Chie Privacy O cer, Oracle Adam Lehman Chie Operating O cer and GM, Lotame SolutionGerard Lewis Senior Counsel and Chie Privacy O cer, Comca

    Chris Libertelli Head o Global Public Policy, Net ix Chris Lin Executive Vice President, General Counsel and Chie Privacy O cer, comScore, Inc.Brendon Lynch Chie Privacy O cer, Microso t Mark MacCarthy Vice President o Public Policy, Te So tware& In ormation Industry AssociationSiobhan M. MacDermott Chie Privacy O cer, AVG echnologiesFran Maier Founder and Board Chair, RUS e Jenni er Mardosz Chie Privacy O cer, Fox Entertainment Group William McGeveran Associate Pro essor, University o Minnesota Law School erry McQuay President, Nymity Scott Meyer CEO, EvidonDoug Miller Global Privacy Leader, AOL, Inc.Saira Nayak Director o Policy, RUS e

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    5/31

    Lina Ornelas General Director or Privacy Sel -Regulation, Federal Institute or Accessto In ormation and Data Protection MexicoKimberley Overs Assistant General Counsel, Pfzer, Inc.

    Harriet Pearson Partner, Hogan and LovellsGeorge Pappachen Chie Privacy O cer, Kantar GroupChristina Peters Senior Counsel, Security and Privacy, IBMRobert Quinn Chie Privacy O cer and Senior Vice President or Federal Regulatory, A &Peter Rabinowitz Chie Privacy Counsel, American ExpressMeMe Rasmussen VP, Chie Privacy O cer, Associate General Counsel, Adobe SystemsKatie Ratt Executive Counsel, Privacy Policy and Strategy, Te Walt Disney Company Joel R. Reidenberg Pro essor o Law, Fordham University School o Law Neil Richards Pro essor o Law, Washington University Law School Shirley Rooker President, Call or ActionMike Sands President and Chie Executive O cer, Bright ag Russell Schrader

    Chie Privacy O cer and Associate General Counsel Global Enterprise Risk, Visa Inc.Paul Schwartz Pro essor o Law, University o Cali ornia-Berkeley School o Law Cary Sherman Chairman and CEO, Te Recording Industry Association o America Ho Shin General Counsel, Millennial Media Meredith Sidewater Senior Vice President and General Counsel,Lexis Nexis Risk SolutionsEmery Simon Counselor, Business So tware Alliance

    Dale Skivington Chie Privacy O cer, Dell Daniel Solove Pro essor o Law, George Washington University Law School Cindy Southworth Vice President o Development & Innovation,National Network to End Domestic Violence(NNEDV)

    Future o Privacy Forum Advisory Board (continued)

    JoAnn Stonier SVP and Global Privacy & Data Protection O cer,MasterCardZoe Strickland VP, Chie Privacy O cer, United Health GroupGreg Stuart CEO, Mobile Marketing AssociationLior Jacob Strahilevitz Sidley Austin Pro essor o Law, University o ChicagoLaw School Peter Swire Pro essor, Ohio State University Moritz Collegeo Law Omar awakol CEO, BlueKaiOmer ene Associate Pro essor, College o Management School o Law, Rishon Le Zion, Israel Owen ripp Co-Founder and Chie Operating O cer,Reputation.comCatherine ucker Mark Hyman, Jr. Career Development Pro essor and Associate Pro essor o Management Science,Sloan School o Management, MISteven Vine Chie Privacy O cer, PulsePoint Hilary Wandall Chie Privacy O cer, Merck & Co., Inc.Mark Weinstein Founder and CEO, SgrouplesMichael Zimmer Assistant Pro essor in the School o In ormationStudies, University o Wisconsin-MilwaukeeGeneral Electric

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    6/31

    able o ContentsBridging the Gap Between Privacy and Design

    Deidre Mulligan and Jennifer King ..................................................................................................... 1

    Going Dark Versus a Golden Age o Surveillance

    Peter Swire and Kenesa Ahmad ............................................................................................................3How Come Im Allowing Strangers to go Trough My Phone? Smart Phonesand Privacy Expectations

    Jennifer King ......................................................................................................................................5

    Mobile Payments: Consumer Benefts & New Privacy Concerns

    Chris Jay Hoofnagle, Jennifer M Urban, and Su Li ............................................................................... 6

    Privacy by Design: A Counter actual Analysis o Google and Facebook Privacy Incidents

    Ira S. Rubinstein and Nathan Good* ....................................................................................................9

    Te Re-identifcation o Governor William Welds Medical In ormation: A CriticalRe-examination o Health Data Identifcation Risks and Privacy Protections, Ten and Now

    Dr. Daniel Barth-Jones .......................................................................................................................12

    Smart, Use ul, Scary, Creepy: Perceptions o Online Behavioral Advertising

    Blase Ur, Pedro Giovanni Leon, Lorrie Faith Cranor, Richard Shay, and Yang Wang .............................15

    Will Johnny Facebook Get a Job? An Experiment in Hiring Discrimination via OnlineSocial Networks

    Alessandro Acquisti and Christina Fong* ..............................................................................................19

    *Recipients o the IAPP award or best papers at the 2012 Privacy Law Scholars Con erence

    Out o respect or copyright law and or ease o re erence, this compilation is a digest o the papers selected by the Future o Privacy Fo-rum Advisory Board and does not contain ull text. the selected papers in ull text are available through the re erenced links.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    7/31

    1

    Bridging the Gap Between Privacy and DesignDeidre Mulligan and Jenni er King

    Full paper available at: http://www. utureo privacy.org/privacy-papers-2012/

    Executive Summary

    Tis article explores the gap between privacy and design in the context o lateral privacyprivacy issues arising among users o a service rather than rom the service provider on socianetworking sites (SNSs) and other plat orms by analyzing the privacy concerns lodged against theintroduction o Facebooks News Feed in 2006. Our analysis reveals that the dominant theory o privacy put orth by regulators, privacy as individual control, o ers little insight into the experiences o privacy violation claimed by users. More importantly, we show that this theory is ill equipped to guide the designo SNSs and plat orms to avoid similar harms in the uture. A rising tide o privacy blunders on socialnetworking sites and plat orms drives the search or new regulatory approaches, and privacy regulators across the globe are increasingly demanding that the Fair In ormation Practice Principles, theembodiment o privacy as individual control, in orm the design o technical systems through Privacy By Design. Te call or Privacy By Design the practice o embedding privacy protections into products andservices at the design phase, rather than a ter the act connects to growing policymaker recognitiono the power o technology to not only implement, but also to settle policy through architecture,confguration, inter aces, and de ault settings. We argue that regulators would do well to ensure that theconcept o privacy they direct companies to embed a ords the desirable orms o protection or privacy.

    Ideally, there would be a widely used set o methods and tools to aid in translating privacy into design. oday, neither is true. We identi y three gaps in the in ormational sel -determination approach thatlimit its responsiveness to lateral privacy design decisions in SNSs and plat orms and then explore threealternative theories o privacy that provide compelling explanations o the privacy harms exemplifedin plat orm environments. Based on this descriptive utility, we argue that these theories provide morerobust grounding or e orts by SNSs and plat orm developers to address lateral privacy concerns in thedesign o technical arti acts. Unlike FIPPs, which can be applied across contexts, these theories requireprivacy to be discovered, not just implemented. o bridge this discovery gap, we turn to the feld o Human Computer Interaction (HCI) and dip into the related feld o Value Sensitive Design (VSD)to identi y tools and methodologies that would aid designers in discovering and ultimately embeddingthese contextual, socially-oriented understandings o privacy in technical arti acts. Finally, we providesome tentative thoughts on the orm and substance o regulations that would prompt corporations toinvest in these HCI approaches to privacy.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    8/31

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    9/31

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    10/31

    4

    Authors

    Peter P. Swire is the C. William ONeill Pro essor o Law at the Moritz College o Law o the OhioState University. He is a Senior Fellow with the Future o Privacy Forum, and also a ellow with theCenter or American Progress and Center or Democracy and echnology. He has been a recognizedleader in privacy, cybersecurity, and the law o cyberspace or well over a decade, as a scholar, governmeno cial, and participant in numerous policy, public interest, and business settings.From 2009 until August,2010 Pro essor Swire was Special Assistant to the President or Economic Policy, serving in the NationalEconomic Council under Lawrence Summers. From 1999 to early 2001 Pro essor Swire served as theClinton Administrations Chie Counselor or Privacy, in the U.S. O ce o Management and Budget,

    as the only person to date to have government-wide responsibility or privacy issues. Among his otheractivities when at OMB, Swire was the White House coordinator or the HIPAA Medical Privacy Rule,and chaired a White House working group on how to update wiretap laws or the Internet Age. In 2012,Pro essor Swire is lead author or two new books that are the o cial guides or Certifed In ormationPrivacy Pro essional examinations. Many o his writings appear at www.peterswire.net.

    Kenesa Ahmad is an in ormation privacy and cyber security attorney. She received her law degree romthe Moritz College o Law o Te Ohio State University, where she served as an Articles Editor o theOhio State Law Journal. She also received her LL.M. rom Northwestern University Law School. From20112012 Ahmad completed a legal and policy ellowship with the Future o Privacy Forum. She is now an Associate in the global privacy practice o Promontory Financial Group.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    11/31

    5

    How Come Im Allowing Strangers to go Trough My Phone?Smart Phones and Privacy Expectations Jenni er King

    Full paper available at: http://www. utureo privacy.org/privacy-papers-2012/

    Executive Summary Tis study examines the privacy expectations o smartphone users by exploring two specifc dimensionsto smartphone privacy: participants concerns with other people accessing the personal data stored ontheir smartphones, and applications accessing this data via plat orm APIs. We interviewed 24 AppleiPhone and Google Android users about their smartphone usage, using Altmans theory o boundary regulation and Nissenbaums theory o contextual integrity to guide our inquiry. We ound these theoriesprovided a strong rationale or explaining participants privacy expectations, but there were discrepanciebetween their expectations, smartphone usage, and existing plat orm designs and data access practicesby application developers. We conclude by exploring this privacy gap and recommending designimprovements to both the plat orms and applications to address it.

    Author

    Jenni er King is a Ph.D candidate in In ormation Science at UC Berkeleys School o

    In ormation, where she is advised by Pro essor Deirdre Mulligan. Ms. Kings work uses human-computerinteraction methods to examine the privacy gap between peoples expectations and howtechnological systems actually unction. Her publications include privacy ocused investigations intomobile systems, online social networks, radio- requency identifcation [RFID], and digital videosurveillance. Ms. King holds a pro essional masters degree in in ormation management and systemsalso rom Berkeleys i-School. Prior to her research career, Ms. King worked in security and productmanagement or several Internet companies, most recently Yahoo!.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    12/31

    6

    Mobile Payments: Consumer Benefts & New Privacy ConcernsChris Jay Hoo nagle, Jenni er M. Urban, and Su Li

    Full paper available at: http://www. utureo privacy.org/privacy-papers-2012/

    Executive Summary

    Payment systems that allow people to pay using their mobile phones are promised to reduce transactionees, increase convenience, and enhance payment security. New mobile payment systems also are likelyto make it easier or businesses to identi y consumers, to collect more in ormation about consumers,and to share more in ormation about consumers purchases among more businesses. Tis is a radicalchange rom the current payment system, which by design and by legal arrangement, limits the ability o participants to ully track consumer purchases. Te shi t to mobile payments has large implications orconsumer tracking and profling, and because o nuances in existing anti-marketing laws, the shi t couldmean that individuals will receive much more spam and telemarketing.

    While many studies have reported security concerns as a barrier to adoption o mobile paymenttechnologies, the privacy implications o these technologies have been under examined. o betterunderstand Americans attitudes towards privacy in new transaction systems, we commissioned anationwide, telephonic (wireline and wireless) survey o 1,200 households, ocusing upon the waysthat mobile payment systems are likely to share in ormation about consumers purchases.

    We ound that Americans overwhelmingly oppose the revelation o contact in ormation (phonenumber, email address, and home address) to merchants when making purchases with mobilepayment systems. Furthermore, an even higher level o opposition exists to systems that track consumers movements through their mobile phones.

    Tis last result speaks directly to emerging business models that attempt to track individualsuniquely through signals emitted rom phones. For instance, Navizon I. .S. claims that it can track, any Wi-Fi enabled smart phone or tablet, including iPhones, iPads, Android devices, BlackBerry, WindowsMobile, Symbian and, o course, laptops. As with many other tracking technologies, it seems to be

    designed to operate without the knowledge o the individual. Navizon claims, Unobtrusive surveillance /Navizon I. .S. works in the background, quietly and unobtrusively locating Wi-Fi- enabled devicesNoapplication is needed on the devices to be tracked. Te only requirement is that their Wi-Fi radios beturned on, which is the de ault in most smart phones, tablets and laptops.

    In this paper, we explain some advantages o mobile payment systems, some challenges to theiradoption in the United States, and then turn to our main fnding: Americans overwhelming reject mobilepayment systems that track their movements or share identifcation in ormation with retailers. We thensuggest a possible remedy or such in ormation sharing: adapting provisions o Cali ornias Song-BeverlyCredit Card Act, which prohibits merchants rom requesting personal in ormation at the register whena consumer pays with a credit card, to mobile payments systems. Our survey results suggest thatconsumers would support limitations on in ormation collection and trans er. Song-Beverly could be

    adopted to accommodate those who wish to share their transaction data.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    13/31

    7

    Authors

    Chris Jay Hoo nagle is director o the Berkeley Center or Law & echnologys in ormationprivacy programs and senior ellow to the Samuelson Law, echnology & Public Policy Clinic. He isan expert in in ormation privacy law. He teaches computer crime law and a seminar on the Federal radeCommission and online advertising. Hoo nagles research ocuses on the challenges in aligning consumerprivacy pre erences with commercial and government uses o personal in ormation.

    Jenni er M. Urban is an Assistant Clinical Pro essor o Law and Director o the Samuelson Law, echnology & Public Policy Clinic at the UC Berkeley School o Law.

    Broadly, her research considers how values such as ree expression, reedom to innovate and privacyare mediated by technology, the laws that govern technology, and private ordering systems. Her clinicstudents represent clients in numerous public interest cases and projects at the intersection o societalinterestsincluding civil liberties, innovation, and creative expressionand technological changeRecent Clinic projects include work on individual privacy rights, copyright and ree expression,artists rights, ree and open source licensing, the smart electricity grid, biometrics, and de ensivepatent licensing.

    Pro essor Urban comes to Berkeley Law rom the University o Southern Cali ornias Gould School o Law, where she ounded and directed the USC Intellectual Property & echnology Law Clinic. Priorto joining the USC aculty in 2004, she was the Samuelson Clinics frst ellow. Prior to that, she was anattorney with the Venture Law Group in Silicon Valley. She graduated rom Cornell University with aB.A. in biological science (concentration in neurobiology and behavior) and rom Berkeley Law with a J.D. (intellectual property certifcate). She was the Annual Review o Law and echnology editor whilea student at Berkeley Law, and received the Berkeley Center or Law and echnology Distinguished Alumni Award in 2003.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    14/31

    8

    Su Li received a PhD in Sociology in 2006 and a MS in Mathematical Methods or Social Science in2002, both rom Northwestern University. She received additional trainings in quantitative methodsrom the Stan ord Institute or the Quantitative Study o Society (SIQSS) and the Interuniversity Con-sortium or Political and Social Research (ICPSR) at the University o Michigan. Su Li joined Berkeley Law in January 2010, as the statistical consultant or the school o law. She works with pro essors, editorso the Cali ornia Law Review, J.D. and Ph.D. students, as well as a liated researchers and scholars onresearch papers/projects, government reports, law suit cases, and dissertations. She provides consultationservices on data retrieving, modeling construction, results interpretation and other relevant issues.

    Su Lis research interests are on quantitative methods, social network analysis, law and society, genderand social inequality, economic sociology and organizations. Her previous research/publications ocus ongender segregation and inequalities in higher education. Su Lis current research involves the develop-ment and change o the legal pro ession, especially in the feld o white collar criminal litigation. She alsoparticipates in projects on the implications o privacy laws in the context o virtual world and beyond.

    Be ore joining Berkeley Law, Su Li was an assistant pro essor o Sociology at Wichita State University in Wichita Kansas.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    15/31

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    16/31

    10

    systems in rastructure, which are generally hidden rom the user but drive the heart o any system; andront-end, user inter aces, which (in the privacy setting) handle tasks such as notifcation, consent, access,pre erence management, and other user experiences. We there ore analyze privacy by design romtwo complementary perspectives: privacy engineering , which re ers to the design and implementationo so tware that acilitates privacy, andusable privacy design, which re ers to design tasks that ocus onhuman-computer interaction (HCI). Te ormer ocuses on building so tware satis ying the abstractprivacy requirements embodied in the FIPs (in some cases overlapping with security engineering),

    the latter on ensuring that users understand and beneft rom well-engineered privacy controls. Ourdiscussion o privacy engineering draws mainly on our key papers in the technical design literature andthe works cited therein. In contrast, our discussion o usable privacy design looks at a rather di erentbody o work that fnds inspiration in the writings o Irwin Altman, a social psychologist, and HelenNissenbaum, a philosopher o technology, both o whom analyze privacy in terms o social interaction.In Part II, we o er ten case studies o Google and Facebook privacy incidents and then rely on theprinciples identifed in Part I to discover what went wrong and what the two companies might have donedi erently to avoid privacy violations and consumer harms. We conclude in Part III by considering whatlessons regulators might learn rom this counter actual analysis.

    Authors

    Ira Rubinstein is a Senior Fellow at the In ormation Law Institute. His research interests includeInternet privacy, electronic surveillance law, online identity, and Internet security. Rubinsteinlectures and publishes widely on issues o privacy and security and has testifed be ore Congress on these

    topics on several occasions. In September 2009, he organized a con erence at the law school on FederalPrivacy Legislation, and he participated in the December 2009 Federal rade Commission Roundtable:Exploring Privacy. In July 2010, he testifed at a hearing on a new privacy bill, H.R. 5777, the BestPractices Act, be ore the House Subcommittee on Commerce, rade, and Consumer Protection. In 2011,he was awarded a research grant to explore regulatory issues related to privacy by design. In March2011, he was an invited speaker at a Boalt Hall Law School symposium on echnology: rans ormingthe Regulatory Endeavor, where he discussed his paper entitled Regulating Privacy by Design. Tispaper has been published in the Symposium Issue o the Berkeley echnology Law Journal (2012).He also recently commented on the White House proposal to encourage a multistakeholderprocess or developing consumer privacy codes o conduct. In June 2012, he co-authored a paper withNathan Good entitled Privacy by Design: A Counter actual Analysis o Google and Facebook Privacy Incidents, which was chosen or the IAPP Privacy Law Scholars Award at the 5th Annual

    Privacy Law Scholars Con erence. Other recent publications include Privacy and Regulatory Innovation: Moving Beyond Voluntary Codes, 6 I/S: A Journal o Law and Policy or the In ormationSociety 356 (2011), which was selected by the Future o Privacy Forum in their best Privacy Papersor Policy Makers competition, and Data Mining and Internet Profling: Emerging Regulatory and echnological Approaches, co-authored with Ron Lee and Paul Schwartz, 75 U. Chi. L. Rev.261 (2008). Prior to joining the ILI, he spent 17 years in Microso ts Legal and Corporate A airsdepartment, most recently as Associate General Counsel in charge o the Regulatory A airs andPublic Policy group. Be ore coming to Microso t, he was in private practice in Seattle, specializing in

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    17/31

    11

    immigration law. He graduated rom Yale Law School in 1985. From 1998-2001, Rubinstein served onthe Presidents Export Council, Subcommittee on Encryption. He has also served on the Editorial Boardo the IEEE Security and Privacy Magazine. In 2010, he joined the Board o Directors o the Center orDemocracy and echnology.

    Dr. Nathan Good is Principal and Chie Scientist o Good Research. A undamental goal o his work ishelping companies create networked systems devices and services that are simple, secure and respect ulo peoples privacy. He is a co-author o the 2012 web privacy census, and contributing author to bookson privacy and security. Prior to

    Good Research, Nathan was at PARC, Yahoo and HP research labs. At Berkeley, he worked with RUSand the Samuelson Law & echnology Clinic and was a member o the 2007 Cali ornia Secretary o State op-to-Bottom Review o Electronic Voting Systems. Nathan has published extensively on userexperience studies, privacy, and

    security related topics and holds patents on so tware technology or multimedia systems and eventanalysis. His research has been reported on in the New York imes, CNN and ABC and he hastestifed on his research be ore the House, Senate and F C. Nathan has a Phd in In ormationScience and a MS in Computer Science rom the University o Cali ornia at Berkeley and was a membero Li eLocks Fraud Advisory Board.

    Dr. Nathan Good is Principal and Chie Scientist o Good Research. A undamental goal o his work ishelping companies create networked systems devices and services that are simple, secure and respect ul

    o peoples privacy. He is a co-author o the 2012 web privacy census, and contributing author to bookson privacy and security. Prior to

    Good Research, Nathan was at PARC, Yahoo and HP research labs. At Berkeley, he worked with RUSand the Samuelson Law & echnology Clinic and was a member o the 2007 Cali ornia Secretary o State op-to-Bottom Review o Electronic Voting Systems. Nathan has published extensively on userexperience studies, privacy, and

    security related topics and holds patents on so tware technology or multimedia systems and event analy-sis. His research has been reported on in the New York imes, CNN and ABC and he has testifed onhis research be ore the House, Senate and F C. Nathan has a Phd in In ormation Science and a MS inComputer Science rom the University o Cali ornia at Berkeley and was a member o Li eLocks Fraud Advisory Board.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    18/31

    12

    Te Re-identifcation o Governor William Welds Medical In ormation: A Critical Re-examination o Health Data Identifcation Risks and Privacy Protections, Ten and Now Dr. Daniel Barth-Jones

    Full paper available at: http://www. utureo privacy.org/privacy-papers-2012/Executive Summary Te 1997 re-identifcation o Massachusetts Governor William Welds medical data within aninsurance data set which had been stripped o direct identifers has had a pro ound impact on thedevelopment o de-identifcation provisions within the 2003 Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule. Welds re-identifcation, purportedly achieved through theuse o a voter registration list rom Cambridge, MA is requently cited as an example that computerscientists can re-identi y individuals within de-identifed data with astonishing ease.

    However, a care ul re-examination o the population demographics in Cambridge indicates that Weld was most likely re-identifable only because he was a public fgure who experienced a highly publicized hospitalization rather than there being any certainty underlying his re-identifcation usingthe Cambridge voter data, which had missing data or a large proportion o the population. TeCambridge population was nearly 100,000 and the voter list contained only 54,000 o these residents,so the voter linkage could not provide su cient evidence to allege any defnitive re-identifcation. Te statistics underlying this amous re-identifcation attack make it clear that the purported method o voter list linkage could not have defnitively re-identifed Weld. While the odds were somewhat betterthan a coin- ip, they ell quite short o the certainty that is implied by the term re-identifcation.

    Te complete story o Welds re-identifcation exposes an important systemic barrier to accuratere-identifcation known as the myth of the perfect population register . Because the logic underlyingre-identifcation depends critically on being able to demonstrate that a person within a sample dataset is the only person in the larger population who has a set o combined characteristics (known asquasi-identifers) that could potentially re-identi y them, most re-identifcation attempts ace astrong challenge in being able to create such a complete and accurate population register. Importantly,each person missing within an imper ect population register is directly protected rom re-identifcationattempts the register -- but these missing individuals also importantly con ound attempts to re-identi y others whenever such incomplete registers are used in re-identifcation attempts. When just a singleperson sharing the same quasi-identifer characteristics with a purported re-identifcation victimis missing rom the voter register, then the probability o a correct re-identifcation or this target isonly 50%.

    Tis strong limitation not only underlies the entire set o amous Cambridge re-identifcation resultsbut also impacts much o the existing re-identifcation research cited by those making claims o easy re-identifcation. a act which must be understood by public policy-makers seeking to realistically assesscurrent privacy risks posed by de-identifed data.Fortunately, HHS responded to the concerns raised by the Weld/Cambridge voter list privacy attack and, through the HIPAA Privacy Rules, acted to help prevent re-identifcation attempts.Re-Identifcation risks today under the HIPAA Privacy Rule reveal dramatic reductions (thousandsold) o re-identifcation risks or de-identifed health data as protected by the HIPAA Privacy Rulede-identifcation provisions since 2003. Available evidence urther suggests that re-identifcation risksunder current HIPAA protections are now well-controlled.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    19/31

    13

    In 2007, the National Committee on Vital and Health Statistics received testimony that 0.04percent (4 in 10,000) o the individuals in the U.S. population within data sets de-identifed usingthe Sa e Harbor method could possibly be identifed on the basis o their year o birth, genderand three-digit ZIP code.

    A 2010 study estimated re-identifcation risks under the HIPAA Sa e Harbor rule on astate-by-state basis using voter registration data. Te percentage o a states population estimatedto be vulnerable (i.e., not defnitively re-identifed, but potentially re-identifable) ranged rom 0.01percent to 0.25 percent.

    Te O ce o the National Coordinator or Health In ormation echnology conducted a 2011study examining an attack on HIPAA de-identifed data under realistic conditions, testing whetherHIPAA Sa e Harbor de-identifed data could be combined with external data to re-identi y patients. Te study was per ormed under practical and plausible conditions and verifed the re-identifcationsagainst direct identifersa crucial step o ten missing rom this sort o study. Te study used 15,000de-identifed patient records and showed a match or only two o the f teen thousand individuals(a re-identifcation rate o 0.013 percent). Even when maximally strong assumptions were madeabout the possible knowledge o a hypothetical data intruder, the re-identifcation risk (under thequestionable assumption that re-identifcation would even be attempted) was likely to be less than0.22 percent.

    Because a vast array o healthcare improvements and medical research critically depend on de-identifedhealth in ormation, the essential public policy challenge then is to accurately assess the current state o privacy protections or de-identifed data, and properly balance both risks and benefts to maximume ect. While one can point to very ew, i any, cases o persons who have been harmed by attacks with verifed re-identifcations, virtually every member o our society has routinely benefted rom the use o de-identifed health in ormation.

    Considerable costs come with incorrectly evaluating the true risks o re-identifcation undercurrent HIPAA protections. It is essential to understand that de-identifcation comes at a cost to thescientifc accuracy and quality o the healthcare decisions that will be made based on research usingde-identifed data. Balancing disclosure risks and statistical accuracy is crucial because some popular

    de-identifcation methods, such as k-anonymity methods, can unnecessarily, and o ten undetect-ably, degrade the accuracy o de-identifed data or multivariate statistical analyses. Tis problem is wellunderstood by statisticians and computer scientists, but not well-appreciated in the public policy arena.Poorly conducted de-identifcation and the overuse o de-identifcation methods in cases where they do not produce real privacy protections can quickly lead to incorrect scientifc fndings and damagingpolicy decisions.

    De-identifed health data is the workhorse that supports numerous healthcare improvements and a wide variety o medical research activities. Tis critical role that de-identifed health in ormation plays inimproving healthcare is becoming increasingly more widely recognized, but properly balancingthe competing goals o protecting patient privacy while also preserving the accuracy o researchrequires policy makers to realistically assess both sides o this coin. De-identifcation policy must achieve

    an ethical equipoise between potential privacy harms and the very real benefts that result rom theadvancement o science and healthcare improvements which are accomplished with de-identifed data.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    20/31

    14

    Te paper also provides recommendations or enhancements to existing HIPAA de-identifcation policy,such as: Prohibiting o the re-identifcation, or attempted re-identifcation, o individuals and their relatives,

    amily or household members. Requiring parties who wish to link new data elements (which might increase re-identifcation risks)

    with de-identifed data to confrm that the data remains de-identifed. Speci ying that HIPAA de-identifcation status would expire i , at any time, the data contains

    data elements specifed within an evolving Sa e Harbor list which should be periodically updatedby HHS.

    Formally speci ying that or statistically de-identifed data, anticipated data recipients must alwayscomply with specifed time limits, data use restrictions, qualifcations or conditions set orth in thestatistical de-identifcation determination associated with the data.

    Requiring those holding and using de-identifed data to implement and maintain appropriatedata security and privacy policies, procedures and associated physical, technical andadministrative sa eguards.

    Requiring those trans erring de-identifed data to third parties to enter into data use agreements which would oblige those receiving the data to also hold to these conditions, thus maintaining animportant chain-o -trust data stewardship principal accompanying de-identifed data.

    Conclusion William Welds 1997 re-identifcation had an important impact on improving healthcare privacy because it led to regulations that help to importantly protect patients rom re-identifcation risks. But the Weld saga does not re ect the privacy risks that exist under the HIPAA Privacy rules today. We shouldnot let todays minimal re-identifcation risks cause us to abandon our use o de-identifed to protectprivacy, save lives and continue to improve our healthcare system.

    Author

    Daniel C. Barth-Jones , MPH, PhD is a statistical disclosure control researcher and HIV

    epidemiologist serving as an Assistant Pro essor o Clinical Epidemiology at the Mailman Schoolo Public Health at Columbia University and an Adjunct Assistant Pro essor and Epidemiologist atthe Wayne State University School o Medicine. Dr. Barth-Jones work on statistical de-identifcationscience ocuses the importance o properly balancing two public policy goals: e ectively protectingindividuals privacy and preserving the scientifc accuracy o statistical analyses conducted withde-identifed health data.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    21/31

    15

    Smart, Use ul, Scary, Creepy: Perceptions o OnlineBehavioral Advertising Blase Ur, Pedro Giovanni Leon, Lorrie Faith Cranor, Richard Shay, and Yang Wang

    Full paper available at: http://www. utureo privacy.org/privacy-papers-2012/

    Executive Summary In recent years, Internet advertising has become increasingly tailored to individual users. In the simplestcase, contextual advertising, advertising networks choose which ads to display on a webpage based on thcontents o that page. In the more complex technique o online behavioral advertising (OBA), advertisingnetworks profle a user based on his or her online activities, such as the websites he or she visits over time.Using this profle, advertising networks show ads that are more likely to be o interest to a particular user.

    OBA presents both benefts and downsides to users. I their interests have been accurately profled,users will receive more relevant advertising. However, collecting data about users online activities capotentially violate their privacy. Previous research has ound that users have substantial privacy concernsabout OBA, while marketing surveys have ound that consumers like OBA and that discom ort withOBA is reduced when users are in ormed that non-personally identifable in ormation is used or OBA. Whereas past work employed surveys, which can sample a large number o individuals but are notconducive to open-ended questions exploring attitudes and motivations, we conducted interviews tolearn how past experiences, knowledge, and understanding actor into users attitudes toward onlinebehavioral advertising.

    In this paper, we report results o 48 semi-structured interviews that unpack the actors ueling usersattitudes about OBA. Beyond asking participants their opinions, we investigated their knowledge o the current practice o OBA and tools to control it, their understanding o how profles can be created,and the extent to which the circumstances o data collection and the identity o the advertising network in uence their attitudes.

    Attitudes About Internet Advertising and OBAParticipants were surprised that OBA currently occurs. While a number o participants believed thatbrowsing history could theoretically be used to target advertising, ew were aware that this technique iscurrently used. In contrast, many participants were amiliar with contextual ads on frst-party sites, suchas Amazon and Facebook.

    A ter learning about OBA, many participants perceived some benefts rom behavioral advertising, yet the majority o participants noted that the practice negatively impacted their privacy. Participantsmentioned lack o transparency and control as well as discom ort with being monitored. aken as a whole, participants ound OBA smart, use ul, scary, and creepy at the same time. Tey o ten believed thatpersonal in ormation is collected during OBA, potentially in uencing their attitudes toward the practice.Participants varied in the types o browsing situations in which they would like data to be collected or

    OBA purposes, basing these decisions on both privacy and utility.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    22/31

    16

    E ectiveness o Notice and Choice MechanismsParticipants responses suggested that current approaches or providing notice about OBA areine ective. Only a hand ul o participants understood the meaning o industry-created icons intended tonoti y consumers about OBA. Instead, they believed that icons intended to provide notice about OBA would let them express interest in the product being advertised or purchase their own ads. Participantscould not accurately determine what in ormation is collected or OBA purposes, or by whom, and they assumed the worst, leading them to oppose a practice they expected would involve the collection opersonally-identifable and fnancial in ormation.

    Our results also identi y disconnects between participants mental models and current approaches orgiving consumers control over OBA. Participants were unaware o existing tools or controlling OBA,and they were unsure where to turn to protect their privacy. o exercise consumer choice, participantsexpected that they could use amiliar tools, such as their web browsers settings, deleting their cookies,or antivirus so tware suites. However, mechanisms to exercise choice about OBA in browsers are limitedand di cult to use. Deleting cookies, participants most common response, would nulli y their opt-outs. A Do Not rack header has been designed to allow users to set a pre erence in their browser that doesnot disappear when cookies are deleted. However, e orts to defne ully the meaning o Do Not rack arestill ongoing in the W3C racking Protection Working Group.

    Furthermore, existing privacy tools ranging rom opt-out pages to browser plug-ins expect consumersto express OBA pre erences on a per-company basis. However, participants misunderstood the role o advertising networks in the OBA ecosystem, evaluating companies based solely on activities unrelatedto advertising. Participants expressed complex OBA pre erences that depended on the context o theirbrowsing, an approach that is unsupported by current mechanisms. Future investigation is needed to testnotice and choice mechanisms that better align with users understanding o OBA, particularly by takingusers mental models o the process into consideration.

    ConclusionsParticipants ound behavioral advertising both use ul and privacy-invasive. Te majority o participants were either ully or partially opposed to OBA, fnding the idea smart but creepy. However,

    this attitude seemed to be in uenced in part by belie s that more data is collected than actually is.Participants understood neither the roles o di erent companies involved in OBA, nor the technologiesused to profle users, contributing to their misunderstandings.

    Given e ective notice about the practice o tailoring ads based on users browsing activities, participants would not need to understand the underlying technologies and business models. However, our researchsuggests that current notice and choice mechanisms are ine ective. Furthermore, current mechanismsocus on opting out o targeting by particular companies, yet participants displayed aulty reasoning inevaluating companies. In contrast, participants displayed complex pre erences about the situations in which their browsing data could be collected; yet they currently cannot exercise these pre erences. Ourresults suggest that rather than tools or opting out o tracking by individual companies, there is a needor easy-to-use tools that allow consumers to opt-out o certain types o tracking or data practices they

    fnd objectionable, or to opt-out o tracking on certain types o websites or in certain contexts (e.g.,healthcare). In addition, our results suggest a need or more e ective communication with users about when and how OBA occurs.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    23/31

    17

    Authors

    Blase Ur is a second-year Ph.D. student in the School o Computer Science at Carnegie MellonUniversity. His research ocuses on usable security and privacy, including passwords, online behavioraadvertising, and privacy decision making. He received his undergraduate degree in computer sciencerom Harvard University.

    Pedro Giovanni Leon is a Ph.D. student in Engineering and Public Policy at Carnegie MellonUniversity. His research ocuses on investigating strategies that protect non-expert users privacy in todays complex Internet ecosystem. In particular, he is interested in assisting the design andimplementation o both regulations and technologies that improve current transparency and controlmechanisms in the context o online tracking and behavioral advertising. He received a masters degreein In ormation Security echnology and Management rom Carnegie Mellon University and abachelors degree in elecommunications Engineering rom the School o Engineering at the National Autonomous University o Mexico. Be ore coming to CMU, he worked or the Central Bank o Mexico.

    Lorrie Faith Cranor is an Associate Pro essor o Computer Science and o Engineering and PublicPolicy at Carnegie Mellon University where she is director o the CyLab Usable Privacy and Security Laboratory (CUPS). She is also a co- ounder o Wombat Security echnologies, Inc. She has authoredover 100 research papers on online privacy, usable security, phishing, and other topics. She has playeda key role in building the usable privacy and security research community, having co-editedthe seminal book Security and Usability (OReilly 2005) and ounded the Symposium On UsablePrivacy and Security (SOUPS). She also chaired the Plat orm or Privacy Pre erences Project (P3P)Specifcation Working Group at the W3C and authored the book Web Privacy with P3P (OReilly 2002). She has served on a number o boards, including the Electronic Frontier Foundation Board o Directors, and on the editorial boards o several journals. She was previously a researcher at A & -LabsResearch and taught in the Stern School o Business at New York University.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    24/31

    18

    Richard Shay is a ourth-year Ph.D. student in the School o Computer Science at Carnegie MellonUniversity. His research ocuses on usable privacy and security, studying online behavioral advertisingand password policy. He received an undergraduate degree in computer science and classics rom BrownUniversity, and a masters degree in computer science rom Purdue University.

    Yang Wang is an assistant pro essor in the School o In ormation Studies at Syracuse University.His research is centered around privacy and security, and social computing. He was a researchscientist at CyLab in Carnegie Mellon University. Tere, he collaborated with Bell Labs on privacy enhancing technologies, and researched privacy issues in online behavioral advertising and privacyconcerns o online social networks across di erent cultures. He has also been working on studies,models and preventive systems related to regrettable behavior in social media. He received his Ph.D. inin ormation and computer sciences rom University o Cali ornia, Irvine. In his thesis work, he builta privacy enhancing personalization system that takes into consideration privacy regulations and

    individuals privacy pre erences. Wang previously worked at Intel Research, Fuji Xerox Palo AltoLaboratory, and CommerceNet.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    25/31

    19

    Will Johnny Facebook Get a Job? An Experiment in Hiring Discrimination via Online Social Networks Alessandro Acquisti and Christina Fong

    Full paper available at: http://www. utureo privacy.org/privacy-papers-2012/

    Executive Summary Nowadays, many job seekers publicly disclose online, personal in ormation that is risky or employersto ask in ace-to- ace interviews or use in the o cial hiring process. In most o the United States, orinstance, an employer who asks a job applicant questions about her religious a liation, sexual pre erence,or amily status may be sued or discrimination under the Equal Employment Opportunity laws. Tus,even in extensive interviews, much o that in ormation remains requently private.

    Employers costs o acquiring the same data online, however, are much lower: the in ormation is o ten aew clicks away, and the risks o detection are substantially lower. With the rise o social networking sites,micro-blogging, and other Web 2.0 services, new opportunities or labor market discrimination haveclearly arisen. Anecdotal evidence and sel -report surveys suggest that U.S. frms have, in act, started

    using various online services to seek in ormation about prospective hires. According to the employers,the in ormation sought online is benign: frms admit to searching blogs or online profles or evidenceo pro essional or unpro essional behaviors and traits. However, so much more can be gleaned aboutprospective hires rom their online presences. A tweet can reveal a place o worship. A blog post can implya persons sexual pre erence. A photo on LinkedIn can show her race. A comment on Facebook - or even just an image chosen as the online profles background - can indicate her amily status.

    o date, however, no controlled experiment has investigated the extent to which frms use onlineresources to fnd in ormation about job applicants, and how their hiring activities are in uencedby the in ormation they fnd. In particular, no experiment has established whether protectedin ormation that employers are discouraged rom asking during interviews, but which can be ound onsocial networking sites, a ects their employment decisions. We used two randomized experiments to

    investigate the e ects o job candidates personal in ormation, posted on a popular social networking site,on the search activities o employers. Te experiments shared a common design: we used data revealedonline by actual members o popular social networking sites and job seeking sites to design resumes andonline presences o prospective job candidates. We manipulated those candidates personal in ormation,ocusing on traits that U.S. employers may not law ully consider in the hiring process, and there oreshould not inquire about during interviews, and measured individuals and HR pro essionals responsesto those profles. Our current fndings suggest that in ormation ound online about prospective jobcandidates can, in act, be a source o hiring discrimination.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    26/31

    20

    Authors

    Alessandro Acquisti is an associate pro essor at the Heinz College, Carnegie Mellon University,the director o the CMU PeeX (Privacy Economics Experiments) lab, and the co-director o CMUCenter or Behavioral and Decision Research (CBDR). Alessandro has held visiting positions at theUniversities o Rome, Paris, and Freiburg (visiting pro essor); Harvard University (visiting scholar);University o Chicago (visiting ellow); Microso t Research (visiting researcher); and Google (visitingscientist). He has been a member o the National Academies Committee on public response to alerts and warnings using social media.

    Alessandros research investigates the economics o privacy. His studies have spearheaded the applicationo behavioral economics to the analysis o privacy and in ormation security decision making, and theanalysis o privacy and disclosure behavior in online social networks. His studies have been published in journals across several disciplines (including the Proceedings o the National Academy o Science, the Journal o Consumer Research, the Journal o Marketing Research, Marketing Science, In ormationSystems Research, Social Psychological and Personality Science, the Journal o ComparativeEconomics, and ACM ransactions), as well as edited books, con erence proceedings, and numerouskeynotes. Alessandro has been the recipient o the PE Award or Outstanding Research in Privacy Enhancing echnologies, the IBM Best Academic Privacy Faculty Award, multiple Best Paper awards,and the Heinz College School o In ormations eaching Excellence Award. His research has beensupported by awards and grants rom the National Science Foundation, the ranscoop Foundation,Microso t, and Google.

    Alessandro has testifed be ore Senate and House committees on issues related to privacy policy andconsumer behavior, and participated in policy-fnding activities o the Federal rade Commission,DARPA, the European Network and In ormation Security Agency, and various national privacy commissioner authorities. In 2009, he was the invited co-chair o the cyber-economics track at theNational Cyber Leap Year Summit, as part o the NI RD Program, under guidance rom the WhiteHouses O ce o Science and echnology Policy.

    Alessandros fndings have been eatured in national and international media outlets, including theEconomist, the New York imes, the Wall Street Journal, the Washington Post, the Financial imes, Wired.com, NPR, and CNN. His 2009 study on the predictability o Social Security numbers (SSNs) was eatured in the Year in Ideas issue o the NY Magazine (the SSNs assignment scheme waschanged by the US Social Security Administration in 2011). Following his study on ace recognitionand online social networks, in December 2011 Alessandro was invited to participate in the Federal radeCommissions orum on acial recognition technology.

    Alessandro holds a PhD rom UC Berkeley, and Master degrees rom UC Berkeley, the London Schoolo Economics, and rinity College Dublin. While at Berkeley, he interned a Xerox PARC and Riacs,NASA Ames.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    27/31

    21

    Christina Fong is a Senior Research Scientist in the Department o Social and Decision Sciences atCarnegie Mellon University. She has a BA in Economics rom University o Michigan, Ann Arborand an MA and Phd in Economics rom University o Massachusetts, Amherst. Prior to earning herPhD, she acquired three years o ull-time experience in the public sector working or the U.S. Bureauo Labor Statistics, the Massachusetts State Senate, and the Inter-American Development Bank.She has held visiting positions at the Economic Science Laboratory at University o Arizonaand the Department o Political Science at Washington University in St. Louis and receivedresearch unding rom the National Science Foundation, Te John D. and Catherine . MacArthurFoundation, Te Russell Sage Foundation, and the NSF unded program ime-Sharing Experimentsor Social Scientists ( ESS).

    Christinas research ocuses on behavioral motives in non-market settings and in imper ectly competitive markets. Her most recent research concerns discrimination in a variety o such settings,including labor markets, interpersonal cooperation, and charitable giving. In per ectly competitivelabor markets, discrimination purely on the basis o personal characteristics unrelated to job per ormanceshould not occur. I some employers discriminate because o distaste or a particular type o person,other employers could proft by hiring members o this less pre erred group, driving their wages upto the competitive level. Tus we must ask: when we see income disparities between di erent typeso people, does that represent rational, proft-seeking discrimination on the basis o di erent levels o human capital, or does it represent ine cient behavior stemming rom distaste or di erent categorieso people? Similar arguments apply to charitable giving and pre erences or public policy. I an altruisticdonors or voters goal is to help the poor in general, then racial and ethnic bias based purely on social

    group membership should not occur. Fongs research suggests that i) there is some racial bias in charitablegiving to poor people but ii) it stems not rom a simple distaste or people o di erent races but rombelie s that members o ones own racial group are more morally worthy (e.g., harder working, lesseager to take advantage o handouts) than members o a di erent racial group. In addition to herrecent research on discrimination, Fong has a long-term research agenda on the role o airness ineconomic behavior. She has shown that even when we subject data to a high degree o statistical rigor,pre erences or redistribution o income and wealth stem not only rom economic sel -interest but also toa large extent rom desires or airness.

    Christina is a requent reviewer or leading academic journals in economics. She speaks to philanthropicand other non-proft groups about practical implications o research on generosity. Her research hasbeen eatured in a variety o media outlets including the Financial imes Magazine, Te Pittsburgh

    Post-Gazette, and the Chronicle o Philanthropy.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    28/31

    22

    Privacy Papers o Notable Mention o View the Following Papers Visit: http://www. utureo privacy.org/privacy-papers-2012/

    Di erential Privacy as a Response to the Reidentifcation Treat: Te Facebook Advertiser Case StudyBy: Andrew Chin and Anne Klein elter

    Dutch reat? Collaborative Dutch Privacy Regulation and the Lessons it Holds or U.S. Privacy LawBy: Dennis Hirsch

    Internet Advertising A ter Sorrell V. IMS Health: A Discussion on Data Privacy & Te First AmendmentBy: Agatha Cole

    Why Johnny Cant Opt Out: A Usability Evaluation o ools to Limit Online Behavioral Advertising

    By: Pedro Giovanni Leon, Blase Ur, Rebecca Balebako, Lorrie Faith Cranor, Richard Shay, and Yang Wan

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    29/31

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    30/31

    Tis report was designed with the environment and cost-efectiveness in mind. It is printed on recovered ber paperthat has no ozone layer threatening emissions and generates no detectable amounts o sul ur, chlorine, nitrogen,or dioxide gases when properly incinerated.

  • 7/31/2019 2012_Privacy Papers for Policy Makers

    31/31

    About the Future Privacy Forum Te Future o Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advanceresponsible data practices. Te orum is led by Internet privacy experts Jules Polonetsky and Christopher

    Wol and includes an advisory board comprised o leading gures rom industry, academia, law andadvocacy groups.

    o learn more about FPF please visit www utureo privacy org