January 2011 Archives

The panel is being moderated by Kim Hart of Politico and features an impressive line up all-star privacy geeks and wonks:



Kim's framing of the issues: The many different frameworks of privacy.  The key question: who are we actually trying to protect our privacy from?  Strangers?  Bosses? Government? Different kinds of government? Companies/corporations try to sell us things?  What are the different frameworks and how they should look?

Alma:'s Frame: Treating data responsibly...making sure the people who use services have clear and manageable ways to control their preferences.  Advance Announcement: two-step verification for google's products.  Provides two-factor authentication through an access code delivered to a users registered mobile device.  The second announcement is https://encrypted.google.com.  Discussed Ads Preference Manager as well. 

Ed Felten, self-admitted geek with wonkish tendencies, presented from the academic perspective (vs. FTC).  Ed discussed fingerprinting a piece of paper, how a blank piece of paper can be traced.  Implications of this for privacy: individual pieces of can be tracked and fingerprinted.  Makes us think about the privacy of paper based processes, such as voting by paper ballot.  Described Princeton CTIP's SPORC System: http://www.cs.princeton.edu/~mfreed/docs/sporc-osdi10.pdf

Peter's presentation focused on defining privacy.  Privacy begins with the data.  Data describes many points about our lives: what we read and think.  Where we go, etc.  Who do you want to prevent having access to your data?  Corporations? Divorce lawers? Governments?  To what end to you wish to have privacy?  From bad laws? From law enforcement? From authoritarian governments?  There is no technology that will give us privacy but there are three you should know about...(1) HTTPS aka Secure Socket Layer.  EFF's HTTPS Everywhere (Wonk Note: introduces risk that exposes the user as using HTTPS...explain later).  (2) Do-Not-Track -- policy fix for a technical problem, Stanford University has proposed a technical solution at http://donottrack.us/.  (3) Tor  -- based on Onion Routing -- a tool that bounces your communications around the world.  (Wonk Note: Tor also introduces a risk.  At end-nodes, Tor loses it's encryption and your information will exit in the clear.  ONLY send encrypted data out over Tor, never send clear data over the Tor network).

Ari -- How to be bridge the gap between wonks wanting technology to implement the fair information principles in technology and geeks trying to interpret those principles?  (WonK Note: Privacy Engineering!).  Geeks are allergic to wonks and vice versa.  ISO created a process to interpret legal issues into standards.  Standards should support public policy. There is no measurement of privacy -- in order to get to something that industry can use it has to be tied to some sort of measurement.  Policy can be turned into measurement, we can come up with measures with how things are interpreted and turn them into a standard.  In privacy we have a problem the security industry does not -- we have policies embodies in the Fair Information Practice Principles we have almost no measurement around the technologies. Used Do Not Track and other opt-out mechanisms for -- Microsoft, Google, and others have opt-out technologies but there is no standard.  Expressed concern about going in one direction without measurement and standards.

Kim: Security and Privacy seem to be seperate.  With people focusing on one or other but not really both.  How do we merge the discussions so that some of the points being made about privacy get integrated into security?

Alma: Missed her response
Felten: Missed
Peter: turning on HTTPS will give you more security and privacy; however, it doesn't matter if the connection is secure if the back-end is doing something insidious with the data. 
Ari -- if you don't have security you don't have privacy.  In the late 90s we tried to seperate out privacy and security because security was the overarching concern.  However, after high profile breaches (Wonk Note: VA Laptop breach was a watershed policy moment and a bad example of fire alarm oversight) we starting to get a much better integration between security and privacy.

Kim:  What can technologist due to help inform the process you are going through -- Ari with measurements -- Ed with what you're tackling at the FTC.  What can technologist due to help with the process?

Ed: What really helps is a dialogue.  We need to talk about some of the technical trade-offs are.  Ultimately what is happening in the space is trying to make the technology and policy mesh.

Ari: There is concern about access -- for example the health space doctors are concerned that individuals will change their record to something that is medically wrong or against medical advice.  Consumers group want open access and argue individuals will make their records more accurate.  No one has done a study if the information has become more accurate or less accurate if individuals are given access. 

Ed: A couple things about measurement: it is certainly valuable and we don't do enough of it.  It's also important to look at the limits of measurement - -it is a backward looking process.  In technology, we need to look forward toward solutions.  We need to do both.

Ari: Agrees.  However, we need to give technologists the information to move forward. 

Question from audience: Can government help move the ball forward with Do Not Track by being a first adopter? (paraphrased)
Peter: Yes! (emphatically)
Ed: It has to do with what the consumer is opting-out of...
Peter: one further remark, we do not see Do Not track being an enforceable for  a first party website.  It would be for very large third-party websites.  For first party, we feel like it should be more like a helpful signal saying "this person may not want to be tracked."

Question from audience: What would be good practices on data retention -- from the government side and corporate.
Ari -- minimization is one of the fair information practices.  There isn't one technology or policy that will cure it all.  Let's move away from keeping it forever to tying it to a specific use.
Peter: Give consumers choice and control over that.  If you're going to keep data on me for more than 12 months, I should be able to access that information.
Ed: missed his response
Alma: the question you raised is one that is familiar to us.  Google has been engaged more in Europe than here in the US.  We've been vigorously engaged in the topic.

Question from the audience: We've been commodity personal information for a long time, it's the economic reality. Should I be able to sell my own information in some way?
Alma: We only started interest based advertising two years ago...and we kept the lights on before that.  Transparency is going to be critical.  The people in that space between knowing technology and policy have a challenge to stand up to make this type of tech understandable (Wonk note: hell yea!).
Ed: We see sometimes that users are surprised and offended to learn an information use is happening. Users and companies need to have an understand, users need a choice to control. (Wonk Note: simple privacy policies!).

Question from Audience: National Trusted Identity based question.  How will it effect geeks and wonks
Ari: It has to play out so that it does not have a major negative influence.  One of the technologies Alma should was two factor authentication, if we can get a technology that has better trust -- that is the goal of NTISC.  The question is do we build a single system (Wonk Note: no) or a market place for this type of technology (WonK Note: yes).  There is NOT going to be an ID card.  It will NOT be government run.  It will be sector run.  NTISC strategy is to have discussions with industry to help successfully implement that strategy.






Data Privacy Day 2011

Today is Data Privacy Day and I will be attending Google's panel discussion: "The Technology of Privacy: When Geeks Meet Wonks."  The panel is being moderated by Kim Hart of Politico and features an impressive line up all-star privacy geeks and wonks:


In 2010 we saw a huge surge of privacy concerns in the press and potential legislative actions in the U.S. to help protect end-users.  A constant message over the past year was that user's simply did not know the risks of information exposure or what solutions existed to help. So, as today is Data Privacy Day, help educate those around you about privacy and how they control their information.  Take time to explain simple concepts like user settings on various websites and how they impact information.  Discuss Geo-location information and its great potential for misuse

Strong security helps protect information, which can enhance privacy.  Protect yourself, your data, and the networks you work on.  If you're using wireless, make sure it is not only secure but  uber-secure.

For those using social networks, be careful of who you have as a friend.  It is very easy to present yourself as someone else.  A healthy dose of skepticism is a good thing this day in age.

Check out the awesome work that the Center for Democracy &Technology is doing with Take Back Your Privacy: http://www.cdt.org/takebackyourprivacy

Lastly, for those of you in positions to influence privacy within your company/agency, program, or system development effort dig your heels and stick to principles.  Make sure the data is truly needed, stop collecting excessive personal information.  Don't allow the use of SSN as a lazy unique ID -- it's not unique and it's a dangerous information commodity.  Don't sell information to marketing companies just to make a few bucks on the side, it abuses your user's trust. Speaking of users, be open and transparent  -- if you have to publish a 20 page privacy policy because the legal department demands it also publish a digestible version so the regular public can understand exactly what is happening with their data. Consider becoming a supporter of Mozilla's Privacy Icon Project. If you want to be a trail-blazing and bold company, especially within the United States, start switching to an explicit opt-in mentality.  Give users access, control, and choice to their information.

NB: PrivacyWonk highly approves of geek & wonk interaction and give giant kudos to Google for one of the best panel names I can remember in recent history.  I am not sure if I will be live-blogging the event as I really want to pay attention to the content but stay tuned...