- Joel Reidenberg, (home page), Professor of Law, Fordham University
- Timothy B. Lee, blogger at Technology Liberation Front and adjunct scholar, Cato Institute
- Marc Rotenberg, Executive Director, Electronic Privacy Information Center
The panelists presented their positions and answered audience questions as well as responding to each other's positions. I don't feel that I can do justice to the panelists and what they said, so I'll just give a little flavor of what was said. When the video recording of the session becomes available, I'll add at the bottom of this post.
Tim Lee started things off with the position that privacy is governed by a series of trade-offs. Some data sharing is a pre-requisite for any useful online service and users are generally willing to give up some privacy in return for a valuable service. Some users will be willing to share more information for more value. Tim also spoke a bit about the history of browser cookies, GMail, and the Facebook news feed. All 3 of these things were initially looked at negatively by at least some segment of the user community. In all 3 cases, users became more accepting as they learned more about how the technologies worked, what "opt-out" options existed, and what benefits the users could derive from the technologies. A key point Tim made is that having private companies collecting data about you is less troubling than having the government doing the same. If you don't like the policies of a particular service provider, you can choose not to use that provider, as there are others around with different policies. There are no such choices available when it comes to the government.
Joel Reidenberg focused no 3 sets of implications: ownership of data, embedded values in the architecture, and irony. Data ownership is really how you get to use the bits and bytes. Fair information practice standards provide a control here. However, if data usage is based on a user consent model and the user doesn't understand, how can the model be effective? Joel also raised the question of whether data on social networking sites is public or private. Despite what many users may think, the data is generally public and can be accessed by anybody (including law enforcement). Next up, Joel talked about how privacy values are embedded in the architecture of a given technology. With the Facebook beacon fiasco, we got to see "how the data mining sausage was made" and it bothered quite a few people. We got to see what was going on behind the scenes in a way that was quite graphic when compared to GMail's ad scanning. Joel said that data privacy rules have to focus on effective transparency and proposed that a data usage rule set should travel along with data wherever it goes. Finally, he spoke of the irony that cloud computing actually opens the doors for privacy enhancement. Centralized data holders are easier to find, regulate and prosecute. However, we will need more cooperation in the future between lawmakers and standards bodies if we are to have effective data privacy standards and rules.
Marc Rotenberg gave an introduction to privacy culture. He presented the concept of fair information practices where the entity that collects data on individuals takes on obligations for security, accuracy and rights of access, among others. The custodian of the data has the responsibility to prevent "bad things" from happening to the data. Privacy people by and large believe that technology can be a solution to privacy problems, but the techniques need to be evaluated: having secure encryption keys will protect your data, but having a key escrow system will erode that protection in at least some (if not all) cases. Anonymity is critical to privacy. A person's actual identity should not be required to determine if they have the credentials to use a given service. Also, there is a paradox in that much of privacy is about transparency. Imposing obligations on custodians to be more open and accountable about the data they collect makes it easier to ensure that the data will only be used in known ways. The greater the secrecy about how data is being collected, the greater the possibility that it can be used in negative ways without people learning about it.
As I was trying to actually pay attention to what was being said as I was taking notes, I feel that I may have given short shrift to all 3 presenters and I encourage you to watch the video of the panel (once it becomes available - please check back for an update). That way you'll hear first-hand what they had to say. You'll also get to hear the lively debate that took place during the rebuttal and audience question section.
UPDATE: The video recordings from the workshop are now available at the Princeton UChannel.
Technorati Tags: