Following our Digital Footprints
9.15am Welcome and Introductions
9.30am Contextualising the Discussion: Research and policy frameworks
This forum addresses issues arising at the intersection of research on Internet technology, regulation and social science. To provide a backdrop for the day’s discussions we will open with a brief summary, recap and comments on the 15 February Panel providing a snapshot of the Horizon research programme. We will also summarise the current legal and policy framework within which the forum’s discussion will take place.
- Change question from “share everything possible” to what is the minimal amount we can share to accomplish what we are trying to do (example of letting credit card company know that you are indeed in Tokyo at the time your credit card is being used to purchase a meal in Tokyo, but not dumping all your travel information from your mobile phone GPS to the credit card company)
- 1995 EC data protection principles still apply (see Ian Brown report), particularly things like data minimization which is one of the 8 principles at the beginning of the data protection directive
- Harmonization is important to business, because the cost of complying across a range of rules implemented by a range of states is high. Even if the rules are more restrictive, if they are harmonized it would be preferable to a hodge-podge of rules on things like data export.
o Debate in the EC over whether data protection should be a regulation rather than a directive, since directives allows for member states to interpret some of it quite differently
o Uncertainty is a key issue, although getting very clear can also cause a lot of new problems (Bill D.)
- EC consultations: first round included the idea of including the right to be forgotten, and the right to have your data protection claims originating anywhere in the world heard in front of a European court
10am Technological Advances Shaping Privacy
This session will establish a shared sense of possible scenarios on advances in digital technologies that have major implications for the collection, use and storage of personal data, including a discussion of cloud computing. We will also look at technological advances in privacy enhancing technologies and related technical approaches to protecting the privacy of personal data.
- Banks originally thought (in the 1960s) nobody would use a credit card, because people wouldn’t want their purchases to be tracked. Surveys since the 60s show that people do believe that using a credit card undermines their privacy. But people use them, so you can’t extrapolate from people’s opinions directly to their behaviours
- What are our current expectations about the changing technologies, and how technical changes will affect access to and use of personal information
o (Comment from Yorick that you used to have to sign 5 pound notes to use them, and they were big things with your signature on the back, at least in the working class area where he grew up)
- What about treating your data like an asset, with a monetary value and I can make an informed decision about whether the trade is worth it (ie., if I get 50 pence a month, is it worth exposing my data to the people who want it?). (Anthony House, Google)
- The scale is an issue, as vast numbers of people are involved in implementation and design, and are doing it in poor ways. When a couple of people sitting in their den start a business, they may be more concerned about cost or efficiency, than in caring much about privacy protections.
- The current templates we are giving businesses is “grab all the data”. How can we deliver tools to them that provide privacy protecting templates? (Derek McAuley)
- It is easy to get people to consent to giving up their data, but they don’t really have any understanding of what this means for privacy (and privacy experts often don’t fully understand it as well)
o The idea of moving the computation to the data (that is controlled by you) rather than dumping all your data into the computation engine is a potential area for development
o So, you become your own data controller. But how is this data secured?
- Data is a function of context, and you don’t know at the design stage what the rest of the context will be in use
- Design of security and privacy can’t be a one-off activity during the engineering phase, but needs constant attention as technology is implemented (Rae Harbird, UCL)
- Accuracy of data: there is the illusion of being accurate, but the inferences made from the data may be quite inaccurate
- Consent is not necessarily the answer, especially if they are only given an all-or-nothing option to engage or leave. Ian argued that people should have a range of options, with varying privacy/utility tradeoffs, rather than just having people say yes or no.
o Came from a comment from Vicki that much of modern life collects data about you but you have no real option of opting out and still living in the modern world (mentioned a recent novel about someone trying this)
- Bill: Don’t want to just be viewed as alarmists, whose rhetoric can be safely ignored
o Need to stop thinking about the easy cases, and need to start thinking about the hard cases. Example is managing old people. Do we adopt the medical model of ‘we know best what to do for you’, or in assisted-living, how much do we use technology to assist it. (Bob Anderson, Nottingham)
11.15am Tea and Coffee
11.45am The Economics of Personal Data
What are the economic incentives behind each scenario – shaping likely outcomes? What type of markets are developing?
- Bill: shifting ecology of actors, and new actors that need to be taken into account in economic models
- Systems and policies too often presume that there is only one kind of rationality, but there are actually multiple kinds of rationality and we need to take that on board when we think about policy, technology and design. There is not just one way to think about things. (Bob Anderson)
o People are NOT just rational economic actors, but they are social actors. They don’t make decisions on just what is going to maximize profits. We have learned this over the last hundred years of economics. (Bob Anderson)
o The Simpson model (bright and malicious Bart, to dumb and non-malicious Homer, to smart and non-malicious Lisa, etc) are the users you are building for
- The money to be made may not be in the transactions, but in the things around the transaction (along the lines of the people who made money in the Gold Rush were the people who sold spades and picks)
2.00 Ethical, Legal, Policy, and Social Issues in National and Global Contexts
Privacy and data protection legal frameworks have been designed in part to enable the collection, use, and storage of (sensitive) personal data. However, ethical, legal and other policy issues have generated great uncertainty over what is appropriate now and in the future. Do future technological scenarios raise genuinely new challenges for privacy and data protection, or primarily raise the economic and personal stakes in settling old issues?
3.00pm Implications for Research, Policy and Practice
This session will consider the implications of technological developments for future research, policy, and practice. What are the priorities for research, including any new research questions that should come to the forefront of our attention? Are structural arrangements for grappling with privacy and data protection adequate? If not, what kinds of work will be required to rethink the governance of personal data?
4.00pm Points of Summary and Conclusion
What key points or observations emerged from the panel and today’s forum? What kind of summary would be useful to produce, and what follow up would be valued?