Economics of Privacy Session at the FIA Budapest sees a number of key messages to remember and lessons learned as summarized by Martin Waldburger (session co-organizer, rapporteur):
Privacy determines a major socio-economic concern in the Future Internet. Economics of privacy implies a need for developing an understanding of the relevant stakeholders and their incentives in markets for personal information. Stakeholders engage in tussles as they act according to their incentives. In that sense, the identified preconditions for personal information markets are key (property rights, infrastructure, incentives have to be available). Both, from an economic and from a legal point of view, consumer perception and incentives for consumers and providers are of central importance. This includes appropriate consumer information, the exercise of essential consumer rights, but also a trade-off between differing interests has to be considered. While signaling benefits to consumers may increase consumer acceptance and loyalty, on the one hand, costs of legal compliance for companies have to be reduced, and risks need to be known and reduced, on the other hand. To that end, approaches to a better risk assessment allowing for reliable cost-benefit analysis are needed.
slide set) by highlighting that while the issue has been discussed in previous FIAs, the economic aspect of privacy has not been thoroughly investigated. Trinh mentioned the examples of Google’s and Facebook’s success that have the root cause in their innovative business models. According to Trinh, these innovative business models are partly based on the innovative use of personal information of the users. He challenged the audience by putting the question: Will these business models be sustainable in the Future Internet? Finally, Trinh positioned personal information as an economic asset. This statement was reflected from multiple points of view throughout the session:
Nicola Jentzsch differentiated – after reminding that privacy is a human right and thinking about it in economic terms would not change this basic fact – in her keynote speech personal from private information. Jentzsch introduced the according economic definition of privacy: “Privacy is a state of asymmetric distribution of personal information among market participants.” She explained the set of three pre-conditions for markets for personal data to emerge: First, property rights to personal information have to be specified. Second, the infrastructure for transfer of personal data has to be in place. Third, incentives for firms and consumers for collecting/trading and disclosing personal information, respectively, have to prevail. She, thus, explained the economic cost-benefits calculus conducted by consumers when disclosing personal data (so-called privacy calculus) and how salience, social comparison and other effects impact on the consumer’s decision.
Jentzsch spoke about competition strategies of firms in markets for personal data: firms will conduct targeting, behavior-based pricing and social sorting, for example, with differing impact on consumer welfare. She explained how competition intensifies if consumers choose to remain anonymous and buy only standard products. She also stated that customer lock-in could increase when personalization increases switching costs. Finally, Jentzsch explained how research about consumer lock-in can be undertaken referring to a research project conducted for ENISA.
Estelle De Marco shed light on the legal situation for privacy and, by that, the legal frame to be consistent with in personal information markets. She showed the link between privacy and personal data, which are elements of the private life and protected for this reason by multiple international and European legal sources (European Convention of Human Rights, EU Charter of Fundamental Rights, EU Directives, ...). She emphasized that the EU legal requirements apply in many cases, even to non-EU entities, and highlighted the main legal requirements (imperative information and, in many cases, consent of the consumer, for instance). Regarding consumer perception of privacy and its exploitation, she notably highlighted that a more positive attitude is noticed when legal requirements are respected.
The actual valuation of personal information in economic terms determined the main topic for Aljosa Pasic and Kai Rannenberg. Pasic asked attendees whether they would share their political opinion for 20 €. The majority would have. He went on and asked about religion – where somewhat less attendees, but still the majority in the audience, would accept the offer. Pasic, however, asked why a company would want to pay 20 € if that information is (often) available for free? In this respect, he commented on the legal situation which he found to be general in nature (e.g., “proportionality” principle) and incoherent from state to state. This may render the evaluation of privacy compliance a lengthy and costly process, he added. For companies, he found two optimization dimensions in privacy compliance to be key: costs and risks (both to be minimized). Hence, he concluded that a solid cost-benefit analysis is needed. For that purpose, a structured approach to the analysis of complex systems is needed.
One possible example of such an economic valuation approach – for privacy-enhancing Identity Management (IdM) – was introduced by Rannenberg. This valuation approach wants to overcome the shortcomings of common decision making processes. And it should lead to a set of decision-relevant economic consequences of adopting, mediating or providing privacy-enhancing IdM services. Accordingly, he proposed a process model that identifies stakeholders with their costs and benefits and assesses aggregated and clustered costs and benefits under different scenarios (e.g., with and without a trusted third party). Rannenberg concluded that minimization and decentralization of data is important, as well as the empowering of users.
In the following round of project statements, Eric Meyer (SESERV Coordination Action) positioned privacy in a socio-economic context, as privacy to him is a question of technological capacities, goals, and attitudes. Meyer asked whether there could and should be a trusted third party that can release personal information if really needed. He suggested that such a trusted agent – which he mentioned is not a new concept per se – might address privacy issues raised in the session. So, he concluded that potential solution approaches may exist for some problems, but we are not implementing them as yet. Meyer invited the audience to participate in the respective privacy-oriented discussions within SESERV, e.g., during the Oxford workshop in June 2011.
Roger Torrenti introduced Paradiso 2 as a project that explores the future of society. The project produces recommendations to the EC – and the exploration of the limitations of the Internet is one of these recommendations. Privacy (from no privacy up to full privacy) has been identified as such a limitation.
Jim Clarke (BIC Coordination Action) presented recent activities from around the globe specifically on economics of privacy including those within the EU, US, Canada, Korea and Australia. If there is any interest in contacting or linking with the people involved in these activities, this could certainly be facilitated by BIC, according to Clarke.
A: Jentzsch explained that this was the case for recent start-up companies such as Allow, MyID.com and Bynamite. She explained that in some cases consumers would be paid for the sale of their data.
Q: Are there models for the valuation of privacy and, if yes, have they been reliably tested thus far?
A1: Jentzsch explained that there is research on quantification of privacy, however, much remains to be done in that area. The valuation of consumers regarding their personal data could be assumed to be context-dependent, but yet not completely arbitrary.
A2: Rannenberg stated that no one really knows the value of privacy. One way to circumvent this issue is to involve users into the decision making problem and to aggregate valuation after that.
Q: Is privacy a possible asset in public services, too?
A: Rannenberg stated that the presented assessment model could be applied in the design of a public service as well.
Q: How to address a user’s question: “Can I see all the data you are collecting about me?”
A1: Rannenberg proposed a dashboard which may help users see the data being collected. A question however is how to organize the data in the dashboard in a meaningful way.
A2: De Marco emphasized that a right to access is granted in the European Directive, as well as a right to be informed. If, despite these rules, it stays in practice difficult for a user to know who is collecting his data, for several reasons, we can note that within the framework of the revision of the Directive 95/46/EC, it is notably foreseen to increase transparency for data subjects and to strengthen users’ rights. Regarding profiling, the information of users is mandatory, an information given in the general terms and conditions being not sufficient when a cookie is used, under EU law as interpreted by the Article 29 working party.
Q: How could the cost of privacy compliance be determined?
A: Pasic pointed out that different levels of compliance have to be considered. For security, for instance, there is compliance with ISO standards (ISO certification).
FISE Conversation >