The blurry line between data protection and coercion

We had the pleasure of interviewing Dr. Gilad L. Rosner, privacy and information policy researcher and the founder of Internet of Things Privacy Forum, which focuses on responsible innovation in the domain of connected devices. As an expert on IoT, identity management, US & EU privacy, data protection regimes, and online trust, we are also very happy to have Dr. Rosner as a keynote speaker at SaSeIoT 2016, the 3rd EAI International Conference on Safety and Security in Internet of Things, which will take place in Paris on October 26-28. The domain of data protection, privacy, surveillence and information laws is a gray one, so let’s dive in.
How would you comment on the recent clashes between governments and tech firms, regarding privacy and security?
The internet age has been a windfall for law enforcement and intelligence gathering agencies. The routine collection of personal information through social media, search, mobile phones, and web usage has created enormous databases of people’s movements, activities and communications. All of this comes from commercial endeavor – that is, the internet and social media are propelled by private companies in search of profit. They create the products and they store the data, and so those private data stores represent an irresistible target for data-hungry government entities like the NSA and others.
The ‘government’ is never one thing. Governments are comprised of agencies, interests, and people – overlapping powers and agendas. In the case of law enforcement, different groups have different remits and varying degrees of power. Foreign intelligence gathering is not the same as domestic law enforcement, and the rules that enable and constrain different agencies vary widely. There is a blurry line between lawful and unlawful access to private stores of personal data. The Snowden disclosures gave the world some perspective about just how blurry that line is, and how the governance of intelligence gathering may be porous or insufficient.

“Lawful and unlawful access – and who gets to say which is which? – are two sides of the same coin: the state’s desire for information about people.”

Sociologists have noted that states ‘penetrate’ their populations; that people need to be made ‘legible’ so that the state can act upon them. A strong argument can be made that intelligence gathering – for foreign or domestic purposes – is a core characteristic of the modern state. As such, lawful and unlawful access (and who gets to say which is which?) are two sides of the same coin: the state’s desire for information about people. Part of the way liberal democracies are judged is through consideration of their legitimacy. When government actors are accused of hacking into private data stores or otherwise circumventing established legal methods of obtaining access, such as search warrants and subpoenas, that legitimacy is called into question. Still, the line is blurry, and because of the secretive nature of intelligence gathering, it’s difficult to get a complete picture of when agencies are acting within their rights, when company practice facilitates or hinders the transfer of personal data to government actors, and when everyone is acting within ‘normal’ operating procedures.
What role does the EU play in protecting our digital privacy and enforcing our security online?
The EU has several roles to play. The first is the creation of ‘command and control’ data protection frameworks. These are coercive laws that permit and sanction activities relating to the collection and use of personal data – an example is the forthcoming General Data Protection Regulation. Also, the political control of the EU is highly influential on the relative power of data protection authorities. DPAs are essential ‘detection’ and enforcement bodies in the domain of privacy. The second is what is sometimes called ‘norm entrepreneurship’: deliberately changing social norms around privacy. We can see this in the continual and evolving support of the idea that privacy is a ‘fundamental right.’ Regarding security, the EU can both create coercive laws, such as data breach notification, but also encourage standards development, convene stakeholders to advance issues, fund research, and promote security through bodies like ENISA. The EU is many things. In addition to refining a harmonized internal commercial market, it also seeks to make population movements easier (Schengen), create a trading bloc, and also to create a sense of ‘Europeanness.’ Part of this European identity is strong support of human rights, a component of which (in the European conception of it) is the right to privacy. By supporting human rights, Europe also ends up supporting privacy rights.
What, in particular, is the security and privacy sector currently striving for?
Privacy professionals and academics are a heterogeneous group, so it’s difficult to say what the sector is striving for as a whole. Many voices within the community are in full-throated support of maintaining and pushing forward privacy values in new technology developments, but there is great debate about the ways to accomplish this. Europe tends to house a great number of privacy people who normatively support the view that privacy is a fundamental right, whereas in the United States there are those who are softer on this idea, saying instead that market mechanisms (rather than rights) are a more appropriate way to let privacy protections manifest.
For those who vigorously support the expansion of privacy rights and mechanisms through non-market methods, there is a general sense of wanting to help shape the market into more respectful, user-centered uses of data. Coercive methods like sanctions, fines, law suits and punishments only go so far, and so there is active and far-reaching discussion about different ways of supporting consumers, enhancing their knowledge and ability to intervene, supporting pseudonymous use of technology, and attempting to incorporate the contextual nature of privacy into governance. Part of the privacy community is quite concerned with Consent; or, more accurately, with the perceived failure of Consent. Some privacy researchers are seeking ways to make Consent more meaningful, whereas others have given up on it. Connected to this is a discussion about if and how to regulate data based on its use. In other words, regulating data differently when it is used for medical versus lending versus employment versus educational purposes. This already occurs to some degree, but part of the privacy community is actively seeking to broaden this approach.
How will user protection change when there will be a developed IoT infrastructure, compared to how it looks right now?
The IoT is not a homogenous idea – it’s a set of trends: non-computer devices getting more sensors and network communications, low power computing, screenless interfacing, increasing stakeholders in the collection of device data, increasing device autonomy, lower cost manufacturing, miniaturization, the use of smartphones as a platform, and so on. User protection will, as it does now, take many forms: law, encryption, security architectures, contractual terms, market disincentive, and consumer education. Each of those areas will (hopefully) evolve to address the increase in data collection that the IoT portends. One challenge, for example, is privacy policies and other forms of user notifications. When screens get smaller or disappear, how do you notify users about what data a device is collecting? We know that many, many people do not read privacy policies, so the IoT will likely amplify this problem. There is some progress in the evolution of privacy policies, but it is slow, and that’s due in part to how many policies people are told to read. My view is that this calls for more institutional controls rather than trying to make people read more.
What do you expect from the SaSeIoT 2016 conference? How do conferences like this one influence your work?
I expect to hear about interesting developments in security and identity management with regard to the IoT, and about new designs and uses of technology. Some of these will implicate personal data collection and some will not. I’m interested in both, but the greatest impact on my work is learning of new ways that people’s data is being collected and used. Also, hearing about the emergence or evolution of standards and the way that public bodies are interacting with the IoT domain both influence my research activities.