Digital Values: Advancing Technology, Preserving Fundamental Rights

by Martin Schmalzried

On the 18th of January, Carnegie Europe, in partnership with Microsoft and in association with the Dutch Presidency, organized a conference on devising policies which help to maximize the value of technology while preserving our core values. Several high level key note speakers took the floor including Brad Smith, President and Chief Legal Officer of Microsoft, and Věra Jourová, European Commissioner for Justice, Consumers, and Gender Equality.

The key topics discussed, from COFACE’s perspective were balancing privacy with security/safety, the latest developments and implications for the Internet of Things, transparency and user trust and finally, the development of Big Data.

Balancing privacy with security/safety

Given the recent terrorist attacks in France and in many other parts of the world, the “mood” has shifted from a focus on privacy following the Snowden revelations to security and public safety. Privacy is often pitched against security and safety, in the sense that one cannot have both privacy and security/safety. To some extent, this is true. If States were allowed to limitlessly monitor and skim through all data, perhaps some terrorist attacks could be prevented. However, the assumption on which this trade-off is built is a gross over-simplification of the world we live in. Pitching privacy against security/safety fails to address more pressing issues such as rethinking the foreign policies of our National governments and actions at international level which may indirectly or directly encourage terrorism. Should we therefore keep selling weapons all over the world, apply very cynical policies or “real politik” across the world, failing to address the hardships that citizens in Iraq, Syria or Afghanistan are facing, failing also to recognize our governments’ responsibilities in creating the mess and at the same time implement mass surveillance as a means to prevent terrorism? There may be no dilemma or trade-off between privacy and security/safety after all. Privacy simply needs to go along with measures to curb inequalities, bring about more responsible, humanist foreign policies and eliminate discriminations, exclusions and the ghettoization of specific minorities.

The Internet of Things

While for end users, innovations in the field of IoT (Internet of Things) seem very attractive, especially in their potential to make their lives easier (automated heating, managing your fridge’s content…), they also gives rise to a number of challenges:

– A lack of standardization in terms of communication protocols, security, privacy protection, etc could break the very principle of the openness that the Internet was built on, with each actor trying to impose his “standard” for IoT.

– How can users access data generated by IoT and under which format should they receive it?
– How can users control data generated/transmitted by IoT? Will there be a way to switch the connectivity off?

– How much of the devices functionalities should run through the Internet as opposed to run either “offline” or only on the local “house” network? For instance, should a talking pet require permanent connectivity to interact with a child, it may be very restrictive in terms of use. Also, if a toy monitors a child, should the “recordings” be uploaded to the Internet or rather stored on a local computer on the “house” network? The latter point raises the question of privacy and intrusion of IoT into the privacy of minors. This is especially tricky from a legal point of view as IoT devices may monitor and collect data about adults but also about children. This may give rise to a new form of privacy protection model, based on Privacy by Design or Privacy by Default.

– IoT is extremely complex from a liability point of view and includes many “layered legals”. Who is responsible in case something goes wrong? Often, there multiple companies involved to make IoT work. For instance, in the case of an automated heating service, car companies are involved (to send a signal from the vehicle when the user gets close to home), online cloud services, the IoT device itself and potentially even more (if there is a third party software on the IoT device…).

– The combination of Big Data and IoT may also usher in a “Premium” vs. “Freemium” era, where consumers are offered discounts based on their behaviours, at the expense of quality of service. Many examples illustrate how IoT and Big Data can be used for better or for worse. Hotels for instance, can gather data on a consumers’ habits (does he/she usually heat the room, how many towels does he/she use, etc…) to save energy, water… but this data could also be used to propose “Ryanair” type discounts to consumers at the expense of quality (if a consumer agrees not to use the air conditioning, or not use any towels, he/she will get a discount…) or even discriminations (should a consumer take long showers or baths, he/she would be proposed systematically higher prices for hotels).

Transparency and user trust

User trust is absolutely essential if companies wish them to embrace the current revolutions in IoT, Big Data, Artificial Intelligence etc. However, this poses a serious challenge. How can a company be transparent about highly technical and complex issues such as Big Data? To give a concrete example, credit scores, which are automatically calculated using algorithms and Big Data are all but transparent to consumers.

In the end, while transparency is a necessity, ensuring that a quality regulatory framework is in place will also help in securing user’s trust. As Brad Smith has pointed out, regulation should be clear and predictable, rights of people need to be respected and their rights need to have remedies.

Big Data

Big Data is a relatively new phenomenon and users are only getting familiar with its implications and the possibilities such a technology offers. There are many questions that users are struggling with:

– How much is my data worth? Am I getting my “data’s worth” when I’m using a service which relies on the use of personal data in its business model?
– Is there a trade-off between sharing my personal data and the service I am receiving? Can this data be used against me?
– How can I have more control over the data I am sharing and how the data I share influences my online experience?
All of which will need to be addressed if users are to entrust companies with their data.

To finish, COFACE fully agrees with the lead statement of the conference: advancing technology, preserving fundamental rights. This balance will certainly need to be struck in the years to come and COFACE will reflect on all the latest developments to ensure that users get the most out of technology.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s