Archives

Privacy vs. Security in France - Part 6

This is the 6th and last part of the discussion “Sécurité ou Vie Privée ?” (ed: Security or Private Life) moderated by Mathieu Vidard (MV).  Part 1, Part 2, Part 3, Part 4 and Part 5 may be found here, here here, here and here.

—-

Guests :

Isabelle Falque-Pierrotin (IFP) – Vice-president of the French Data Protection Authority (CNIL)

Stéphanie Lacour (SL) – CNRS researcher

Meryem Marzouki (MM) – CNRS researcher

Jean-Luc Dugelay (JLD) – EURECOM researcher

Jean-Marc Manach (JMM) - internetactu.net

—-

Man – The question is about knowing where we are aiming.  And since technologies are moving very fast, we need to look upstream but nothing plays the role of the ethics committee (CCNE) in biological science, for instance when it comes to computer science. The CNIL has a regulating role and it has nothing to do with what the CCNE does. This comparison with Science of Living is relevant to us as there are technologies we didn’t want to develop. We said by thinking upstream, we don’t want reproductive human cloning. We stop research tending to point in this direction. Is there anything equivalent in Computer Science to a technology we would not like to develop?  Though without thinking ahead, industrials would shape it and there would be calls for proposal. To me, the facial recognition can be compared to the artificial uterus: this is something which will radically change security in our society and individual freedom. It is not perfectly running but there is money for funding it. It means there was not the same thinking upstream that one can find in biological science.

MV – IFP?

IFP - Your question is one of the many reasons why the CNIL is transforming itself into something else. Google is an American company. The French people, the 30 millions of customers consuming Google services are obedient to any terms of service the company would like to apply. Today Google respects the Californian law. Some of the services, like Google Latitude (allowing one to identify people in a given spatial environment) respects the Californian law. We may believe it’s tragic, that nothing can be done because most of the big Internet companies are not French. However these companies are opened to discussion with actors like the CNIL about private data since the organization took upstream several initiatives in this field to drive them on taking measures that abide by the French national rights. For instance, in regards with Street View (mapping street photographs with districts showing views of main cities in Europe), as the CNIL was alerted early enough, faces could be anonymised so that the service conforms with the 1995 law. As a result, Google accepted to apply these measures in Europe. I would answer the question by telling that the CNIL has to work more and more in a prospective way with the service offerors so that the process of protecting data really becomes a pervasive concern. It is much more difficult than applying stricto sensu law texts relative to public filing but I think this activity shapes the future of the organization.

JLD – I would like to come back to the global aspects of these things. Among the criteria defining the research areas, there are the national and European industrial calls but there are international competitions as well. As national labs, we want to be competitive towards American or Japanese laboratories for instance, or China which goes off in many domains, like research. We would like to avoid monopolistic situations such as the ones with biometrics or video surveillance by leading competitive research in France.

MV – Another question?

Man - We talk about personal data. I was asked to confirm my presence by signing a paper when I arrived. We talk about carts and knowing what I purchase. Ethical questions are critical. With regards to information technology, we don’t really know how to set limits by preventing ourselves from investigating further in given research areas. Given techniques, politics and ethics as main parameters to be considered, the real question is perhaps not about RFID (which will be deployed and these chips will provide information about what I purchase). Why are we looking closer at Google webmail as something so interesting when incoming and outgoing messages are not safe within private servers either? All that is related to search queries and subsequent profiling activities might be a little bit more of a concern. Are not we running away from the latter by discussing RFID? Those are ethical questions to be asked. Moreover we are not at all in the same context than with biotechnologies. The means to take measures are totally dissimilar with information technology.

SL - There are several camps in this game: the government, the private companies, the citizens. I remain convinced that no solution will come from one of these three players. It might come from the government, for instance the Chinese government forced Google not to provide results for keywords like democracy. I am not sure one would like to fall into a system similar to the Chinese democracy, if I may. The fact is, politics can not be totally passive before choices. When the political decision is made, things move forward. Besides, the technology itself makes things happen. This is true with Internet and also with RFID. Developing technologies upstream, for example in Europe, by following European privacy standards, is one possible solution. As most of these technologies are developed in countries like the United States of America or Korea, where privacy has a totally different meaning, we have to face very difficult issues since privacy protection doesn’t even exist inside the technology itself. When research institutions push collaborations between technology inventors and ethicists, jurists or sociologists, we get closer to the solution. The third part of the solution belongs to each of us. I don’t think the system will be altered in the future by only one of the three ways.

JMM - Let me take another example. In computer science, for a long time, there was IBM the monster absolutely cannibalizing everything. Then came Microsoft with a fabulous business plan equipping today more than 90 or 95 percent of private person computer. Now Google has between 80 and 95 percent of the advertising market shares. Each time, for the reasons why IBM is criticized, Microsoft was and is still heavily criticized, IBM made huge losses, Microsoft is in the same losing process. Reviewing what is going on with the browser market and Internet Explorer would be enough to get convinced. Today in Europe, more than 30-40 % of customers are Firefox users; they are not Internet Explorer users anymore. I can’t see exactly how Internet Explorer will carry on its development. Internet Explorer is being lost. I don’t know how things will turn out for Google in 10 years but it is not unthinkable than Google fade away because of another company offering new services better addressing peoples’ needs and being more respectful of users. The monopolistic role of Microsoft in relation to operating systems may come to an end. The Microsoft operating systems coming along automatically with brand-new computer purchases, called compulsory sales, originated from the European trials against the company. It is well known that it is safer to run Linux instead of a Microsoft operating system given the same privacy concerns so the law project behind stopping compulsory sales would contribute in making Internet safer as well.

MM – I would like to return to the original question. I was amazed of hearing mentions about ethics, politics and techniques but nothing about rights. Given that rights should not depend on politics, at least not exclusively. Rights exist as rules applied to everybody. Data and privacy protection rights can fundamentally be split into the purpose principle and the proportionality principle. The first one is more and more flouted, as explained by JMM with the national file of genetic footprints and the extension of personal data collecting activities. The proportionality principle is still too fuzzy. When we use the flaws of proportionality such as a period of data retention as arguments in legal recourse, we are saying the length of these periods are directly proportional to the aims at stake. The purpose principle is equally inadequate, for instance, with passports, visas used in targeting foreign people, measures taken for border control, which will serve in police operations with different purposes since databases exist. The principles should be refined so that some population could be made safe. I have quoted the case of foreign people but let us talk about children. All of us were offended when we learned that 13-year-old children could be found in Edwige database. What may offend us as well when considering the French passport regulation, is how biometrics data are collected for 6-year-old children (which is not a requirement at the European level). It shows clearly how collected data can be extended and put into service for other objectives, to control other people, because young people scare (we see it with regulation law proposals about mobs). We are not here in the middle of ethical discussions of new topics without any form of consensus already made about them in the society. Let us revert to fundamental rights, this would be progress.

MV - A last question?

Woman - I would like to know what was the legal reasoning of Internet service providers, which are apparently commercial and actually accomplice of the all-in-one security trends with HADOPI in the LOPSI II law. Were not their positions during meetings meant to address these issues?

JMM -  It depends on the industry we are referring to. It is not all about the same reactions. For instance we know that Free works backwards. They launched the free wifi service access (allowing their customers to access Internet through other customers routers in wireless connections) when the HADOPI law was being discussed. When the main carriers are asked to implement some practices, they usually do it and charge the customers for doing it. Some actors of the industry like Jean Michel Planche were among the first who brought to light the issues coming with the LOPSI law and Internet filtering policies. Some individuals dare expound ideas and spread information. Most of the time, information published in the press about the government intentions come from industry leaking details when not expressing themselves ideas defending their customers. We do not have the same culture in France in comparison with the United States of America. In the USA where there is no such law like the data processing and freedom law, the fight for the customer freedoms and privacy protection is driven by the industrial. They know there is no interest for them in going against the freedom of their customers not to lose them by holding big brother-like roles. In France, the industry doenn’t maintain such policy of protecting their customers. With HADOPI, we are routed to the possibility of the filtering of Internet though professionals.

MV - IFP?

IFP - We should not end this debate on a caricature of the industry. After discussing with many of their representatives, I can stand that many of them were worried by these questions. All actors of the industry are not willing to establish security devices everywhere. There are cases when the technology provides advantages, for instance with warranty services which could be integrated in the articles we consume within RFID chips. There are industries which are sensitive to these questions. I meet with them and I think they are more and more sensitive with them as they realize it worries their consumers, it is also forming part of the sustainable development of their company reputation. It is not only theoretical. We all have this responsibility of finding a balance between security and freedom. The industry has its own role to play, which is also a positive role to be played.

 

MV - I ask the three of you, a few words to conclude. SL?

SL - Indeed from the collaboration of all actors may emerge the shapes of an answer.

MV - Can we rely on the rights today as a citizen when it comes to data circulation?

SL – The law tells us it is possible but we face a more complex problem when considering attentively how the law can be applied.

MV - There are obviously some vulnerabilities in the texts. MM?

MM - Again, let us return to the fundamentals and remember the Edwige case. Citizen mobilization can have a major impact in addressing other similar questions.

JMM - More and more people should stop acting paranoid and really get informed about the reality of the threats we are confronted with. The more we will look for being informed and the more we will have the right to resist.

MV - JLD?

JLD - Indeed, people should be informed so that they can have their own opinions. To make the right decisions, it is critical to master the technology, not to suffer from monopolistic situations coming with unique circumstances . I think it is important that we have a strong French and European research. Eventually I think there was interesting proposals like the creation of an ethical committee for new technologies, which is something to be widened with different actors and users.

MV - Does it exist already with nanotechnologies, SL?

SL - Something exists already with the CCNE.

MV - Thank you to all of you.

Share

Leave a Reply

 

 

 

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>