Security vs. Privacy in France - Part 5

This is part 5 of the discussion “Sécurité ou Vie Privée ?” (ed: Security or Private Life) moderated by Mathieu Vidard (MV).  Part 1, Part 2, Part 3 and Part 4 may be found here, here here and here.


Guests :
Isabelle Falque-Pierrotin (IFP) – Vice-president of the French Data Protection Authority (CNIL)
Stéphanie Lacour (SL) – CNRS researcher
Meryem Marzouki (MM) – CNRS researcher
Jean-Luc Dugelay (JLD) – EURECOM researcher
Jean-Marc Manach (JMM) -


MVSL, would you like to take the lead?

SL – Technologies move forward very quickly. I would like to keep pace with IFP when she evokes this rapid movement but we (consumers) also have to make concrete decisions regarding these technologies. Such aspects were fundamental in the enactment of the 1978 law. When we are asked our personal data, we can still freely provide an answer to gain some advantages. Today, it is still possible to acquire simple coupons with RATP. An anonymous Navigo pass is available but the acquisition cost is a little bit higher (5 €). I am not revealing a universal solution: in this case, about RFID, I think a cape has been rounded relative to what existed before (credit card, cellular phones and other individual tracking technologies) as the existence of a chip inside a Navigo pass is well known and some may be aware of carrying these devices oppositely to the ones that will be massively deployed in the close future over the mass-market and they’ll communicate data without asking for authorizations.  This constitutes a real interesting legal issue in the way the 1978 law has been submitted to new evolving needs in 2004 requiring people’s consent as a central matter of balance: I accept providing parts of my personal data in exchange of advantages and I make a personal decision. Given technologies people carry without awareness of their presence nor whether these technologies provide our data, it becomes even more difficult for us to consent.

MV - What does your survey reveal about how the new generation of individuals perceive this concept of privacy? I imagine oscillations between generations and these technologies being more easily accepted by some age groups. Would some of you like to comment about it?

JMM - We’re referring here to the privacy paradox. On one hand people are using these technologies: notably visible on the web with social networks, Facebook and the collaborative Web, where people don’t hesitate in disclosing a lot of information. I’m wondering if you’ve heard about Google and Mark L.’s personal history.  This story appeared famously hitting the headlines at the beginning of the year. This gentleman’s portrait was issued in a French magazine named “Le Tigre”. A fellow was described in his habits exclusively from what could be found on the Web. They didn’t put him naked but it was close. So many photos, videos of his travels, and stories were posted across the Web that they managed to retrace his life. We heard about it because it exhibited the propensity of people for disclosing personal data. On the other side, when the man discovered this article published among the news, he was very, very scared.  Before the article, he didn’t descry how easily these data could be aggregated - this is the privacy paradox. Mark L. expressed himself carelessly until his voice was put in such nearly official report and he got very scared. Being particularly afraid of Big Brother doesn’t prevent anyone from using these technologies (even in enforced situations as it was mentioned early). Today obtaining a passport comes with biometrics, an anonymous Navigo pass is more expensive and so on. We are more and more urged in behaving similarly. What could be the possible extensions? We don’t exactly know. I recently wrote an article entitled “Privacy, a schmuck problem”, about a comparison made by an American team between sexual revolution and what is going on with privacy. Forty years ago, women could not wear miniskirts without being accused of incitement to rape and declaring homosexuality would provoke someone’s face smashing. Mentalities are not the same anymore. Some laws were voted. Activism led to wearing miniskirts without being indicted for incitement and telling one’s homosexuality doesn’t necessarily imply someone’s rage.

MV – Are we at a disinhibited digital stage?

JMM - What has to be clarified relates to the part of control we’ll keep for ourselves over the network. Would all this data be rendered available to merchants, policemen, public services and administration and would not there be any pending transformation of some Big Brother-like self-censored totalitarian system kinds. This is a real political issue.

MV – Do you agree? We’ll listen to your opinions and then we’ll take another question.

SL – I’m a course instructor and conference presenter on these topics: privacy protection on the Web, traceability of individuals. When I start a new course, I am generally seen as a schmuck in front of the students. They stare at me wondering what I could tell them? Things like the Internet is dangerous, especially with all the pedophiles around. By the time I close a course, students come to me and say they will terminate their Facebook profile. In general, I explain to them that it is already too late. However this effort to inform is very fruitful. People strongly react against the Edwidge file. Young people are currently no more ready than aged people to let recruiters find photos of them drunk while they run as a candidate for a job. They don’t accept it better than people of the previous generation. It is this effort where the CNIL is concentraing. It has preserved a worthwhile policy concerning traces for years and the CNIL is not the only relay. In my opinion, this information has to be loudly broadcasted.

MV - MM then JLD.

MM – I would like to return to the comparison made with the living. The CCNE gave two outstanding judgements, the first for biometrics and RFID and the second for using DNA. The DNA ruling was for paternity tests when they were introduced in the 2007 law. About facing long term problems with research projects, it has to be underlined that these projects have clauses. It started with parity between partners working on European projects. There had to be as many women working on the projects as men. Beyond these immediate palliative measures, the consent appears. Knowing if the consent is free overpasses communicating about it. And the consent is not free, not for administration nor for police filing. It is not free either for private filing. I don’t really agree with the parallel made between sexual revolution and privacy on the Web as sociologists proved the existence of a great mastership of Facebook users for their data. They are not young children but young active people (about thirty) who want to exhibit themselves by having a clear conscience. Let us return to the first TV reality show (Loft Story), we heard of it equally in the news. There were lots of exhibitions. The social consent existing behind it is the commodification of bodies and intimacy. With the living, there are debates about surrogacy. We apprehend this commodification. We can fully understand it and we can have arguments about selling a belly. I don’t ask any question now but I ask the following ones to my students. Who doesn’t use a free mailing box (Gmail, Yahoo!, Hotmail whatever)? Who doesn’t use free services? Who ever rejected giving away some personal data to take advantage of a service? One immediately receives the benefits of services while providing few details of oneself. The consciousness raising of the data collection, the possible interconnections and resulting portraiture comes later. Different reaction periods are at stake. In 1996, I offered already a solution commented as soft insanity. The fundamental question is: Can we make people happy without their consent? If we can, then we should think of an “holification” of personal data and intimacy. We are not talking about vital data but patrimonial and geolocalization data, with bodies taken as identities with their biometrics and DNA. This is very close to debates occurring with the living and body intimacy might be an argument to forbid collecting and processing some personal data even with consent.

MV - Could not we equally compare GMO with nanotechnologies? JLD, IFP and the public then.

JLD – Just a quick idea, specific clauses appeared in multimedia for disabled people so that they can access the services. We have to abide by some terms. There are cogitations in European projects with ethics committees delivering recommendations, which are sometimes a bit surprising. We have a program extracting the largest amount of information possible from a face, like age, eye color, etc… We were asked not to discriminate men from women as it was considered as not acceptable by a committee.  As a result, we didn’t work it.


IFP – I believe this concept of personal data has completely evolved. It was absolute before (protected by the 1978 law) and now it is subject to negotiation. There is a big difference today and in the way we are now referring to the capital of personal data dwelling around all of us. This approach is closely a proprietary one. Some even say they are owners of such data, and whether there is consent or not,  they should be able to do whatever they want with it. We recognize the debate of the human body: if there is consent, then we should be able to lease a belly. Some can not cut one’s harm but… We know that laws about reproduction cared for by the state, and bodies not belonging to anybody (in France) were constructed against this on behalf of higher principles. I wonder if personal data are about to be added to one’s patrimonies, since they belong of course to our intimacy. Should not we be reacting here? There are worries welling up in the polls: the first fear concerns personal data. There is confusedly something. At the same time, it doesn’t prevent them from consuming services. Some of us will wake up soon or later. Should not there be a stronger corpus of renovated and unalterable principles of privacy by letting individuals make separate choices, not just one for everything?

MV – Sir?

Man - Hello, I’m a member of the organization “Democracy and Freedom” objecting to the plan of camcorder installation in Paris. I would like to return to the technological argument. Video surveillance is clearly feeding a race for progress as we have heard with Mr. JLD. It is all about a sequel of new devices bearing the lacks of previous devices.

JLD – You see it as if applications were driving researc,h but it is not always true. We want to progress in image analysis. It is our first goal then that this analysis can serve different purposes (medical imagery, video surveillance etc.). We naturally talk about biometrics and video surveillance today but the first goal of most of researchers is not about improving specific applications. We are willing to improve audio, video and signal processing. I believe there is a little misunderstanding at this level.

Man – The problem is that technological arguments are provided by the chief constable, for instance, emphasizing the old systems obsolescence and inputs of the new devices. As I follow news about camcords, I found a funny picture of a policeman from the 60s with his stick and cape. We moved to analog camcords. As they were inefficient, we moved to digital camcords which are magnificent.

JLD – Video surveillance didn’t trigger the digital revolution. We moved from analog devices to digital devices for many reasons.

Man – Well, I want to express your participation to be a headlong rush. When talking about digital camcords, you said while someone is moving, it is difficult to see his face and a facial technology has to be developed as a compensation and so on. The next argument could be skin color and ethnical statistics from videos. We clearly discern an unstoppable dynamic here. Technology is seen as a solution, but not humans?

JLD – This discussion is very interesting, but we can get pretty far that way. For each technological progress, there are new issues (positive, and negative ones as well). I agree with you. What should we do then? Should we end research? It is a society problem.


SL – I understand the argument of headlong rush. However I don’t think technologies are the main arguments of politics. For videosurveillance, we know camcords of the market are not satisfying, for instance in terms of individual spotting. Technology doesn’t tell the chief constable that what exists today will work to find Mr. X on the street Y because he attacked Mr. Z, and it won’t prevent the political decision. It is exactly the same with RFID, consisting in telling the citizens they’re taken care of and their privacy is as well. The RFID chips can be disabled, but there is a big advantage for them as consumers at check out time. The big benefit for supermarkets is about making logistical decisions and profiling consumers in real time. The benefits are a little bit unbalanced, but let us assume people can check out more quickly. Public powers aware of privacy issues could impose requirements where the chips are disabled after checkout. In the present state of things, the government has already authorized the RFID deployment over the market, but the technology can not guarantee these tags being disabled after checkout. The only way how to really disable a tag after checkout consists in breaking it.


JMM – I take the example of a company called Visiowave created by Swiss students. They wanted to carry TV on the Internet. Image compression algorithms were developed. In 2001, the dot-com bubble burst and they wondered how to earn some money. They started to think of smart closed-circuit television. Visiowave was bought by TF1 and it is quite funny that TF1 might become the world’s number one smart television channel. Since then it was sold to General Electric. I’m talking about it because Visiowave is the system sold to RATP to equip buses. What was TV became videosurveillance and might return to TV with an inverted channel (news broadcasted in buses). The same system deals with videosurveillance and advertising. There is indeed a headlong rush with these technologies creating usages depending on the market needs. Some researchers are trying various experiments and we wonder what to do with the results. By meeting the CNIL people, I notice they are facing similar issues. Technology progresses fast and the political choices in terms of regulation affects what is related to security or emotions (to get elected again) instead of willing to be efficient. Don’t we go to far? Isn’t it too dangerous? Is it too late for the politics when they start snatching at these questions?

MV – Sir, good night, a new question?

Man – My questioning concerns the company Google. Today, thanks to their free accounts, we can take advantage very easily of emails and standard searches. A panel of services, such as the calendar application, allows them to know where we are and when. Advertising is contextual. Youtube allows them to know what we watched. Google books provide them with information about what we read…

MV – What is your question?

Man – What matters the most thanks to the analytics part is that even sites consulted directly without using Google search engine, are related to logs taking advantage of google accounts if a session has been opened. What do you think of the behaviour of this company which claims today keeping “don’t be evil” as a motto?


Leave a Reply




You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>