Archives

Security vs. Privacy in France – Part 4

This is part 4 of the discussion “Sécurité ou Vie Privée ?” (ed: Security or Private Life) moderated by Mathieu Vidard (MV).  Part 1, Part 2 and Part 3 may be found herehere and here.

Man – My question is closely related to your last topic, perhaps a little bit provocative : there are politics, technics and ethics. There are ethics committees, decision makers handle politics and technicians determine what can be implemented. From what you have said since the debate started, we are quite far from responding to every ethical issues raised by new technologies. We can focus on epiphenomena but in general all of us are controlled in different ways. Mobile phone usage makes locating anybody at any time possible. I don’t see how this could be prevented. Besides, even when they can be prevented, nothing is done. For instance, I don’t know what the current status of social filing is but I don’t see it as overused…
JMM – Ten interconnections between social databases were made last year.
Man – Yes, exactly and each database has to be a social database since it contains data about individuals. I don’t know if any policeman at any police station can get access to any piece of information about anybody.

MV – Respecting ethics, JLD, you are a researcher designing new technologies. You can probably explain to us in few words what you do with faces in biometrics.
JLD – We try to integrate dynamic parameters with faces. Today, facial representation works well if camerawork is kept relatively simple (frontal, good lighting conditions etc.), which is rarely the case when people are walking along a corridor… To improve the “scoring”, we add the dynamic facial parameters, the way how one smiles, the way how one talks… combined with gaits.
MV – So, you design these instruments?
JLD – We try to get rid of these locks then…
MV – You are quibbling… You design these machines serving security and surveillance.
JLD – I don’t have such a vision. I am maybe a little bit naive as I am a scientist.
MV – I’m not blaming you… It is a fact.
JLD – I don’t have such a negative picture in mind.
MV – It’s not negative…
JLD – I find all these questions very interesting as we need to raise the possible negative aspects and I am entirely fine with it. It is great that there are people for doing it. However I see also positive aspects. About RATP, I think they do really want to improve security. Customers are asking for it, customers who are taking the RER (the transit system) everyday in the suburbs wish rightly or wrongly more security.

MV – Perhaps we will return to the perception of security a bit later. Do you, as a researcher, feel concerned about ethics and when does one start wondering about the related issues?
JLD – Rather quickly as…
MV – Basically, it may look a bit like Einstein inventing the nuclear weapon and then…
JLD – No, no… First of all, I am here for discussion. Researchers are used to discussions with other people and it is very nice. The ANR (the National Research Agency) encourages us in working with people from other areas, which we do willingly. We ask ourselves questions as citizens as well but there are different levels. Research depends on mathematical foundations, signal processing etc. which is rather independent from applications. I’m providing you with a single instance: we examine faces to recognize skin colors. There are equal requests from people developing virtual makeup software and those who want to retrieve someone’s ethnic origins. From a mathematical approach, both these requests are nearly the same. This is just an example enlightening how ambiguous research can be and how different the applications are.
MV – MM?
MM – I come to rescue my colleague, a computer scientist who risks being the bad guy in this debate.
MV – Not at all!
MM – As you referred to the ANR, we need to realize that we are aiming at more and more contracting activities in research areas. If we want simply to conduct our research with scientific goals, we have to submit our projects with the nice phrasing so to say that it can be immediately useful, for technics in security, for biometrics, for improving video surveillance, for making it smarter and then it works. We get money for conducting our research and if we don’t do this, we don’t get the money. The problem remains upstream. Above all in the following areas: in computer science, in micro electronics, in nanoelectronics now, we are forced to promote public-private cooperation, otherwise nothing is done and we fold our arms (and I say we for solidarity even if I am not in this area anymore). We fold our arms, we can not lead projects, we can not take Ph.Ds. All this makes sense and it’s comforting. I have seen myself, projects submitted to the ANR involving a partnership with the gendarmerie for experimenting filtering technics. These projects are always submitted by taking into account the best objectives in the world, for our old people not losing themselves on the bus, for producing makeup as white skins won’t react the way how coloured skins would – nevertheless there is ethnic profiling behind it. Biometrics resellers vindicate buying their products with the best arguments: for instance, on asian and african markets for improving transparency and democracy during election periods since there are no well-structured vital statistics in these countries and there is not really any voter register. So we always have the best arguments, but the root of the problem is not related to this or that technique.  Though each technique has to be investigated properly, it resides in massive, systematic usage of these techniques and their interconnections. This is where problems are originate.

MV – Short answer to take another question.
JMM – Another instance comes with the GIXEL (Electronics Industrial Group). A few years ago, a blue paper informing about what had to be done for developing the industry, was addressed to the government. According to this paper, people are scared by video surveillance technologies, biometrics, RFID and control technologies. People see “Big Brother” when told about them and this act as a brake on their business. It was explicitely written that, to develop their industry, we had to deploy RFID, biometrics and video surveillance devices with kindergartens and nurseries, so that parents and children can get used to these technologies, would stop seeing “big brother” and not to be scared anymore. When this was pointed out at one of the Big Brother Awards sessions, the blue paper was published again, with the sensitive parts removed. This is one of the issues industrials are confronted with and how they try to infiltrate into our minds. Since then, we have seen nurseries with biometrics devices…
MV – There are also these wristbands in maternities…
JMM – There are now RFID wristbands, supposedly preventing kidnapping.
MV – We’re are in the same field…
IFP – I would like to answer the question, which appears absolutely fundamental to me. In the background, we notice the rise of the technologies and we wonder what can be done. I think the situation is very different from the one we had in 1978 when the CNIL was founded. In some way, everything was quite simple in 1978. There were large scary public files. The CNIL was mainly established for controlling these files. Today the “threat” is totally decentralized. There are still these public files. M. Manach told us they keep expanding. Moreover there is personal information everywhere which are not even put together as files (this is the newest part) but are available. Each of us even offers it. The processing tools help in making scripts out of them. This is how these kind of smaller files are formed. We understand intuitively the solution has to be different. The CNIL as a regulator has to adapt and this adaptation is on its way since 2004.  But the global chain has to follow the same logic. That’s why I was insisting on individuals having a role to play in terms or personal data protection and companies as well. Each link of the chain has to play a role in terms of the security-freedom balance.
MV – We are going to develop the topic with chip usage inside companies. JLD, a few words and another question.

JLD – I agree with the fact that Europe, and France in particular, deliver invitations to tender and organize research. They influence decisions as they launch proposal invitations, but sometimes we reply to invitations which are not really expected and we suggest our own projects. For instance, I’m strongly interested in system reliability and I think it is crucial showing to the general public that systems are not one hundred percent reliable. Thalès played the game with a project aimed at demonstrating that impostures can lead to vulnerabilities in some biometric systems and soon we are going to receive the answer to our proposal.
Man – The question is about knowing where we are aiming.  And since technologies are moving very fast, we need to look upstream but nothing plays the role of the ethics committee (CCNE) in biological science, for instance when it comes to computer science. The CNIL has a regulating role and it has nothing to do with what the CCNE does. This comparison with Science of Living is relevant to us as there are technologies we didn’t want to develop. We said by thinking upstream, we don’t want reproductive human cloning. We stop research tending to point in this direction. Is there anything equivalent in Computer Science to a technology we would not like to develop?  Though without thinking ahead, industrials would shape it and there would be calls for proposal. To me, the facial recognition can be compared to the artificial uterus: this is something which will radically change security in our society and individual freedom. It is not perfectly running but there is money for funding it. It means there was not the same thinking upstream that one can find in biological science.

Share

Leave a Reply

 

 

 

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>