Algorithmic e-proctoring of exams: TestWe will not survive the winter

Posted on


A couple of weeks ago, we wrote about the legal case brought by students against the use by the Institute of Distant Study of the University of Paris 8 of the algorithmic exam e-proctoring platform “TestWe”, a case in which La Quadrature was involved. Last week, in a remarkably interesting preliminary ruling, the administrative court of Montreuil suspended the use of TestWe’s software.

TestWe, the company which develops the eponymous software, proudly claims to have developed a tool that allows for the automatic proctoring of any student taking an exam remotely. In short, this automated surveillance consists of an algorithmic analysis of applicants taking a written test outside university facilities: identity is automatically checked at the beginning and during the test, eyes are constantly tracked, and the immediate environment is constantly monitored using video and sound analysis (see our full analysis of the software). This implies the processing of personal data, of which much is sensitive data.

The legal procedure introduced by the affected students, and in which La Quadrature took part, consisted in asking the administrative court to urgently suspend the use of the TestWe software by the Institute of Distant Study of the University of Paris 8, in a preliminary ruling.

To obtain this suspension, it was necessary to demonstrate “serious doubt” before the administrative judges. In other words, we had to show that TestWe’s software is quite likely illegal, or that this illegality is likely enough that an urgent examination of the situation would reveal a obvious infringement of freedoms.1The very same “serious doubt” allowed us to obtain the suspension of the use of drones by the Paris police authorities in 2020. This is exactly what the administrative court of Montreuil decided last week: TestWe’s software is quite likely to be disproportionate and this doubt justifies to require the suspending its use.

So what are our legal arguments? That any data processing like TestWe must respect a number of requirements to be legal. These requirements include proving the existence of a legal basis (i.e. that the data processing must be authorised by law), a purpose (i.e. the reason behind the processing) and proportionality (i.e. that only the data necessary for the purpose, and no more, must be processed, sometimes referred to as the “data minimisation principle”). In our written submissions in support of the students before the administrative court (see our first submission, our “note en délibéré” and our reply to the shocking privacy impact assessment produced by the University), we detailed how TestWe’s software had no legal basis among those provided for by the GDPR and was disproportionate to the intended purpose, i.e. the proctoring of a remote exam.

It is this second point, the lack of proportionality, that was of particular interest to the administrative court. It considered that the “serious doubt” to require the suspension of the use of TestWe consists in the apparent failure of this software to comply with the data minimisation requirement of the GDPR. It is rather good news that the judge is open to this argument: this means the scope of the student information and data TestWe collects and processes is far too large and disproportionate for the stated purpose. In short, just because the data exists or is available does not mean it is legal to use it for any purposes.

A ruling with many positive outcomes

First of all, let’s celebrate that algorithmic surveillance has been banned from the University: by raising doubts about the legality of this kind of tool, the administrative court of Montreuil sent a clear warning to all other universities and schools.

Also, beyond this case, this preliminary ruling notably creates a precedent for the recognition of the lack of proportionality of algorithmic surveillance systems by courts. In May 2020, when we had the Paris police drones banned, the Conseil d’État (the highest administrative court in France) ruled in our favour because it found that the police had no legal basis for flying the drones, i.e.: these drones were not authorised by the law. In December 2020, when we won again, the Conseil d’État stated again that there was no legal basis (it also added some considerations denouncing the very principle of this kind of device, which led us to claim a “total victory”). But the problem with victories obtained because of a lack of legal basis is that they never last forever: the law can be changed, and this is what the government rushed to do with its Global Security law (partially censored by the Constitutional Council) then the Internal Security law (this time validated by the French Constitutional Council).

In the case of TestWe, the administrative court of Montreuil ruled that the doubt as to the legality of this data processing stems from its likely lack of proportionality. In other words, the court considers that permanent surveillance of bodies and sounds is not allowed for remote proctoring.

Since 2020, we have been fighting algorithmic video surveillance in courts, whether in big cities like Marseille or in small towns like Moirans in the Alps. We are also fighting algorithmic sound surveillance, against the city of Orléans that dreams itself a leader in this area. However, while this kind of surveillance is not currently authorised by law, the missing legal basis could soon be fixed, at least for algorithmic video surveillance, with the upcoming law on the 2024 Olympic Games that is about to legalise this mass video surveillance of public space. This is the real value of the TestWe ruling: the proportionality of this kind of permanent algorithmic surveillance is now seriously questioned.

The students’s victory before the administrative court in Montreuil will also allow us to better oppose the CNIL’s wishes to make these algorithmic e-proctoring tools legal. Contrary to what the court just ruled, the CNIL would like to introduce a principle of proportionality for some of these video and audio exam proctors, and promote the use of algorithm-based surveillance tools.

The fight continues

The fight doesn’t end with this preliminary ruling. The judge did not say that TestWe is illegal, only that there is serious doubt about its legality. We will therefore keep on fighting alongside the students of Paris 8 in the next steps of the procedure (i.e.: the normal, long and non-urgent procedure) in order to confirm the success and get the administrative court of Montreuil to definitively recognise TestWe as illegal. However, this final ruling will take several months to happen.

On its website, TestWe claims it is proud to have as clients institutions such as the CNED, ESSEC, and Grenoble Management School or the SESAME competition. We therefore call on the students of these institutions and any whose university or school uses software similar to TestWe, but also call on the professors and teachers, to continue the movement initiated by the studends at Paris 8: make noise in your universities, make yours and reuse the legal arguments, tilt the balance of power in your favour. Today’s victory is only waiting to spread and, in the end, to kick out of our universities and educational systems the Technopolice and all the ideas it dredges.

References

References
1 The very same “serious doubt” allowed us to obtain the suspension of the use of drones by the Paris police authorities in 2020.