1. ‘Way more than people counting’
It was hardly more than a footnote – the message that informed the University Council in October 2020 that the university wanted to switch to an ‘automated visitor management system’. The idea was that by using a ‘counting system infrastructure’ the university would be able to ‘monitor the use and occupancy of classrooms at any time of the year and at any moment of the day’, ‘gather management data on a structural basis’, and ‘respond immediately if necessary’. The usual method of carrying out manual checks at the beginning of each semester had proved to be too labour-intensive.
It is a matter of urgency. The board wants to approve this system by means of an accelerated process: ‘In view of the importance of this proposed decision regarding the provision of information on room usage in light of the measures that were taken due to the Covid-19 crisis.’
The University Council is concerned about privacy issues but is reassured by the Executive Board. According to the board, ‘No personal data is recorded.’ This is because the system is ‘fed by so-called “3D visualisation”’, which provides ‘key indicators that are used in the dashboard’.
And so it happens that, while the doors are still closed during the lockdown, the university purchases 350 smart cameras at 600 euros each. Immediately, they are placed at the entrances of university buildings to keep track of the number of visitors. They have been hanging in front of the door of every lecture hall since ‘mid-2021’.
Classroom scanners, university spokesperson Caroline van Overbeeke calls them in response to questions from Mare. ‘These people counters only register numbers. No other data is recorded so no personal characteristics are registered or saved.’
The term ‘classroom scanners’ was apparently coined by the university itself as it cannot be found anywhere else.
The cameras are made by the Swiss manufacturer Xovis and they can do way more than just count. The camera sensors are equipped with artificial intelligence that constantly analyses the footage. Aside from counting, they can also track the walking routes of people who pass by, display their heights, estimate their age, gender or mood, and even check whether they are wearing a face mask.
This data allows for extensive analyses. Such cameras can also be found at airports, for example, where the analyses are used to reduce queuing times. In shops, they analyse whether and how long customers watch advertisements and at what point they decide to buy something. Since the pandemic, Xovis has been encouraging potential customers to use the scanners to enforce social distancing, and has also introduced face mask detection.
In short, the university may call it ‘people counting', but the manufacturer uses an entirely different slogan: ‘Way more than people counting.’
2. Completely anonymous or a live feed
The inconspicuous boxes can currently be found at every faculty, including The Hague. Only the Leiden University Medical Centre has none. There are two round openings in the housing with a camera behind them. No one can pass under the sensor unnoticed.
One of the corridors in the Kamerlingh Onnes building, where the Faculty of Law is located, is particularly well equipped. There is a Xovis sensor every few metres, above every teaching room door. The ceiling is high, about four metres. According to the manufacturer, each camera can monitor an area of 7.50 by 4.20 metres from that height. With all the cameras together, that would be sufficient to monitor the long corridor almost in its entirety.
Contrary to what the university claims, the scanners are not configured to exclusively display numbers. This is how it works: the camera first records video footage of the environment, and then, the artificial intelligence filters out certain data: man, woman, child, employee, height, walking route. The amount of data the user gets to see, and how anonymous it is, depends on the settings.
There are four levels, ranging from 0 to 3: at the highest security level (3), the user only sees data about the number of people who pass by. This is completely anonymous. At the two lower levels, the user sees moving abstract dots with, for example, information about height or age. At the lowest level (0), there are no restrictions and the sensors provide a live feed of the camera footage, which shows people in a clearly identifiable way.
The university states that the cameras are set to ‘level 1’, the second-lowest level. This is ‘so that people are not identifiable and are only registered as silhouettes’, Van Overbeeke says. ‘The university guarantees that it doesn’t register any specific personal characteristics.’ According to her, this is ‘ensured at each privacy level of the scanners, from the lowest (0) to the highest (3)’.
Anyone who wants to change the system to a lower level, and thereby collect more information, needs a so-called ‘master key’. The university informs Mare that the University Services Department manages the system, but is not in possession of the master key. ‘Only the supplier can change the privacy settings and does so exclusively on the instructions of the university.’
3. Poor and outdated security
If a system is capable of collecting large amounts of data, the security has to be up to scratch, and that is not the case here. Until last week, the cameras’ login page was the first hit on Google if you searched for ‘Xovis Leiden University’. After Mare’s questions, the page was made private.
Not only did this public page, which seemed to be protected only by a password, give access to the data collected by the cameras, but possibly to various privacy settings as well.
Mare asked ethical hacker Sijmen Ruwhof to take a look at the page. ‘These administrator login screens should absolutely not be accessible via the public internet. Regular camera security isn’t either. ‘As the system uses http (no padlock icon – Ed.) instead of the more secure https (with padlock icon – Ed.), the password you enter is transmitted over the internet in unencrypted form and is therefore easily viewable. It travels from your computer to the university’s server via all the intermediate stops. This is extremely imprudent, especially for a professional institution; it’s simply not acceptable anymore. It looks like a poorly implemented system that hasn’t been given much thought.’
Even users who are not logged in can gather quite a lot of information via this page. For instance, they are able to view the various settings of the cameras.
Moreover, one notable line reveals the level of the privacy settings: ‘<privacy-mode> 0 </privacy-mode>’ (see box below).
4. The moral responsibilities of an overkill system
‘Every single instance of personal data processing is a violation of someone’s privacy’, says a spokesperson for the Dutch Data Protection Authority (Dutch DPA). ‘You need a lawful basis for this, often referred to as a “legal ground” in GDPR terminology. The greater the breach of privacy, the more important this basis must be. In other words: the end must justify the means. An organisation has to consider this beforehand.’
As an enforcer of the privacy legislation, the DPA does not want to comment on individual cases which were not investigated by the authority itself.
What the spokesperson does want to do, however, is give a general explanation. ‘The bar is set very high when it comes to tracking people on a large scale. And as an organisation you should always ask yourself the question: can I achieve this goal by less severe means, so without infringing on people’s privacy?’
‘In principle, counting people’s presence seems to me to be a legitimate aim’, says Bart Jacobs, professor of security, privacy & identity at Radboud University. ‘But if more functionalities are enabled for the cameras, that is illegal. The principle of the GDPR is purpose limitation, and this entails data minimisation.’
Data minimisation is a term used in privacy legislation. In short, it means that an organisation may not collect more data than is strictly necessary for the intended purpose. So, if you want to know how many people are present in a lecture hall at a particular moment, you do not need to know whether these people are men or women or whether they look happy or not. Gathering this sort of additional information is not permitted.
Text continues below.
Research by Mare reveals that digital access to the Xovis sensors poses various security risks.
Anyone who googles ‘Xovis Leiden University’ immediately finds the administrator login screen. Upon opening the login screen, a notification pops up immediately, stating that the browser has detected a security risk. The page is considered unsafe because it uses the older, unencrypted http and not the more secure https.
‘This is extremely imprudent, especially for a professional institution’, says ethical hacker Sijmen Ruwhof.
And the HTML code behind the page reveals even more inaccuracies. Users who are not logged in can view various settings of the device just from the HTML code of the login screen. The temperature, the amount of light that the sensor receives, and the server to which the data is transferred are all openly available to read.
An archived copy of the login page from August includes the line ‘<privacy-mode> 0 </privacy-mode>’. If that is indeed the current privacy level of the Xovis cameras, it would mean that the cameras were set to the lowest, unlimited, privacy level, at least at that time. According to Xovis, this setting means: ‘No restrictions. Live video image stream and tracked persons paths are shown.’
Further analysis revealed that the login page even shows users who are not logged in the password in encrypted form. Anyone browsing the HTML will read ‘<password enc=\"true\">bxJF…’ (shortened for security reasons). A bit further on in the code, you can also read which algorithm was used to create this encrypted password, thus making it possible to crack it and gain access to the system. By using a so-called ‘salt’ on the password, the cracking is made more difficult, but this salt can also be found in the code itself. It is unclear whether the master code of the system can also be retrieved in this way.
According to Ruwhof, this is another sign of carelessness: ‘The MD5 algorithm that’s used here to store the password has been known to be really unsafe for about fifteen years now. The fact that the server sends the encrypted password to the user is outrageous and would never have been necessary if the system had been set up in a secure manner. The combination of these two weaknesses poses a serious security risk.’
In the light of responsible disclosure, Mare reported these risks to the university prior to the publication of this article. As a result, the university has shielded the login pages from the public internet. The ISSC informed Mare that the security risks are considered to be low.
Jacobs: ‘Strictly legally speaking, you’re allowed to use a system with more functionalities than just counting people, as long as you do not use the other functions. But if you purchase an overkill system like this, I think that, as a university, you at least have the moral responsibility to take as many confidence-building measures around it as possible. For example, independent audits should be carried out on a regular basis. It should also be clear who is allowed to change the privacy settings and under what circumstances.’
These rules also apply if the footage is converted directly into data, as is the case with Xovis. ‘Processing personal data ‘on the sensor’ is still processing personal data’, says the DPA spokesperson.
‘It is possible however, that by anonymising data on the sensor, the infringement of privacy is smaller than it would be in the case of centralised processing, and that, as a result, when weighing up the infringement against the benefits, it is more likely to be the case that it is allowed. However, it must be noted that anonymisation is a very difficult matter, especially for long-term monitoring in which location and behaviour also play an important role. Plus, if the anonymisation on the sensor can be switched off remotely very easily, there may still be a major risk.’
5. Sensitive personal data
Professor Bart Jacobs agrees with the Data Protection Authority that the cameras process personal data. ‘As soon as there is identifiable footage at some point in the process, we’re talking about personal data. What is important to consider here is that, according to the law, in addition to personal data, there is also sensitive personal data. This sensitive data concerns ethnicity, religion, or health. You can often identify these things with camera footage. Therefore, this system also processes sensitive personal data. This must be covered by additional safeguards, set out in a DPIA.’
A DPIA, or ‘data protection impact assessment’, is an assessment that an organisation must carry out before processing data with a high privacy risk. According to the Data Protection Authority, there is a high privacy risk if, among other things, you ‘process special categories of personal data on a large scale’ or ‘track people in publicly accessible areas on a large scale and systematically’.
However, the university’s independent data protection officer, Ricardo Catalan, believes that a DPIA was not necessary in this case. ‘The cameras only register identifiable images for a split second before the images are converted into unidentifiable images (silhouettes). The unidentifiable images are then forwarded for further processing. This is all covered by the agreement that the university has concluded with the supplier.’
According to him, the risk of hackers stealing the images before anonymisation has taken place is very small. Besides, ‘the impact would be limited’ for the people who were filmed. ‘The university is an open institution and there is free access to its lecture hall buildings.’
He has advised however to draft a data processing agreement. However, the university does not want to allow access to the only document that contains agreements about the data processing.
‘It’s not common practice to share such an agreement, in part because of security reasons’, spokesperson Van Overbeeke explains. ‘In essence, the agreements state that the external party processes personal data on behalf of the university and that the personal data may not be used for any other purpose except on the instructions of the university. Other agreements concern the location of the data storage (in the EU if possible), retention periods, assistance in the event of incidents, agreements in the event of data leaks, et cetera.’
6. A fine would be painful
‘If more functionalities are enabled for the cameras than necessary, it’s illegal’, says Jacobs. If the Data Protection Authority were to enforce this, this could result in a large fine. In fact, any student or employee can submit an enforcement request to the DPA. A fine like that would be very painful for the university.’
Why does the university opt for an extremely smart and expensive system if it cannot be used? According to Van Overbeeke, ‘reliability tests have been carried out in the past few years’ with regard to people counting. ‘They have tested infrared scanners and methods for translating Wi-Fi signals or CO2 values into the number of people present. The results were found to be insufficiently reliable. In the end, the current scanners proved to be the most reliable.’
The cameras have been there for a whole year. Nevertheless, the university only decided to inform employees and students this week.
A year after the cameras were installed, while Mare was waiting for reactions to this article, an announcement suddenly appeared on the university website on Tuesday evening: ‘Now that in the past weeks, all the scanners have been installed in the right place, we would like to inform you about this service.’
When asked why the cameras are not set to the most anonymous security setting, the spokesperson replied: ‘Now that all installation and calibration work has been completed, the university is going to investigate whether the next privacy level (2) does not negatively affect the reliability of the counts.’