A lesson on face recognition, privacy and the GDPR of the Far North



<div _ngcontent-c16 = "" innerhtml = "

Getty

Are you ready? Join Forrester at & nbsp; second annual editionData Strategy & amp; Forum InsightsNovember 5-6 in Austin. & nbsp; & nbsp;

All eyes may be on Hong Kong and its use of facial recognition to identify protesters, but there is a much brighter case – for those lucky enough to have less antagonistic relations with local authorities – in the small Swedish town of Skellefteå ("sche-eye-left"), Located near the Arctic Circle. This sleepy mining town made the headlines of the international press by becoming the recipient of & nbsp;The very first GDPR fine from Sweden& nbsp; to pilot the use of computer vision to track the attendance of high school students.

Why is a small fine allocated to a remote city important? Because it demonstrates that computer vision – and facial recognition in particular – can now be successfully implemented by & nbsp;everyone& nbsp; – regardless of resources and capabilities – and you need to focus on & nbsp;meaningful consent& nbsp; to ensure successful adoption.

Students trained on students

Last year, Skellefteå Anderstorp High School teamed up with Finnish IT and IT consulting firm Tieto to test automation technologies to monitor and report attendance. students. According to Swedish law, schools must declare the number of participants each day – a time-consuming and time-consuming requirement during the school year. Tieto reports that the teachers at the school spent 17,280 hours preparing the annual accounts.

Tieto has been testing two solutions to this problem: one based on RFID tags allowing students to be recorded as being present when they were within easy reach. receiver and the other based on facial recognition technology. The latter has been more successful because a large percentage of students forget to bring their labels to class, which makes the data incomplete and inaccurate.

Do not cheat with meaningful consent

So, why the fine of 200 000 SEK (about 20 000 USD)? One of the key elements of the complaint filed by the Swedish Data Protection Authority (DPA) concerned the question of consent. Yes, students and parents were asked to give their consent (and some even refused to participate), but the agency felt that this was not enough given the power dynamics between students and parents. school. Therefore, a written consent is not enough, even when the person has a viable alternative, if pressure or coercion is imposed on him to obtain his consent. This is the difference between "consent" and "meaningful consent" and it applies everywhere when it comes to artificial intelligence. You risk a setback for the client, employees and regulatory authorities when the person feels cheated, under pressure or otherwise forced to use an AI solution.

The right way to introduce an AI solution is to offer a solution that actually benefits the individual and convinces him of these benefits – while offering alternatives, including the status quo. And it should be absolutely certain that the choice of the status quo will have no negative repercussions and will remain as good or, ideally, better with the IA solution around you. If not, you're back to make people feel forced.

Do your homework

Another key element that led to this fine seems to be the lack of a sufficient impact assessment. For sensitive projects such as this one, it is extremely important to be in proactive contact with the relevant authorities and it seems that ODA has never been contacted before. Important questions had to be examined: what would the students who would have withdrawn feel? Even though they knew that they were not collecting their biometric data, were they still comfortable? Do students who choose to trust trust that technology accurately records their presence? What were the risks associated with the collection and storage of this type of biometric data and how would these risks be mitigated?

Pencils down?

Does this mean that computer vision solutions are not compatible with GDPR? Far from there. This fine is not based on the GDPR, but on the interpretation of GDPR in Sweden – a country that has a long history of enforcing strict data protection legislation dating back several decades. This is unlikely to create a limiting precedent in the European Union. In fact, if there is greater geographical revenue in this scenario, it is a positive scenario. Ten years ago, this case of using computer vision would have been science fiction. Today, the technology is mature enough to be deployed anywhere, even on the edge of the Arctic Circle.

This article was written by Kjell Carlsson, Ph.D., Senior Analyst, and originally appeared right here.

">

Getty

Are you ready? Join Forrester at the second editionData Strategy & Insights Forum November 5-6 in Austin.

All eyes may be on Hong Kong and its use of facial recognition to identify protesters, but there is a much brighter case – for those lucky enough to have less antagonistic relations with local authorities – in the small Swedish town of Skellefteå ("sche-eye-left"), Located near the Arctic Circle. The sleepy mining town made headlines in the international press by becoming the first recipient of the GDPR fine in Sweden for piloting the use of computer vision to track the attendance of high school students.

Why is a small fine allocated to a remote city important? Because it demonstrates that computer vision – and facial recognition in particular – can now be successfully implemented by everyone – regardless of resources and capabilities – and you need to focus on meaningful consent to ensure successful adoption.

Students trained on students

Last year, Skellefteå Anderstorp High School teamed up with Finnish IT and IT consulting firm Tieto to test automation technologies to monitor and report attendance. students. According to Swedish law, schools must declare the number of participants each day – a time-consuming and time-consuming requirement during the school year. Tieto reports that the teachers at the school spent 17,280 hours preparing the annual accounts.

Tieto has been testing two solutions to this problem: one based on RFID tags allowing students to be recorded as being present when they were within easy reach. receiver and the other based on facial recognition technology. The latter has been more successful because a large percentage of students forget to bring their labels to class, which makes the data incomplete and inaccurate.

Do not cheat with meaningful consent

So, why the fine of 200 000 SEK (about 20 000 USD)? One of the key elements of the complaint filed by the Swedish Data Protection Authority (DPA) concerned the question of consent. Yes, students and parents were asked to give their consent (and some even refused to participate), but the agency felt that this was not enough given the power dynamics between students and parents. school. Therefore, a written consent is not enough, even when the person has a viable alternative, if pressure or coercion is imposed on him to obtain his consent. This is the difference between "consent" and "meaningful consent" and it applies everywhere when it comes to artificial intelligence. You risk a setback for the client, employees and regulatory authorities when the person feels cheated, under pressure or otherwise forced to use an AI solution.

The right way to introduce an AI solution is to offer a solution that actually benefits the individual and convinces him of these benefits – while offering alternatives, including the status quo. And it should be absolutely certain that the choice of the status quo will have no negative repercussions and will remain as good or, ideally, better with the IA solution around you. If not, you're back to make people feel forced.

Do your homework

Another key element that led to this fine seems to be the lack of a sufficient impact assessment. For sensitive projects such as this one, it is extremely important to be in proactive contact with the relevant authorities and it seems that ODA has never been contacted before. Important questions had to be examined: what would the students who would have withdrawn feel? Even though they knew that they were not collecting their biometric data, were they still comfortable? Do students who choose to trust trust that technology accurately records their presence? What were the risks associated with the collection and storage of this type of biometric data and how would these risks be mitigated?

Pencils down?

Does this mean that computer vision solutions are not compatible with GDPR? Far from there. This fine is not based on the GDPR, but on the interpretation of GDPR in Sweden – a country that has a long history of enforcing strict data protection legislation dating back several decades. This is unlikely to create a limiting precedent in the European Union. In fact, if there is greater geographical revenue in this scenario, it is a positive scenario. Ten years ago, this case of using computer vision would have been science fiction. Today, the technology is mature enough to be deployed anywhere, even on the edge of the Arctic Circle.

This article was written by Kjell Carlsson, Ph.D., Senior Analyst, and originally appeared right here.