Privacy and data protection are two of the major issues of our times.
And probably for more than a few good reasons, not least among them is the tendency of some companies to skirt the rules and, in some cases, outright break the law.
In the case of Clearview AI in France, the country’s privacy watchdog CNIL is alleging just that and outlined two specific practices in particular.
Both of these involve the company’s utilization of images found on the Internet in their software development Known as scraping, this practice typically involves taking tons of images off of a website through image search – often without anyone’s knowledge, not to mention with someone’s permission.
TechCrunch notes that CNIL’s action, while technically only applicable to French territories, can actually be spread across the European Union quite easily through similar actions by sister agencies in other countries. This is because Clearview AI, as a US-based company without an EU footprint, is “open to regulatory action across the EU, by any of the bloc’s data protection supervisors.”
“These people, whose photographs or videos are accessible on various websites and social networks, would not reasonably expect their images to be processed by [Clearview AI] to feed a facial recognition system that can be used by states [such as for] police purposes.”
In a statement to TechCrunch published after the website’s initial report, Clearview AI’s CEO argued that, actually, the company isn’t subject to GDPR and the company’s custom is to only “collect public data from the open internet and comply with all standards of privacy and law.”
Of course, we’ll keep you updated on how all of this turns out.
We’d love to know what you think of companies scraping photos off of the Internet to use in their software development programs in the comments below.
We’ve got some other photography news on Light Stalking at this link right here.