Swedish police used Clearview AI on citizens and lied about it

in abuse •  17 days ago  (edited)


Swedish police have used Clearview AI without any 'consequence judgement' having been performed.

In other words, Swedish police have used a facial-recognition system without being allowed to do so.

This is a clear breach of human rights.

Swedish police has lied about this, as reported by Dagens Nyheter.

This is the type of thing that people should be talking more about, as I mentioned in regard to Women in Tech Stockholm 2020.

What will happen now?

Most probably nothing. Somebody in Swedish police gets a slap on the wrist while our private data has been uploaded to a US-based company, which means that the US government now has access to our data which can be used at their behest.

All of this is known thanks to Buzzfeed that disclosed that Clearview AI have customers outside of the US. Sweden was included. Quoted from this article:

When BuzzFeed News reported earlier this month that Clearview AI had used marketing materials that suggested it was pursuing a “rapid international expansion,” the company was dismissive, noting that it was focused on the US and Canada.

The company’s client list suggests otherwise. It shows that Clearview AI has expanded to at least 26 countries outside the US, engaging national law enforcement agencies, government bodies, and police forces in Australia, Belgium, Brazil, Canada, Denmark, Finland, France, Ireland, India, Italy, Latvia, Lithuania, Malta, the Netherlands, Norway, Portugal, Serbia, Slovenia, Spain, Sweden, Switzerland, and the United Kingdom.

Clearview AI scrapes—which is a fancy version of saying steals—photos from Facebook, Twitter, and other sources.

The consequences of using tools like this one are staggering and mean far more than a lot of people know.

Bonus info: all of the above is known not thanks to Clearview AI nor the Swedish authorities, but via a data breach:

Fucking buffoons.

The system has been criticised due to its method of scraping the internet to gather images and storing them in a database. Privacy activists say the people in those images never gave consent.

“Common law has never recognised a right to privacy for your face,” Clearview AI lawyer Tor Ekeland said in a recent interview with CoinDesk. “It’s kind of a bizarre argument to make because [your face is the] most public thing out there.”

“Clearview AI Lawyer Tor Ekeland Says Your Face Is Public Property - CoinDesk.” CoinDesk. March 6, 2020. https://www.coindesk.com/clearview-ai-lawyer-tor-ekeland-says-your-face-is-public-property.

Posted from my blog with SteemPress : https://niklasblog.com/?p=24356
Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!