Behind IBM: we will no longer give facial recognition software to the police

0
13


IBM says enough. No to facial recognition solutions or analysis software somehow connected. The fresh CEO explained it Arvind Krishna in a letter to the US Congress specifically addressed to some Democratic MPs including Kamala Harris, a former primary candidate and now back in the running for the ticket with Joe Biden. “IBM strongly opposes and will not accept the uses of any facial recognition technology, including those offered by other suppliers, for mass surveillance, ethnic profiling, violation of human rights and freedoms purposes or for other purposes that do not correspond to our values. and principles of trust and transparency “writes the 58-year-old executive born in India at the reins only since last April. More: Krishna, who comes from the company’s cloud division, relaunches the need for a “national debate about whether and how facial recognition technology should be used by internal law enforcement agencies”.


Behind IBM: we will no longer give facial recognition software to the police

The software dedicated to facial recognition has been at the center of controversy for years. Suffice it to mention those around Rekognition, the platform developed and offered in the cloud by Amazon to numerous US police districts. The Seattle giant, by admission of Andy Jassy, Web Services CEO, does not even know exactly how many departments exploit the over 165 services – including recognition services – available on the platform. Or the debate on Clearview AI, a New York startup that fell victim to a data breach at the end of last year and which has served, in addition to many private services, also US federal agencies and, again, local law enforcement units. The company recently – accused of having stolen over three billion images from social networks to train its neural networks – has explained that it wants to terminate all contracts that have not been signed by police and other state or local agencies. But the front now has a historical depth, on the other hand facial recognition has been used by the police in the United States for twenty years (in Florida since the nineties) and this year Facebook has paid 550 million dollars to close a case dating back to 2015 and linked to the suggestion feature of friends in the photos, which unduly memorized biometric parameters. Just to explain the Far West where sellers and those who use these tools move.


Behind IBM: we will no longer give facial recognition software to the police

Tools that have often come under fire on both counts. Both when used by law enforcement agencies for the intrinsic prejudices of their design and their algorithms on gender, age or skin color (a 2018 study signed by J testifiedoy Buolamwini is Timnit Gebru also on the IBM solution) that when available to private companies because out of any regulation and with unclear purposes of use. All, as always, with the background of China and the drifts to which these biometric surveys can lead. On the other hand, Amazon itself has officially asked governments to intervene to regulate this technology and make sure, as Jassy himself said, “that it is used appropriately”.

IBM had already tried, in 2018, to contain assorted biases and distortions of its facial recognition solution by releasing a series of tools and data sets. But even the U.S. giant had been caught with his hands in the jam last year, when it was discovered that he had used almost a billion photos taken from the Flickr platform without any consent, despite the fact that the images had been uploaded under a Creative Commons license.

In any case, Krishna’s letter certainly arrives not by chance, in a phase that mixes the ongoing social tensions in the United States with the confrontation that has been taking place for years on the specific theme, which exploded after the assassination of George Floyd in Minneapolis. In particular, the invitation to downsize city and local police, cutting or reviewing the economic resources of law enforcement and their room for maneuver in field work. The manager does not touch on that point but explains that it would be necessary to end the policy of impunity by approving new federal regulations to make the police more responsible for illegal behavior. For example with the extensive use of “body cameras” and data analysis tools to study the behavior of policemen.

Returning to facial recognition, the CEO explains that “artificial intelligence is a powerful tool that can help law enforcement agencies keep citizens safe. But those who sell those systems and those who use them have shared responsibilities: they have to do it that artificial intelligence is bias-proof, especially when used by law enforcement agencies, and that any bias is addressed and reported. ” The letter ends with a reference to the need to “create more open and fair paths for all Americans to acquire valid skills and training on the market”, considering that it is a “particularly acute need in the colored communities”. For this, Krishna explains, one must insist on one’s P-Tech program and on the stimulus to Pell GrantsFederal college loans that should be extended and financed better.


# subscription-message{background-color:# f1f1f1;padding:48px 16px;display:block;margin:32px 0 16px;border-top:2px solid # f3bb02;border-bottom:2px solid # f3bb02;position:relative}# subscription-message p{font-family:georgia;font-style:italic;font-size:24px;line-height:30px;color:# 3c3c3c;margin-bottom:8px}.subscription-message_author{font-family:side;text-transform:uppercase;font-size:14px;text-align:right;display:block;margin-bottom:32px;margin-right:16px;font-weight:700}# subscription-message a{display:block;width:210px;padding:8px;color:#fff;border-bottom:none;background-color:# e84142;text-align:center;margin:0 auto;border-radius:3px;font-family:side,sans-serif;font-size:17px;line-height:24px;-webkit-box-shadow:0 0 20px 0 rgba (0,0,0,.6);–moz-box-shadow:0 0 20px 0 rgba (0,0,0,.6);box-shadow:0 0 20px 0 rgba (0,0,0,.6)}# subscription-message to:hover{background-color:# db1b1c;border-bottom:none;-webkit-box-shadow:0 0 0 0 rgba (0,0,0,.6);-moz-box-shadow:0 0 0 0 rgba (0,0,0,.6);box-shadow:0 0 0 0 rgba (0,0,0,.6)}

We are not a party, we do not seek consensus, we do not receive public funding, but we are standing thanks to the readers who buy us on newsstands every morning, look at our site or subscribe to Rep :.
If you are interested in continuing to listen to another bell, perhaps imperfect and some days irritating, continue to do it with conviction.
Mario Calabresi
Support journalism
Subscribe to Repubblica

->

->



Source link
https://www.repubblica.it/tecnologia/sicurezza/2020/06/09/news/dietrofront_ibm_non_daremo_piu_software_di_riconoscimento_facciale_alla_polizia-258784368/

LEAVE A REPLY

Please enter your comment!
Please enter your name here