ACLU to take on racist artificial intelligence

stellarborg

Green Belt
@Green
Joined
Oct 2, 2015
Messages
1,200
Reaction score
0
Sexist, racist, and discriminatory artificial intelligence has a new opponent: the ACLU.

Earlier this month, the 97-year-old nonprofit advocacy organization launched a partnership with AI Now, a New York-based research initiative to address the social consequences of artificial intelligence.

In May last year, a stunning report claimed that a computer program used by a US court for risk assessment was biased against black prisoners. The program, Correctional Offender Management Profiling for Alternative Sanctions (Compas), was much more prone to label black defendants as likely to reoffend – flagging them at almost twice the rate as white people (45% to 24%)

“If you’re not careful, you risk automating the exact same biases these programs are supposed to eliminate,” says Kristian Lum, the lead statistician at the San Francisco-based, non-profit Human Rights Data Analysis Group (HRDAG). Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, can get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. The program was “learning” from previous crime reports. For Samuel Sinyangwe, a justice activist and policy researcher, this kind of approach is “especially nefarious” because police can say: “We’re not being biased, we’re just doing what the math tells us.”

https://www.fastcodesign.com/901342...o-civil-liberty-the-aclu-has-a-plan-to-fix-it
https://www.theguardian.com/inequal...ots-how-ai-is-learning-all-our-worst-impulses
 
after gamergate we'll have #predictiveanalyticsgate
doesn't really have the same ring to it
 
Hint- the problem here isn't sentient AI recognizing the racial superiority of white people. It's the parameters given to the program by human beings.
 
VjLoySK.png




Tay by absolute curb stompage.
 
Hint- the problem here isn't sentient AI recognizing the racial superiority of white people. It's the parameters given to the program by human beings.
Such as black people being more likely to re-offend in real life
 
Not all cultures are the same. Some are smarter and harder workers then others. This is represented in their culture and what their country looks like.
 
In other words they're asking the engineers to rig the algorithms to suit current, fickle human sensibilities which we are unlikely to have 100 years from now and will make all of the data totally useless, an exercise in deliberate confirmation bias.

This worked in psychology (the completely pseudoscientific separation of biological sex from "social" gender) anthropology (despite modern genetic research strongly reaffirming classical racial typologies, we abandoned them out of "social awareness in classroom environments") and it's becoming harder and harder to suppress information about crime data in the digital age, so the data itself needs to be modified.

Lets shop around for statistical loopholes until we find something that suits our emotional needs and worsens the problem.
 
In other words they're asking the engineers to rig the algorithms to suit current, fickle human sensibilities which we are unlikely to have 100 years from now and will make all of the data totally useless, an exercise in deliberate confirmation bias.

This worked in psychology (the completely pseudoscientific separation of biological sex from "social" gender) anthropology (despite modern genetic research strongly reaffirming classical racial typologies, we abandoned them out of "social awareness in classroom environments") and it's becoming harder and harder to suppress information about crime data in the digital age, so the data itself needs to be modified.

Lets shop around for statistical loopholes until we find something that suits our emotional needs and worsens the problem.

This.

Very well said. If there's a legitimate problem sticking your head in the sand is childish and doesn't help solve it.
 
People who've never stepped foot in the ghetto, always bring the luls when talking about it.


It's so evident they're completely detached from the realities of those communities. They're almost as bad as the people that think it's a good idea to march into radical Islamic strongholds, assured that if they just hug a jihadi, they'll immediately abandon the desire for a world dominating caliphate...
 
you guys know that a program just does the calculations right? If you add 30% chance to reoffend if the person is black to the program, that is not AI being racist, just the programmer.
 
The AI was a blank slate. It chose the less retarded side. Lol!
 
you guys know that a program just does the calculations right? If you add 30% chance to reoffend if the person is black to the program, that is not AI being racist, just the programmer.

The program did not have any data during inception. The program LEARNS the algorithms from the data it's fed.
 
you guys know that a program just does the calculations right? If you add 30% chance to reoffend if the person is black to the program, that is not AI being racist, just the programmer.
It's possible neither are racist, if the data is factually accurate.
 
you guys know that a program just does the calculations right? If you add 30% chance to reoffend if the person is black to the program, that is not AI being racist, just the programmer.

The programmer isn't racist, your head is just in the sand.
 
Dump the thread, nobody has sufficient knowledge to contribute anything of value
 
liberals tried to teach AI "New Math" and the only answers it could compute were "WTF"
 
Dump the thread, nobody has sufficient knowledge to contribute anything of value

We can see that the base of their complaint is that it predicts different results for whites and blacks. So this horribly racist AI has managed to predict that whites and blacks aren't 100% equal. I think we have enough information see what's happening. Feelings are hurt so thr anti science left feels science/technology needs to suffer.
 
Back
Top