Video

How AI Could Reinforce Biases In The Criminal Justice System

En

How AI Could Reinforce Biases In The Criminal Justice System


#Reinforce #Biases #Criminal #Justice #System

Increasingly, algorithms and machine learning are being implemented at various touch points throughout the criminal justice system, from deciding where to deploy police officers to aiding in bail and sentencing decisions. The question is, will this tech make the system more fair for minorities and low-income residents, or will it simply amplify our human biases?

We all know humans are imperfect. We’re subject to biases and stereotypes, and when these come into play in the criminal justice system, the most disadvantaged communities end up suffering. It’s easy to imagine that there’s a better way, that one day we’ll find a tool that can make neutral, dispassionate decisions about policing and punishment.

Some think that day has already arrived.

Around the country, police departments and courtrooms are turning to artificial intelligence algorithms to help them decide everything from where to deploy police officers to whether to release defendants on bail.

Supporters believe that the technology will lead to increased objectivity, ultimately creating safer communities. Others however, say that the data fed into these algorithms is encoded with human bias, meaning the tech will simply reinforce historical disparities.

Learn more about the ways in which communities, policemen and judges across the U.S. are using these algorithms to make decisions about public safety and people’s lives.

» Subscribe to CNBC:

About CNBC: From ‘Wall Street’ to ‘Main Street’ to award winning original documentaries and Reality TV series, CNBC has you covered. Experience special sneak peeks of your favorite shows, exclusive video and more.

Connect with CNBC News Online
Get the latest news:
Follow CNBC on LinkedIn:
Follow CNBC News on Facebook:
Follow CNBC News on Twitter:
Follow CNBC News on Google+:
Follow CNBC News on Instagram:

#CNBC

How AI Could Reinforce Biases In The Criminal Justice System
criminal law , How AI Could Reinforce Biases In The Criminal Justice System, CNBC,business news,finance stock,stock market,news channel,news station,breaking news,us news,world news,cable,cable news,finance news,money,money tips,financial news,Stock market news,stocks,artificial intelligence,artificial intelligence criminal justice,artificial intelligence prejudice,is artificial intelligence dangerous,prison reform,criminal justice reform,minority report

36 thoughts on “How AI Could Reinforce Biases In The Criminal Justice System”

  1. "From that time Jesus began to preach, and to say, Repent: for the kingdom of heaven is at hand." Matthew 4:17

    "Ye have heard that it hath been said, An eye for an eye, and a tooth for a tooth: But I say unto you, That ye resist not evil: but whosoever shall smite thee on thy right cheek, turn to him the other also." Matthew 5:38-39▪︎

  2. This sounds like a computer program that does the exact same thing that law enforcement has been doing forever. Generating crime maps. If you have armed robberies mostly in one part of town, naturally you are going to focus patrols in that area. That will reduce the number of armed robberies in that area. Doesn't take a million dollar computer program, it takes someone compiling the reports of crime to law enforcement. I have been in shift briefings where patrols are focused in specific areas because of the crimes being reported in those areas.

  3. THIS IS THE TYRANNY THAT PREVAILS WHEN ELECTED DO NOT FEAR THE SHEEPLES. THE CHERIF DAs MAYORS ARE PAID OFF BY THE OLIGARCHY TERRORIST. THEY MUST BE PLACED IN THE SPOTLIGHT & FIRED. THATS THE ONLY PREDICTIVE OUTCOME

  4. ANY SYSTEM BEING PUSHED BY THE FED GOVERNMENT MEANS THE OLYGARCHY TERRORIST LOBBYISTS HAV PAID OFF THE DC BANDITS TO FORCE STATES TO BUY THEIR PRODUCTS THAT WILL CONTINUE TO ENSLAVE

  5. They have to have funding to expand and grow this program if they don't have a quota they will create one this is ground level to an full blown authoritarian rule

  6. Love MSNBC : “Arresting minorities at higher rate can only mean racial bias; AI dramatically reducing crime after its use doesn’t mean AI is effective.”

  7. everyone is biased in one way or another, you cannot actually create something completely unbiased we can only try to minimise it as much as possible.(IMO)

  8. they are trying to put people into a box (classifying minority people) and predict actions with OLD Data… are they nuts ? They should look into facebook's posts/personal conversation, it would be the gold data to look for !

  9. The data the algorithms use come from encounters from biased police. That means the algorithm will likely tell them to go where theyve been going.

  10. Putting a bunch of liberal academics with no military or law enforcement experience in charge of determining law enforcement procedure seems like a really good idea to me. I mean, statistics are clearly just as racist as putting more police on the streets of "certain high crime areas." What we really need are more cops in low crime areas so they can write tickets for people going 26 in a 25 zone and give the underprivileged a fair chance to sell dope, rob and steal, thus helping them climb the economic ladder.

  11. If your against this then you should also be against climate change theory.
    It the same technology used in both

  12. Hahaha the data shows that burglaries dropped from 1600 to 600 and this idiot is saying it has nothing to do with PredPol, what else could POSSIBLY explain a drop of – 62.5 % in burgalries you fool?

  13. This will be abused, almost every police dept has curruption issues. Precrime will be a thing. You won't even have to be doing a crime. Just thinking about it.

  14. Assholes. If you train on a dataset formed by predatory policing on minorities, that’s what the AI will predict. You wanna have an unbiased AI? Set an example and build an unbiased database.

Comments are closed.