Artificial Intelligence Can Oppress the Poor and People of Color

When we think of Artificial Intelligence we often think of intelligent robots who act and think like humans — the walking, thinking, feeling machines that we see in the movies. The advent of that kind of intelligent robot is so far off in the future, that we often don’t recognize the kind of AI already all around us. Or the effects it’s having on our lives. Courts, search engines, stores and advertisers all use Artificial Intelligence to make decisions about our behavior: to sell us products, but also to send us to prison or set bail. We look at one kind of decision made by AI, called a risk assessment, and why it’s had such an impact on the poor and people of color.

We also hear how community organizers on skid row fought back against the use of artificial intelligence by the Los Angeles Police Department.

Featuring:

  • Joshua Kroll, Computer Scientist at the UC Berkeley School of Information
  • Jamie Garcia, Stop LAPD Spying Coalition

Music:

  • Fruitcake Hotel #5 — computer generated music (freesound.org)
  • Kaumodaki — Space Bridge Loop (freesound.org)
  • Gis Sweden — Python Winsound 9 Frequences (freesound.org)
  • Cabled Mess — Filtered Note 08-01 (freesound.org)
  • Frankum Reward — Music Track, Ambiance Guitar
  • Podington Bear — Starling
  • Deef — Nostalgia of an Ex Gangsta Rapper
  • Emily Howell — From Darkness Light (album) — Number 3 Prelude (track)