BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Podcasts Cathy O'Neil on Pernicious Machine Learning Algorithms and How to Audit Them

Cathy O'Neil on Pernicious Machine Learning Algorithms and How to Audit Them

Bookmarks

In this week's podcast InfoQ’s editor-in-chief Charles Humble talks to Data Scientist Cathy O’Neil. O'Neil is the author of the blog mathbabe.org. She was the former Director of the Lede Program in Data Practices at Columbia University Graduate School of Journalism, Tow Center and was employed as Data Science Consultant at Johnson Research Labs. O'Neil earned a mathematics Ph.D. from Harvard University. Topics discussed include her book “Weapons of Math Destruction,” predictive policing models, the teacher value added model, approaches to auditing algorithms and whether government regulation of the field is needed.

Key Takeaways

  • There is a class of pernicious big data algorithms that are increasingly controlling society but are not open to scrutiny.
  • Flawed data can result in an algorithm that is, for instance, racist and sexist. For example, the data used in predictive policing models is racist. But people tend to be overly trusting of algorithms because they are mathematical.
  • Data scientists have to make ethical decisions even if they don’t acknowledge it. Often problems stem from an abdication of responsibility.
  • Auditing for algorithms is still a very young field with ongoing academic research exploring approaches.
  • Government regulation of the industry may well be required.

Weapons of math destruction

  • 0m:44s - The central thesis of the book is that whilst not all algorithms are bad, there is a class of pernicious big data algorithms that are increasingly controlling society.
  • 1m:22s - The classes of algorithm that O'Neil is concerned about - the weapons of math destruction - have three characteristics: they are widespread and impact on important decisions like whether someone can go to college or get a job, they are somehow secret so that the people who are being targeted don’t know they are being scored or don’t understand how their score is computed; and the third characteristic is they are destructive - they ruin lives.
  • 2m:45s - These characteristics undermine the original intention of the algorithm, which is often trying to solve big society problems with the help of data.

The connection between Big Data Algorithms and the rule of law

  • 4m:04s - Algorithms that play a part in the criminal justice system like policing, sentencing, and parole, play a part in the system of law but the algorithms are proprietary and not open to scrutiny.
  • 5m:00s - These algorithms are effectively digital laws and we should have the same constitutional protections as we do with laws.

Predictive policing models

  • 5m:21s - Chicago and a number of other police forces use predictive policing models and the evidence that they work in the way they are intended to is patchy.
  • 6m:35s - All predictive policing algorithms are things that look for patterns, which causes a feedback loop for other policed - black - neighbourhoods.
  • 7m:24s - The data being fed into the algorithms is racist and biased data, but for some reason people imagine that the since it’s mathematical it must be objective.
  • 8m:01s - Flawed data can result in an algorithm that is racist and sexist. To fix the problem we have to stop including nuisance crimes in the data.
  • 8m:42s - If the police had gone to Wall Street in 2008 then these same predictive policing algorithms would be telling the police to go back to Wall Street.

Approaches to dealing with this kind of problem

  • 9m:39s - De-correlating the input variables to a model so that they are not statically correlated to a particular factor, but it’s not enough.
  • 11m:39s - Algorithms that are used in white colour jobs to sort resumes look at historical patterns. Hypothetically, if say Fox news used such an algorithm, you might see the female applicants getting filtered out because females weren’t historically as successful at Fox News.
  • 13m:19s - If you automate an imperfect system you are just reporting your past mistakes.

Trust in algorithms

  • 13m:46s - Something O'Neil says in an article for Slate is that “people have too much trust [that] numbers [will] be intrinsically objective.”
  • 14m:23s - If you are given a bad score, I somehow automatically assume it’s your responsibility - people stop thinking about causation.
  • 14m:40s - They also often think they are afraid of mathematics and so they just assume it’s objective.

Comparing data science and computer science

  • 15m:27s - Data scientists aren’t paid to think about these kinds of problems.
  • 15m:57s - Comparing two types of problems - one computer science, one data science.
  • 16m:22s - A computer scientistic knows what they want to end up with.
  • 16m:43s - A data scientist might be asked to figure out who is going to book a hotel room for three days. The problem is not as well defined.
  • 18m:47s - Data scientists have to make ethical decisions even if they don’t acknowledge it.
  • 19m:03s - As an example, I was talking to someone who builds recidivism risk models for a state prison sentence. He doesn't use race, but he does use zip-code, and zip-code is a proxy for race.
  • 19m:47s - No one within the state government tells you what the rules are. He doesn’t see it as his responsibility how these things are being used because he’s just responsible for the technical side of it.
  • 20m:48s - One of the conditions that can lead to the creation of a weapon of math destruction is an abdication of responsibility.

The teacher value added model

  • 21m:22s - Intention is to find bad teachers and fire them, but the method of finding them is statistically weak.
  • 21m:56s - It's almost entirely a random number generator. I talked to a teacher who got 6/100 one year and 96/100 the following year. He had been teaching for 26 years and had not changed the way he taught.
  • 22m:40s - Another teacher got fired for getting a bad score even though her principle loved her.
  • 23m:00s - The resulting destructive feedback loop is resulting in a nationwide teacher shortage.

Approaches to auditing algorithms

  • 24m:22s - It's a very new field and we don’t have all the answers.
  • 24m:40s - You have to be able to refer to a some sort of ground truth. Is there an expensive way you can work out who is a good teacher and who is not? Suppose we don’t use the inexpensive data driven version of it until it agrees with the expensive one?
  • 25m:32s - There have been some companions done between the expensive qualitative approaches to assessing teachers and the value added algorithm way, and they don’t agree at all - like 24% agreement in the correlation.
  • 25m:52s - For something like Google Search, it’s so massive you couldn’t really audit all of it.
  • 26m:24s - Latanya Sweeney has done some of this because she Googled her own African-American name and she was served an ad asking if she would like to look at the police records of Latanya Sweeney. What she found was that black sounding names get arrest record ads, and white sounding names do not.
  • 27m:09s - You will get Googled when you apply for a job so this deserves auditing.
  • 27m:34s - Auditing Google search will be a series of case studies.
  • 27m:56s - The racism here isn’t intentional - it’s a product of our society - but it does have influence so Google does have to take responsibility.

Government regulation vs. self-regulation

  • 28m:43s - Government regulation vs. self-regulation: Google and others are working on a self-regulatory framework.
  • 29m:30s - A public debate is needed because the universe of possible ethical concerns of these companies isn’t wide enough.

Further reading and resources

More about our podcasts

You can keep up-to-date with the podcasts via our RSS Feed, and they are available via SoundCloud, Apple Podcasts, Spotify, Overcast and the Google Podcast. From this page you also have access to our recorded show notes. They all have clickable links that will take you directly to that part of the audio.

Previous podcasts

Rate this Article

Adoption
Style

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Community comments

  • Interesting - but a liberal slant.

    by Will Stevens,

    Your message is awaiting moderation. Thank you for participating in the discussion.

    Interesting - but a liberal slant on all of this. Dah... dah... slam Fox News and Wall Street and proprietary algorithms targeting black neighborhoods. Typical of a liberal. Everybody is the enemy, anyone not a democrat or liberal. Such a tiresome argument that no one falls for anymore.

    Too bad Cathy O'Neil mixes math and science with politics. I lose interest in her and her research if she can't stick to the facts and not single out Fox, Wall Street or algorithms trying help the police force fight crime.

    I hope InfoQ does not continually permit this type of content or I will be unsubscribing.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

BT