• This is a political forum that is non-biased/non-partisan and treats every person's position on topics equally. This debate forum is not aligned to any political party. In today's politics, many ideas are split between and even within all the political parties. Often we find ourselves agreeing on one platform but some topics break our mold. We are here to discuss them in a civil political debate. If this is your first visit to our political forums, be sure to check out the RULES. Registering for debate politics is necessary before posting. Register today to participate - it's free!

Amazon built an AI to hire people, but reportedly had to shut it down because it was discriminating

I doubt the program was designed to check if the applicant had the correct plumbing, before any decision was actually made.

Besides this is not the 1930s, the world doesn't work like that anymore.


The AI was given a stack of resumes to learn from. Those resumes were primarily from males. So the AI's definition of good resumes were ones which skewed toward typically male activities. Resumes heavy on activities women were more likely to do didn't fit into what the AI had decided constituted good resumes.

It had nothing to do with whether the women actually had good resumes. It was all about the bias of the male-oriented data which the AI was given to use to form it's standards.
 
The AI was given a stack of resumes to learn from. Those resumes were primarily from males. So the AI's definition of good resumes were ones which skewed toward typically male activities. Resumes heavy on activities women were more likely to do didn't fit into what the AI had decided constituted good resumes.

It had nothing to do with whether the women actually had good resumes. It was all about the bias of the male-oriented data which the AI was given to use to form it's standards.

The ethic of the work in question has no gender.

There is no bias here, as the system was trying to pick the best worker in general. It did not care for gender in the first place.
 
The ethic of the work in question has no gender.

There is no bias here, as the system was trying to pick the best worker in general. It did not care for gender in the first place.


Do you really not get it?


That's a rhetorical question. Either you really don't get it, in spite of many explanations, or you do get it but are willfully acting obtuse. Either way, I hope you're not in HR. And I'll leave it there.
 
Do you really not get it?


That's a rhetorical question. Either you really don't get it, in spite of many explanations, or you do get it but are willfully acting obtuse. Either way, I hope you're not in HR. And I'll leave it there.

I'm a psychologist, working with people is my job.

Though I did take computer courses in college and in high school. Neither of them has shown, nor any of the information that I have learned, or things that I've experienced have shown your presumption of what transpired here to be true. Amazon having to deal with this issue is just another obtuse run at identity politics and for someone to claim that a computer program sexist, racist, or even biased in general. Is a rather horrible leap in logic for anyone to take.

I would willingly accept that the programmer themselves would be the one who is guilty of being as such, yet we have no evidence of that being the case. A computer is a logic engine, it will take the simplest routes, form the best solutions to current problems and minimize the waste of resources if given the necessary question. Even in the military we have programs that will suggest parameters to combat, even if it cost civilian lives. Doing so simply because it sees that instance as the most logical course.

Now you can actually try and debate this, or you can try giving me slight insults for the rest of the night.
The choice is up to you.
 
....

Now you can actually try and debate this, or you can try giving me slight insults for the rest of the night.
The choice is up to you.

The rest of the night? As if I would waste the effort.

The choice was already made. That's what "I'll leave it there" means. Goodbye.
 
The rest of the night? As if I would waste the effort.

The choice was already made. That's what "I'll leave it there" means. Goodbye.

Then why go through all of that, if you're just going to surrender?
 
Back
Top Bottom