当人工智能被植入人类的偏见,谁会是受害者?

如今,人工智能已经成为人类生活不可缺少的一部分,但是如果这些机器的系统中植入了人类的偏见,会发生什么呢?技术专家Kriti Sharma探讨了科技领域缺乏多样性会如何影响人工智能。同时提出了3种解决方法。

中英全文(来自TED官网)

How many decisions have been made about you today, or this week or this year, by artificial intelligence? I build AI for a living so, full disclosure, I'm kind of a nerd. And because I'm kind of a nerd, wherever some new news story comes out about artificial intelligence stealing all our jobs, or robots getting citizenship of an actual country, I'm the person my friends and followers message freaking out about the future.

你今天,这周,或今年有多少决定是人工智能(AI)做出的?我靠创建AI为生,所以,坦白说,我是个技术狂。因为我是算是个技术狂,每当有关于人工智能要抢走我们的工作这样的新闻报道出来,或者机器人获得了一个国家的公民身份时,我就成了对未来感到担忧的朋友和关注者发消息的对象。

We see this everywhere. This media panic that our robot overlords are taking over. We could blame Hollywood for that. But in reality, that's not the problem we should be focusing on. There is a more pressing danger, a bigger risk with AI, that we need to fix first. So we are back to this question: How many decisions have been made about you today by AI? And how many of these were based on your gender, your race or your background?

这种事情随处可见。媒体担心机器人正在接管人类的统治。我们可以为此谴责好莱坞。但现实中,这不是我们应该关注的问题。人工智能还有一个更紧迫的危机,一个更大的风险,需要我们首先应对。所以我们再回到这个问题:今天我们有多少决定是由人工智能做出的?其中有多少决定是基于你的性别,种族或者背景?

Algorithms are being used all the time to make decisions about who we are and what we want. Some of the women in this room will know what I'm talking about if you've been made to sit through those pregnancy test adverts on YouTube like 1,000 times. Or you've scrolled past adverts of fertility clinics on your Facebook feed. Or in my case, Indian marriage bureaus.

算法一直在被用来判断我们是谁,我们想要什么。在座的人里有些女性知道我在说什么,如果你有上千次被要求看完YouTube上那些怀孕测试广告,或者你在Faceboo的短新闻中刷到过生育诊所的广告。或者我的遇到的情况是,印度婚姻局。

But AI isn't just being used to make decisions about what products we want to buy or which show we want to binge watch next. I wonder how you'd feel about someone who thought things like this: "A black or Latino person is less likely than a white person to pay off their loan on time." "A person called John makes a better programmer than a person called Mary." "A black man is more likely to be a repeat offender than a white man." You're probably thinking, "Wow, that sounds like a pretty sexist, racist person," right? These are some real decisions that AI has made very recently, based on the biases it has learned from us, from the humans. AI is being used to help decide whether or not you get that job interview; how much you pay for your car insurance; how good your credit score is; and even what rating you get in your annual performance review. But these decisions are all being filtered through its assumptions about our identity, our race, our gender, our age. How is that happening?

但人工智能不仅被用来决定我们想要买什么产品,或者我们接下来想刷哪部剧。我想知道你会怎么看这样想的人:“黑人或拉丁美洲人比白人更不可能按时还贷。”“名叫约翰的人编程能力要比叫玛丽的人好。”“黑人比白人更有可能成为惯犯。”你可能在想,“哇,这听起来像是一个有严重性别歧视和种族歧视的人。”对吧?这些都是人工智能近期做出的真实决定,基于它从我们人类身上学习到的偏见。人工智能被用来帮助决定你是否能够得到面试机会;你应该为车险支付多少费用;你的信用分数有多好;甚至你在年度绩效评估中应该得到怎样的评分。但这些决定都是通过它对我们的身份、种族、性别和年龄的假设过滤出来的。为什么会这样?

Now, imagine an AI is helping a hiring manager find the next tech leader in the company. So far, the manager has been hiring mostly men. So the AI learns men are more likely to be programmers than women. And it's a very short leap from there to: men make better programmers than women. We have reinforced our own bias into the AI. And now, it's screening out female candidates. Hang on, if a human hiring manager did that, we'd be outraged, we wouldn't allow it. This kind of gender discrimination is not OK. And yet somehow, AI has become above the law, because a machine made the decision. That's not it.

想象一下人工智能正在帮助一个人事主管寻找公司下一位科技领袖。目前为止,主管雇佣的大部分是男性。所以人工智能知道男人比女人更有可能成为程序员,也就更容易做出这样的判断:男人比女人更擅长编程。我们通过人工智能强化了自己的偏见。现在,它正在筛选掉女性候选人。等等,如果人类招聘主管这样做,我们会很愤怒,不允许这样的事情发生。这种性别偏见让人难以接受。然而,或多或少,人工智能已经凌驾于法律之上,因为是机器做的决定。这还没完。

We are also reinforcing our bias in how we interact with AI. How often do you use a voice assistant like Siri, Alexa or even Cortana? They all have two things in common: one, they can never get my name right, and second, they are all female. They are designed to be our obedient servants, turning your lights on and off, ordering your shopping. You get male AIs too, but they tend to be more high-powered, like IBM Watson, making business decisions, Salesforce Einstein or ROSS, the robot lawyer. So poor robots, even they suffer from sexism in the workplace.

我们也在强化我们与人工智能互动的偏见。你们使用Siri,Alexa或者Cortana这样的语音助手有多频繁?它们有两点是相同的:第一点,它们总是搞错我的名字,第二点,它们都有女性特征。它们都被设计成顺从我们的仆人,开灯关灯,下单购买商品。也有男性的人工智能,但他们倾向于拥有更高的权力,比如IBM的Watson可以做出商业决定,还有Salesforce的Einstein或者ROSS,是机器人律师。所以即便是机器人也没能逃脱工作中的性别歧视。

Think about how these two things combine and affect a kid growing up in today's world around AI. So they're doing some research for a school project and they Google images of CEO. The algorithm shows them results of mostly men. And now, they Google personal assistant. As you can guess, it shows them mostly females. And then they want to put on some music, and maybe order some food, and now, they are barking orders at an obedient female voice assistant. Some of our brightest minds are creating this technology today. Technology that they could have created in any way they wanted. And yet, they have chosen to create it in the style of 1950s "Mad Man" secretary. Yay!

想想这两者如何结合在一起,又会影响一个在当今人工智能世界中长大的孩子。比如他们正在为学校的一个项目做一些研究,他们在谷歌上搜索了CEO的照片。算法向他们展示的大部分是男性。他们又搜索了个人助手。你可以猜到,它显示的大部分是女性。然后他们想放点音乐,也许想点些吃的,而现在,他们正对着一位顺从的女声助手发号施令。我们中一些最聪明的人创建了今天的这个技术。他们可以用任何他们想要的方式创造技术。然而,他们却选择了上世纪50年代《广告狂人》的秘书风格。是的,你没听错!

But OK, don't worry, this is not going to end with me telling you that we are all heading towards sexist, racist machines running the world. The good news about AI is that it is entirely within our control. We get to teach the right values, the right ethics to AI. So there are three things we can do. One, we can be aware of our own biases and the bias in machines around us. Two, we can make sure that diverse teams are building this technology. And three, we have to give it diverse experiences to learn from. I can talk about the first two from personal experience. When you work in technology and you don't look like a Mark Zuckerberg or Elon Musk, your life is a little bit difficult, your ability gets questioned.

但还好,不用担心。这不会因为我告诉你我们都在朝着性别歧视、种族主义的机器前进而结束。人工智能的好处是,一切都在我们的控制中。我们得告诉人工智能正确的价值观,道德观。所以有三件事我们可以做。第一,我们能够意识到自己的偏见和我们身边机器的偏见。第二,我们可以确保打造这个技术的是背景多样的团队。第三,我们必须让它从丰富的经验中学习。我可以从我个人的经验来说明前两点。当你在科技行业工作,并且不像马克·扎克伯格或埃隆·马斯克那样位高权重,你的生活会有点困难,你的能力会收到质疑。

Here's just one example. Like most developers, I often join online tech forums and share my knowledge to help others. And I've found, when I log on as myself, with my own photo, my own name, I tend to get questions or comments like this: "What makes you think you're qualified to talk about AI?" "What makes you think you know about machine learning?" So, as you do, I made a new profile, and this time, instead of my own picture, I chose a cat with a jet pack on it. And I chose a name that did not reveal my gender. You can probably guess where this is going, right? So, this time, I didn't get any of those patronizing comments about my ability and I was able to actually get some work done. And it sucks, guys. I've been building robots since I was 15, I have a few degrees in computer science, and yet, I had to hide my gender in order for my work to be taken seriously.

这只是一个例子。跟大部分开发者一样,我经常参加在线科技论坛,分享我的知识帮助别人。我发现,当我用自己的照片,自己的名字登陆时,我倾向于得到这样的问题或评论:“你为什么觉得自己有资格谈论人工智能?”“你为什么觉得你了解机器学习?”所以,我创建了新的资料页,这次,我没有选择自己的照片,而是选择了一只带着喷气背包的猫。并选择了一个无法体现我性别的名字。你能够大概猜到会怎么样,对吧?于是这次,我不再收到任何居高临下的评论,我能够专心把工作做完。这感觉太糟糕了,伙计们。我从15岁起就在构建机器人,我有计算机科学领域的几个学位,然而,我不得不隐藏我的性别以让我的工作被严肃对待。

So, what's going on here? Are men just better at technology than women? Another study found that when women coders on one platform hid their gender, like myself, their code was accepted four percent more than men. So this is not about the talent. This is about an elitism in AI that says a programmer needs to look like a certain person. What we really need to do to make AI better is bring people from all kinds of backgrounds. We need people who can write and tell stories to help us create personalities of AI. We need people who can solve problems. We need people who face different challenges and we need people who can tell us what are the real issues that need fixing and help us find ways that technology can actually fix it. Because, when people from diverse backgrounds come together, when we build things in the right way, the possibilities are limitless.

这是怎么回事呢?男性在科技领域就是强于女性吗?另一个研究发现,当女性程序员在平台上隐藏性别时,像我这样,她们的代码被接受的比例比男性高4%。所以这跟能力无关。这是人工智能领域的精英主义,即程序员看起来得像具备某个特征的人。让人工智能变得更好,我们需要切实的把来自不同背景的人集合到一起。我们需要能够书写和讲故事的人来帮助我们创建人工智能更好的个性。我们需要能够解决问题的人。我们需要能应对不同挑战的人,我们需要有人告诉我们什么是真正需要解决的问题,帮助我们找到用技术解决问题的方法。因为,当不同背景的人走到一起时,当我们以正确的方式做事情时,就有无限的可能。

And that's what I want to end by talking to you about. Less racist robots, less machines that are going to take our jobs -- and more about what technology can actually achieve. So, yes, some of the energy in the world of AI, in the world of technology is going to be about what ads you see on your stream. But a lot of it is going towards making the world so much better. Think about a pregnant woman in the Democratic Republic of Congo, who has to walk 17 hours to her nearest rural prenatal clinic to get a checkup. What if she could get diagnosis on her phone, instead? Or think about what AI could do for those one in three women in South Africa who face domestic violence. If it wasn't safe to talk out loud, they could get an AI service to raise alarm, get financial and legal advice. These are all real examples of projects that people, including myself, are working on right now, using AI.

这就是我最后想和你们讨论的。减少种族歧视的机器人,减少夺走我们工作的机器——更多专注于技术究竟能实现什么。是的,人工智能世界中,科技世界中的一些能量是关于你在流媒体中看到的广告。但更多是朝着让世界更美好的方向前进。想想刚果民主共和国的一位孕妇,需要走17小时才能到最近的农村产前诊所进行产检。如果她在手机上就能得到诊断会怎样呢?或者想象一下人工智能能为1/3面临家庭暴力的南非女性做什么。如果大声说出来不安全的话,她们可以通过一个人工智能服务来报警,获得财务和法律咨询。这些都是包括我在内,正在使用人工智能的人所做的项目中的真实案例。

So, I'm sure in the next couple of days there will be yet another news story about the existential risk, robots taking over and coming for your jobs.

我确信在未来的几十天里面,会有另一个新闻故事,告诉你们,机器人会接管你们的工作。

And when something like that happens, I know I'll get the same messages worrying about the future. But I feel incredibly positive about this technology. This is our chance to remake the world into a much more equal place. But to do that, we need to build it the right way from the get go. We need people of different genders, races, sexualities and backgrounds. We need women to be the makers and not just the machines who do the makers' bidding. We need to think very carefully what we teach machines, what data we give them, so they don't just repeat our own past mistakes. So I hope I leave you thinking about two things. First, I hope you leave thinking about bias today. And that the next time you scroll past an advert that assumes you are interested in fertility clinics or online betting websites, that you think and remember that the same technology is assuming that a black man will reoffend. Or that a woman is more likely to be a personal assistant than a CEO. And I hope that reminds you that we need to do something about it.

当这样的事情发生时,我知道我会收到同样对未来表示担忧的信息。但我对这个技术极为乐观。这是我们重新让世界变得更平等的机会。但要做到这一点,我们需要在一开始就以正确的方式构建它。我们需要不同性别,种族,性取向和背景的人。我们需要女性成为创造者,而不仅仅是听从创造者命令的机器。我们需要仔细思考我们教给机器的东西,我们给它们什么数据,这样它们就不会只是重复我们过去的错误。所以我希望我留给你们两个思考。首先,我希望你们思考当今社会中的的偏见。下次当你滚动刷到认为你对生育诊所或者网上投注站有兴趣的广告时,这会让你回想起同样的技术也在假定黑人会重复犯罪。或者女性更可能成为个人助理而非CEO。我希望那会提醒你,我们需要对此有所行动。

And second, I hope you think about the fact that you don't need to look a certain way or have a certain background in engineering or technology to create AI, which is going to be a phenomenal force for our future. You don't need to look like a Mark Zuckerberg, you can look like me. And it is up to all of us in this room to convince the governments and the corporations to build AI technology for everyone, including the edge cases. And for us all to get education about this phenomenal technology in the future. Because if we do that, then we've only just scratched the surface of what we can achieve with AI.

第二,我希望你们考虑一下这个事实,你不需要以特定的方式去看,也不需要有一定的工程或技术背景去创建人工智能,人工智能将成为我们未来的一股非凡力量。你不需要看起来像马克·扎克伯格,你可以看起来像我。我们这个房间里的所有人都有责任去说服政府和公司为每个人创建人工智能技术,包括边缘的情况。让我们所有人都能在未来接受有关这项非凡技术的教育。因为如果我们那样做了,才刚刚打开了人工智能世界的大门。

Thank you.

谢谢。