tv Going Underground RT October 18, 2021 5:30pm-6:01pm EDT
5:30 pm
it's 10 years since dnc supporter and then chairman of google, eric smith went to see julian massage of wiki leaks, going to be awaiting next week's court hearing in his long battle for freedom. you can watch our interview with julie. the san john is radically different perspective on technology committed google on our youtube channel. now dr. tim need to get through a form, a google employee, an expert in artificial intelligence who spoke out against a bias has been cooling. the stronger whistle blew a protection against us all ago from big tech companies. she joins me now from california to me. thank you so much for coming on. i think we all have you to thank for server features on our i pads on google. so if we use that, i mean it's affected all our lives in a way. what is ethical a. i presumably different to the squid game, i haven't seen all of that series, but everybody's talking about what is africa? and the cal ai is, i consider it to be
5:31 pm
a feel that tries to ensure that what we work on at technology, we're working on it with foresight and trying to understand what the negative potential negative societal impact to are and minimize those and try to work on something that's actually beneficial for humanity and started by saying what is on ethical ai. there's lots of it, i think. all right, i think that whenever you, so for example, as a part of my work taught showed that my collaboration with joy will let me show that a lot of as a p i that sell automated automated issue. that is tools we showed that they were much there. they had much higher error rates for darker skinned women than lighter skinned men. this is referred to like ignition. yeah, yeah. and that's bird. a lot of movement because a lot of people have been read about surveillance related technology anyway. so
5:32 pm
that spurred a lot of movement to ban some of these, they used to some of these technologies by law enforcement because they're mostly used to suppress dissent to surveil a lot of marginalized communities. and so for me, that's an ethical, that's not ethically i right, that is an ethical ai. but even for capitalism, it's not that useful or law enforcement. it's not useful to have a i that wrongly identifies individuals on the basis of their color. and i think the amazon company did initially fight against a criticism of u. s. law and cash with them. they do presumably so well, you know, no one wants that. i don't, i don't think they saw the lights. i think what happened is that there were a worldwide like lives matter protests. and so it looks bad for them to continue using it amid this world wide protest. so they said that they were going to have
5:33 pm
a moratorium of using this technology until there was a federal law. i think you all right, yeah, i have a statement here. they said, quote, this research paper, and i'm assuming you were cool the article a misleading and drill false conclusions. i mean, why the thing the research is used an outdated version of amazon recognition. we made a significant set of improvements. i mean, what do you think? why, why do they fight back so strongly against dis, interested, scholarly research? you know, the, i mean this isn't you being an activist for blacklist rather this is you analyzing the algorithms. yes. so this was actually my colleagues. so joyful, i me and i wrote a paper that preceded this one and her and another black woman at deborah raji wrote this paper called actionable auditing and them. and when they put it out there for the very 1st thing that amazon wanted to do was push back. so the p after v p wrote a blog post trying to discredit them. and me and my colleague mag,
5:34 pm
mitchell, you know, we're both fired from google. now that we were a google at the time, we spent a whole lot to writing a point by point rebuttal and then and galvanized the academic community to come to their support. so we had a whole petition that more than $7080.00 academics including tearing award winners . that's kind of like the equivalent of the nobel prize for computer science reg signed our letter so we came to their defense, we debunked. we had a point by point rebuttal of what they were trying to say here. and then we all asked them to stop selling recognition to law enforcement. so then that was all over the news and they weren't able to far there, sort of try to discredit them. but the thing that's so what you just saw there is very similar to what google was trying to do to me right after i came out with my paper about large language models because we were trying to show the harms of such technology. well,
5:35 pm
let's get where we will get on to the lodge language models. and i have to say that these are massive companies that you know, some people say own all alive. so i better say these responses because they have a lot of money. you know, before you make any actions that they did, i completely google says they didn't fire you at all to me through that. if we didn't meet the demands, she would leave google and worked on an end date. we accept and respect your decision to resign from google. so the, one of the gods of our societies are saying, no, you didn't, you won't find, i think they would, you know, they had a quote unquote and, and non apology of an apology after the backlash. because they were just doubling down and it's so clear as to resign. i can't even believe, i don't know why they thought that they could get away with saying that maybe they thought i'm so stupid that i would just kind of feel like, oh, i guess i resign. that's not how resignations work, right?
5:36 pm
resignations, you have to submit. you have to have paperwork, you have to, you know, say you're going to resign, you have, there is a whole process and a dispute. i mean, as i say, they say that why did you, according to google resign? according to you were fine. i just said that i resigned. what was that? i'm not going to have in the context of all of this. what exactly happened? because ok, again, you're working on scholarly research. as far as i understand it about lodge language models and people use google translate, love love these google translations, and that uses large language, what could possibly be wrong with large language models. so in all of these incidents that you see, whenever you show a problem and it's inconvenient, it looks like the problem maybe is too big for what they want to admit or, or maybe it's not even that bad, like they, maybe they really do think that, that the issues that i'm raising are not that serious. the one of the things they
5:37 pm
said is that my paper made it look like. there were too many issues with large language models, right? they say it's paint as stark picture. and so they wanted me to retract this academic peer reviewed paper that was being published at an academic scientific conference with the no process whatsoever. so it's one thing if somebody, if your employer comes to you and says we followed this particular process that you know about to come to this conclusion that your paper should be retracted. and here's why we can discuss it. let's have a conversation about, you know, what kind of process we use a why we want you to retract the paper. absolutely not. right? this was way late after we submitted it a long time after we submitted our paper. and we went through the internal approval processes. my manager resigned from google, right, because he was to approve. okay,
5:38 pm
well it was fine if we go to the specifics of the lodge language model of problems that you were investigating. maybe we can figure out what so does question. i mean, i know you give an example of a palestinian using woods and then being wrongly interpreted just just to take a few. yeah. so what we did was we went through actually prior works surveyed a lot of prior works and had some initial, our own analysis of these large language models and what, what has been going wrong and what could go wrong if we focus on larger and larger language models, so the 1st point that we started with was environmental and financial cost. right? these large language models consume a lot of compute, consume a lot of compute power. and so if you are working on larger and larger language models, only the people would, these kinds of huge compute powers,
5:39 pm
are going to be able to use these large anguish models. and that leads to what we talked about as being environmental, racism, that people who benefit from large language models are not the people who are paying the cost of the this environmental and financial cost of large language models. and so we give a bunch of examples there about different languages, right? it's always people in the dominant groups, whether it is between countries or within a specific country that benefit from these language models. the 2nd point we make is the fact there's this weird. there is this assumption that if you use the data on the internet, that you'll somehow incorporate everyone's point of view, right? and those of us who've been harassed on social media and know that that's not the case, but these large language models are trade with data on the internet and most of the time, like the whole internet even. and so we were talking but the dangers of doing that,
5:40 pm
the kinds of biases that you would code, the kinds of hateful content that you and code. we were saying that just because you have a large sized dataset, it doesn't mean that you are now incorporating every once point of view. so we give many examples of that. and then we, we give examples of what happens when this and coded bias at when this large language models trained with data encoded in this, with this kind of bias are deployed. so we give certain examples. one of these examples was this example that you gave about a palestinian. it was this a, he wrote good morning on facebook, and it was translated to attack them. and people didn't even see the initial, didn't check to see what he initially thought a wrote. they wrote, they saw the translation and they arrested him. so this was
5:41 pm
a google translate era. this was, this was a facebook, a underlying technology of large language. they all use large language model. so what we were saying was sometimes when you have machine translation, you get these cues when you have when, when you have errors, right? you can see that the grammar is not quite right. you can see that something is wrong. but with these large language models, you can have something that sounds so fluent and coherent, and it's completely wrong. so you only with these companies. but why would these companies visiting? there's no malice there. there's a mistake in the algorithm and in the, in the software engineering, why would they seek to minimize the publicity given to papers that showed these areas so that they could, i went in. i would disagree that there is no malice because when you look at what happened with joy and dad, for example,
5:42 pm
it is 2 black women talking very much about the impacts of amazon's technology on the black population. so you see a pattern here, it's me and other black women and a bunch of other people in our team who are very much concerned with the impacts of these large language models on marginalized communities. and so if, if you're talking about, you know, something that maybe should not be used right now. that's directly going to impact their money. it's a money making machine, i see what you mean there, but how good morning can translate into them. why would, why would, why would the algorithm even work that out based on the dataset? because of the domain from where the user came from? because you know, if you look at some languages that doesn't event so on, on twitter,
5:43 pm
twitter uses google translate and i had an interview on b, b, c with integration, which is i'm other time. and a google translate doesn't even have to get it. they don't, it's not part of the language is that they offer a translation to, but it uses the same alphabet as all right. and when people were looking, we're sharing my interview. it just went haywire. it just says it just when you know, let's talk about the greedy people, greedy, greedy, greedy greeting, greedy, really, i like there was not i, there was nothing in my interview i talked about really has given the military industrial complex is particularly addressed. and we own of africa and what's happening in eritrea review. yeah, that's actually more disturbing with that 1st. and then it just sounds to me,
5:44 pm
they'll stop you that more from the former co lead, google's ethically, i team and co founder of black. and i, after this break ah ah, it was a surprise, but mostly with a fin. daniels trulia with the rest of the basilica stores to meet with them with, which is food. say that even when you would, you that is images, moves up was good for supposedly good. have my did on the with visitors images to become mom. my phone with her phone was out of the to get the one with all
5:45 pm
of your group plan some way up with with mm, welcome back. i'm still with dr. tim need gabriel, former co lead of google's ethically i team. i'm co founder of black in a, i mean is it in effectively translation? is it, is that process? does it violate the 1st amendment of the us constitution? lemme because it is it's, it's not allowing free speech in effect. oh, i've never thought of it that way. at all, i don't know, i mean, i guess i'm not a legal scholar, but back to the question that you asked me in terms of ethical a,
5:46 pm
what is ethically i, if you read the works of some critical scholars, they would say that the tech industry strategy is to make it look like a purely technical issue, purely algorithmic issue. that needs to be fixed. as you said, purely mathematical issue. it has nothing to do with antitrust laws. it has nothing to do with monopoly. it has nothing to do with labor issues or power dynamics is just purely this, you know, technical thing that we need to work out. and part of the things that we say in our paper and many others is that is not necessarily this purely technical. you know, algorithmic tweak that you want to do, right? we need regulation. and i think that's part of why all of these organizations and companies want it to come down hard on me and a few other people. right? because they think that what we're advocating for is not this simple algorithmic
5:47 pm
tweak. it's for larger structural changes that does, this is the realize that the, the way to really answer the problem is as you say, to do with the nature of monopoly power. yes, i think now with the whole all there's so many whistleblowers now since i, since i 1st came forward, there's one after another after another. i think the public is now starting to really understand that we need some sort of regulation bed in. but the way that what the companies would argue is exactly what you were saying earlier. oh, there is no malice. it's just, you know, the algorithm is not this. why would we want to have algorithms that don't work that, that sound good for our business? and so of course, we will work really hard to fix our algorithms, right. so, for example of it for facebook, mark zuckerberg, wasena, cindy, i don't know any tech company that was to create a he was a all our advertisers don't want to advertise to angry people or bad content. they constantly tell us that. so why would we want to do that?
5:48 pm
that's the kind of argument that they're making, right? but what i'm saying is, that's, that's, you know, a lot of the work in this, in this space. interdisciplinary work has shown that we need to, we need to look into more structural solutions to this problem. now of course you were in google for quite a while. i julie, the son juno, when, when he met eric smith, who wanted to meet him, julius on said, you know, google may not be acting illegally in any of this. presumably he's referring actually to the scientific element as opposed to the anti trust element. who knows, which is why the f b, i don't investigate. when you write a paper, i would assume, did you come across a guy called jared cohen, c of jigsaw. back then it was called google. eric, do you remember that? i didn't come across him, but i heard about him. i was down, i was, i google for actually 2 years, 2 there he long years, every month i was thinking can i survive here? should i leave next month back?
5:49 pm
and i mean again, lot of people watching what we think must be amazing to work, that amazing office and all of that. that's the future there. graduates watching, he had desperately trying to get through the complicated process to get in. and what is i mean and, and i should say jared cohen back by a jo biden's blink in the sex your state and the national security advisor, jake sullivan. they love google ideas and google very much associated with the democrat royalty as it were. why, why was it no fun to worker? well, again, it comes down to what your views and the demographic that you're in and how, what you think i was, you know, in a role that's called a research scientist role. and i was the 1st black woman to be a research scientist at google. and i saw why, right? i had so many issues right off the bat i was under levels are not people with much
5:50 pm
less experience to me way more level than me. there was a lot of just kind of disrespect. there was a lot of harassment and all sorts of things. so it was very, very difficult for me to just even concentrate on my job and, and actually when i'm talking about being fired, i'm pretty certain that it wasn't just that paper that got me fired, right. i was, there were so many things that they didn't like that i was doing. i was speaking up about workplace issues, discrimination issues, and they, and each time i spoke up, they weren't happy that i spoke up about it. so i did my food in the idea that there is something philosophically ayn rand, this kind of a right wing supremacy about it, cuz they did, i only obviously they say you resign and they deny any element, workplace being a bad place to work in, let alone harassment or anything like that? yeah, i mean, they also, i don't know if you're aware of the google walk out there was a large, 20000 people walked out 2 months after i joined google. there was
5:51 pm
a large $20000.00 people demonstrated, walked out because they were demonstrating against, you know, andy reuben, being leaving with the $90000000.00 or something like that after you know, here, harass people. right. and so there were $20000.00 people walking out in protest, and they pushed out that 2 of the organizers, claire and meredith, you know, after like a year or so afterwards. so they were per g. and then a year after i, before i got fired, they pushed, they fire the fired 5, there's an l r b trial going on right now, the national labor relations board trial about these fired 5. and then right after one year later they fired me. and i spoke up about all of those firings, google, denial, wrong doing all of these things, and they're so powerful. i mean, you know, when you just mentioned these cases, obviously journalists, you'll have to look them up using, using google, but as for diversity and identity politics,
5:52 pm
they go. a 9th of december 2020, the millionaire. all the god saying it's incredibly important to me that all black women and under represented google is free for the work at google. know that we value you and you do belong at google. we started the conversation together earlier this year when we announced a broad set of racial equity commitments to take a fresh look at all assistance from hiring and leveling to promotional retention to address the need for leadership accountability committed to continue to make diversity equity inclusion about everything we do from how we build up products, i presume that some of the hardware over represented by people of color, who knows how to, how we build our, our workforce. you know, you know, the statements coming from google completely against what you're saying. yes. so maybe this was what he did, the court non apology and you know, he had to apologize because the other road they were going down on, which was deborah lane down, you know, to saying that my work was sub par, saying that i, i told people to stop working on diversity and all of the other things they were
5:53 pm
saying it was creating more and more backlash each time they went down that road. so i'm sure he realized at some point that he had to do some sort of an apology. and if you read closely the apology, he, it was the kind of apology for it that said, i'm sorry for how you feel kind of apology. there was a long time i what is it called a person who a comic like a person who draws these, these cartoons that cartoon is, there is a long time cartoonist at google level i. he will who left after 1415 years because of what they did and later they did, you had thousands of people on your side. they find lesson, maybe 1000. i mean, i just want to ask you about, i mean, we were talking about jared cohen and i don't know what you think about jigsaw and why you think jigsaw or a is kind of a competitor. now you're the co founder of black in
5:54 pm
a i. you're going to tell me about what black in a i is. why is your a i going to be better than ai? ai? certainly the innovation department of the google conglomerate. why is it going to be better? and i know you're going to make it more profitable actually, because you're going to be more accurate and you're going to get the cia national security agency i. so for, for the group black black and the i that i founded, i co founded was a, a group to it was a, it's a non profit. and it's a group for practitioners and researchers in a i black purchase nurse practitioners and researchers. any i from all over the world. so. so this is not like a group that builds a or anything like that. there's a group that builds mentor builds community networking, has mentorship programs, a for various things, like graduate mentorship programs, entrepreneurship program,
5:55 pm
and we have workshops to raise the visibility of black people in a i, et cetera. so this is different from what i'm doing right now. and in and so what i'm, what i'm hoping to build right now is an interdisciplinary. i research team and the goal is not to be an extremely profitable i research institute because i believe that if your goal, your number one goal is to maximize profit, then you're going to cut corners and you're going to do things downstream. that end up making you build a i that is applied section 70 to the 26 companies act fiduciary duty company to maximize profit. well, we're not profit for the greater yeah. duty maximize profit very quickly. well, what we're doing a whistleblower right now. who's terrified and we work to one of these organizations and even seeing what you're talking about feels more terrified and
5:56 pm
never wants to come clean with journalists or anyone else about yeah, lack of ethics that they see in their daily workplace. well, i would ask them to look at the handbook the tech work or handbook that if your mom was de la, who is another, was implored, just launched like last week. and so that handbook is meant to help people decide whether they want to come forward and what to expect. and i think it's a very personal thing for each person, because it's not a joke. you're gonna really, you're gonna have a lot of backlash. so you have to determine whether this is a right avenue for yourself. and so i understand the fear because you, you get a lot of harass and being cursed into that public space. but i'd want to say that, you know, you were mentioning eric smith and all a julian assange, etc. before eric smith in particular has a lot of influence within the u. s. government right now. he created this national
5:57 pm
a i kind of committee or something like that. and so he's a very, he is this, we view that we have a cold war with china and there's an a i raise, etc. so i think that where we're going with this can be very, very dangerous. so if you are voices or the whistle blowers voices are not heard. it's these people at the top who are having a lot of influence right now. and i think that it would make a huge difference for more people to speak up. but at the same time, we have to protect that. so i can go ask people to speak up without making sure that the society at large will also protect, protect them. well, we invite to hold as google board members, people who left people in the government. now, to come on dr. to meet, give. thank you. thank you. that's it for the show will be back on wednesday, 10 years to the day since the leader of africans, which is because of the country, libya was killed with the backing of the u. k. u. s. in france until then keep
5:58 pm
a job for social media medicine. if you think you protections needed for with, ah, with spoke to me. why destroyed you didn't a from america? nothing. we lived in a world where white lodge mattered. i was a and i wasn't known from black america. i've learned how to speak back to whitefish aboriginal people of iraq more every day. we were out loaded system. now with the police were out with 2 states.
5:59 pm
i'm scared that more children are going to grow up in the country that think says no racism, but they're more likely to end up in the criminal justice system. been there, although shallow friends in daycare both sides of financial survival guide. stacy, let's learn about be allowed. let's say i'm a true i can, any are great grief on banks of the site. 9 wall street broad, thank you for helping with enjoy. that's right. fill out it desk slavery
6:00 pm
a rushes to suspend its permanent mission to nato. from next month, the most, a direct response to the military alliance. recently, kicking out 8 russian diplomats ready for action. russia fills one section of the north stream to pipeline with natural gas and awaits the green light from regulators to start supplying europe. that's i was thinking you commissioner warns of energy poverty throughout the continent is on the rise. america's top envoy to afghanistan tenders his resignation. it comes as the u. s. inspector general has reportedly launched an investigation into biden's military withdrawal from the country and tortured and jailed for 17 years without trial. we explore the case of pakistani,
22 Views
Uploaded by TV Archive on