tv Going Underground RT October 18, 2021 2:30am-3:01am EDT
2:30 am
effected all our lives in a way. what is ethical? a. i presumably different to the squid game, i haven't seen all of that series, but everybody's talking about what is it? and the cal ai is, i consider it to be a field that tries to ensure that while we work on a technology, we're working on it with foresight and trying to understand what the negative potential negative societal impact to are and minimize those and try to work on something that's actually beneficial for humanity. so you can start by saying what is on ethical ai? there's lots of it said to call, right. i think that whenever you, so for example, as a part of my work taught showed that my collaboration with joy pull me showed that a lot of a p i that sell automatic automated facial and that is tools. we showed that
2:31 am
they were much there. they had much higher error rates for darker skinned women than lighter skinned men. this is with ignition. yeah, yeah, and that's bird. a lot of movement because a lot of people have been worried about surveillance related technology anyway. so that spurred a lot of movement to ban some of these, they use of some of these technologies by law enforcement, because they're mostly used to suppress dissent to surveil a lot of marginalized communities. and so for me, that's an ethical, that's not ethically i right, that is an ethical ai. but even for capitalism, it's not that useful or law enforcement is not useful to have a i that wrongly identifies individuals on the basis of their color. and i think the amazon company did initially fight against a criticism of u. s. law and yes. and then they to present me so well, you know,
2:32 am
no one wants that. i don't, i don't think they saw the lights. i think what happened is that there were a worldwide like lives matter protests. and so it looks bad for them to continue using it amid this world wide protest. so they said that they were going to have a moratorium of using this technology until there was a federal law. i think you all right, yeah, i have a statement here, lou said quote this research paper and i'm assuming you were cool that article or misleading and drill false conclusions. i mean, why the thing the research is huge, an outdated version of amazon recognition. we made the significance other improvements. i mean, what do you think, why, why do they fight back so strong lives against dis, interested, scholarly research, yolanda. i mean, this isn't you being an activist for blacklist rather this is you analyzing the algorithms. yes. so this was actually my colleagues, so joyful, i me and i wrote
2:33 am
a paper that preceded this one and her and another black woman at deborah raji wrote this paper called actionable auditing and them. and when they put it out there for the very 1st thing that amazon wanted to do was push back. so v p after v p wrote a blog post trying to discredit them. and me and my colleague mag, mitchell, you know, we're both fired from google. now that we were a google at the time, we spent a whole lot to writing a point by point rebuttal and then and galvanized academic community to come to their support. so we had a whole petition that more than $7080.00 academics including tearing award winners . that's kind of like the equivalent of the nobel prize for computer science reg signed our letter. so we came to their defense, we debunked. we had a point by point rebuttal of what they were trying to say here. and then we all asked them to stop selling recognition to law enforcement. then that was all over
2:34 am
the news and they weren't able to florida or sort of try to discredit them. but the thing that's so what you just saw there is very similar to what google was trying to do to me right after i came out with my paper about large language models because we were trying to show the heart of such technology. well, let's get where we were, we'll get on the la language models and i have to say that these are massive companies that you know, some people say own all alive. so i better say these responses because they have a lot of money. you know, before you make actions that they did, i completely that google says they didn't fire you at all to need wrote that if we didn't meet the demands, she would leave google and worked on an end date. we accept and respect the decision to resign from google. so the, one of the gods of our societies are saying, no, you didn't, you went for it. i think they would, they, you know, they had a, a could and could, and a non apology of an apology after the backlash because they were just doubling down
2:35 am
. and it's so clear as to resign. i can't even believe. i don't know why they thought that they could get away with saying that maybe they thought i'm so stupid that i would just kind of feel like, oh, i guess i resign. that's not how resignations work, right? resignations, you have to submit. you have to have paperwork, you have to, you know, say you're going to resign, you have, there is a whole process and a dispute. i mean, as i say, they say that why did you, according to google resign? according to you were fine. i said that i reside was that i'm not going to have in the context of all of this. what exactly happened? because ok, again, you're working on scholarly research. as far as i understand about lodge language models and people who use google translate, love love these google translations, and that uses large language, what could possibly be wrong with large language models. so in all of these
2:36 am
incidents that you see, whenever you show a problem and it's inconvenient, it looks like the problem maybe is too big for what they want to admit or, or maybe it's not even that bad. like they, maybe they really do think that, that the issues that i'm raising are not that serious. they. one of these, they said is that my paper made it look like there were too many issues with large language models, right. they say it's paint a stark picture, and so they wanted me to retract this academic peer reviewed paper that was being published at an academic scientific conference with no process whatsoever. so it's one thing if somebody, if your employer comes to you and says we followed this particular process that you know about to come to this conclusion that your paper should be retracted. and here's why we can discuss it. let's have a conversation about, you know,
2:37 am
what kind of process we use, why we want you to retract the paper? absolutely not, right? this was way late after we submitted it up a long time after we submitted our paper. and we went through the internal approval processes. my manager resigned from google, right, because he was approved for okay, well it was fine if we go to the specifics of the lodge language model of problems that you were investigating. maybe we can figure out what so does question. i mean i do you give an example of a palestinian using woods and then being wrongly interpreted just just to take a few? yeah. so what we did was we went through actually prior works surveyed a lot of prior works and had some initial, our own analysis of, of these large language models and what, what has been going wrong and what could go wrong if we focus on larger and larger
2:38 am
language models. so the 1st point that we started with was environmental and financial cost. right? these large language models consume a lot of compute, consume a lot of compute power. and so if you are working on larger and larger language models, only the people would, these kinds of huge compute powers. are going to be able to use these large language models. and that leads what to what we talked about as being environmental, racism, that people who benefit from large language models are not the people who are paying the cost of the this environmental and financial cost of large language models. and so we give a bunch of examples there about different languages, right? it's always people in the dominant groups, whether it is between countries or within a specific country that benefit from these large language models. the 2nd point we make is the fact there's this weird if there is just assumption that if you use the
2:39 am
data on the internet, that you'll somehow incorporate every one's point of view. right? and those of us who've been harassed on social media and know that that's not the case, but these large language models are trade with data on the internet and most of the time like the whole internet events. and so we were talking about the dangers of doing that, the kinds of biases that you would code, the kinds of hateful content that you and code. we were saying that just because you have a large sized dataset, it doesn't mean that you are now incorporating every once point of view. so we give many examples of that. and then we, we give examples of what happens when this and coded bias. when this large language models trained with data encoded in this, with this kind of bias are deployed. so we give certain examples. one of these examples was this example that you gave about a palestinian. it was this a,
2:40 am
he wrote good morning on facebook. and it was translated to attack them and people didn't even see the initial didn't check to see what he initially thought a wrote, they wrote, they saw the translation and they arrested him. so this was a google translate era. this was, this was a facebook translate and for the underlying technology of large language, they all use large language models. so what we were saying was sometimes when you have machine translation, you get these cues when you have, when, when you have errors, right? you can see that the grammar is not quite right. you can see that something is wrong. but with these large language models, you can have something that sounds so fluent and coherent, and it's completely wrong. so you only with these companies. but why would these companies visiting? there's no malice, there was a mistake in the algorithm. and in the, in the software engineering,
2:41 am
why would they seek to minimize the publicity given to papers that showed these areas so that they could, i went in. i would disagree that there is no malice because when you look at what happened with joy and deb, for example, it is 2 black women talking very much about the impacts of amazon's technology on the black population. so if you, you see a pattern here, it's me and other black women and a bunch of other people in our team who are very much concerned with the impacts of these large anguish models on marginalized communities. and so if, if you're talking about, you know, something that maybe should not be used right now. that's directly going to impact their money. it's a money making machine. yeah. yeah. i see what you mean there, but how good morning can translate into them. why would, why would,
2:42 am
why would the algorithm even work that out based on the dataset? because of the domain from where the user came from? because a, if you look at some languages that does an event. so on, on twitter, twitter uses google translate, and i had an interview on bbc with integrating. yeah. it was my mother tongue. and a google translate doesn't even have to going out. they don't, it's not part of the language is that they offer a translation to. but it uses the same alphabet as eirik and when people were lukea were sharing my interview. it just went haywire and just says there it just when you know, let's talk about the greedy people, greedy, greedy, greedy greeting, greedy, really, i like there is not i,
2:43 am
there is nothing in my interview i talked about really has given the arguably us military industrial complex is particularly addressed and we own of africa and what's happening in eritrea review. yeah, that's actually more disturbing this. it 1st, that it 1st just sounds to me. i'll stop you that more from the former colleague of google's ethically i team and co founder of black and i, after this break with to what we've got to do is identify the threats that we have. it's crazy confrontation, let it be an arms race group is on often very dramatic development. only personally and going to resist. i don't see how that strategy will be successful, very political time. time to sit down and talk
2:44 am
2:45 am
entailed your and you can go to them. because on the phone with nickel, freedom ok. people to municipal calling me in the past with thousands of newborn babies were torn from their mothers and given away and forced adoption. the bottom of a field of interest to this day mothers still search for grown children. well, adults look in hope for the bearings. mm . welcome back. i'm still with dr. tim need good brew former co lead of google's ethically i team. i'm co founder of black in a, i mean is it in effectively translation? is it, is that process? does it violate the 1st amendment of the us constitution then? because it is it's, it's not allowing free speech and effect. oh,
2:46 am
i've never thought of it that way. at all, i don't know, i mean, i guess i'm not a legal scholar, but back to the question that you asked me in terms of ethical and what is ethically i, if you read the works of some critical scholars, they would say that the tech industry strategy is to make it look like a purely technical issue, purely algorithmic issue. that needs to be fixed. as you said, purely mathematical issue. it has nothing to do with antitrust laws. it has nothing to do with monopoly. it has nothing to do with labor issues or power dynamics. it's just purely this, you know, technical thing that we need to work out. and part of the things that we say in our paper and many others is that is mr. started this purely technical. you know, algorithmic tweak that you want to do, right? we need regulation. and i think that's part of why all of these organizations and
2:47 am
companies want it to come down hard on me and a few other people, right? because they think that what we're advocating for is not this simple algorithmic tweak. it's for larger structural changes that additions don't realize that the, the way to really answer the problem is as you say, to do with the nature of monopoly power. yes, i think now with the whole, all there's so many whistleblowers now since i, since i 1st came forward, there's one after another after another. i think the public is now starting to really understand that we need some sort of regulation bed in. but the way that what the companies would argue is exactly what you were saying earlier. oh, there is no malice. it's just, you know, the algorithm is not this. why would we want to have algorithms that don't work that that's not good for our business. and so of course, we will work really hard to fix our algorithms, right? so for example, of it for facebook, march, 2nd, requesting
2:48 am
a safety. i don't know any tech company that was to create a he was a all our advertisers don't want to advertise to angry people or bad content. they constantly tell us that. so why would we want to do that? that's a kind of argument that they're making, right? but what i'm saying is that, that's, you know, a lot of the work in this, in this space. interdisciplinary work has shown that we need to, we need to look into more structural solutions to this problem. now of course you were at google for quite a while. i julie this on, you know, when, when he met eric smith wanted to meet him, julius on said, you know, google may not be acting illegally in any of this. presumably he's referring actually to the scientific element as opposed to the other trust element. who knows, which is why the f b, i don't investigate when you right paper i would assume did you come across a guy called jared cohen, seo jigsaw, back then it was cool. google, eric,
2:49 am
how do you remember that? i didn't come across hand, but i heard about him. i was down, i was, i google for actually 2 years to very long years. every month i was thinking can i survive here? should i leave next month back? i mean, i've had a lot of people. what do we think? it must be amazing to work that amazing office and all of that. that's the future there. graduates watching here desperately trying to get through the complicated process to get in. what is i mean and, and i should say jared cohen back by joe biden, blink in the sex your state and the national security advisor, jake sullivan. they love google ideas and google very much associated with the democrat royalty as it were. why, why was it no fun to work? well, again, it comes down to what your views and the demographic that you were in and how, what you think i was, you know, in a role that's called
2:50 am
a research scientist role. and i was the 1st black woman to be a research scientist at google and i saw why, right? i had so many issues right off the bat i was under levels. are people with much less experience to me way more level than me. there was a lot of just kind of disrespect. there was a lot of harassment and all sorts of things. so it was very, very difficult for me to just even concentrate on my job. and actually when i'm talking about being fired, i'm pretty certain that it wasn't just that paper that got me fired, right. i was, there were so many things that they didn't like that i was doing. i was speaking up about workplace issues, discrimination issues. and they and each time i spoke up, they weren't happy that i spoke up about it. so i didn't buy the idea that there is something philosophically around this kind of a right wing supremacy about it. cuz they did, i only obviously they say you resign and they deny any element, workplace being
2:51 am
a bad place to work, let alone harassment or anything like that. yeah, i mean, they also, i don't know if you're aware of the google walk out there was a large 20000 people walked out 2 months after i joined google. there was a large $20000.00 people demonstrated, walked out because they were demonstrating against, you know, andy reuben, being leaving with the $90000000.00 or something like that after you know, here, harass people. right. and so there were $20000.00 people walking out in protest, and they pushed out that 2 of the organizers, claire and meredith, you know, after like a year or so afterwards. so they were per g. and then a year after i, before i got fired, they pushed they up fire the fired 5, there's an l r b trial going on right now, the national labor relations board trial about these fired 5. and then right after one year later they fired me and i spoke. 2 up about all of those firings, google,
2:52 am
denial, wrong doing all of those things and they're so powerful. i mean, you know, when you just mentioned these cases, obviously journalists, you'll have to look them up using, using google, but as for diversity and identity politics, they go a 9 to december 2020, the millionaire. all the god saying it's incredibly important to me that all black women and under represented google's people who work at google know that we value you and you do belong at google. we started a conversation together earlier this year when we announced a broad set of racial equity commitments to take a fresh look at all, assistance from hiring and leveling to promotion and retention to address the need for leadership accountability committed to continue to make diversity equity inclusion but of everything we do from how we build our products, i presume that some of the hardware over represented by people of color, who knows how to, how we build our workforce. you know, you know, the statements coming from google completely against what you're saying. yes. so maybe this was what he did, the quote non apology and you know,
2:53 am
he had to apologize because the other road they were going down on, which was deborah lane down, you know, to saying that my work was set par saying that i, i told people to stop working on diversity and all of the other things they were saying it was creating more and more backlash each time they went down that road. so i'm sure he realized at some point that he had to do some sort of an apology. and if you read closely the apology, he, it was the kind of apology for it that said, i'm sorry for how you feel kind of apology. there was a long time. and what is it called a person who a comic like a person who draws these, these cartoons that cartoon is, there is a long time cartoonist at google life. i who left after 1415 years because of what they did and later how they did. you had thousands of people on your side, they find less is and maybe 1000. i just want to, i mean,
2:54 am
i just want to ask you about, i mean, we were talking about jared cohen and i don't know what you think about jigsaw and why you think jigsaw or a is kind of a competitor. now you're the co founder of black in a i, you're going to tell me about what black in a i is. why is your a i going to be better than ai? ai? certainly the innovation department of the google conglomerate. why is it going to be better'n and you're going to make it more profitable actually, because you're going to be more accurate. you're going to get the cia national security agency i. so for, for the group black, black in the i that i founded, i co founded was a, a group to it was a, it's a non profit. and it's a group for practitioners and researchers in a i black purchase nurse practitioners and researchers anywhere from all over the world. so and so this is not like a group that builds
2:55 am
a or anything like that as a group that builds mentor builds community networking, has mentorship programs a for various things like graduate mentorship programs, entrepreneurship program, and we have workshops to raise the visibility of black people in a i, et cetera. so this is different from what i'm doing right now. and in, and so what i'm, what i'm hoping to build right now is an interdisciplinary. i research team and the goal is not to be in extremely profitable. i research institute because i believe that if your goal, your number one goal is to maximize profit, then you're going to cut corners and you're going to do things downstream. that end up making you build a i. that is, i think you'll find section 70 to the 26 companies act fiduciary duty company to maximize profit. well, we're not half
2:56 am
a 3rd grader. yeah. duty maximize profit very quickly. well, with a whistleblower right now, who's terrified and he worked one of these organizations and even seeing what you're talking about feels more terrified and never wants to come clean with journalists or anyone else about yeah, lack of ethics that they see in their daily workplace. well, i would ask them to look at the handbook the tech work or handbook that if your mom was de la, who is another, was deplored, just launched like last week. and so that handbook is meant to help people decide whether they want to come forward and what to expect. and i think it's a very personal thing for each person, because it's not a joke. you're gonna really, you're gonna have a lot of backlash. so you have to determine whether this is a right avenue for yourself. and so i understand the fear because you, you get a lot of harass and being cursed into that public space. but i'd want to say that,
2:57 am
you know, you were mentioning eric smith and all a julian assange, etc. before eric smith in particular has a lot of influence with the u. s. government right now. he created this and c a i this national a i kind of committee or something like that. and so is a very, he is this, we view that we have a cold war with china and there's an a i raise, etc. so i think that where we're going with this can be very, very dangerous. so if you are voices or the whistle blowers voices are not heard. it's these people at the top who are having a lot of influence right now. and i think that it would make a huge difference for more people to speak up, but at the same time we have to protect them. so i can go ask people to speak up without making sure that the society at large will also protect, protect them. well, we invite to hold as a google board members, people who left people in the government. now the come on dr to meet. give. thank
2:58 am
you. thank you. that's it for the show will be back on wednesday, 10 years to the day since the leader of africa's riches recovered country, libya was killed with the backing of the u. k. u. s. same for us until then. keep in touch with social media and let us know if you think you protections needed for with what we've got to do is identify the threats that we have. it's crazy confrontation, let it be an arms. race is often very dramatic development. only i'm going to resist. i don't see how that strategy will be successfully, very difficult. i'm time to sit down and talk. mm. to visible, mystifies,
2:59 am
but also have seen the daniels purely a little fish with the rest of the basal castile with this film. and just needed to ask you about it because i just moved say the game and then you would get these images moves up was closed for supposedly they'd have my did some i would say again suspend your music. his room which is filica. mom with her was out of the to get the vote if idea or the all of your rooms there. some way up. all of the bella is ready to continuously to the shelter from bush with a
3:00 am
with russia. welcome back. the 1st ever space film crew off to that 12 day shoot up high in the sky, up all the international space station, extreme pressure, flaming turbulent the new pioneers of orbital cinema. speak to us here with our fee and describe that fiery descent back home to us with everyone feel that you can not experience something like that unless you fly into space and the feeling wasn't random. did you spend a unforgettable? because when the parish it opens capital scans, so we got it and we still have all these with also in the program. here on our tea, germany confirms that rupture is not withholding natural gas applies to europe. but me.
24 Views
Uploaded by TV Archive on