tv Joy Buolamwini Unmasking AI CSPAN December 2, 2024 4:08am-5:06am EST
4:08 am
you know, i did little league and all that kind of stuff. but that was a book. but i don't have it really is a special skill to write children's books, and i'm not saying that to be pandering. it really is. you you have to know how to tell a very, very broad story in a very limited number of words, but i go back to this this quest. i'm on. history is not boring, but most history books are boring history the way history is taught in the school, its history should be taught as if it was, you know, it was a big story. yeah, great. you know, like something that people should to history class saying what's going to happen today, you know, and they should leave. like, i never believe that that's what real history is like. and so i'm trying to write my books. i love when people say, oh, your book read like a movie and and i will say, i see still some elements of a screenplay with it. we'll just with a with the kind
4:09 am
of the opening of each chapter and stuff like that. but i want people to feel like they're watching a visual experience even as they're reading an actual, youused to '. realize. good afternoon. good afternoon. are you ready for this session? awesome. i am. that dr. garcia vasquez, dean of students here at miami-dade college. welcome. we are grateful to the miami-dade college and volunteers. and for the of our sponsors, including the green family foundation, nicholas children's hospital, amazon the j.w. marriott marquee, brickell and all other sponsors. we'd also like to thank our friends of the fair. friends receive multiple benefits. please consider a friends or gift one as we work to ensure
4:10 am
the future of your miami book fair, please consider supporting miami fair with a contribution to our next fund visit. the friends of the fair table or our website for information. at the end of the session we will have time for kewanee and the authors will be autographing books outside. we kindly request you silence your cell phones now and is my pleasure to. introduce miss joy ball. i'm winning that. she is the author of the national bestseller unmasking. i my mission to protect what is human in a world of machines. she's also, the founder of algorithmic justice league groundbreaking researcher, a after speaker. her writing has been featured in publications such as time the new york times, harvard business review and the atlantic. as the port of code, she creates art to illuminate the impact of artificial intelligence society and advises world leaders on preventing ai harm.
4:11 am
she has received numerous awards, including a rhodes scholarship, the inaugural morals and machine prize, and the technological innovation award from the martin luther king junior center for nonviolent social change. her mit research on facial recognition recognition technologies. as featured in the emmy nominated documentary coded bias morning canada to guyana and parents ballroom lives in cambridge, massachusetts. and our moderator, museveni sylvester is the founder of avatar buddy, a powerful. integration of gen eight technologies with built in evidence based mentorship. she has worked in corporate i.t. 30 years for companies, including burger king subway, wachovia, hp and the children's trust. welcome.
4:12 am
hello. good afternoon, everybody. thank you for. spending your sunday afternoon with with us. i'm honored and barely beyond words that i got this amazing. so i'm going to start off with a poem by my word, buddy in the heart of miami, where cultures entwine on november 24th, at the stroke of four. two visionaries meet their parts align to unravel unraveled the mysteries of ice core. stephanie sylvester, a beacon of light with roots in it. a journey profound. from burger king to subway. her insights craft in avatar buddy where solutions abound.
4:13 am
a member of the links with a heart so wide empowering communities within each voice. her mission to bridge the digital divide to make technology a tool of choice. enter dr. joy with a story to tell on mask in eye. her book, a clarion call for exposing bias, is where shadows dwell from justice for justice and equality. she tall. together they sit under miami's sun a dialog of a dance of the mind. ai's power what's loss and one in their world a future redefine. stephanie asks with wisdom and grace about the journey, the struggles, the fight. dr. joy responds with courage to face the systemic barriers brought to life in the exchange attests to our spirit. waves of technology, algorithmic justice and dreams, untold a spoken word. symphony one that believes in a world where every voice is full. so gather around, let the
4:14 am
stories unfold. at the miami book fair, where ideas ignite. stephanie and dr. joy narrative goal inspiring us to envision delight in the heart of miami, where cultures entwine and 24th at the stroke of four. two visionaries meet their paths align to unravel the mysteries of core. so thank you. so i am going to get right it. and the first question i have, dr. joy, is let's talk about this amazing cover of your book. thank you. so this cover designed by algerie and french illustrator malika of you might have seen her covers on the new yorker and others. and it shows me in this pivotal moment where i literally had to put a white over my dark skin face as a student at mit. here i was this epicenter of innovation working an art
4:15 am
project that actually used face detection. and it wasn't detecting my face. so i started experimenting. i drew a face on my palm. it detected the face on my palm. it happened to be around halloween time. that's why i had the white mask. a friend invited me for a girls night out and she said, let's do math. so i thought she meant halloween mask. but she actually met korean mask. so had i known that this thing might have never started. but thank you, cindy. so because that mistake i had, the white there and i put the white over my dark skinned face and the white mask was detected as human while i was not. and so that's literally where my research began. and phenomena started in his book, dark skin white mask. i just didn't think it would be so literal and so that's how we got to title unmasking i and the
4:16 am
cover as well. thank you. so i'm sure that you've encountered this as. i have during my journey about in educating people about i. when i read your book i just was really moved and fascinated by how you wove every day a currencies in where complex thoughts and broke air down to a level that anybody can understand. so talk little bit about your writing style and. what made you choose that path? got it. and i have a writing style now. this is fun to hear because when i was approached to write this book, i was really fortunate. an editor, random house slid my audience and i looked at she was she had maybe four followers on twitter. so i was like, well, one of them was president obama. so like, all right, let me
4:17 am
return this. right. so i was to say, i kind of found my way into world of writing. and i was on my ph.d. at the time. so i was under the fiction, my ph.d. would be the book. that's not what happened very much of voice. so as i'm through and i'm writing this academic piece to a dr. joy title, we called it mission, dr. justice. the voice wasn't what i it to be because i wanted people feel my humanity. i to go from performance metrics, which were all of these i test algorithmic audits right to performance arts. so you could actually know me from being that idealistic research immigrant coming into having to face tech giants like amazon on or what it was like to get a film like coded made when no one believed in the story to then have it go be emmy nominated all of that or almost dropping out of mit to start my
4:18 am
skateboarding which didn't happen right. you know and along the way as a poet, i thought this would be a great way to drop in some if i'm going to be published. anyways, so throughout add poems that reflected where i was at that particular time and the journey of the book and with i. so an eclectic representation of the poetic elements of myself, the science communicator as well, and really trying to make it feel like people with me. so when i'm in davos the snipers right. are fixing their i want you to feel like you're there, you know. so that was how i wrote it. thank. so a bit more on present day being a black woman in i feel is emotionally wrought if you want to say it. being conservative at best and i
4:19 am
want to know you about like the getting blocked at door and the the security guard not even the time to validate if your name was on the lists and not really like spoke to me because i have an air company and there are times i am present in and i see people that have like a minuscule of a knowledge look at me and say so who's the brain behind this? and then get in a little when i say i am the brain behind this, like just so so i can just imagine. i mean, because what you're doing is multiple times complex to what i'm doing. so how do you process that emotional fallout and for the people in the room that are having to deal with that are
4:20 am
some tips that you can give to them. so i write about this in the and there's this chapter. i think it's the streets, belgium gates of belgium, because i'm talking about gatekeepers. so i was invited to be part of the eu global tech headed by the vice president the european commission. one of my mentors was megan smith, who's former chief technology officer of the united states, had invited me on. so she pulled up a seat at that table using her privilege and access. so i was still i still like, okay, here i am, eu global tech panel. i roll up to brussels and. i tell them that i'm here for the meeting. they tell me literally, that's a meeting for important people, right? so i look around like, okay, i might look, i might not look like the gray clad diplomats, which was true. i was like well, my have an
4:21 am
invitation letter. right. here's my invitation. i kid you not. and i write about. anyone could have printed anything. which is true. still, you know, it's at this point, i'm just hoping that international plan doesn't fail me. now. i managed to get hold of the secretary, and they come down, they give me the special badge, and i'm literally the only one who's wearing the special badge that kind of say that i'm supposed to be in this place. and then he finally looked at the list to see by name, which could have been the process from the beginning. and so that was gatekeeping that you were just speaking to, that we experience in different ways. and i was using that in the book kelso. talk about the algorithmic gate that are happening right. so it might be the hiring
4:22 am
instead of that being the hiring manager, right, who might who might not want to hire or something like this. now it's the faceless ai interface or systems that you don't even see as well. and in my case, i could contest it right. there was somebody i could out to. but so many of the ai systems that are being used in our you can't contest the power. you don't see this is something that i at times called the coded gaze. so some people might have heard of the male gaze, the white gaze, the post-colonial gaze. this cousin concept, which is really about who has the power to shape technology. those are shape the priorities, the preferences, and at times prejudices. and then to your question in terms of how do you navigate in that space it was having to stomach what just happened while doing what i came to do. so in that space that was actually i first shared my poem
4:23 am
i ain't i a woman and it shows some of the biggest tech companies amazon, ibm, google, mislabeling or mis gendering, the faces of iconic women like oprah winfrey, serena williams, michelle. right. and so in that space after that experience is happening i'm saying can machines ever see my queen's as i view them can machines see our grandmothers as we knew them? ida b wells data science pioneer hanging back stacking stats on the lynching of humanity teaching to send in data. each entry and omission a person worthy of respect. shirley chisholm. and on balance, the first black congress woman but not the first to be misunderstood by machines well versed in data driven mistakes. michelle obama and unafraid to wear her crown history. her account seems a mystery to systems unsure of hair, a wig up, a font, a toupee. maybe not. are there no words for? our braids in our locks, the
4:24 am
sunny skin and relaxed hair make oprah the first lady, even for face well known. some algorithms fault her, echoing sentiments that strong women are men. we laugh celebrating the successes of our sisters with serena smiles no is worthy of our beauty. and that was that was literally part of the algorithmic audit that i was showing them that i almost was blocked from even delivering and. the head of the world economic forum was in that room. one of the co-founders of deepmind was that room representative, the president of microsoft and so forth in that space, having that conversation, bringing us in literally right through poetry to ground conversations. ai and so later on, that very same panel engaged all of the eu defense ministers ahead of conversations on, lethal
4:25 am
autonomous weapons. so the poem is a bit of how i show up, right? we celebrate our successes with serena smile. so there's all of pain that's happening, but i still know i was there on assignment. i had something to do. so i had to put that aside in that particular moment and then write a whole book. the process there. okay. so all of for those of us that don't know how to write i guess we have to use ai to help us get our words out out. a mask and a mask and i don't know. i mean, i think as an approach the world, right, with my artist hat and then also my research. so i was an undergrad at georgia tech for a while. we didn't really to write that much, write our currency wasn't and how well we words together. it wasn't how we built machines
4:26 am
and, halls. and it wasn't until i was in the uk studying on a rhodes scholarship well, okay, these people are really about words and lots of words right. so in a way i kind of felt, oh, i wasn't equipped or i wasn't prepared. and i learned it was more about voice than words, right? the message you have to share and sharing it in whatever way is true and authentic to you. yeah. so just kind of following up on that. so what about the people that don't have technical acumen to navigate the complexities of ai or to even start understanding why what's happening and what advice do you want to give to them? because as much as it's overwhelming at times, we need to take steps in order to sure that everybody's aware of what's
4:27 am
going on or this get out of get get away from us. such a great question and i love to say if you have face you have a place in conversation and i and in this whole journey in terms of, you know, the academic experience and the technical knowledge and so what really made the algorithm make justice league, i think, resonate was storytelling. it was having the courage as, a student who experienced what i felt a pretty embarrassing thing, like you might and i'm putting on a white mask to be seen and i didn't realize how far white mask would travel in terms of story, in terms of impact. and so i think right now we all have the opportunity to tell stories our lived experience. that's exactly what i was doing right at that. and so sharing your stories about how ai is impacting you,
4:28 am
what your you're seeing and also sharing the stories of how it's impacting others, for example, a not too long ago there was a report about a 14 year old boy who sadly self-harm trigger warning took his after interacting with a character created on character not a i and his mother says she believes he'd still be alive if not for the kind of emotional that was co-created with this particular ai entity. and in the book i open with a man, belgium, whose widow also a similar thing right? her husband getting involved with this kind of chat bot that went sideways. and in those cases they're not i experts per say they're people lived experience that is really important to share because that actually informs how do we create systems where there isn't
4:29 am
the emotional manipulation or ways which we might think we're putting out something good but ends up having unintended consequences. that's a big way that i think a lot of people can engage now, because the are more accessible. when i started you kind of had to have the technical knowledge to really dive in, but now you don't. yes, it's great. so i want to go back a little to the amazing poem you just spit out for us. i obviously i had to get my to write my poem for. it's definitely not as good as yours, but why ain't either woman? i mean, that is such a powerful little piece of american history. and i think that it doesn't always up to the foreground as much as it should, especially when you talk about women's movement, women, power, and so forth. and i just kind of want to kind
4:30 am
of unpack that a little bit more and really ask you like just to be like authentic and raw if possible, in what was going through your mind when you went with ain't i a woman? yes. so i inside a woman was inspired by looking at the research results i got when i tested systems from ibm, from microsoft and others, and i was testing gender classification. so guessing the gender of somebody on their face, what could go wrong right? yes, that's what i was testing and what i found was that overall, the systems work better on male labeled faces than female labeled faces. there was no concept of gender fluidity or anything like this. it was just binary gender classification as done by the machines. and then they all overall work on lighter labeled faces, darker labeled faces. but then did this intersectional
4:31 am
analysis looking at lighter metals, lighter female, darker metals, darker females and. i found that the results varied, except for one group, the darker females, though women like me, they always had the worst performance. so let's say, for example in one case the performance gap was 34% and accuracy rate for lighter males and darker females. so of that performance metric, i wanted to change that performance metric to performance arts. so that's why i go to back to that phrase machines ever see my queens as i view them capuchins, ever see our grandmothers as we knew the that's literally from the numbers that i'm seeing and i wanted to do something i now call an evocative audit. so you have the i audit the algorithm like audit that's testing the different aces stubs and their yes, you get the numbers, but you don't get the emotional piece. and so when i say my heart
4:32 am
smiles as i bask in their legacies, knowing their lives have altered many in her eyes, i see my mother's poise in her face like, looks my aunties grace. in this case of déja vu, a 19th century question comes into in a time when sojourner truth asked, ain't i a woman? that's going back to the history to say this is part of a much larger history and line of inquiry about it even means to be seen on own terms. because i was referencing sojourner truth, a 19 century talk. a ain't i? oh, and in that case, she was actually challenging the women's movement to say, wait, what about women of color? we're part of this conversation. and to and so that was was going through my mind i really wanted i knew that the research even though it's literally one of the most cited papers in it's filled in this category would only reach a very small number of
4:33 am
people so wanted to break out of that. i wanted to break out of the lab and poetry was, a way to do that. i will say i love the fact that you're using poetry to make i accessible, because as you start about it and you use in poetry, realize that it's really not scary. it's actually very easy. and when we look about the history of i. it's been around from the 1950s, so i want to kind of touch a little bit about your style not because i'm being shallow, but because because you like these boots. i, i was like, i love the boots, love the different the glasses on the cover of the book. but one of the things that when you're in this space, people look at you as woman and they they sometimes say, well, you don't look like a computer scientist person or you don't look like an it person. and i always whenever they tell
4:34 am
me that, i always stop to think what should a woman in computer science look like, if not like me? and you know, these are what i call like little microaggression that after a time can wear on you how do you how do you like the movement going? because this is a fight. i mean, and we have to enroll so many people to get to where people that like you cannot leave an entire class of people behind in this revolution. so how do you first the first question just to make it little tighter is how do you in a intertwine your your your style lipstick, your beauty, your access, and then how do you like have that? does that help you keep the momentum going in this like brutal revolution that's happening to make that a segment of the population doesn't get left behind?
4:35 am
yes, i there are few parts. one by the time i started my fourth degree ph.d. at mit, i have lyon i might have heard of tiger parents are the african versions. so it's like you have arrived until stacked it up. so that's just working my dad's the original dr. von when he a professor of medicinal chemistry pharmaceutics sciences, my mom's father was his advisor and he was also a professor of medicinal. right. so that's like the lineage of that. i am coming from. and then my is an artist. and so i grew with the world of art and, science and literal companionship. if you think i have style, i of this, like from my mom's read writing her closet and that kind thing. and so for me it was kind of how i grew up and also being from ghana and being of to and wanting a represent that
4:36 am
identity that just felt a natural to me when i was younger. as i got older, it became very that okay, the stem kids are over here, the art kids are over there and you need to choose. and by the way, you're an athlete and that's kind of messing up this thing. what's with the pole vaulting the basketball? so all this to say, i was always in that case, right? like, you don't look like, you should be in any of these classes. you look more like an athlete. i've been hearing that all my life that i never look like the thing i'm supposed be. and so after a while, you just kind of have to be who you want to be. and so in the book, talk about being at the lab and the company of supply coming out, and it was supposed to be space space fabrics, right? for everyday use and. they only came in men's sizes,
4:37 am
right? even though that time about women were about 3% of the media. and so there are just these constant reminders that you were not the norm, just like the white mask. you are not the norm. so in that with these whitewashed walls, it was actually a deliberate act to be the splash of color that i was anyhow, even if it might get me blocked out that you global tech panel because they're looking up and trying to figure out who's there. but i feel that's been the story of my life oftentimes and when the film coded bias came out and people saw my style, things like that, i would get all sorts of things like your white glasses distracting from your very important message. i was like, well, if you were paying attention to the message, maybe the glasses would it really? or people feeling seen and the i show i have them do scene where my hair is being braided. right. or there's a scene where i'm
4:38 am
with my partner at the time that those human moments, people really latched on to i thought was so that experience of how people reacted to the film the style in that film made me realize oh this is another way to speak right as to who gets to do tech who gets to be an air expert anybody and you can look any way right and this how i look some of the time you know so there was some freedom there for sure it's just great. so just on that and being seen, we are seeing with our research both formal and informal, that is giving voice agency to a class of people that would otherwise discount. so when we were building company, we went to a city and overtown and west perrin and miami gardens to get people to
4:39 am
help us write the requirements and test it and give us feedback. and that really positioned us to be about two months ahead of opening. so whenever opening i came with a way to use a.i., we already had figured it out and were deploying it. and so as much as there's a lot be concerned about with ai, we are also finding that it's given agency to people that wouldn't otherwise have a voice. and so i wanted you to see if you could spend a few minutes talking about how to navigate that because that's a very tight, tight rope that you need to navigate and when one misstep could cause you to careen in the wrong way and create negative unintentional biases but positive steps gets it to a point where this class of people now have a voice can be seen and can assert themselves in ways
4:40 am
they have never been afforded before. that's i'd love to get your left. shakespeare inside of that, because one thing i learned while i was at the center for civic media, that was the group i was a part of at mit media lab. we were always thinking about tech and society, but tech was a way to have conversation points that needed to happen anyway. so, for example we released the promise tracker app in brazil, and this allowed people to document civic so maybe potholes or school lunches, right. elected officials said they would improve the school lunches that that actually happened and so the technology became a way to have documents session but it also became a way to bring people in conversation that otherwise wouldn't have been in that conversation in that way. when i was thinking about some of my earlier work in ethiopia working on neglected tropical diseases and using mobile tablets to make it easier to
4:41 am
collect the information for health systems, those were all kind of like this tech for good kind of exploration. and then as i got further in, when it came to, i saw times what was position opposed to giving people voice stripped away their right or stereo typed their voice as well. so i'd be really curious how you're the technology in terms of providing people with agency and so forth. well, well this is your interview, so i'm only going to give take two data points. then we're going to turn it back over to you. but for one thing, we took about year and a half to make sure that or i sounded like the people we wanted to have, use, and that was a struggle, really find voices with the appropriate accents that could understand. so that's kind of what we're
4:42 am
doing. and the other thing that we're seeing is that i did a ted talk with, this woman from overton moore, she the women's club of overton young and what we did is, is that she's ai to help resumes and why is that important? so the ai helps to rewrite the resumes of individuals and they're seen themselves in new light and given them a whole way to process themselves, giving them self-confidence. but the other piece of what they're saying is that they're coming back and you're talking about the jobs that they're getting. so somebody their forties got a job for the first time where benefits and career paths and that is a lot for a community to be able to be gainfully employed in a way that makes you feel good about yourself and obviously that is given lavette
4:43 am
a lot of agency as well and she's now come in and we're talking about all these really cool plans we're going to be doing for 2025 and we would not be having conversation had it not been ai giving her some kind of agency. so that for me is very powerful. that's something i've been trying to figure out since was nine years old. how do we the problems of poverty and why some people get to have a voice and people don't know that's powerful. okay, so back to you. it's like we can't have all this brilliance and not hear about it. well, i appreciate. so just coming back, what are you when you're think about 20, 25, 2030 and beyond, what what do you want to see the world look like in terms of ai and
4:44 am
making sure that there's appropriate guardrails? and i love this question and love that you're in green because that's that i think about a lot green ai sustainable. ai right now many of the ai approaches are so energy intensive. i i was reading a study somewhere where they were putting a prompt through chat gpt can be like pouring eight glasses of water depending on the time of day. and the thing is it have to be this way. we different flavors of ai that come out where we're in this transformer moment and it's gotten a lot of hype and people are seeing that but there are alternative ways of it that aren't as energy intensive. and so by the time we to 2030 i want there to be green ai where we can be happy, you know, about the kind of environmental
4:45 am
footprint that's associated with these technologies. i also want to think what alternative ethical ai pipelines look like because so many of the companies they're facing lawsuits, artists, writers, authors, right. of the best of what we see of i came from the best of what was created by humanity that was taken without consent. ask without compensation, pay without credit. like, where did you get that nice poetic sound from? poets here? and it didn't just come from anywhere, you know, and then also control, right? it could be that you to experiment with some of these a.i. systems on your own terms with. we should have that. and we should have that with the ethical api plans with the green ai. so that's the future that i want to be a part of and we can be part of advocating towards that. that's part of why i started the
4:46 am
algorithmic justice league because. truly, we're all impacted by ai systems and so we do a voice and we do have a choice in shaping where it goes as this wonderful i'm glad to hear it out where there's an alignment with where we're wanting to go and where you're believing it should go. so before open the floor for questions, can drop one more, one more. okay. yes, we'll the audience choose. so i a poem one is called unstable desire. it's the way the book ends after a conversation, a roundtable with president biden, and the other is called brooklyn tenants, which was to people on the front lines, successfully resisted air installation that they found harmful to their community. so show of hands brooklyn tenants show of hands, unstable. yeah, well, it looks like unstable desire. when i saw brooklyn tenants.
4:47 am
okay, how about the vote? we have time for go. yes. yeah. right. so i'll start with unstable desire. this spike is longer than i thought. okay. yes. so unstable. page 289. unstable desire prompted to competition. where be the guardrails now threat insight will might make hallucinate actions taken as prophecy destabilized on a middling to outpace to open chase to claim supremacy to rein definitely haste and pace control altering deletion and unstable desire remains undefeated the fate of a i still uncompleted responding fear
4:48 am
responsible aib where profits do snare and people still dare to believe our humanity is the neural nets and transfer nations of collected muses more than data and errata more than transactional. the fusions are we not transcended beings bound in transient forms. can power be guided with care, augmenting delight alongside economic destitution temporary band-aids cannot the wind when the task ahead is to transform the atlas fear of innovation the android dreams and tastes. the nightmare schemes of vice put of code certified human made. thank you. we can do questions and then i can with the last one. but okay okay.
4:49 am
well, thank you to both you for talking to us today. i had a question because i have been teaching the netflix documentary coded bias in my class. it's a freshman composition class at florida university, as well as new york times article by. keith metz, i think you were interviewed as well, and of course, it's freshman composition class. i walk them through. you have a thesis statement, you have a to action and that i like to stress to them like this is your essay. you know what do you really see as the point of you're writing here in the documentary? i think you're really good about presenting calls to action. my students always point out the moment that you're sitting before and actually trying to work with the legislation of this issue and then i ask the students, okay, how many of you are going to now do that or how many of you are going to willing
4:50 am
to do something like that? they all of like, well, not me i hope maybe it's the person next to me, but not me personally. so wonder what would be your to students in that position who aren't going to realistically seeing themselves going up to congress state things like that but still have issue with what you're saying about i and these topics at hand what's kind of like a realistic pathway for them that you could advise on got to one of the things i'm proud of with the film coded bias is it that raji, who was an undergrad, shows me as a grad student. right. and it shows some of these aspirational not to say that's something you have to do, but it is to say that is something that could be a path for you so for some seeing that we've seen youth organizations like encode justice high schoolers around the world where they've actually put together ai policy agenda as
4:51 am
well i'm always surprised with the middle schoolers and elementary school students who also reach out with what they're doing of algorithmic bias as well. but i also hear your question the deeper question, right? it goes overwhelming to fight for algorithmic justice, right like what do i do on a day to day? that's why i think the sharing your story and the experience is really important. have a report agl dot org where people can talk about how how they're experiencing ai. so sometimes students will talk about the use of ai to to detect if they are cheating or not and e proctoring and what their personal experiences have been because sometimes it's less daunting right when you know that you're going to a space where people are open to hearing those experiences. thing we encourage everybody to do is when it comes to airport face scans if you're traveling you have the right to opt out of the space scans and most people
4:52 am
don't even know and it's usually even faster. you just stand away and you say, i want the standard check. look at your face. they look at it. you're not trying to get the lighting right and go through. and each time you do that you're actually voting for biometric rights for people like robert williams or porsha woodruff, who were falsely arrested due to facial recognition and misidentification, i think it's important to again emphasize that your experience matters. one thing we love to do with the algorithmic justice league is this workshop called drag versus ai. we developed it the boston public library, with high schoolers. and so we have this workshop where people are actually testing ai systems and seeing how the ai reads face, and then they can try to do the invisibility chamber. so see if you can have it not detect your face. all we do the infinity chamber play with age. can you look older, you look
4:53 am
younger, and then that starts conversation about, well, how is ai being used within your classroom or your school are there surveillance uses? are there uses you would like to see? and so in those spaces i've really seen when there is expectation and also invitation to be part of the conversation and also the validation of the importance their experience and their perspective. then we see more engagement in way because i know if you see code advisor like go do that that might feel a little overwhelming for sure and part of the reason i wrote unmasking ii was to also talk about that film almost didn't get made because. people didn't want to bet on somebody who looked like as the lead and that sort of film i talk about, i almost dropping out of mit and, what the dynamics of that were like and. so sometimes when you see the glory, but you don't the story, it can also feel less accessible. so i think i'm hopefully as you're teaching this to your
4:54 am
students, you also showing them some of the things didn't go so well, right? because part of the journey to i do i do compare for them how you're introduced and like the article, the film and how like legos behind you. so thank you for that advice. appreciate it. thank you so. i enjoyed your lecture. i just got introduced to ai when. i was looking at 60 minutes, so, you know, i started out with a typewriter there was an electric. yes. so a slow motion but as i saw that documentary, they talking about how they could, you know write an essay and, you know and refer to particular and their concepts as now i was just wondering with. project 2025, would it be possible for a i to analyze it and have students discuss what their future goals and
4:55 am
objectives are and then see how it matches with project 2025 in terms of, you know, policy? oh, that's a provocation. i'd love to see that exploration happen. i would also one of the things we see sometimes with these ai systems are hallucinations, so it's not even clear that the summaries produced about attacks like project 2025 would be completely accurate. it really depends on the type of system. if we're assuming there's that could even be part of that assignment right because we've seen this so many times where, cognitive labor is assumed to be easily taken by a ai system when it isn't. so we saw a lawyer, for example, using a chatbot for one of their cases, and it cited law that did not exist and they were actually debarred. right.
4:56 am
and so i think it's so important that we continue to have examples of where a.i. astray. so we are limited our expectation and also our trust. and you'll see many companies now they'll say we we stand behind nothing that came out of this. you've had your own risk. but as people continue to engage with it generally, you're not a subject matter expert in that particular topic. you're not going to know if what you're getting is actually what's in there. so i think even an assignment that there would be another piece of is what we're getting accurate was talking to the library of congress about this and this will be ever more relevant right it is the for congress and have congressional researchers and one of the things they wanted to do was legislation that's there's a lot of legislation to read but because they're library of congress they have to be right right can't just be a
4:57 am
hallucination and so forth so they actually found in that particular instance it was taking them time to fact the summary in first place. so all of this is just fascinating me because we're still in the early days of how these systems are being developed and where their weaknesses are, but also where some of those weaknesses are can give us into what it is to expertise, what it is to have deep comprehension, what it is to sit with text for not just seconds. right. but days, weeks, years and so forth. i gave my eighth graders a project that 20, 25, two years ago and. they looked at me like i was crazy and thought maybe if i would presented it with the computer that, they would, you know, because they they check everything, say based on, you know what wikipedia or somebody else had said, as opposed to what i had to say. but i'm hoping that that'll be the future with the a.i. that
4:58 am
they can understand that legislation and make it their thing. yeah. thank you. thank you. there are technology now called small language model that's working on it to make sure that i is a little bit more effect have and less about hallucination. so it is in the works and that's else that we do it. avatar buddy is using small language model to make your responses more accurate accurate. thank you, dr. joye. a riveting conversation. thank you all for attending thank you so much. i hope to see you at the book signing.
13 Views
IN COLLECTIONS
CSPAN3Uploaded by TV Archive on
