tv [untitled] March 25, 2025 7:30pm-8:00pm AST
7:30 pm
to eliminate the use of force in the black seat and the prevent the use of commercial vessels for minutes a use uh, also a developed measures to implement the band again, strikes on energy facilities. now that was something that came out of previous negotiations. and there was questions about whether that was actually being adhered to or implemented. is it the old parties will work towards achieving a tutor over the last in peace. and also that the united states and ukraine for that together to achieve the exchange of prisoners of will a civilian detain these unreturned of forcibly risk transferred children. now, going in to the initial discussions, negotiations at the start of this month, ukraine had proposed a, a n c c. no, they didn't get that. what the united states, that was, that would be a c spy, a complete c spy along different lines. no rush, i hadn't agreed to that. following the phone call between donald trump vladimir putin, there was a pos of ceasefire every against a whole of for
7:31 pm
a hold against attacks on energy facility. again, preston's run whether that had been implemented or not. now what we have here is pretty much what ukraine had suggested in the initial discussions, hope to attack in the black sea, but there's not a complete a ceasefire in terms of strikes. and then we have a partial agreement here, but also an agreement going forward that there will be more discussions and the full size of up together as, as, as they said to achieve a durable and lasting peace. now the question is, will this be implemented and how will it be implemented? and when will this future negotiations take face? to discuss the technical aspects of disagreement. as i baked a da 0 key you can for now finally, this bulletin and a piece of talk is one of 7 journalists. so been arrested and took care that being held of the protest against the arrest of assembled man,
7:32 pm
it come memorial. these are life pictures of the continue and protests that have taken place in many places in turkey. and these live pictures of them assembled hold on 1400 protest as of interested in the past 6 days in the moon, who has been charged with corruption. he denies the accusations and supposed to say the charges up indefinitely, motivated to if his president was it 5, as one has condemned the raleigh's will that set somebody elizabeth put on him for this half of news for stay with us on i'll just say are coming up next to studios be the a i series the examining the impact of today's head lice. the russians are on that accessible way to him. that will be launch the in the agent setting the agenda for tomorrow's discussions set very on for them in their studies because of that. and this is the variance of those latest international filmmakers and world class journalists. this
7:33 pm
is an opportunity to consider what a collective values bring programs to inspire you africans. i'm back in love with themselves. let's use this to retain my then to $1010.00 for his own ends as the, for the artificial intelligence. a, i is already transforming our world and is expected to bring some of the most profound changes in human history. but while a i is being created in silicon valley, the workers that make it possible are largely in the global south region concerns over food stamps, the benefits, and what the cost. my name is maria as subs and i'm a journalist from the phillips through our investigations. i became the target of a harassment in this information campaign. receiving thousands of death threats on
7:34 pm
my i received the nobel prize in 2021 and acknowledgement of how difficult it is for journalists around the world to do our jobs. today. i saw 1st hand the dangers and its threat to democracy. the design of the systems of social media prioritizes the spread of lies, least with england in the special series of studio be on artificial intelligence. i'll be meeting some of the brightest minds working in the field today. i guess this week is actually a bad job director of a groundbreaking research collective based in india, which studies how testing is impacted economies and communities across the global south at big tech, taking the place of the old colonial empire. how the algorithms were flat bias instrumentation and is a high, a powerful tool to fight climate change and not to agree to do the
7:35 pm
law. she your basting go india, where you found a digital future slab that looks at the impact of a eye on societies on countries in the global. so this technology is created in silicon valley in a different place with different cultures. it's coded bias as in there. how does all of this play out in india? at the most basic level, these technologies or very calling ai is design without a sense of what the social political context and these countries looks like. so it's a very simple example. there's a lot of a concern that is going to take away our jobs is gonna automate our jobs. but if you look at the countries in the global south,
7:36 pm
if you look at the country like india, only 20 percent of the population actually is informed unemployment and has jobs. so there's concerns about a large scale job displacement and not for you, the concerns that we have in the building south or in the country. let me just make sure 20 percent have formal job invitations and formal sectors that that's for sure that okay, can we continue? you know, we talk about virtual assistance or set of driving cars in the places that i live. people don't have access to drinking water or do health care, etc. right? so the huge, huge disconnect, people are not on the internet in countries like in now, in many parts of the global south, the use of ai is being position as a way to address very complex developmental problems. right? so it's being used in very critical social sectors like health care, education, welfare, so on, so forth. but at the same time, many of these countries of a young democracies, we don't have the institutional system set up to be able to regulate the use of
7:37 pm
these technologies. the platforms of the companies that are building this technology is of us and not invested adequately and ensuring that these technology do not produce home in these context. right. to give you a very simple example, something like content, moderation on facebook. yeah. the both of facebook's users in the global south lights, but a tiny fraction of facebook's budget is actually spent on content moderation in those countries. so there's a complete disregard for what the impacts of these technologies might be in those areas. so it's the combination of all these factors that i think mix the use of ai in the global south, particularly worrying and risky. you know, there's so much to pick from what you said in the west, in the global north. if you are a woman, if you're black, brown, l, g, b, p 2 plus your further margin sliced when you walk into the social media platforms, right? that's built in. how does that get compounded when you come from the global cell?
7:38 pm
yeah, now it's pretty well understood that is systems are only as good as a data on which they're trained. yeah, right. back to data. and in the middle of a 1000 countries like india, more than half the population is still not on line. right? so these people are digital uncounted in some sense, but the systems are still impacting them. so the chances of bias, a exclusion of very, very high at the same time is systems or machine learning is a state of school technology. it reproduces the future based on the past, right? and what that means is that even the data that does exist already reflects historical patterns of injustice of discrimination against women. again, certain religions against people of 2nd costs, etc. so the systems are reproduced that and it went in change it because it invisible lives, is that decision making as was right? so one of the solutions for this. so the way that we talk about the solution is that we need to be with more inclusive systems. what's the incentive though,
7:39 pm
for these companies, given that, you know, they're driven by a profit motive? then centrally the, very few. i think it's a huge problem, but we can build a systems that recognize people of color better. we can build a systems that recognize women better and, and that is sort of as intrusion. but the problem is that it distracts from the basic question the, do we want the systems at? all right, so for an example of facial recognition technology. yes, we can make it more inclusive. but do we want facial direct recognition technology in the 1st place, or do we want to be using it for credit scoring? do we want to be using it for job applications? do we want to be using ai to decide on somebody that someone gets pearl or not? no, we, we do not want to be using a i in those kind of for very critical decision making. so the question then is not about how do we make this more inclusive? what do we want this at all? let me ask you about the other parts of gender today. i in particular, there's a tool on the environment. there's
7:40 pm
a tool on the content moderators that are actually coming from our countries. right . tell me about both of those. right. a new stock with the, with the labor issue. um i system is on the work of machine learning on the works because you have people who are labeling data sets they have to label this has a glass, this has a job. this has a table that has a phone that has one phone on top of the phone. this is a huge, laborious activity. yeah. all of that work on the labeling of machine learning data sets happens in the global south. it happens for a pittance. well, cuz i've paid a dollar an hour, sometimes even less. right. so the reason that we're able to see so much progress with machine learning is because it is predicated on the exploitation of labor somewhere else. very often we think about the exportation of labor as an outcome of a i systems, but i don't think it's an outcome. it's an enabling condition. you have people who are moderating this content. we're looking at image of offer image image of
7:41 pm
a very graphic content, right? it has the mental health, but that's, that's the invisible label that allows the so called progress. right. and i think that's why we get these comparisons made to colonialism. yeah. right. we see the maximum amount of the new creation for a very few number of actors was sitting in the industrialized north because of the availability of the cheap labor in the bill. this out, india for many years was the top b, p, o. the business process outsourcing, center of the world. the philippines took over several years ago. and in our country, i know that, you know, this is domestically one of the top ways that we actually get revenue. but the people who are doing content moderation, what are they learning? right? how can we get our power back? is there a way if we want to shift the balance of follow, we really have to empower workers better. there's often this sense and technology policy. the technology is moving so fast policy can catch up. yeah, right here,
7:42 pm
but it's only moving so fast because as exploitation somewhere as this is not necessarily the base, it has to go at yahoo not necessary in the direction that has to go out. right. so i think if we actually started being people that what and the value, we have a very, very different innovation, equal system. right. and when would something like generated a i the problem becomes even worse. yeah. so something like tried to be dean. it has no way of knowing whether something that it produces is factually correct or not right? it lies. it gives you lies. okay sir? well, generate them, right? i mean that's, that's what something like that g, d is right. you need to have a human evaluating those responses. is this the right response? is that the right response and then rating those responses, right? we need a higher and higher skills to be able to do that as we start going into particular domains, whether it's in healthcare or in for legal advice, etc. so what we're talking about then is not only that there's low skilled workers
7:43 pm
in the global style doing this really new, a work of labeling task, but even skilled workers, right in various spots of the was that job now is to train the i 1st of the feedback to the i right on on my, we're on the flight to i was watching a film. there was a scene in that film with is a children's art class going on. and yeah, but there's no kids running around. there's no paint on the walls. there's, it's clean, it's for staying and what are the kids doing? the kids are giving feedback to the guy that is doing are right. so is that the future that we want? but those are the kinds of jobs that we're doing. but our job is to constantly verify whether the chat, g, p, d is producing blue or not. so let me, let me bring one other aspect because the deal political power is using information operations and information warfare. in the canyon elections, we did
7:44 pm
a study to look at at how social media played in the canyon elections. and what we found were the troll armies that were insidiously manipulating them came from the philippines in india. right. so these people are getting paid to be trolls. is this worry so? i mean, of course, that's why some, but i didn't get any in any war. we should not focus on the foot soldiers, right? we need to focus on those and follow and what is driving that war in the 1st place, right. and i think there's 2 things here that i'm important. one is, i think we focus too much on the content, right? i focus on the content on social media platforms instead of why the content spreads . yeah. right. and so we need to focus on the recommendations systems on the social media platforms, the, i would make recommendations, systems that are optimized for vitality, for profit for profit optim. exactly. exactly. and the focus on content does,
7:45 pm
it's actually allowed social media companies to use the argument of free speech. yeah. as a way of doing nothing. yeah. regulate content on their platforms. right. so i think we need to focus on the recommendation engines and what they're optimized for and creat safer. say her recommendation systems the same time. now i think the information warfare problem is, becomes sort large that it's an industry. yeah. right. i think the data broker industries billions and billions of dollars, right? and these data brokers says, are individuals data to companies who have been set up to manipulate us? and if we're really trying to address this problem with the root cause, i think we have to regulate our data economy. we made a really bad bargain in the decade and a half ago when we said that we were okay to give up our data to get personalized services. and now we're paying the price for it, right?
7:46 pm
so you have a whole global economy that is based on the collection and monetize isn't about personal data. so unless we address that that we, we don't address them as information this information and information warfare problem. and of course, you know, building media electricity building, traditional media institutions, we have to invest in that as part of the problem. but we want to go back and bring you to environment. what is the environmental cost of these large language models? huge and you and i were to have the same conversation by a judge g, b, d, just like a few minutes. so if it is going to be the equivalent of just doing out like couple of bottles of water, just wasting it right to give you another example. the amount of electricity or energy consumption, but in a system consumes is likely to be the same by 2027 if we continue at the same project tree of the, of a country the size of the netherlands. so these companies say that they're getting better at not killing the environment. right. have home. do you believe this?
7:47 pm
no, i don't, i don't. i don't believe in. i think the, the issue is also that the people who are the most impacted by the degradation of our environment in the global south, the extremely vulnerable and then not the ones we're benefiting from the gains and the technology. right, right. to where the benefits lie and where the home's light is a very far apart. and that's what's worrying is that we're betting my entire future on a technology that is fundamentally very and since unsustainable and back for the environment, bad for people buy from the planet. chapter p t for example, right? they expected the downloads to be about a $100000.00. the 1st month it went to a $100000000.00. and so now it is out in the wild and we are all training that, right? we're literally again, i will say problems. stocks for people watching. what do we do? what can we do? i think we do have options to me. the biggest worry with regard to a i is a concentration of follower and a handful of technology companies. right?
7:48 pm
without the requisite systems of public accountability markets, a political systems, how we engage with we do the beach, the socially, all of that is being restructured right at the whim of a few companies. and that's because they have this concentration of market follow. so i think one of the things that we need to do is actually use the tools we haven't competition policy to address the market monopoly. there's nothing in the global south right now that actually does this, right? our countries are, are working separately. it is the you leading the way it is still the west. it is still the global north, like our institutions are weaker. yeah. often more corrupt. yeah, exactly. and i think that in the global south, and in many parts, there is this narrative that we need to catch up. right, like in the global south is narrative that we missed the boat on previous industrial revolutions. and we've gone missed the boat on this one. and so, rather than trying to rain in the technology rather than trying to regulate the technology we want to emulate big tech. yeah,
7:49 pm
if you want to create domestic champions and that does nothing to address the homes in terms of oversight in terms of accountability for it. so i think competition policies key, but i think proceed. policy punishment legislation is also okay. fantastic. um we go to the questions to your questions. wow. hands up right away. we like you guys. my name is chung, i'm from china for most part of the global self education. that is a privilege that a lot of people don't have. so as john, these and people who study a, how can we educate them about a i something that impact them the most. so i think that's a really, really important question. i think if you look, if i look around globally and i see where the biggest moans of change has come from, it has come from the bottom up swear of people coming together and pushing back against byler um. and that can only happen if people actually understand when the technology is understanding what the implications of the technology i am thinking
7:50 pm
about civic education is really important. i think we need to do it in schools. i think the media has a huge role to play in it and even the media use the hype on a few of these kinds of utopian versus dystopian narrative, and doesn't help us get any grounded understanding of what's happening. i think the research community has a huge role to play and actually building the evidence that we need to show what is actually happening in our society. so i think it's key that we invest more in public education around a i next question. hi, my name is audit. yeah, i'm from buying a law. uh you spoke briefly about the building, so it's essentially playing catch up when it comes to technology. how can they balance playing catch up while still making sure that the much less communities in the global sort that needed the most don't get paid well, the way from it, i think, catching up is not the right frame to which policy making what technology policy
7:51 pm
needs to operate in the south is not clear to me what we're catching up to if we're catching up to big tech or for catching up to the kind of business model that is a i right now, that is a fundamentally exploited of business, multi brand. it depends on the extraction and monetization of data, and the one less of that, we don't one more of that. the other reason why catching up is really hard with, with the, with this kind of centralizing a technology that we see right now is just the cost of building and maintaining this technology. something like chad cbd costs, open the i $700000.00 a the to, to actually run the technology, right? the kind of computational infrastructure that is required to maintain the systems is immense. google apparently spends more on computational infrastructure than it does on people. that being said, i think many parts of the global south are in a pre a is we are at a point of digitalization. so we have an opportunity to actually create intentional
7:52 pm
curated purpose for data sets that address specific problems that face us. so rather than catching up, what we need is a different paradigm in the global south that is focused on addressing specific problems that is focused on building the data sets to address those problems. rather than the current model, which is just collect as much data as you can and kind of hope for the best. i mean, just to build off of that, right. i think one of the key things we need to do is look at how the west is the one that is act that has the legislation out the door, right. how do you stop the impunity? how do you deal with insidious manipulation in terms of how was it to walking into election site one and 3 people will be voting in 2024. so this is where is accountability. and i think you cannot begin to build alternate systems until we say this is uh, this is illegal. yes, the woman in blue. and will you be from manchester? so we knew the islands about as by taking all the data and manipulating dolls using
7:53 pm
it, it comes to the noise that we just don't know, but we don't have a choice but to be in these digital space that's now. so is there a way of kind of making friends with a, i mean, i know we're not going to be best the straight away, but is there a way to at least build some kind of basic trust? i is not one thing. i a fight is an umbrella tom for a lot of different kind of computational techniques that are trying to make machines work more like humans. so if it isn't an object, but we can then be friend and therefore also is an object that we can so clearly regulate a i is in some sense of marketing from that packages or re packages of business models based on commercial surveillance based on surveillance capitalism and re packages that as innovation read packages, it is something that is pushing the frontiers of science that is going to have
7:54 pm
human guy. and that is going to have all of us address, very complex problems. so i think we need to break away from that hype around a and be very, very clear on what it is and what it's not. an in that i think we find the answer to what, what we can be friend or we can not be friend, right? at the same time, i also don't think that we don't have any option. i don't think that we're to point to there's no turning back just very recently for an examples of regulation in california that allows people to opt out of data systems. tried to have a blanket opt out of the use of the data for a purpose. and they will not consent to, to, that's a huge victory for civil society. so we have more choice and more option. then we think, and i think we have to find those spaces and grow those spaces and not given just as yet into a literally takes kind of a guy, a guy that is going to be part of our lives as our companions or boss. she's more
7:55 pm
optimistic. i mean, i think so let's just talk about what happens on facebook on social media, right. so machine learning comes in, takes every post that you've done. if you've done the dating, if you've done the marketplace, if you've got relationships, all of that is data. it makes a model of you that knows you better than you know yourself, right? or it's a close you, did they ask for our per mission to be closed? no. then a guy comes in and takes all of our clothes and puts it in the mother lode database that is used to micro target your weakest moment to a message. and that's sold to a company or a country, right? that insidious manipulation is our bargain with the social media companies. so, you know, use, i feel like we need to stop the impunity. that's the 1st step. but i've become more radical and you know, facebook and wrapped are our partners. we're partners with every tech company including open a i, but eyes wide open. you know,
7:56 pm
a friend of mine's last question. this tier. hi, my name is hattie. i used to be a journalist spaced in china. i'm wondering, gets there such a thing as an a footprint. it's like a carbon footprint. and how we, as consumers or as individuals, as motors, can more ethically make choices about our use of a i. i think there's a lot of potential for us as consumers to vote with their feet on what kind of a i we want and i live and what we don't want. but we can only do that if there's transparency by the companies on what the product actually is. what was the data was trained on as it been tested? what are the error rates? is there any cautionary advice where we should use it? if we don't have that information, it's very hard for us to be able to actually take action. i think the analogy to food is quite interesting and i'm not the 1st one to make this analogy. so like maybe a decade ago or 2 decades ago, you were happily like smoking cigarettes,
7:57 pm
drinking glucose, eating chips and we did not paint them. this was bad for health, right? at some point, there was a realization that this is not just about like individual people having health crises. this is a public health crisis. and so the food that we eat, it needs to be tested that needs to be for labels and going this food that tell you, this is the composition of the food. this is the fat content. don't need this. it will have an allergy, so on so forth. we need that same kind of documentation and or transparency when it comes to the use of an ai system. once you have that information, then it becomes possible for us to be able to vote for their feet, right? we don't have that information where, where, you know, grouping in the dock. if we follow what happened post world war 2 to mattie saw an x a central moment and it came together with the universal declaration of human rights . so many things were formed during that time, but this is one of those transformative moments and hopefully every person in the room, every person watching,
7:58 pm
we have to demand better because this is still within our control. now it will be very soon provide she and i just thank you from digital futures lab and go in the memory arrest the studio b b i series. thank you for joining the in discussing the defining issues of our time as we are the subject of a i, we are not the users of a i going beyond the artificial intelligence hyphen big tech propaganda to explore how to build the sites an ethical ai, the racial biases get worse, you get more of those in power. are the ones with the resources to decide where it's used. how do we change this paradigm? studio b, b a r series on a jersey,
7:59 pm
the company public. somebody steps out of the home to co told many any, but she can see sort of government census on annual activity. we can't grow anything. the animals eat everything from the smallest. we've built a government officials as supporting agriculture authorities. dakota monkeys look acts at peacocks. the mid carpets in damaging corrupts and got solid at the scene. so this is being done to help find a real solution. with millions of coconuts and voss swedes of petty fields being destroyed, it's clear that crop damage is a serious issue. sometimes this blame human activity, arguing the solution. life with us can the us and russia envy you create more without your own or even ukraine? will president from tony shakeup of the us government help or hurt the country? how do israel's war on gaza become one of the defining moments of the 21st century? the quizzical book with us politics, the bottom line,
8:00 pm
the lakes messages on us will plans and jasmine spot teach at the base of the us senate here in washington the so i'm elizabeth donovan. this is al 0 life from door ha. also coming off the gauze is refugee camps, hospitals and shelters are on the attack. once again, at least 62 palestinians have been killed since monday.
0 Views
Uploaded by TV Archive on
