tv [untitled] January 15, 2025 12:30pm-1:01pm AST
12:30 pm
2000 homes remain under official warnings or at risk of binding down life threatening and destructive and wide spread. winds are a already here. for this significant wind event, we're taking in aggressive lean for past year and the early up, the a staff all available resources, strategically placing fire patrols and engines in the unimed pack did high fire risk areas in the city. well, tens of thousands of people cited liza plus the homes this past week. so how all people attempting to recover? well, bundles isn't pacific palisades and spoke to one survivor. what's your home is gone? what do you do next? how do you cope with the emotional trauma? can you contemplate rebuilding? that's what sarah trip on your emergency room physician and mother are for is going through right now. i can't even deal with the grief of my community lost and i friends dispersed in my house gone. i have to just get so the stability for our
12:31 pm
children and i and so in my mind it's house 1st talk to the insurance and fema and see where we are even before the disaster, the rental housing market in los angeles was extremely tight. finding housing has been difficult because so many people are displaced from policies like 30000 people have lost and are looking. so the need is so great. many fire victims fine, it's too early to fully grapple with their emotional states. no, i can, i can. so you know, i'm an e, our doctor did pediatric promos, and i a navy veteran. my coping skill is compartmentalizing and so i have to wall away these things that are too much to bear because it will stop my progress. it wasn't simply structures and material objects that were destroyed. it was lifetimes worth of experiences. on top of all her trauma, sarah, true, pan,
12:32 pm
you're has another concern that the memory of this place where she lived should not be erased. this community was our happy place because it was such a warm embracing. group of people diverse people from all over everywhere, coming together and working together and being kind of nice and seeing the town destroyed. it means that it may never come back. one of the many coping with a bleak new reality and all that the fires took away, rob riddles, l, g, 0, pacific palisades, california, or of course on the fires and indeed helps to bring about peace and costs on a website, which is a top com. the news continues here on houses here as of the stream, bye for now the
12:33 pm
the limits to how far the dream contains sta in your own adventure, now counter and we welcome back to the stream. today we will be discussing artificial intelligence within the news industry and navigating the ethical implications of it, tuned in for an exciting discussion. i am on a lease board this and this is the stream, the. well thank you. ai analise, artificial intelligence is said to revolutionize our world and become faster and better than us and many things. defects and generative models have already had an
12:34 pm
explosive impact on journalism. the question is, can the news industry survive us to help us answer that? and more we have joining us today. joe, i'm did is associate director at the center for cooperative media at long clare university. glenn will k a i need it trainer, currently consultant for several major media networks on the i and john elma, so money producer, so maker and research are in digital humanities and societies. thank you all so much for your time. thank you for being part of the stream. i'm very, very intrigued by this conversation, obviously, and glen. i'd like to start with you with an overview. if you don't mind, because you've been helping several major outlets navigate these on charts, had a i waters, can you tell us where the things stand as we speak today? is everyone experimenting? testing a i to a certain extent, how is that going?
12:35 pm
and i think we're in a, a really interesting situation. a lot of the big media organizations on, at least i think it's down to a kind of a polarization. i think a lot of stuff because of the exercise of those. a lot of the tools are on the market or experimenting privately. i think most of the big organizations i've dealt with at least start taking a much more pragmatic and cautious approach. so many are putting energy into to develop a kind of an, a i ethic statement and a set of guidelines for staff to try and make sure that they work within the parameters that are established as acceptable. and so it's an interesting kind of juxtaposition, and it's also fair to say that as soon as you publish your a i, ethics statements, there's a really good chance that you're gonna have to revisit it within just a couple of months. because the i have at least it's not so fast that it is very, very hard to top of it. hm. and we've been talking about how a guy is going really fast and pretty much all sectors. so i'm jo from what you've been seeing and analyzing what specific new
12:36 pm
a i to or innovation has caught your attention lately. maybe give us a positive highlights if you will, and something that is worrying you as well as well. i think i appreciate the question and i do think that that is correct, that there are a lot of evolutions happening very quickly. i think the thing that people need to focus on, especially organizations, the small to medium size is not even so much of generative a i. but even starting with basic automation, if this then that sort of trigger and response automation. and i think it really comes down to the, the adoption of these tools will be best if the process and the operations of an organizational already solid. um, these tools are really never meant to be all encompassing and integrated into everything that we do at an organization and they're much better served as a layer as an additional feature or a tool. and i think that once these tools sort of start to fade into the background, the dust settles, the hype goes down a little bit. we'll start to see the,
12:37 pm
the evolution in the maturation of their use in organizations and, and companies in a much more balanced and reasonable way. one thing that's been sticking out to me is just, you know, simple, simple tools that work only with the data that you give it. so for instance, google's notebook, l, m flat out refuses to give any responses that are not based in the source material that you upload to it. and i think that is where you'll see a lot of growth and development and actually useful applications instead of sort of the, you know, anything that has that word a i n a is just now over. that's the new thing we have to do. oh, very, very important points and we're going to talk about some of them in length during the show. but, but john, i wanna, i wanna get your, your reaction here in terms of what you're observing right now. what aspects of a guy are you starting to pay attention to right now? because in your opinion, they may actually revolutionize the news industry. a definite
12:38 pm
kid has been transforming their way really produce news. and what is really important now is that that analysis. so i've seen many platform and one of many and use the artist that has been using data analysis and then government major events. and i think it's like it's transforming hours into minutes, days, into uh, to hours or 14. so it's, it's, it's giving many use and towards to the, to the agenda is to, to, to sift through a vast amounts of data. and i think this is that it was and it does it. all right, a is changing. yeah. it's already revolutionizing our, our, our, our sector, isn't it? it's totally taking out of that tedious part of on the work i would say. but when we think about a i in news these days, at least what most people may actually imagine is the fix and misinformation journalist i've done my own ball opposed to this about this topic. he said,
12:39 pm
research shows that nearly 90 percent of people struggle distinguish between a defect and a real video with an even a detection tools failing to spot the difference on multiple occasions. even though that analyze a guy that started to show might not be perfect yet. but there is quite a lot of very impressive advancements in this, in this area, at least glen. how would you say news organizations shouldn't be integrating this technology and the multiple aspects of it without damaging that holy grail on this relationship? the relationship between the news organization and the viewer or the reader. that is of course, trust. yeah, that trust is definitely as power lines. i think for every news organizations on that time of tends to a mass interruption in dfcs. i think trust is the one thing that we have to all of dearest above all else. i mean, there are lots of
12:40 pm
a applications that don't necessarily come on scream or straight into the editorial outputs side of things like transcription, translation tools or a captioning for accessibility. and there's a huge suite of research tools which you can use behind the scenes. so to speak, french as a get taken out like pain. why is that? like normal to reverse image search, this will actually identify a face and a photograph and then track that face and every context on the internet and their tools like that that are transforming journalism. and research and ocean research in particular is massively on the back side. i mean, saving hours for the average day for a journalist who the key thing is to know the checks and balances will what's acceptable in the organization before you hit publish. and then ultimately, full disclosure that you say if you've used a i said up front. so that the audience knows what to expect and you can listen to the feedback because people appreciate that or whether they have concerns. that's a very interesting tank. uh tell what is your view on this? is it important for news outlets to be pulling the curtains and showing people how
12:41 pm
they're going about uncovering, searching data, tracing search and data? is that paramount here and keeping trust? i think generally speaking, yes, that is very important. but i think what a lot of folks here or think of when they think of disclosure statements, or at least in, from a cynical perspective, it often times feels like the a disclosure, you know, you know, language on the website is serving more as a c, y a or a, an accountability sink if you will. and i think if we're going to pull the curtain back, we absolutely should do that. we should, for instance, we should have all transparency on things like, who writes the headlines. because often times, especially for op ed journalists are opinion journalists, they're not the ones writing the headline. so i think there are many layers within the journalism. and specifically not just the reporting, but the publishing side. in the back office side of journalism that we can have in fact pull back the curtain. but i think if you're just putting in
12:42 pm
a disclosure statement at the bottom of an article that you publish, i think a lot of times the public can see that as a thing that reduces trust in many ways. because studies show that when people know that a guy was involved in making something that trust and, and respect sort of goes down. and so if you are going to pull back that curtain, let's do it in all areas where as many areas as possible, i don't think we want to get into a situation where half of the page on the article is just as, as a breakdown of all the different filters the different precepts and light room, the different prompts that you use to format the text or which which images and so forth and so forth. i think it's very good to have an ethnic send disclosure policy, but i think you have to go further than that and not just check the box, but actually go into and engage with your community and your audience and say, look, this is how these tools are use they are just tools, but at the end of the day as a journalist, the whole point is that what you put out into the public, you have to be able to stand by and defend. and when you get that information from a freelance or a staff writer, a producer or one of these tools,
12:43 pm
i think it's important to be able to stand there and have the box stop at the publication. so that regardless of what tools you use, the people can trust that if it comes out from you, then you stand by and you can defend it. oh, i'm that. there isn't a major issue here that we've been discussing. as of late when it comes to a i the issue of bias. this is also something that we talk a lot about in media. obviously, i want to share this ted talk by a computer science students. i've done not lola, and then get you guys to react to take a look. so when you and i think of bias, we're often thinking about human bias, which is one. we are prejudiced towards a group of individuals or a system of beliefs, but bias in a i, even though it may propagate human bias use is completely different because it is
12:44 pm
quantifiable. we can assign a number to this. and more importantly, using this type of analysis we can mitigate algorithmic bias. and that's because algorithmic bias fundamentally results because of in balances and data that are used to train artificial intelligence models. so here's an example and over simplified one of a data distribution that we might use to train in our system. there are peaks and this is through fusion. so areas of high data under areas of lower representation places where we don't have a lot of data. when we're training models that involve human data, the data that is at the peaks of this distribution tends to come from exactly as we just saw. white western men and data that comes from the under represented regions of this type of data set comes from women and people of color. glenn, so donna is obviously very, very bright there. and i find that fascinating the danger on any open harvesting
12:45 pm
motto is obviously copyrights, but also the bias that is inherent to the source that this model is using to learn from how ar username is ations currently addressing that as well. it's an interesting one actually because i was named jackson, but 2 of the organizations that i've been dealing with, i've decided to go for a close learning model based on their own data only. so as opposed to data harvesting just random content and you know, creating a massive large language model, what they're actually doing is training internal model. so using the technology but feeding their own data sets in to customize that so that they can stand over the quality of the data is gone. and the other thing i would say that i think is interesting is you can actually for instance, and try to be asked to identify potential bias and an article that will intelligent though it may be actually 12 red flags or when it can detect it. so it's an interesting one of turn it back on it so that that is, it is intelligent, but it's not as smart uh, so that it could lie. and you also mentioned when we chatted before the show,
12:46 pm
you also mentioned a very interesting example where you, you asked a question about a very contentious issue about gaza to a model. and you asked it in arabic and english and you had very different answers . can you share that story, that example with us? yeah, absolutely. so, i mean, this is a classic case study, again of the nature of the data inputs that are going in and whether language alone can actually have a kind of, in, in, in a bias on his. so there was a question asking about a statement in relation to kaiser and the english language response came back with a very pro western plastic american and positioning honors. and yet the arabic ends are came back that was very, very pro palestinian and in the classroom in particular, i think it actually left most people quite shocked that it could be so radically different. and so just circles without any nuance, even the phraseology in the english language response would cause concern because of the nature of the bias of it as well. and so a very,
12:47 pm
it's really example of our bias can, you know, pre been to that kind of newsroom environment. and joe, your take here on devices of these models that we're seeing emerge and how news room should work to mitigate these biases and ensure a fair representation to but that's a, that's a great example. and i was gonna use a similar one. but i think even with what concept about the news organizations training on their own data as if that some kind of safeguard against us. and we know very well that media outlets have their own biases imbedded in their coverage . um whether it's the source diversity uh and the audits that show that they over whelming li, interview white man, straight man, western man. or it's the biased towards things like institutional legitimacy or coherence or deference to or should sort chevron deference if you will, towards the institutions, especially the intelligence communities or the allies or, or partners of, of us imperialism. and so whether those biases are coming from the training model,
12:48 pm
the training data themselves, or they're coming from the news organizations. again, i think it comes back to this idea that we need to hold ourselves accountable. and we need to in, to investigate internally, our own institutional and organizational and individual biases. and make sure that when we put something out, whether it is something that we've derived from these bots, or if it's something that's coming from inside the house, so to speak, that we are putting those kind of checks and that we're involving many, many voices and perspectives. on these issues so that they are reflected in the output that we have. i mean just this panel alone, there's 2 white dudes on it for. so i was just gonna say that and you know, we tend to be new lots more inclusive here. the stream i must say, i know i didn't mean that as a call out to you, we've been, we've tentatively very mindful all of that, especially with gender representation. but it turns out that you guys had some pretty interesting things to say. so we were forced to invite you and i want to ask
12:49 pm
about, uh, some other ethical implications as we're kind of dissecting this here because we, we keep talking about the impact on our democratic models. i mean, with news rooms already adopting a models to predict and analyze data during elections. reason well the dangers, all the a i and deep fix on democracies cannot be understated the 2024 london mayoral election was an example of that. is it so recently released in documentary the fake mirror that addresses this issue? take a look. is this information you just minutes? the photograph? what is the use barriers to when you lower your spread factor? disappointed down 2016. so which be an okay, bring others rushed to the referendum facebook on the us in mind,
12:50 pm
few bodies unable to catch or i can do. what do they can do to fix the difference in november, doctors, thursday. that sounds like a plane from joe biden, except that it is the time that i should be invested in. i have to try and reach out to joe. so that was obviously around the episode where the cloned the mayor is. i don't even know if that is a term, but let's from for the purposes of this conversation they, they, they put out the statements that sounded a lot like the mayor, but it wasn't the mere um and i was to be honest, i was expecting a lot on that kind of situation, similar situations to come out of the us selections, and at least from here we can see that much. uh, why would you say the facts were not that popular?
12:51 pm
i would say in this particular race of the white house as well, i'd say in one part because it's, it's just not necessary to mislead people. and in most cases, the technology is, as the documentary from the tail pointed out, much easier to access a used to. i mean, these things are not new. they're not novel issues that were dealing with photo and video manipulation. audio fakes and clones. yes, that is the right word, right parlance for these days. but in the past, you know, it took years or months, at least of training and practice and education on how to use the various tools that allow you to manipulate these, these pieces of media. now it takes a couple of dollars, uh, phone and some, some spare time and a maybe some malicious or, you know, the various intent um, but ultimately the reality is it just just not necessary to mislead people. you don't need to come up with these elaborate schemes to tell people lies. you can just say it and the cases, especially in
12:52 pm
a regular that they're not looking for that they're, they, they're looking to confirm their own biases anyway. so all you have to do is sort of give them that red meat that they're looking for in the 1st place. and they're wrong with it. it's, it's not, it's not necessary. i was going to say, especially in the race or donald trump is running. i mean, there's so much there. all right, and then i'm talking about the president elect of the united states. um looking ahead, i'd like to to, to take a moment now. so as we're nearing the end of the show to, to talk about how do we go about experimenting again with these tools in a controlled fashion. some say for example that a, i should be regulated, take a look of 91 percent of the people agree that we should carefully manage a guy. so let's make that happen. we need to come together to build a global organization, something like an international agency for a i, it is global, non profit and neutral is critical that we have both governance and research as
12:53 pm
part of it. it's a very big ask, but i'm pretty confident that we can get there because i think we actually have global support for this. our future depends on it. joe was smiling at this statement, but before i go to joe, i want to have john us take on this on the future. should there be a body that overseas the usage on the i also in news, in your opinion, i think it's, it's no, it just took, took to put like some strategy to what, what a great they use of a high or the ethic of you know, so if you think a i, i would definitely agree with job and talk to you said the cards in 50 states. i think it has been used before the election. but i think now the states are, uh, from the old days that are new on funds way off money protecting the old you. so the that i have to for again, the dissemination that like there are, well, again, i is ation using a big and integrated in charge of it to you to just to get them
12:54 pm
a few points. i think, i think this also the align with the with 20 glen has mentioned before regarding the difference between as your schedule between english oh and not a big but also in english and english as your as charge of it. do you like disney? then you'd have price point of view it towards the fast from, from another organization. so i think, i think that you would share a raise to, to manipulate the delegates other than the same, such a huge potential. it's something in place, you know, i think it's very scary. i mean, joe, i believe it was you that quoted stephen fry in our chat before the show. um, i also spoke about this $100000000000.00 plan with the 70 percent risk of killing us all a i shouldn't be optimistic about in the future. all the i particularly in use i think ultimately the future of any technology, especially, you know, in the current climate is tied to the future of capitalism and the way that these
12:55 pm
markets operate. i think there wouldn't be nearly as many complaints about copyright and stuff of content if these tools were created with public value and public benefit in mind, but they're not there created to a mass and extract the surplus value for a small group of capitalists. and i think with regard to any kind of regulations, i completely agree we need them. i think there should be consumer and creative protections. i don't think that it should be a a labyrinth in mays to go turn off the a i training and scraping for every social media platform retroactively. and i think that the regulatory challenge, of course, especially at a global scale is going to be, i mean we've had mass consensus, the palestinian statehood should be a thing. but there are a few detractors that prevented from going forward. so global or mass appeal does not necessarily result in global or mass enforcement. so we need consumer protections. we need creative protections. we need to rest back control of the value of these and to localize the extra nowadays,
12:56 pm
and the impacts of them to the people who are creating them. and we need to reorient the way that we develop technology to benefit public, the members of the public, and the people who create the value. and not just allow it to be hoarded and, and sequestered into a small group of international capital or capital s, which is happening right now. so no, we should not be optimistic. i mean, last. all right, glen, where do you see there's going, what do you think is going to happen to a i i'm to journalism a whole lot of bad. well if, if i'm going to circle back to, to conversely use that already. the 1st one is about how, by perception a, i didn't seem to play as much of a role in the us selection, perhaps as the expectations. certainly that would have been my initial take. but i think the one thing that have factored in the lot is that there were a huge amount of closed community groups like telegram, particularly being used by the republican party to reach their voter base. that wouldn't of necessarily being directly within the public line of sight. and we
12:57 pm
don't know because we're not in there. we don't know whether there was a lot of of cambridge. i know there's a list like a style kind of messaging being pushed to be able to trigger them. that's the 1st thing to factor in. we don't know whether this was the i election we anticipated or not because not all of it was in the public domain. second thing is the outcome of the last election, but trump taking power again, is basically going to give a card launch to a guy in the us with the on most effectively by looks. it was at least taking up a government position. i fully expect that he will lobby to try and actually de regulations that's even possible. the developments. so there's a lot of very, very big vested ext 1000000000 dollar companies in the last who will benefit from that to come to your question. i'm a little bit worried about, as you said, and at the very top of the sean least, the idea of the trust is probably our only us be going forward. and the time with people's behaviors are changing. people are leading away from traditional media to get the new sources and sometimes looking to influence or the best of respect and sometimes can be very easily manipulated. and if there's enough coin in the deal. and i think that idea of being
12:58 pm
a single point where you can go and get verified information structurally accurate . um, you know, challenges. so that is accountability is, well i think that will be r u s p going forward. and hopefully we can stick to that and continue to produce quality journalism. in the meantime. thank you so so much glenn, joe and donna for your time. thank you for being part of the stream today. and thank you all for to me. and if you have a comments about are so you can talk to us on social media, use the hash time where they handle a stream and then we'll look into your feedback and suggestions. take care. and i'll see you soon. the savannah's way, like columbia bu address, has become a stomping grounds for trespasses as desperate people transgress and illegal passage to feed an emerging fuel trafficking market. we followed the
12:59 pm
powerless journey unguarded through the line of fire risk, and it'll send as wayland columbia on al jazeera. the, the type again is designed to inflame and defense the way that this story is being told, not right. and it's not accurate from social networks to legacy media. the listening post explosives, the post is behind the headlights pulled out just there. after doing political come
1:00 pm
back. and the decisive election when the problem is heading back to the white house for the 2nd time. joining us for an in depth look at trump's policies, how they'll affect americans and the world trump 2nd term on how the, [000:00:00;00] the hello and welcome on sort of fight us. and this is your news life from the coming up in the next 60 minutes. reports of progress in gauze, the sci fi tools spots as well as catch office attacks throughout the nights and into the early hours of wednesday. we are the government to find
0 Views
Uploaded by TV Archive on