tv The Context BBC News September 19, 2024 8:30pm-9:01pm BST
8:30 pm
hello, i'm christian fraser. you're watching the context on bbc news. you are watching the context. it is time for al decoded. welcome to the programme. freely available, largely unregulated, the creative tools of generative ai, now amplifying the threat of disinformation. how do we tackle it, what can we trust? and how are our enemies using it to undermine our elections and our freedoms? this week governor gavin newsom signed a bill in california that makes it illegal to create and publish deepfakes related to the upcoming election. and from next year the social media giants will be required
8:31 pm
to identify and remove any deceptive material. it is the first state in the nation to pass such legislation. is it the new benchmark? some of this stuff obviously fake, some of it designed to poke fun. but look how these ai memes of cats and ducks powered the pet—eating rumour mill in america with dangerous consequences. it is a problem too in china. how does the communist party retain social order, in a world where the message can be manipulated? beijing is pushing for all alto be watermarked. and is putting the onus on the creators. and from politics to branding there's no bigger brand than taylor swift. hijacked by the former president, who shared fakes images of her fans, endorsing him, it affects us all. with me as ever in the studio our regular commentators and ai presenter stephanie hare. and from washington our good
8:32 pm
friend miles taylor who worked in national security advising the former trump administration. we will talk to them both in a second. but before we do that we are going to show you a short film. one of the many false claims that has appeared online in recent months was a story that kamala harris had been involved in a hit and run accident in 2011. that story was created by a russian troll farm. and was one of the many inflammatory stories microsoft intercepted. the threat analysis unit that does their work in new york is at the very forefront in defending all our elections. our ai correspondent, marc chislack has been to see it. times square, new york city, an unlikely location for a secure facility which monitors attempts by foreign governments to destabilise democracy. it is, however, home to mtac, the microsoft threat analysis centre. its job is to detect, assess and disrupt cyber enabled influence threats to democracies worldwide. the work that's carried out here is extremely sensitive. we're the very first people that have been permitted to film inside. it's also the first
8:33 pm
time russian, iranian and chinese attempts to influence a us election have all been detected at once. all three are in play, and this is the first cycle where we've had all three that we can definitely point to. individuals from this organisation serve on a special presidential committee in the kremlin. reports compiled by these analysts advise governments like the uk and us, as well as private companies on digital threats. this team has noticed that the dramatic nature of the us election is complicating attempts at outside interference. the biggest impact of the switch of president biden for vice president harris has been its really thrown the russians so far off their game. they really focussed on biden as somebody they needed to remove from office to get what they wanted in ukraine. russian efforts have now pivoted to undermining the harris campaign via a series of fake videos designed to provoke controversy. these analysts were instrumental in detecting
8:34 pm
iranian election influence activity via a series of bogus websites. the fbi is now investigating this, as well as iranian hacking of the trump campaign. we found that in the source code for these websites they were doing was using al to rewrite content from a real place and using that for the bulk of their website. and then occasionally they would write real articles when it was a very specific political point they were trying to make. the third major player in this election interference is china using fake social media accounts to provoke a reaction in the us public. experts are unconvinced these campaigns affect which way people actually vote, but they worry they are successful in increasing hostility on social media. marc chislack, bbc news. that gives you an idea of how quick this is advancing. do you think we're almost at the point where we are going to be very
8:35 pm
close, very soon, to not knowing the difference between and fiction? it’s knowing the difference between and fiction?— and fiction? it's getting harder to _ and fiction? it's getting harder to detect - and fiction? it's getting harder to detect the - and fiction? it's getting l harder to detect the deep and fiction? it's getting - harder to detect the deep fake imagery audio. i think we are right now possibly in the last us election where it is easy to see when you're being manipulated. the trick is, do you want to believe it? what this is all about is hijacking your emotions. this is all about is hi'acking our emotions. ~ ., ., ., your emotions. watermarks? that is often the _ your emotions. watermarks? that is often the go _ your emotions. watermarks? that is often the go to _ your emotions. watermarks? that is often the go to solution, - is often the go to solution, why would it not be the answer to all the ills of generative ai? i to all the ills of generative ai? ., ., , ai? i wonder if there would be wa s of ai? i wonder if there would be ways of manipulating - ai? i wonder if there would be ways of manipulating even - ai? i wonder if there would be i ways of manipulating even that, but it would probably get a good start. you feel like you are playing guatemala with these technologies. we would probably start with watermarks and then there would be an advance and a kick back and we would have to advance. it's about preparing citizens to have the critical media skills that we all need to be able to
8:36 pm
construct narratives, look at who is giving us information, and does it check with reality? i was saying to stephanie, this is a good step forward what is happening in california, the governor but on the social media companies and the creative companies to do something about this, particularly around the election, then stephanie said to me, american companies are regulated by american legislators, why wouldn't they just go to china? i legislators, why wouldn't they just go to china?— just go to china? i think that is one of— just go to china? i think that is one of the _ just go to china? i think that is one of the concerns - just go to china? i think that| is one of the concerns always with — is one of the concerns always with tech _ is one of the concerns always with tech regulation. you remember the debate well over encryption in the united states. _ encryption in the united states, there was the san bernardino terrorist attack, almost _ bernardino terrorist attack, almost ten years ago, when the fbi could — almost ten years ago, when the fbi could not get into the shooter's phone. it led to a bil shooter's phone. it led to a big debate in the states about these — big debate in the states about these encrypted apps like telegram and signal and whether it should — telegram and signal and whether it should be legislated that those _ it should be legislated that those were forbidden in the united _ those were forbidden in the united states. opponents of
8:37 pm
those — united states. opponents of those laws said, you can outlaw them _ those laws said, you can outlaw them here, _ those laws said, you can outlaw them here, but someone overseas is going _ them here, but someone overseas is going to — them here, but someone overseas is going to create the same apps— is going to create the same apps and _ is going to create the same apps and it's going to be really— apps and it's going to be really difficult to prevent people from using a version of it overseas. we faced the same problem — it overseas. we faced the same problem with regulations about the fakes and ai. it is only as far as — the fakes and ai. it is only as far as us— the fakes and ai. it is only as far as us legislation and law enforcement can reach that they can he _ enforcement can reach that they can be enforced. there is a big challenge _ can be enforced. there is a big challenge here. there is a domestic challenge about the first amendment implications and free—speech implications. gavin — and free—speech implications. gavin newsom signing that law has opened up that debate as welt — has opened up that debate as well. there will be a lot of contention in the next four years— contention in the next four years about how to get this right— years about how to get this right from a legislative and regulatory standpoint. we talk about protecting _ regulatory standpoint. we talk about protecting children - regulatory standpoint. we talk about protecting children all . about protecting children all the time, one of the issues that companies come up against is finding the material and getting rid of it. if you are having to find a very good deep fake material, that process becomes much more difficult. how do we find a metric to hold
8:38 pm
the social media companies to task? i the social media companies to task? ~ ,, , ., task? i think stephanie said something _ task? i think stephanie said something really _ task? i think stephanie said something really important| something really important which — something really important which was that trying to stop all these _ which was that trying to stop all these things, if you think that— all these things, if you think that watermarking, putting a sticker— that watermarking, putting a sticker on this content saying this is— sticker on this content saying this is fake, if you think that is a — this is fake, if you think that is a solution it is going to be hard — is a solution it is going to be hard to— is a solution it is going to be hard to keep up. the experts i talk— hard to keep up. the experts i talk to— hard to keep up. the experts i talk to say— hard to keep up. the experts i talk to say that maybe that is a short—term solution but in the — a short—term solution but in the longer term you have to re—architecture what is real and — re—architecture what is real and not _ re—architecture what is real and not real to your earlier point, _ and not real to your earlier point, what do i mean by that? there _ point, what do i mean by that? there is— point, what do i mean by that? there is a — point, what do i mean by that? there is a word i want listeners to remember and it is provenance. there is a big discussion and technology community is about making sure by default, when you do something like capture a picture _ something like capture a picture on your phone, that it is cryptographically signed to say, — is cryptographically signed to say, i— is cryptographically signed to say, i was taken at this place, at this— say, i was taken at this place, at this time and that cannot be changed — at this time and that cannot be changed. tied to a public ledger _ changed. tied to a public ledger. other people can see
8:39 pm
your— ledger. other people can see your photos with a cryptographic signature that cannot — cryptographic signature that cannot be broken. eventually all cannot be broken. eventually aii our— cannot be broken. eventually all our tech will have that provenance which says, i and reai~ — provenance which says, i and reai~ you _ provenance which says, i and real. you will know it is not real— real. you will know it is not real because it will not have that — real because it will not have that point of creation certification. in the meantime, a lot— certification. in the meantime, a lot of— certification. in the meantime, a lot of difficult conversations are going to be have — conversations are going to be have. �*, ., .,, ,_ have. it's also a supplying chain approach _ have. it's also a supplying chain approach when - have. it's also a supplying chain approach when you | have. it's also a supplying - chain approach when you have a chain approach when you have a chain of evidence and you have to follow it all the way through and you can't temper with it. people many years ago wanted to know with mad cow disease where the meat came from and people knew they needed to traceability through the food chain. i wonder if there is a parallel to help people understand, all the things you are creating can have that encoded sea would always be able to know. like a painting, when a painting and sold it might go through many different hands before it ends up different hands before it ends up in a museum. why did it come from? was illegally bought,
8:40 pm
etc? you should be able to follow data in the same way. here in the studio with this is a senior research associate in machine learning at oxford university. they are researching how to identify some of these deep fakes using ai. welcome to the programme. we werejust talking ai. welcome to the programme. we were just talking about how quickly things are advancing, to the naked diet is becoming more difficult, certainly with imagery. what technology are you developing to make that easier? i you developing to make that easier? ~ ,., ., you developing to make that easier? ~ ., ., easier? i think the solution to our problems _ easier? i think the solution to our problems of— easier? i think the solution to our problems of establishing l our problems of establishing provenance _ our problems of establishing provenance of— our problems of establishing provenance of continent - our problems of establishing provenance of continent will| provenance of continent will involve _ provenance of continent will involve a _ provenance of continent will involve a lot _ provenance of continent will involve a lot of— provenance of continent will involve a lot of research - provenance of continent will involve a lot of research butj involve a lot of research but also — involve a lot of research but also adopting _ involve a lot of research but also adopting existing - also adopting existing technologies. - also adopting existing technologies. in- also adopting existingl technologies. in terms also adopting existing. technologies. in terms of research, _ technologies. in terms of research, the _ technologies. in terms of research, the dipper. technologies. in terms of- research, the clipper brought home — research, the clipper brought home that _ research, the clipper brought home that al _ research, the clipper brought home that al is _ research, the clipper brought home that al is being - research, the clipper brought home that al is being used . research, the clipper brought| home that al is being used to amplify— home that al is being used to amplify this— home that al is being used to amplify this information. - home that al is being used to| amplify this information. let's use at —
8:41 pm
amplify this information. let's use al to— amplify this information. let's use alto solve _ amplify this information. let's use alto solve it. _ amplify this information. let's use alto solve it. i— amplify this information. let's use alto solve it. i research l use alto solve it. i research how — use alto solve it. i research how to — use alto solve it. i research how to protect _ use alto solve it. i research how to protect information i use alto solve it. i research i how to protect information with al. ., ., , ., how to protect information with ai. you are using it to track down the — ai. you are using it to track down the deep _ ai. you are using it to track down the deep fakes? - ai. you are using it to track down the deep fakes? we l ai. you are using it to track l down the deep fakes? we did some research _ down the deep fakes? we did some research with _ down the deep fakes? we did some research with bbc - down the deep fakes? we did i some research with bbc verify, when _ some research with bbc verify, when you — some research with bbc verify, when you have _ some research with bbc verify, when you have a _ some research with bbc verify, when you have a picture - some research with bbc verify, | when you have a picture explain if it is— when you have a picture explain if it is a — when you have a picture explain if it is a deep— when you have a picture explain if it is a deep fake _ when you have a picture explain if it is a deep fake or— when you have a picture explain if it is a deep fake or not. - if it is a deep fake or not. let's _ if it is a deep fake or not. let's have _ if it is a deep fake or not. let's have a _ if it is a deep fake or not. let's have a no— if it is a deep fake or not. let's have a no example. | if it is a deep fake or not. - let's have a no example. this is the pope in a jacket. it got into some new streams and so it did deceive some people. what did deceive some people. what did you do with this? you did deceive some people. what did you do with this?— did you do with this? you can see the pope. _ did you do with this? you can see the pope, from - did you do with this? you can see the pope, from the - did you do with this? you can i see the pope, from the context it's clear— see the pope, from the context it's clear that— see the pope, from the context it's clear that it _ see the pope, from the context it's clear that it is _ see the pope, from the context it's clear that it is deep - see the pope, from the context it's clear that it is deep fake - it's clear that it is deep fake and — it's clear that it is deep fake and it— it's clear that it is deep fake and it is— it's clear that it is deep fake and it is for— it's clear that it is deep fake and it is for entertainment. i it's clear that it is deep fake . and it is for entertainment. an e>
8:42 pm
you can see it is— doesn't attach to the chain. you can see it is very- you can see it is very important _ you can see it is very important to- you can see it is very important to have . you can see it is very. important to have these explanations _ important to have these explanations as - important to have these explanations as well. i important to have these i explanations as well. not important to have these - explanations as well. not just the percentage _ explanations as well. not just the percentage of— explanations as well. not just the percentage of deep - explanations as well. not just the percentage of deep fake i explanations as well. not just . the percentage of deep fake but why it _ the percentage of deep fake but why it is — the percentage of deep fake but why it is one _ the percentage of deep fake but why it is one. we _ the percentage of deep fake but why it is one. we have - the percentage of deep fake but why it is one. we have these - why it is one. we have these tools — why it is one. we have these tools that _ why it is one. we have these tools that can _ why it is one. we have these tools that can create - why it is one. we have these tools that can create these l tools that can create these explanations. _ tools that can create these explanations.— tools that can create these explanations. something that ou could explanations. something that you could run _ explanations. something that you could run a _ explanations. something that you could run a photograph i you could run a photograph through? gray potentially guess. but these tools still have a lot of failure. where did they — have a lot of failure. where did they fail _ have a lot of failure. where did they fail and _ have a lot of failure. where did they fail and why? - have a lot of failure. where did they fail and why? they can't get things right, so you can't get things right, so you can't get things right, so you can't get six fingers on a hand. , , can't get six fingers on a hand. ,_ . ., hand. this is classic on videos- _ hand. this is classic on videos. an _ hand. this is classic on videos. an object - hand. this is classic on - videos. an object disappears. the prohlem _ videos. an object disappears. the problem is _ videos. an object disappears. the problem is that _ videos. an object disappears. the problem is that these - videos. an object disappears. l the problem is that these tools are trained _ the problem is that these tools are trained on— the problem is that these tools are trained on a _ the problem is that these tools are trained on a lot _ the problem is that these tools are trained on a lot of- the problem is that these tools are trained on a lot of data. - are trained on a lot of data. they— are trained on a lot of data. they are _ are trained on a lot of data. they are learning _ are trained on a lot of data. they are learning so—calledj they are learning so—called features, _ they are learning so—called features, patterns, - they are learning so—called features, patterns, that. they are learning so—called i features, patterns, that help make — features, patterns, that help make these _ features, patterns, that help make these decisions. - features, patterns, that help make these decisions. it - features, patterns, that help make these decisions. it can| make these decisions. it can happen_ make these decisions. it can happen that _ make these decisions. it can happen that sometimes - make these decisions. it can. happen that sometimes these patterns — happen that sometimes these patterns are _ happen that sometimes these patterns are present - happen that sometimes these patterns are present in - happen that sometimes these patterns are present in somel patterns are present in some images — patterns are present in some images that— patterns are present in some images that are _ patterns are present in some images that are too - patterns are present in some images that are too far- patterns are present in some . images that are too far away... can you — images that are too far away... can you explain _ images that are too far away... can you explain that _ images that are too far away... can you explain that to - images that are too far away... | can you explain that to people? is it a pixel difference? it's
8:43 pm
not in the way the image looks? the ai is looking deeper into the image?— the ai is looking deeper into the imae? ~ , .,~ the image? the ai is taking the ima . e the image? the ai is taking the image and _ the image? the ai is taking the image and projecting _ the image? the ai is taking the image and projecting it - the image? the ai is taking the image and projecting it into - image and projecting it into some — image and projecting it into some high— image and projecting it into some high dimensional- image and projecting it into l some high dimensional space image and projecting it into - some high dimensional space and within— some high dimensional space and within this — some high dimensional space and within this high _ some high dimensional space and within this high dimensional- within this high dimensional space — within this high dimensional space you _ within this high dimensional space you do _ within this high dimensional space you do a _ within this high dimensional. space you do a dimensionality reduction _ space you do a dimensionality reduction and _ space you do a dimensionality reduction and then _ space you do a dimensionality reduction and then what - space you do a dimensionality reduction and then what you i space you do a dimensionality . reduction and then what you can do is _ reduction and then what you can do is form — reduction and then what you can do is form these _ reduction and then what you can do is form these features - reduction and then what you can do is form these features and i do is form these features and then— do is form these features and then if— do is form these features and then if you _ do is form these features and then if you have _ do is form these features and then if you have an _ do is form these features and then if you have an image - do is form these features andl then if you have an image that has been — then if you have an image that has been trained, _ then if you have an image that has been trained, these - has been trained, these features— has been trained, these features might - has been trained, these features might not - has been trained, these features might not be . features might not be generalised. - features might not be generalised. you - features might not be generalised. you canl features might not be - generalised. you can have issues _ generalised. you can have issues where _ generalised. you can have issues where an _ generalised. you can have issues where an image - generalised. you can have - issues where an image evokes some — issues where an image evokes some impressions _ issues where an image evokes some impressions that - issues where an image evokes some impressions that are - some impressions that are wrong _ some impressions that are wrong so— some impressions that are wrong so you _ some impressions that are wrong. so you see - some impressions that are i wrong. so you see reflections or something _ wrong. so you see reflections or something and _ wrong. so you see reflections or something and they- wrong. so you see reflections or something and they are . wrong. so you see reflectionsl or something and they are not deep — or something and they are not deep fake _ or something and they are not deep fake but... _ or something and they are not deep fake but. . ._ deep fake but... stephanie mentions _ deep fake but... stephanie mentions the _ deep fake but... stephanie mentions the photographs | deep fake but... stephanie - mentions the photographs they struggle with, i have one here. this is lee and messy kissing the world cup. this one is
8:44 pm
real. but the machine thought it was fake, why? the machine miaht it was fake, why? the machine might think — it was fake, why? the machine might think so _ it was fake, why? the machine might think so because - it was fake, why? the machine might think so because it - might think so because it hasn't _ might think so because it hasn't seen _ might think so because it hasn't seen an _ might think so because it hasn't seen an image - might think so because it| hasn't seen an image that might think so because it i hasn't seen an image that is close — hasn't seen an image that is close enough— hasn't seen an image that is close enough to _ hasn't seen an image that is close enough to do - hasn't seen an image that is close enough to do this - close enough to do this picture in its _ close enough to do this picture in its training _ close enough to do this picture in its training set. _ close enough to do this picture in its training set. it _ close enough to do this picture in its training set. it might- in its training set. it might think— in its training set. it might think that _ in its training set. it might think that some _ in its training set. it mightl think that some reflections in its training set. it might- think that some reflections on the trophy— think that some reflections on the trophy or— think that some reflections on the trophy or the _ think that some reflections on the trophy or the way- think that some reflections on the trophy or the way leona . the trophy or the way leona messy— the trophy or the way leona messy holds _ the trophy or the way leona messy holds his _ the trophy or the way leona messy holds his hand - the trophy or the way leona messy holds his hand all. the trophy or the way leonal messy holds his hand all the skin— messy holds his hand all the skin are _ messy holds his hand all the skin are natural. _ messy holds his hand all the skin are natural. these - skin are natural. these explanations _ skin are natural. these explanations can - skin are natural. these explanations can be i skin are natural. these i explanations can be very convincing, _ explanations can be very convincing, but - explanations can be very convincing, but they- explanations can be very convincing, but they are | convincing, but they are nevertheless _ convincing, but they are nevertheless wrong. - convincing, but they are nevertheless wrong. doj convincing, but they are nevertheless wrong. do you like this idea of _ nevertheless wrong. do you like this idea of ai _ nevertheless wrong. do you like this idea of ai checking - nevertheless wrong. do you like this idea of ai checking deep - this idea of ai checking deep fakes? i this idea of ai checking deep fakes? ., ~ this idea of ai checking deep fakes? ., . ., ., fakes? i love it. we have got to use al _ fakes? i love it. we have got to use ai against _ fakes? i love it. we have got to use ai against ai - fakes? i love it. we have got to use ai against al to - fakes? i love it. we have got| to use ai against al to protect ourselves. it is going to be our— ourselves. it is going to be our best— ourselves. it is going to be our best asset. one of the things— our best asset. one of the things that is interesting right— things that is interesting right now is that we always focus — right now is that we always focus on _ right now is that we always focus on who is developing the technology that could be used for bad — technology that could be used for bad but a lot of folks
8:45 pm
around _ for bad but a lot of folks around the world are now investing time and resources and building companies on deep fake detection. in the united states— fake detection. in the united states there are companies that are exciting, venture backed, lots _ are exciting, venture backed, lots of— are exciting, venture backed, lots of people want to work for them _ lots of people want to work for them and _ lots of people want to work for them and they focus solely on trying — them and they focus solely on trying to— them and they focus solely on trying to prove what is and isn't — trying to prove what is and isn't real _ trying to prove what is and isn't real. one of the things that— isn't real. one of the things that has— isn't real. one of the things that hasjust become possible, in the — that hasjust become possible, in the past few months, is that some _ in the past few months, is that some of— in the past few months, is that some of these technologies are leveraging context understanding. they are just looking — understanding. they are just looking and saying it looks manipulated, the models can say, — manipulated, the models can say, the _ manipulated, the models can say, the pope in the past couple _ say, the pope in the past couple of weeks has been no on vacation — couple of weeks has been no on vacation in _ couple of weeks has been no on vacation in italy and there is no way— vacation in italy and there is no way that this photo was just taken — no way that this photo was just taken and _ no way that this photo was just taken and they can give you a confident— taken and they can give you a confident score. fire taken and they can give you a confident score.— confident score. are you incorporating _ confident score. are you incorporating that? - confident score. are you incorporating that? yes| confident score. are you | incorporating that? yes it confident score. are you . incorporating that? yes it is incorporating _ incorporating that? yes it is incorporating context - incorporating that? yes it is incorporating context on - incorporating that? yes it is i incorporating context on where and when — incorporating context on where and when the _ incorporating context on where and when the context - incorporating context on where and when the context was - incorporating context on where l and when the context was found, absolutely — and when the context was found, absolutely yes~ _
8:46 pm
and when the context was found, absolutely yes. the _ and when the context was found, absolutely yes.— absolutely yes. the social media companies - absolutely yes. the social media companies have i absolutely yes. the social media companies have a l absolutely yes. the social - media companies have a vested interest in this because if you can't tell fact from fiction you get a liars dividend. you become a disruptor. you poison the well so much that nobody believes anything and that's not good for a social medial model that makes money from spreading news. it model that makes money from spreading news.— model that makes money from spreading news. it was exciting at first when — spreading news. it was exciting at first when it _ spreading news. it was exciting at first when it was _ spreading news. it was exciting at first when it was this - spreading news. it was exciting at first when it was this new . at first when it was this new thing and then you could stay in touch with your friend and journalist would keep up with the news through it. once that starts feeling like they are just using your data and you are looking for news but it's not reliable, the ecosystem is being flooded all the time. you mightjust turn off. that's without going into the mental health indications of these sites, which we know are harmful. i wonder if we might have lived through the golden age of social media and are now entering this news phase and if it isn't cleaned up, people
8:47 pm
will leave it or only in the way that people would read the national enquirer about aliens. one of our collaborations was with— one of our collaborations was with a — one of our collaborations was with a big _ one of our collaborations was with a big tech _ one of our collaborations was with a big tech company - one of our collaborations was with a big tech company and i one of our collaborations was i with a big tech company and so there _ with a big tech company and so there is— with a big tech company and so there is a — with a big tech company and so there is a lot _ with a big tech company and so there is a lot of— with a big tech company and so there is a lot of interest- with a big tech company and so there is a lot of interest in- there is a lot of interest in these _ there is a lot of interest in these solutions. _ there is a lot of interest in these solutions. the - there is a lot of interest in. these solutions. the interest goes — these solutions. the interest goes further _ these solutions. the interest goes further and _ these solutions. the interest goes further and we - these solutions. the interest goes further and we can - these solutions. the interest goes further and we can nowj goes further and we can now proactively _ goes further and we can now proactively look— goes further and we can now proactively look for- goes further and we can now proactively look for a - goes further and we can now proactively look for a deep . proactively look for a deep fakes— proactively look for a deep fakes and _ proactively look for a deep fakes and this _ proactively look for a deep fakes and this information| proactively look for a deep l fakes and this information in social— fakes and this information in social media _ fakes and this information in social media platforms - fakes and this information inj social media platforms using autonomous— social media platforms using autonomous agents. - social media platforms using autonomous agents. then i social media platforms usingl autonomous agents. then we social media platforms using - autonomous agents. then we can establish _ autonomous agents. then we can establish situational— autonomous agents. then we can establish situational awareness . establish situational awareness on a _ establish situational awareness on a global— establish situational awareness on a global scale. _ establish situational awareness on a global scale. [5 _ establish situational awareness on a global scale.— on a global scale. is this the riaht on a global scale. is this the right environment _ on a global scale. is this the right environment to - on a global scale. is this the right environment to be - right environment to be developing, do you get support for stuff like this in the uk? generally yes, i think the uk is a — generally yes, i think the uk is a great _ generally yes, i think the uk is a great place. _ generally yes, i think the uk is a great place.— is a great place. one of the problem — is a great place. one of the problem here _ is a great place. one of the problem here is _ is a great place. one of the problem here is not - is a great place. one of the problem here is not the - is a great place. one of the i problem here is not the deep fa ke fake news as the disinformation one of the problems here, is not so much the deepfake so much the deepfake news, as the disinformation spread by conspiracy theories, who are creating material
8:48 pm
8:49 pm
welcome back. the moon landings that never happened, the covid microchip that was injected into your arm, the pizza paedophile ring in washington. conspiracy theories abound, often with dangerous consequences. many have tried reasoning with the conspiracy theorists, but to no avail. how do you talk to someone so convinced of what they believe, who is equally suspicious of why you would even by challenging those beliefs. well, researchers have set about creating a chatbot to do just that. it draws on a vast array of information to converse with these people, using bespoke fact based arguments. and the �*debunk bot�* is proving remarkably successful.
8:50 pm
joining us on zoom is the lead researcher, drthomas costelloe. associate professor in psychology at the university of washington. welcome to the programme, what does the chapel do? the welcome to the programme, what does the chapel do?— does the chapel do? the idea is that studying — does the chapel do? the idea is that studying conspiracy - that studying conspiracy theories and trying to debunk them has been hard until now because there are so many different conspiracy theories out there in the world and you would need all of this... you need to look across all the information comprehensively to debunk all of them and study them systematically. large language models, they are perfect for doing just that. we did an experiment where people came in and described a conspiracy theory that they believed in and felt strongly about, the ai summarised it for them and rated it and then they entered into a conversation with this chat box and it was given what they believed and
8:51 pm
programmed to persuade them away from the conspiracy theory using facts and evidence. after about an eight minute conversation, people reduce their beliefs in their chosen conspiracy by about 20% on average. one in four people came out of the other end of that conversation actively uncertain towards their conspiracy. they were newly sceptical. i5 conspiracy. they were newly sceptical-— sceptical. is it that they don't know _ sceptical. is it that they don't know where - sceptical. is it that they don't know where to . sceptical. is it that they don't know where to go j sceptical. is it that they i don't know where to go to sceptical. is it that they - don't know where to go to get this information and they are suspicious of anybody who might have the answers to the things that concern them?— that concern them? that could be art that concern them? that could be part of— that concern them? that could be part of it- — that concern them? that could be part of it. i— that concern them? that could be part of it. i think _ that concern them? that could be part of it. i think it - be part of it. i think it really it is being provided with facts and information that is tailored to exactly what they believe.— is tailored to exactly what the believe. ., ., , they believe. how do you deploy a? i they believe. how do you deploy a? i can't _ they believe. how do you deploy a? i can't imagine _ they believe. how do you deploy a? i can't imagine that _ a? i can't imagine that conspiracy theorists are wandering around saying, disprove my conspiracy theory? that's a great question i would be curious to hear other answers about that too. in the
8:52 pm
study, we pay people to come and do it. i am optimistic about the truth motivations of human beings in general and i think people want to know what is truth, so if there is a tool they trust to do that, then all they trust to do that, then all the better. they trust to do that, then all the better-— the better. can you see a purpose _ the better. can you see a purpose for _ the better. can you see a purpose for this - the better. can you see a purpose for this in - the better. can you see a i purpose for this in america? the better. can you see a - purpose for this in america? i can see it being incorporated into— can see it being incorporated into a — can see it being incorporated into a lot _ can see it being incorporated into a lot of technology. we use — into a lot of technology. we use things like chat gpt. an example _ use things like chat gpt. an example of chat gpt disproving something for me. there is a famous — something for me. there is a famous winston churchill quote, a lie gets— famous winston churchill quote, a lie gets halfway around the world — a lie gets halfway around the world before the truth can get its pants — world before the truth can get its pants on. no quote better describes the conversation we're _ describes the conversation we're having, how fast this disinformation spreads. i put that— disinformation spreads. i put that into _ disinformation spreads. i put that into chat gpt before i did a presentation and it's not a winston _ a presentation and it's not a winston churchill quote, it's from — winston churchill quote, it's from jonathan swift in the 1700s _ from jonathan swift in the 17005. ai helped me disprove that this — 17005. ai helped me disprove that this disinformation that has been around for years. it
8:53 pm
should — has been around for years. it should be _ has been around for years. it should be integrated into these technologies. is should be integrated into these technologies.— technologies. is this where the two worlds _ technologies. is this where the two worlds collide _ technologies. is this where the two worlds collide because - two worlds collide because there are some conspiracy theories that they put out ai generated material as well. maybe you can stop the prevalence of fake material? potentially. the study was done in laboratory _ potentially. the study was done in laboratory conditions, - potentially. the study was done in laboratory conditions, so - potentially. the study was done in laboratory conditions, so it i in laboratory conditions, so it would — in laboratory conditions, so it would be _ in laboratory conditions, so it would be interesting - in laboratory conditions, so it would be interesting to- in laboratory conditions, so it would be interesting to see l would be interesting to see whether— would be interesting to see whether these _ would be interesting to see whether these results - would be interesting to see - whether these results translate into the — whether these results translate into the real _ whether these results translate into the real world. _ whether these results translate into the real world. also, - whether these results translate into the real world. also, the l into the real world. also, the large — into the real world. also, the large language _ into the real world. also, the large language models- into the real world. also, the large language models that i into the real world. also, the - large language models that were used were — large language models that were used were safety— large language models that were used were safety fine _ large language models that were used were safety fine tuned - used were safety fine tuned which — used were safety fine tuned which means— used were safety fine tuned which means they - used were safety fine tuned which means they were - which means they were programmed _ which means they were programmed to - which means they were programmed to say - which means they were j programmed to say the which means they were - programmed to say the truth. if that safety— programmed to say the truth. if that safety tuning _ programmed to say the truth. if that safety tuning is _ programmed to say the truth. if that safety tuning is not - that safety tuning is not there. _ that safety tuning is not there, there _ that safety tuning is not there, there could - that safety tuning is not there, there could be i there, there could be interactive _ there, there could be - interactive disinformation, they— interactive disinformation, they could _ interactive disinformation, they could be _ interactive disinformation, they could be used - interactive disinformation, they could be used to - interactive disinformation, - they could be used to convince people — they could be used to convince people of— they could be used to convince people of things _ they could be used to convince people of things that - they could be used to convince people of things that are - they could be used to convince people of things that are not l people of things that are not true — people of things that are not true that _ people of things that are not true that is _ people of things that are not true. that is the _ people of things that are not true. that is the risk- people of things that are not true. that is the risk i - people of things that are not true. that is the risk i see. i| true. that is the risk i see. have a question, i am
8:54 pm
true. that is the risk i see.“ have a question, i am curious as to how much having good information changes people's mine. for example, smoking. we have known for decades it's bad for you. we have all the data, we put labels on it clearly and yet people still smoke. when you talk to a smoker and tried to persuade them to give up because you care about them, they will sometimes really entrench and it is hard to break. notjust because it is addictive, but because they want to smoke. i see there's parallel with conspiracy theories in terms of we have beliefs and information is not always enough to change it. it's not always just about facts, it's about something else. ~ ~' ., facts, it's about something else. ~ ,, ., ., , facts, it's about something else. ~ ,, ., ., else. we know that drugs are bad for us — else. we know that drugs are bad for us when _ else. we know that drugs are bad for us when we _ else. we know that drugs are bad for us when we start - else. we know that drugs are l bad for us when we start doing it. bad for us when we start doing it then— bad for us when we start doing it. then fundamentally not about _ it. then fundamentally not about information. conspiracy beliefs — about information. conspiracy beliefs are often descriptive, they— beliefs are often descriptive, they are _ beliefs are often descriptive, they are accounts of what went on in _ they are accounts of what went on in the — they are accounts of what went on in the world, al-qaeda didn't— on in the world, al-qaeda didn't put together the 9/11
8:55 pm
terrorist attacks, it was the government, etc dealing with claims — government, etc dealing with claims about the world is conducive to information persuasion in the way that... we focus _ persuasion in the way that... we focus so much on the legislation, the question i almost ask is how far our congress on that, what we have shown tonight is that it is the industry itself that is forcing the change. maybe it is not legislation because legislation is or is one step behind. i’m is or is one step behind. i'm auoin is or is one step behind. i'm going to _ is or is one step behind. i'm going to give _ is or is one step behind. i'm going to give you an embarrassing admission that proves— embarrassing admission that proves that. i was at dinner last— proves that. i was at dinner last night— proves that. i was at dinner last night with one of the creators of chat gpt, she was on one — creators of chat gpt, she was on one of— creators of chat gpt, she was on one of the earlier versions. i complained to her and said i was— i complained to her and said i was teaching a course at university and i got lazy and was — university and i got lazy and was supposed to produce a list for my— was supposed to produce a list for my students of books and i looked — for my students of books and i looked upon chat gpt. send it
8:56 pm
to my — looked upon chat gpt. send it to my students and they sent it back— to my students and they sent it back and — to my students and they sent it back and said those books are fake — back and said those books are fake i— back and said those books are fake. i said this to her and she— fake. i said this to her and she said. _ fake. i said this to her and she said, yes, that was bad and it gave — she said, yes, that was bad and it gave chat gpt a bad reputation in your mind and that's— reputation in your mind and that's why we kept improving the models. we don't want to give _ the models. we don't want to give you — the models. we don't want to give you false content because you won't _ give you false content because you won't want to use the product. _ you won't want to use the product. it may not be heartening to everyone but certainly the industry improvements move faster than legislation, because there is a business — legislation, because there is a business imperative to get it right — business imperative to get it riuht. . , , right. that is the vested interests _ right. that is the vested interests that _ right. that is the vested interests that i - right. that is the vested interests that i see - right. that is the vested interests that i see for i right. that is the vested j interests that i see for a right. that is the vested . interests that i see for a lot of the online companies and people developing it. we are out of time. just to remind you that all these episodes are on the ai decoded playlist on youtube. have a look at those. thank you to our guests and let's do it again, same time next week. thanks for watching.
8:57 pm
good evening. it has been another day when things have felt more like summer than september. most places got to see at least some sunshine. and along the south coast today, temperatures got pretty close to 25 degrees. but in the southern areas, things are set to change as this area of low pressure pushes northwards over the next few days. that will bring some showers, some thunderstorms, and see how the rain is set to accumulate in some locations. it is possible that some places in the south could get close to a month's worth of rain, but notice, further north, not much rain at all. it is going to stay largely dry across northern england, northern ireland and scotland. warm sunshine up towards the north west, often quite cloudy and cool close to these north sea coasts, and with the chance for those showers and thunderstorms down towards the south. now, in the shorter term tonight, we are going to see a lot of cloud filtering
8:58 pm
in across many parts of the uk, and even where we keep hold of some clear skies — say, across parts of northern ireland — we could see a few fog patches. a little bit on the chilly side across northwest scotland, down into northern ireland. further south and east, though, with the cloud and the breeze, a mild start to friday morning, a lot of cloud around first thing — that tending to retreat towards the east coast, but staying pretty murky, i think, for parts of north—east england and eastern scotland. and then we start to see these showers and thunderstorms breaking out across parts of wales, southern and southwestern parts of england. we could see a lot of rain in a short space of time. and those temperatures, if anything, a little bit down on where they have been. during friday night, we continue to see this feed of cloud into parts of north—east england, eastern scotland, staying quite murky here. through saturday, sunshine for western scotland and northern ireland, but further south across england and wales, i think we'll see more of these heavy downpours and thunderstorms — perhaps the worst of those from the midlands, westwards into wales and the south—west of england. temperatures north to south around 16 to 22 degrees into sunday. it looks like we'll see
8:59 pm
the showers joining into longer spells of rain. again, though, confined to southern parts of england, the midlands, wales. further north, largely dry. best of the sunshine to the north—west. quite murky for some of these eastern coasts. and then, as we head on into next week it does look very unsettled. low pressure is set to be in charge. we'll see some rain at times. it could potentially be quite windy and it is also set to turn quite a lot colder.
9:00 pm
hello, i'm christian fraser. you're watching the context on bbc news. hezbollah has been hit hard this week. notjust in terms of the dead end though wounded but the attack on his communication systems was a major humiliation. translation: this operation, the enemy crossed all rules and redlines. it didn't care about anything at all not humanely, not legally. israel is responding. by force to hezbollah. we will use all means - necessary to restore security to our northern border and safely return ouri citizens to their homes. the population in both north and israel and southern lawn has had to flee their homes. we all want to see them be able to go back to their homes.
26 Views
IN COLLECTIONS
BBC News Television Archive Television Archive News Search ServiceUploaded by TV Archive on