tv The Context BBCNEWS June 27, 2024 8:30pm-9:01pm BST
8:30 pm
the start was delayed by rain and they also had to go off mid—innings, but india still posted 171—7. virat kohli and rishab pant went cheaply, but captain rohit sharma and suryakumar yadav put on 73 for the third wicket. there you can see the really, really bad news for england. england in reply 62—5 off ten overs. salt, buttler, bairstow, moeen and curran the wickets to fall. axar patel has taken three of them. the defending champions are in big, big trouble. harry brooks making a go of it, but they have got to get a lot of runs in a very short period of time. south africa are waiting for the winner of that match after they made very light work of afghanistan in their semifinal. they won by nine wickets in trinidad, but the state of the pitch has come in for a lot of criticism.
8:31 pm
afghanistan were bowled out forjust 56, the lowest score in a t20 world cup semifinal. they were all—out in the 12th over. batting was a lottery with extremely inconsistent bounce with each delivery. south africa survived a few scares themselves, but reeza hendricks top—scored with 29 and they are through to theirfirst men's white—ball final. but for the afghanistan coach, the result left a bad taste in the mouth. whenever you lose a game like this, it's always going to hurt and it should hurt, because we've put so much into it, so many sacrifices made by the players, coaching staff, management, officials, all that sort of stuff. i don't want to get myself into trouble, but that's not the pitch you want to have a match, semifinal of the world cup on, plain and simple. there are a couple of rest days at the european championship before the round of 16 starts on saturday. england have to wait
8:32 pm
until sunday for their game against slovakia in gelsenkirchen. 23 of the 26 players trained today. declan rice and kieran tripper are being put through individual programmes away from the squad. phil foden is on his way back from the uk after a brief break to be with his partnerfor the birth of their third child. we heard from eberechi eze today, who gave his backing to manager gareth southgate after all the negativity directed at him and the squad. i think is a testament to his character, that of a person that he is, that he and his main focus is what can he do to get the best out of everyone? and that's what he is trying to do. you can see itjust what he is saying, kiefer playeris, kiefer the dressing room in knowing that you've got the full support. and ifeel like that you've got the full support. and i feel like of course you don't wish for that type of situation where you've got negativity around,
8:33 pm
but again it's football. people of that opinions and stuff like that so it's understandable. but as i said the main focus is what we got and we believe in an egg and the fact that we got everyone in the dressing room supporting each other. that's the main focus. some of the top stars are playing at the eastbourne international in preparation. three british players reached the quarterfinals in the women's draw, the first time that had happened in over 45 years, but they are all out. the british number one katie boulter lost to french open finallist jasmine paolini in straight sets. boulter really seemed to struggle with the blustery conditions. she lost the first set 6—1 and the second on a very one—sided tie break. harriet dart lost to another grand slam finallist. she won only three games against leylah fernandez, the canadian who lost to emma raducanu in the 2021 us open final. and raducanu is also out. she'd beaten fellow grand slam winner sloane stephens and the world number fivejessica pegula this week, but was no match for the world number 14 daria kasatkina. 6—2, 6—2, she is out.
8:34 pm
all the latest tennis news on the bbc sport website. i think england havejust bbc sport website. i think england have just lost bbc sport website. i think england havejust lost another bbc sport website. i think england have just lost another wicket, henry brooke possibly, 68—6, he is gone and england staring at the feet in that world cup 220 semifinal against india. an update on that later in the evening. i was depressing news, thank you. you are watching the context. it is time for al decoded. welcome to ai decoded. it's that time of the week when we dive deep into the most eye—catching stories in the world of artificial intelligence. this week — cyber—security. how long would it take an advanced hacker to break into the most complex, most powerful ai models that we have built? 30 minutes, as it happens. this week, the ft carried an exclusive interview with the mysterious, faceless hacker known as pliny the prompter.
8:35 pm
he is stress—testing, orjailbreaking, the large language models that microsoft, chatgpt and google have been building. an ethical online warrior who is part of the international effort to draw attention to the shortcomings of the biggest tech companies, who he argues are putting profit before safety. and if you think that is a risk somewhere in the future, think again. just two weeks ago, the russian cyber criminals qilin used sophisticated ai tools to find their way into the nhs computers. tens of thousands of patients here in the uk had names, dates of birth and private information published on the dark web when the testing firm synnovis refused to pay the ransom. the hackers encrypted vital information which made the it systems at two nhs hospitals useless. so how worried should we be? what are the implications for those businesses who are turning to ai to improve the systems?
8:36 pm
tonight, we are bringing together a former hacker, connor leahy — now the ceo of ai safety company conjecture — daryl flack — ceo of cybersecurity company blockphish — and as ever, to help guide us through it, our resident expert and presenter stephanie hare. let me start with you, stephanie. pliny it is what we call a white hat hacker, stress testing the system so what are they doing? the? hacker, stress testing the system so what are they doing?— hacker, stress testing the system so what are they doing? they are trying to stress test — what are they doing? they are trying to stress test the _ what are they doing? they are trying to stress test the system _ what are they doing? they are trying to stress test the system to - what are they doing? they are trying to stress test the system to get - to stress test the system to get what they should not do. they are coding the rule. imagine you had a teenager at home who is trying to play with the game and seeing if they could break it, break the system. lots of engineers have grown up system. lots of engineers have grown up trained that way. actors love to do it and they almost cannot help but do it. and actually quite like
8:37 pm
pliny the prompter�*s idea that he is doing work for free the company should be doing themselves. he is showing malware and showing how scammers can create scripts that my people click on links in a way you can enjoy all sorts of very nasty code into people's hospitals as we have just seen, which code into people's hospitals as we havejust seen, which is code into people's hospitals as we have just seen, which is a very soft target, a right target. but it's schools, is individuals and is only growing, only a growing problem. we just had last month the ai safety institute here in the uk published eight report showing that every single major large language model can be broken orjailbroke. that’s can be broken or “ailbroke. that's uuite can be broken or 'ailbroke. that's quite alarming. — can be broken orjailbroke. that's quite alarming. it _ can be broken orjailbroke. that's quite alarming. it is _ can be broken orjailbroke. that's quite alarming. it is quite - can be broken orjailbroke. that's. quite alarming. it is quite alarming if ou'd quite alarming. it is quite alarming if you'd like — quite alarming. it is quite alarming if you'd like the _ quite alarming. it is quite alarming if you'd like the colour _ quite alarming. it is quite alarming if you'd like the colour red - quite alarming. it is quite alarming if you'd like the colour red which i l if you'd like the colour red which i wanted to show solidarity with the researchers in this country. they are putting this out for all the people who have a feel of missing out and i need to get in with generative ai which i'm hearing with
8:38 pm
every single client could buy speak with. everybody has got real anxiety about missing out. but we are missing from the ai safety institute and the national cybersecurity centre here in the uk is a lot of this technology which continue to be in beta. it's not really ready. it's fun if you are approaching it from an engineering perspective but not if you are a ceo or general council who needs to protect your risk. but these companies, they have read team akers who are employed by the company to go in and try to find a way through the system. why are the white hats, guys like you, able to do it in the red teams are not? well, so in the hacker business, you talk about_ well, so in the hacker business, you talk about blue teams and red teams. red teams_ talk about blue teams and red teams. red teams are people who try to break things in blue teams are the ones _ break things in blue teams are the ones who— break things in blue teams are the ones who try to defend. usually the red team _ ones who try to defend. usually the red team always wins, the red team always _ red team always wins, the red team always wins — red team always wins, the red team always wins. blue team canjust minimise — always wins. blue team canjust minimise harm. with always wins. blue team canjust minimise harm. with al systems in minimise harm. with ai systems in particular— minimise harm. with al systems in particular and what we are seeing
8:39 pm
with people like pliny the prompter and me _ with people like pliny the prompter and me and my old friends in the online _ and me and my old friends in the online days, that is that al systems are very— online days, that is that al systems are very immature when it comes to safety _ are very immature when it comes to safety a _ are very immature when it comes to safety a lot — are very immature when it comes to safety. a lot of the stuff that we do for _ safety. a lot of the stuff that we do for cyber security and safety civil does — do for cyber security and safety civil does not apply to ai systems in the _ civil does not apply to ai systems in the same way in other forms of software. — in the same way in other forms of software, when a hacker finds a vulnerability, you will have your programme if you look at the code of the system, — programme if you look at the code of the system, fix the problem and then deploy— the system, fix the problem and then deploy a _ the system, fix the problem and then deploy a patch to solve the problem. the problem is that al systems are not like _ the problem is that al systems are not like normal software. they are not like normal software. they are not ridden — not like normal software. they are not ridden line by line with code. it's more — not ridden line by line with code. it's more like they are grown, they are more _ it's more like they are grown, they are more like artificial huge piles of numbers that can do great things but we _ of numbers that can do great things but we don't really know how to patch _ but we don't really know how to patch them. we can do a little bit. these _ patch them. we can do a little bit. these other— patch them. we can do a little bit. these other companies invest a lot of money— these other companies invest a lot of money into trying to treat these businesses but as pliny the prompter have shown _ businesses but as pliny the prompter have shown is wildly ineffective at this point—
8:40 pm
have shown is wildly ineffective at this point in time. so have shown is wildly ineffective at this point in time.— this point in time. so this joe rekik started _ this point in time. so this joe rekik started about - this point in time. so this joe rekik started about a - this point in time. so this joe rekik started about a year. this point in time. so this joel rekik started about a year ago this point in time. so this joe - rekik started about a year ago and the attacks as he described have evolved so as a constant game of cat and mouse, this. what is the inherent problem with that if you are a company hoping to integrate into your systems? i are a company hoping to integrate into your systems?— into your systems? i think ultimately _ into your systems? i think ultimately it's _ into your systems? i think ultimately it's always - into your systems? i think| ultimately it's always going into your systems? i think - ultimately it's always going to be an unknown— ultimately it's always going to be an unknown risk. _ ultimately it's always going to be an unknown risk. you're - ultimately it's always going to be an unknown risk. you're giving. ultimately it's always going to be | an unknown risk. you're giving up your— an unknown risk. you're giving up your data. — an unknown risk. you're giving up your data. your— an unknown risk. you're giving up your data, your information - your data, your information potential _ your data, your information potential your _ your data, your information potential your intellectual l potential your intellectual property. _ potential your intellectual property, your— potential your intellectual property, your brand - potential your intellectual - property, your brand reputation, potential your intellectual _ property, your brand reputation, and you're _ property, your brand reputation, and you're putting — property, your brand reputation, and you're putting into— property, your brand reputation, and you're putting into a _ property, your brand reputation, and you're putting into a system - property, your brand reputation, and you're putting into a system that - you're putting into a system that you're putting into a system that you are — you're putting into a system that you are hoping _ you're putting into a system that you are hoping is _ you're putting into a system that you are hoping is going - you're putting into a system that you are hoping is going to - you're putting into a system thati you are hoping is going to protect and secure — you are hoping is going to protect and secure it _ you are hoping is going to protect and secure it. and _ you are hoping is going to protect and secure it. and so— you are hoping is going to protect. and secure it. and so organisations need _ and secure it. and so organisations need to— and secure it. and so organisations need to be — and secure it. and so organisations need to be aware _ and secure it. and so organisations need to be aware that _ and secure it. and so organisations need to be aware that the - and secure it. and so organisations need to be aware that the risks - and secure it. and so organisationsl need to be aware that the risks that they are _ need to be aware that the risks that they are taking _ need to be aware that the risks that they are taking around _ need to be aware that the risks that they are taking around that - need to be aware that the risks that they are taking around that and - need to be aware that the risks that they are taking around that and like j they are taking around that and like we say, _ they are taking around that and like we say, organisations— they are taking around that and like we say, organisations don't- they are taking around that and like we say, organisations don't yet - they are taking around that and like i we say, organisations don't yet know what those _ we say, organisations don't yet know what those risks _ we say, organisations don't yet know what those risks are. _ we say, organisations don't yet know what those risks are. this _ we say, organisations don't yet know what those risks are. this is - we say, organisations don't yet know what those risks are. this is a - we say, organisations don't yet know what those risks are. this is a new. what those risks are. this is a new technology, — what those risks are. this is a new technology, there _ what those risks are. this is a new technology, there is _ what those risks are. this is a new technology, there is always - what those risks are. this is a new technology, there is always good i what those risks are. this is a new. technology, there is always good to be of _ technology, there is always good to be of an _ technology, there is always good to be of an arms — technology, there is always good to be of an arms race _ technology, there is always good to be of an arms race and _ technology, there is always good to be of an arms race and defenders. be of an arms race and defenders against _ be of an arms race and defenders against attackers _ be of an arms race and defenders against attackers and _ be of an arms race and defenders. against attackers and assessments that come — against attackers and assessments that come into _ against attackers and assessments that come into looking _ against attackers and assessments that come into looking at - against attackers and assessments that come into looking at that - against attackers and assessments| that come into looking at that arms race, _ that come into looking at that arms race, everything— that come into looking at that arms race, everything comes _ that come into looking at that arms race, everything comes down - that come into looking at that arms race, everything comes down to - that come into looking at that arms. race, everything comes down to how quickly— race, everything comes down to how quickly you _ race, everything comes down to how quickly you want _ race, everything comes down to how quickly you want to _ race, everything comes down to how quickly you want to jump _ race, everything comes down to how quickly you want to jump to - race, everything comes down to how quickly you want to jump to these i quickly you want to jump to these platforms — quickly you want to jump to these platforms and _ quickly you want to jump to these platforms and how— quickly you want to jump to these platforms and how much - quickly you want to jump to these platforms and how much data - quickly you want to jump to these - platforms and how much data you want to give _ platforms and how much data you want to give them _ platforms and how much data you want to give them as— platforms and how much data you want to give them. as we have _ platforms and how much data you want to give them. as we have seen- platforms and how much data you want to give them. as we have seen from i to give them. as we have seen from the institute, — to give them. as we have seen from the institute, it's _ to give them. as we have seen from the institute, it's very— to give them. as we have seen from the institute, it's very much - to give them. as we have seen from the institute, it's very much early. the institute, it's very much early stages, beta _ the institute, it's very much early stages, beta stages, _ the institute, it's very much early
8:41 pm
stages, beta stages, and - the institute, it's very much early stages, beta stages, and of- the institute, it's very much early stages, beta stages, and of the l stages, beta stages, and of the risks of— stages, beta stages, and of the risks of things _ stages, beta stages, and of the risks of things going _ stages, beta stages, and of the risks of things going wrong - stages, beta stages, and of the risks of things going wrong or. stages, beta stages, and of the - risks of things going wrong or quite hi-h risks of things going wrong or quite high at _ risks of things going wrong or quite high at the — risks of things going wrong or quite high at the moment. _ risks of things going wrong or quite high at the moment. so _ risks of things going wrong or quite high at the moment. so taking - risks of things going wrong or quite high at the moment. so taking a i high at the moment. so taking a pilot approach _ high at the moment. so taking a pilot approach with _ high at the moment. so taking a pilot approach with the - high at the moment. so taking a pilot approach with the size - high at the moment. so taking a pilot approach with the size of. pilot approach with the size of data with none — pilot approach with the size of data with none of— pilot approach with the size of data with none of my— pilot approach with the size of data with none of my state _ pilot approach with the size of data with none of my state it _ pilot approach with the size of data with none of my state it would - pilot approach with the size of data with none of my state it would be i with none of my state it would be a good _ with none of my state it would be a good start — with none of my state it would be a good start to — with none of my state it would be a good start to test _ with none of my state it would be a good start to test it _ with none of my state it would be a good start to test it to _ with none of my state it would be a good start to test it to make sure l good start to test it to make sure that you — good start to test it to make sure that you were _ good start to test it to make sure that you were getting _ good start to test it to make sure that you were getting the - good start to test it to make sure i that you were getting the times and outcomes— that you were getting the times and outcomes you're _ that you were getting the times and outcomes you're expecting. - that you were getting the times and outcomes you're expecting. but - that you were getting the times and outcomes you're expecting. but if. outcomes you're expecting. but if you jump — outcomes you're expecting. but if you jump straight _ outcomes you're expecting. but if you jump straight and _ outcomes you're expecting. but if you jump straight and right - outcomes you're expecting. but if you jump straight and right now. outcomes you're expecting. but if. you jump straight and right now with all of your— you jump straight and right now with all of your data — you jump straight and right now with all of your data and _ you jump straight and right now with all of your data and try— you jump straight and right now with all of your data and try to _ you jump straight and right now with all of your data and try to be - you jump straight and right now with all of your data and try to be ahead i all of your data and try to be ahead of the _ all of your data and try to be ahead of the curve, — all of your data and try to be ahead of the curve, then _ all of your data and try to be ahead of the curve, then you've _ all of your data and try to be ahead of the curve, then you've got to - all of your data and try to be ahead of the curve, then you've got to be| of the curve, then you've got to be aware _ of the curve, then you've got to be aware that— of the curve, then you've got to be aware that that _ of the curve, then you've got to be aware that that risk _ of the curve, then you've got to be aware that that risk you _ of the curve, then you've got to be aware that that risk you were - aware that that risk you were thinking _ aware that that risk you were thinking could _ aware that that risk you were thinking could come - aware that that risk you were thinking could come to - aware that that risk you were . thinking could come to fruition. what's — thinking could come to fruition. what's really _ thinking could come to fruition. what's really fun _ thinking could come to fruition. what's really fun about - thinking could come to fruition. what's really fun about this - thinking could come to fruition. i what's really fun about this article is it gives a bit of a heads up the california's legislature is going to be voting in august of the bill that will require any company in the state, so that google and meta and openai to ensure they do not develop models with a hazardous capability and pliny the prompter says all ai models will fit that criteria. just talkin: models will fit that criteria. just talking about _ models will fit that criteria. just talking about the nhs model, what is it that is weak in the system and does that mean that all our systems, given what you've just described, are at risk of this kind of malign
8:42 pm
activity? are at risk of this kind of malign activi ? , ., ., , , activity? the short answer is absolutely. _ activity? the short answer is absolutely, yes, _ activity? the short answer is absolutely, yes, it _ activity? the short answer is absolutely, yes, it is - activity? the short answer is absolutely, yes, it is a - activity? the short answer is - absolutely, yes, it is a complete disaster. — absolutely, yes, it is a complete disaster. a — absolutely, yes, it is a complete disaster, a complete mess and none of our— disaster, a complete mess and none of our methods are adequate not whatsoever. this is not ready for say production whatsoever across the board _ say production whatsoever across the board to _ say production whatsoever across the board. to give you a feeling of this. _ board. to give you a feeling of this. there _ board. to give you a feeling of this, there are systems that are as close _ this, there are systems that are as close to _ this, there are systems that are as close to unlikable as possible. these — close to unlikable as possible. these are _ close to unlikable as possible. these are usually used a military context _ these are usually used a military context or— these are usually used a military context or nuclear power plants and it's all— context or nuclear power plants and it's all formally verified software. to give _ it's all formally verified software. to give you a feeling for this, developing even a relatively simple piece _ developing even a relatively simple piece of— developing even a relatively simple piece of software, less of the embedded software in helicopter which _ embedded software in helicopter which sounds simple but usually takes _ which sounds simple but usually takes a — which sounds simple but usually takes a couple of years to develop, developing this kind of think and take over— developing this kind of think and take over a decade and can cost millions— take over a decade and can cost millions or— take over a decade and can cost millions or even hundreds of millions— millions or even hundreds of millions of dollars. this is using techniques that we understand. the best tools. — techniques that we understand. the best tools, the best researchers, tools— best tools, the best researchers, tools were — best tools, the best researchers, tools were developed over many decades — tools were developed over many decades. ai tools were developed over many decades. alas it is tools were developed over many decades. ai as it is today has only existed _ decades. ai as it is today has only existed really for like a couple of years. _ existed really for like a couple of years, maybe a decade. our
8:43 pm
understanding of what our ai models do internally is extremely inadequate compared to our understanding of other types of software — understanding of other types of software systems and even with other types of _ software systems and even with other types of software systems, our safety — types of software systems, our safety is — types of software systems, our safety is very shaky.— types of software systems, our safety is very shaky. that terrifies me. safety is very shaky. that terrifies me- because _ safety is very shaky. that terrifies me- because i _ safety is very shaky. that terrifies me. because i read _ safety is very shaky. that terrifies me. because i read a _ safety is very shaky. that terrifies me. because i read a question . me. because i read a question earlier and i thought if we are integrating this technology into our critical infrastructure systems, and this becomes really serious. it's interesting he used the term about a nuclear power plant, but maybe that's not far—fetched. maybe our critical infra structure is at risk. i think with the advent of new technology. _ i think with the advent of new technology, legislation - i think with the advent of new technology, legislation is - technology, legislation is always somewhat — technology, legislation is always somewhat behind, _ technology, legislation is always somewhat behind, and - technology, legislation is always somewhat behind, and so- somewhat behind, and so organisations— somewhat behind, and so organisations will- somewhat behind, and so organisations will start i somewhat behind, and sol organisations will start the somewhat behind, and so- organisations will start the same time. _ organisations will start the same time. to— organisations will start the same time. to speed _ organisations will start the same time. to speed up— organisations will start the same time, to speed up operations. i organisations will start the same i time, to speed up operations. and organisations will start the same - time, to speed up operations. and if it's deployed — time, to speed up operations. and if it's deployed to _ time, to speed up operations. and if it's deployed to quickly— time, to speed up operations. and if it's deployed to quickly and - time, to speed up operations. and if it's deployed to quickly and some i time, to speed up operations. and if it's deployed to quickly and some of| it's deployed to quickly and some of those _ it's deployed to quickly and some of those critical— it's deployed to quickly and some of those critical national _ those critical national infrastructure - those critical national - infrastructure environments, those critical national _ infrastructure environments, and there _ infrastructure environments, and there is— infrastructure environments, and there is always _ infrastructure environments, and there is always a _ infrastructure environments, and there is always a risk _ infrastructure environments, and there is always a risk that - infrastructure environments, and there is always a risk that it - infrastructure environments, and there is always a risk that it will. there is always a risk that it will io there is always a risk that it will go wrong — there is always a risk that it will go wrong and _ there is always a risk that it will go wrong and as— there is always a risk that it will go wrong. and as he _ there is always a risk that it will go wrong. and as he was- there is always a risk that it will go wrong. and as he was sake, | there is always a risk that it will - go wrong. and as he was sake, years and years— go wrong. and as he was sake, years and years are — go wrong. and as he was sake, years and years are spent _ go wrong. and as he was sake, years and years are spent to _ go wrong. and as he was sake, years and years are spent to assure - and years are spent to assure products. — and years are spent to assure products. to _ and years are spent to assure products, to assure _ and years are spent to assure. products, to assure platforms, and years are spent to assure - products, to assure platforms, to assure _ products, to assure platforms, to assure software _ products, to assure platforms, to assure software so _ products, to assure platforms, to assure software so that _ products, to assure platforms, to assure software so that when - products, to assure platforms, toi assure software so that when they are placed — assure software so that when they are placed in — assure software so that when they are placed in the _ assure software so that when they are placed in the critical—
8:44 pm
assure software so that when they are placed in the critical national. are placed in the critical national infrastructure _ are placed in the critical national infrastructure to _ are placed in the critical national infrastructure to perform - are placed in the critical national infrastructure to perform as - infrastructure to perform as expected _ infrastructure to perform as expected-— infrastructure to perform as exected. , ., , infrastructure to perform as exected. , . , ., expected. just really quickly, are we talkin: expected. just really quickly, are we talking al _ expected. just really quickly, are we talking ai developing - expected. just really quickly, are we talking ai developing ai - expected. just really quickly, are we talking ai developing al to . we talking ai developing al to attack the systems? ever attack the systems? absolutely. ever more sophisticated _ attack the systems? absolutely. ever more sophisticated equipment - more sophisticated equipment potentially. the more sophisticated equipment potentially-— more sophisticated equipment otentiall . , ., , potentially. the question is will it sur - ass potentially. the question is will it surpass us _ potentially. the question is will it surpass us and _ potentially. the question is will it surpass us and we _ potentially. the question is will it surpass us and we will— potentially. the question is will it surpass us and we will not - potentially. the question is will it surpass us and we will not be - potentially. the question is will it| surpass us and we will not be able to do sincerely keep up with that were happening within the machines. coming up, the former twitter boss jack dorsey warns that within the next 5—10 years, we won't know what is real any more. so sophisticated will the ai—generated content be that, "it will feel," he says, "like you're in a simulation." will those technological advancements bring us all closer to "singularity", the point in our evolution when we share this planet with entities smarter than us? we will discuss that after the break. around the world and across the uk, this is bbc news.
8:46 pm
earlier this year, during a primary election in new hampshire, thousands of voters were sent a message byjoe biden. you know the value of voting democratic, and our votes count. it's important that you save your vote for the november election. we'll need your help in electing democrats up and down the ticket. voting this tuesday only enables the republicans in their quest to elect donald trump again. it's persuasive, but anyone carefully listening to that robocall might detect some stilted language, the emphasis and pauses in the wrong places, which would tell you there is something wrong with it. in fact, it had been created by a political consultant who had using generative al to influence the vote. it shows what is possible, and it raises some really big issues. this week, the former twitter ceo and co—founderjack dorsey warned that in 5—10 years from now, it will be impossible to differentiate what is real from what is fake. don't trust, verify. so even this talk, everything
8:47 pm
i said, don't trust me. don't trust, verify. so even this talk, everything i said, don't trust me. you have to experience it yourself. and you have to learn yourself. and this is going to be so critical as we enter this time in the next 5—10 years, because the way the images are created, deep fakes, videos, you will literally not know what is real and what is fake. it will be almost impossible to tell. it will feel like you are in a simulation, because everything will look manufactured. everything will look produced. and it's very important that you shift your mindset or you attempt to shift your mindset to verifying the things that you feel you need through your experience, through your intuition. because of these devices in your bags and your pockets, they are off—loading from that image of the neuron i sent you, and because all these things on your phone now, you are not building those connections in your brain any more.
8:48 pm
this is one of those stories where you wonder how mr dorsey would like to verify by experiencing. that is not eight scalable model for a tech ceo to be recommending so it's weird. how cannot know what is being reported in china? i am weird. how cannot know what is being reported in china? iam not weird. how cannot know what is being reported in china? i am not there, i'm here in london. the whole point of news is we need a system of contributors who put forward evidence that other people peer—reviewed and then it's discussed his truth. this is taking us back to philosophy. what is reality? or the study of history. how do we treat sources? even the law. what is evidence of what is not? what is admissible in court and was not? telling everybody to go with our intuition does not feel great. with our intuition does not feel areat. �* , with our intuition does not feel areat.�* , . with our intuition does not feel areat.�* , ,., . great. any protected democracy if ou great. any protected democracy if you cannot _ great. any protected democracy if you cannot differentiate _ great. any protected democracy if you cannot differentiate what - great. any protected democracy if you cannot differentiate what is i you cannot differentiate what is real and what is fake? i you cannot differentiate what is real and what is fake?— real and what is fake? i mean, definitely _ real and what is fake? i mean, definitely not _ real and what is fake? i mean, definitely not easily, _ real and what is fake? i mean, definitely not easily, that's - definitely not easily, that's for sure — definitely not easily, that's for sure a—
8:49 pm
definitely not easily, that's for sure. a lot of what makes a society possible _ sure. a lot of what makes a society possible is — sure. a lot of what makes a society possible is that we can agree on something being true or not. if we cannot— something being true or not. if we cannot fundamentally agree on what is true _ cannot fundamentally agree on what is true or— cannot fundamentally agree on what is true or not, and especially if the things— is true or not, and especially if the things we believe are true or not and — the things we believe are true or not and may be selected that people are things _ not and may be selected that people are things and do not have our best interest— are things and do not have our best interest at— are things and do not have our best interest at heart, or the dissent of something — interest at heart, or the dissent of something or make us angry or make us do _ something or make us angry or make us do some _ something or make us angry or make us do some political thing, then you cannot— us do some political thing, then you cannot for— us do some political thing, then you cannot for a — us do some political thing, then you cannot for a state. you cannot form a civilisation — cannot for a state. you cannot form a civilisation in the limit. it�*s a civilisation in the limit. it's kind of a civilisation in the limit. it�*s kind of interesting that he is putting the onus on us because i would say if you cannot differentiate, then that is an existential risk for them, for social media companies? why would you ever tune in if it'sjust me and you ever tune in if it'sjust me and you fake news? you ever tune in if it's 'ust me and you fake newsah you fake news? absolutely. try to ut the you fake news? absolutely. try to put the burden — you fake news? absolutely. try to put the burden on _ you fake news? absolutely. try to put the burden on the _ you fake news? absolutely. try to put the burden on the user - you fake news? absolutely. try to put the burden on the user is - you fake news? absolutely. try to . put the burden on the user is moving the portion— put the burden on the user is moving the portion of— put the burden on the user is moving the portion of blame _ put the burden on the user is moving the portion of blame to _ put the burden on the user is moving the portion of blame to the - put the burden on the user is moving the portion of blame to the wrong. the portion of blame to the wrong place _ the portion of blame to the wrong place it's— the portion of blame to the wrong place it's the _ the portion of blame to the wrong place. it's the tech _ the portion of blame to the wrong place. it's the tech companies are providing — place. it's the tech companies are providing the _ place. it's the tech companies are providing the technology. - place. it's the tech companies are providing the technology. they. providing the technology. they should — providing the technology. they should have _ providing the technology. they should have the _ providing the technology. they should have the intelligence i providing the technology. they- should have the intelligence behind that technology _ should have the intelligence behind that technology to _ should have the intelligence behind that technology to be _ should have the intelligence behind that technology to be able - should have the intelligence behind that technology to be able to - that technology to be able to identify— that technology to be able to identify if— that technology to be able to identify if something - that technology to be able to identify if something is - that technology to be able to identify if something is true i that technology to be able to. identify if something is true or if it's false — identify if something is true or if it's false or— identify if something is true or if it's false or not. _ identify if something is true or if it's false or not. ultimately- it's false or not. ultimately the user— it's false or not. ultimately the user at— it's false or not. ultimately the user at the _ it's false or not. ultimately the user at the end _ it's false or not. ultimately the user at the end is _ it's false or not. ultimately the user at the end is the - it's false or not. ultimately the user at the end is the last -
8:50 pm
it's false or not. ultimately the i user at the end is the last line of defence — user at the end is the last line of defence if— user at the end is the last line of defence. if we _ user at the end is the last line of defence. if we are _ user at the end is the last line of defence. if we are having - user at the end is the last line of defence. if we are having to - user at the end is the last line of defence. if we are having to use | defence. if we are having to use our intuition— defence. if we are having to use our intuition comes _ defence. if we are having to use our intuition comes in _ defence. if we are having to use our intuition comes in the _ defence. if we are having to use our intuition comes in the system - defence. if we are having to use our intuition comes in the system is - intuition comes in the system is fouled to — intuition comes in the system is fouled to get _ intuition comes in the system is fouled to get to _ intuition comes in the system is fouled to get to that _ intuition comes in the system is fouled to get to that point. - intuition comes in the system is| fouled to get to that point. what about watermarking? _ fouled to get to that point. what about watermarking? i - fouled to get to that point. about watermarking? i thought fouled to get to that point— about watermarking? i thought openai was going to watermark everything. it's often trotted out as a solution so everyone agrees is not a silver bullet. let's discuss that. everyone will come up upon some kind of watermarking they will agree upon an use. this becomes interesting for a block chain in ways he would do a chain of evidence and how do you verify? there is cyber security and cryptographic principles we could use in this with the problem is it does not scale. it's really intensive. want going to do? everything we took a post office immediately went to do some sort of decryption to see if it's real? that's what does not work. you're right, they constantly put the onus on the user to protect their data and not to decide what's true or not based on feelings. weird. i and not to decide what's true or not based on feelings. weird.— and not to decide what's true or not based on feelings. weird. i think we keep returning _ based on feelings. weird. i think we keep returning to _ based on feelings. weird. i think we keep returning to in _ based on feelings. weird. i think we keep returning to in this _ based on feelings. weird. i think we j keep returning to in this programme and in mind of what we have
8:51 pm
discussed here... a theme we keep returning to in this programme — and how can you not in mind of what we have just discussed — is how long it will be before we reach singularity. what do i mean by that? singularity is the term given to the point in human history when the machines become smarter than those programming them, the point where they don't really need us. is that really likely to happen, or is itjust a concept created by those who are making money from the technology of the future? there is no better panel to discuss it with. connor, let me start with you. are you a believer, or are you a sceptic? i see no reason | see no reason fi’oiti i see no reason from any scientific evidence _ i see no reason from any scientific evidence i— i see no reason from any scientific evidence i have seen so far that it should _ evidence i have seen so far that it should not— evidence i have seen so far that it should not be possible. humans are the dumbest species that can create industrial— the dumbest species that can create industrial civilisation. we were the first ones— industrial civilisation. we were the first ones to evolve. we were the first ones to evolve. we were the first draft — first ones to evolve. we were the first draft. there is no reason we cannot _ first draft. there is no reason we cannot build ai systems that use or are faster. — cannot build ai systems that use or are faster, our cpus or computers switch — are faster, our cpus or computers switch about a million times faster than our— switch about a million times faster than our neurons do. so even if we 'ust than our neurons do. so even if we just get— than our neurons do. so even if we just get into — than our neurons do. so even if we just get into computers and smart as a human. _ just get into computers and smart as a human, right? let's say it runs 1000 _ a human, right? let's say it runs 1000 times _ a human, right? let's say it runs
8:52 pm
1000 times faster than a human. that means— 1000 times faster than a human. that means they— 1000 times faster than a human. that means they can do two years of thinking — means they can do two years of thinking per day. two years of labour— thinking per day. two years of labour per— thinking per day. two years of labour per day. if that exists already. _ labour per day. if that exists already, and we know that this is technically— already, and we know that this is technically possible... already, and we know that this is technically possible. . ._ already, and we know that this is technically possible... there are so many facets _ technically possible... there are so many facets that _ technically possible... there are so many facets that we _ technically possible... there are so many facets that we call _ many facets that we call intelligence, two of which computers have already mastered. one is memory and the other is calculus or how to calculate, right? what about reasoning and emotional intelligence? is that in a or is that learned? i intelligence? is that in a or is that learned?— intelligence? is that in a or is that learned? . �* , ., , ., that learned? i mean, it's a bit of both, but you _ that learned? i mean, it's a bit of both, but you open _ that learned? i mean, it's a bit of both, but you open up _ that learned? i mean, it's a bit of both, but you open up your- that learned? i mean, it's a bit of both, but you open up your brain | that learned? i mean, it's a bit of. both, but you open up your brain and there _ both, but you open up your brain and there is— both, but you open up your brain and there is much— both, but you open up your brain and there is much of goop and there may have neurons that are sending electric— have neurons that are sending electric signals around. sure, it's a pretty. — electric signals around. sure, it's a pretty, get a system. the brain is one of— a pretty, get a system. the brain is one of the — a pretty, get a system. the brain is one of the most located things in the universe, but that does not mean it's not— the universe, but that does not mean it's not something we can do. we already— it's not something we can do. we already have systems like gpt4 is much _ already have systems like gpt4 is much better than me at writing shakespearean poetry. it's better at reading _ shakespearean poetry. it's better at reading books and i ambles up it's good _ reading books and i ambles up it's good analysing emotions in social
8:53 pm
situations— good analysing emotions in social situations with him is not perfect but interim i and neitherare situations with him is not perfect but interim i and neither are you. the concerns is the machines become smarter and develop feelings of superiority and then decide that they don't want to be turned off. laughter. right? that the concern. at what point did they say actually i'm in charge? i point did they say actually i'm in charue? ~ ., v point did they say actually i'm in charue? ~ . �*, ., �*, charge? i think that's what it's really important _ charge? i think that's what it's really important to _ charge? i think that's what it's really important to the - charge? i think that's what it's i really important to the guardrails now come — really important to the guardrails now come and _ really important to the guardrails now come and understand - really important to the guardrails now come and understand the i now come and understand the principles— now come and understand the principles of— now come and understand the principles of how _ now come and understand the principles of how we - now come and understand the principles of how we should i now come and understand the | principles of how we should be now come and understand the - principles of how we should be using ai in principles of how we should be using al in the _ principles of how we should be using al in the use — principles of how we should be using ai in the use cases— principles of how we should be using ai in the use cases where _ principles of how we should be using ai in the use cases where it - principles of how we should be using ai in the use cases where it is - ai in the use cases where it is viable — ai in the use cases where it is viable. there he _ ai in the use cases where it is viable. there he places- ai in the use cases where it is viable. there he places where ai in the use cases where it is . viable. there he places where it ai in the use cases where it is i viable. there he places where it is a credibly— viable. there he places where it is a credibly valuable _ viable. there be places where it is a credibly valuable to _ viable. there be places where it is a credibly valuable to have - viable. there be places where it is a credibly valuable to have that i a credibly valuable to have that speed — a credibly valuable to have that speed up— a credibly valuable to have that speed up process _ a credibly valuable to have that speed up process and - a credibly valuable to have that speed up process and the i a credibly valuable to have thatl speed up process and the ability a credibly valuable to have that i speed up process and the ability to scan lots _ speed up process and the ability to scan lots of— speed up process and the ability to scan lots of data _ speed up process and the ability to scan lots of data and _ speed up process and the ability to scan lots of data and to _ speed up process and the ability to scan lots of data and to find - scan lots of data and to find solutions _ scan lots of data and to find solutions to _ scan lots of data and to find solutions to problems i scan lots of data and to find solutions to problems that i scan lots of data and to find . solutions to problems that we struggle — solutions to problems that we struggle with _ solutions to problems that we struggle with for— solutions to problems that we struggle with for hundreds i solutions to problems that we struggle with for hundreds of| solutions to problems that we i struggle with for hundreds of years, with an _ struggle with for hundreds of years, with an equally _ struggle with for hundreds of years, with an equally there _ struggle with for hundreds of years, with an equally there is _ struggle with for hundreds of years, with an equally there is think - struggle with for hundreds of years, with an equally there is think that i with an equally there is think that humans _ with an equally there is think that humans are — with an equally there is think that humans are very _ with an equally there is think that humans are very good _ with an equally there is think that humans are very good at - with an equally there is think that humans are very good at and i with an equally there is think that humans are very good at and the | humans are very good at and the empathy— humans are very good at and the empathy side _ humans are very good at and the empathy side of— humans are very good at and the empathy side of things, - humans are very good at and the empathy side of things, the i humans are very good at and the i empathy side of things, the looking after each _ empathy side of things, the looking after each other— empathy side of things, the looking after each other and _ empathy side of things, the looking after each other and that _ empathy side of things, the looking after each other and that closeness| after each other and that closeness and intimacy. — after each other and that closeness and intimacy. be _ after each other and that closeness and intimacy, be able _ after each other and that closeness and intimacy, be able to— after each other and that closeness and intimacy, be able to have thati and intimacy, be able to have that may be _ and intimacy, be able to have that may be that — and intimacy, be able to have that may be that we _ and intimacy, be able to have that may be that we have to _ and intimacy, be able to have that may be that we have to look i and intimacy, be able to have that may be that we have to look at i and intimacy, be able to have that may be that we have to look at a i may be that we have to look at a different — may be that we have to look at a different future _ may be that we have to look at a different future where _ may be that we have to look at a different future where the - may be that we have to look at a different future where the work. may be that we have to look at a i different future where the work load side of— different future where the work load side of the _ different future where the work load side of the world _ different future where the work load side of the world is— different future where the work load side of the world is carried - different future where the work load side of the world is carried out i different future where the work load side of the world is carried out by. side of the world is carried out by aland _ side of the world is carried out by aland we — side of the world is carried out by aland we have _ side of the world is carried out by aland we have to— side of the world is carried out by aland we have to look— side of the world is carried out by
8:54 pm
aland we have to look at other . ai and we have to look at other things— aland we have to look at other things to — aland we have to look at other things to keep— aland we have to look at other things to keep ourselves - aland we have to look at other things to keep ourselves busy i aland we have to look at other i things to keep ourselves busy and to have a _ things to keep ourselves busy and to have a view— things to keep ourselves busy and to have a view to — things to keep ourselves busy and to have a view to what _ things to keep ourselves busy and to have a view to what our _ things to keep ourselves busy and to have a view to what our top - things to keep ourselves busy and to have a view to what our top level on| have a view to what our top level on mastows _ have a view to what our top level on maslow's hierarchy _ have a view to what our top level on maslow's hierarchy of— have a view to what our top level on maslow's hierarchy of the _ maslow's hierarchy of the virtualisation should i maslow's hierarchy of the virtualisation should be. i maslow's hierarchy of the i virtualisation should be. pi virtualisation should be. philosophical virtualisation should be.- philosophicalconversation with virtualisation should be— philosophicalconversation with some philosophical conversation with some of the other day about this and i'm sure many people at home do about whether, you know, machines can feel touch or the emotion of love or hate, they cannot react to a flower or a sunset column, or actually can they? that is all learned behaviour. there is even a question of his intelligence always about embodiment. we exist in a physical body that has sensing things but also what ought to physically touch you to send out your feeling. i'm reading you, and can machines do that? really they cannot physically exist in the water left they are a robot, which does and have sensors and cameras, etc. it does not know the feeling is because feelings are biochemical productions in a human brain or if you have pets, certain animals, perhaps all animals even plants that we are still learning even with the natural world, so
8:55 pm
there is a lot of this discussion on what is thinking, what is intelligence is notjust about machines or even humans but about what it means to be alive. d0 machines or even humans but about what it means to be alive.— what it means to be alive. do you always feel _ what it means to be alive. do you always feel that _ what it means to be alive. do you always feel that we _ what it means to be alive. do you always feel that we never - what it means to be alive. do you always feel that we never have i always feel that we never have enough time to get everything in this programme? itjust sort of lives by every week. i could talk for another two hours about this. thank you forjoining us again, always great to have you on the programme and maybe you will come back and thank you also to stephanie in a bit of housekeeping. just a bit of housekeeping before we go. we will be off air next week for the election, but we will be back a week on thursday from the wimbledon championships, where we will be looking at how they are using ai to improve the whole experience of tennis, and not just tennis. dojoin us for that. for now, if you are watching us in the uk, you will be leaving us for the northern ireland leaders debate. but just another reminder.
8:56 pm
8:59 pm
taking part in the second tv debate of the election campaign. all 18 seats are up for grabs next thursday. in 2019, the dup won the most seats with eight. sinn fein came second with seven seats, the sdlp secured two, and the alliance party won a single seat. the elections being held during northern ireland's school summer holidays and parading season which could turnout. constituency boundary changes may
9:00 pm
have an impact. sarah garvin sent this update. mil have an impact. sarah garvin sent this update-— this update. all five political -a this update. all five political party leaders _ this update. all five political party leaders of _ this update. all five political party leaders of northern i this update. all five political- party leaders of northern ireland's main political parties were invited to take part in this debate. we now know that not all of them will be representing the uup here this evening, robbie butler and representing... tonight, senior leaders from the five larger parties take questions from our studio audience. i'm tara mills. this is the northern ireland leaders' debate. applause.
14 Views
IN COLLECTIONS
BBC News Television Archive Television Archive News Search ServiceUploaded by TV Archive on