tv The Context BBC News July 25, 2024 8:30pm-9:01pm BST
8:30 pm
and hie] for a full before that, it is sports news, and for a full round-u- hie] for a full before that, it is sports news, and for a full round-u- from hie] for a full before that, it is sports news, and for a full round-u- from the for a full round—up from the bbc sport centre, it is over to hugh ferris. the opening ceremony isn't until friday, but we're currently in day two of the action at pari 2024, with team usa attempting to win the women's football gold for the first time in 12 years. their opening match is against zambia, and it's new coach emma hayes' first competitive game in charge. hosts france are playing colombia. earlier 2016 champions germany beat australia. a comfortable lead. two goals in a minute included in that. they are still in the first half, about half an hour gone. hosts france are playing colombia. earlier 2016 champions germany beat australia. 3-0. that's in the usa's group. while brazil started off with a 1—0 win over nigeria. spain started their attempt at olympic gold to last year's world cup with a win overjapan.
8:31 pm
they came from behind to secure a 2—1victory. defending champions canada also fought back from a goal down. evelyne viens scored this winning goal after arsenal's cloe lacasse netted the equaliser in stoppage time of the first half. andy murray admits he "ran out of time" after announcing he's withdrawing from the singles event at the paris olympics, his final competition before retiring. he recently had an operation on his back and, like at wimbledon, is only able to compete in the doubles. murray will play alongside dan evans, and they'll face japan pair taro daniel and kei nisikori in the first round. well, murray won't be the only former grand slam winner calling time on their career after the olympics. three—time grand slam winner angelique kerber has announced she'll be stepping away from the sport after the games. the 36—year—old is a former world number one, and won a silver at rio 2016. the first world record of paris 2024 was set earlier today in the archery.
8:32 pm
it came in the ranking round of the women's individual recurve event by south korea's lim si—hyeon. she achieved an incredible score of 694 out of a possible 720 at the iconic les invalides venue, beating the previous best of 692 by another korean in 2019. the country's archers also broke their own olympic record in the team category with a combined effort of 2,046. so while some sports are already under way, the traditional opening ceremony will take place tomorrow night along the river seine, and the ceo of the organising committee says there's been lots of hard work to get the city prepared. but they are now ready to put on a show. it's a national project. everybody has been gathering to show what this country has to offer, to be able to, again, put the best of france for the athletes of the world from the
8:33 pm
best athletes of the world's the athletes of the world from the best athletes of the worlds to thrill the world and bring some emotions. england have named an unchanged side for their third and final test against west indies which starts at edgbaston tomorrow. that means fast bowler mark wood keeps his place despite concerns over his fitness. wood was forced to leave the field for treatment during the second test, leading to speculation he might be rested and replaced by either matthew potts or the uncapped dillon pennington. england have already won the series, but captain ben stokes wants to complete a clean sweep. i think when you look at how we performed over the first two games, it is pretty hard to look past any changes. you know, ithink it is pretty hard to look past any changes. you know, i think we have been, you know, two very, very impressive all—around team performances. so, yeah, we are looking to cap off the series with another one. and haas have announced they've signed esteban ocon as their new driver for the 2025 formula one season. ocon has been driving for alpine but has now signed a multi—year contract with the us based team.
8:34 pm
ocon's arrival completes an—all new driver line up for next season. the frenchman joins oliver bareman, following the decision to drop kevin magnussen. and that's all the sport for now. you are watching the context. it is time for our our new weekly segment — ai decoded. welcome to ai decoded, that time of the week when we look in depth at some of the most eye catching stories in the world of artificial intellligence. last week we looked at how artificial intelligence could threaten human jobs in the future but what about those on the battlefield? the guardian is calling it al's "oppenheimer moment" due to the increasing appetite for combat tools that blend human and machine intelligence. this has led to an influx of money to companies and government agencies that promise they can make warfare smarter, cheaper and faster. and here in the uk, leading military
8:35 pm
contactor bae systems are ramping up efforts to become the first in their industry to create an ai—powered learning system meant to make military trainees "mission ready sooner." our bbc ai correspondent marc cieslak went to meet all those involved, and we will be showing you his piece injust a moment. with me is our regular ai contributor and presenter priya lakhani, who's ceo of ai powered education company century tech. priya, fascinating area, but of all the uses of ai, its perhaps the one where the moral and ethical dilemmas are the most extreme. that is absolutely right. this is using ai that is absolutely right. this is using al to potentially have unmanned military drones. what you are going to see in mark's incredible piece is unmanned military were craft potentially and it's obviously great if there aren't
8:36 pm
humans being harmed out there on the field, but does that mean that actually work could escalate much quicker? will decisions be made by these ai systems. if both parties have ai systems what happens then? it sort of a race as to who can escalate further. there's all sorts of s two ethical situations. you can also see learning systems and how these systems are using ai also see learning systems and how these systems are using al to improve learning in terms of training the military and soldiers. it is a fascinating area and we will do a bit of the deep dive into the ethics a little bit later in the programme. ethics a little bit later in the programme-— ethics a little bit later in the ”roramme. ., , ., ., ~' ., �*, programme. lots to talk about. let's take a look. — programme. lots to talk about. let's take a look. as _ programme. lots to talk about. let's take a look, as we _ programme. lots to talk about. let's take a look, as we were _ programme. lots to talk about. let's take a look, as we were just - programme. lots to talk about. let's take a look, as we were just talking i take a look, as we were just talking about, this report. stay with us, we have lots to discuss afterwards. up, down, flying or hovering around. for 75 years, the airshow has shown off aircraft of civilian and military, often inviting pilots to put their aeroplanes through their
8:37 pm
paces. to the delight of the assembled attendees. including plane buffs, even new prime ministers. in recent years, farnboro has played host to a lot more than these, unmanned air vehicles as they are commonly known as drones. drones with military application with fixed wings that behave like an aeroplane or rotor is capable of hovering like a helicopter are in abundance. but all have something in common. a human being involved in the, and of control of these aircraft at some stage. it's a process that's called human in the loop. it is stage. it's a process that's called human in the loop.— human in the loop. it is critical from a moral— human in the loop. it is critical from a moral and _ human in the loop. it is critical from a moral and ethical- human in the loop. it is critical from a moral and ethical point| human in the loop. it is critical. from a moral and ethical point of view to ensure that there is a human judgment that is always at the heart of selection of the course of action. ~ . , ., ,, ., ., of selection of the course of action. ~ . , ., .,�* action. military application of ai is extremely — action. military application of ai is extremely controversial. - action. military application of ai i is extremely controversial. images of killer robots and the idea of ai
8:38 pm
run amok ourfrequent of killer robots and the idea of ai run amok our frequent additions to stories in the press about the risks the technology poses. nevertheless, militaries around the world are already using artificial intelligence. one area where it is particularly useful is training pilots to fly aircraft like these. flight simulators are an integral part of a pilot's training. they save time and money, allowing prospective pilots to gain valuable skills from the comfort and safety of terra firma. formerly with the raf, jim whitworth is a pilot instructor experienced in flying military jets instructor experienced in flying militaryjets like the hawk and tornado. as soon as you see that, just pull the stick back, set in altitude as we discussed. the the stick back, set in altitude as we discussed.— the stick back, set in altitude as we discussed. the simulator rig is for a hawk — we discussed. the simulator rig is for a hawaets. — we discussed. the simulator rig is for a hawk jets, the _ we discussed. the simulator rig is for a hawaets, the royal- we discussed. the simulator rig is for a hawaets, the royal air - we discussed. the simulator rig is i for a hawk jets, the royal air force for a hawaets, the royal air force �*s preferred trainer. what sort of feedback have you given to the team developing this in terms of its realism? ,, .,
8:39 pm
developing this in terms of its realism? ., , �*, realism? 50, really, it's about the feedback from _ realism? 50, really, it's about the feedback from the _ realism? 50, really, it's about the feedback from the controls. - realism? 50, really, it's about the feedback from the controls. i - realism? 50, really, it's about the i feedback from the controls. i would like it to feel as much like a hawk as possible. like it to feel as much like a hawk as possible-— like it to feel as much like a hawk as ossible. ~ ., , ~ _, as possible. where does the ai come into the mix? — as possible. where does the ai come into the mix? we _ as possible. where does the ai come into the mix? we can _ as possible. where does the ai come into the mix? we can record - into the mix? we can record everything _ into the mix? we can record everything a _ into the mix? we can record everything a trainee - into the mix? we can record everything a trainee does i into the mix? we can record everything a trainee does in | into the mix? we can record - everything a trainee does in this environment in the simulator. we can give some metrics with which to measure the performance and then score each performance. as we start to build up on each trainee, artificial intelligence can then start to analyse that data for us and show us where our pinch points in the syllabus are. and by that immune where each trainee is standing perhaps might want to a piece of training, either courseware material or technique from the instructor to try to make that training is successful as possible. greatest advantage of learning to fly like this is that when i need to get back down on the ground, i can hit a few keys, take the headset off and i am good to go. synthetic training isn't exclusive to aircraft. nearly every of the battlefield _ aircraft. nearly every of the battlefield and _ aircraft. nearly every of the battlefield and its _ aircraft. nearly every of the i battlefield and its surrounding environment can be simulated. the software powering these tools
8:40 pm
has evolved from the same tech as video games. the addition of bi video games. the addition of ai allows them _ video games. the addition of ai allows them to _ video games. the addition of ai allows them to behave - video games. the addition of ai allows them to behave in - video games. the addition of ai allows them to behave in a much more realistic way, even replicating civilian activity. how does ai help in simulation? it’s civilian activity. how does ai help in simulation?— in simulation? it's really difficult to replicate _ in simulation? it's really difficult to replicate real-life _ in simulation? it's really difficult to replicate real-life scenarios. i to replicate real—life scenarios. it's very— to replicate real—life scenarios. it's very difficult to get enough space — it's very difficult to get enough space to — it's very difficult to get enough space to do the training in. at to -et space to do the training in. at to get enough assets available, particularly if they are on operations. it can make incredibly capilaled — operations. it can make incredibly capitated scenarios and the ai can then create the complexity that they need to _ then create the complexity that they need to train against.— need to train against. when it comes to aerial combat _ need to train against. when it comes to aerial combat training, _ need to train against. when it comes to aerial combat training, new- need to train against. when it comes to aerial combat training, new ai - to aerial combat training, new ai powered adversaries are proving to be a challenge even for experienced pilots. be a challenge even for experienced ilots. , , , ,, be a challenge even for experienced ilots. , , , i. ., pilots. definitely put you through our pilots. definitely put you through your paces- _ pilots. definitely put you through your paces- it — pilots. definitely put you through your paces. it puts _ pilots. definitely put you through your paces. it puts you _ pilots. definitely put you through your paces. it puts you in - pilots. definitely put you through i your paces. it puts you in positions supernaturally seen before. it fights a different doctrine that we are not necessarily trained against. so i think it's going to become the future. ., , so i think it's going to become the future. . , ., , ., future. headset on, put him through his aces. future. headset on, put him through his paces. peers _ future. headset on, put him through his paces. peers w— future. headset on, put him through his paces. peers w 's _ future. headset on, put him through his paces. peers w 's to _ future. headset on, put him through his paces. peers w 's to fly - future. headset on, put him through his paces. peers w 's to fly the - his paces. peers w �*s to fly the raf's his paces. peers w �*s to fly the raf�*s most advanced fighter, the
8:41 pm
typhoon. he is about to fly a virtual version of the same jet in aerial combat against a system created by developers from crown field university. it is called the ai aided tactics engine. so if your opponent is also a human being, something at stake for both of you, your lives are at stake. if your opponent in the real world isn't a human being, does that change things for you as a pilot? bi is human being, does that change things for you as a pilot?— for you as a pilot? ai is learning and adapting — for you as a pilot? ai is learning and adapting to _ for you as a pilot? ai is learning and adapting to your— for you as a pilot? ai is learning and adapting to your reactions, | and adapting to your reactions, so therefore — and adapting to your reactions, so therefore it— and adapting to your reactions, so therefore it becomes quite difficult to train_ therefore it becomes quite difficult to train against if you are fighting against _ to train against if you are fighting against other real—world aircrew. you potentially know the training they've _ you potentially know the training they've been through. you know almost — they've been through. you know almost what to expect, whereas against — almost what to expect, whereas against this, you just don't know what _ against this, you just don't know what to— against this, you just don't know what to expect with it. the against this, you just don't know what to expect with it.— what to expect with it. the ai en . ine what to expect with it. the ai engine has — what to expect with it. the ai engine has come _ what to expect with it. the ai engine has come out - what to expect with it. the ai engine has come out on - what to expect with it. the ai engine has come out on top. | what to expect with it. the ai - engine has come out on top. now it is my turn to take on an ai top gun. i lost is my turn to take on an ai top gun.
8:42 pm
ilost him. is my turn to take on an ai top gun. i lost him. can i get some altitude. outmaneuvered at every turn, the yeah made quick work. —— made quick work of this novice pilot. just to elusive! pilots aren'tjust learning from the ai. in turn, it is learning from the ai. in turn, it is learning from them too. it's refining skills which one day may be used to pilot drones in real—world situations. a scenario for many presents a significant moral and ethical risk. that risk associated with technology is a critical area. it is not new. every technology that has been deployed in defence has a risk associated with it and there is a very well—established moral, ethical and legal framework around very well—established moral, ethical and legalframework around how very well—established moral, ethical and legal framework around how we evaluate the risk of any new capability alongside the operational capability alongside the operational capability and the imperative to use it. ~ ., capability and the imperative to use it. . ., ., , , , capability and the imperative to use it. what happens if the adversary doesnt it. what happens if the adversary doesn't play _ it. what happens if the adversary doesn't play by — it. what happens if the adversary doesn't play by the _ it. what happens if the adversary doesn't play by the rules? - it. what happens if the adversary doesn't play by the rules? if - it. what happens if the adversary doesn't play by the rules? if they don't play by the rules of engagement, orthey
8:43 pm
don't play by the rules of engagement, or they don't play by the ethicalframeworks. engagement, or they don't play by the ethical frameworks. tote engagement, or they don't play by the ethical frameworks.— the ethical frameworks. we don't assume that _ the ethical frameworks. we don't assume that adversaries - the ethical frameworks. we don't assume that adversaries will - the ethical frameworks. we don't assume that adversaries will play the ethical frameworks. we don't i assume that adversaries will play by the same rules we do. but because we understand the technology, we understand the technology, we understand how you would go about deploying autonomy outside of that framework and when we understand the technology and such, we can understand the techniques we would use to counter that and defeat that threat. , , ., use to counter that and defeat that threat. , , . , , use to counter that and defeat that threat. , , ., , , ., threat. this is a glimpse of the future. threat. this is a glimpse of the future- it _ threat. this is a glimpse of the future. it is _ threat. this is a glimpse of the future. it is called _ threat. this is a glimpse of the future. it is called tempest, i threat. this is a glimpse of the future. it is called tempest, a| future. it is called tempest, a joint collaboration between the uk, italy and japan. this proposed sixth generation stealth combatjet will have advanced radar and weapon systems as well as flying with its own mini squadron of drones. the tempest acting as a flying and at a distance while the drones perform missions semi—autonomously. which begs the question, how long will the human being remain in the loop?
8:44 pm
told you it was interesting. coming up, we will delve deeper into the issues surrounding ai on the battlefield and we'll be speaking to mikey kay, a former senior raf officer in the british military and dr peter asaro from the campaign to stop killer robots, a coalition of non—governmental organisations who seek to pre—emptively ban lethal autonomous weapons. join us for all that on al decoded after this short break. stay with us here on bbc news.
8:45 pm
8:46 pm
contributor and presenter, priya lakhani, who's ceo of ai powered education company century tech. joining us now are mikey kay, a former senior raf military officer, and dr peter asaro from the organisation stop killer robots, who's also a professor at the new school in new york, where his research focuses on artificial intelligence and robotics. thank you both of you forjoining us here on a id coded. perhaps if i can start with you and just ask to give us a quick rundown of your understanding of how ai technology is being used on the battlefield at the moment. i is being used on the battlefield at the moment-— is being used on the battlefield at the moment. ~ ., ., , the moment. i think a good example is the process _ the moment. i think a good example is the process called _ the moment. i think a good example is the process called the _ the moment. i think a good example is the process called the kill - is the process called the kill check, which is a procedural approach to the precision guided munition process. and what that does basically is it assists with the identification and selection of a potential threat and then the approach looks at what's called the cbe, the collateral damage estimate and will look in the vicinity of
8:47 pm
what the target is, let's say for example, two snipers on the second story of a 30 story building and it will assess potentially what components within a certain radius of that target could be or could form some sort of collateral, whether that is endangering human life or endangering infrastructure. at the same time, you've got a significant amount of intelligence is going into this process, whether that's human intelligence, intelligence you get from an informant or imagery intelligence, electronic intelligence, listening to tapping into phones, listening to radio frequencies or looking at imagery. then it will go into weapon selection and it will basically look at what type of weapon, whether it be a bomb from a platform like fast jet or whether it's an artillery cell, precision guidance from a tank, a wartime weapon —— what kind of weapon is most appropriate to minimise that collateral damage. various governments will have
8:48 pm
different tolerance policies on that and it it will also bring in what the rules of engagement are in terms of being able to prosecute the target. so, ai of being able to prosecute the target. so, a! can across all of those areas, including the battle assessments, which is effectively taking a photograph after the bombs hit, it can inform all of those components of what is commonly called the kill chain.— called the kill chain. your organisation _ called the kill chain. your organisation is _ called the kill chain. your organisation is called. i called the kill chain. your- organisation is called. robots, which perhaps _ organisation is called. robots, which perhaps give _ organisation is called. robots, which perhaps give people i organisation is called. robots, which perhaps give people an. organisation is called. robots, i which perhaps give people an idea of where you are coming from us. we just seen a! being used to train pilots. do you have a problem with al being used in that sort of ai being used in that sort of capacity or do you want ai ai being used in that sort of capacity or do you want a! not to be used at all when it comes to battlefield training and effectiveness?- battlefield training and effectiveness? , ., ~' effectiveness? yet, so i think there is a lot of valid _ effectiveness? yet, so i think there is a lot of valid applications - effectiveness? yet, so i think there is a lot of valid applications for i is a lot of valid applications for artificial— is a lot of valid applications for artificial intelligence across many different — artificial intelligence across many different domains for medicine to health_ different domains for medicine to health care, and even in the mililary— health care, and even in the military for logistics and training and things like that. we are really
8:49 pm
focused _ and things like that. we are really focused on — and things like that. we are really focused on autonomy in systems and ensuring _ focused on autonomy in systems and ensuring that humans are ultimately making _ ensuring that humans are ultimately making the decision to use lethal force _ making the decision to use lethal force and — making the decision to use lethal force and determining what is a valid _ force and determining what is a valid unlawful target in in armed conflict — valid unlawful target in in armed conflict. and we have been working at the _ conflict. and we have been working at the un _ conflict. and we have been working at the un for more than a decade working _ at the un for more than a decade working on— at the un for more than a decade working on a treaty there. but of course _ working on a treaty there. but of course there is many different kinds of applications and there has been a lot of— of applications and there has been a lot of debate around exactly how to define _ lot of debate around exactly how to define these systems, as we just heard _ define these systems, as we just heard from the video in your previous— heard from the video in your previous speaker that there is a lot of different ways to integrate these into the _ of different ways to integrate these into the complex operations of the military_ into the complex operations of the military which involves a lot of data. — military which involves a lot of data. a — military which involves a lot of data. a lot _ military which involves a lot of data, a lot of computers come a lot of people _ data, a lot of computers come a lot of people making decisions at different levels of, and and control _ different levels of, and and control. so it is challenging to find ways _ control. so it is challenging to find ways to really regulate how that happens and ensure that humans remain— that happens and ensure that humans remain in— that happens and ensure that humans remain in control.— remain in control. mikey, a question for ou, remain in control. mikey, a question for you. because _ remain in control. mikey, a question for you, because presumably - remain in control. mikey, a question for you, because presumably one i remain in control. mikey, a question for you, because presumably one of| for you, because presumably one of the areas that the military is trying to achieve here is less
8:50 pm
civilian harm, right? we know from the un that the civilian casualty ratio is about 91. so nine civilians to one combatant. making precision targeting theoretically more possible doesn't necessarily mean that the impact on mitigating risks to civilians is more probable because when it comes to using artificial intelligence is about speed. if both parties have artificially intelligent trained weapons were drones and they are using this technology, speed is key in the process. and we saw, for example, the es system used by the idf, sources alleged that they actually increase the number of civilians they were permitted to kill when they were targeting a potential low risk militant to 15—20 civilians. they would drop a bomb on an entire house and flatten it to try and achieve their goal. what do you think about al making war more destructive in this sense and not helping us when it comes to reducing
8:51 pm
civilian harm?— civilian harm? what you are talking about there — civilian harm? what you are talking about there is _ civilian harm? what you are talking about there is the _ civilian harm? what you are talking about there is the collateral - civilian harm? what you are talkingl about there is the collateral damage estimate, and the collateral damage estimate, and the collateral damage estimate varies from government to government will stop and think it's quite obvious if you look at the tolerance policy of the idf it is significantly different from experience of what the tolerance policy was of, say, the uk when it was operating precision guided munition strikes in iraq or afghanistan. i was part of that to kill chain process in baghdad. so i am incredibly familiar with that and incredibly familiar with the collateral damage estimate and what the rules of engagement are. where ai the rules of engagement are. where a! can improve this, and you are absolutely right, when you talk about speed, speed is of the essence. so if you do have an imminent threat to life or to infrastructure, neutralising that threat through speed and accuracy is where a! can help improvement. so the collateral damage estimate, for example, a! will be able to speed up
8:52 pm
that assessment of what the potential collateral damages. and when we talk about collateral damage from them talking about a school within the potential radius of a certain weapon or a bus passing by at a certain time of the day. and thatis at a certain time of the day. and that is where the pattern of life comes and which is effectively drones overhead of target looking at what the pattern of life is of various components surrounding that. so speed is critical. selection of the weapon, speed at which the weapon can be selected from the rules of engagement from the collateral damage estimate is critical. so ai collateral damage estimate is critical. so a! for me will speed up and make that process more accurate. ultimately, the top level tier has to be the government's tolerance policy on what it is willing to accept in terms of loss of life. in the moment we talked about human in the moment we talked about human in the loop. so the slowest and therefore the weakest link here would arguably be human in the loop. is there than a risk that the human will be cut out of the process? you're talking about speed and
8:53 pm
potentially the human could become the slightly slower component of that. but then what's a critical component to think about is and ethics. ai isn't there yet. will it ever get there? i'm not sure. there are those that argue it will. there are those that argue it will. there are those that argue it will. there are those that argue there you will always need a human in the loop to give that overlay of what the ethics are, with the rules of engagement are, with the rules of engagement are, the scenarios are very different. prosecuting different targets in different environments with different platforms, different weapons. i gave the example of two islamic state snipers on a second floor of a 30 story building. the human has the ability to be able to select the weapon through technology but also put, for example, a steel tip on the top of that weapon called a penetrator so it can go through 28 floors to the second floor with a delay, and taking out the second floor without destroying anything
8:54 pm
else. so, it is a massively complex procedure of which a! will be learning how to do that. but my advice and certainly the way i would approach this is that a human in the loop right now it is imperative in order to minimise that collateral and minimise potential mistakes. and mistakes do sadly have been quite a lot. we mistakes do sadly have been quite a lot. ~ ., �* ., ~ mistakes do sadly have been quite a lot. t . �* ., ~ ., lot. we haven't talked about the transparency _ lot. we haven't talked about the transparency of _ lot. we haven't talked about the transparency of that _ lot. we haven't talked about the transparency of that either i lot. we haven't talked about the transparency of that either in i lot. we haven't talked about the | transparency of that either in the sense that a lot of this is classified intelligence. would defence contractors be allowed. we are running out of time, but war games show that the use of ai would escalate quicker than it would otherwise, what are your thoughts about that?— otherwise, what are your thoughts about that? . about that? welcome as you said with seed, the about that? welcome as you said with speed, the decision-making _ about that? welcome as you said with speed, the decision-making happensl speed, the decision—making happens and in _ speed, the decision—making happens and in shorter time frames. the real difficulty— and in shorter time frames. the real difficulty is _ and in shorter time frames. the real difficulty is when more and more strategic— difficulty is when more and more strategic decision—making and engaging decisions to engage a target — engaging decisions to engage a target or— engaging decisions to engage a target or initiate an operation, you would _ target or initiate an operation, you would actually have humans that would _ would actually have humans that would not — would actually have humans that would not be in control of the
8:55 pm
overall— would not be in control of the overall planning, the decisions to id overall planning, the decisions to go to— overall planning, the decisions to go to war. — overall planning, the decisions to go to war, the decisions to escalate the conflict — go to war, the decisions to escalate the conflict could alljust sort of happen— the conflict could alljust sort of happen automatically. we've seen this already with online trading and flash crashes where different algorithms interact with each other and lead _ algorithms interact with each other and lead to a stock market crash. we don't _ and lead to a stock market crash. we don't want _ and lead to a stock market crash. we don't want this happening with autonomous systems and warfare. but ithink— autonomous systems and warfare. but i think to _ autonomous systems and warfare. but i think to the _ autonomous systems and warfare. but i think to the question you asked before _ i think to the question you asked before about precision weapons. what we know _ before about precision weapons. what we know is— before about precision weapons. what we know is this is automation. automation increases speed and reduces— automation increases speed and reduces cost. by reducing the cost of bombing each individual targets, that means you can afford to bomb a lot more _ that means you can afford to bomb a lot more targets. so if you are a certain— lot more targets. so if you are a certain percentage of civilians with each strike — certain percentage of civilians with each strike but now you can strike many, _ each strike but now you can strike many. many — each strike but now you can strike many, many more things to meet connection— many, many more things to meet connection to wind up having a much larger— connection to wind up having a much larger impact on the civilian population, even though you have increased — population, even though you have increased precision. it's not automatic that these systems will improve _ automatic that these systems will improve the impact on civilians. i�*m
8:56 pm
improve the impact on civilians. i'm auoin to improve the impact on civilians. going to have improve the impact on civilians. i�*m going to have to stop you there. i'm sure we could talk about this all evening. it's a fascinating subject and we appreciate your time. thank you, and here in the studio, thank you, and here in the studio, thank you so much forjoining us. that is it. we are out of time. a! decoded will be taking a well—deserved break �*s imac ——ai decoded will be taking a well—deserved break for the month of august, but don't worry we will be back in full force beginning of september, so please join us then. good evening. hello there, good evening. the air still muggy and humid throughout today, lots of moisture in it, some misty, murky conditions, lots of low cloud, particularly towards western—facing coasts and hills. some sunny spells at times, most of those to the east of high ground, but the sunshine tomorrow will be a lot more abundant — the air feeling fresher and less humid still. the chance, though, of some showers, particularly in the north and the west, and that change is going to happen overnight.
8:57 pm
tonight, we'll see these weather fronts push further southwards and eastwards. the skies will largely clear. still some showers in the far north and the west, but it's a cooler start to the day tomorrow and it should be a brighter one as well. so, not quite so much cloud around tomorrow, much more in the way of sunshine — this time from the word go for most of us. there will still be some showers pushing eastwards on that westerly wind across scotland, northern ireland, perhaps northern england, to a few more isolated showers across wales and the far southwest of england. some of those showers could potentially be heavy, but the further south and east you are, then the drier your day is likely to be. and in the best of the sunshine, of course, the temperatures will react — 19—23 celsius north to south, butjust a different feel to things. and it changes again on saturday. high pressure tries to build in, but there's a weather front out towards the west that's pushing eastwards, bringing with it some cloud and some showery outbreaks of rain across northern ireland, through western wales, the north of england, some of the showers possibly sharp,
8:58 pm
and there'll also be some showers across much of scotland, too. but again, across east anglia, perhaps the southeast of england, it could stay largely dry and temperatures here will rise to 22 or 23 degrees, but of course it will feel cooler underneath the layers of cloud. we're much more likely to see a dry day across the board on sunday. that's because high pressure will be building in from the azores. always more cloud towards the north and the west, and there will be some areas of cloud, i think, bubbling up here and there as we head through the afternoon, but some decently long, sunny spells and temperatures more widely will peak in the low 20s. i think we could get the mid 20s, perhaps across london and the south east as we head through sunday. and those temperatures in the south and the east of england in particular, will start to rise, perhaps to the high 20s into monday and tuesday. more sunshine, too, across wales, the mid 20s here, but further north and west, it's cooler with a chance of some showers. bye— bye.
8:59 pm
hello, i'm sarah campbell. you're watching the context on bbc news. well, welcome back, mr prime minister. we've got a lot to talk about. i think we should get to it. the floor is yours. we need to bring the war to an end, and one of the principal things that the president's going to talk to the prime minister about today is how we get there, how do we end this war. mr netanyahu will then meet i kamala harris, the vice president, but also now the likely _
9:00 pm
democratic presidential nominee. so her position on all of this i becomes increasingly important. joining me tonight on the panel — wills robinson from daily mail us and claire ainsley, formerly policy director for keir starmer, now director of the progressive policy institute. first, the latest headlines — there have been more protests after a video emerged of a police officer kicking a man in the head while he's on the floor at manchester airport. demonstrators gathered in manchester near the mayor's office to call on the authorities to take more action. greater manchester police has suspended one officerfrom all duties. the police watchdog says it has another referral from greater manchester police in relation to the incident.
22 Views
IN COLLECTIONS
BBC News Television Archive Television Archive News Search ServiceUploaded by TV Archive on