Skip to main content

tv   Click  BBC News  May 11, 2024 1:30am-2:01am BST

1:30 am
voice-over: this is bbc news. we'll have the headlines for you at the top of the hour, which is straight after this programme. this week, we look at the latest tech being used to combat cancer. lara hits the lab to see how ai and robotics are being harnessed to create new treatments for the condition. we uncover the new blood monitoring system aiming to give cancer patients more independence. ijust fell in love with the machine. it gave me a little bit
1:31 am
of control back as well. we meet the robots with a sense of touch. so, how did you sleep last| night? did you sleep well? and spencer looks at how a video game engine can make movies. we can move the shard... laughter ..which does make me feel more powerful. according to britain's biggest cancer charity, someone is diagnosed with cancer here in the uk every two minutes. despite massive progress, many treatments are still incredibly harsh and not always successful. however, we now have technologies like aland robotics which are helping us to discover more effective drugs — with fewer side—effects — quicker. one in two of us will be diagnosed with cancer in our lifetime — one in two! for years, treatment has been centred around surgery, chemotherapy and radiotherapy, sometimes a combination. cancer therapy has evolved over the last two decades,
1:32 am
and particularly moving, shifting away from chemotherapy for every patient and for every cancer to personalised treatment. in some cancers, we have really made great progress. and 10—15 years ago, melanoma survival rates — median survival rates — were around six to nine months. now, patients live, survive years. standard cancer therapies can be incredibly tough to go through. and one of the great new hopes for kinder treatment is immunothera py. that's drugs that work alongside your own immune system to kill cancer cells. scientists here at labgenius in southeast london are harnessing the power of robotics and ai. they're hoping it'll help them find new immunotherapy treatments and make existing ones better. there's a lot going on in here — some of it by automated machines, some of it by people. that's right.
1:33 am
so, the different thing about what this machine is doing, and what you might envisage or see happening in a normal lab, is all of the experiments have actually been designed by an algorithm. and all of the data you're using is your own data? that's absolutely right. and actually, that is a really, really key point. we want to build models that are predictive of certain biological features of interest, and the challenging thing for us is that data that's required to train those models, it doesn't exist anywhere in the public domain. but it's notjust these few machines i've seen. it's everything. eva is a smart robotic platform, synthesising, purifying and testing molecules using machine learning and the clout that comes with cloud computing. and this room is the final bit of the lab process, to make sure that the molecules are tested for purity, that they can withstand changes in temperature and, crucially, that something has
1:34 am
been created that can be produced at scale. labgenius says that with its unique approach, it's more likely to find high—performing treatments faster. and they have tested that claim. we took a molecule that's currently in clinical trials. it's an immunotherapy drug? it's an immunotherapy, that's right, currently in clinical trials. we took the one that's furthest along. and we said, "can we use this process to make molecules "that are any better than this?" we were hoping for a tenfold improvement in that molecule's ability to distinguish between healthy and disease cells. but what we actually found is that these machine learning models were able to deliver molecules with 400 times improvement in terms of their killing selectivity — so, really, orders of magnitude better. these are molecules that, as a protein engineer, you would never have sat down and designed yourself. and i think that's something that's really quite differentiated and special around this fusion of human ingenuity and machine intelligence. but they want to take this to the next level. cancer cells are densely covered in surface markers that distinguish them from normal cells. and it's these markers that molecules developed by labgenius are targeting.
1:35 am
they're called immune cell engagers. one part of the immune therapy is targeting a very specific cancer protein, or receptor. the other part is inviting t—cells to join the party and really attack cancer cells. so this is version two of targeted immunotherapy. it's a very pinpointed approach. you can attack much harder and with much less collateral damage. i particularly believe the preclinical development, which is now through companies like labgenius, is much more condensed, much quicker, supported by the right regulatory framework, much, much quicker. and it really, you know, historically, with drugs took ten years to be developed. it would bring us down to five years and even quicker sometimes. will we cure cancer? i think that's still an open question, but better understanding of biology, a more targeted approach will help us getting most
1:36 am
of our cancer patients surviving, with good quality of life, for many years to come. wow! it really does feel like we are getting there with these cancer treatments. i know it's always going to be slow progress, but this is another example of the power of big data. yes, and what really stuck with me from that day was the testing of drugs on healthy cells — because many experts have said to me the big challenge isn't creating drugs to kill cancer, it's developing tolerable drugs to kill cancer. right. of course. yeah. 0k. well, next, shiona mccallum has been looking at some new tech which could help to monitor cancer patients at home rather than in hospital. shiona: cancer has dominated lynn's life for the last seven years. you get ct scans, you've got mri scans, you've got blood tests, seeing the doctor, you've got all these other tests and things. and it does feel, at the very beginning, that every day, there was another appointment coming through. it's exhausting
1:37 am
physically and mentally. you're just waiting and waiting and waiting. she spent a lot of time away from her beloved cat, sunny, getting chemotherapy and blood tests in hospital — something she's always found stressful. i don't like needles. i don't like having bloods taken. you can be waiting two hours for the results. if the analysers break or something goes wrong, it can be 3—4 hours before you get your results for the blood tests — sitting in that blood room, getting more and more anxious, getting hotter and hotter. by the time i would go into the chair, i would probably... you know, i do faint. but lynn has recently had access to a tech solution that she says has made a huge difference to her life. when i did the training session, it was so simple to follow and to use.
1:38 am
this is liberty, a device which allows lynn to do a quick finger prick at home instead of a full blood test at the hospital. you can do it the day before you have your treatment or the day before you go and see your consultant, when you want to do it. you've got that little bit of control over your treatment and your time. i can do my blood test and i can go and do something else whilst it's being analysed. and the one thing that i did like about it as well — i didn't have to keep going and being reminded that i had the cancer. the kit has been designed by health tech company entia and is the world's first remote patient monitoring solution for people undergoing cancer treatment. the liberty has just had regulatory approval and is in the early stages of being rolled out to various hospitals. here at the christie in manchester, they were one of the first places to try out the tech. absolutely packed —
1:39 am
an all—too—familiar sight in a hospital waiting room. if you could do the test at home, you'd be far more relaxed. you wouldn't be worried about the appointment times, the waiting rooms, particularly if you're feeling ill and tired. and it's just another worry for you, another anxiety whilst you're going through your treatment. what would it mean if you could do this process at home, using...? it'd be amazing. wouldn't have to get childcare and wouldn't have to worry. you're more comfortable at home, aren't you? you've got your home comforts. coming here can be quite scary sometimes. and reducing patients coming through the doors was one of the reasons this hospital took a punt on the tech. what we've done recentlyl at the christie has actually positioned phlebotomy units around the region. - so we now have 10 or 12| units around the region, what we call "bloods closer to home". - so the patients can book- in themselves to have their blood
1:40 am
tests, but it still means we have to staff those units in order- for the patients to be able to have the blood tests. . if the patients were able to simply get those blood tests done or do i them themselves at home, that would result in - significant efficiencies. and that's really important. we're not trying to distance i ourselves from patients, we're trying to make their lives easier. and during the clinical trial, the medical teams found that the blood testing at home was just as good as in the lab. and then the next thing, - of course, is, "how acceptable is doing the bloods at home "to the patients?" - and i'm very pleased to say thatj it's a highly acceptable approach to the patients that we've tested this in. _ they also found it easy to use. the results from the devices are easy to read. i can see the white blood cells, the neutrophils, the haemoglobin and the platelets all on the screen. and it meant a lot less of this — giving blood — which many patients find particularly difficult. so what is the ambition for this bit of tech? so now that we've got regulatory approval,
1:41 am
we're really actively engaged with many nhs centres across the country and really looking at new clinical use cases that this solution can actually be applied to. but it's not too onerous for them, is it? yeah, so everything that we've done at entia has been designed to offer patients a very simple journey. i mean, we are not trying to medicalise patients further. we're really trying to take them out of the hospital setting, placing them back in the community, in their homes, in the place that they really want to be, and to provide them, then, with tools that are very simple and intuitive to use. so it looks like there will be more patients like lynn who will get the chance to access the liberty. but given the volume of cancer patients across the country, mass adoption is still a long way off. ijust...fell in love with the machine, to be really honest with you, because it was just... it gave me a little bit of control back as well. lara: tiktok has filed a lawsuit aiming to block a us law that
1:42 am
would ban the video app in the country unless it's sold by chinese—founded company bytedance. the company called the act an extraordinary intrusion on free—speech rights. ofcom has warned social media sites they could be banned for under—18s if they fail to comply with new online safety rules. the uk's media regulator has published draft codes of practice which require social media firms to have more robust age—checking measures. at the moment, teenagers, younger children up and down the country can experience harmful content on their social media feeds again and again, and this has become normalised. and that has to change. researchers from mit and project ceti claim they're now a little closer to understanding how sperm whales communicate, all thanks to machine learning. the team used algorithms to decode the sperm whale�*s phonetic alphabet, revealing sophisticated structures in their communication, similar to our own phonetic system. and finally, apple hasjust announced its thinnest product ever in the form of a new ipad pro series. atjust over 5mm, the devices will also come with a more powerful chip,
1:43 am
as well as a stylus in the form of the apple pencil pro. robots have integrated into our lives in many different ways, taking over tasks that were once exclusively performed by humans. but one thing that has been limiting them is their inability to feel — not emotions, but touch. and a company in edinburgh is working on bridging the gap. skin is the biggest organ in the human body, and it's a vital organ. if you look at most of evolution, you'll see that almost every animal out there has some form of sense of touch. so the idea was really that at some point, touch will be the bottleneck. it will be the limiting factor in robotics, to get robots into unstructured environments
1:44 am
in the real world. to give machines the subtle capabilities of human touch, they developed electronic skin by printing sensors onto different materials that can be applied to robots. they�* re fully flexible and printed and extremely thin, and they provide all of those sensations to a robot. if you want to, let's say, handle something very delicate, a strawberry or, in our case, you want to hold the hand of, you know, your grandmother in a hospital, or maybe a newborn baby, something like that, then you need a very highly precise sensor. taking it a step further, these sensations can then be relayed back to humans again through haptic interfaces, such as a body suit or a glove. despite its importance, touch has been overlooked in the field of robotics, making robots clumsy and unfit for a variety of basic tasks. the precise sensing capabilities of electronic skin enable robots
1:45 am
to engage with objects and humans in a more gentle manner. by detecting changes in pressure and touch, they can navigate environments with increased safety, steering clear of collisions and avoiding unintended harm or damage. to demonstrate this, they are deploying valkky, an avatar system equipped with electronic skin on its fingertips, to a hospital in finland. there, it is operated by nurses to support them in caring for patients, especially those who are immunocompromised. after the pandemic, they have realised that there are a lot of problems that they would like to solve, they would like to prepare for when something like this happens in the future. and this is where we came in and started figuring out, "how can we build a robot, build a system that can actually be useful in a hospital?" without this specific e—skin technology, the robot would be too dangerous to be near patients. as a nurse, you come in,
1:46 am
you sit down in a chair, you put on a vr headset. you have then a choice of either a vr controller with a couple of buttons and joysticks, or you use a haptic glove. so, how did you sleep last night? did you sleep well? are you experiencing any pain as of the moment? i nurses can use the robot avatar to speak to patients, deliver food and medication and take physical measurements. we have a set of laser scanners that tell us how far different obstacles are. for more spatial awareness, we have a 360—degrees camera, which means that, through the virtual reality, the user can look around at any time, even see behind. we have a little nice feature of a rear—view mirror. every time there is something happening behind the robot, you get to see that as well. then we've got the heat camera as well, to be able to tell people's temperature. in a hospital, where the risk
1:47 am
potential is high, these functionalities are the only reason they are allowed to run such a project. they also plan on using machine learning and the data sets they gathered to automate some of the nurses' tasks in the future. utilising ai, their aim is to develop autonomous and dextrous robots, capable of performing practical tasks such as repositioning patients or changing catheters. you can use as many cameras as you want to, but when it comes to actually detecting whether you are causing any harm, being able to sense touch is very important. and this is where we would like to cover the whole robot with electronic skin. the era of tactile intelligence in robotics has dawned. the sensation is still fundamentally very different from human touch, but integrating electronic skin is an emerging field that robots can now put theirfinger on.
1:48 am
indistinct radio chatter. spencer: you know, there's something strange... ..in this neighbourhood. music: ghostbusters by ray parkerjr. but, no, this isn't the big new ghostbusters movie currently hitting cinemas. this is a short film made using video game graphics. the software behind it is unreal engine, a system that allows video games developers to build sd worlds where the physics, lighting and non—player characters all behave in a realistic way in real time. # i ain't afraid of no ghost... # the motivation behind the project originally was to see how far we could push real—time graphics, and really to see if we could shoot what would normally be a post—visual effects shot that would potentially take weeks or months
1:49 am
to process after the fact, if we could shoot that live on a motion capture stage using current real—time gaming technology. and so sony pictures connected unreal engine up to their motion capture stage, called in ghostbusters: afterlife directorjason reitman and asked him to play around. the stay puft marshmallow man was puppeted by a motion capture actor, and reitman used a vcam — a kind of virtual camera — to record the action from a variety of angles that would have been pretty expensive to do in the real new york. he could shoot from street level, he could get up to building height, he could shoot from cranes, he could do drone shots. there's a great example of a happy accident during the shoot, where jason stepped backwards into a building and actually really liked the shot through the window, so we ended up doing some shots through a building window, down onto a street. and those buildings weren't built from scratch.
1:50 am
see, unreal engine comes with 20 square kilometres of ready—made city to work from, complete with autonomous inhabitants. there's a system within the city where traffic is driving and coordinating junctions and avoiding other vehicles in a completely kind of organic way. we made some enhancements to that. very specifically, we had the challenge of driving the ecto through some busy streets and needing vehicles to pull out of the way of an emergency vehicle with sirens. so, we made some modifications for the vehicles to automatically detect when there was a siren from an emergency vehicle — if they could hear that coming, they would all pull to the side of the road, so the ecto could drive straight through. having full access to and full control of a few city blocks means that this kind of production can be achieved without a massive film crew having to shut down new york city... johnny! siren blares. ..or london, for that matter.
1:51 am
yeah, we can move the shard... laughter ..which does make me feel more powerful. laughter we can move... move... yeah. let's move the tower of london. the tower of london. that's impressive, isn't it? this is flite, a full is—minute film made in unreal engine by oscar—winning visual effects director tim webber. again, he was able to build a futuristic city, shoot from any angle and pick the weather, the time of day and lighting conditions in a way that would be impossible in the real world. we've ended up with an over—three—minute shot of a continuous action sequence, with our actress riding a hoverboard, withjet cops flying in, very, very carefully choreographed. if you wanted to shoot that for real, it would take you weeks. you couldn't shut the bridge down for weeks. you know, one long, continuous shot — i mean, it would be... even for a really, really big—budget feature film — and we weren't a big—budget film — that would be an exceedingly big challenge.
1:52 am
this technology has come so far from the days when actors and directors would have to imagine what their cgi sets, which would take months to render afterfilming, would look like. as well as using the virtual camera, we sometimes did virtual scouting and exploring the locations in vr. so we'd put the headset on, and we could very quickly move around, and a few of us could be in the same space virtually at the same time. so that's the equivalent of being on a realfilm set and walking around and going, "ok, i think we need to be over here, "or i think we need to move that..." absolutely. yeah, it's totally virtual, but you can move wherever you want. and even...we put the headsets on the actors so that they could explore the environment, too, so that they could... when they were acting, they had a full understanding of the environment they were going to be in. tim is building on his work on the film gravity — the one that won him the oscar — in which, for the most part, the only real things in the shot were sandra bullock
1:53 am
and george clooney�*s faces. framestore pioneered the use of led walls to show the actors what they were performing against. and these days, the real—time abilities of unreal allow the scenery to be completely dynamic and react to improvised and unprepared camera moves on the day of the shoot. however, tim still thinks that the actors' performances themselves shouldn't be synthesised. the whole world is created in the computer. it's cgi except for the faces. creating virtual faces is very challenging and expensive and hard to be properly engaging. it's very easy to fall into the uncanny valley. faces aside, though, it does look like there's a new tool in the box for film—makers for whom a location shoot was previouslyjust a pipe dream. so, next time you want to wreck or save a city, who you gonna call? ghostbusters!
1:54 am
# if ya all alone pick up the phone... # electronic whirring. engine revs. ghostbusters! why are we here on this roof?! we could have been anywhere. who's to say we really are on this roof? anyway, that's all we've got time for. thank you for watching. see you soon. bye. hello there. settled and warm again on friday, with plenty of late spring sunshine around and temperatures rising across the four nations into the low 20s in celsius. the warmth is set to last
1:55 am
as we head through the weekend. temperatures will remain above the seasonal average, warmest towards the east. and it's still dry for the vast majority of us on saturday. a scattering of showers, with the real breakdown happening on sunday. heavy, thundery showers out towards the western half of the uk. further east should stay largely dry. and here is the area of high pressure that's keeping these dry, settled conditions for the time being. it will eventually push further eastwards into scandinavia, but we've got a bit of an easterly breeze, and that's been dragging some mist and low cloud in from the north sea. but that will lift and clear across the southeast of england and east anglia through saturday morning. still maybe a hang—back of cloud towards parts of the yorkshire lincolnshire coast, though. lots of sunshine to start the day and we'll keep the sunny skies for most through the afternoon. but a scattering of showers across scotland pushing northwards, perhaps some heavy and thundery, but they'll be fairly isolated. it's still very warm — 2a degrees celsius in glasgow. chance of a shower, too, across northern areas of northern ireland and north wales.
1:56 am
a little cooler towards these north sea—facing coasts, with some of the cloud possibly lapping onshore again at times. 25 or 26 degrees celsius in london and southeast england. so the high pressure starts to push further eastwards as we head through sunday. that allows for these low pressure systems to roll in from the west. and this weather front will bring us thickening cloud across the south—west of england, western wales, on sunday morning. some showers across the western isles and western scotland, pushing into northern ireland, and the chance of some thunderstorms developing all across the western half of the uk. but it should stay drier further east. again, there will be a lot of sunshine here, and once again we could see temperatures in the low to the mid—20s in celsius. but cooler out towards the west, of course, underneath the cloud and with the eventual rain. and here comes that low pressure system swinging in as we head through monday. it's going to give us quite widespread rain on monday, especially through the afternoon, so expect it to turn a lot more showery as we head through next week.
1:57 am
and there'll be a drop in temperature, too, so unsettled and cooler as we head through next week. bye—bye for now.
1:58 am
1:59 am
2:00 am
live from washington. this is bbc news. a us state department report criticises israel's conduct in gaza — but stops short of recommending the us halt weapons supplies. in gaza, the territory's main un aid agency warns it only has three days—worth of food remaining. and — russian forces launch a surprise cross—border attack on ukrainian territory near the city of kharkiv. hello i'm caitriona perry. you're very welcome. the us has released a report to congress — finding that israel may have used american—supplied weapons in breach of international humanitarian law
2:01 am
in some instances during the war in gaza. the document says however that the us government

0 Views

info Stream Only

Uploaded by TV Archive on