Skip to main content

tv   Shift  Deutsche Welle  August 8, 2020 7:15pm-7:31pm CEST

7:15 pm
conditions that are hindering of the relief and recovery effort now. they're watching t.w. news a from the left don't forget you can always get all the latest on our website at www dot com you can follow us on twitter and instagram to abstain of units for news from me and 45 minutes from an entire team see if and. blood. blinds us is on its way to bring you more conservation law how do we make see the screeners how can we protect habitats we can make a difference for the ideas fundamental series again global sales and d.-w. and all mine. if it is for me. just for.
7:16 pm
beethoven is for. beethoven it is for. beethoven it is for every modern many children 2020 years of the 50th anniversary here on d w. you. can you spot the fake deep fakes videos fake by ai seem deceptively real and they're a serious threat to society in order to expose the fakes to experts are embroiled in the battle in the name of science while one of them fakes the other masks we met up with both of them the fakes our topic today i'm sure.
7:17 pm
some defects are funny others are really mean it's fascinating how researchers can manipulate videos with the help of artificial intelligence i tried out what it was like to control someone else's face. check me out as a handsome. earthly. or quite human. it's pretty impressive to see how easy that was with the right technology and in real time to i don't even want to imagine what could happen if a fake president declared war how can we stop fake news or deep fakes that's what scientists are asking themselves to we met the 2 most renowned deep fake experts in the us their academic competition is advancing research on the issue. most consider how to lead to be the best the fake artist in the world the german computer
7:18 pm
scientist teaches at the university of southern california and manipulates videos for science you know as hard as it sells we are also trying to advance it in order to be able to detect them right because if you don't understand how they are being generated if you don't have the capability of generating realistic you face there is no way to if effectively detect those either his opponent honey fareed uncovers deep thanks a professor. at the university of california berkeley he's a world leading expert in digital forensics. i think the biggest obstacle in detection is that the field is moving so quickly every 3 months you see new advances in the creation of fake content so our adversary if i can call them that is not static they are changing and they're changing relief really fast so we start developing these new friends of technology and all the sudden there's a whole new way of creating deep fakes and now we have to worry about that as well
7:19 pm
it's crazy to see how quickly this technology is advancing we're used to it from photos technically we're all able to alter them at home with basic software but videos a few years ago moving images were proof that something had really happened deep fakes have taken this to another level there's growing fear that from information circulating without anyone noticing specially when it comes to big events like the u.s. presidential elections whether it's photos or videos isn't new but it's getting better as both experts warn we've always had fake news from the day of the printing press being developed but here's the difference we this what's new here is not we can create fake news we can create fake images we can create free videos what's new here is the sophistication with which we can do it the democratization of access so almost anybody on the internet can do it but here's the real issue instant global
7:20 pm
publication the most scary thing is that it feels like you're stealing someone's identity and you can put anything you one to their mouth and you can make them do whatever you want so far deep they technology is used mainly for one purpose porn one study shows that this amounts to 96 percent of that the fake online today stars like emma watson and scarlett johansson were some of the 1st prominent victims their faces were super. onto pornographic videos with the help of ai the program enabling such fakes was 1st uploaded to the online forum reddit the anonymous user called deep fakes was quickly banned but other users took over his software code and kept expanding it today similar software is still freely available online the name is a combination of deep as and deep learning technology and fake but how exactly do defects work how hard is it to make them look real the effect expert how leak
7:21 pm
explain it to me in los angeles. many deep fakes are based on so-called gun's algorithms that are on supervised meaning they learn without human guidance instead they're trained with raw data real photos for example. you're not modeling anything you just collecting a bunch of pieces all you need to make sure the aligned like this and you crop the fees right and you're manually you would try to make sure that there's a balance of different lighting condition different expressions and it just learns that by itself this is what is so powerful about the steep learning based approaches. the novelty in this approach is that to algorithms work together one generates under the checks almost like how levy and honey fareed. is like an arms race right you have to generated a tries to generate some images in the beginning you know it's just like some random pixels but then what you try to do is you tell instead of telling it where
7:22 pm
to improve like traditional supervised learning you just tell it no it's not good enough so you improve improve again until you would fool the discriminator so that discriminated can't sell any more what a difference is that in itself that proves the algorithms work in the wrong without human assistance. to look like this one minute long video took 4 days to make it's a deep faith generated by how little it is. seem. to create it they use face what technology and videos of comedians imitating us politicians. go to the biggest put is the fake algorithms were fed 5000 images but there's no secret recipe for the perfect the fake it. sometimes we don't know why we add more data and the it gets worse. so there's a little bit of. like magic and you know cooking over there we've all got photos on
7:23 pm
social media right i think it's pretty scary to think that somebody could use my pictures to make a fake video of me that's why methods to flack the fakes quickly and reliably are becoming more and more important tech giants like facebook and microsoft have launched the deep fake detection challenge with a prize of nearly 1000000 euros here at the w. we've developed a very few cation program truly media which can help analyze their talk content there's also a number of specialized startups on the issue how to for read is an expert in digital forensics he uses how nice steve thanks to fine tune his erf occasional programs and he's always on the lookout for the best way to use a i but a i approach is exactly what you think it is you get a bunch of real videos you get a bunch of fake videos and you teach them in machine learning system to tell the difference but that's a sensitive subject since deep fakes are also created with machine learning a detection toll could just as well be used to generate better fakes but i need to
7:24 pm
read goes beyond the ordinary approach instead of just relying on ai he and his team analyze videos to identify people's unique facial features we try to understand how do people move how does the mouth move how does the expression change in part because we think that those are properties of the video that the machine learning can't currently airlines and that means we are less vulnerable to counterattack so any time you build forensic techniques you always have to ask yourself how will this be used to make better fakes this bridge visualizes aspects such as head movement line of vision and facial expressions in order to spot differences honey for read compares a real video with a video that was possibly altered this method detects about 96 percent of faked footage. one possible use for this might be to integrate detection software into social networks algorithms in the future google has already provided scientists
7:25 pm
with some 3000 manipulated videos to help improve the detectors. to make these $3000.00 videos google empire $28.00 actors and then manipulated the images with algorithms so tech time seem to be taking deep thanks faeries seriously and it's no surprise experts predict the damage caused by manipulated audio and video files could range as high as nearly a quarter 1000000000 euros and 2020 and what's worse there's no perfect program to help us detect the fakes yet but every new technology has a good side to right the fakes also offer f.-u. exciting possibilities for the future a magic you can have a customized movie where when you download a movie you can say i want this actor or this actress to be in the movie instead of who it is that sort of. this is a huge opportunity for the movie industry soon visual effects could be much cheaper
7:26 pm
to produce but so far deep technology cannot create images in the same high resolution as movies. but it's good enough for you tube videos. which is so nicolas cage good kirsten the godfather. to fake technology could also bring celebrities back to life like the spanish painter salvador dali. in the dali museum in florida visitors can even pose for a picture with him. i think one of the most interesting ones is in hollywood studios for creating movies so if we have a movie that is filmed in german and we want to listen to it in english you create a lip sync deep fake where the person now it looks like they're speaking english but super cool like this awareness campaign and not in languages leiria isn't just any disease. everything so he said.
7:27 pm
do you know what it's being. able to fly i'm going to do the right thing to make it the creator's unadmitted 3 d. model of david beckham's face wow that's cool someone you might be able to watch me host shift in hindi or kiswahili completely in sync and perhaps even life it's just a matter of time before we're able to manipulate 4 k. videos in real time it's just up to computers getting a little bit faster but back to more serious matters people around the world need to know about this new technology how leap presented his face what studio at the world economic forum in davos the most important thing is actually that people know that this is possible right if they wouldn't know because usually be fooled if they know that their ways to actually create to manipulation is extremely believable then you know we will have some we have protection honestly i have to say that both
7:28 pm
experts hunted for it and how we have managed to call my worries about defects a little fake news has always been a huge problem not just today but the perfect technology it's not even necessary for it. the more people are aware of the possibilities to manipulate the news the better they can deal with misinformation is something true or false that's not a question of deep fakes what do you think are you scared of manipulative videos or is this all just a storm in a teacup let us know what you think you can do this on you tube too there you can find a video of me getting my fairy own 3 d. avatar and how nice slap in los angeles i even said in the same chair s. will smith the hardest part was i have to make 26 different faces take it out on our you tube channel thanks for watching and see you next time.
7:29 pm
on the 77 we talk about this issue. come on the streets of nairobi. how has called in 1000 transformed the city tonight. happens to all those service men and women who rely on the activities of the night to fend for them to use well i want to find out. this is polar ice in the in slices and he can interpret it physicist john haas how does ice. how does it expand in winter and how does it melt in summer and what does this mean for the earth's climate big questions big expedition we let
7:30 pm
ourselves funnies in the. 60 s. w. . me there i'm david. klamath . happy. for free. hello to you welcome to 77 percent this edition explores how i have been dealing with the koan up and demick so if you are already been fasting you'll see belts and let's drive through i'm your host at the mike and you know. so here's what's coming up on the program.
7:31 pm
as a culture. we explore the.

18 Views

info Stream Only

Uploaded by TV Archive on