Skip to main content

tv   RIK Rossiya 24  RUSSIA24  March 4, 2023 10:30pm-11:01pm MSK

10:30 pm
the number of fake videos has increased dramatically, and here's how suspicious people are going to deal with this special report by alisa romanova. two bigsits at a bargain price can be used as motors for delivering medicines on a horseshoe famous flea they fit more than
10:31 pm
150 pieces try a real ice cream. russian voice great incredible taste all kinds of gifts in lenta i remember a wonderful moment of gifts in front of me for my beloved daughter's mother, there is a large selection in lenta. and also discount cards, surprises and drawing of an apartment. this is a real bank opportunity, real opportunities to manage a complex
10:32 pm
digital machine, a professional graduate, we will teach you to think with your own hands. choose racing rf leading employers are already waiting for our graduates professionalism, you are in good company. politicians
10:33 pm
saying things they've never said in public in their lives. artists who have died for which their digital copy plays or actors who have thrown off 30 years on the screen and without plastic. all this is already making face replacement technology a defake, but where can it lead us, because for most people it is either a joke or a threat. deals if
10:34 pm
you install our technology on your tv channel. you can always determine exactly if the image is spoofed, if uh this depake, or not the deppet is made up of two words, diploid machines, learning and fake, but many developers do not like this particular concept for negative connotation, yes, fake, but this is the future. what opportunities have opened up, you can become any celebrity, any idol, and on their behalf already say whatever you want. well, natalie portman felt like dozens of other hollywood stars. the whole horror of this new technology, when their faces began to massively impose on the faces of actresses in 18+ commercials, of the most explicit content. well , then what should i do so that, on the one hand, do not interfere with the development of technologies, and on the other hand, do not allow me to be abused.
10:35 pm
you are late now to everyone at once on the penalty area, if you told me that i am jason statham 2027 i will celebrate my anniversary in russia, i would have laughed the star of fast and furious films, guy ritchie jason stat has moved to the russian village keanu reeves wears a t-shirt with the olympic bear we have been filming together in russia for five years we were friends buys zhiguli i believed that 200.000 for the car, which we found adequate prices, made, thanks to the technology of face swapping, each plan of the double , the director of the world's first depression series
10:36 pm
, coordinated the site with specialists, we sent here a screenshot and, accordingly, well they said yes or no. yes, yes, of course, of course, stops. when was it dark? and already under their faces digital masks of joints were adjusted. a profile, uh, you can’t take, you can’t do, uh, come up with frames of a full full circle around a person, and then, uh, in some of the series it turned out to be done by accident, because the masks were created by neural networks. they are constantly trained for a certain person. ideal cancer with uh, more precisely
10:37 pm
incomplete meets quarters by the author of the series was important in creating a parody. by law , the harmless use of the faces of stars is in no way world is not regulated. but what opportunities have diffrics opened up for world cinema? the filmed company has been working with this technology for several years. the more similar the actor is to the prototype, the more realistic the video will be. well , i think it will soon come to this once a year , a digital copy of a person is made, that is, well, high-quality, video filming from all sides. you're already old and you're already tired. you no longer want to act. there is a thirty-year-old you have 40 years in the cinema that makes you either money. i think it's quite real. a lot of them video and photo algorithm studies everything and creates
10:38 pm
an imitation. and then a businessman appears on this hook, who says that i don’t know, i’m selling the company. we have losses. everything is very bad, stocks fall at the moment someone makes money on it, unfortunately, modern technologies are very often mastered in the first place. protection against depays in russia is being developed by several companies in sberbank that have patented two inventions; their algorithms analyze the biological feature of a person that changes complexion. then when the heart ends. if your face fills up evenly, it contracts. perform them in the tank, the human console. no, if there is a change in the light of the face, if the purity of the human pulse falls into the verdict of almost 80%, that on the right
10:39 pm
is a synthesized person, we cannot see this with the naked eye, we lay out the video frame by frame and analyze each frame. which is an exceptional feature compared to other similar solutions. we, respectively, can simultaneously detect several faces in the frame, which is also exclusive feature of our solution? very similar to the chancellor and the schulz. tell me an american. you have a lot of money, we
10:40 pm
wanted to give up russian gas, but in the words of a russian classic. we wanted the best, but it turned out, as always. here is a video called a deeppack, and we made it with the help of an actor and our neural networks. sber is actively using face id facial biometrics , when employees enter the office, customers confirm their identity and operations in the same way, so it was important for the company to prepare in advance for the risks if installed. on uh your tv channel broadcast channel our technology. you can always determine exactly if the image has been changed, if this is a deepfake or not, our goals are for this to happen in an automated mode, so that there are no buttons to turn on, but the light comes on right away, that there is some risk here today it is difficult for deepfakes to bypass
10:41 pm
biometric barrier, but there are already examples of how technologies can be used for criminal purposes, an accountant of one of the enterprises received an sms message from his director stating that so that she urgently contacted him by video link there. as a matter of fact, its directors. she is deeply convinced of this that i just got out. big manager, we have an urgent order to transfer 100,000 dollars in this situation , the accountant flawlessly fulfills his instructions. and literally in an hour or two, with the deepest disappointment and horror, she realizes what she has done. voice defakes become even more dangerous, at the end of january, users of a new program, a speech synthesizer , created an audio recording with the voices of stars. in them tom cruise clones of george lucas and others
10:42 pm
have made offensive remarks, even russian ones. files blocked, but the residue remains in one of the most shocking audio deepfakes. emma watson, the same hermione from the harry potter films , said an excerpt allegedly from hitler's manifesto what is she? of course she didn't. but how to deal with this in general, let's ask emma hayma herself. hai i think we should all know the truth about deepfake detection technology and special laws. so far very weak. it doesn't matter at all. for what purpose make such videos or share them online, people should know the truth, such a video should be somehow marked technology, they are not to blame for anything. we played along with the actress yuliya ganatskaya. like any artist, it is important for her that the image is protected from credit, as in any, probably, some kind of cool technologies. hmm
10:43 pm
, there are such cool technologies for and against you work in the frame, you are an actor. you come there i don't know, mom realistic masks helped us. find a specialist sergey loginov he has been working with deffakes for a long time and understands their potential very well. how much in general, the threat that the scale of such criminal ones will grow, that fakes are flying is great fakes precisely for the purpose of exposing some kind of lie, slander, and illegal actions. this, of course, needs to be done, but usually they are such deepfakes, uh, well, they are not misleading with their realism, they are often very unrealistic, but everything still works there. uh, that's neuroprogramming. you could say it's just a belief and it's like a virus and an antivirus. the first is always one step ahead. i
10:44 pm
never agreed to any illegal things want applications were very many suggestions. there are all sorts of identity confirmations on the crypto exchange through a de-fake. firstly, it is impossible now in this technology to do. secondly, they, well, represent it in general. for a second, like this, 5 rubles are made there. give it and it's all done. well, it's not like that at all. but a british journalist described and showed how easy it is to deceive a voice assistant. he called his bank and started from the computer. audio recording it was synthesized by an easily accessible program from his own voice. give me a reason your call to check my account balance, okay. state your date of birth. the program did not notice the catch, my voice my password, the security system again did not recognize the fake and gave out information about his account balance and recent transactions. they are fighting
10:45 pm
in the laboratory of the center of speech technologies, which the operator can see. and now we will try there. call a-a to see how the technology works. hello, i would like to get into my personal account the director of the research department simulates a call to a call center. to show how their protection system works such solutions can be integrated in the state bank, service and television companies, the system solved two problems. first, she identified me as her client, and second, she, uh, made sure i wasn't trying to deceive her, and uh, the anti-phreaking schedule in the green zone is exactly the same appeal, he recorded in advance and i reproduce. actually the voice ah,
10:46 pm
tell me how much to do it by ear , the person will not notice the catch, the system already hears everything, it heard that this is a hoax, that it is a recording of my voice, not a real voice, in order to pass the voice biometrics. uh, attackers can go in a variety of ways, and they can, uh, record a voice and play it back, and they can parody a person, and they can glue top international ones to protect voice biometrics, and companies and russian companies occupy leading positions, and on international technological competitions. the developers of biometric systems are studying all the vulnerabilities of the system so that it is impossible to walk through someone else's photo, and now i i will demonstrate, let's say the scenario of the most common myth associated with facial biometrics. namely, what is in your photo,
10:47 pm
and the villains will get access to your data, for example, yes or hmm to the terminal. i present a photograph. now the terminal says, look into the camera and realizing that it’s not me who forbids me to pass the diffuse and biometrics are developing all over the world in parallel and they use computer vision technology here and there. it originally grew from the common cold of human detection by identifying people by some points that are accepted and familiar to our eye a and beyond with the development of neural networks and a, the improvement of work with them, and the system shows really fantastic results. on a positive note then it's great there to make some cool videos. you
10:48 pm
remember all videos. keep the money in a savings dice, but then again , i mean, i wouldn't want to see myself in some video in some weird context, like, yes, and i'd like to have a tool to prove it's not me. if it's not well, specifically, not me on this video and everything goes to this, that it will be enough. it's simple and i think that it will be quite accessible. at vision labs, he develops algorithms for face detectors. they, too, do not watch either photos or videos from the phone, they often ask the question that some kind of special makeup can be applied. and something else like that. no, this doesn’t work either, because we are looking at a completely different plane of the face, and this technology recognizes emotions on the face and
10:49 pm
estimates the age. a smile clearly rejuvenates fallen people sitting, for example, on a chair on squatting, can be activated by raised hands. and these systems will already determine the position of the body in the frame , we are developing algorithms that help us identify fighting people, that is, if somewhere in the frame we have two people using each other's force. here, we can detect this event. what squatting people were created for, and this is a bank request in order for attackers to hack into an atm, he needs to sit in front of it to gain access to the compartment. and accordingly, if, for example, a person squats there for more than ten seconds, then we can trigger an alarm, their platform detects
10:50 pm
the system has uploaded one of the most famous videos. and the ultimate dumbass, you see, i would never say that. at least in public, but another would say, someone like jordan saw dangerous times right now, moving forward. we need to be more vigilant about everything we trust on the internet right now algorithms. they have advanced a lot and now this fragment. yes he relatively old. and now there is already more , let's say, so powerful substitution technology. you bakes are already easily destroying reputation, so a resident of pennsylvania decided to save her daughter from rivals in the support group. she sent fake photos and videos of the girls to the trainers on them they smoke vapes and drink the generated voices and the videos are already sending the innocent to jail when some criminal case is being considered. very often
10:51 pm
now as aesthetic indications. are audio recordings or video recordings used there and how do they correspond in fact , deephack technology was not used. here is the problem. now, in fact, a very large number of judges are worried. and in general, those who represent this segment all over the world, and also political manipulations and the indian candidate ordered this video before the elections. he himself wanted to speak english and their dialect really from languages. he himself knows only india, his face was put on the face of an actor, and they were already responsible for articulation. neural networks. so the politician decided to reach as many audiences as possible. here you go in california, for example, it is already forbidden to post fake politicians two months before the election, and in china, since january , fakes that instruct the reputation or threaten the security of the country are outlawed. well, apart from
10:52 pm
the film industry and advertising, this technology has a great and legal future. he himself wanted to catch the little mermaid. and lesha banita and a scientist cat to take a picture in the style of lukomorye imagine that you come to a literature lesson, they start a video for you and, for example, sergei yesenin reads his own poems in his own voice. all the same is possible do in history class. all the same can be done in the lessons of mathematics, physics and chemistry, and this and because of this there is a completely different perception, especially for modern children. russian social network vkontakte there we are now working on technologies that will allow all users of vkontakte and for any content they have, photo or video or content available in social networks or other resources, by uploading it to us to check the presence or manipulation of the face of the presence of deppeys on this video . and in general uh we
10:53 pm
started developing this technology because we had a large database of deephacks that we created with our deephake technology, which we launched a few months ago. the application was the first to introduce a technology that allows users to try on the faces of celebrities, and then the neural network learned to detect deepfakes on the same videos. to be honest, in many moments when a and i , our deflate could not distinguish, but a really real star from our ordinary users, a and our users of this may not, so we first of all strive to give a tool that will allow you to say really real photo videos, but there are already algorithms that allow you to create a fake using just one photo tool. everything is simpler, and the videos are frighteningly more realistic, beautiful artificial intelligence in fact, of course. eh, it's not yet , but in the future they will certainly learn to think quite qualitatively. but well, for
10:54 pm
this you will need a large amount of information, how can we not think that our phone has already collected biometrics from us data when we make a discovery on the face of this and the same biometrics. we give a full 3d scan of the face from all angles. this means that now from these sides it can, in theory, be restored in some way , we upload our photos to the cloud from different angles and at different uh, there are hours of the day and night. this means that now hypothetically. neural networks. there is an opportunity to better reveal exactly and make lighting and create this particular cast of the face. at itmo university in st. petersburg developed the expert service, it could help employers looking for new employees , the system studies the candidate's video presentation according to various parameters.
10:55 pm
this is not a lie detector that will determine some completely true or completely false statements on questions, but it can form the general impression of a person that in general he behaves insecurely on the video, but in general, he avoided answering all questions . accordingly, we, well, are unlikely to want to deal with such a person who will not be completely sincere with us. while walking the development system turned out to be perfect, it screens out and deflakes, we load popular videos on it, the actress jennifer lawrence has turned into a fuselage system, the system sees a discrepancy in emo facial expressions through different channels. tions and text with all the ups and downs on the often indicative, let's say the same audio channel, uh, the video was always the same, that is, it was literally without emotional. and just the same, having information that he had aggression, for example, and that he
10:56 pm
spoke some kind of emotional text and showed emotions with his voice, but at the same time did not moving his face, we can say that this is not very typical for a living person and we can already understand that we should pay attention, and not defects. it can be a lot of different applications. students can run the answers before the exam . i would like to check whether it is really worth it for me to spend my precious time studying the material that potentially may not be useful in view of the fact that the specialist himself is not sure about what he is saying technologies. actually, it’s worth saying to this expert believe it or not , a narrower application, please, can be further, if forensic examination, after all, and so on, players on the stock exchange, it is important for them to know who has to invest there tomorrow, that is, they listen to the opinion of authoritative people on
10:57 pm
so-called inflasks appear in their background ; well, they themselves earn little that people will lose on it, as if they do not matter, therefore, in principle, for this technology such and intended. the system is designed to test expert opinion, who to trust, and what would be nice to double-check and bad artificial intelligence, more than ever requires critical thinking. and diff-flakes , even the most innocent ones, need to be labeled for this, it seems they are leaning all over the world.
10:58 pm
alexander gitsenko now could only save
10:59 pm
11:00 pm
the soviet people fish others are not here denied, but the terrorist attack in the bryansk region was agreed with him, one of the criminals is already handing over his customer with might and main. what to say? brussels is sad, he was warned that sanctions against russia are the same as stepping on a rake. moscow returned imports almost at the same level.

15 Views

info Stream Only

Uploaded by TV Archive on