Skip to main content

tv   RIK Rossiya 24  RUSSIA24  March 7, 2023 2:30pm-3:01pm MSK

2:30 pm
inspiration, when you want to confess your love, for a whole life and it is impossible to take your eyes off, happy holiday, my love. it's me maxim nikitin the creator of the computer to make an important discovery. cultivate interest in science and technology rf new
2:31 pm
edition of the author's program
2:32 pm
2:33 pm
2:34 pm
politicians saying things they would never say in public. already deceased artists, for whom their digital copy plays or an actor who has thrown off 30 years on the screen and without
2:35 pm
plastic surgery. all this is already making face replacement technology defake, but where can it take us start, because for most people it is either a joke or a threat. deals if you install our technology on your tv channel. you can always determine exactly if the image is spoofed, if, uh, this defake, or not dippers, it’s made up of two words, diploid, machine, training and fake, but many developers don’t like to understand this for negative connotations, yes, fake, well, this is the future. what opportunities have opened up, you can
2:36 pm
become any celebrity, any idol, and from those faces you can already say whatever you want. well, natalie portman, like dozens of other hollywood stars , felt the full horror of this new technology when their faces began to be massively superimposed on the faces of actresses in 18+ commercials, of the most explicit content. well, then, what should be done so that, on the one hand, do not interfere with the development of technologies, and on the other hand, do not allow them to be abused. you are late now all at once on the penalty area, if you told me that i jason statham 2027 will celebrate my anniversary in russia, i would have laughed at the star of fast and furious films, guy ritchie
2:37 pm
jason statham moved to the russian village keanu reeves wears a t-shirt with an olympic bear, you and i filmed together in russia for 5 years, we were friends, buys a zhiguli, i thought that 200,000 for the car, which we found adequate prices, were made thanks to face-cancelling technology, each plan of the double was coordinated by the director of the world's first series with it specialists. we sent here the screenshots and accordingly, well, they said yes or no. yes yes. yes, of course, of course. when was it dark? and already under their faces, digital masks were adjusted so that there were no seams. profile,
2:38 pm
uh, you can’t take it, you can’t do, uh, come up with frames of a full full circle around a person, and then in some of the series it turned out to be done by accident, because the masks were created by neural networks. they are constantly trained for a certain person. here 's the perfect angle. uh, not completely three-quarters. it was important for the authors of the series not to offend when creating a parody. by law, the harmless use of the faces of stars is not regulated in any way in the world. but what opportunities have opened deppniki for world cinema? filmed by gentle media a have been working with this technology for several years. the more similar the actor is to the prototype, the more realistic the video will be. well, i
2:39 pm
think, you know, artists will also come to this once a year, once every six months, a digital copy of a person is made, that is, well, high-quality, video filming from all sides. you are already old. you are already tired. you no longer want to act. there is a thirty-year-old you have 40 years in the cinema working out money. i think it's quite real. you have a lot of their videos and photos , the algorithm studies everything and creates an imitation. and then on this hook you can already catch victims a businessman appears who says that i don't know, i'm selling the company. we have losses. everything is very bad, the shares are falling at the moment someone makes money on it, unfortunately, modern technologies are very often mastered, in short, scammers. protection against diffaks in russia is being developed by several companies in the field that have patented two inventions; their algorithms analyze the biological feature of a person, changing complexion.
2:40 pm
okay, it ends. if your face is evenly reduced, well, slowly into the tank , the pulse of the human heart of the face does not match. in the cleanliness of the human console. it turns out that you need a fake of almost 80%, that on the right is a synthesized person, you cannot see this with the naked eye. we split the video frame by frame and analyze each frame. which is an exceptional feature compared to other similar solutions. we, therefore, can simultaneously detect several faces in the frame. what is also an exclusive feature of our solution?
2:41 pm
a man very similar to chancellor olaf and schultz. tell me an american. you have a lot of money, we wanted to give up the russian gas, but they speak the words of a russian classic. we wanted the best, but it turned out, as always. here is a video called a deeppack, and we made it with the help of an actor and our neural networks. sber actively use facial biometrics via face id, employees enter the office, customers confirm their identity and operations in the same way, so it was important for the company
2:42 pm
to prepare in advance for the risks if installed. on your tv channel, the broadcast channel of our technology. you can always determine exactly if the image substitutions, if this one or not, because our goal is for this to happen in an automated mode, so that there are no buttons to turn on, but a light immediately lights up that there is some risk here today it is difficult for diffusers to bypass the biometric barrier, but there are already examples of how technologies can use for criminal purposes, the accountant of one of the enterprises received an sms message from his director that she urgently contacted him via video link. there is a dipak. as a matter of fact , its directors. she's in it deep convinced that i just got out. great leader, we have an urgent order to transfer 100,000 dollars in this situation, the accountant is impeccably fulfilling
2:43 pm
his instructions. and literally in an hour or two , with the deepest disappointment and horror , she realizes what she has done. voice diff-fakes become even more dangerous. at the end of january, users of a new program, a speech synthesizer, created an audio recording of the voices of the stars. in them, clones of tom cruise , george lucas and others uttered insulting even russian statements. files blocked, but the residue remains in one of the most shocking audio deepfakes. emma watson, the same hermione from the harry potter films, said an excerpt allegedly from hitler's manifesto what is she? of course she didn't. but how to deal with this in general, let's ask emma chaim herself. i think we should all know the truth about the technology to detect diff-fakes and special laws.
2:44 pm
while very weak, but it does not matter at all. what is the purpose of making such videos or sharing them online. people should know the truth, such a video should somehow be labeled with technology, not to blame for anything. we played along with the actress yuliya ganatskaya. like any artist, it is important that her image is protected from their accreditation, as in any , probably, some cool technologies. hmm , there are such cool technologies for and against you work in the frame, you are an actor at home . you come there , you don't know, mom realistic masks helped us. find a specialist sergey loginov he has been working with deffakes for a long time and understands their potential very well. how general is the threat that the scale of such criminals will grow, that people are yours, how great is the detection of fakes precisely for the purpose of exposing some kind of
2:45 pm
lie, slander, and illegal actions. this, of course, needs to be done, but usually such deepfakes are , uh, well, misleading not by their realism, they are often very unrealistic. and it still works there. uh, that's neuroprogramming. you could say it's just a belief and it's like a virus and an antivirus. the first is always one step ahead. i have never settled for illegal things, want applications were very many offers. there are all sorts of identity confirmations on the crypto exchange through a deepfake. firstly, it is impossible now in the data in this technology to do. secondly, they , well, represent it at all. for a second , like this, 5 rubles are made there. give it and it's all done. well, it's not like that at all. but a british journalist described and showed how
2:46 pm
easy it is to deceive a voice assistant. he called his bank and started from the computer. audio recording it was synthesized by an easily accessible program from his own voice. what is your reason for calling to check my account balance, okay. state your date of birth. the program did not notice the catch, my voice is mine password, the security system again did not recognize the fake and gave out information about his account balance and recent transactions. they are fighting in the laboratory of the center of speech technology, and which the operator can see ah, let's try there call uh in order to see how the technology works. hello, i would like to get into my personal account the director
2:47 pm
of the research department imitating a call to the call center. show how their security system works such solutions can be integrated in a bank of state, service and the tv company, the system solved two problems. the first she identified me as her client, and the second - she, uh, made sure that i was not trying to deceive her, and uh, putin's schedule is in the green zone, he recorded the exact same appeal in advance and i reproduce it. the actual voice uh, do this by ear , a person will not notice a catch, the system already hears everything , that this is a hoax, that this is a recording of my voice, and not a real voice, in order to undergo voice biometrics. uh, attackers can go all sorts of ways, and they can uh record a voice and play it back, and they can, uh, parody a person, and they can
2:48 pm
stick together top international ones work with the protection of voice biometrics, and companies and russian companies take the lead, and at international technology competitions. the developers of biometric systems are studying all the vulnerabilities of the system so that it is impossible to walk on someone else's photo, and now i will demonstrate, let's say , the scenario of the most common myth associated with facial biometrics. namely, what is in your photo, and the villains will have access to your data, for example, yes or well , i bring a photo to the terminal. now the terminal says, look into the camera and realize that it's not me. biometrics are developing all over the world in parallel, and here and there they use computer vision technology. it initially grew from the common cold of human detection by identifying
2:49 pm
people by some points that are accepted and familiar to our eye a and further with the development of neural networks. and while improving work with them, and the system really demonstrates fantastic results. on a positive note then it's great there to make some cool videos. you will remember all videos with problems. keep the money in a savings dice, but then again, i wouldn't want to see myself in some video in some weird context, like, yes, and i'd like to have a tool to prove it's not me. if it's not, well, specifically, i'm not in this video and everything goes to this, that will be enough. it's simple and i think that it will be quite accessible. vision labs
2:50 pm
develops algorithms for facial detectors. they, too, do not watch either photos or videos from the phone, they often ask the question that some kind of special makeup can be applied. and something else like that. no, this doesn’t work either, because we are looking at a face in a completely different plane , and this technology recognizes emotions on a face and estimates age. a smile clearly rejuvenates . we can detect fallen people sitting, for example, squatting on a chair, we can detect raised hands. and these systems will already determine the position of the body in the frame, we are developing algorithms that we help to identify fighting people, that is,
2:51 pm
if somewhere in the frame we have two people using each other's force. here, we can detect this event. what squatting people were created for, and this is a bank request in order for attackers to hack into an atm, he needs to sit in front of it to gain access to the compartment. and accordingly, if, for example , a person, but squats there for more than ten seconds, then we can trigger an alarm, their platform announces the system has loaded one of the most famous videos. donald trump is a total dumbass, you see, i would never say that. at least in public, but another would say, someone like jordan saw dangerous times right now, moving forward. we need to be more vigilant about everything we trust on the internet right now algorithms. they
2:52 pm
have advanced a lot and now this fragment. yes, it's relatively old. ah. now there are already more, let's say, powerful substitution technologies. is it really easy to destroy the reputation so a resident of pennsylvania decided to save her daughter from cheerleaders. she sent fake photos and videos of the girls to the trainers on them they smoke vape and drink the generated voices and the videos are already sending the innocent to prison when some criminal case is being considered. very often now. e, either audio recordings or video recordings are used as aesthetic indications, and as far as they correspond to reality, technology has not been used. here is the problem. now. in fact , a very large number of judges are concerned. and in general, here are those who represent this segment about the whole world, and also political manipulations and this video was ordered by the indian candidate before the elections. he himself wanted to address voters in english and their dialect
2:53 pm
really from languages. he himself knows only india has been evolutionarily placed on the actor's face. and neural networks were already responsible for articulation. so the politician decided to reach as many audiences as possible. well, in california, for example, it is already forbidden to post fake politicians 2 months before the elections in china since january. deflakes that injure reputation or threaten the security of the country are outlawed, but where in addition to the industry, cinema and advertising, this technology had a great and legal future, i wanted to catch it. find a scientist's cat to take a picture in the style of lukomorye imagine that you come to a literature lesson, they start a video and, for example, sergei yesenin reads his own poems in his own voice. all the same can be done there in history lessons. all the same can be done
2:54 pm
in the lessons of mathematics, physics and chemistry, and this and because of this there is a completely different perception, especially among modern children the russian social network vkontakte is also developing yes, we are now working on technologies that will allow everyone using vkontakte and for any content they have photo or video or content available in social networks or other resources by uploading it to us to check the presence or manipulation of the face of the presence of fakes on this video. and in general, uh , we started to develop this technology because we had a large base of defaks that we created using our deephake technology that we launched some a few months ago. the application was the first to introduce a technology that allows users to try on the faces of celebrities, and then the neural network learned to identify the type of fake on the same videos. to be honest, for many, from the moment when a we became our deephake, i could not distinguish a really real star from our
2:55 pm
ordinary users, a and our users may not do this, so we first of all strive to give a tool that will allow you say, really real photo videos, but already these algorithms that allow you to create data by one photo tool only. everything is simpler, and the videos are more frighteningly realistic. we are beautiful terms of russia, this is artificial intelligence in fact, of course, and this is not yet the case, but in the future they will undoubtedly learn to think with sufficient quality. but well, for this you will need a large amount of information, how can we not think that our phone has already collected biometric data from us when we make this and the same biometrics discovery on the face. we give a full 3d scan of the face from all angles. this means that now from this side. in theory, this can be restored somehow, we upload our photos to the cloud from different angles and at different hours of the day and night. this means that now a hypothetical neural network. there is
2:56 pm
an opportunity to better reveal exactly and make lighting and create this particular cast of the face. itmo university in st. petersburg developed the expert service, which could help employers looking for new employees, the system studies the candidate’s video presentation on different settings. this is not a lie detector, which will determine some completely true or completely false statements on questions, but it can form the general impression of a person that , in general, he behaves insecurely on the video, but in general, and he avoided answering everything questions. accordingly, we, well, are unlikely to want to deal with such a person who will not be completely sincere with us. while the development was going on, the system turned out to be perfect, we filter out and diffacks upload a popular video on it, actress jennifer lawrence
2:57 pm
the system has turned into a sivushema, the system sees a discrepancy in emo facial expressions through different channels. tions and text with all the ups and downs on often pointers, let's say, the same audio channel, uh, the video was always the same, that is, it was literally without emotional. and just the same, having information that he had aggression, for example, and that he spoke some kind of emotional text and showed emotions with his voice, but at the same time he did not see his face. we can say that this is not very characteristic of a living person and already we can understand that we should pay attention, not defects. it can be a lot of different applications. students can run the answers before the exam .
2:58 pm
i would like to check whether it is really worth it for me to spend my precious time on studying the material that potentially may not be useful in view of the fact that the specialist himself is not sure about what he says is the task of technology. in fact, it is worth saying to this expert believe it or not, a narrower application, please can further its forensic examination, after all, and so on players. on the stock exchange, it is important for them to know who to invest today there tomorrow, that is, they listen to the opinion of authoritative people against their background, the so-called inflation quickly appears, they begin to speak , the authority of their returns fell into the forecasts. well, they themselves earn little that people will lose on this, as if they do not matter. therefore, in principle, this technology is intended for this. the system is designed to test expert opinion, who to trust, and that it would be nice to double-check and bad artificial intelligence, more than ever requires critical thinking, and de fakes, even the most innocent need to be labeled for
2:59 pm
this, seems to be leaning all over the world.
3:00 pm
russia was able to overcome the difficulties in the financial sector associated with the imposition of sanctions , vladimir putin said at a meeting with the head of sberbank, the german vulture, the president highly appreciated the professionalism, the government of the central bank and the entire banking community of the country. despite the well-known and considerable difficulties and problems that have arisen in as a result of well-known events and the difficulties that they tried and are trying to create for us in various fields, including

17 Views

info Stream Only

Uploaded by TV Archive on