tv RIK Rossiya 24 RUSSIA24 March 5, 2023 5:30am-6:01am MSK
5:30 am
5:32 am
saying things they've never said in their lives and doing things they wouldn't do in public. already deceased artists, for whom their digital copy plays or actors who have thrown off 30 years on the screen and without plastic surgery. all this is already making face replacement technology de fake, but where can it lead us, because for most people it is either a joke or a threat. dels if install on your
5:33 am
tv channel broadcast channel our technology. you can always quite accurately determine if the image is spoofed, if uh this is a depake, or not, not a depression formed from two words, diploid machines, training and fake, but many developers do not like this particular concept for its negative connotation, yes, fake, but this is the future. what opportunities have opened up, you can become any celebrity, any idol, and on their behalf already say whatever you want. well, natalie portman, like dozens of other hollywood stars , felt the horror of this new technology, when their faces began to be massively superimposed on the faces of actresses in 18+ commercials, of the most explicit content. well , then what should i do so that, on the one hand, do not interfere with the development of technologies, and on the other hand , do not allow me to be abused.
5:34 am
you are late now with him right on the penalty area, if you told me that i am jason statham 2027 will celebrate my anniversary in russia, i would have laughed the star of fast and furious films, guy ritchie jason statham moved to the russian village keanu reeves wears a t-shirt with the olympic bear , you and i are five years together in russia filmed were friends. i thought that 200.000 for the car, which we found adequate prices, made thanks to face-cancelling technology
5:35 am
, the director of the world's first series coordinated each plan of the double with anti-specialists, we sent screenshots and accordingly, well , they said yes or no. yes, yes, of course, of course, stops. when was it dark? and already under their faces. profile, uh, you can’t take it, you can’t do it, come up with frames of a full full turn around a person, and then in some of the series it turned out to be done by accident, because that masks a-a are created by neural networks. they are constantly trained for a certain person. here's the perfect
5:36 am
angle. uh, not completely three-quarters of the author of the series was important, creating a parody not to offend anyone's feelings under the law , the harmless use of the faces of stars is not regulated in any way in the world. but what opportunities have opened fakes for world cinema. filmed by gentle media and have been working with this technology for several years. the more similar the actor is to the prototype, the more realistic the video will be. well, i think it will come to this soon once in a a year a digital copy of a person is made, that is, well, high-quality, video filming from all sides. you are already old. you are already tired. you no longer want to act. there is a thirty-year-old you have 40 years in the movie is developed money. i think it's quite real. than a lot of their
5:37 am
videos and photos, the algorithm studies everything and creates an imitation. and then a businessman appears on this hook, who says that i don’t know, i’m selling the company. we have losses. it's all very bad stocks are falling at the moment someone is making money on this, to unfortunately, modern technologies are very often mastered first of all by scammers. protection against diffaks in russia is being developed by several companies in the field that have patented two inventions; their algorithms analyze the biological feature of a person, changing complexion. then, when the heart is shorter if the whole face is evenly reduced, well, one at a time, if elsa is good in the tank of the pulse of the human heart, no, if the change of light of the face, if it falls into the purity of the human pulse,
5:38 am
the verdict is almost 80%, which is synthesized on the right man, with the naked eye we can’t see this, we lay out the video frame by frame and analyze each frame. which is an exceptional feature compared to other similar solutions. we, therefore, can simultaneously detect several faces in the frame. what is also an exclusive feature of our solution? a man very similar to the chancellor and the schulz. tell me american. you have a lot of money , we wanted to give up russian gas, but
5:39 am
in the words of a russian classic. we wanted the best, but it turned out, as always, right there their detector. as you can see, it’s called a tip-up, and we did it with the help of an actor and our neural networks are the right things. sbera actively uses facial biometrics via face id . employees and clients enter the office in the same way to confirm their identity and operations, so it was important for the company to prepare for risks in advance. and if installed. our technologies are on your tv channel broadcasting channel. you can always determine exactly. uh, if the image is changed, if this diff or not, it's not our goal, so that this happens in an automated mode, so that you do not have to turn on any buttons, but the light immediately lights up, that
5:40 am
there is some kind of risk here. today it is difficult to bypass the biometric barrier, but there are already examples of how technologies can be used for criminal purposes, the accountant of one of the enterprises received an sms message from his director that she urgently contacted him via video link. there is a dipak. as a matter of fact , its directors. she is deeply convinced of this that i just got out. big manager, we have an urgent order to transfer 100,000 dollars in this situation, the accountant flawlessly fulfills his instructions. and literally in an hour or two , with the deepest disappointment and horror , she realizes what she has done. voice defakes become even more dangerous, at the end of january, users of a new program, a speech synthesizer, created an audio recording with
5:41 am
the voices of stars. in them, clones of tom cruise , george lucas and others uttered insulting even russian statements. the files were blocked, but the sediment remained one of the most shocking audio deepfakes ever. emma watson, the same hermione from the harry potter films, said an excerpt allegedly from hitler's manifesto what is she? of course she didn't. but how to deal with this in general, let's ask emma hayma herself. hai i think we should all know the truth about deepfake detection technology and special laws. while very weak, but it does not matter at all. what is the purpose of making such videos or sharing them online. people should know the truth, such a video should somehow be labeled with technology, not to blame for anything. we were played along by the actress yulia ganatskaya. like any artist, it is important for her that the image is protected from credit, as in any, probably,
5:42 am
some kind of cool technologies. hmm, there are such cool technologies for and against this work in the frame, you are an actor . you come there, i don’t know, mum , these specialists helped us with realistic masks sergey loginov he has been working with deffakes for a long time and understands their potential very well. how much is the threat that the scale of such criminal ones will grow, that fakes fly fakes are great precisely for the purpose of exposing some kind of lie, slander, and illegal actions. this, of course, needs to be done, but usually they are such deephakes, uh, well, they are not misleading with their realism. they are often very unrealistic. eh, it still works there. uh, that's neuroprogramming. you could say it's just a belief and it's like a virus and
5:43 am
an antivirus. the first is always one step ahead. i have never settled for illegal things, want applications were very many offers. there's some confirmation identity on the crypto-exchange through de-fake. firstly, it is impossible now in these technologies to do. secondly, they, well, represent it at all. for a second, like this, 5 rubles are made there. give it and it's all done. well, it's not like that at all. but a british journalist described and showed how easy it is to deceive a voice assistant. he called his bank and started from the computer. audio recording it was synthesized by an easily accessible program from his own voice. what is your reason for calling to check my account balance, okay. name it your birth date, the program did not notice the catch, my voice, my security password again did not recognize the fake
5:44 am
and gave out information about his account balance and recent transactions. they are fighting in the laboratory of the speech technology center, which the operator can see, uh, let's try to call there, a in order to see how the technologies work. hello, i would like to get into my personal account the director of the research department imitating a call to the call center. show how their security system works such solutions can be integrated in a bank of state, service and the tv company, the system solved two problems. first, she identified me as her client, and second, she, uh, made sure i wasn't trying to deceive her and, uh, the anti-freezing schedule. in the green zone, exactly the same appeal, he recorded in advance and i
5:45 am
reproduce. in fact, the voice, uh, do it by ear, a person will not notice the catch , the system already hears everything, it’s a hoax, that this is a recording of my voice, and not a real voice, in order to pass voice biometrics. uh, attackers can go all sorts of ways, and they can, uh, record a voice and play it back, and they can parody a person, and they can stick together top international ones work to protect voice biometrics, and companies and russian companies take the lead, and at international technology competitions, developers of biometric complexes study all systems together so that it is impossible to walk through someone else's photo and now i will demonstrate, let's say, the scenario of the most common myth associated with
5:46 am
facial biometrics. namely, what is in your photo, and the villains will have access to your data, for example, yes or i bring a photo to the terminal. now the terminal says, look into the camera and realizing that it's not me who forbids it. to me, biometrics penetrations are developing all over the world in parallel, and here and there they use computer vision technology. it initially grew from the cold of detecting a person by identifying people by some points that are accepted and familiar to our eye a and further with the development of neural networks and the improvement of working with them, and the system shows really fantastic results in a positive way, that's great
5:47 am
there to make some cool videos. you do remember all videos fair. keep the money in a savings dice, well, again, i mean, i wouldn't want to see myself in some video in some kind of weird context, like, yes, and i'd like to have a tool to prove that it's not me. if it's not, well, specifically, i'm not in this video and everything goes to this, that will be enough. it's simple and i think that it will be quite accessible. at vision labs , he develops algorithms for face detectors. don't bother them either. no photo or video from the phone that you can apply some kind of special makeup, or something else like that or not? this also does not work, because we
5:48 am
are looking at a completely different plane of the face. and this technology recognizes emotions on the face and estimates age. a smile clearly rejuvenates. we can detect fallen people sitting, for example, squatting on a chair, we can detect raised hands. and these systems will already determine the position of the body in the frame, we are developing algorithms that help us identify fighting people, that is, if somewhere in the frame we have two people will appear, applying force to each other. here, we can detect this event. and why were squatting people created, and this is a bank request in order for attackers to hack an atm, he needs to sit in front of it to gain access to the compartment, and accordingly, if, for example, a person, but squats there for longer than 10 seconds , then
5:49 am
we can trigger an alarm. their platform already reveals the system has uploaded one of the most famous commercials. donald trump full of ultimate dumbass, you see, i would never didn't say that. at least in public, but another would say, someone like beer jordan is in perilous times, moving forward. we need to be more vigilant about everything we trust on the internet now in algorithms. they have advanced a lot and now this fragment. yes, it's relatively old. and now we already have more, let's say, powerful substitution technologies. defakes are already easily destroying reputation, so a resident of pennsylvania decided to save her daughter from rivals in the support group. she sent out a coach fake photos and videos of girls on them they smoke vapes and drink generated voices and videos are already sending innocent people to prison
5:50 am
when some criminal case is being considered. very often now. uh, either audio recordings or video recordings are used as evidence, and as far as they correspond to reality , the defake technology was not used. here is the problem. now. in fact , a very large number of judges are worried. and in general , those who represent this segment around the world, and also political manipulations and this roller indian candidate. ordered before you. he himself wanted to appeal to the electoral and dialect of their realians from languages. he himself knows only india put on the actor's face, and the networks were already responsible for articulation. so the politician decided to reach as many audiences as possible. well, in california, for example, it is already forbidden to spread fake politicians 2 months before the elections in china since january outside the law of depression, which will instruct the reputation or
5:51 am
threaten the security of the country, well, where, apart from the film industry and advertising, this technology great and legal future. i wanted to catch myself. and find a leshiba to take a picture in the style of lukomorye imagine that you come to a literature lesson, they start a video for you and, for example, sergei yesenin himself reads his own poems in his own voice. all the same can be done there in history lessons. all the same can be done in the lessons of mathematics, physics and chemistry, and this and because of this there is a completely different perception, especially for modern children , the russian social network vkontakte is also developing yes, we are now working on technologies that will allow all users of vkontakte and for any photo or video content they have or content available in social networks or other resources by uploading it to us to check for presence or manipulation with the face
5:52 am
of the presence of deppacks on this video. and in general, uh, we started developing this technology because we had a large deepha base that we built with our deephake technology, oh, which we launched a few months ago. the first in the application appeared a technology that allows users to try on the faces of celebrities, and then the neural network learned to identify the type of fake on the same videos. to be honest, many from the moment we became our dick could not distinguish, but really a real star from our ordinary users, and our users may not do this, so we first of all strive to give a tool that will allow you to say, really real photo videos, but already these algorithms that allow you to create a fake one by one photo tool only. everything is simpler, but the videos are scary more realistic. we are beautiful terms. russia is artificial intelligence in fact, of course, er, this is not the case yet, but in the future they will
5:53 am
undoubtedly learn to think quite qualitatively. but well, this will require a large volume of information, how do we not think that our phone has already collected biometric data from us when we make a discovery in the face. this is the same biometrics. we give a full 3d scan of the face from all angles. this means that now from this side. this can theoretically be restored somehow in this way, we upload our photo to the cloud from different angles and at different hours of the day and night. this means that now hypothetically. neural networks. there is an opportunity to better reveal exactly and make lighting and create this particular cast of the face. itmo university in st. petersburg developed the expert service , which could help employers looking for new employees, the system studies the candidate's video presentation according to various parameters. this is not
5:54 am
a lie detector, which will determine the real some completely truths on questions or completely false statements, but it can form the general impression of a person that in general he behaves insecurely on the video, but in general, he avoided answering all questions. accordingly, we, well, are unlikely to want to deal with such a person who will not be completely sincere with us. while the development was going on, the system turned out to be perfect, it screens out and deflakes, we upload popular videos on it, the actress jennifer lawrence turned into his rage, the system sees a discrepancy in facial expressions through different channels. emotions and text with all the ups and downs on the often indicative, let's say the same audio channel, uh, the video was always the same, that is, it was literally without emotional.
5:55 am
and just the same, having information that he had aggression, for example, and that he spoke some kind of emotional text and showed emotions with his voice, but at the same time he did not see his face. we can say that this is not very typical for a living person, and we can already understand that we should pay attention, and not defects. it can be a lot of different applications students can advance run the answers before the exam actors role student of the courses check the lecturer for competence. i would like to check whether it is really worth it for me to spend my precious time on studying the material that potentially may not be useful in view of the fact that the specialist himself is not sure about what he says is the task of technology. actually, it’s worth saying to this expert. believe it or not, a narrower application, please, you can start its forensic examination further, and so on players. on the stock exchange, it is important for them to know who has invest today there tomorrow, that is, they
5:56 am
listen to the opinion of authoritative people against their background, the so-called inflation appears, they quickly read, they start talking, they got into the forecasts of authorities increased, well, they themselves earn. not only will people lose on this, as if they do not matter, therefore, in principle, this technology is intended for this. expert opinion who to trust, and what would be nice to double-check and bad artificial intelligence, more than ever requires critical thinking. even the most innocent need to be marked to this seems to be the trend all over the world.
5:58 am
6:00 am
and the gunners of the airborne troops destroyed the stronghold of the ukrainian military, the ministry of defense told about this, the coordinates of the target were obtained during the combing of the area , after which they transferred them to the artillery control point further , the calculations of novna self-propelled guns got down to business, they marched to the firing positions completed the combat mission set by him, the enemy suffered losses in manpower. now we have raised the bird, uh, we have identified the enemy's stronghold , we have been given the coordinates.
11 Views
IN COLLECTIONS
Russia-24 Television Archive Television Archive News Search ServiceUploaded by TV Archive on