tv RIK Rossiya 24 RUSSIA24 March 7, 2023 11:30pm-12:01am MSK
11:30 pm
11:31 pm
11:32 pm
saying things they've never said in public in their lives. already deceased artists, for whom their digital copy plays or an actor who has thrown off 30 years on the screen and without plastic surgery. all this is already done by technology defake face swaps, but where can it lead us, because for most people this is either a joke or a threat. fakes if
11:33 pm
you install our technology on your tv channel. you can always determine exactly if the image is spoofed, if, uh, this defake, or not not fake is a combination of two words, diploid machines, training and fake, but many developers do not like this particular concept for its negative connotation, yes, fake, but this is the future. what opportunities have opened up, you can become any celebrity, any idol and on their behalf, already say whatever you want. well, natalie portman felt like dozens of other hollywood stars. the whole horror of this new technology, when their faces began to massively impose on the faces of actresses in 18+ commercials, of the most explicit content. well , then what should be done so that, on the one hand, not to interfere with the development of technologies, and on the other hand, to prevent abuse.
11:34 pm
now you are late to everyone at once on the penalty area, if they told me that i jason statham 2027 will celebrate my anniversary in russia, i would have laughed at the fast and the furious movie star, guy ritchie jason statham moved to the russian village keanu reeves wears a t-shirt with the olympic bear you and i filmed together in russia for five years and were friends buys a zhiguli i thought that 200.000 for the car that we found adequate prices, made thanks to face-cancelling technology, each plan is a double director the world's first series coordinated with
11:35 pm
it-specialists. we sent here a screenshot and accordingly, well they said yes or no. yes, yes, of course, of course. when was it dark? and already under their faces, digital mask so that there are no seams or joints. you can’t take a profile, uh, you can’t do, uh , come up with frames of a full full circle around a person, and then in some of the series it turned out to be done by accident, because the masks, uh, are created by neural networks. they are constantly trained for a certain person. let's say this is the perfect angle. uh, that is,
11:36 pm
incomplete three-quarters of the author of the series was important in creating a parody. somewhat legally , the harmless use of the faces of stars is not regulated in any way in the world. but what are the possibilities opened depressions for world cinema? the company has filmed several meters working with this technology. the more similar the actor is to the prototype, the more realistic the video will be. well , i think that soon it will come to this once a year, once every six months, a digital copy of a person is made, that is, well, high-quality, video filming from all sides. you're already old and you're already tired. you no longer want to act in film, you have a 40-year-old. she goes to work in the cinema earns money. i think it's quite real. in new videos and photos, the algorithm
11:37 pm
studies everything and creates an imitation. and then on to this the hook is already possible to catch victims appear businessman who says i do not know i 'm selling the company. we have losses. everything is very bad, the shares are falling at the moment someone makes money on this, unfortunately, modern technologies. very often asphalt, in short, scammers. protection against diffaks in russia is being developed by several companies in the field that have patented two inventions; their algorithms analyze the biological feature of a person, changing complexion. ends if the whole face contracts evenly, you rotate into a tank the pulse of the human heart of a person, that
11:38 pm
on the right a synthesized person cannot be seen with the naked eye. we split the video frame by frame and analyze each frame. which is an exceptional feature compared to other similar solutions. we, therefore , can simultaneously detect several faces in the frame. what is also an exclusive feature of our solution? artificial intelligence at the stand of the savings bank was shown to the president a man very similar to chancellor olaf scholzan raven. lik iman tell me an american, what is the power of money in money says that in money you have a lot of money and what? i think there is power in truth. who really has someone stronger. here is the one you deceived after him. true, it means that it is stronger than we wanted to abandon russian gas, but
11:39 pm
in the words of a russian classic. we wanted the best, but it turned out, as always. immediately , their detector revealed a fake, and we made it with the help of an actor and our neural networks. sbera is actively using facial biometrics via face id employees customers enter the office in the same way confirm their identity and transactions, so it was important for the company to prepare in advance for the risks if established. on your tv channel, the broadcast channel of our technology. you can always determine exactly. uh, if the image is changed, if this is a deepfake or not, it’s not our goal for this to happen in an automated mode, so that there are no buttons to turn on, but a light comes on right away, that there is some risk here today
11:40 pm
it’s difficult for diffakes to bypass the biometric barrier, but already there are examples of how technologies can be used for criminal purposes, the accountant of one of the enterprises received an sms message from his director that she urgently contacted him via video link. as a matter of fact, its directors. she is deeply convinced of this that i just got out. big manager, we have an urgent order to transfer 100,000 dollars in this situation , the accountants are impeccable to carry out his instructions. and literally in an hour or two, with the deepest disappointment and horror, he realizes what she did. voice diff-fakes become even more dangerous. at the end of january, users of a new program, a speech synthesizer , created audio recordings with the voices of stars. in them , clones of tom cruise, george lucas and others
11:41 pm
uttered insulting even russian statements. the files were blocked, but the sediment remained. in one of the most shocking audio deepfakes of emo watson , the same hermione from the harry potter films uttered an excerpt allegedly from hitler's manifesto what is she? of course she didn't. but how to deal with this in general, we ask ourselves emma i think we should all know the truth about what diff detection technology and special laws are. while very weak, but it does not matter at all. what is the purpose of making such videos or sharing them online. people should know the truth, such a video should somehow be marked technology, they are not guilty of anything. actress yulia ganatskaya played along with us. like any artist, it is important for her that the image is protected from some kind of credit. hmm, there are such cool technologies
11:42 pm
for and against you work in the frame, you are an actor. son. mom, i don't know how to put these realistic masks. these specialists helped us sergey loginov he has been working with deffakes for a long time and understands their potential very well. how much in general is the threat that the scale of such criminal, or something, such as fakes will grow, fakes are great precisely for the purpose of exposing some kind of lie, slander, and illegal actions. this, of course , needs to be done, but usually such depfakes are , uh, well, misleading not by their realism, they are often very unrealistic. and it still works there. uh. here, one might say, is just a belief, and this is like a virus and an antivirus. the first is always one step ahead. i have never agreed to
11:43 pm
anything illegal. although the application was. there are a lot of offers, there are all sorts of identity confirmations on crypto exchanges through deephake. firstly, it is impossible now in the given technologies to do it. secondly, they, well, represent it in general in a second. this is how 5 rubles are made there. give it and it's all done. well, it's not like that at all. but the british journalist described and showed how easy it is to fool the voice assistant. he called his bank and started from the computer. audio recording it was synthesized by an easily accessible program from his own voice. what is your reason for calling to check my account balance, okay. name it your birth literature, the program did not notice the catch my voice is my password. the security system again did not recognize the fake and gave out information about his account balance and recent transactions. kami
11:44 pm
wrestling in the lab of the speech technology center, which can be seen by the operator ah, let's try to call there, a in order to see how the technology works. hello, i would like to get into my personal account the director of the research department imitating a call to the call center. to show how their protection system works such solutions can be integrated in the state bank, service and television companies, the system solved two problems. one, she identified me as her client, and two, she, uh, made sure i wasn't trying to deceive her, and uh, the anti-pouffe schedule in the green zone is exactly the same. appeal, he recorded in advance and i reproduce.
11:45 pm
own voice uh, could this be done by ear, a person will not notice the catch, the system already hears everything , that this is a hoax, that this is a recording of my voice, and not a real voice, in order to pass voice biometrics, and attackers can go in a variety of ways, and they can record a voice and play it back, and they can parody a person, and they can stick together top international ones work, and companies and russian companies occupy the leading places, and at international technology competitions. the developers of biometric systems are studying all the vulnerabilities of the system so that it is impossible to walk on someone else's photo, and now i will demonstrate, let's say , the scenario of the most common myth associated with facial biometrics. namely, what is
11:46 pm
in your photo, and the villains will get access to your data, for example, yes or hmm to the terminal. i present a photograph. now the terminal says, look into the camera and realizing that it’s not me that forbids me to pass and biometrics are developing all over the world in parallel, both there and there use computer vision technology. it initially grew from a simple human detection by identifying people by some points that are accepted and familiar to our eye a and further with the development of neural networks and the improvement of working with them, and the system shows really fantastic results in a positive way, it's great to do some kind of those are great
11:47 pm
videos. you remember all the videos with the problem. keep money in a savings dice. but again, i would not want to see myself on some video in some strange context, for example, yes, and i would like to have a tool to prove that it's not me. if it's not, well, specifically, i'm not in this video and everything goes to this, that will be enough. it's simple and i think that it will be quite accessible. in the company for facial detectors , they are also not embarrassed, neither photos nor videos from the phone often ask the question that some kind of special makeup can be applied. and something else like that. no, this does not work either, because we are examining a face of a completely different plane, and this technology recognizes
11:48 pm
emotions on the face and estimates the age of a smile clearly rejuvenates fallen people, sitting, for example, on a squatting chair, raised hands can be detected. and these systems will already determine the position of the body in the frame, we are developing algorithms that help us identify fighting people, that is, if somewhere in the frame we have two people using each other's force. here, we can detect this event. what squatting people were created for, and this is a bank request in order for intruders to hack into an atm, he needs crouch in front of him to gain access to the compartment. and accordingly, if, for example , a person, but squats there for more than ten seconds, then we can trigger an alarm
11:49 pm
and the platform already detects the system has loaded one of the most famous videos. donald trump is a complete and utter dumbass, you see , i would never say that. at least in public, but another would say, someone like jordan saw dangerous times right now, moving forward. we need to be more vigilant about everything we trust on the internet right now algorithms. they are very strong advanced and now this fragment. yes, it's relatively old. and now there is already more hmm let's say. so powerful substitution technologies easily destroy reputation, so a resident of pennsylvania decided to save her daughter. to the support group, she sent fake photos and videos of the girls to the coach, they smoke vapes and drink generated voices and videos are already sending the innocent to prison when some criminal
11:50 pm
case is being considered. very often now , either audio recordings or video recordings are used as aesthetic indications, and to what extent they correspond to reality no technology was used. here is the problem. now. in fact, a very large number of judges are concerned. and in general, those who represent this segment all over the world. the indian candidate ordered this video before the elections. he himself wanted to address the voters in english and dialects of languages. he himself knows only hindu, his face was put on the face of the actor, and neural networks were already responsible for the articulation. so the politician decided to embrace. as big an audience as possible, but here in california for example, it is already forbidden to post defakes of politicians 2 months before the elections, and in china since january, fakes that defame the reputation or threaten the country's security are outlawed, but where, apart from the film industry and advertising, this
11:51 pm
technology has a great and legal future. he himself wanted to catch the little mermaid. and to find leshy and the scientist’s cat to take a picture in the style of lukomorye imagine that you come to a literature lesson, they start a video for you and, for example, sergei yesenin reads his own poems in his own voice. all the same the most you can do there in the lessons of history. all the same can be done in the lessons of mathematics, physics and chemistry, and this and because of this there is a completely different perception, especially for modern children. and the russian social network vkontakte yes, we are now working on technologies that will allow everyone using vkontakte and for any photo or video content they have or content available on social networks or other resources , by uploading it to us deppeykov on this video. and in general
11:52 pm
uh, we started developing this technology because we had a large database of data that we created with our deephake technology, oh, which we launched a few months ago. the application was the first to introduce a technology that allows users to try on the faces of celebrities, and then the neural network learned to identify the type of fake on the same videos. to be honest, many from the moments when our deflake could not distinguish a muslim, but a really real star from our ordinary users, and our users of this may not, so we first of all strive to give a tool that will allow you to say really real photo videos, but there are already algorithms that allow you to create just one photo tool. everything is simpler, and the videos are more frighteningly realistic. we are beautiful terms and russia is artificial intelligence. but in fact, of course, er, this is not yet the case, but in the future they will certainly learn to think quite qualitatively. but it is for this that
11:53 pm
a large amount of information will be needed, because we don’t think about what our phone has already collected from us biometric data when we make a face discovery. this is the same biometrics. we give a full 3d scan of the face from all angles. this means that now from these sides it can, in theory, be restored in some way, we upload our photos to the cloud from different angles and at different hours of the day and night there. this means that now a hypothetical neural network. there is an opportunity to better reveal exactly and make lighting and create this particular cast of the face. at itmo university petersburg, they developed the expert service , which could help employers looking for new employees, the system studies the candidate's video presentation according to various parameters.
11:54 pm
this is not a lie detector, which will determine some completely true or completely false statements on questions, but it can form the general impression of a person that , in general, he behaves insecurely on the video, but in general, and he avoided answering everything questions. accordingly, we, well, are unlikely to want to deal with such a person who will not be completely sincere with us. while the development was going on, the system turned out to be perfect, weeding out and diffing uploading popular videos on it, the actress jennifer lawrence turned into a fuselage, the system sees a discrepancy in facial expressions through different channels. emotions and text with all the ups and downs on the often pointers, for example, the same audio channel , the video was always the same. that is, he was literally without emotion. and just the same, having information that he had aggression, for example, and what did he
11:55 pm
say, what kind of emotional text are you and he showed emotions with his voice, but at the same time he did not see his face. we can say that this is not very typical for a living person, and we can already understand that we should pay attention, and not defects. it can be a lot of different applications. students can run the answers before the exam . i would like to check whether it is really worth it for me to spend my precious time studying the material that potentially may not be useful in view of the fact that the specialist himself is it's not sure what he's talking about is technology challenge. actually, it’s worth saying to this expert believe it or not, a narrower application, please, you can go further, if forensic examination and so on, the players on the exchange, it’s important for them to know who to invest today there tomorrow, that is, they listen to the opinion of authoritative people against their background,
11:56 pm
so-called infogypsies appear. they quickly began to read, they began to speak , the authorities got into the forecasts of their return. well, they themselves earn little that people will lose on this, as if they do not matter, therefore, in principle, for this technology is meant to be. the system is designed to test expert opinion, who to trust, and what would be nice to double-check and bad artificial intelligence, more than ever requires critical thinking. and diff-flakes, even the most innocent ones , need to be labeled for this, it seems they are leaning all over the world.
12:00 am
the soviet people fish others are not here began to celebrate since 1913 and only more than 50 years later in the soviet union, march 8 began to be celebrated at the state level, but over time, the political overtones began to fade into the past on march 8. today is associated with spring and the femininity of russian women, president vladimir putin also congratulated. dear our women , i sincerely congratulate you on international women's day, this holiday is celebrated in many states, but for russia it is always full.
24 Views
IN COLLECTIONS
Russia-24 Television Archive Television Archive News Search ServiceUploaded by TV Archive on