Skip to main content

tv   RIK Rossiya 24  RUSSIA24  March 7, 2023 11:30am-12:00pm MSK

11:30 am
kiselyov you said that you will return, the most expensive thing is holding.
11:31 am
now our air is a short advertisement, and then watch a special report by alisa romanova , suspicious faces will find out how russia is fighting against deffakes. we closed a loan with
11:32 am
a guaranteed rate service. that's not all, how to pay and in time the post office bank will return the difference in money, recalculating the loan, put 4.5% per annum. come for money mail bank solution for every day from vtb vtb card with cashback up to 30%. this is a reason to rejoice in every purchase, to please someone else to turn ordinary coffee into a pleasant acquaintance and the ability to change plans, as it suits you. get a free vtb card for life and every day will become better. go to vtb and everything will work out. buy branded stores and on the site we will raffle prizes of a million. each of them is a real work of art in light, translucent, tinted light. they have a special inner strength of character and at
11:33 am
the same time lightness, as if you are familiar with eternity and inspiration. when you want to confess your love, kiss life and it is impossible to look away. happy holiday, my love. politicians
11:34 am
saying things they've never said in public in their lives. already deceased artists, for whom their digital copy plays or an actor who has thrown off 30 years on the screen and without plastic surgery. all this is already making face replacement technology de fake, but where can it lead us, because for most people it is either a joke or a threat.
11:35 am
fakes will automatically appear if you install our technology on your tv channel. you can always determine exactly if image substitution, if uh, this data or not, not a dipper, it was formed from two words, diploid, machines, training and fake, but many developers do not like this particular concept for its negative connotation, yes, fake, but this is the future. what opportunities have opened up, you can become any celebrity, any idol, and on their behalf already say whatever you want. well, natalie portman, like dozens of other hollywood stars , felt the full horror of this new technology when their faces began to be massively superimposed on the faces of actresses in commercials. 18+, the most explicit content. well, then, what should be done so that, on the one hand, do not interfere with the development of technologies, and on the other hand, do not allow them to be abused.
11:36 am
you are late now with him immediately on the penalty, if you told me that i jason 2027 will celebrate my anniversary in russia, i would have laughed the star of fast and furious films, guy ritchie jason statham moved to the russian village keanu reeves wears a t-shirt with the olympic bear we have been with you for 5 years filmed together in russia were friends for a car that we found adequate prices, thanks to face swapping technology, every plan
11:37 am
the director of the world's first series coordinated the double. accordingly, well did they say yes or no? light up the candles. now i will decide everything. the doubles were played by actors, and digital masks were already fitted under their faces so that there were no seams or joints. a profile, uh, you can’t take, you can’t do, uh, come up with frames of a full full circle around a person, and then, uh, in some of the series it turned out to be done by accident, because the masks were created by neural networks. they are constantly trained for a certain person.
11:38 am
here's the perfect angle. uh, more precisely, not completely three-quarters of the author of the series was important, creating a parody not to offend anyone's feelings under the law, the harmless use of the faces of stars is not regulated in any way in the world. but what opportunities have opened deppniki for world cinema. the series was filmed by the gentlemedia company, and have been working with this technology for several years. the more similar the actor is to the prototype, the more realistic the video hour in the west will be. there is such a story. well, i think, who do you know, artists will also soon come to this once a year once every six months is done a digital copy of a person, that is, well, high-quality, video filming from all sides. you are already old. you are already tired. you no longer want to act. there is a thirty-year-old you , a forty-year-old, she goes to work in the cinema , develops you, or your heir money. i think it's quite possible to copy celebrities the easiest way. these
11:39 am
algorithms are all these, and then a businessman appears on this hook, who says that i don’t know, i’m selling the company. we have losses. it's all very bad stocks are falling at the moment someone makes money on it money, unfortunately, modern technology. very often asphalt, in short, in the car. cove in russia is being developed by several companies in the field have patented two inventions, their algorithms analyze the biological characteristics of a person, changing complexion. that said something to finish, if the whole face is evenly attacked is reduced , you are well elsa in the tank of the pulse of the human means stands out in the purity of the person. it turns out that it should be
11:40 am
almost 80%, that on the right is a synthesized person, with the naked eye this is not discern. we split the video frame by frame and analyze each frame. which is an exceptional feature compared to other similar solutions. we, therefore, can simultaneously detect several faces in the frame. what is also an exclusive feature of our solution? in artificial intelligence, at the sberbank stand , the president was shown a person very similar to chancellor olaf and scholz. tell me an american, what is the power of money in money says that in money you have a lot of money and what? i think there is power in truth. who someone is really stronger. here is the one you
11:41 am
deceived after him. true, it means that it is stronger than we wanted to abandon russian gas, but in the words of a russian classic. we wanted the best, but it turned out, as always. immediately , their detector declared a fake, and we made it with the help of an actor and our neural networks. employees and clients enter the office using face id to confirm their identity and operations in the same way, so it was important for the company to prepare for risks in advance. and if installed. on uh your tv channel broadcast channel our technology. you can always determine exactly if the image has been changed, if this is a defake or not, it’s not dipplay, this is our goal, so that this happens in an automated mode, so that you don’t have to turn on any buttons, but the light immediately lights up that there is some kind of
11:42 am
fake risk, it is difficult to bypass the biometric barrier, but there are already examples of how technologies can be used for criminal purposes, the accountant of one of the enterprises received an sms message from her director that she urgently contacted him by video link there is plaster. as a matter of fact , its directors. she is deeply convinced of this that i just got out. e. big manager, we have an urgent order to transfer 100,000 dollars in this situation, the accountants are doing his orders without reproach. and literally in an hour or two, with the deepest disappointment and horror, she realizes what she has done. voice defakes become even more dangerous, at the end of january, users of a new program, a speech synthesizer , created audio recordings with the voices of stars. in them
11:43 am
tom cruise clones of george lucas and others have made offensive remarks, even russian ones. the files were blocked, but the sediment remained in one of watson's most shocking emo audio diffs, that same hermione from the harry potter films uttered an excerpt allegedly from hitler's manifesto what is she? of course she didn't. but how to deal with this in general, let's ask emma chaim herself. i think we should all know the truth about fake detection technology and special laws. so far very weak. it doesn't matter at all. what is the purpose of making these videos? share them on the network, people should know the truth , such a video should be somehow marked by technology, they are not to blame for anything. we played along with the actress yulia ganatskaya like any artist, it is important for her that her image is protected from accreditation by some hmm
11:44 am
, there are such cool technologies for and against you work in the frame , you are an actor these realistic masks. these specialists helped us sergey loginov he has been working with zifakes for a long time and perfectly understands their potential to it will grow the scale of such criminal, or something, somewhere fakes are great fakes precisely for the purpose of exposing some kind of lie, slander of illegal actions. this, of course , needs to be done, but usually such deepfakes are , uh, well, misleading not by their realism, they are often very unrealistic. and it still works there. uh. here, one might say, is just a belief, and this is like a virus and an antivirus. the first is always one step
11:45 am
ahead. i have never agreed to anything illegal. although the application was. a lot of offers, there are all sorts identity verification on crypto exchanges through deephake. firstly, it is impossible now in this technology to do. secondly, they, well, represent it in general in a second. this is how 5 rubles are made there. give it and it's all done. well, it's not like that at all. but the british journalists described and showed how easy it is to deceive a voice assistant. he called his bank and started from the computer. audio recording it was synthesized by an easily accessible program from his own voice. what is your reason for calling to check my account balance, fine. name it your birth literature, the program did not notice the catch my voice is my password. the security system again did not recognize the fake and gave out information about his
11:46 am
account balance and recent transactions. they are fighting in the laboratory of the speech technology center, and which the operator can see, uh, let's try to call there, and in order to see how the technologies work. hello, i would like to get into my personal account the director of the research department imitating a call to the call center. show how their protection system works such solutions can be integrated into the state bank, service and television companies, the system solved two problems. first, she identified me as her client, and second, she uh made sure i wasn't trying to deceive her and uh, the green zone anti-phreaking schedule is exactly the same call, he recorded it in advance and i'm
11:47 am
replaying it. actually the voice uh , do this by ear, a person will not notice the catch , the system already hears everything , it’s a hoax, that this is a recording of my voice, and not a real voice, to go through the voice biometrics. uh, attackers can go in a variety of ways, and they can record a voice and play it back, and they can, uh , parody a person, and they can stick together top international ones work to protect voice biometrics, and companies and russian companies occupy leading positions, and at international technology competitions. the developers of biometric systems are studying all the vulnerabilities of the system so that it is impossible to walk through someone else's photo, and now i will demonstrate, let's say, the scenario of the most common myth,
11:48 am
associated with facial biometrics. namely, what is in your photo, and the villains will get access to your data, for example, yes or well , i bring a photo to the terminal. here is the terminal. he says, look into the camera and realizing that it's not me who forbids it. to me , diffuse biometrics are developing all over the world in parallel, and here and there they use computer vision technology. it initially grew from a simple human detection by identifying people by some points that are accepted and familiar to our eye a and further with development neural networks and improved work with them, and the system shows really fantastic results. in a positive way
11:49 am
, it's great to make some cool videos there. do you remember all the videos? keep the money in a savings suit, but then again, i mean, i wouldn't want to see myself in some video in some weird context, like, yes, and i'd like to have a tool to prove it's not me. if it's not, well, specifically, i'm not in this video and everything goes to this, that will be enough. just and i think it will be enough available. at vision labs , he develops algorithms for face detectors. they, too, are not embarrassed by either photos or videos from the phone, they often ask the question that some kind of special makeup can be applied. and something else like that. no, this doesn’t work either, because
11:50 am
we examine the face in a completely different plane, and this technology recognizes emotions on the face and estimates the age. a smile clearly rejuvenates. we can detect fallen people sitting, for example, squatting on a chair, we can detect hands. and these systems will be determined in the frame is already the position of the body, we are developing algorithms that help us to identify fighting people, that is, if somewhere in the frame we have two people using each other's force. here, we can detect this event. why were squatting people created, and this is a bank request in order for attackers to hack an atm, he needs to sit down in front of him to gain access to the compartment, and accordingly, if, for example, a person, but squats
11:51 am
there for more than 10 seconds, then we can work alarm and already the platform reveals the system has uploaded one of the most famous commercials. donald trump is a complete dumbass, you see, i would never say that. at least in public, but another would say, someone like jordan saw dangerous times right now, moving forward. we need to be more vigilant about everything we trust on the internet now of the algorithm. they have advanced a lot and now this fragment. yes, it's relatively old. and now there are already more, let's say, so powerful technologies substitutions. it is already easy to destroy the reputation so a resident of pennsylvania decided to save her daughter from rivals in the support group. she sent fake photos and videos of the girls to the trainers on them they smoke vapes and drink the generated voices and the videos are already sending
11:52 am
the innocent to jail when some criminal case is being considered. very often now. uh, either audio recordings or video recordings are used as evidence, and as far as they correspond to reality, the technology was not used. here is the problem. now. in fact worries a very large number of judges. and in general, here are those who represent this segment all over the world, and also political manipulations and the indian candidate ordered this video before the elections. he himself wanted to address voters in english and their dialect is really from languages. he himself knows only the india of the actor, and neural networks were already responsible for the articulation. so the politician decided to reach as many audiences as possible, but in california , for example, it is already forbidden to post fake politicians 2 months before the elections in china since january it is illegal to spread fakes that defame
11:53 am
reputation or threaten the security of the country, but where, apart from the film industry and advertising, this technology has a great and legal future like the little mermaid. and to find leshy and the scientist’s cat to take a picture in the style of lukomorye imagine that you come to a literature lesson, they start a video for you and, for example, sergei yesenin reads his own poems in his own voice. all the same can be done there in history lessons. all the same can be done in the lessons of mathematics, physics and chemistry, and this also appears because of this. a completely different perception, especially among modern children, is also being developed by the russian social network vkontakte yes, we are now working on technologies that will allow all users of vkontakte and for any photo or video content they have or available content on social networks or other resources by uploading it to us check for any manipulations with the face
11:54 am
of the presence of deppeyks in this video. and in general , uh, we started developing this technology because we had a large database of defak of that we created with our deephake technology, about which we launched some a few months ago. the application was the first to introduce a technology that allows users to try on the faces of celebrities, and then the neural network learned to identify the type of fake on the same videos. to be honest, many from the moments when our dikfake could not distinguish garbage. uh, a really real star from our regular users, and our users may not do this, so we first of all strive to give a tool that will allow you will be told really real photo videos, but there are already algorithms that allow you to create a display using just one photo tool. everything is simpler, and the videos are more frighteningly realistic. we are beautiful terms and russia is artificial intelligence, but in fact, of course, uh, this is not the case yet, but in the future they
11:55 am
will undoubtedly learn to think quite qualitatively. but well, this will require a large amount of information. here we do not think that our phone has already collected biometric data from us when we make a discovery in the face of this and the same biometrics. we give a full 3d scan of the face from all angles. this means that now from these sides it can, in theory, be restored in some way, we upload our photos to the cloud from different angles and at different hours of the day and night. this means that now hypothetically. neural networks. there is an opportunity to better reveal exactly and make lighting and create this particular cast of faces. itmo university in st. petersburg has developed an expert service that could help employers looking for new employees, the system examines the candidate's video presentation according to various parameters.
11:56 am
this is not a lie detector that will determine some completely true or completely false statements on questions, but it can form the general impression of a person that in general he behaves insecurely on the video, but in general, he avoided answering all questions . accordingly, we, well, are unlikely to want to deal with such a person who will not be completely sincere with us. while the development was going on, the system turned out to be ideal, weeds out and deflakes we load popular videos on it actress jennifer lawrence has turned into a fuselage system, the system sees a discrepancy in facial expressions through various channels. emotions and text with all the ups and downs on the often index, let's say the same channel's audio , the video was always the same, that is, it was literally without emotional.
11:57 am
and just the same, having information that he had aggression, for example, and that he spoke some kind of emotional text and showed emotions in his voice, but at the same time did not move his face, we can say that this is not it is very typical for a living person and we can already understand what we should pay attention to, and not that we can already have a lot of different applications students can run the answers before the exam i would like to check whether it is really worth it for me to spend my precious time on studying the material that potentially may not be useful in view of the fact that the specialist himself is not sure about what he says is the task of technology. it's actually worth saying expert believe it or not narrower application, please, you can further its forensic examination to start and so on players. on the stock exchange, it is important for them to know who will invest today there tomorrow, that is, they
11:58 am
listen to the opinion of authoritative people against their background, the so-called information appears. the gypsies quickly began to read too much, began to speak, got into the forecasts of the authorities of their returns. well, they themselves earn little that people will lose on this, as if they do not matter, therefore, in principle, this technology is intended for this. the system designs for verification of expert opinion, who to trust, and what would be nice to double-check and bad artificial intelligence, more than ever requires critical thinking. and diff-flakes , even the most innocent ones, need to be labeled for this, it seems they are leaning all over the world.
11:59 am
dozens of machine guns, grenade launchers , automatic weapons and pistols were seized, 23
12:00 pm
underground weapons workshops were asked to be milled, a large-scale operation was enough for about 30 regions of the country. who and why produced this weapon? leaving in small groups, the french press publishes the words of one of the ukrainian military, who said that the ukrainian troops are almost surrounded, there is only one way to escape and a new model of international relations, the head of the chinese foreign ministry commented on the relations between moscow and beijing and criticized the actions of the united states. what statements were made? one of the squares of the city and several streets in vladivostok were flooded after a breakthrough in heat without hot water and heating, 30,000 people were left with such a situation. underground weapons workshops, large-scale operations of the fsb of the ministry of internal affairs of the russian guard covered about 30 regions of the country.

23 Views

info Stream Only

Uploaded by TV Archive on