Skip to main content

tv   RIK Rossiya 24  RUSSIA24  March 5, 2023 5:30pm-6:01pm MSK

5:30 pm
for forensic experts, it is such a service today in one city. tomorrow in another many business trips of the complexities of danger. their relatives were waiting for them. they say goodbye to faithful comrades. they have made a huge contribution to the coming triumph, truth and justice. they remained true to exact fact, impeccable logic and pure. their conscience is for the truth, and in truth there is strength. the solution for every day from vtb online payments is to make the house comfortable in one step and without a commission to make the day more fun with paying for the internet and tv and without overpayment will make the hatch cozy in one click. get a free card for life on vtb.ru. and every day will get better. switch to vtb and that's it, it will be difficult to live with prostatitis even harder, it helps to reduce inflammation
5:31 pm
just against fibrosis with prostatitis smartphone for 1,500 rubles. a month is convenient. automatic cleaning on a whole new level tefal robot vacuum cleaner with a discount of 5000 deposits with a maximum yield of up to 9.5% in a bank branch right now
5:32 pm
policies already deceased artists played by their digital copy or actors who threw off on the screen is 30 years old and without plastic. all of this is already making face replacement technology a defake, but where
5:33 pm
can it lead us? because for most people it's either a joke or a threat. fakes if you install our technology on your tv channel. you can always determine exactly if the image substitution, if , uh, this deppack, or did not develop from two words, diploid, machine, training and fake, but many developers do not like this particular concept for negative connotations, yes, fake, but this is the future. what opportunities have opened up you can become any celebrity or any idol and say whatever you want on their behalf. well, natalie portman, like dozens of other hollywood stars , felt the full horror of this new technology when their
5:34 pm
faces began to be massively superimposed on the faces of actresses in 18+ commercials, of the most explicit content. well, then, what should be done so that, on the one hand, do not interfere with the development of technologies, and on the other hand, do not allow them to be abused. now you are late to everyone at once on the penalty area, if they told me that i am jason states in 2027 i will celebrate my anniversary in russia, i would laughed the star of fast and furious films, guy ritchie jason statham moved to the russian village keanu reeves wears a t-shirt with the olympic bear you and i looped together in russia
5:35 pm
were friends buys a zhiguli persons, each plan of the double the director of the world's first series coordinated the site with specialists, we sent screenshots and, accordingly, well, they said yes or no. yes, yes, of course, of course. when it was dark lyubimka customized digital masks joints? a profile, uh, you can’t take, you can’t do, uh, come up with frames of a full full circle around a person, and then, uh, in some
5:36 pm
of the series it turned out to be done by accident, because the masks were created by neural networks. they are constantly trained for a certain person. here's the perfect angle. uh, more precisely, not completely three -quarters of the author of the series was important not to offend when creating a parody. by law, the harmless use of the faces of stars is not regulated in any way in the world. but what opportunities have opened depressions for world cinema? filmed by the company have been working with this technology for several years. the more similar the actor is to the prototype, the more realistic the video will be. well, i think that soon it will come to this once a year, once every six months , a digital copy of a person is made, that is, well, high-quality video filming from all sides. you're already old and you're already tired. you no longer
5:37 pm
want to act. there is a thirty-year-old you have 40 years in the cinema are developing. i think it's quite real. a lot of their videos and photos , the algorithm studies everything and creates an imitation. and then on this hook you can already catch victims a businessman appears who says that i don't know, i'm selling the company. we have losses. everything is very bad, the shares are falling at the moment someone makes money on it, unfortunately, modern technologies are very often mastered first of all by scammers. protection against diffaks in russia is being developed by several companies in the field that have patented two inventions; their algorithms analyze the biological feature of a person, changing complexion. finish if
5:38 pm
the whole face is evenly filled, shrinks, well, if you want to tank, pulse the human heart of the face matches the pulse rate of a person, it turns out that it should be a fake of almost 80%, that on the right a synthesized person cannot be seen with the naked eye. we split the video frame by frame and analyze each frame. which is an exceptional feature compared to other similar solutions. and accordingly, we can simultaneously detect several faces in the frame. what is also an exclusive feature of our solution? on artificial intelligence at the sberbank stand the president was shown a man very similar to chancellor olaf and scholz. tell me
5:39 pm
, an american, what is the power of money in money says that in money you have a lot of money and what? i think there is power in truth. who really has someone stronger. here's someone who cheated on him. true, it means that it is stronger than we wanted to abandon russian gas, but in the words of a russian classic. we wanted the best, but it turned out, as always. immediately , their detector revealed a fake, and we made it with the help of an actor and our neural networks. sber actively uses facial biometrics via face id. employees and clients enter the office in the same way to confirm their identity and operations, so it was important for the company to prepare for risks in advance. and if installed. on your tv channel, the broadcast channel of our technology.
5:40 pm
you can always determine exactly if the image has been changed, if this is a deepfake, or not, because our goal is for this to happen in an automated mode, so that there are no buttons to turn on, and the light bulb immediately lights up, which is here there is some risk today to bypass the biometric barrier, but there are already examples of how technologies can be used for criminal purposes, the accountant of one of the enterprises received an sms message from his director that she urgently contacted him via video link. there is a dipak. as a matter of fact , its directors. she is deeply convinced of this that i just left, and the big manager, we have an urgent order to transfer 100,000 dollars in this situation, the accountants are impeccably fulfilling it instructions. and literally in an hour or two, with the deepest disappointment and horror, she realizes what she has done. even more dangerous is
5:41 pm
voice defakes, at the end of january, users of a new speech synthesizer program created audio recordings with the voices of stars. in them , clones of tom cruise, george lucas and others uttered insulting even russian statements. the files were blocked, but the sediment remained in one of watson's most shocking emo audio diffs, that same hermione from the harry potter films uttered an excerpt allegedly from hitler's manifesto what is she? of course she didn't. but how to deal with this in general, let's ask emma herself. i think we should all know the truth about the technology to detect fakes and special laws. so far very weak. it doesn't matter at all. what is the purpose of making such videos or sharing them online. people should know the truth, such a video should somehow be labeled technology, they are not
5:42 pm
to blame for anything. we played along with the actress yuliya ganatskaya. like any artist, it is important that the image is protected from these credits of some kind. hmm, there are such cool technologies and for and against you work in the frame, you are an actor at home. you come there, i don't know, mom's emotions. mom, i don't know how to put these realistic masks. we were helped by these specialists sergey loginov. he has been working with zifakes for a long time and is well aware of their potential and how much the threat is that the scale of such criminal ones will grow, that fakes are flying great fakes precisely for the purpose of exposing some kind of lie, slander and illegal actions. this, of course, needs to be done, but usually such deepfakes are, uh, well, not misleading
5:43 pm
with their realism, they are often very unrealistic, but everything works there the same way. uh. here, one might say, is just a belief, and this is like a virus and an antivirus. the first is always one step ahead. i have never settled for any illegal things the apps want to have been. there are a lot of offers, there are all sorts of identity confirmations on crypto exchanges through deephake. firstly, it is impossible now in the data in this technology to do. secondly, they, well, represent it in general in a second. this is how 5 rubles are made there. give it and it's all done. well, it's not like that at all. but the british journalists described and showed how easy it is to deceive a voice assistant. he called his bank and started from the computer. audio recording it was synthesized by an easily accessible program from his own voice. what is your reason for calling to check my account balance,
5:44 pm
okay. name your date of birth, the program did not notice the catch my voice my password. the security system again did not recognize the fake and gave out information about his account balance and recent transactions. are fighting in the laboratory of the speech technology center, and which the operator a-a can see, let's try to call there, a in order to see how the technologies work. hello, i would like to get into my personal account the director of the research department imitating a call to the call center. to show how their protection system works such solutions can be integrated in the state bank, service and television companies, the system solved two problems.
5:45 pm
one, she identified me as her client, and two, she, uh, made sure i i'm not trying to deceive her and uh, the anti-spoofing schedule in the green zone is exactly the same appeal, he recorded in advance and i reproduce. the actual voice uh, tell him to do it by ear , the person will not notice the catch, the system already hears everything , it heard that this is a hoax, that this is a recording of my voice, and not a real voice, in order to undergo voice biometrics. uh, attackers can go all sorts of ways, and they can, uh, record the voice and play it back, and they can uh mimic a person, and they can glue voice biometrics to protect top international ones work, and companies and russian companies occupy leading positions, and at international technology
5:46 pm
competitions. the developers of biometric systems are studying all the vulnerabilities of the system so that it is impossible to walk on someone else's photo, and now i will demonstrate, let's say , the scenario of the most common myth associated with facial biometrics. namely, what is in your photo, and the villains will get access to your data, for example, yes or well , i bring a photo to the terminal. here is the terminal. says look into the camera and realizing that it is not me who forbids me to pass, and biometrics are developing all over the world in parallel, and here and there they use computer vision technology. it initially grew from a cold of a person's action to identify people by some points that are accepted and familiar to our eye. a and further. with the development of neural networks and a, work with them improves,
5:47 pm
and the system shows really fantastic results. on a positive note then it's great there to make some cool videos. do you remember all the videos? keep your money in a savings dice, but then again, that is, i would not want to see myself in some video in some strange context, for example, yes, and i would like to have a tool to prove that it is not me. if it's not, well, specifically, i'm not in this video and everything goes to this, that will be enough. it's simple and i think that it will be quite accessible. at vision labs
5:48 pm
, he develops algorithms for face detectors. they, too, are not embarrassed by either photos or videos from the phone, they often ask the question that some kind of special makeup can be applied. and something else like that. no, this is also not works, because we are looking at a face in a completely different plane, and this technology recognizes emotions on a face and estimates age. a smile clearly rejuvenates . we can detect fallen people sitting, for example, squatting on a chair, we can detect raised hands. and these systems will already determine the position of the body in the frame, we are developing algorithms that help us identify fighting people, that is, if somewhere in the frame we have two people using each other's force. here, we can detect this event. and for what were created, squatting people,
5:49 pm
and this is a bank request in order for intruders to hack into an atm, he needs to sit in front of it to gain access to the compartment. and accordingly, if, for example , a person squats there for more than ten seconds, then we can trigger an alarm, their platform detects the system has uploaded one of the most famous videos. donald trump is a total dumbass, you see, i would never say that. at least in public, but someone else would say like jordan saw now dangerous times, moving forward. we need to be more vigilant about everything we trust on the internet right now algorithms. they have advanced a lot and now this fragment. yes, it's relatively old. ah. now we have more to say. this is how powerful spoofing technologies easily destroy reputation. save daughter from
5:50 pm
cheerleading rivals she sent out by trainer fake photos and videos of girls on them they smoke vape and drink generated voices and videos are already sent to prison innocent when considered some criminal case. very often now. uh, either audio recordings or video recordings are used as physical evidence, and as far as they correspond to reality , the defak technology was not used. here is the problem. now, in fact, a very large number of judges are worried. and in general, those who represent this segment about the whole world. also, political manipulations and this video was ordered by the indian candidate before the elections. he himself wanted to address voters in english and their dialect realians from languages. he himself knows only india , his face was put on the face of an actor, and neural networks were already responsible for the articulation. so
5:51 pm
the politician decided to reach as many audiences as possible. well, in california, for example, it is already forbidden to post fake politicians 2 months before the elections in china since january. diffaks are outlawed, which will instruct the reputation or threaten the security of the country, well, where, apart from the film industry and advertising, this technology has a great and legal future to catch the little mermaid. find a scientist cat to take a picture in the style of lukomorye imagine imagine that you come to a literature lesson , they start a video for you and, for example, sergei yesenin himself reads his own poems in his own voice. all the same can be done there in history lessons. all the same can be done in the lessons of mathematics, physics and chemistry, and this and because of this there is a completely different perception, especially for modern children , the russian social network vkontakte is also developing yes, we are now working on technologies that
5:52 pm
will allow all users of vkontakte and for any , their photo content or video or existing content in social networks or other resources by uploading it to us check for any manipulations with the face of the presence of deepfakes on this video. and in general , we started to develop this technology because we had a large davo base, which we created with our deephake technology, which we launched a few months ago. the first technology appeared in the application that allows users to try on the faces of celebrities, and then the neural network learned deepfakes on the same videos reveal. to be honest, many from the moment when a we became our deflate could not distinguish, but a really real star from our ordinary users, a and our users may not do this, so we first of all strive to give a tool that will allow you to say really real photo videos, but there are already algorithms that
5:53 pm
allow you to create a disfake using just one photo tool. everything is simpler, and the videos are more frighteningly realistic. we are beautiful terms and russia is artificial intelligence in fact, certainly. eh, it's not yet, but in the future they will certainly learn to think quite qualitatively. but for this, a large amount of information will be needed, how do we not think that our phone has already collected biometric data from us when we make this and the same biometric discovery in the face. we give a full 3d scan of the face from all angles. this means that now from these sides it can, in theory, be restored in some way. we upload our photos to the cloud from different angles and at different e, there hours of the day and night. this means that now a hypothetical neural network. there is an opportunity to better reveal exactly and make lighting and create exactly a cast of the face.
5:54 pm
itmo university in st. petersburg developed the expert service, which could help employers looking for new employees , the system studies the candidate's video presentation according to various parameters. this is not a lie detector, which will determine on questions the real some completely true or completely false statements, but it can form a general impression of a person that in general, he behaves insecurely on the video, but in general, he evaded answering all questions. accordingly, we, well, are unlikely to want to deal with such a person who will not be completely sincere with us. while the development was going on, the system turned out to be perfect, it screens out and deflakes, we upload popular videos on it, the actress jennifer lawrence turned into a fuselage, the system sees a discrepancy in emo facial expressions through different channels. tions and text with all the ups and downs on often
5:55 pm
pointers, for example, of the same audio channel and the video has always been the same. that is, he was literally without emotion. and just the same, having information that he had aggression, for example, and that he spoke some kind of emotional text and showed emotions with his voice, but at the same time he did not see his face. we can say that this is not very typical for a living person, and we can already understand that we should pay attention, and not defects. it can be a lot of different applications students can run the answers in advance before the exam actors role student check the electron competence. i would like to check whether it is really worth it for me to spend my precious time on studying the material that potentially may not be useful in view of the fact that the specialist himself is not sure about what he says is the task of technology. actually, it is worth saying to this
5:56 pm
expert believe it or not, a narrower application, please, you can further its forensic examination, after all, and so on players. on the stock exchange, it is important for them to know who will invest today there tomorrow, that is, they listen to the opinion of authoritative people against their background the so-called information appears, the gypsies quickly read a lot, they begin to talk , the authorities got into the forecasts of their return. well, they themselves earn little that people will lose on this, as if they do not matter, therefore, in principle, this technology is intended for this. the system is designed to test expert opinion, who to trust , and what would be nice to double-check and the age of artificial intelligence, more than ever requires critical thinking, one fake, even the most innocent need to be labeled for this, seems to be inclined all over the world. this infects tears voice and nerves beats per minute can not be counted. let it be wheezing to goosebumps, yes
5:57 pm
fever, yes tears, we are all sick of football. play football at the stadium, welcome the new season get a fan card at gosuslugi.ru.
5:58 pm
5:59 pm
6:00 pm
we are used to watching videos watching russian movie channels and cartoons we continue to follow the developments in the russian military defeated the forces of the armed forces, under fire our air defense systems shot down a ukrainian mi-8 helicopter and 6 shells of highmors rocket systems and a tornado learn about the situation at the front in the special operation zone the head of the ministry of defense arrived details from denis alekseev sergei shoigu held a meeting with the commanders of the special military operation groups the minister of defense was informed about the current situation and plans for further actions special attention to the issues of safe deployment of personnel is important to organize continuous provision of troops with equipment and ammunition. that's just the work of the rear units. the main task. without delay , we work 24/7 without days off to solve all the problems of the fighters on the front line. here.

10 Views

info Stream Only

Uploaded by TV Archive on