tv RIK Rossiya 24 RUSSIA24 March 4, 2023 9:30am-10:01am MSK
9:30 am
9:31 am
it can take us for most people, it's either a joke or a threat. do not believe what you see, but what tools already exist to automatically detect fakes if you install our technologies on your tv channel. you can always determine exactly if the substitution of the image, if uh this dikf, or not, not dippers was formed from two words, diploin, and machine learning and fake, but many developers do not like this particular concept for its negative connotation, yes, fake, but this is the future. what opportunities have opened up, you can become any celebrity, any idol. and this face is already saying whatever you want, but natalie portman, like dozens of other
9:32 am
hollywood stars, felt for themselves. the whole horror of this new technology, when their faces began to be massively superimposed on the faces of actresses in 18+ commercials, of the most explicit content. well , then what to do, so that on the one hand not prevent technologies from developing, and on the other hand, prevent me from being abused. you are now late for everyone at once on the penalty area, if you told me that i jason statham 2027 will celebrate my anniversary in russia, i would have laughed at the star of fast and furious films. guy ritchie jason state moved to the russian village keanu reeves wears a t-shirt with the olympic bear you and i filmed together for 5 years in russia
9:33 am
, we were friends for a car that we found adequate prices, thanks to face-swapping technology, the director of the first in the world, respectively. uh, when it was dark, darling, light it up bye candles? now i'll decide everything. the doubles were played by actors, and already digital masks were adjusted under their faces so that there were no seams or joints. you can’t take a profile, you can’t do, uh , come up with frames of a full full circle
9:34 am
around a person, and then in some of the series it turned out to be done by accident, because the masks a-a were created by neural networks. they are constantly trained under a certain one. it was important for a person, creating a parody not to offend anyone's feelings under the law, the harmless use of the faces of stars is not regulated in any way in the world. but what opportunities have opened deffriki for world cinema. yes, yes, we can, the series was filmed by the gentlemedia company, and they have been working with this technology for several years. the more similar the actor is to the prototype, the more realistic the video will be honestly in the west, there is such a story. well, i think, who do you know, artists will also soon come to this once a year , once every six months, a digital copy
9:35 am
of a person is made. that is, well, high-quality, video filming from all sides. you are already old. you are already tired. you no longer want to act, there is a copy of you 30 years old you forty years old. she goes to work in the cinema develops money for you or your heirs. i think it's quite possible to copy celebrities the easiest way. algorithms, and then a businessman appears on this key to already catching victims, who says that i don’t know, i’m selling the company. we have losses. everything is very bad, the stocks are falling at the moment someone makes money on it, unfortunately, modern technologies very often asphalt is first of all scammers. in russia , several companies in the field are developing patented two inventions, their algorithms analyze the biological feature of a person, changing complexion. to finish,
9:36 am
if the whole face contracts evenly , you rotate into a tank, the pulse of the human heart of the face coincides with the purity of a person. it turns out that 80%, which is a synthesized person on the right, cannot be seen with the naked eye. we split the video frame by frame and analyze each frame. which is an exceptional feature compared to other similar solutions. and we, accordingly, can simultaneously detect several faces in the frame, which is also exclusive feature of our solution? like artificial intelligence at the sberbank stand, the president was shown a person very similar to chancellor olaf and schultz. here,
9:37 am
tell me an american, what is the power of money in money says that in money you have a lot of money and what? i think there is power in truth. who really has someone stronger. here is the one you deceived after him. true, it means that it is stronger, we wanted to abandon russian gas, but they say the words of a russian classic. we wanted the best, but it turned out, as always. right their detector detected a fake, and we made it with the help of an actor and our neural networks. sberbank is actively using facial biometrics via face id, when employees enter the office, customers confirm their identity and operations in the same way, so it was important for the company to prepare in advance for the risks if installed. on your tv channel, the broadcast channel of our
9:38 am
technology. you can always determine exactly. uh, if the image is changed and if this is a diffuse, or not, because our goal is for this to happen in an automated mode, so that there are no buttons to turn on, and the light immediately lights up that there is some risk here today it is difficult for diffakers to bypass the biometric barrier, but there are already examples of how technologies can be used for criminal purposes, an accountant of one of the enterprises received an sms message - messages from her director that she urgently contacted him via video link, there is a cast. as a matter of fact , its directors. she is deeply convinced of this that i just got out. big head, we have an urgent an order to transfer $100,000 in this situation. accountants impeccable performs his instructions. and literally in an hour or two , with the deepest disappointment and horror
9:39 am
, she realizes what she has done. voice diff-fakes become even more dangerous. at the end of january, users of a new program, speech synthesizers, created audio recordings with the voices of stars. in them, clones of tom cruise , george lucas and others uttered insulting even russian statements. the files were blocked, but the sediment remained. in one of the most shocking emo audio diffuses watson, the same hermione from the harry potter films, uttered an excerpt allegedly from hitler's manifesto what is she? of course she didn't. but how to deal with this in general, let's ask emma chaim herself. i think we should all know the truth about fake detection technology and special laws. so far very weak. it doesn't matter at all. for what purpose are such videos made or shared, and on the network people should know the truth, such a video
9:40 am
should somehow be marked by technology, they are not to blame for anything. we played along with the actress yulia ganatskaya she, like any artist it is important that the image is also protected from crediting some kind of hmm such cool technologies there are both pros and cons you work in the frame, you are an actor you come there you don’t know, mom emotions. mom, i don't know how to put these realistic masks. we were helped by these specialists sergey loginov he has been working with zifakes for a long time and is well aware of their potential and how much the threat is that the scale of such criminal ones will grow, or something, somewhere fakes are great fakes precisely for the purpose of exposing some lies, slander of illegal actions. this is, of course, you need to do, but usually such deepfakes they
9:41 am
are, uh, well, misleading not with their realism, they are often very unrealistic . and it still works there. uh. here, one might say, is just a belief, and this is like a virus and an antivirus. the first is always one step ahead. i have never settled for any illegal things the apps want to have been. there are a lot of offers, there are all sorts of identity confirmations on crypto exchanges through deephake. firstly, it is impossible now in the data in this technology to do. secondly, they, well, represent this at all in a second. this is how 5 rubles are made there. give it and it's all done. well, it's not like that at all. but the british journalists described and showed how easy it is to deceive a voice assistant. he called his bank and started from the computer. audio recording it was synthesized by an easily accessible
9:42 am
program from his own voice. what is your reason for calling to check my account balance, okay. name it your birth date, the program did not notice the catch my voice my password. the security system again did not recognize the fake and gave out information about it. account balance and recent transactions. they are fighting in the laboratory of the speech technology center, and which the operator can see a-a, let's try to call there, a in order to see how the technologies work. hello, i would like to get into my personal account the director of the research department imitating a call to the call center. to show how their protection system works such solutions can be integrated in the state bank, service
9:43 am
and television companies, the system solved two problems. first she identified me as her client, and the second - she uh, made sure that i 'm not trying to deceive her and uh, the anti-pooking schedule in the green zone is exactly the same appeal, he recorded in advance and i reproduce. in fact, the voice, uh, do it by ear, a person will not notice the catch , the system already hears everything, it’s a hoax, that this is a recording of my voice, and not a real voice, in order to pass voice biometrics. um, attackers can go all sorts of ways, and they can record the voice and play it back. and they can, uh, parody a person, and they can glue top international companies work to protect voice biometrics, and companies and russian companies occupy leading
9:44 am
positions, and at international technology competitions. the developers of biometric systems are studying all the vulnerabilities of the system so that it is impossible to walk on someone else's photo, and now i will demonstrate, let's say , the scenario of the most common myth associated with facial biometrics. namely, what is in your photo, and the villains will get access to your data, for example, yes or mm to the terminal. i present a photograph. already terminals. he says, look into the camera and realizing that it's not me who forbids it. to me, the passage of biometrics is developing all over the world in parallel, and here and there they use computer vision technology. it initially grew from simple human detection by identifying people by some points that are accepted and familiar to our eye a and further. with the development
9:45 am
of neural networks and a, work with them improves, and the system shows really fantastic results. in a positive way , it's great there to do some cool video. you remember all the videos with the problem. keep the money in a savings dice, but then again, i wouldn't want to see myself in some video in some weird context, like, yes, and i'd like to have a tool to prove it's not me. if it's not, well, specifically, i'm not in this video and everything goes to this, that will be enough. it's simple and i think that it will be quite accessible. at vision labs
9:46 am
, he develops algorithms for face detectors. they, too, are not embarrassed by either photos or videos from the phone, they often ask the question of whether that you can apply some kind of special makeup. and something else like that. no, this doesn’t work either, because we are looking at a face in a completely different plane, and this technology recognizes emotions on a face and estimates age. a smile clearly rejuvenates fallen people sitting, for example, squatting on a chair, we can detect raised hands, and after arrow a these systems will already determine the position of the body in the frame, we are developing algorithms that help us identify fighting people, that is, if somewhere in the frame we will have two people using each other's force. here, we can
9:47 am
detect this event. what squatting people were created for, and this is a bank request in order for attackers to hack into an atm, he needs to sit in front of it to gain access to the compartment. and accordingly , if, for example, a person squats there for more than ten seconds, then we can trigger an alarm, their platform detects the system has uploaded one of the most famous videos. donald trump full final dumbass, you see, i would never say that. at least in public, but another would say, someone like jordan saw dangerous times right now, moving forward. we need to be more vigilant about everything we trust on the internet now of the algorithm. they have advanced a lot and now this fragment. yes, it's relatively old. and now we already have more, let's say, powerful
9:48 am
substitution technologies. djs are already easily destroying reputations, so a resident of pennsylvania decided to save her daughter from rivals in the support group. she sent out a coach fake photos and videos of girls on them they smoke vapes and drink generated voices and videos are already sending innocent people to prison when some criminal case is being considered. very often now. uh, either audio recordings or video recordings are used as evidence, and as far as they correspond to reality , the defake technology was not used. here is the problem. now. in fact , a very large number of judges are concerned. and in general , those who represent this segment all over the world. this video was ordered by himself he wanted to address voters in english and their real dialect of languages. he himself knows only india cheered up the faces of the actors, and
9:49 am
the networks were already responsible for articulation. so the politician decided to reach as many audiences as possible. well, in california, for example, it is already forbidden to post fake politicians two months before the elections in china since january. diffakes that will damage the reputation or threaten the security of the country are outlawed. well, apart from the film industry and advertising, this technology has a great and legal future. he himself wanted to catch the little mermaid. and leshiba find a cat a scientist to take a picture in the style of lukomorye imagine that you come to a literature lesson, they start a video for you and, for example, sergei yesenin himself reads his own poems in his own voice. all the same can be done there in history lessons. all the same can be done in the lessons of mathematics, physics and chemistry, and this and because of this there is a completely different perception, especially for modern children. russian social network
9:50 am
vkontakte yes, we are now working on technologies that will allow all users of vkontakte and for anyone, photo or video content they have or content available in social networks or other resources by uploading it to us to check for presence or manipulation of the face of the presence of deppeys on this video. and in general , uh, we started to develop this technology because we had a large base of depfakes that we created with our deephake technology, which we launched a few months ago. the first application was a technology that allows users to try on the faces of celebrities, and then a neural network on the same rollers learned to identify the type of fakes. to be honest, many since the moments when you and i could not distinguish our dikfake, but really a real star from our ordinary users, a and our positions may not do this, so we first of all strive to give a tool that will allow
9:51 am
you to say , really real photo videos, but already these algorithms that allow you to create a fake by one photo tool alone. everything is simpler, and the videos are more frighteningly realistic. we are beautiful terms russia is artificial intelligence in fact, of course. eh, it's not yet, but in the future they will certainly learn to think quite qualitatively. but well, this will require a large amount of information. here we don’t think that our phone has already collected biometric data from us when we make this and the same biometric discovery in the face. we give a full 3d scan of the face from all angles. this means that now from these sides it can, in theory, be restored in some way, we upload our photos to the cloud from different shots from different angles and at different hours of the day and night. this means that now hypothetically. neural networks. there is an opportunity to better reveal exactly and make lighting and create this particular cast of the face. itmo university in
9:52 am
st. petersburg developed the expert service , which could help employers looking for new employees, the system studies the candidate's video presentation according to various parameters. this is not a lie detector, which will determine on the questions of the present some completely true or completely false statement, but it can form a general impression of a person that in general, he behaves insecurely on the video, but in general, he evaded answering all questions. accordingly, we, well, are unlikely to want to deal with such a person who will not be completely sincere with us. while the development was going on, the system turned out to be perfect, it screens out and deflakes, we load popular videos on it, the actress jennifer lawrence turned into his mental system, it sees a discrepancy in the facial expressions of emotions and text through different channels. with all the ups and downs on the often
9:53 am
index, let's say the same audio channel, uh, the video was always the same, that is, he was literally without emotional. and just the same, having information that he had aggression, for example, and that he spoke some kind of emotional text and showed emotions in his voice, but at the same time did not move his face, we can say that this is not very typical for a living a person and we can already understand what we should pay attention to, and not this we can already have a lot of different applications students can run the answers in advance before the exam actors role course student check the lecturer for competence. i would like to check whether it is really worth it for me to spend my precious time on studying the material that potentially may not be useful in view of the fact that the specialist himself
9:54 am
is not sure about what he says is the task of technology. actually, it is worth saying to this expert to believe or not, a narrower application, please, you can further its forensic examination and so on players. on the stock exchange, it is important for them to know who has to invest today there tomorrow, that is, they listen to the opinion of authoritative people against their background, the so-called information appears; the gypsies quickly read too much, begin to speak, got into the forecasts, the authority of their returns. well, they themselves earn little that people will lose on this, as if they do not matter. therefore, in principle, this technology is intended for this. the system is designed to test expert opinion, who to trust, and what would be nice to double-check and the era of artificial intelligence, more than ever , requires critical thinking. and diffakes , even the most innocent ones, need to be labeled for this, it seems they are leaning all over the world.
9:56 am
9:58 am
the fan card is a digital service for safe and comfortable attendance at football matches. the card number is needed to purchase tickets and season tickets for russian premier league games and the russian football cup final . you can apply for registration at public services or in the public service application. at the stadiums of all rpl clubs in twelve cities get a fan card and football pains at the stadium.
9:59 am
the head of the ministry of defense inspected the vostok grouping of troops in the south donetsk direction of the special operation. remind the administration to the minister presented reports, paid special attention to the issues of providing and accommodating personnel for the work of medical and business units. in addition, sergei shoigu presented the fighters of the order of courage with the insignia of the st. george cross and medals for courage. on suspicion of financial assistance, the all forces were detained
10:00 am
a resident of moscow, she remained to leave the country, according to the fsb public communications center, the operation was carried out jointly with the ministry of internal affairs , the lefortovo court chose a preventive measure of 2 months of arrest under the state article. treason detainees face up to 20 years in prison and a large fine. the five-year-old story of the poisoning of the violins was invented by britain to demonize russia in the future , this was announced in the russian depression in london , the british authorities hide the results of the investigation and stubbornly blame moscow for everything, the british foreign office sent a note with a demand to clarify how things are with the health and location of sergei and yulia skripals. in the west of moscow, on lobachevsky street, scaffolding collapsed, one person died, two were injured; now work is underway to dismantle the structure, as reported by the ministry of emergency situations, people may be under the rubble.
14 Views
IN COLLECTIONS
Russia-24Uploaded by TV Archive on
![](http://athena.archive.org/0.gif?kind=track_js&track_js_case=control&cache_bust=2089363328)