Skip to main content

tv   PODKAST  1TV  August 30, 2024 1:00am-1:45am MSK

1:00 am
kandinsky already has this, that he does better than analogues, let's say, analogues are wrong in his word, well, it seems, then neural networks, which solve the same problem, generating images from text, for example, although in what are we cooler, although kandinsky generates videos, now yeah, now, well, that is, already, as if already, we recently presented a model that generates videos, called kandinsky video, although probably it was necessary to call it something like torkovsky, called kandinsky video, we will continue this line, that's it, but what does it do better, knows the domestic cultural code better, of course, this is what we actually set for the model , and we will certainly develop this further, what is the problem with russian data , here is knowledge of the russian domain, in that there is russian data, of course, there is less of it, what does the model learn from, it learns from data from the internet, that is, from open pairs of images, description,
1:01 am
it kind of captures data at the time of the request, no, naturally, there is a team that deals with data, and that is, there is data it already contains, it does not search the internet at this moment, no, no, no, in no case, it creates, in this, as it were , unique where they are stored, this voluminous amount is really a huge amount of data, this is kandinsky the third studied on one and a half billion pairs of picture text, and you can imagine how many terabytes of data storage is needed there. on a supercomputer there is, as if it studies all this on a supercomputer and is stored nearby on special storages, they are called s3, but these are such big-big storages data that really, let's say , so that the dataset model can see it, they need to be put there, and not just put there, you need to remove all the low-quality pairs, that is, filter them, well, if you go online, then in general there is half.
1:02 am
which needs to be written before giving the data to the model, because of course you can't give everything in a row, it's like you know, showing a child everything in a row, it's not clear what he will learn in the end, of course, the data development team is responsible for this, and this is super important, and the activity and the teams themselves, because not just half, but more than half of the entire success depends on it, and after the data is stored, here they are... located in this c3 storage,
1:03 am
in fact, you can train the model, yeah, training the model is just this iterative viewing of this data and changing this huge number of parameters that i talked about, these billions of parameters that need to be adjusted so that when a new description appears at the input, the picture would match it, we have a video that was created by kandinsky, but before watching it, the question is, we people conser
1:04 am
photo editor now in general recently it has been very popular to make a photo editor based on or rethink current photo editors like photoshop, integrating ai into it. why is this useful, because it is clear that it is much, much easier to create pictures and edit pictures from text than to actually draw it yourself, what people have always done before, it took a huge amount of time amount of time, now it is enough to simply write in text, the picture will be generated, you don't like some area, you have blurred it,
1:05 am
automate it better, it will only save people's time, actually it will also increase the quality of the content, animation, for example, how many routines are there, animation is not video creation at all - it is generally how many films are shot there, but they can shoot for years now it can all happen faster, that is, from well, for now, to the creation of full-length films , the technologies have not yet grown up to the point of honestly, but the progress is such that if you have been talking for 3 years, that we have been working on neural networks for 3 years, yes, i mean. creating a neural network that generates images based on text, if you look at how neural networks, the best ones , created images 3 years ago, you will see that it is very, very bad, let's watch a video, let's watch a video, your questions, yeah,
1:06 am
what is this, denis, you can tell me, i will comment, this is still animation, this is not a full-fledged video, kandinsky did this, yes, kandinsky did this, how, how did he do it does, it draws the first frame according to the text, and then you choose, that is, in what, how does animation differ from video, in that animation is a camera flyby around a static object, well, or some kind of camera movement, and video is a full-fledged movement of everything, here it is still animation, because there is a first frame, and then we choose the camera movement, well, that is, a monotonous camera movement, such a zoomin, and we choose the first, draw the first picture according to promtuzoomin, and we can without creating a new model. based only on the model, which generate an image based on text, generate animations like this. let 's tell you where to look? yes, there is, there is actually a telegram bot, within which you can create animations like this, that is , anyone can, anyone can do it, well, that is
1:07 am
, it copes with complex tasks, well , that is, a telegram bot, that means, well , the current task was, in my opinion, very complex, i honestly don't remember what prompts there were exactly, that is, it doesn't require much memory. everything is created in the cloud, somewhere, yes, that's everything is created on a supercomputer, and oh, a supercomputer is an interesting topic, well , the training of the model itself and the application of the model, well , it's called inference, it requires computing power, yeah, the model is huge after all, it won't run on an iphone, roughly speaking, your smartphone, although progress is also going towards this, so that it can run on a smartphone, well, at the moment , kandinsky is running on the christophor supercomputer, yeah, this is our supercomputer, on which we train models, yeah. and language and these generative ones, which turn text into picture and video, well, and that's where we actually use them, when you make a request, it flies to the supercomputer, it 's processed there, that's where kanzinsky lives, yeah, that's where he makes
1:08 am
a picture out of text, the picture comes back to you, where you use a telegram bot or well, in a bot it's a certain number of iterations, that is, i go into the bot, i say what picture i want with... scaling methods, after all, a supercomputer contains not one calculator, but many, of course, if a million come that's right, yes, but we actually have a person, then there will be a queue, it's inevitable, in this sense, as if to do inference on a device, that is, a smartphone. as much better, because you do not stand in a queue, you do it there, but we try to keep a lot of resources on the supercomputer so that the queue is satisfied after all, so that there are not, say, a lot of people in a queue at the same time. i wonder why you feel that competitors are spying on you, we feel it, that is, you
1:09 am
feel this attention on yourself, yes, and since they are you are being spied on, which means you are doing something that apparently causes them, i don’t know, concerns with everyone who is involved. competitors, frankly speaking, and this is not artificial intelligence - this is not an exception, of course, but simply how much you are ahead of your competitors, this is also a competitive struggle, so to speak, you read the news in the morning or evening in any case and you probably also note to yourself, but has anything happened that will affect who has overtaken whom, so to speak, at the turn? that’s true, of course, yes, as if, let’s say, the area of ​​​​creation large models, although it requires a huge number of computing resources, data and specialists who will actually train all this, but in principle, large bichtechs, they certainly have such teams, they
1:10 am
are developing these models, for example, yandex has shedavrom and the guys are making a fairly similar technology, the only thing is that they do not have video generation, but we today. that leo tolstoy, here is roman , said at the beginning, there is everything there by leo tolstoy, he did not pass it on to anyone as an inheritance, he is an open history, yes, that's when you load data into a supercomputer, you license them or they are like public domain, or it is not necessary, well, they are not like public domain, that is, they are rather public domain on the internet, yeah, when we uploaded them, after all, we are like their whole dataset in a folder, we do not post it anywhere.
1:11 am
1:12 am
well, some participation in the creation of the final picture, they made their contribution, but
1:13 am
so far the story is that if a person, most likely, if a person, well, as if by logic, each company does it in its own way in the end, if a person as if pays for what he used a neural network and got a picture, then he is the author after all, that's most likely it is, and if motives are guessed there , usually, if something is created, he still designs it. a model that is not posted and is not in the public domain, kandinsky, the opposite story, since one of our first
1:14 am
missions, i already talked about this, is to move the community as a whole and move science forward, then we post the model in the public domain, you can use the model, absolutely not without paying any money companies and deploy on your own and why create such a model that is not widely available, this is so that... today we have as our guest the person who created the neural network kandinsky, denis dimitrov and elena kiper and roman karmanov, with you as before, i wonder, if i generated a masterpiece, then it is a masterpiece, just a picture where i want, no, you always have to mark something in a masterpiece, that's something. good - and then is it possible to read from the data of this masterpiece, well, let's say, i
1:15 am
send it, it turns out to be in someone's use, is it possible to open it and see everything data, how the iteration took place, that is, or it is already blocked everything, this is not really how the generation took place, you can see, of course, that is, how a specialist, not i may be a specialist, from what data it was created, it is impossible yes, but as if. some white canvas, well, not necessarily white, well, just white paper, some empty, napkin, napkin, yes, there is a tablet screen, he has some idea for sure, he starts sketching out some details, there with a pen, a stylus, whatever, adds details, maybe removes something, well, a person also has patterns in his head, that is, he remembered something, he has, he has patterns, this is after training,
1:16 am
so step by step he simply creates a picture, like kandinsky creates a model. in principle , you can draw an analogy, he begins to draw a picture from the so-called white noise, what is it, well, this is when there is no signal on the tv, such interference, just some kind of matrix, naturally there is your request, this is the same idea, as if in a person it is expressed by these neuron signals, neurocity it is expressed simply by numbers, which are text, into which the text is encoded, and from this matrix step by step in the process of removing noise. in the right direction, an image is created, that is, the creation of images from this white, this white noise, from this matrix you can simply visualize, that is , there are 100 steps or 100 steps, during which from any of this noise the final picture is created and visualized, manifested, so to speak, for example, you want to make a pug in space, yeah, that's it you always have white noise at the beginning, and step by step
1:17 am
you get a pug in space at the end of this process, if the noise... is a little different, it is clear that the noise can be any way you like, swap two pixels there, it will be different, but also a pug in space, this matrix, in some sense it is responsible for the model's fantasy, this can be visualized, but here - this process is called denoising or noise reduction, just so that it happens correctly, we need to show this neural network 1.5 billion pairs of picture text, that is, so that she learned to connect text and image, basically she...
1:18 am
her images, well, let's say, on some canvas, i don't know, well, in the sense of digital, of course, in fact there is research, i wouldn't say that it's just super there is a lot of practice there and that it is actively used there, but there are studies that allow decoding brain signals, well, that is, you know, they put there - that means a reader of electrical signals in the head, for example, you can there, so that without words you, roughly speaking, understand what you want, or you close your eyes. imagine how something, i don’t know, for example, a palm tree of some kind and you can decode the picture purely from the signals of the brain at the moment when you imagine it, that is, what you, what you actually imagined, in general, neural networks do this too, others, they don’t make a picture based on text, but based on these time series of brain signals they make the same picture, in general, as if if you were connected to
1:19 am
such an electrode, you can really train a model that understands you without words and draws a picture, if you need draw a picture, but the limitation of these models is that for each person, for now, you still need to teach your own model, that is , for each person, when he imagines a palm tree, the signals are slightly different, that is , like one model cannot yet be taught to all people, and for each person , teaching a model is expensive, and you need a supercomputer for each person, you need a supercomputer, so until there is some kind of universal thing that everyone understands.
1:20 am
all the same, and this is probably best seen with and, as it were, you can trace with language models, you know, when gpt came out, these problems were especially visible and they asked what is heavier than a kilogram? than fluff with the word heavy, well, on the contrary, which is logical, and of course,
1:21 am
she answers that a kilogram of lead means heavier, a kilogram of fluff is lighter, yeah, and explains why, for perfection, there are three three paragraphs, why, it is dangerous, in fact, because a schoolchild, for example, who will look here...
1:22 am
yes, he hallucinates, and i you, you see, in the form of a promt in my head, to me you look exactly like that too, and i will actually tell you how we get photorealistic portraits, we have a team that deals with fake detection, well, you know, right now this is a particularly pressing issue, well, with the development of generative models , you can generally make a bunch of fakes and post them on the internet, and this can bring huge losses to both companies and states and reputational costs, that is, we need to distinguish between what was created by a neural network and what was not, so we, in fact, within the framework of fake detection, decided to make a model that generates fakes, why? then, in order to here on that dataset, which generates
1:23 am
a generative model of creating fakes, to finish training the detection model, so we, as part of this work on detecting fakes, made a model of generating fakes and applied it here, that is, as the model creates your face, first there is a model that describes you. even if we did not quite understand everything in general in the end, there is still nothing to be afraid of, look at what nice, smart people
1:24 am
are doing those things that you and i may be afraid of for some reason, there is no need to be afraid, since this is where our country's competitiveness lies, and it is in the safe hands of people like denis. denis, thank you very much, we are waiting for the next stage, the antennas on your head, will explain to us. analog versions of the hosts: elena kiper, producer of the clipmaker and roman karmanov, ceo of the presidential fund for cultural initiatives. this was a podcast of the creative industry on channel one. see you. hello, i'm dmitry. we have another episode of the literary podcast right here: let them not talk, let them read. we we are talking about very different writers, about
1:25 am
classics, about our contemporaries, but today we are talking about a person who passed away quite a long time ago, but his prose, his stories, his larger works are still very relevant, they are still debated, their titles are remembered. these are, for example, stories like nervous people, like banya, many others, you have certainly guessed that we are talking about mikhail mikhailovich zoshchenko, he would have turned 130 this year on august 9, we will talk about him today with vera mikhailovna, zoshchenko, the great-granddaughter of mikhail mikhailovich, actress, tv presenter, hello, vera mikhailovna, hello, and also with maria alexandrovna kotova, head of the scientific.
1:26 am
let's start the conversation with this: well, zoshchenko flourished in the twenties, gained fame, gained popularity, was overgrown with legendary stories, his life, or rather, the twenties - this is such a turning point, is there anything special about mikhail mikhailovich against the background of other russian writers of that time, i always introduce when i speak with students such a criterion: did the person manage to travel abroad before the revolution, to study? abroad or did the world military revolution catch up with him already in his early youth, that's how it was with zoshchenko against the background of the same bulgakov or pelnyak or even ivanov, zoshchenko did not manage to travel abroad, like bulgakov, and it seems that zoshchenko, like bulgakov, was an emigrant all his life, and he never went anywhere, the revolution for zovchenko, it seems to me, was a determining factor in his literary development and he counted his literary path from 1921.
1:27 am
let's wrap it up, because this was a turning point, of course, yes, indeed, some of the russian writers managed to fight on both sides, to participate in the civil war, but zoshchenko, straight from the walls of the law faculty, from the university of st. petersburg himself, he found himself in the thick of things, and for him, in general, there was no particular choice, yes, he immediately accepted the revolution, yes, you can say so, or is this still too harsh? i think, if we compare with the same bulgakov, we can say
1:28 am
that...
1:29 am
here the question is more complicated, you mentioned, of course, about the story before sunrise, and we will talk about it tragic, violent and so on, a lot, but now let's talk about early zoshchenko, let me ask you about what, in your family, what did they say about mikhail mikhailovich, what traditions were there, what stories may be that are unknown from his texts, but still, it's not every day that you find yourself opposite the great-granddaughter of mikhail mikhailovich, well here maybe. the answer is none, none, none, well, let's look at the photograph, this is my dad, an officer, a submariner, this is the son of mikhail mikhailovich, my grandfather, valery mikhailovich zovshchenko, i was named in honor of vera vladimirovna, the beautiful wife of mikhail mikhailovich and you see, i was lucky, it should have been christina or ioanna, lucky that the family quarreled, dad and mom decided, that means, to make peace in this way, in general, i lied a little, the thing
1:30 am
is that ... i thought so, i was just sure, in general these are not exactly stories, this is - in the family they tried to preserve - that, this is the answer to your first question, how does mikhail mikhailovich differ from other writers of the same time, of which there were many, talented, and humorous, and satirists and philosophers, yeah, yeah, mikhail mikhailovich is distinguished by dignity and respect, here are the serapionov brothers, and the serapionov brothers, who are also this. the association, well, as is commonly believed, arose in february of the twenty-first year, the same twenty-first year, it included very different writers, all of them, here are ivanov, konstantin fedin, mikhail slonimsky, lev luns, lev luns , veniamin kaverin, some had a very, well , such a happy fate, yes, they, koverin, for example, everyone knows him as the author of two captains, although he is much broader than just
1:31 am
the author of this novel. but still, mash, how could this happen, i still don't have a very clear picture of this, well, something is growing in the country that will later lead to this terrible decree. zhdanovsky of forty -six, a dictatorship over literature is growing in the country, a social order, literature must serve ideology, many succumb to this, many perceive this as something natural, mayakovsky founds the magazine lev, this will end for him too tragedy, but nevertheless, the serapion brothers, this is something completely opposite, yes, that which went against the ideology, how could it be, because after all, it was still the twenty-first year.
1:32 am
that the serapion brothers - this is a definition that led to freedom of creativity, freedom of creativity was steadily declining, right up until the thirty-fourth year, until the first congress of soviet writers, and these people for the most part preserved it, yes, because well, despite
1:33 am
all the differences in biographies, they, well, became large-scale writers, not only mikhail mikhailovich, but all vlad ivanov, but also fedin, especially earlier, yes, yes, then a completely different period came, this is an important story. well, okay, in the twenty-first year another important event took place, not banal, but not erased from repetitions, this is the tenth congress of the rcp, now, of course, probably fortunately, and maybe even without probably, definitely fortunately, few of our interlocutors on the other side of the screen or headphones, because our podcast works in different formats, few of you, dear interlocutors, know the history of the party, knows there, what congresses, when they were, well, i remember that, i remember what happened there, it’s, well, if in the language of banal formulas, then this is the introduction, yes, the new economic policy, and several
1:34 am
utopias were cancelled, in the seventeenth year, in the eighteenth year it seemed that there would be a world revolution, we must go, a march on paris, on europe, there was a civil war, it seemed that there would be a new culture. proletkult, it is clear that little came of this either, and it seemed that there would be a new economy, little came of this, hyperinflation, devastation, that’s all the same to me it seems that mikhail mikhailovich finds his hero at this time, what do you think, mikhailovna, precisely after the twenty-first year, not between the seventeenth and twenty-first, when, well, the revolution was on the rise, swept away the constituent assembly. in the twenty -first year, something changed, because when the nep appeared, new types of people appeared, zovchenko is very sensitive to new types of people, not to people in general, by the twenty-first year he found, felt some kind of, i would say, this stream, in which there is some kind of carrier
1:35 am
line, i think, yes, it started from there, but in general, it seems to me, there in each story, when you read, you don't need to talk, squat,
1:36 am
science fiction finale tomorrow on the first, you need to study a wide range of people, it takes time. and it is necessary to neutralize the agent as soon as possible, but there is no other way out, the best chance is to meet him somewhere in a restaurant or in a bar where there is linen, this is already interesting, you yourself are from the nkvd. no, from the kgb. by the way, it would be nice if you managed to put him to bed. i am the only one here about this i know, and no one else, come on, writes you. the legendary multi-part film based on the novel by yulian semyonov. on the weekend at the first. in the twenty-first year, what happened? well , there are many dissatisfied, yes, for what, for what did they fight, when again around the nepmen, the slogan
1:37 am
get rich, when again firmilis appear, for example, in moscow and the current tsum and so on and so forth, that is , new types appear, well, how to tell, an aristocrat, yes, that is, in russia the country of six estates, an empire, i mean, these are nobles, these are clergy, these are merchants, these are peasants until the sixty-first year.
1:38 am
his stamma tells about these proletarians , working people, citizens, like poems, and it seems to me that the nepmans acted as such a background for bulgakov, because this was an era in which this new proletarian worker was completely confused, old culture, new culture, you have to take a woman to the theater to buy a cake, but she wants four, she wants four, in the story, if i'm not mistaken, the delights of culture, yeah. the proletarian goes to the theater, he has a ticket, but
1:39 am
it turns out that you need to pay 20 kopecks to the cloakroom attendant, but he doesn't have them, and they won't let him in, there are these nepmen around who easily get millions out of pockets, what is the secret of time, that on the one hand you are a proletarian, that is, well, relatively speaking , simple, yes, but the dictatorship of the proletariat, yes, that is, the main class there, not a class, the estates of the nobility, here is a proletarian who is broken.
1:40 am
1:41 am
one of the most beloved writers, and dostoevsky, all came out of gogol, yes, this is all true, but this is another little man, let's go back to mikhail mikhailovich and we announce the continuation of our conversation, which will be in the second part of the program, and it will concern, well , not exactly a new zoshchenko, not exactly another
1:42 am
zoshchenko, but still a person who made very significant discoveries for himself, well, now like you, of course.
1:43 am
who loves literature, is one of the pillars - and a theater critic who discovered ostrovsky, and a poet and, most importantly, the creator of the concept of organic criticism, well, that is , contemporaries are looking in very different directions at this time, someone is aimed forward, someone seeks and catches those instant changes that mean major changes, someone looks... the past and strives in the past, like alexander blok, to catch the eternal, to catch the music of the revolution, to catch epochal events, this book is absolutely important, and apollom grigoriev, who for everyone who loves russian literature, stands on a certain shelf, gets on this shelf only
1:44 am
after this book was published in the sixteenth year, well, now let's go...
1:45 am
not a single one of their popular figures now gives this scale, which was then right up to here before this famous pythological story, add up, add up several, because when the tram passed by his house the stop zochigo rossi was often announced with a reservation stop zoshchenko rossi, and really he knew without the internet, the whole country knew him, photographs were sold everywhere, well that is, it just was.

9 Views

info Stream Only

Uploaded by TV Archive on