tv Shift Deutsche Welle June 11, 2023 9:15am-9:31am CEST
9:15 am
the cranium, presidents allow them use the landscape concepts that's countable, offensive operations against russian forces underway. intense fighting is being reported. any great stuff will ratio region, but those keys most got the coming up next. on the dw scientists training 8 i to interpret grade waves. will the technology so it'd be able to read our minds. find on next fun ship, monica jones for me on the news team here in berlin. thanks so much the guardians of truth. my name is john dean and i have paid almost every price of being enjoying this in a country like to tease taking on the powers that be they risk every thing
9:16 am
john doing dar, s activist journalist and politicians leaving an anxiety too much on my shoulders. but i have to hold this way because i'm responsible for the future follow country for the people who are behind the bus for their mission. people need to know what is happening there. you know, are series guardians of truth. watch now on youtube dw documentary, the, when a i read online, soon, scientists have already trained an ai to interpret your brain waves. also, experts argue that a poet, beauty filters a potentially dangerous to the point of making use that we take a look at snapshot this morphia and how it comes about and why that's natasha. use that data with us intelligence agencies and who else demands use information from
9:17 am
big tech companies. these are the topics that have move the tech well, the train day, i can pick your brain and recreate image. just you have looked at all stories that you have listened to all by interpreting your brain weight. those are the findings of scientists and japan, and the u. s. i in the future fulfilled our every wish. as soon as we think it. and how do we get the best results of ai tools? now, at the university of with seca in japan, participants in the study looked at thousands of pictures and set the brain scans reported. those guns were used to training a dakota model. it load, which gun represents what picture nature the mother was able to recreate images just by studying the brain with of participants why they were looking at images. and it doesn't just work with pictures. researches in the u. s. trained in
9:18 am
a model using brain scans of people and listening to pop costs and model of been managed to create as somewhat similar texts to what they had listened to impressive . but so far it's only works if the model is trained on your individual brain activity. if you cooperate, if you wanted to you, and you could step this, think about something else and creates different brain with that way. so a wonderful full l y the streams anytime soon in order to communicate with an a and make it work for us. brain with won't help us right now, but i can already help you a lot. if you feed it with the right prompts, how to prompt, there is no secret magical prompting dictionary, a system a constant, the learning and evolving. yes. how you get better results prompt in english. even though a lot of a models speak several languages, they've received
9:19 am
a lot of training in english off the ai what needs to fulfill a task. you can also ask the model to write prompts for another ai, an image generator, for example, use train of thought prompting. gives me a, an example of how you want to solve a task at a step by step description of the thoughts you wanted to include this, why you might be able to get better results. but always keep in mind that a lot of models have flaws. which lots to keep in mind. a, i knows only what it's been told, chapter cpg was only trans with the text published before september 2021. so it can't relate to recent events. is biased because it's training data often is prejudiced against women. all minority is like the eligibility q community find the way from the training data into the results. when image generates michonne is asked to create pictures of a doctor, it will be male, it cleaning person. instead, will be female. so always check if it results on defensive
9:20 am
a. it doesn't keep your data safe. samsung employees as cheap each, each fixed will improve coat. they uploaded it about accidents. the corporate secrets that could now be included in the checkbooks future responses. the same applies to your personal data. in many cases, your input is stored and used to further train these models. are you using a i tools in your everyday life already? and what's your experience with prompting to smooth the skin, acute a nose off. so let's face altering apps like face june of face, of like, getting better and better at producing real looking photos and even videos. but some experts argue that a public field a potentially dangerous to the point of making you sick. what's the fuss about face tuning today? all you need to enhanced itself is, is an f face to,
9:21 am
and one of the most popular beauties field traps is counting more than 200000000 downloads worldwide. and that dozens of others face up. you can make up to 2 plus airbrush the list goes on and on, and social apps like take talk or snatch it has been offering integrated beauty filled us for quite some time being used on a mess of scale. here's what the study that was conducted in the us 2021 says 80 percent of gross under the age of 13 has already use the filter or re touching apps to change the way they look in the photos. okay, but how does one thing to look pretty and pick just make one sick? snapshot this mafia. the term describe the need to heavily edit one's own digital image up to the point where you have plastic filter in real life to match that digital image. the term was coined by purchase position doctors to john asheville. he noticed that an increasing number of course medic, surgery patients were bringing heavy edits itself is through the consultation
9:22 am
appointments. well, this phenomena is a rather severe case of body to small if you could just show us what the amish just display of supposedly perfect human beauty on social media can do. it leads to completely unrealistic ideas of what you should look like. and that can cost anxiety and depression cosmetics such as have reported that patients who bring in heavy edit itself is often surprised that the all the photo graphic results cannot be replicated in real life. because it's so easy in an app. so how do they work anyway? how does face tuning work? the future? so essentially ultimate, the photo editing tools to use artificial intelligence and computer vision. the software detects the face and then overlays and invisible facial template. it consists of dozens of dots to create a sort of mesh. once that has been built, all kinds of graphics can be attached to the mesh pending on the app. this can be adding some digital makeup reset,
9:23 am
spatial features or adding some samples for what is being done about it. while that haven't been attempts to band these up sofa from european countries are now taking steps to at least regulate the use of a beauty of filters. they are trying to fall. social media advertises and influence so as to admit when they have altered the physical image especially influence. those and celebrities are presenting themselves and the seeming they authentic wait online and uses. often believe that what they see is 100 percent real noah introduced in the in 2021 celebrities and instances must indicate on social media whether a photograph has been re touched in the process. someone does the lot to go even further and encompass all content on social media. francis in the process of passing a similar lot, but for both photos and videos. french, one edition, we know the mass at the measure is to limit the destructive psychological effect of filters. and the u. k is currently working on drafting and always comparable tens
9:24 am
of department state regulations. and in fact, really comes down to us uses. we need to constantly remind ourselves that what we see online is very often not a depiction of reality. what do you think of these apps? are they a threat? how do you feel about such altered faces? let us know. what are you using in stuff, facebook or what's up? well, be wary of the data. you ship the parent company. meta could pass on person. the dates of european uses to us all for a tooth. that's why the social media giant has just been fine in the you with the rest of some of the 1.2000000000. why? what metro allow sensitive data to be accessed by u. s. government services? what else is this stage of being shipped? who exactly is being affected? let's take a look how the record find helps you measure transfers to stay self, excuse us, to the u. s. why the company claims slides will to the business model use the data
9:25 am
needs to be bundled, to process it in an effective way. and to said targeted as, based on the results. but us surveillance last a low intelligence agencies like the end, if they've brought access to the data from non us use this a union costs of meta to take precautions against these laws before, but it didn't happen. so now the e u has left mess up with this record fine in order to protect your pin citizens data from the u. s. it could be an important step and false big tech companies to protect our data better even beyond the you. that's important because it's not just u. s. also, or issues that may be interested in your data. your own government could be to us big tech aiding the n s. s. mess surveillance program. we've known that since 2013 thanks to whistle blow. edward snowden, now, 10 years later, government and intelligence agencies are reported the receiving even more data from social media provider. who else has metro sharing data with matter has been
9:26 am
publishing and transparency of reports in 2016, it details the number of times governments of ospital access to the own citizens data on its platform. in the latest issue, the us stopping the list with more than 64000 request photos closely by india. and next comes germany and brazil with about $70500.00 total request submitted each in the report. metro state measure response to government requests for data in accordance with a particular law and our terms of service each and every requests were receive is cassidy reviewed for legal sufficiency. and we may reject or require greater specificity or requests that appear overly broad or big. so in other words, unless a country slot does not allow for use of data to be passed on mess, i may indeed end it over the published numbers seem to indicate that metal is actually quite willing to give out information in more than 75 percent of all cases,
9:27 am
at least some form of data was produced, which states i do governments access beyond basic data like user names, addresses and contact information tech companies like metro as google, apple, microsoft and amazon can often access a lot more. for example, use of emails, text messages, call logs, photos, videos, documents, contact list, and calendars that think of government or storage is a very intimate look into your private life. what is this data that being requested and used for? in many countries, law enforcement agencies monitor social media to assist with criminal and civil investigations if they have good reason to believe someone's involved in criminal activity. they might ask that phone, provide us full access to that persons use a profile. this can also happen in connection with public safety. for example, before big events authorities might want to scan use us profiles to assess risk. yet another reason is immigration and travel screening. and here we can get an idea
9:28 am
of the ethical implications that come with platforms for meeting use of data to authorities. imagine those requests targeting certain activities. you might think this kind of surveillance doesn't apply to you because you've got nothing to hide. but government monitoring of social media data can and has led to people being accused of crimes even wrong for them. and it's definitely in that's free communication. what can you do to protect yourself? the recent ruling and the you, the parts of the record fine says match a must return europe in use of data to european service. but apart from that, that's nothing you can do to be entirely safe from being monitored by your own government or elsewhere. you'd have to come see if they did your online accounts to guarantee that. and this does not apply to meta and on, but also to google m as long and microsoft. what do you think? do you get if your government has access to your data? that's it from me by and see you next time. the
9:29 am
high heart, or a noble called the bon opera gala 2023. benefiting the german aids foundation, experience, young international talent, great arias, and even touch of fraud. the, to highlight the parts these are new islands along the river, are home to the challenges of bangladesh. every 6 months, their land, it's money by title, way or the funny does actually be the last few, but he's a reassessed. i don't funny, don't the i'm a bit populated and stuff to do is give you the never ending cycle of
9:30 am
17 Views
Uploaded by TV Archive on
![](http://athena.archive.org/0.gif?kind=track_js&track_js_case=control&cache_bust=1180764228)