Skip to main content

tv   Shift  Deutsche Welle  June 10, 2023 6:15am-6:31am CEST

6:15 am
as far as last months have been found alive. present this double petrol tweeted pictures of a rescue teams attending to them the children belongings when indigenous communities spent 40 days in the jungle. before being found. the us department of justice has unsealed thing that meant a former president. donald trump contains $37.00 criminal charges against him, but keeping classified documents in his florida residents and obstructing investigations to the case. the technology show shift is coming up next time, told me and i thought for myself and the team to you against the yells and didn't you belong to the 77 percent comfortable?
6:16 am
i just got on 65 last last your stop. and here's one thing we are here to help you make up your mind. we are here on please find your mind. so all of the topics i'm much up to you from trouble fixed a new culture. and in 15 minutes, let's say parts of our community life on the server is now on when i read online, soon, scientists have already trained an ai to interpret your brain waves. also, experts argue that a poet, beauty fills with a potentially dangerous to the point of making you sick. we take a look at snapshot this morphia and how it comes about and why that's natasha. use
6:17 am
data with us intelligence agencies and who else amongst use the information from big tech companies. these are the topics that have move the tech world, the train day. i can pick your brain and recreate images. you have looked at all stories that you've got. listen to all by interpreting your brain weight. those are the findings of scientists and japan, and the us in the future fulfill our every wish. as soon as we think it and how do we get the best results out of ai tools. now, at the university of was stuck in japan, participants in the study looked at thousands of pictures and set the brain scans reported. those guns were used to train in a dakota model. it load which scanned represents what picture they showed the model was able to recreate images just by studying the brain with of participants why they were looking at images. and it doesn't just work with pictures. researches in
6:18 am
the u. s. trained in a model using friend scans of people and listening to pop costs and model of ben managed to create a somewhat similar text to what they had listened to in process but sofa. it's only works if the model is trained on your individual brain activity. if you cooperate, if you wanted to, and you could step this, think about something else and create different brain with that way. so a one full full a why the streams anytime soon in order to communicate with an a and make it work for us? brain with won't help us right now, but i can already help you a lot. if you feed it with the right prompts, how to prompt, there is no secret magical prompting dictionary, a system a constantly learning and evolving. yes, how you gets better results prompt and english. even though a lot of
6:19 am
a models speak several languages, they've received a lot of training and english off the ai what needs to fulfill a task. you can also ask a model to write prompts for another ai, an image generate to, for example, use chain of thought prompting gives me a, an example of how you want to solve a task at a step by step description of the thoughts you wanted to include this way, you might be able to get better results, but always keep in mind that a lot of models have flaws. which lots to keep in mind. a, i know only what it's been told. chapter peachy was only trans with the text published before september 2021. so it can't relate to recent events as bias because it's training data often is prejudiced against women. all minority is like the eligibility q community find the way from the training data into the results. when image generates michonne is asked to create pictures of a doctor, it will be male,
6:20 am
it can in person instead will be female. so always check if it results on defensive . it doesn't keep your data safe. samsung employees as cheap each, each fixed, so improve coat. they uploaded it about accidents did corporate secrets that could now be included in the checkbooks future responses. the same applies to your personal data. in many cases, your input is stored and used to further train these models. are you using a i tools in your everyday life already? and what's your experience with prompting to smooth the skin, the cuter notes off, full lips face altering as like face to face of like, getting better and better at producing real looking photo and even videos. but some experts argue that a public field, a potentially dangerous to the point of making you sick. what's the fuss about face tuning today?
6:21 am
all you need to enhanced itself is, is an f face to, and one of the most popular beauties field trips is counting more than 200000000 download worldwide, and that dozens of others face up. you can make up to 2 plus airbrush the list goes on and on and social absolutely take talk or snatch it has been offering integrated beauty filled us for quite some time being used on a mess of scale. here's what the study that was conducted in the us 2021 says 80 percent of gross under the age of 13 has already use the filter or re touching apps to change the way they look in the photos. okay, but how does one thing to look pretty and pick just make one sick? snapshot this mafia. the term describes the needs heavily edit one's own digital image up to the point where you have plastic filter in real life to match that digital image. the term was coined by purchase position doctors to john asheville. he noticed that an increasing number of cosmetic surgery patients were bringing
6:22 am
heavy edits itself is through the consultation appointments. well, this phenomena is a rather severe case of body to small fia, but it shows what damage has display of supposedly perfect human beauty on social media can do. it leads to completely unrealistic ideas of what you should look like, and that can cost anxiety and depression. cosmetics such as have reported that patients who bring in heavy este itself is often surprised that the, all the photographic results cannot be replicated in real life. because it's so easy in an app. so how do they work anyway? how does face tuning work? the future so essentially ultimate, a photo editing tools to use artificial intelligence and computer vision. the software detects the face and then overlays and invisible facial template. it consists of dozens of thoughts to create a sort of mesh once that has been built, all kinds of graphics can be attached to the mesh pending on the app. this can be
6:23 am
adding some digital makeup reship sets spatial features or adding some levels for what is being done about it. while the haven't been attempts to band is f sofa from european countries are now taking steps to at least regulate the use of a beauty filter. they are trying to fall, social media advertises and influence so as to admit when they have altered their physical image. especially instances and celebrities are presenting themselves and the seeming they authentic way online and uses often believe that what they see is 100 percent reel. no, i introduced in the in 2021. celebrities and instances must indicate on social media whether a photograph has been re touched in the process, someone to, to not to go even further and encompass all content on social media. francis in the process of passing a similar lot, but for both photos and videos. french one edition, we know the mass at the measure is to limit the destructive psychological effect of filters. and the u. k is currently working on drafting and always comparable tens,
6:24 am
but apart from the state regulations, it in fact really comes down to us uses. we need to constantly remind ourselves that what we see online is very often not a depiction of reality. what do you think of these apps? are they a threat? how do you feel about such altered faces? let us know. what are you using in stuff, facebook or what's up? well, be wary of the data. you shift the parent company. meta could pass on person the dates of european uses to us all for a tooth. that's why the social media giant has just been fine in the you with the rest of the stuff up 1.2000000000. why would meant to allow sensitive data to be accessed by us government services? what else is this stage of being shipped? who exactly is being affected? let's take a look how the record find helps you make a transpose that stays self, excuse us, to the u. s. why the company claims slides will to the business model,
6:25 am
use the data needs to be bundled to processes in an effective way. and to said targeted as based on the results. but us surveillance last, a low intelligence agencies, like the end, if they've brought access to the data from non u. s. uses union costs or the competitive take precautions against these last before, but it didn't happen. so now that you have left mess up with this record fine in order to protect your pin citizens data from the u. s, it could be an important step and force big tech companies to protect our data better even beyond the you. that's important because it's not just u. s. also are issues that may be interested in your data. your own government could be to us big tech aiding the n s. s. mess surveillance program. we've known that since 2013 thanks to whistle blow. edward snowden, now, 10 years later, government and intelligence agencies are reported the receiving even more data from social media provider. who else has metro sharing data with mattel has been
6:26 am
publishing and transparency reports in 2016. it detailed the number of times governments that's possible access to the own citizens data on its platform in the latest issue of the us topic than this with more than 64000 request photos closely by india. next comes germany. i'm presented with about $70500.00 total requests submitted each in the report. metro states match our response to government requests for data in accordance with a particular law and our chance of service each and every requests were receive is cassidy reviewed for legal sufficiency. and we may reject or require greater specificity or requests that appear overly broad or vague. so in other words, unless a country slot does not allow for use of data to be passed on much m may indeed, and over the published number seemed to indicate that method is actually quite
6:27 am
willing to give out information in more than 75 percent of all cases, at least some form of data was produced, which states i do governments access beyond basic data like user names, addresses and contact information tech companies like metro as google, apple, microsoft and amazon can often access a lot more. for example, uses emails, text messages, call logs, photos, videos, documents, contact lists, and calendars that think of government or storage as a very intimate look into your private life. what is this data of being requested and used for? in many countries, law enforcement agencies monitor social media to assist with criminal and civil investigations if they have good reason to believe someone's involved in criminal activity. they might ask that phone providers for access to that person's use a profile. this can also happen in connection with public safety. for example, before big events authorities might want to scan, use us profiles to assess risk,
6:28 am
get another reason of immigration and travel screening. and here we can get an idea of the ethical implications that come with platforms for meeting use of data to authorities. imagine those requests targeting certain activities. you might think this kind of surveillance doesn't apply to you because you've got nothing to hide. but government monitoring of social media data can and has led to people being accused a crime even wrong. so just as it's definitely in this free communication, what can you do to protect yourself? the recent loading and the you pod from the record fine says match a must return your pin, use the data to your pin service. but apart from that, that's nothing you can do to be entirely safe from being monitored by your own government or elsewhere. you'd have to come see if they did the 2 online accounts to guarantee that. and this does not apply to meta and on, but also to google, amazon and microsoft. what do you think you can if your government has access to your data? that's it from me by and see you next time the
6:29 am
it's the instrument of the year, the mandolin and he is, it's ambassador avi, avi tall. he is really virtual, so hes promoted a small string instrument on great stages the world over we catch up with him in his adopted home early in your room. 2 the w hi hard for a know because the bon opera gala 2023. benefiting the german aids foundation, experience. young international talent, great arias,
6:30 am
and even the touch of fraud. the highlight the parts that he was not what makes the gym. and he did love to hear about in think step away from us, but i'm not even allowed to go to my own car on everyone with later holes and every single day stuff. getting you ready to meet the gentleman can join me. right. just do it on dw, the small, but my d, the mandolin is the instrument of the year. and virtuoso, i'd be, i'd be tough shows us why it deserves to be heard. what's the human library? it's where people take the place of books and tell their live stories to strangers .

18 Views

info Stream Only

Uploaded by TV Archive on