Skip to main content

tv   Shift  Deutsche Welle  July 31, 2023 8:15am-8:31am CEST

8:15 am
a team which has been the target of the previous attack. then forget, you can always get the w news on the go, just download our app. it'll give you access to the latest news from around the world as well as push notifications for any breaking news. optics shifts looks at the potential dangers of the a i and how the technology could be regulated. i bid as well, and thanks for watching. i'll see you again with mobile headlines excel. it is on the we've got some hot tips for your bucket list. magic corner check hot spot for food and some great culture of the board has to double your travel off we go. it's evelyn charmaya. welcome to my pod cast.
8:16 am
last matters that i am, vice celebrities, influenza, attend experts to talk about all playing loved data. and yet today, nothing less the south. all these things in more and the new season of the fun, the make sure to tune in wherever you get your thoughts costs, enjoying the conversation. because you know, it's last matter, the chief visual intelligence, the race is on and the whole world is watching a i can paint, but i can go and can even help paralyze people walk again, but there's more to it, it can monetize and it can amplified this information is a ios of nation. well damn nation today on ship. the
8:17 am
hey, art technology is pretty advanced. it can ride academic essays, identify cancer cells, or even replace people like me. but with a i, the new gateway for miss use as well. a crime is on the rise, one d, b, d, for example, available on the document. it can ultimate the perfect fishing attack, but that's not all. hope francis in a pop project. the image went viral earlier this year and was generated by a lot more that could be seen as a joke, so called deep stakes can cause great harm. just recently, the republican national committee released this a i generated election in the clip, warns of at the scope and future should the current president joe biden make it to the white house again? i am not morgan freeman. what you see is not what you see is a defect, a technology driving a i scans a deep big kind of go,
8:18 am
a chinese national out of $600000.00 us dollars. a scam or use a software to impersonate close best print on a video call manipulating go into revealing his bank details. that's not the only way i can be used to. manipulators. that's even more allow me is the potential use case of lots language models and providing this information that is messages, but a deliberately intended to be misleading. and here i worry that large language miles might enable propaganda or its scale. and in a way that's tailored to what we've never seen before. as for example, the news agency reuters tested the chinese a chatbox earnings and found that it was reluctant to answer questions about chinese president, gigi and ping, or about more recent events in china is history. it's unclear whether this was intentional for a programming flaw. but it shows just so easily a i can amplify misinformation. the me i has also been known to discriminate
8:19 am
against people, artificial intelligence, machine learning, and all these. let's say complex complex algorithms and data sets are jeff, placate the biases that the humans have. a are recruitment systems, for example, have been shown to be biased against women. it's been proven that facial recognition technology is less accurate for people of color with people being arrested for crimes. they never committed experts and politicians in many parts of the world want to regulate the use of a i. in july 2023, the un security council met for the 1st time to discuss the risks, even the head of the company behind to, to be the same oldman is proposing more regulation for a i at a us senate hearing in may 23. he pointed out the dangers of the technology, i thought the also says the opportunities outweigh the risks of those i calling for
8:20 am
you. i development to be stop completely until regulators catch up by google's top executive fund up to try also one say i to be regulated globally. that's because he claims his technology is as dangerous as nuclear weapons, which i seems to be serious. the executive wants to make certain things unavailable for users. google's texts to video ai for not key will not generate clips depicting people. for example, hundreds of researchers and entrepreneurs are calling for a pause in a development among them signed up to try and test lubarski longmont. an open letter for such an a moratorium had been signed by more than 33000 people. but the call could also be an elaborate corporate move for big players to catch up with the top developers. after all mark's new company x a i is working on its own chatbox truth
8:21 am
t p t. even google's chat box part can't keep up with chapter p t just yet. so one reason for recent calls for our fries could be economic rivalry. clicked up to china approve, there's more to it than just the can nomic interest. the government, the, it has realized the potential a i has to consolidate its power to thoughts in china. seem to be reluctant to say anything critical about the regime and since 2019, the government has been using ai to monitor people and evaluate their behavior. the you, on the other hand, is looking to band a. i is the valence legal standards for a use of being drafted. but the so called a i act still needs to be given the green light was this image created by artificial intelligence. it's not always easy to tell. content or products generated by ai are set to be labeled in the future. one of the pillars of
8:22 am
the use new a i act. we ask for a, a degree of transparency. we want people to know even when they are interacting with that safe check bought that this is not the person. it's a check. the priority is the regulation of an application that might interfere with people's fundamental rights. we have to try the to a breed your, these 2 approaches of pearl fundamental rights and protections. and on the other end, they need to sustain innovation. and development of the ai, a ai will be regulated to varying degrees depending on where it's applied. for low risk tools such as intelligence, spam filters, there will be fewer requirements. strictly regulation is in the works for applications that could more seriously impact to users lives. for example, a, i tools that pre select job applicants for the customer's credit ratings,
8:23 am
alias systems, themes too dangerous will be banned altogether, such as bio metric surveillance systems for social scoring applications. companies that use a us, we'll have to register it in an e u y database a systems and how they operate. it'll be open to the public to ensure the maximum transparency is the leading the way for a future with a i or is it putting the brakes on innovation? the impact of these regulations is already showing. for example, in india, one of the world's largest tech markets for many companies that are working with europe and companies, they have to what in art about the regulations. because europe is sort of taking the lead in terms of the regulating a stable lightning data for many people in india, a,
8:24 am
a is already playing a central role in their work. so we'll start that's coming up both in terms of developing new algorithms and also using a systems for various applications from dating to finance, through his agriculture you know, across the board. so, so that's the very exciting part about it. but for a long time, india's government wanted nothing to do with regulation. despite experts, warnings, the european parliament passed its version of the act, setting the tone a applications are to be regulated, much like the block chain based web 3. the discussion is well under way. as to how the laws should look. so one is it needs to be bias pretty. you know, it could be gender bias, it could be community buzz, they should buys these things. the 2nd is accountability. you know, i can barely,
8:25 am
now i system. i can use it in many applications, but then if the, if the phase who is accountable is it me, the vendor of that system is, is the person who deployed this for now the debate in india is focused on the protection of user data. the ethical aspects of our usage. however, having taking center stage data protection is hotly debated in the field of a i largely due to the fact that the eyes are trained with vast amounts of our data . companies like albany, i google meta say they won't sell. i data, but ultimately users have little to no control over what really happens with it. to chat g b t collects and stores all entered requests, even if the users delete conversations with the box next to the data and user profile, the chat box can store information about the device used, such as the ip address and precise to your location. ideally, a chat with an a,
8:26 am
i should be like a conversation between real people. but this can lead us to reveal much more than intended a company's handle user data to improve their services. many companies use the input to train the ai itself. the thing is, once the system is seen that data, it is then its memory, you know, just like us who months i see a person. and then you say it is like person from a memory. it's not happening. same with 40 i. if you get already days it from its members. so what does that mean for us use us in any case, we don't have much more personalized advertising. microsoft, for example, recently started using chat, c, b, t and it's search engine being information game. this could be used towards tailored advertising. we should be mindful of what we tell a i systems, for instance, employees that companies like amazon to allow to use templates for work for
8:27 am
a fee of leaking trade secrets, the one that so many countries are working on a loss right now. and let and america however, there are different concerns. argentina, for example, uses a i to monitor it's price, it's still regulation isn't the top priority for latin? american governments says research or via treats, present each a less of the method. yeah. not of this world. the north and also divide in the field is bradley. many cut it out in america is more than capable of training. people see that it, for individuals educated in our universities, are quickly drafted from the region by working i local, or even by moving to the entire and a lot though it is impossible to compete with the incentives research that exists in the sphere. i mean, if a foreman with us when he wants to add it at the outset on not a bit on the flipside companies and rich countries outsource a lot of the human labor behind a i, i, well, my not why not. there are many simple jobs here, like paperwork or training the
8:28 am
a i itself can pick job, so you don't need much school enough for me seem poorly paid, precarious work is outsourced to the so i cool, well specialist trained our universities work in the normal. so i meant that, but it could be so because it's the website, it's on the interface. initial corporations have been using this strategy for decades. one size fits all rules for the industry may change that in the future. single pulls government is relying on voluntary self regulation by companies when it comes to a i. this could make single pool, a global hotspot for innovative a i development. but it could just as easily pay the way for misuse. what this means for you is this will only become apparent in the future. so is regulation hom, for to innovation? or should i be more tightly controlled? let us know what do you think that, that for me is you next time my, the
8:29 am
eco africa. you can look and be sustainable. we are trying to construct ideal for ways to do means when he comes to the test when he comes to flushing, our design is just our to our tools to gauge everything, to be taught. that's the message from the younger man design and you go for it goes to 77 percent west central africa have the highest rates of child marriage in the world. a violation of human rights off 15 at this stage, would you want to be married? no. because of being a girl child, i believe to say there is more or to nice and getting married at the very young age
8:30 am
most is being done. it's a fight this cool practice. the 77 percent is 60 minutes on dw, we are all set and we're watching closely. we all seem to bring you the story behind the news. we wrote about unbiased information, all 3 months. done the we all know that it is important to make a good impression on the close issues a lot about what sort of positive you are. i am under 3 know do and today on equal off because.

16 Views

info Stream Only

Uploaded by TV Archive on