tv Shift Deutsche Welle July 30, 2023 11:15am-11:30am CEST
11:15 am
of the hamburg open, sorry, i have defeated frenchman r 2 or 56264. it'll be the 26 year old germans 1st final since may last year, having suffered a serious ankle injury at the previous french open field plate. serbian last load jerry later today. here's a reminder of our top story versus defense ministry says moscow has been attacked by 3 ukrainian drones overnight. keefe has not claimed responsibility. right? you are up to date, stay tuned for our tech shop shift after a short break. c, capital, the people and trucks injured one trying to feed the city center. more and more refugees are being turned away, the bees correct? don't straighten people's extreme.
11:16 am
getting 200 people around the world more than 150000000, which we ask. why? because no one should have the make up your own mind. me for mine's of the chief visual intelligence, the race is on and the whole world is watching a i can paint it, i can go and can even help penalize people walk again. but there's more to it, it can monitor to us and it can amplify this information is a, i'll a salvation or i'll damnation today on the
11:17 am
hey, i, technology is pretty advanced. it can ride academic essays, identify cancel sales, or even replace people like me. but with a i the new gateway for misuse as well. a i crime is on the rise one d b. d, for example, available on the dock that it can automate the perfect fishing attack. but that's not all hope. frances, in a tough project. the image went viral earlier this year and was generated by a lot of it could be seen as a joke, so called deep stakes can cause great harm. just recently, the republican national committee released this a i generated election in the clip, warns of at the scope and future should the current president joe biden make it to the white house again? i am not morgan freeman. why do you see is not what you, what you see is
11:18 am
a deep face, a technology driving a i scans a deep, big kind of cool, but chinese, national out of $600000.00 us dollars. a scam or use a software to impersonate glows, best print on a video call, manipulate and go into revealing his bank details. that's not the only way i can be used to manipulate or perhaps even more alarming is the potential use case of lots language models and providing this information that is messages. but it deliberately intended to be misleading. and here i worry that lots language miles might enable propaganda or at scale. and in a way that's tailored to what we've never seen before. as for example, the news agency void or tested the chinese a i chat, bought earnings, and found that it was reluctant to answer questions about the chinese president, gigi, and ping, or about more recent events in china is history. it's unclear whether this was intentional for a programming flaw, but it shows just so easily
11:19 am
a i can amplify misinformation. a i has also been known to discriminate against people, artificial intelligence, machine learning, and all these. let's say complex complex algorithms and data sets or shep placate the biases that the humans have. a are recruitment systems, for example, have been shown to be biased against women. it's been proven that facial recognition technology is less accurate for people of color with people being arrested for crimes. they never committed experts and politicians in many parts of the world want to regulate the use of a i. in july 2023, the un security council met for the 1st time to discuss the risks, even the head of the company behind to, to be the same. altman is proposing more regulation for a i at a us senate hearing in may 23. he pointed out the dangers of the technology,
11:20 am
but he also says the opportunities outweigh the risks of those i calling for you. i development to be stop completely until regulators catch up by google's top executive fund up to try also one say i to be regulated globally. that's because he claims his technology is as dangerous as nuclear weapons, which i seems to be serious about this. the executive wants to make certain things unavailable for users. google's texts to video ai for not key will not generate clips depicting people. for example, hundreds of researchers and entrepreneurs are calling for a pause in a development among them signed up to try and test lubarski long lost an open letter for such an a moratorium had been signed by more than 33000 people. but the call could also be an elaborate corporate move for big players to catch up with the top developers. after all mark's new company x
11:21 am
a i is working on its own chatbox truth t p t. even google's chat box part can't keep up with chapter p t just yet. so one reason for recent calls for fries could be economic rivalry. clicked up to china proves there's more to it than just the cannot make interest. the government that has realized the potential a i has to consolidate its power to thoughts in china. seem to be reluctant to say anything critical about the regime and since 2019, the government has been using ai to monitor people and evaluate their behavior. the you, on the other hand, is looking to band a. i is the valence legal standards for a use of being drafted. but the so called a i act still needs to be given the green light was this image created by artificial intelligence. it's not always easy to tell. content or
11:22 am
products generated by ai are set to be labeled in the future. one of the pillars of the use new a i act. we ask for a, a degree of transparency. we want people to know even when they are interacting with that safe check bought that this is not the person. it's a check. the priority is the regulation of a applications that might interfere with people's fundamental rights. we have to try the to a breed. are these 2 approaches of pearl, fundamental rights and protections, and on the other end, they need to sustain innovation and development of the ai, a ai will be regulated to varying degrees depending on where it's applied for low risk tools, such as intelligent spam filters. there will be fewer requirements. strictly regulation is in the works for applications that could more seriously impact to users lives. for example, a,
11:23 am
i tools that pre select job applicants for the check customer's credit ratings, alias systems, themes too dangerous will be banned altogether, such as file metric surveillance systems for social scoring applications. companies that use a us, we'll have to register it in an e. u y database a systems and how they operate will be open to the public to ensure the maximum transparency is the leading the way for a future with a i or is it putting the brakes on innovation? the impact of these regulations is already showing. for example, in india, one of the world's largest tex markets for many companies that are working with europe and companies, they have to what in art about regulations. because europe is sort of taking the lead in terms of the regulating a big relating data for many people in india, a,
11:24 am
a is already playing a central role in their work. so we'll start that's coming up both in terms of developing new algorithms and also using a systems for various applications, you know, from dating to finance, through harris agriculture you know, across the board. so, so that's the very exciting part about it. but for a long time, india's government wanted nothing to do with regulation. despite experts, warnings, the european parliament passed its version of the act, setting the tone a applications are to be regulated, much like the block chain based web 3. the discussion is well under way. as to how the laws should look. so one is it needs to be bias pretty. you know, it could be gender bias, it could be community buzz, they should buys these things. the 2nd is accountability. you know, i can barely,
11:25 am
now i system. i can use it in many applications, but then if, if the phase who is accountable is it me, the vendor of that system is, is the person who deployed this for now, the debate in india is focused on the protection of user data. the ethical aspects of a usage. however, having taking center stage data protection is hotly debated in the field of a i largely due to the fact that the eyes are trained with vast amounts of our data . companies like open a, i google meta say they won't sell i data, but ultimately users have little to no control over what really happens with it. to chat g b t collection stores all entered requests, even if the users delete conversations with the box next to the data and user profile. the chat box can store information about the device used,
11:26 am
such as the ip address and precise deal location. ideally, a chat with an a, i should be like a conversation between real people. but this can lead us to reveal much more than intended a company's handle user data to improve their surfaces. many companies use the input to train the a i itself. the thing is, once the i system, i've seen that data, it is very and it's memory, you know, just like us who months i see a person. and then you say it is like person from a memory. it's not happening. same with for the i, if you can or the days it from its members. so what does that mean for us use us in any case, we don't have much more personalized advertising. microsoft, for example, recently started using catchy b t and it's search engine being information gain there could be used towards tailored advertising. we should be mindful of what we tell a i systems,
11:27 am
for instance, employees that companies like amazon to allow to use templates for work for the fee of leaking trade secrets, the one that so many countries are working on a loss right now. and let and america however, there are different concerns. argentina, for example, uses a i to monitor it's price, it's still regulation isn't the top priority for latin? american governments says research or via 3. it's present each a less of the method. yeah. not at this world. the north and off divide in the field is bradley. how many cut it out in america is more than capable of training. people see that it, for individuals educated in our universities are quickly drafted from the region by working i local, or even by moving to the entire and also it is impossible to compete with the incentives research that exists in the sphere. i mean, that's a foreman with us when you, when you add it at the outset on not a bit. on the flip side, companies and rich countries outsource a lot of the human labor behind a. i might not why not?
11:28 am
there are many simple jobs here, like paper work or training media itself, syndic jobs, you don't need much school enough for me, seem poorly paid, precarious work is outsourced to the so i cool. well specialist trained our universities work in the north. oh, so i'm going to put it already so because except that to say this on the interface, it has to corporations have been using this strategy for decades. one size fits all rules for the industry may change that in the future. single pulls government is relying on voluntary self regulation by companies when it comes to a i. this could make single pool, a global hotspot for innovative a i development. but it could just as easily pay the way for misuse. what this means for you is this will only become apparent in the future. so is regulation hom, for to innovation? or should i be more tightly controlled? let us know what do you think that that for me is you next time my
11:29 am
the, the 77 percent west. i'm central africa. have the highest rates of child marriage in the world. a violation of human rights off 15 at this stage. would you want to be married? no, because of being a girl child, i believe to say there is more to life than getting married at the very young age. let's just being dumb. so fight this cool practice. the 77 percent. next on dw, it's time for visionaries for sustainability. but also for horsepower the
11:30 am
it's time for the full bio revolution. 16 dw, we've got some hot tips for your bucket list, the magic corner check. hot spot for for me. and some great culture more just about w travel off. we go the hello. are you ready for another edition now? this 77 percent. what am i asking? of course. you're ready. thanks for joining the program for africa's you majority i . i'm eddie michael, julia and you all.
18 Views
Uploaded by TV Archive on