tv The Bottom Line Al Jazeera August 21, 2023 9:00am-9:30am AST
9:00 am
life fire fighters, they're saying that there's very little they can do with these extremely heavy wind while the world focusing a lot on the tourist. the local real desperation here about how much they've lost 35 square kilometer as of last green forest and woodland has been burned, really does resemble and don't want to get in landscape the hello, i'm kimberly how could, in the tough stories on al jazeera celebrations of guatemala, as congress and bernardo river lo, with the presidents, are run off the center left handed it round on a platform of anti corruption. arrived below was on the ballot with conservative candidates, sandra torres, and a run off election to choose. the next president of ecuador looks increasingly likely left his candidate, louisa. this alice and her surprise challenger daniel de bo,
9:01 am
i beat 6 other contenders in sundays election, which was marred by gang violence. tropical storm hillary has brought heavy rain and gusty winds to southern california after causing flash floods. and at least when death in mexico city and county leaders are asking residents to stay at home. thousands of people meantime have been ordered to evacuate western canada, where nearly 400 wild fires are raging. another 30000 people have been told to be prepared to leave their homes in the province of british columbia. a prime minister gesture for jo says every canadians should be prepared to help in any way they can . or canadians across the country were particularly out west, are stepping up to help their fellow citizens, their neighbors, residents who are fleeing for their lives. many cases have lost everything,
9:02 am
and these are things where emergencies happen and canadians pull together. and it's, it's always comforting to see the federal government is continuing to stop up where they're, with canadian armed forces, resources were active in coordinating with regional and local governments to make sure people are safe. make sure we're doing everything to protect lives, property, and firefighters. on the island to 10 rethink the wild, far, which devastated large areas of forest was started deliberately. around 26000 people have to move to safety on the canary islands, as firefighters continue to try and contain the fire. the fire started on wednesday in the north, away from popular tourist areas in the south. but another lens and denmark have pledged us made fighter just to pray for its ongoing war with russia. the
9:03 am
announcements were made during president vladimir zaleski is visit to the nato member. countries show, assuming if we have this is one more step and strengthening you cranes and offenses . we will use f. 16 jets to keep russian terrace away from ukrainian cities and villages mock and i have agreed on the number of f sixteens that will be provided to you. crane. after our pilots and engineers have been trained, the men on some denmark commits to transfer f. 16 aircraft to ukraine, and the ukranian air force in close cooperation with united states and all the progress wants to conditions for such a transfer has been met. or a former us president donald trump has confirmed. she'll skip wednesday's debate with other republican presidential contenders. trump posted on his social media site, he believes voters know how successful his presidency was. he's the front rider in a pool of at least 11 candidates. cambodia is new. prime minister humanae is being
9:04 am
an og rated as the new leader. it's the nation's 1st transition of power. in nearly 40 years, the ruling party went all the 5 seats in parliament and a one sided election. after the main office vision was barred from the vote. rushes 1st mission to the moon in nearly 50 years has ended in disappointment. as luna 25 space craft lost control and crashed well trying to land space beat in england to win the women's world cup. for the 1st time, spain secured a one meal victory to take home the trophy in sydney, australia. they are the 5th nation to hold the title and have now become the 1st country. when both the women's and men's world cups. and those are the headlines. the news continues here on now to sierra. stay with us. the bottom line
9:05 am
is next the a. hi, i'm steve clements and i have a question. is the wild quest for advanced artificial intelligence? more like a suicide race for humanity? let's get to the bottom line. the science fiction has been obsessed with this idea for decades. robots and super intelligent computers take over our decision making powers. and then they and slave us. and we only realize what's going on when it's way too late. like when kiana reeves character in the matrix film figured out that he and all of humanity were basically just batteries for a huge a machine. well is fiction becoming reality. governments and big corporations are locked in a frantic race to come up with more advanced applications for artificial intelligence
9:06 am
to manage everything cheaper and more competently healthcare education even battlefields. in a way, artificial intelligence is the new arms race. russian president vladimir, put in one said that the nation that leads in a i will be the ruler of the world. so is there an extra central risk to humans? and is it time to pause and take a step back to make sure these efforts are regulated and more guard rails are put in place to minimize those risks? my guess today says it's definitely that time he has gone to hell in one of the engineers behind hit programs like skype and cars off and now the founder of the center for the study of back to the central risk of cambridge university. he's also co founder of the future of life institute that took the lead in calling for 6 months moratorium on a research young. it's a real pleasure to have you with us today and i, and i just want to start 23 years ago. i read an article that was the cover of wired magazine by another technologist, they knew named bill joy. and the title of that is saying is, is, was why the future doesn't need us. that i would love to hear where we are in build
9:07 am
choice predictions and what you think we need to be wary of as we, as we move into this new era of a i indeed uh, but they're just like bill joyce. and even they can go even further. for example, i enjoyed 1951 said that the uh, once i becomes uh, smarter than humans. we'd like to lose control to it. and i think the correct position to take care is that a soon as we kind of drew it out, that way with the remain controlled, permanent controls for a long time. we showed the kind of take necessary questions precautions to make sure that either we remain on control or if you lose control of the things the future it was, it'd be good for us. now you have written about your concerns in this area, not being something that evolves tomorrow, but down the road as super intelligence really evolves and takes hold,
9:08 am
that mankind may be less relevant to the equation. can you explain to our audience, our la audience, what your concerns are about computer super intelligence? and uh, so does that look like that was just recently um, a survey result uh, i think by organization called you gov that the for the normal. well there are people on the streets, they actually have like pretty good. the patients, what could go wrong? and in some ways of other people in academia, intellectuals, they gonna have like, hybrid those don't think the problems on both the experts like your show of angel or different on the mentor, sofa of deep learning or the people in the streets. they understand that because like if you look at it, the reason why 2 months are like permanent control of this planet, not to funds, ease the stuff because we are more capable than they are. we don't want that isn't
9:09 am
. they are not smart. we are not stronger, but we know how to do long term planning, etc. i know we are we species we are in the race to yield that advantage to machines, which of us going to think patients are coming to people to indicate is not a good idea. now you are involved with the founding of a number of institutes that are fascinating. one is the future, you know, looking at this study of x, a central risk and other as the future of life institute, which i find fascinating and lead with a letter signed by many of the world's most important technologist today asking for a moratorium. and we start to see you on mosque who's been a guess on the board of your center, but also is a co signer that letter. i guess my question is, can, can pandora be put back in the box? can, can, can a letter like that can a more, it's for him, a call for a pause. actually have an impact on the global development of
9:10 am
a i today or things just proceeded to far at this point a. this is a come out that must abrasions to be helpful. for example, we have done it with the other to come, was it before we'd eat a bon schuman clothing event, though with that and was it like this is very possible. and also things like buying websites such are like for they have been like less successful about them, but still like the success has not been known as it has all been 0. and the other thing is that again, the really experts uh top experts in the field. they are concerned. therefore like there isn't this sort of like, uh, voice, double base for, for polls not what they like this. that's a very gonna mixed and white dumping consensus about that need to take the slower. i'm going to reason to take things slow already. so just make sure make sure that things go well. if we are just the, at the mercy of market forces, we might not have the ability to steer things sufficiently. yeah. and you're
9:11 am
a wealthy guy and you've invested in a lot of these companies over the years. and you've been a little investing in a high companies. so sort of interesting, on one hand, to see you warning about these dangers and yet you're a significant investor. you run with the crowd that's out there. that is bringing this new technology forward. how do you square that with yourself? yeah, that is a good question on the bottom. my default way of looking at things is that like what is the content factual that i'm displacing but you know, that was like if i didn't and must like wells will take my place, them on the so that's like $11.00 up roadside taken with companies like on topic, for example, that the other kind of approach i take is that the, i just try not to be going to the decisive investor. like for example, in the bind where i invested in 2011, if i remember correctly. i was just like very small shareholder, so i can help you with my investment,
9:12 am
the ticket to be present at the company and talk about the concerns of how that our the, how to part time. now let's talk about this letter and present biden's meeting with 7 leading companies. i have them listed as amazon and profit. google inflection met a microsoft and open a i. and so many people believe that this meeting in the white house, the president biden, with these executives talking about voluntary measures that they might take to think through the impact and dangers and risks of a i was precipitated by the future of life institutes. a letter now love to hear your thoughts on that, but i think more importantly is, do you believe that the 7 executives, these, these companies and present biden are sincere and what they're doing? or is this big performance acting like they're doing something while behind the scenes they're just chugging along as they were before? uh so i don't think this is our fake performance. i mean,
9:13 am
i do have some concerns that they like to many people to many people. this might be like very new topic. uh so the, just the kind of, uh, don't exactly know what to think at this point. but like you're gonna like everyone has to know, learn about this new situation. and when it comes to a sort of taking credit. uh for these things, i think like the most biggest credit really goes for google for about to open a i. i bought a leasing uh the 250. uh because that's kind of like cost the cost the planet to going to pay attention to a yeah. in a way that they didn't really pay before, but yes, i got that for the future like face of deals that are open, that are going to, failing to dislike personal ground repair, play open a i. and then like you turn the extension statement by center for you safety, there's like $11.00 sentence declaration that they i,
9:14 am
extinction risk should be there at the same level of risk from neubauer and bio. you know, it's, it's an interesting conversation. i actually try to think, i try not to sensationalize when we have these conversations, but sometimes, you know, to overstate for a fact as part of learning and also thinking around the corner and trying to anticipate the unanticipated bull. if you will, things we haven't thought about as part of the conversation and your future of life institute actually has been doing this. and they, they've developed something i, i've best thing i call them is what they call them, are slaughtered bots. can you tell us about slaughter box and, and what, you know, just thinking in contemporary terms and capacity and drones, what could possibly happen um, more recently than people think with a slaughter box. yes. so there are like 2 really big problems. one is that a fully ultimate, that of putting a are in the military cause makes it very hard for your mind to control the subject
9:15 am
or because i've got this point, you are in a very political, honest race. and like when you are in the last race, your you don't have much maneuvering girl when it comes to thinking about what, how do you approach this new technology? you just have to go when it came up like this are when it comes to digit code one digit 8. so that's like one big worry about the putting a military data problem. and us, we see the cyber uh cyber warfare. is that the us things become a toner most uh i thought to abuse and become sweetheart. uh, so like the natural evolution uh for slot. but for the ultimate, that fully ultimate, the, the warfare us church of like for you to do with the slaughter bullets videos. so you start to look at this likes, small arms of the media draws drones. but like,
9:16 am
anyone with money can produce and release without that there be ocean. so you might, we might, the just great thing to world war is no longer safe to be on the site. because you might be chased down by itself. i swore themself of starbucks. one of the interesting things i reaction i had to the future of life institute letter, when you brought all of these major technologist to call for more toy. im is, i wonder how does that compete that consortium compete if china is not involved or, or is china involved, is there an opportunity for a truly global deal that we're not thinking, you know enough about, how do you do that? if you've got one side that's gonna follow guard rails and guidelines and the other side that might not so i'm not an expert in china, but with that kind of a, i just want to point out that china already has uh, regulations in a way that the us does not. uh so, so in some ways your west is gonna falling behind in that like directly regulate the area as i say you will be even though he always was the father. i had them then
9:17 am
us. so and also this is a global problem. so like, just like we've seen the trouble of warming, it's not kind of the correct to assume the chinese i'll just above the guys in the room. like if there's a global problem, i got, you'll have to like get people together and discuss it in kind of with, with open cars rather than trying to shift the blame to the other guys. it occurred to me reading a goldman sachs report that says that extrapolating our estimates globally suggest that generative a i could expose the equivalent of 300000000 full time jobs to automation in europe and united states. and along this line, it just, it just made me think, are we going to look as, as a world has changed when we went from farming and large families of kids that were farmers and helping their parents to a much smaller population in the world. when there are 4 more humans actually being
9:18 am
employed. so uh, yeah, like if you are, you could pick the i don't want it. i'll tell you. i was just another technology that you can like for me to make the case that well isn't be worried about because i got the coolant, u, as in dan, all the time being has been good for a few months, including employment. however, if you, if you take this analogy off like species, then it's clear that i think deduction of a smart, the species quite often does not go well for the, for the less advanced species was a fast name prominent another dimension. here i read, you know, a met recently with barry diller and barry diller is started consortium to basically su ai operations. that's great material. intellectual property, the property of others that are particularly in the news business, a news industry and threatened these major lawsuits. and there's now this very interesting thing in the media and publishing world about property rights,
9:19 am
and whether or not guard rails that can limit how ai learns are a smart thing to do. have you thought about that at all? and what do you think the chances are that publishers or artists or people that create, can somehow get carve outs where they are not part of the world. i find it perhaps and i eat, but i'd love to hear from you. so perhaps as someone who was 20 years ago working on now to verify sharing program operations like take like strong stance on this issue. however, like one point that i think about this, that the or, or i suspect is that the issue with copyright is just going to be very temporary in some ways. it the more like a feature of the current generation of a i and that's like a, i would get smarter. they might just actually need less and less data and less and less cooperate that they don't do it. we're going to exhibit the same with the
9:20 am
helps even stronger capabilities. and so i think in some ways that couple of people perhaps are fighting a good fight, but eventually tickets are going to be losing pipe. i mean, you did this. i mean it's, you just mentioned, i mean, you are the, a king of this, you do this with cars on you, you know, about brought new tech, new technology and the skype. i mean, i think that, that, that is a very interesting tension out there. if, if you would kind of put yourself 20 years ago when you were doing a good, were there ways that the system could have slowed you down? and i ask this as someone, and when you are 25 years old, you are changing the world and not worried about these boundaries. now you're 50 years old and you're saying, hey, we need guidelines and boundaries. are you worried about the 25 year old version of you that's ignoring those guard rails and concerns today? that is a fascinating question. i kind of feel that i could if i was faced with like 20 year old myself. i good kind of like
9:21 am
talk sense to him, but i think it's uh and perhaps like deceptive thoughts because indeed uh like the one thing that i have for example, you've heard that from the 20 year old printer, old myself based i'd really hate software, but then so i think the just a tax on programmers that it's just another great thing to have. but the but like i definitely my view is about open source having become like more would all right, as i see that the potentially the open source, a ice cause actually be the source of catastrophic events. so, is resistance futile? i mean, i mean i, i guess you know what, how do i look at technology advancing so much and you've been such a big driver of this? and i do believe that we do make choices. i'm glad you mentioned human cloning. what i'm interested in is how do we take the work that you're doing in the future
9:22 am
of life and the future of the concerns for x, a central threats and give it scale so that it becomes more the norm and less of a boutique topic. i think the, it's no time to, to really put forward some uh, early regulations at the very least to do something to test exercise and muscle off it technology or regulation. i mean, you test on some of that. i think more than the us test on but the, but like, specifically, i'm thinking about things like uh, making sure that data centers are certified, certified. so like if you want to train on a, i don't on the big i expect you must have done data centers. big data centers, they'll say, doesn't this have to be certified? i think that's uh, kind of one of the steps. perhaps even easier step what that seems to be going a lot of consensus on is that a output should be labeled like nobody should be faced with a phone call or,
9:23 am
or retail, or text and full thing to thinking that this came from human. that should be clear indications, but this is a i or but there are things like liability. so like if, for example, facebook puts out open source a i on that full, think about hands on something really got the stuff that comes as a result. that responsibility should go back to facebook and we have a survey that was done in 2019 and i guess 4000 people were asked that got hundreds and hundreds of responses, asking experts were actually working on these technology issues on whether machines would be vastly better than humans at all professions and at that time it said within 2 to 10 years or 2 years, it would be 10 percent within 30 years. it would be 60 percent. where do you fall in this spectrum? i'm just like very uncertain. i think there is like significant bump in pro
9:24 am
ability in the next few years. uh tough because uh like people have sort of like discard the gold mine and like just throwing more more compute more, more people more, more money. i mean. okay. uh, i just watched the uh, sonata, uh, testimony uh by still trust them. and he said that the, that like those criteria are both then with $10000000000.00 per month. uh, being invested in a start ups. uh, like bucks like more than the ask and buy a science funding into us for the rest of the science. that full credit there is this a rush uh sort of gold rush happening uh, in a uh, in uh, in a very distinct and minor when you compare the rest of taken off the funding crisis is start up land and technology in general. so perhaps like this might actually gonna precipitate some southern capability gains,
9:25 am
but i'm very concerned about if that doesn't happen, then a sofa bed sort of all sort of off the table again and, and kind of figure made them seem like how much more time we'll have i'm gonna tell our audience, john, that you are the real yon, tell me and you're not a deep fake. we haven't conjured you to do all of this, but you know, maybe some day we wouldn't be able to do that maybe illegally. i've seen the face of tom cruise on tick tock and an event that said before where i work, i'll put together, we did a deep, deep fake of barry diller this big, you know, media tight and he was not happy at all about it, but it was one of these things where is you kind of look at the conversions of a lot of different dimensions of, of how it changed. what do you think truth is actually in jeopardy? in some sense. sure. but in some sense, not really. so there are little the ultimate uh, i think, uh like we have lived with the ability to produce kind of fake texts for
9:26 am
a long time. and we have built things like a secure digital signatures. uh like website, traffic encryption, things like that. the dealing with this would be the content we have been gonna use to trusting it. so that would be like a period during which we got off by many people with default. uh bye bye fake. it is. but uh, of course, if nothing, nothing worse would happen because of a, i don't, i wouldn't pay that for it because the people that just are just, they don't stop demanding that kind of authentication of the sources. asked on the deed, stockton, money loss, that if you're going to full, someone without generate that we do is that you should go to jail. let me just ask you, finally, we've had a discussion you and i about this before. and that is about the fragility of
9:27 am
democracy. and whether technology is worsening the problem or enhancing democratic options down the road. we were, you know, i'm in a country right now. we're a former american president, just had his 4th indictment. i'm not sure how well we're exhibiting democracy today, but when you think about this, part of the question is might a, i get democracy better? i mean a, i could get everything but sure. i, i do think that there is this. just like with the page that is complex, this table and the water is a head. and if, if it's a it doesn't turn next thing. so i think we could potentially just uh, develop color measures to everything kind of kind of adjusted a new situation like that a bit with the previous powerful technology such as like internet or smartphones or cameras everywhere, etc. but yeah, my, my only go for a start the, like every new generation of a i was just present big around even logic problems. singularity,
9:28 am
marion hacker investors, businesses beyond tell in founder of the center of the study of x, a central risk and co founder of the future of life institute. thank you so much for being with us today. thank you very much. so what's the bottom line? we all would love a i to help doctors diagnose our ailments, better or to protect us, let's say, against fraud and identity theft. but those are just the toes in the door. generative a, i will eventually effect everything and i mean everything. and there are significant human list dimensions to it. we're data and machines actually talk to each other. they learn from each other and they have all, without us. some of us would like to buy a car that could drive itself. that's great. but are you willing to live in a country that has an autonomously run government? actually, when we look around, that doesn't look like a bad idea. but add to that leaf autonomous weapon systems or robotic killers, or cities that run themselves without any workers. and things start to look a bit more scary. we should be worried about the power of
9:29 am
a handful of people who are making the big decisions on artificial intelligence today. and then what should be even more worried when a handful of people are gone and a i is making all those decisions by itself. and that's the bottom line, the as temperatures hit, the highest on record, environmental leaders will gather in canada to discuss international action to combat climate change on the world to meet the 2013 goals set out to end pollution and most of buyers as far as the 7th assembly of the global environments facility on noticing of me since its inception. in 1961, the quite fund has been supporting people's livelihoods in over 100 countries by funding projects in an array of sectors ranging from infrastructure
9:30 am
to health and education. these initiatives ultimately help to radically poverty and promote sustainable development the hello, kimberly. how can the, how the task stories at this hour and al jazeera celebrations in guatemala as congress and bernardo are way below wins. the presidents were run off the center left handed. it ran on a platform of anti corruption arrived below was on the ballot with conservative candidates, sanders for.
16 Views
Uploaded by TV Archive on
![](http://athena.archive.org/0.gif?kind=track_js&track_js_case=control&cache_bust=1744713707)