tv The Context BBC News January 23, 2025 9:30pm-10:00pm GMT
9:30 pm
hello, i'm karin giannone. this is the context on bbc news. president trump tells the world economic forum in davos that global oil prices should be lowered to end the $500 billion stargate joint the $500 billion stargate joint venture aims to build a network venture aims to build a network of data centres across the us. of data centres across the us. you're watching the context. you're watching the context. to artificial intelligence it's time for our our new it's time for our our new weekly segment — ai decoded. weekly segment — ai decoded. welcome to ai decoded, that time of the week when we look in depth at some when we look in depth at some of the most eye catching of the most eye catching stories in the world stories in the world of artificial intellligence. of artificial intellligence. i'm karin giannone, sitting i'm karin giannone, sitting in for christian who's in for christian who's in washington this week.this in washington this week.this
9:31 pm
9:32 pm
ahead in developing ai? silicon republic says president trump has revoked an executive order signed byjoe biden in 2023 to reduce the risks that artificial intelligence poses to consumers, workers and national security. biden�*s order required developers to share the results of ai safety tests with the us government before they were released to the public. meanwhile reuters reports tiktok�*s chinese owner bytedance has released an update to its flagship ai model this week aimed at challenging microsoft—backed 0penai. this comes as the global ai arms race intensifies over who will be first to create ai models capable of tackling the most complex problems. with me in studio are bryan mccann, co—founder and cto at al company you.com, bryan was an early pioneer in the development of ai prompt engineering and carissa veliz, associate professor at the faculty of philosophy at the institute for ethics in al at the university of oxford. she is also the author of privacy is power — a book exposing how our personal data is being handled
9:33 pm
and to help guide us through everything our regular ai decoded co—presenter dr stephanie hare is here. stephanie, dr stephanie hare is here. an enormous week nai. stephanie, an enormous week nai. . what is it? it stephanie, an enormous week nai. . what is it?— nai. . what is it? it seems like so much _ nai. . what is it? it seems like so much and _ nai. . what is it? it seems like so much and elon - nai. . what is it? it seems i like so much and elon musk is nai. . what is it? it seems - like so much and elon musk is a question if the money is actually there. that's the first thing we know about. microsoft said they were good for their 80 billion. what looks like an american initiative is also backed by a japanese bank. i'm not sure a lot of americans know that. the second is what is it about. we're talking data centres, eventually getting onto the energy that's going to provide the electricity that powers those centres. we have to talk about the water needed to fuel
9:34 pm
these data centres, and eventually the data! that's what we have these two experts with us. ~ ., ., i. what we have these two experts with us. ~ ., ., ,, ~' what we have these two experts with us. ~ ., ., ~ ., with us. who would you like to beain with us. who would you like to begin with? — with us. who would you like to begin with? this _ with us. who would you like to begin with? this is _ with us. who would you like to begin with? this is huge. - begin with? this is huge. hearing the words stargate and data centre doesn't necessarily mean a great deal to the general public. my questions will be quite basic, but what sort of difference was made make people's lives. i sort of difference was made make people's lives.- make people's lives. i think this is primarily _ make people's lives. i think this is primarily about - this is primarily about stealing the limelight, and i think— stealing the limelight, and i think there is a single optics at play _ think there is a single optics at play. economic power, closely _ at play. economic power, closely tied to computing power and wall— closely tied to computing power and wall power. extracting from fossil_ and wall power. extracting from fossil fuels and nuclear. as much — fossil fuels and nuclear. as much as_ fossil fuels and nuclear. as much as we can get. its fossil fuels and nuclear. as much as we can get. as well as an american — much as we can get. as well as an american hard _ much as we can get. as well as an american hard power- much as we can get. as well as an american hard power and i much as we can get. as well as i an american hard power and soft power. an american hard power and soft ower. ~ , y an american hard power and soft ower. n, y g an american hard power and soft ower. n, , g ., power. absolutely. my fear focusina power. absolutely. my fear focusing all— power. absolutely. my fear
9:35 pm
focusing all of _ power. absolutely. my fear focusing all of our - power. absolutely. my fear focusing all of our eyes - power. absolutely. my fear focusing all of our eyes on | focusing all of our eyes on this, — focusing all of our eyes on this, it's _ focusing all of our eyes on this, it's clear that this has become _ this, it's clear that this has become more and more of a space race type — become more and more of a space race type of— become more and more of a space race type of environment. all the rhetoric is developing in that— the rhetoric is developing in that direction. a race to see how— that direction. a race to see how the _ that direction. a race to see how the world will change. who's_ how the world will change. who's going to have control. when — who's going to have control. when the _ who's going to have control. when the space race was associated with the us and the soviet union, this a! race, us and china. soviet union, this ai race, us and china-— and china. that's right. it is very symbolic. _ and china. that's right. it is very symbolic. something l and china. that's right. it is i very symbolic. something very symbolic with the inauguration �*s tech ceos were in front of his own cabinet. that was an externally powerful message to send. in these 500 billion, trump has promised 100 billion to oracle, a company that deals with databases. software having
9:36 pm
to do with predicting how people act, whether we will go for a job or leave a job. larry ellis, the ceo of oracle, has predicted the modern surveillance. we argue citizens will be at their best behaviour. just like energy. they have enough data power. they combine military or political power in other kinds of power. aha, other kinds of power. a statement which everyone is behaving really well seems like a country that already exist on this planet. indeed, several of them. does united states know what it's building edge as the european union agree? wejust had the prime minister
9:37 pm
announced the uk a! action plan only about ten days ago. what a rebuilding here? is where they need for us to step in. iaiiuiiiiii rebuilding here? is where they need for us to step in.- need for us to step in. will be her first laying _ need for us to step in. will be her first laying down - need for us to step in. will be her first laying down railroad l her first laying down railroad tracks, — her first laying down railroad tracks, all things that came with— tracks, all things that came with it _ tracks, all things that came with it it— tracks, all things that came with it. it was also done by folks— with it. it was also done by folks we _ with it. it was also done by folks we think of as... we don't _ folks we think of as... we don't entirely know who is at the helm _ don't entirely know who is at the helm at this stage. people talk about agi, artificial general intelligence, but we don't — general intelligence, but we don't really have a great demonstration. we knew who won because _ demonstration. we knew who won because it — demonstration. we knew who won because it... we don't really know— because it... we don't really know when it's going to happen. we will— know when it's going to happen. we will come up to that in the second half. buti we will come up to that in the second half. but i wanted to move this on. as a result, we've also heard donald trump talk about reversing joe
9:38 pm
biden�*s ai talk about reversing joe biden�*s a! safety law, the one from 2023, and that gives me something that you might want to talk about, carissa. it seems like the guardrails are down now. seems like the guardrails are down now-— seems like the guardrails are down now. , ., down now. exactly come of the s stems down now. exactly come of the systems are — down now. exactly come of the systems are very _ down now. exactly come of the systems are very dangerous. i | systems are very dangerous. i often give talks to people who do a! for a living, and often give talks to people who do a! fora living, and i often give talks to people who do a! for a living, and i asked them, has a! done work good than bad or bad than good? at least more than half thinks it has done more harm than good. we have discrimination, incredible surveillance. we don't even know whether, where we going with these systems. it's a lot of hype because the bubble is inflated so much that nobody dares to say something that might challenge it. which kind keep hyping the ai. essen that might challenge it. which kind keep hyping the ai. even a few months _ kind keep hyping the ai. even a few months ago, _ kind keep hyping the ai. even a few months ago, only - kind keep hyping the ai. even a few months ago, only heard - few months ago, only heard about was safety concerns about al. ., �* , ~ ., about was safety concerns about al. ., ai. that's kind of mind blowing and it's been _ ai. that's kind of mind blowing and it's been a _ ai. that's kind of mind blowing
9:39 pm
and it's been a long _ ai. that's kind of mind blowing and it's been a long year. - ai. that's kind of mind blowing and it's been a long year. we l and it's been a long year. we are all— and it's been a long year. we are all happy— and it's been a long year. we are all happy to _ and it's been a long year. we are all happy to see - and it's been a long year. we are all happy to see the - and it's been a long year. we are all happy to see the back| are all happy to see the back of it — are all happy to see the back of it all— are all happy to see the back of it. all the _ are all happy to see the back of it. all the people, - are all happy to see the back of it. all the people, not - of it. all the people, not long ago. — of it. all the people, not long ago. standing _ of it. all the people, not long ago, standing next— of it. all the people, not long ago, standing next to- of it. all the people, not long ago, standing next to donald| ago, standing next to donald trump — ago, standing next to donald trump are _ ago, standing next to donald trump are asking _ ago, standing next to donald trump are asking people - ago, standing next to donald trump are asking people to i ago, standing next to donald i trump are asking people to sign up trump are asking people to sign up as— trump are asking people to sign up as petition— trump are asking people to sign up as petition against _ trump are asking people to sign up as petition against ai. - trump are asking people to sign up as petition against ai. they i up as petition against ai. they urged — up as petition against ai. they urged a — up as petition against ai. they urged a six—month _ up as petition against ai. they urged a six—month pause - up as petition against ai. theyl urged a six—month pause which elon— urged a six—month pause which elon musk— urged a six—month pause which elon musk signed. _ urged a six—month pause which elon musk signed. none - urged a six—month pause which elon musk signed. none of- urged a six—month pause whichi elon musk signed. none of them paused~ — elon musk signed. none of them paused~ what _ elon musk signed. none of them paused. what has _ elon musk signed. none of them paused. what has changed - elon musk signed. none of them paused. what has changed in. elon musk signed. none of them paused. what has changed in isi paused. what has changed in 18 months? — paused. what has changed in 18 months? have _ paused. what has changed in 18 months? have a _ paused. what has changed in 18 months? have a soft— paused. what has changed in 18 months? have a soft ai - paused. what has changed in 18 months? have a soft ai ethics? | months? have a soft ai ethics? to a _ months? have a soft ai ethics? to a not— months? have a soft ai ethics? to a not need _ months? have a soft ai ethics? to a not need to _ months? have a soft ai ethics? to a not need to do _ months? have a soft ai ethics? to a not need to do ai - months? have a soft ai ethics? to a not need to do ai safety. to a not need to do ai safety any— to a not need to do ai safety any more? _ to a not need to do ai safety any more? talk— to a not need to do ai safety any more? talk us _ to a not need to do ai safety any more? talk us through? | to a not need to do ai safety. any more? talk us through? did i any more? talk us through? did i miss— any more? talk us through? did i miss a — any more? talk us through? did i miss a headline? _ any more? talk us through? did i miss a headline? [— any more? talk us through? did i miss a headline?— i miss a headline? i think the underlying — i miss a headline? i think the underlying state _ i miss a headline? i think the underlying state of _ i miss a headline? i think the underlying state of things - i miss a headline? i think the underlying state of things is i underlying state of things is probably nothing has really changed about how much we understand the ethical implications and how we can regulate _ implications and how we can regulate these things properly and how— regulate these things properly and how they tackle people's data~ — and how they tackle people's data. everyone is becoming so clear— data. everyone is becoming so clear item _ data. everyone is becoming so clear item digital, and some that— clear item digital, and some that may— clear item digital, and some that may be height, some of it is certainly real as well. the
9:40 pm
uncertainty there is generating this tension that says if you aren't— this tension that says if you aren't moving as fast as possible and you get beat by someone else, you might lose control— someone else, you might lose control of— someone else, you might lose control of the entire situation. you don't have any say at — situation. you don't have any say at all _ situation. you don't have any say at all-— say at all. talking about somebody _ say at all. talking about somebody else - say at all. talking about somebody else coming | say at all. talking about i somebody else coming in, tiktok. coming up on al decoded — as the world races to create this really highlights the amount of energy going into this race from china. brian, carry on with what you're saying about going ahead because i'm leaving those for safety concerns behind. think that's one _ safety concerns behind. think that's one primary _ safety concerns behind. think that's one primary driver- safety concerns behind. think that's one primary driver if. that's one primary driver if you — that's one primary driver if you have _ that's one primary driver if you have this tension. we will come — you have this tension. we will come back— you have this tension. we will come back to this. at some point, — come back to this. at some point. you _ come back to this. at some point, you have europe and philosophers that i'm very partial— philosophers that i'm very partial to acting as a conscience is what's happening right— conscience is what's happening
9:41 pm
right now _ conscience is what's happening right now. but those who are creating _ right now. but those who are creating it and pushing it fastest _ creating it and pushing it fastest and for this are to some _ fastest and for this are to some extent creating that. if the whole narrative. you might lose _ the whole narrative. you might lose a — the whole narrative. you might lose a lot — the whole narrative. you might lose a lot of leverage and control— lose a lot of leverage and control to china for example. what — control to china for example. what is — control to china for example. what i ., ., �* what is the end game. ? i don't want us to _ what is the end game. ? i don't want us to win _ what is the end game. ? i don't want us to win that _ what is the end game. ? i don't want us to win that race - what is the end game. ? i don't want us to win that race if - want us to win that race if it's for authority to them. when you turn the analogue into the digital, you turn what is on trackable into something trackable. on trackable into something trackable-— on trackable into something trackable. �* ., trackable. and the european union doesn't _ trackable. and the european union doesn't agree - trackable. and the european union doesn't agree with - trackable. and the european | union doesn't agree with any trackable. and the european i union doesn't agree with any of this. it union doesn't agree with any of this. ., , �* . this. it doesn't agree with where china _ this. it doesn't agree with where china or _ this. it doesn't agree with where china or the - this. it doesn't agree with where china or the us - this. it doesn't agree with where china or the us is i this. it doesn't agree with - where china or the us is going because — where china or the us is going because we _ where china or the us is going because we have _ where china or the us is going because we have the - where china or the us is going because we have the eu - where china or the us is going because we have the eu ai - because we have the eu ai summit— because we have the eu ai summit next— because we have the eu ai summit next month. - because we have the eu ai - summit next month. everybody standing — summit next month. everybody standing next _ summit next month. everybody standing next to _ summit next month. everybody standing next to trump - summit next month. everybody standing next to trump will- summit next month. everybody standing next to trump will be i standing next to trump will be standing — standing next to trump will be standing next _ standing next to trump will be standing next to _ standing next to trump will be standing next to president - standing next to president macron— standing next to president macron and _ standing next to president macron and discussing - standing next to president macron and discussing a l standing next to president i macron and discussing a path forward _ macron and discussing a path forward. look— macron and discussing a path forward. look at _ macron and discussing a path forward. look at the - macron and discussing a path forward. look at the agenda i forward. look at the agenda
9:42 pm
that— forward. look at the agenda that the _ forward. look at the agenda that the french _ forward. look at the agenda that the french have - forward. look at the agenda that the french have sent. l forward. look at the agenda l that the french have sent. it's very— that the french have sent. it's very european _ that the french have sent. it's very european. it's _ that the french have sent. it's very european. it's looking - that the french have sent. it's very european. it's looking at| very european. it's looking at human— very european. it's looking at human rights— very european. it's looking at human rights and _ very european. it's looking at human rights and civil- human rights and civil liberties, _ human rights and civil liberties, making - human rights and civili liberties, making sure human rights and civil. liberties, making sure ai benefits— liberties, making sure ai benefits for— liberties, making sure ai benefits for all, - liberties, making sure ai benefits for all, it - liberties, making sure ali benefits for all, it names liberties, making sure ai - benefits for all, it names the elephant— benefits for all, it names the elephant in— benefits for all, it names the elephant in the _ benefits for all, it names the elephant in the room. - benefits for all, it names the elephant in the room. you i benefits for all, it names the i elephant in the room. you have to acknowledge _ elephant in the room. you have to acknowledge the _ elephant in the room. you have to acknowledge the fact - elephant in the room. you have to acknowledge the fact about i to acknowledge the fact about climate — to acknowledge the fact about climate change _ to acknowledge the fact about climate change and _ climate change and biodiversity. - climate change and biodiversity. how. climate change and i biodiversity. how are climate change and - biodiversity. how are we climate change and _ biodiversity. how are we going to make — biodiversity. how are we going to make it— biodiversity. how are we going to make it powered _ biodiversity. how are we going to make it powered in- biodiversity. how are we going to make it powered in a - biodiversity. how are we going to make it powered in a way. to make it powered in a way that's— to make it powered in a way that's climate _ to make it powered in a way that's climate friendly? - that's climate friendly? there'll _ that's climate friendly? there'll be _ that's climate friendly? there'll be some - that's climate friendly? i there'll be some awkward questions _ there'll be some awkward questions and _ there'll be some awkward questions and i— there'll be some awkward questions and i some - there'll be some awkward - questions and i some awkward photo-op— questions and i some awkward photo-op 's~ _ questions and i some awkward photo-op 's-— questions and i some awkward photo-op 's. yeah, so where do ou see photo-op 's. yeah, so where do you see that _ photo-op 's. yeah, so where do you see that going _ photo-op 's. yeah, so where do you see that going forward - photo-op 's. yeah, so where do | you see that going forward when you see that going forward when you see that going forward when you see america and china going out so fast? i you see america and china going out so fast?— out so fast? i think it's there tot out so fast? i think it's there to try to _ out so fast? i think it's there to try to operate _ out so fast? i think it's there to try to operate as - out so fast? i think it's there to try to operate as a - to try to operate as a conscience and try to instill some — conscience and try to instill some of— conscience and try to instill some of this regulatory and ethical— some of this regulatory and ethical bias, but i'm curious what — ethical bias, but i'm curious what you _ ethical bias, but i'm curious what you think.— what you think. will we talk about innovation, _ what you think. will we talk about innovation, we - what you think. will we talk about innovation, we tend i what you think. will we talk| about innovation, we tend to assume it has to do with screens and gadgets, but actually, the most incredible innovations we ever come up with her social and political. europe is at the very cutting edge of what is the society we
9:43 pm
want to build and how can we leave the world better than we found it? we thought we were making progress and now, here comes trump and erases what biden was doing. we do need an international alliances because data i i doesn't respect frontiers. most going to happen in these conversations? these boundaries of ai in these conversations? these boundaries of a! that are having different resources. hold your thoughts there. stay with us. coming up on al decoded — as the world races to create artificial intelligence capable of human level reasoning, questions are being raised about our future — when machines are capable of independent thought and action, what role should human intelligence play? we'll discuss in a moment. around the world and across the uk, this is bbc news.
9:45 pm
as we discussed earlier in the programme, the us is in strategic competition with china over which country will be first to reap the benefits of next generation a! technologies — but what about artificial general intelligence? what is it? is that where we're heading, and will a! systems soon be able to reason like humans can? joining us again is bryan mccann, co—founder and cto at a! company you.com, carissa veliz, associate professor at the faculty of philosophy at the institute for ethics in a! at the university of oxford, and our regular ai decoded co—presenter dr stephanie hare. brian, how far away are we from being able to think like humans? i being able to think like humans?— being able to think like humans? ~ ~ , , being able to think like humans? ~' ~ , , , humans? i think the key bit is thinkin: humans? i think the key bit is thinking like _ humans? i think the key bit is thinking like humans. - humans? i think the key bit is thinking like humans. i- humans? i think the key bit is thinking like humans. i thinkl thinking like humans. ! think there's— thinking like humans. i think there's going to be a lot of important impactful things that al can — important impactful things that al can do, especially as we ai can do, especially as we move — ai can do, especially as we move from language models and understanding images and other perception tasks to taking
9:46 pm
actions _ perception tasks to taking actions on her behalf. we will hear— actions on her behalf. we will hear a — actions on her behalf. we will hear a lot— actions on her behalf. we will hear a lot about agents. that's referring — hear a lot about agents. that's referring to the agencies to make — referring to the agencies to make decisions on our behalf. but agi — make decisions on our behalf. but agi is _ make decisions on our behalf. but agi is i think... just exolained _ but agi is i think... just explained that, - but agi is i think... just explained that, please. | but agi is i think... just| explained that, please. i but agi is i think... just - explained that, please. ithink it has a few — explained that, please. ithink it has a few definitions. - explained that, please. ithink it has a few definitions. i - it has a few definitions. i think— it has a few definitions. i think in— it has a few definitions. i think in many ways, it's a moving _ think in many ways, it's a moving goalpost. to some extent, _ moving goalpost. to some extent, it has something to do with— extent, it has something to do with super intelligence being better— with super intelligence being better than humans in general on either— better than humans in general on either everything or most economic— on either everything or most economic tasks, however you want — economic tasks, however you want to— economic tasks, however you want to constrain the space. but right— want to constrain the space. but right now, it is something that's— but right now, it is something that's being touted as 5—10 years— that's being touted as 5—10 years ago a. that's being touted as 5-10 years ago a— that's being touted as 5-10 - years ago a._ yeah. years ago a. that close? yeah. goina years ago a. that close? yeah. going beyond _ years ago a. that close? yeah. going beyond human _ going beyond human intelligence?- going beyond human intelliuence? , ., , ., , intelligence? the philosopher art of intelligence? the philosopher part of me — intelligence? the philosopher part of me says _ intelligence? the philosopher part of me says i _ intelligence? the philosopher part of me says i don't - intelligence? the philosopher part of me says i don't even i part of me says i don't even know— part of me says i don't even know that _ part of me says i don't even know that we know entirely what humans — know that we know entirely what humans intelligence is. i think 1523
0 Views
IN COLLECTIONS
BBC News Television Archive Television Archive News Search ServiceUploaded by TV Archive on