tv BBC News BBC News May 16, 2023 3:00pm-3:31pm BST
3:00 pm
live from london, this is bbc news. the rush to regulate artificial intelligence. the man behind chatgpt gives evidence to the us congress. this is the scene live from capitol hill where that hearing starts in the next few minutes. ukraine says they shot down six of russia's most advanced hypersonic missiles during a night of intense attacks on kyiv. we'll talk live to the former president, petro poroshenko. a bbc investigation finds a uk conservative party donor is a british businessman whose companies are linked to a money laundering investigation. and glitz and the glamour of the 76th cannes film festival. we're live in the south of france.
3:01 pm
hello. welcome to bbc news, if you'rejustjoining us, good timing, because we're about to head to washington to hear from the man leading the change to the way we all live. this is sam altman, the ceo and co—founder of openai, the people who've brought us the artificial intelligence tool chatgpt in the last few months, that has the potential, to revolutionise how people work, shop and interact. he's about to be questioned by lawmakers in the us, both about the huge benefits the technology could bring and the threat artificial intelligence could pose for humanity, plus the safeguards that will be needed. in march, elon musk and other tech leaders published an open letter calling for the development of ai systems to be paused. this the scene, all set for the start of that session. it's going to be fascinating.
3:02 pm
fascinating too is altman himself. those who know him describe him as a brilliant thinker, he's even been called a start—up yoda. one of his first employers said, within minutes of meeting him, "so this is what bill gates must have been like when he was 19." on the technolgy he's created, altman has admitted being a little bit scared of it, but has pledged to move forward responsibly. let's to move forward responsibly. go live now to our technology reporter, shiona mccallum. it is going to be quite a session because the man synonymous with all of this giving testimony to those lawmakers. of this giving testimony to those lawmakers-_ of this giving testimony to those lawmakers. . �* , ., ~ lawmakers. that's right and i think it is an opportunity _ lawmakers. that's right and i think it is an opportunity for _ lawmakers. that's right and i think it is an opportunity for people - lawmakers. that's right and i think it is an opportunity for people to l it is an opportunity for people to understand and get to know sam altman because he is lesser—known than meta, facebook�*s mark zuckerberg but no less impressive. if you have watched any interviews
3:03 pm
with uc he is quite open about the technology that openai has developed. he has addressed the potential, the excitement, the capabilities around ai which of course he is very at the forefront of developing but he is also willing to look at the cons and address thoseissues to look at the cons and address those issues around ethics, policy, privacy and all the concerns that people are wanting to ask him about today. i think it will be interesting to hear from today. i think it will be interesting to hearfrom him, he is quite a laid—back character, in congress i think you could get a grilling but i think his personality lends well to answering tough questions and he has not been shy in addressing them in the past. it will be interesting to see what he says, particularly today, he knows he is under intense scrutiny so perhaps we'll be choosing his words even more wisely. find we'll be choosing his words even more wisely-— we'll be choosing his words even more wisely. and he is not in any sort of denial _ more wisely. and he is not in any sort of denial about _ more wisely. and he is not in any sort of denial about the - more wisely. and he is not in any sort of denial about the potentiali sort of denial about the potential risk, is he?—
3:04 pm
sort of denial about the potential risk, is he? ., ., , , ., risk, is he? no, absolutely not. he has been willing _ risk, is he? no, absolutely not. he has been willing to _ risk, is he? no, absolutely not. he has been willing to say _ risk, is he? no, absolutely not. he has been willing to say there - risk, is he? no, absolutely not. he has been willing to say there are i has been willing to say there are pitfalls when it comes to artificial intelligence and we know that there are these dangers that people are raising. questions over bias, how can the algorithms really get things accurate? what dangers are there if these machines perhaps start becoming even more intelligent than humans? there are lots of questions that i'm sure the senate will be asking him. there are so many different things to get into. i think that's why everyone is going to have their eyes on what he has to say. d0 to have their eyes on what he has to sa . , , to have their eyes on what he has to say. do stay with us because we're rroin to say. do stay with us because we're going to return _ say. do stay with us because we're going to return to _ say. do stay with us because we're going to return to you _ say. do stay with us because we're going to return to you over - say. do stay with us because we're going to return to you over the - say. do stay with us because we'rel going to return to you over the next while as we hear from this testimony but let's bring in live now to nello cristianini, professor of artificial intelligence artificial intelligence at the university of bath. what do you see is the big risks with this type of technology? the first thing to _ with this type of technology? tue: first thing to say with this type of technology? tte: first thing to say is with this type of technology? t"t;e: first thing to say is that artificial intelligence was not
3:05 pm
borne by chatgpt in november 2022, so we know quite a lot about what we are worried about and politicians in washington also know quite well what the main concern is. they will try to work this fine line between not stifling innovation and yet ensuring that they can regulate safely. artificially intelligence. in that they can regulate safely. artificially intelligence. in terms ofthat artificially intelligence. in terms of that last _ artificially intelligence. in terms of that last phrase _ artificially intelligence. in terms of that last phrase you - artificially intelligence. in terms of that last phrase you use, - artificially intelligence. in terms of that last phrase you use, is l artificially intelligence. in terms of that last phrase you use, is it| of that last phrase you use, is it clear how you regulate something like this? tia clear how you regulate something like this? ., , ., . ., like this? no it is not clear, everyone — like this? no it is not clear, everyone agrees _ like this? no it is not clear, everyone agrees it - like this? no it is not clear, everyone agrees it must. like this? no it is not clear, everyone agrees it must be i like this? no it is not clear, - everyone agrees it must be done, like this? no it is not clear, _ everyone agrees it must be done, the european union is very close to a law which will be passed in a few weeks but it keeps on moving. the least we can do is list the concerns and in washington i think they will focus a lot on the risk of fake news, misinformation. t
3:06 pm
focus a lot on the risk of fake news, misinformation. i wonder if ou can news, misinformation. i wonder if you can stay _ news, misinformation. i wonder if you can stay there, _ news, misinformation. i wonder if you can stay there, the _ news, misinformation. i wonder if you can stay there, the opening i you can stay there, the opening remarks, i am going to put the microphones up and hear a little of what is being said. the microphones up and hear a little of what is being said.— microphones up and hear a little of what is being said. the audio and by -la in: a what is being said. the audio and by playing a track— what is being said. the audio and by playing a track he — what is being said. the audio and by playing a track he was _ what is being said. the audio and by playing a track he was curious - what is being said. the audio and by playing a track he was curious or- playing a track he was curious or humorous, — playing a track he was curious or humorous, but what reverberated in my mind _ humorous, but what reverberated in my mind was what if i had asked it and what _ my mind was what if i had asked it and what if— my mind was what if i had asked it and what if it had provided an endorsement of ukraine's surrendering or vladimir putin's leadership? that would have been frightening and the prospect is more than a _ frightening and the prospect is more than a little scary, to use the word you have _ than a little scary, to use the word you have used yourself, and i think you have used yourself, and i think you have _ you have used yourself, and i think you have been very constructive in calling _ you have been very constructive in calling attention to the pitfalls as well as— calling attention to the pitfalls as well as a — calling attention to the pitfalls as well as a promise and that is the reason _ well as a promise and that is the reason why— well as a promise and that is the reason why we wanted you to be here today— reason why we wanted you to be here today and _ reason why we wanted you to be here today and we thank you and other witnesses — today and we thank you and other witnesses forjoining us. for several— witnesses forjoining us. for several months the public has been fascinated — several months the public has been fascinated by chukchi pt and other
3:07 pm
ai fascinated by chukchi pt and other ai toois. _ fascinated by chukchi pt and other ai tools, release examples like the home _ ai tools, release examples like the home work— ai tools, release examples like the home work done by chatgpt and the articles— home work done by chatgpt and the articles and — home work done by chatgpt and the articles and op—ed that it can write feel articles and op—ed that it can write feet iikem — articles and op—ed that it can write feel like... but the underlying advancement of this era are more than _ advancement of this era are more thaniust— advancement of this era are more thanjust research advancement of this era are more than just research experiments, advancement of this era are more thanjust research experiments, they are no— thanjust research experiments, they are no longer fantasies of science fiction _ are no longer fantasies of science fiction. they are real, the promises of curing _ fiction. they are real, the promises of curing cancer, understanding of physics. _ of curing cancer, understanding of physics, biology, modelling climate and weather, all very encouraging and weather, all very encouraging and hopeful. but we also know the potential— and hopeful. but we also know the potential harms. and we have seen them _ potential harms. and we have seen them already, weaponised disinformation, housing discrimination, harassment of women and impersonation fraud, voice cloning. — and impersonation fraud, voice cloning, deepfakes, these are the potential— cloning, deepfakes, these are the potential risk, despite the other rewards —
3:08 pm
potential risk, despite the other rewards and for me, perhaps the biggest _ rewards and for me, perhaps the biggest nightmare is the looming new industrial— biggest nightmare is the looming new industrial revolution. the displacement of millions of workers, the loss— displacement of millions of workers, the loss of— displacement of millions of workers, the loss of huge numbers ofjobs, the loss of huge numbers ofjobs, the need — the loss of huge numbers ofjobs, the need to prepare for this new industriai— the need to prepare for this new industrial revolution in skill training _ industrial revolution in skill training. and relocation that may be required _ training. and relocation that may be required and already industry leaders — required and already industry leaders are calling attention to those — leaders are calling attention to those challenges. to quote chatgpt, this is— those challenges. to quote chatgpt, this is not— those challenges. to quote chatgpt, this is not necessarily the future that we — this is not necessarily the future that we want. we need to maximise the good _ that we want. we need to maximise the good over the bad. congress has a choice _ the good over the bad. congress has a choice now. we have the same choice _ a choice now. we have the same choice when rees faced social media. we failed _ choice when rees faced social media. we failed to see that moment. the resuit— we failed to see that moment. the result is— we failed to see that moment. the result is predators on the internet, toxic— result is predators on the internet, toxic content, exploiting children,
3:09 pm
can creating dangers for them. senator— can creating dangers for them. senator blackburn and others are trying _ senator blackburn and others are trying to — senator blackburn and others are trying to deal with it, the kids oniine — trying to deal with it, the kids online safety act, but congress failed — online safety act, but congress failed to — online safety act, but congress failed to meet the moment on social media, _ failed to meet the moment on social media, now— failed to meet the moment on social media, now we have the obligation to do it on— media, now we have the obligation to do it on al _ media, now we have the obligation to do it on al before the threats and the risk— do it on al before the threats and the risk become real. sensible safeguards are not in opposition to innovation — safeguards are not in opposition to innovation. accountability is not a burden _ innovation. accountability is not a burden far— innovation. accountability is not a burden, far from innovation. accountability is not a burden, farfrom it, they are innovation. accountability is not a burden, far from it, they are the foundation — burden, far from it, they are the foundation of how we can move ahead while protecting public trust, they are how— while protecting public trust, they are how we can lead the world in technology and science but also promoting our democratic values, otherwise — promoting our democratic values, otherwise in the absence of that trust. _ otherwise in the absence of that trust. i— otherwise in the absence of that trust, i think we may well lose both — trust, i think we may well lose both. these are sophisticated technologies but there are basic expectations common error law, we can sort—
3:10 pm
expectations common error law, we can sort with — expectations common error law, we can sort with transparency, ai companies are required to test systems. — companies are required to test systems, disclose known risks and allow _ systems, disclose known risks and allow independent researcher access, establish _ allow independent researcher access, establish scorecards to encourage competition based on safety and trustworthiness. limitations on use, where _ trustworthiness. limitations on use, where the _ trustworthiness. limitations on use, where the risk of ai is so extreme we should — where the risk of ai is so extreme we should impose restrictions or ban their use, _ we should impose restrictions or ban their use, especially when it comes to commercial decisions that affect people's— to commercial decisions that affect people's livelihoods and accountability. when companies and their clients cause harm they should be their clients cause harm they should he held _ their clients cause harm they should be held liable. we should not repeat our past _ be held liable. we should not repeat our past mistakes. for example, section— our past mistakes. for example, section 230, forcing companies to think— section 230, forcing companies to think ahead and be responsible for the ramifications of their decisions can he _ the ramifications of their decisions can he the — the ramifications of their decisions can be the most powerful tool of
3:11 pm
all. garbage in, garbage out. the principle — all. garbage in, garbage out. the principle still applies. we ought to be aware — principle still applies. we ought to be aware of the garbage, whether it is going _ be aware of the garbage, whether it is going into these platforms are coming — is going into these platforms are coming out of them and the ideas that we _ coming out of them and the ideas that we develop in this hearing will provide _ that we develop in this hearing will provide a _ that we develop in this hearing will provide a solid path forward i look forward _ provide a solid path forward i look forward to — provide a solid path forward i look forward to discussing them with you today— forward to discussing them with you today and _ forward to discussing them with you today and i— forward to discussing them with you today and i will finish on this note — today and i will finish on this note the _ today and i will finish on this note. the ai industry does not have to wait— note. the ai industry does not have to wait for— note. the ai industry does not have to wait for congress. i hope their ideas _ to wait for congress. i hope their ideas and — to wait for congress. i hope their ideas and feedback from this discussion and the industry and voluntary— discussion and the industry and voluntary action such as we have seen _ voluntary action such as we have seen lacking in many social media platforms — seen lacking in many social media platforms and the consequences have been huge _ platforms and the consequences have been huge. , i platforms and the consequences have been huge. , lam hoping we platforms and the consequences have been huge. , i am hoping we will elevate _ been huge. , i am hoping we will elevate rather than have a race to the bottom — elevate rather than have a race to the bottom and i think these hearings _ the bottom and i think these hearings will be an important part of this— hearings will be an important part of this conversation, this one is
3:12 pm
only— of this conversation, this one is only the — of this conversation, this one is only the first. the ranking member and i_ only the first. the ranking member and i have — only the first. the ranking member and i have agreed there should be more _ and i have agreed there should be more and — and i have agreed there should be more and we're going to invite other industry— more and we're going to invite other industry leaders, have committed to comer _ industry leaders, have committed to come, experts, academics, and the public, _ come, experts, academics, and the public, we — come, experts, academics, and the public, we hope will participate and with that _ public, we hope will participate and with that i — public, we hope will participate and with that i will turn to the ranking member, — with that i will turn to the ranking member, senatorwhole. with that i will turn to the ranking member, senator whole.- with that i will turn to the ranking member, senator whole. thanks to the witnesses for — member, senator whole. thanks to the witnesses for being _ member, senator whole. thanks to the witnesses for being here, _ member, senator whole. thanks to the witnesses for being here, i _ witnesses for being here, i appreciate several of you had long journeys to make, i appreciate you making the time, i look forward to your testimony. making the time, i look forward to yourtestimony. i making the time, i look forward to your testimony. i want to thank senator blumenthal for being leader on this topic, a year ago we could not have had this evening because a technology we're talking about had not burst into public consciousness. that gives us a sense i think of just how rapidly this technology that we are talking about today is changing and evolving and transforming our very eyes. i was talking with someone just last night, a researcher in the field of
3:13 pm
psychiatry who was pointing out to me that the chacha pt and generative ai, these large language models, it is like the invention of the internet at scale at least in potentially far more significant than that. we could be looking at one of the most advanced technical innovations in human history and my question is what will it be like the printing press that diffused knowledge and power and learning widely across the landscape that empowered individuals that lead to greater flourishing and greater liberty? or is it going to be more like... studio: we will come away from those lawmakers as they make their introductory remarks, we will return to here sam altman to give his testimony but let me bring back in the professor of artificial intelligence he was listening to that. in terms of the things you are
3:14 pm
listening out for, from sam altman himself and the politicians, what are the key things you're keeping your ion? mi; are the key things you're keeping ourion? g , ,, ., ,., your ion? my impression is of the ve lonr your ion? my impression is of the very long list _ your ion? my impression is of the very long list that _ your ion? my impression is of the very long list that we _ your ion? my impression is of the very long list that we do - your ion? my impression is of the very long list that we do have - your ion? my impression is of the very long list that we do have of i very long list that we do have of concerns, today, they are really focusing on misinformation. i think it may be due to the coming election campaign to get it right of course. bad memories from the past. let's remember there are many concerns, privacy, bias, potentialfor persuasive technologies that you cannot quite imagine yet and also misinformation. the list is very long. it should be addressed item by item. it looks like they focus on public opinion and misinformation. we will leave it there but we will return in the next three minutes or maybe you can hang on and stay with us but very interesting what we have
3:15 pm
heard so far from those lawmakers. laying out the sort of concerns they have about this new wave of artificial intelligence, the key testimony to come and we will return straight to washington as soon as that starts. whilst we wait, let's turn to ukraine, our other main story because ukraine... ukraine claims to have shot down, six advanced russian hypersonic missiles, during a fresh overnight barrage, described by officials as "exceptional in its density". ukraine says all 18 missiles of various types were shot down. this footage showing kyiv�*s air defences in action. loud explosions were heard across the city. at least three people were injured by debris. in a tweet, ukraine's defence minister says...
3:16 pm
meanwhile russia's defence ministry has claimed that they destroyed a patriot air defence system in kyiv. ukrainian officials have refused to comment. the latest barrage, came just hours after president zelensky wrapped up a european tour, in which he was promised billions of dollars�* worth of military aid by western allies. in another development in the past hour, ukraine has claimed its forces have retaken some 20 square kilometres of territory around the city of bakhmut in recent days. a little earlier the mayor of kyiv spoke to the bbc the intense bombardment overnight. three hours long is beginning 2.30 we have alarm. the whole night we listen
3:17 pm
huge explosions. one of the massive attacks to our home town. hypersonic missiles also. we shoot down russian weapons, three people injured from the part of missiles fall down on our hometown. thanks for partners nobody has died but three people in hospital. live now to kyiv where i can speak to zhanna bezpiatchuk, from the bbc ukrainian service. a huge barrage of missiles but as we have reported, ukrainian authorities
3:18 pm
pleased about the way the defence systems coped overnight.- pleased about the way the defence systems coped overnight. exactly and i can add systems coped overnight. exactly and i can add the — systems coped overnight. exactly and i can add the residents _ systems coped overnight. exactly and i can add the residents are _ i can add the residents are extremely pleased with this. people have a feeling that these particular nights the ukrainian air defence have really saved their lives, saved the infrastructure of kyiv from the damage. from that large—scale damage because small—scale damage happened of course, for example in the morning people not from my apartment building on the eastern left back of that they found signs of the damage of their apartment buildings and found also the fragments, small fragments of the missile are in the playground of their residential block. so it isjust
3:19 pm
playground of their residential block. so it is just one example. but overall what i can say that is it was a very hard night and sleepless night for many people although ukrainians do trust their defence after all the reinforcement and supply that happened, and what actually happened, it was according to... hypersonic as well as cruise missiles launched in the shortest time span to attack the ukrainian capital so the reason it was so noisy, i personally heard multiple serious explosions and it also sounded like the ukrainian air defence and i thought... then it got
3:20 pm
much calmer and then there was the alarm that time during the attack. it is a difficult line so we will leave it there but thanks for the latest air. and a pointer because in about 15 minutes' time we will be talking to ukraine's former president petro poroshenko, his take on everything we have seen overnight and recent weeks with the two have european capitals and perhaps what lies ahead with the potential for the spring offensive. all of that coming up in 15 minutes' time. as promised i will take you back to capitol hill, some of the introductions being made, i don't think we are that far from hearing from sam altman and others who are giving testimony on al and chatgpt and all the questions and concerns that lawmakers have. let's take the
3:21 pm
microphone is up and rejoin proceedings. microphone is up and re'oin proceedingsi microphone is up and re'oin irroceedins. , ., , ., proceedings. development and co . nitive proceedings. development and cognitive neuroscience, - proceedings. development and cognitive neuroscience, thankl proceedings. development and i cognitive neuroscience, thank you for being here and as you may know, our custom on a judicial meeting is to swear an error witnesses before testimony so if you would rise and raise your right hand. do you solemnly swear the testimony that you are going to give is the truth... thank you. mr altman, we are going to begin with you if that is ok. ., ., are going to begin with you if that is ok. . ,, i. ., ~ i. are going to begin with you if that is ok. . ,, ., ~ ,, ., ., is ok. thank you, thank you senator blumenthal- — is ok. thank you, thank you senator blumenthal. members _ is ok. thank you, thank you senator blumenthal. members of— is ok. thank you, thank you senator blumenthal. members of the - blumenthal. members of the committee, thank you for the opportunity to speak to you today, it is an _ opportunity to speak to you today, it is an honour to be here, even more _ it is an honour to be here, even more than _ it is an honour to be here, even more than i _ it is an honour to be here, even more than i expected. my name is sam altman _ more than i expected. my name is sam altman i_ more than i expected. my name is sam altman i am _ more than i expected. my name is sam altman, i am the chief executive officer— altman, i am the chief executive officer of— altman, i am the chief executive officer of openai. openai was founded — officer of openai. openai was founded on the belief that al has the potential to improve nearly
3:22 pm
every— the potential to improve nearly every aspect of our lives but also that it _ every aspect of our lives but also that it creates serious risks we have _ that it creates serious risks we have to — that it creates serious risks we have to work together to manage. we are here _ have to work together to manage. we are here because people love this technology, we think it can be a printing — technology, we think it can be a printing press moment, we have to work— printing press moment, we have to work together to make it so. openai is an unusual company and we set up that way— is an unusual company and we set up that way because ai is an unusual technology. we are governed by nonprofit— technology. we are governed by nonprofit and our activities are driven— nonprofit and our activities are driven ity— nonprofit and our activities are driven by a _ nonprofit and our activities are driven by a remission and charter which _ driven by a remission and charter which commit us to working to ensure the broadest— which commit us to working to ensure the broadest revision of the benefits— the broadest revision of the benefits of ai and maximise the safety _ benefits of ai and maximise the safety of — benefits of ai and maximise the safety of ai systems. we are working to build _ safety of ai systems. we are working to build tools that one day can help us make _ to build tools that one day can help us make new discoveries and address humanity's— us make new discoveries and address humanity's biggest challenges like curing _ humanity's biggest challenges like curing cancer and climate change. our current— curing cancer and climate change. our current systems are not yet capable _ 0ur current systems are not yet capable of— our current systems are not yet capable of doing these things but it has been _ capable of doing these things but it has been immensely gratifying to watch _ has been immensely gratifying to watch many people around the world -et watch many people around the world get so _ watch many people around the world get so much value from what the systems— get so much value from what the systems can already do today. we love seeing people use our tools to create. _ love seeing people use our tools to create, learn, be more productive, we're _ create, learn, be more productive, we're very— create, learn, be more productive, we're very optimistic they are going to be _
3:23 pm
we're very optimistic they are going to be fantasticjobs in the future and currentjobs can get much better~ — and currentjobs can get much better. we also have seen what the developers are doing to improve lives _ developers are doing to improve lives for— developers are doing to improve lives. for example, be my eyes use our technology to help visually impaired — our technology to help visually impaired individuals navigate their environment. we believe the benefits of the _ environment. we believe the benefits of the tools we have deployed so far vastly— of the tools we have deployed so far vastly outweigh the risks but ensuring their safety is vital to our work— ensuring their safety is vital to our work and we make significant efforts— our work and we make significant efforts to — our work and we make significant efforts to ensure safety is built into our— efforts to ensure safety is built into our systems at all levels. before — into our systems at all levels. before releasing any new system, openai _ before releasing any new system, openai conducts extensive testing, engages _ openai conducts extensive testing, engages external experts are detailed reviews and audits, improves the model behaviour and implements robust safety and monitoring systems. before we release — monitoring systems. before we release gpt for, we spent six months running _ release gpt for, we spent six months running tests, we are proud of the progress _
3:24 pm
running tests, we are proud of the progress we made and chat defour is more _ progress we made and chat defour is more likely— progress we made and chat defour is more likely to respond helpfully and truthfully— more likely to respond helpfully and truthfully and refuse harmful requests than any other widely deployed model of any similar capability. however, we think regulatory intervention by governments will be critical to mitigate — governments will be critical to mitigate the risks of increasingly powerful — mitigate the risks of increasingly powerful models. for example, the us government— powerful models. for example, the us government might consider a combination of licensing and testing requirements for development and release _ requirements for development and release of— requirements for development and release of ai models above a threshold of capabilities. there are several— threshold of capabilities. there are several areas i mention in my written — several areas i mention in my written testimony what i believe companies like ours can partner with governments including ensuring the most powerful ai models adhere to a set of— most powerful ai models adhere to a set of safety requirements facilitating processes to develop and update safety measures and examine — and update safety measures and examine opportunities for local coordination and as you mentioned i think— coordination and as you mentioned i think it _ coordination and as you mentioned i think it is _ coordination and as you mentioned i think it is important companies have their own _ think it is important companies have their own responsibility in the matter— their own responsibility in the matter what. this is a remarkable time _ matter what. this is a remarkable time to— matter what. this is a remarkable time to be — matter what. this is a remarkable time to be working on artificial intelligence. as technology advances we understand people are anxious
3:25 pm
about— we understand people are anxious about how— we understand people are anxious about how it could change the way we live. about how it could change the way we live we _ about how it could change the way we live we are _ about how it could change the way we live. we are too. but we believe we can and _ live. we are too. but we believe we can and must work together to identify— can and must work together to identify and manage the potential downsides so we can all enjoy the tremendous upsides. it is essential that powerful ai is developed with democratic values in mind and this means— democratic values in mind and this means us — democratic values in mind and this means us leadership protocol. i believe — means us leadership protocol. i believe we will be able to mitigate the risks— believe we will be able to mitigate the risks in front of us and capitalise on this technology potential to grow the us economy and the world's— potential to grow the us economy and the world's and i look forward to working — the world's and i look forward to working with you all to meet this moment— working with you all to meet this moment and answering your questions. thank— moment and answering your questions. thank you _ moment and answering your questions. thank ou. ., ., moment and answering your questions. thank ou. ., ~ i. moment and answering your questions. thank ou. ., ~ ~ ., thank you. thank you, mr chairman blum involved, _ thank you. thank you, mr chairman blum involved, members _ thank you. thank you, mr chairman blum involved, members of- thank you. thank you, mr chairman blum involved, members of the - blum involved, members of the subcommittee, _ blum involved, members of the subcommittee, thank— blum involved, members of the subcommittee, thank you - blum involved, members of the subcommittee, thank you for i blum involved, members of the - subcommittee, thank you for today's opportunity— subcommittee, thank you for today's opportunity to— subcommittee, thank you for today's opportunity to present. _ subcommittee, thank you for today's opportunity to present. ai _ subcommittee, thank you for today's opportunity to present. ai is - subcommittee, thank you for today's opportunity to present. ai is not - opportunity to present. ai is not new but— opportunity to present. ai is not new but it — opportunity to present. ai is not new but it is _ opportunity to present. ai is not new but it is certainly _ opportunity to present. ai is not new but it is certainly having - opportunity to present. ai is not new but it is certainly having a l new but it is certainly having a moment _ new but it is certainly having a moment. recent _ new but it is certainly having a i moment. recent breakthroughs new but it is certainly having a - moment. recent breakthroughs in generative — moment. recent breakthroughs in generative ai _ moment. recent breakthroughs in generative aland _ moment. recent breakthroughs in generative ai and the _ moment. recent breakthroughs in generative ai and the technology. moment. recent breakthroughs ini generative ai and the technology is dramatic— generative ai and the technology is dramatic surge _ generative ai and the technology is dramatic surge in _ generative ai and the technology is dramatic surge in the _ generative ai and the technology is dramatic surge in the public - dramatic surge in the public attention— dramatic surge in the public attention has— dramatic surge in the public attention has rightfully-
3:26 pm
dramatic surge in the public i attention has rightfully raised serious — attention has rightfully raised serious questions _ attention has rightfully raised serious questions at - attention has rightfully raised serious questions at the - attention has rightfully raised| serious questions at the heart attention has rightfully raised i serious questions at the heart of today's— serious questions at the heart of today's hearing _ serious questions at the heart of today's hearing. what— serious questions at the heart of today's hearing. what rai's- today's hearing. what rai's potential— today's hearing. what rai's potential impact _ today's hearing. what rai's potential impact on - today's hearing. what rai'sl potential impact on society? today's hearing. what rai's- potential impact on society? what do we do _ potential impact on society? what do we do about— potential impact on society? what do we do about bias, _ potential impact on society? what do we do about bias, what _ potential impact on society? what do we do about bias, what about - we do about bias, what about misinformation? _ we do about bias, what about misinformation? misuse- we do about bias, what about misinformation? misuse or. we do about bias, what about - misinformation? misuse or harmful content _ misinformation? misuse or harmful content generated _ misinformation? misuse or harmful content generated by _ misinformation? misuse or harmful content generated by ai _ misinformation? misuse or harmful content generated by ai systems? i content generated by ai systems? these _ content generated by ai systems? these are — content generated by ai systems? these are the _ content generated by ai systems? these are the right _ content generated by ai systems? these are the right questions - content generated by ai systems? these are the right questions and i content generated by ai systems? | these are the right questions and i applaud _ these are the right questions and i applaud you — these are the right questions and i applaud you for— these are the right questions and i applaud you for convening - these are the right questions and i applaud you for convening today'sl applaud you for convening today's hearing _ applaud you for convening today's hearing to — applaud you for convening today's hearing to address _ applaud you for convening today's hearing to address them - applaud you for convening today'sl hearing to address them head—on. applaud you for convening today's i hearing to address them head—on. ai may be _ hearing to address them head—on. ai may be having — hearing to address them head—on. ai may be having its _ hearing to address them head—on. ai may be having its moment, - hearing to address them head—on. ai may be having its moment, but- hearing to address them head—on. ai . may be having its moment, but moment for government — may be having its moment, but moment for government to — may be having its moment, but moment for government to play _ may be having its moment, but moment for government to play a _ may be having its moment, but moment for government to play a role _ may be having its moment, but moment for government to play a role has - for government to play a role has not passed — for government to play a role has not passed us— for government to play a role has not passed us by _ for government to play a role has not passed us by. this— for government to play a role has not passed us by. this period - for government to play a role has not passed us by. this period of. not passed us by. this period of focused — not passed us by. this period of focused public— not passed us by. this period of focused public attention - not passed us by. this period of focused public attention on - not passed us by. this period of focused public attention on ai . not passed us by. this period ofj focused public attention on al is precisely— focused public attention on al is precisely the _ focused public attention on al is precisely the time _ focused public attention on al is precisely the time to _ focused public attention on al is precisely the time to define - focused public attention on al is precisely the time to define and| precisely the time to define and build _ precisely the time to define and build the — precisely the time to define and build the right— precisely the time to define and build the right guardrails - precisely the time to define and build the right guardrails to - build the right guardrails to protect— build the right guardrails to protect and _ build the right guardrails to protect and their— build the right guardrails to protect and their interests. | build the right guardrails to - protect and their interests. but at its core. — protect and their interests. but at its core. ai — protect and their interests. but at its core. al is— protect and their interests. but at its core, ai isjust _ protect and their interests. but at its core, ai isjust a _ protect and their interests. but at its core, ai isjust a tool- protect and their interests. but at its core, ai isjust a tool and - its core, ai isjust a tool and tools— its core, ai isjust a tool and tools conserve _ its core, ai isjust a tool and tools conserve different - its core, ai isjust a tool and - tools conserve different purposes. to that _ tools conserve different purposes. to that end. — tools conserve different purposes. to that end, ibm _ tools conserve different purposes. to that end, ibm urges— tools conserve different purposes. to that end, ibm urges congress. tools conserve different purposes. i to that end, ibm urges congress to adopt— to that end, ibm urges congress to adopt a _ to that end, ibm urges congress to adopt a precision _ to that end, ibm urges congress to adopt a precision regulation - adopt a precision regulation approach _ adopt a precision regulation approach to— adopt a precision regulation approach to ai. _ adopt a precision regulation approach to ai. this - adopt a precision regulation approach to ai. this meansl approach to ai. this means establishing _ approach to ai. this means establishing rules - approach to ai. this means establishing rules to - approach to ai. this meansl establishing rules to govern approach to ai. this means - establishing rules to govern the deployment— establishing rules to govern the deployment of— establishing rules to govern the deployment of ai _ establishing rules to govern the deployment of ai in _ establishing rules to govern the deployment of ai in specific - establishing rules to govern the deployment of ai in specific use cases. — deployment of ai in specific use cases. not— deployment of ai in specific use cases, not regulating _ deployment of ai in specific use cases, not regulating the - deployment of ai in specific use - cases, not regulating the technology itself _ cases, not regulating the technology itself such — cases, not regulating the technology
3:27 pm
itself such an — cases, not regulating the technology itself. such an approach _ cases, not regulating the technology itself. such an approach would - itself. such an approach would involve — itself. such an approach would involve four— itself. such an approach would involve four things. _ itself. such an approach would involve four things. first, - involve four things. first, different— involve four things. first, different rules _ involve four things. first, different rules for- involve four things. first, i different rules for different involve four things. first, - different rules for different rests. the strongest _ different rules for different rests. the strongest regulation - different rules for different rests. the strongest regulation should i different rules for different rests. i the strongest regulation should be applied _ the strongest regulation should be applied to — the strongest regulation should be applied to cases _ the strongest regulation should be applied to cases with _ the strongest regulation should be applied to cases with the _ the strongest regulation should be applied to cases with the greatestl applied to cases with the greatest risks to _ applied to cases with the greatest risks to people _ applied to cases with the greatest risks to people and _ applied to cases with the greatest risks to people and society. - applied to cases with the greatest. risks to people and society. second, clearly _ risks to people and society. second, clearly defining _ risks to people and society. second, clearly defining risks. _ risks to people and society. second, clearly defining risks. they- risks to people and society. second, clearly defining risks. they must - risks to people and society. second, clearly defining risks. they must bel clearly defining risks. they must be clearly defining risks. they must be clear guidance _ clearly defining risks. they must be clear guidance on— clearly defining risks. they must be clear guidance on uses— clearly defining risks. they must be clear guidance on uses or— clearly defining risks. they must bei clear guidance on uses or categories of ai clear guidance on uses or categories of al supported _ clear guidance on uses or categories of ai supported activity _ clear guidance on uses or categories of ai supported activity that - clear guidance on uses or categories of ai supported activity that are - of ai supported activity that are inherently— of ai supported activity that are inherently high— of ai supported activity that are inherently high risk. _ of ai supported activity that are inherently high risk. this- of ai supported activity that are . inherently high risk. this common definition— inherently high risk. this common definition is— inherently high risk. this common definition is key— inherently high risk. this common definition is key to _ inherently high risk. this common definition is key to enabling - inherently high risk. this common definition is key to enabling a - definition is key to enabling a clear— definition is key to enabling a clear understanding - definition is key to enabling a clear understanding of- definition is key to enabling a clear understanding of what l clear understanding of what regulatory _ clear understanding of what regulatory requirements - clear understanding of whatl regulatory requirements will clear understanding of what - regulatory requirements will apply in different— regulatory requirements will apply in different use _ regulatory requirements will apply in different use cases— regulatory requirements will apply in different use cases and - regulatory requirements will apply. in different use cases and contexts. third. _ in different use cases and contexts. third. be _ in different use cases and contexts. third, be transparent. _ in different use cases and contexts. third, be transparent. ai _ in different use cases and contexts. third, be transparent. ai should - in different use cases and contexts. | third, be transparent. ai should not be hidden — third, be transparent. ai should not be hidden. consumers— third, be transparent. ai should not be hidden. consumers should - third, be transparent. ai should not be hidden. consumers should knowl be hidden. consumers should know when _ be hidden. consumers should know when they— be hidden. consumers should know when they are _ be hidden. consumers should know when they are interacting _ be hidden. consumers should know when they are interacting with - be hidden. consumers should know when they are interacting with an . be hidden. consumers should know| when they are interacting with an ai system _ when they are interacting with an ai system and — when they are interacting with an ai system and have _ when they are interacting with an ai system and have recourse - when they are interacting with an ai system and have recourse to - when they are interacting with an all system and have recourse to engage with a _ system and have recourse to engage with a real _ system and have recourse to engage with a real person _ system and have recourse to engage with a real person should _ system and have recourse to engage with a real person should they- with a real person should they desire — with a real person should they desire no _ with a real person should they desire. no person— with a real person should they desire. no person anywhere . with a real person should they- desire. no person anywhere should be tricked _ desire. no person anywhere should be tricked to _ desire. no person anywhere should be tricked to interacting _ desire. no person anywhere should be tricked to interacting with _ desire. no person anywhere should be tricked to interacting with an - desire. no person anywhere should be tricked to interacting with an ai - tricked to interacting with an ai system — tricked to interacting with an ai system. finally, _ tricked to interacting with an ai system. finally, showing- tricked to interacting with an ai system. finally, showing the l tricked to interacting with an ai - system. finally, showing the impact for higher— system. finally, showing the impact for higher risk— system. finally, showing the impact for higher risk use _ system. finally, showing the impact for higher risk use cases _ system. finally, showing the impact for higher risk use cases companiesl for higher risk use cases companies should _ for higher risk use cases companies should be _ for higher risk use cases companies should be required _ for higher risk use cases companies should be required to _ for higher risk use cases companies should be required to conduct - for higher risk use cases companies i should be required to conduct impact assessments — should be required to conduct impact assessments that— should be required to conduct impact assessments that show— should be required to conduct impact assessments that show how - should be required to conduct impact assessments that show how their- assessments that show how their systems— assessments that show how their systems perform _ assessments that show how their systems perform against - assessments that show how their systems perform against test - assessments that show how their systems perform against test for| systems perform against test for bias and — systems perform against test for bias and how— systems perform against test for bias and how that _ systems perform against test for bias and how that could - systems perform against test for bias and how that could impact l systems perform against test for. bias and how that could impact the public— bias and how that could impact the public and — bias and how that could impact the public and a — bias and how that could impact the public and a test _ bias and how that could impact the public and a test they— bias and how that could impact the public and a test they have - bias and how that could impact the public and a test they have done i bias and how that could impact the i
3:28 pm
public and a test they have done so. by public and a test they have done so. by following — public and a test they have done so. by following risk— based _ public and a test they have done so. by following risk— based used - by following risk—based used approach _ by following risk—based used approach at _ by following risk—based used approach at the _ by following risk—based used approach at the core - by following risk—based used approach at the core of - by following risk— based used - approach at the core of precision regulation — approach at the core of precision regulation congress _ approach at the core of precision regulation congress can - approach at the core of precisionj regulation congress can mitigate approach at the core of precision l regulation congress can mitigate a potential— regulation congress can mitigate a potential risk— regulation congress can mitigate a potential risk of— regulation congress can mitigate a potential risk of ai _ regulation congress can mitigate a potential risk of ai without - potential risk of ai without hindering _ potential risk of ai without hindering innovation - potential risk of ai without hindering innovation but . potential risk of ai without - hindering innovation but business also plays— hindering innovation but business also plays at— hindering innovation but business also plays at critical— hindering innovation but business also plays at critical role. - also plays at critical role. companies _ also plays at critical role. companies active - also plays at critical role. companies active in - also plays at critical role. - companies active in developing or using _ companies active in developing or using ai _ companies active in developing or using ai must _ companies active in developing or using ai must have _ companies active in developing or using ai must have strong - companies active in developing or| using ai must have strong internal governance. — using ai must have strong internal governance, including _ using ai must have strong internal governance, including among - using ai must have strong internall governance, including among other things— governance, including among other things designating _ governance, including among other things designating a _ governance, including among other things designating a lead _ governance, including among other things designating a lead ethics - things designating a lead ethics official— things designating a lead ethics official responsible _ things designating a lead ethics official responsible for - things designating a lead ethics official responsible for an - official responsible for an organisation's _ official responsible for ani organisation's trustworthy official responsible for an . organisation's trustworthy ai strategy. _ organisation's trustworthy ai strategy, standing _ organisation's trustworthy ai strategy, standing up - organisation's trustworthy ai strategy, standing up and i organisation's trustworthy ai - strategy, standing up and ethics board _ strategy, standing up and ethics board or— strategy, standing up and ethics board or similar— strategy, standing up and ethics board or similar function- strategy, standing up and ethics board or similar function as - strategy, standing up and ethics board or similar function as a i board or similar function as a centralised _ board or similar function as a centralised clearing - board or similar function as a centralised clearing house i board or similar function as a | centralised clearing house for resources _ centralised clearing house for resources to _ centralised clearing house for resources to help _ centralised clearing house for resources to help guide - resources to help guide implementation - resources to help guide implementation of- resources to help guide implementation of that| resources to help guide - implementation of that strategy. resources to help guide _ implementation of that strategy. ibm has taken _ implementation of that strategy. ibm has taken both— implementation of that strategy. ibm has taken both of— implementation of that strategy. ibm has taken both of these _ implementation of that strategy. ibm has taken both of these steps - implementation of that strategy. ibm has taken both of these steps and - implementation of that strategy. ibm has taken both of these steps and wej has taken both of these steps and we continue _ has taken both of these steps and we continue calling _ has taken both of these steps and we continue calling on— has taken both of these steps and we continue calling on our— has taken both of these steps and we continue calling on our industry- continue calling on our industry pierced — continue calling on our industry pierced to _ continue calling on our industry pierced to follow _ continue calling on our industry pierced to follow suit. - continue calling on our industry pierced to follow suit. our - continue calling on our industry pierced to follow suit. our ai . pierced to follow suit. our ai industry— pierced to follow suit. our ai industry board _ pierced to follow suit. our ai industry board plays - pierced to follow suit. our ai industry board plays are - pierced to follow suit. our ai . industry board plays are critical role overseeing _ industry board plays are critical role overseeing internal- industry board plays are critical. role overseeing internal processes creating _ role overseeing internal processes creating reasonable _ role overseeing internal processes creating reasonable guardrails- role overseeing internal processes creating reasonable guardrails to i creating reasonable guardrails to ensure _ creating reasonable guardrails to ensure we — creating reasonable guardrails to ensure we introduce _ creating reasonable guardrails to ensure we introduce technologyi creating reasonable guardrails to i ensure we introduce technology into the world _ ensure we introduce technology into the world and — ensure we introduce technology into the world and in— ensure we introduce technology into the world and in a _ ensure we introduce technology into the world and in a responsible - ensure we introduce technology into the world and in a responsible and l the world and in a responsible and safe mannen _
3:29 pm
the world and in a responsible and safe manner. it— the world and in a responsible and safe manner. it provides - the world and in a responsible and i safe manner. it provides centralised governance — safe manner. it provides centralised governance and _ safe manner. it provides centralised governance and flexibility _ safe manner. it provides centralised governance and flexibility to - governance and flexibility to support _ governance and flexibility to support decentralised - governance and flexibility to - support decentralised initiatives across— support decentralised initiatives across iw's— support decentralised initiatives across ibm's global— support decentralised initiatives across ibm's global operations. | support decentralised initiatives i across ibm's global operations. we do this— across ibm's global operations. we do this because _ across ibm's global operations. we do this because we _ across ibm's global operations. we do this because we recognise - across ibm's global operations. we i do this because we recognise society grants— do this because we recognise society grants the _ do this because we recognise society grants the licence _ do this because we recognise society grants the licence to _ do this because we recognise society grants the licence to operate - do this because we recognise society grants the licence to operate and - grants the licence to operate and with al, — grants the licence to operate and with al, the _ grants the licence to operate and with al, the stakes _ grants the licence to operate and with al, the stakes are _ grants the licence to operate and with al, the stakes are simply i grants the licence to operate and i with al, the stakes are simply too high _ with al, the stakes are simply too high we — with al, the stakes are simply too high we must— with al, the stakes are simply too high. we must build, _ with al, the stakes are simply too high. we must build, not- with al, the stakes are simply too . high. we must build, not undermine the public— high. we must build, not undermine the public trust. _ high. we must build, not undermine the public trust. the _ high. we must build, not undermine the public trust. the era _ high. we must build, not undermine the public trust. the era of- high. we must build, not undermine the public trust. the era of ai - the public trust. the era of ai cannot— the public trust. the era of ai cannot be _ the public trust. the era of ai cannot be another— the public trust. the era of ai cannot be another one - the public trust. the era of ai cannot be another one of- the public trust. the era of ai i cannot be another one of move the public trust. the era of ai - cannot be another one of move fast and break— cannot be another one of move fast and break things _ cannot be another one of move fast and break things but _ cannot be another one of move fast and break things but we _ cannot be another one of move fast and break things but we do - cannot be another one of move fast and break things but we do not- cannot be another one of move fasti and break things but we do not have to slam _ and break things but we do not have to slam the — and break things but we do not have to slam the brakes— and break things but we do not have to slam the brakes on— and break things but we do not have to slam the brakes on innovation- to slam the brakes on innovation either~ _ to slam the brakes on innovation either~ these _ to slam the brakes on innovation either. these systems _ to slam the brakes on innovation either. these systems are - to slam the brakes on innovation either. these systems are within to slam the brakes on innovation- either. these systems are within our control— either. these systems are within our control today, — either. these systems are within our control today, as _ either. these systems are within our control today, as other— either. these systems are within our control today, as other solutions. i control today, as other solutions. what _ control today, as other solutions. what we — control today, as other solutions. what we need _ control today, as other solutions. what we need at _ control today, as other solutions. what we need at this _ control today, as other solutions. what we need at this pivotal- control today, as other solutions. - what we need at this pivotal moment is clear— what we need at this pivotal moment is clear reasonable _ what we need at this pivotal moment is clear reasonable policy— what we need at this pivotal moment is clear reasonable policy and - what we need at this pivotal moment is clear reasonable policy and sound i is clear reasonable policy and sound guardrails — is clear reasonable policy and sound guardrails. these _ is clear reasonable policy and sound guardrails. these should _ is clear reasonable policy and sound guardrails. these should be - is clear reasonable policy and sound i guardrails. these should be matched with meaningful— guardrails. these should be matched with meaningful steps _ guardrails. these should be matched with meaningful steps by— guardrails. these should be matched with meaningful steps by the - with meaningful steps by the business _ with meaningful steps by the business community- with meaningful steps by the business community to - with meaningful steps by the business community to do i with meaningful steps by the . business community to do their with meaningful steps by the - business community to do their part. congress _ business community to do their part. congress and — business community to do their part. congress and the _ business community to do their part. congress and the business _ business community to do their part. i congress and the business community must work— congress and the business community must work together— congress and the business community must work together to _ congress and the business community must work together to get _ congress and the business community must work together to get this - must work together to get this right — must work together to get this right the _ must work together to get this right. the american _ must work together to get this right. the american people - must work together to get this - right. the american people deserve no less _ right. the american people deserve no less thank— right. the american people deserve no less. thank you _ right. the american people deserve no less. thank you for— right. the american people deserve no less. thank you for your - right. the american people deserve no less. thank you for your time . right. the american people deservel no less. thank you for your time and i no less. thank you for your time and i look— no less. thank you for your time and i look forward — no less. thank you for your time and i look forward to _ no less. thank you for your time and i look forward to your _ no less. thank you for your time and i look forward to your questions. - i look forward to your questions. thank— i look forward to your questions. thank you — thank you.
3:30 pm
inaudible thank you, senators, today's meeting is historic, _ thank you, senators, today's meeting is historic, i. — thank you, senators, today's meeting is historic, i, someone who has founded — is historic, i, someone who has founded ai _ is historic, i, someone who has founded ai companies and someone who genuinely— founded ai companies and someone who genuinely loves ai but is increasingly worried. there are benefits — increasingly worried. there are benefits but we do not yet know whether— benefits but we do not yet know whether they will outweigh the risks — whether they will outweigh the risks. fundamentally, these systems are going _ risks. fundamentally, these systems are going to be destabilising, they can and _ are going to be destabilising, they can and will create persuasive lies at a scale — can and will create persuasive lies at a scale humanity has never seen before _ at a scale humanity has never seen before. outsiders will use them to affect— before. outsiders will use them to affect elections, insiders to manipulate markets and political systems — manipulate markets and political systems. democracy itself is threatened. chat bots will clandestinely shape opinion, potentially exceeding what social media _ potentially exceeding what social media can do. dataset choices ai companies — media can do. dataset choices ai companies use will have enormous unseen _ companies use will have enormous unseen influence. those who choose the data _ unseen influence. those who choose the data will make the rules, shaping _ the data will make the rules, shaping society and subtle but powerful ways. there are other risks, a law professor— there are other risks, a law professor was accused by chat bot of sexual— professor was accused by chat bot of
29 Views
IN COLLECTIONS
BBC NewsUploaded by TV Archive on
![](http://athena.archive.org/0.gif?kind=track_js&track_js_case=control&cache_bust=290973909)