Skip to main content

tv   Discussion on Protecting Human Rights Defenders Online  CSPAN  April 15, 2024 8:00am-9:31am EDT

8:00 am
8:01 am
watch your favorite authors online anytime at booktv.org. you can also find us on facebook, youtube and x at booktv. >> , it up officials from the national security council and the state department to talk about the importance of protecting human rights defenders that operate online. topics include authoritarian repression in myanmar and online platforms are evaluating and protecting the speech of human rights defenders. this is sosa by the center for strategic and international studies.st
8:02 am
>> [inaudible conversations] [inaudible conversations] [inaudible] i want to thank our partners, noticed is he department stake and atlantic council digital -- [inaudible]ev i am the director of human rights initiative and
8:03 am
humanitarian agenda here at csis, and today as in 2024 according to the digital 2024 global overview report, 5 billion active social media profiles exist in the world as a% of global population that is 62% of the world that is engaging social media. mobile phone users are up to almost 70% of the global population and more than 66% f all people on earth now use the internet. this issues affecting the daily fabric of all of our lives as they change become increasingly reliant on social media platforms for our commerce, our entertainment, all utilities even in how we pay our bills come all the way to what we engage in networks. people who are experiencing some of the positives of this some of the negatives, people who are the dark underside of the internet that is being used to track, threaten, target and our people including human rights defenders, the free people
8:04 am
upholding the values will care about of human rights, democracy and other important issues. human rights defenders include everyone from them was of ngos and trade genius and bimetal advocates and land rightsn, activist, women's rights champions and anticorruption activist and representatives of indigenous people. really some of the most rely on toeoplean fight for human rights are being impacted positively and negatively i this issue. so it couldn't be more important that it is today to have this discussion and very glad you decided to join us. but those were watching on buying in the room we have the lyrecommendations listed on here on the screen and you can also use your phone to scan the qr code if youn want to look at te full report. for those online you can also access that to see. i'll give a couple announcements just that if there's an emergency, emergency exits are right behind you and the bathrooms are down the hall. again it's my pleasure to be with you here today, and i will
8:05 am
now introduce the special guests we have here that are going to share with us some important thoughts. so first we will have remarks by kelly, the national security council special assistant to the president and senior director for democracy and human rights. kelly kelly throws with white house national security council as special assistant to the president and senior director for democracy and human rights, and prior to this she was acting chief of staff and deputy chief of staff for policy for the u.s. ambassador to the united nations linda thomas-greenfield. from 2018-2020 she was director of you manager policy and efficacy for the international rescue committee, irc, which represented irc, which represented the organization at unitedes nations, and she's held numerous u.s. government roles throughout her career advancing key u.s. priorities at the u.n. and new york. she's also served a human rights ambassador to ambassador susan rice and then a senior policy adviser to ambassador samantha
8:06 am
power where she led key human rightsr initiative for the obaa administration including efforts to secure the release of political prisoners around the world. she's had a distinguished career as well as a civil servant working in a variety of important state department bureaus, and shehe has a juris doctor from depaul university college of law where she was a sullivan human rights law fell. so i couldn't be more delighted to introduce kelly here to the podium to kick us off with some keynote remarks. thank you. [applause] >> thank you so much,, for that introduction, and thanks to all of you for being here today. it is such an honor to be at the center for strategic and international studies. it is my first time here and i am so thrilled to be in the room with experts from civil society, from technology companies, governments, and to host this
8:07 am
launch in partnership with access now and the atlantic council's digital forensic lab. technology, as michelle said and as a lot of you know, has fundamentally transformed the fight for human rights around the world. online platforms enable activists to mobilize and share information more quickly and more widely as ever before. but at human rights defenders across the globe too often face technology-facilitated threats and attacks such as targeted surveillance, censorship, and harassment. these attacks can also move from the digital to the physical world. the misuse of commercial spyware, for example, has been linked to arbitrary detentions, disappearances, extrajudicial killings, and transnational repression. a woman was physically attacked and sexually assaulted for her advocacy highlighting the growing hate against lgbtqi+ people online.
8:08 am
a reporter focused on exposing acts of corruption, was murdered at a carwash after being targeted by commercial spyware. and these are just two of the stories. last week, at the third summit for democracy in seoul, the united states convened a high-level side event focused on the misuse of commercial spyware. the reason we highlighted this at the summit for democracy because it is both a national security and counterintelligence threat. but it's also a very real threat to democracy. my colleague maher bitar, who serves as coordinator for intelligence and defense policy at the white house, and i co-moderated a panel at the , summit that brought together ministers from countries, journalists, civil society and private sector experts like representatives from the investor community, to discuss the importance of exposing the misuse of commercial spyware , and protecting human rights
8:09 am
defenders. the generalists talked about the fear for their loved ones who faced profound risks to their safety and security. their comments reflected what i have heard over and over again as i have had the honor of meeting with heroic journalists and human rights defenders from around the world who have been victims of online attacks. they described the chilling effect that commercial spyware intrusions have had on their ability to continue their reporting and their activism. the isolation they faced from colleagues and counterparts now fear any contract with them. one prominent russian journalist has publicly described an intrusion by commercial spyware as feeling like she was stripped naked in the street. online attacks are all too often gendered as well. around the world, an estimated 85% of women and girls have experienced or witnessed some form of online harassment and abuse. indeed gendered information , manipulation and non
8:10 am
-consensual synthetic content are frequently designed to silence and suppress women, political and public figures, by forcing women to self censor or limit their own lighter activity. the growing access to and use of ai further exacerbates these harvest by expanding the speed of intimidation, many position and synthetic content including nonconsensual intimate images, facilitating pervasive surveillance, and enabling enhanced and refined online censorship. the companies and civil society organizations here know these harms all too well. many of you have reported on the insidious tactics of nefarious actors that use online platforms are target members of civil society, journalists, and activists. addressing these threats is critical not just for the
8:11 am
individual survivors of attacks, but it is also a global imperative for the defense of inclusive representative democracies. the united states government remains resolute in our commitment to address these harms. as president biden said at the second summit for democracy, we must ensure that technologies are used to advance democratic governance, not to undermine it. the united states helped develop this guidance we are launching here today for online platforms to support governments, civil society, and the private sector to come together to fight back against these abuses. this guidance is part of a whole of government effort to, as secretary blinken explained last week at the third summit, to build an inclusive, rights-respecting technological future that sustains democratic values and democratic institutions. last march to the president issued an executive order
8:12 am
prohibiting u.s. government use of commercial spyware that poses risks to national security or has been misused by foreign actors to enable human rights abuses around the world. over the past year, the united states has leveraged sanctions, export controls, foreign assistance programs, and visa restrictions, to support victims and hold governments and firms accountable. we have built a global coalition of countries committed to this cause. in fact, last week at the summit, in addition to hosting the side event i just mentioned, we also announced that six new countries would join a joint statement on efforts to counter the proliferation and uses of spyware, adding to the inaugural 11 countries. additionally the cybersecurity and infrastructure security agency at the department of homeland security has been partnering with many other organizations and companies in this room to protect high-risk
8:13 am
communities including civil society organizations and human rights defenders through their joint cyber defense collaborative. at the first summit for democracy in 2021, the biden administration launched the global partnership for action on gender-based online harassment and abuse, which rings together governments, international organizations civil society, and the private sector to accelerate progress towards safety and accountability for women & girls online. one thing we know to be true is that we can't do any of this alone. we need the expertise of civil society actors, the private sector, and online platforms. today's launch as a starting point for these important conversations that will take place going forward as we look to continue to strengthen these partnerships. thank you for your time today, and we look forward to the conversations and work ahead to address these critical issues. [applause]
8:14 am
>> thank you so much. it is now my pleasure to introduce ambassador robert gilchrist. he is the senior bureau official in the bureau of democracy, human rights and labor at the u.s. apartment of state. previously, he served as principal deputy assistant secretary since december of 2003. he is a senior member of the foreign service class of minister, councillor. his last position was ambassador to the republic of lithuania from 2020 to 2023. prior to being ambassador, mr. gilchrist served as director of the department of state's operations center, deputy chief to the u.s. embassy in sweden, deputy chief of mission to the u.s. embassy in estonia, and director of nordic and baltic affairs at the state department. he was also deputy local counselor at the u.s. embassy in
8:15 am
iraq, chief of the political section at the u.s. embassy in romania, and also a special assistant in the office of the deputy secretary of state. . join me in welcoming ambassador robert gilchrist. [applause] >> thank you to all of you for joining us this morning as we officially launch the u.s. guidance for online platforms on protecting human rights defenders online. i would like to extend my thanks to csis and to the atlantic council and to access now four hosting this event and many of you in this room for your insights, inputs and reflections during consultation over the last year as we developed this guidance. i'm robert gilchrist, a senior bureau official in the bureau of democracy, human rights and labor at the department of state. human rights defenders, or hrd's as we call them, play an integral role in promoting and protecting human rights.
8:16 am
and governments and private sector companies should take steps to protect hrd's and respect for the rights and values for which they so fearlessly and tirelessly advocate. the united states developed the guidelines for online platforms we are launching today in response to the rapid growth of online threats against hrd's around the world. we remain resolute in our commitment to put rates at the center of our farm alessi and to condemn -- foreign policy to condemn attempt to silence human rights defenders' voices through threats, harassment, criminalization, or violence. we are grateful that the people shares this commitment and is working with us to elevate the voices of defenders and underscore their essential role as individuals on the front lines, defending human rights. on march 11, we released a joint u.s.-the recommendation outlining 10 steps companies can take to better identify,
8:17 am
mitigate and provide access to remedy for digital attacks targeting hrd's. u.s. guidance builds of those recommendations by providing specific actions and best practices that companies can take to protect hrd's will may be targeted on or through platforms, products, or services. this guidance is addressed broadly to online platforms that host third party content. but we also hope it can support collaboration within the the broader ecosystem. including with civil society organizations will directly work with human rights defenders, and act as trusted-partner organizations platforms. i also want to clarify this guidance is designed to protect. the united states defines human rights defenders as individuals working alone or in groups, nonviolently advocate for the promotion and protection of universally recognized rates and fundamental freedoms. defenders can be of any gender, identity, age, ethnicity, sexual
8:18 am
orientation, religious belief or nonbelief, disability, or status or some identify as lawyers, researchers, lawyers. hrd's continue to face threats and attacks including arbitrary or unlawful online surveillance, censorship, harassment, smear campaigns, this information to include gender disinformation. internet shutdowns and doxing. and online attacks pave the way for physical attacks, including beatings, killings, disappearances and arbitrary detention. strikes to defenders and democratic defenders escalate, preserving safe spaces online is more important than
8:19 am
ever. as kelly mentioned, the u.s. government takes a broad approach to democratic values online. at the state department we are committed to supporting a protecting hrd's, so they can carry out their essential work without hindrance or undue restriction and free from fear of retribution against them or their families. in washington and throughout the world, department staff maintain regular contact with human rights defenders. our embassies of dedicated human rights officers who regularly monitor human rights developments, meet with defenders and their families, and advocate for our shared principles. over the past decade, the department has provided $60 million to directly support almost 10,000 human rights defenders from more than 130 countries and territories from journalist to anticorruption activist and environmental defenders to labor activist, this often life-saving assistance has enabled over 90% of recipients to safely return
8:20 am
to their work. while we are committed to enabling the work of cso's as a government, this is a collective responsibility. online platforms provide important tools to enable the work of hrd's, and they can do more to help them do that work safely. as companies, they have a responsibility to respect human rights in line with you and guided principles on business and human rights. we hope our guidance which was developed in consultation with hrd's, civil society organizations and platforms, many of you in this room, we hope our guidance will provide companies with the blueprint of actions they can take to better protect hrd's. we ask that you adopt and adapt the recommendations in this guidance to improve your own policies and processes. thank you to our esteemed panelists for being here today and for sharing your
8:21 am
perspectives on opportunities for continued collaboration. collective problems demand collective solutions. and i am heartened to see so many stakeholders engaging with us today. we are all partners in this effort. thank you so much. [applause] >> thank you, everyone. i'm n veryo pleased to introducew a that i will
8:22 am
moderate with each of them, and then we will have about 15 minutes at the end to have a question and answer session. you can raise your hand in the room if you would like to ask a question, and we have a microphone that will be coming around to you. if you're online watching, you can submit a question and i will be able to see the question and ask it in the room. we have full participation from both online and audience members here. without further ado, i will introduce the distant was panelists. so, starting to my left is jason peel mayor, the executive director of the global network initiative of gni. jason lee is a dynamic multi-stakeholder human rights collaboration building consensus for the advancement of freedom of expression and privacy among technology companies, academics, human rights, and press freedom groups and investors. prior to joining gni he was a special advisor at the department of state where he led the internet freedom and business and human rights section in the bureau of democracy and human rights and labor. in that role he worked with
8:23 am
colleagues across the u.s. government, counterparts and other governments, and stakeholders around the world to promote and protect human rights online. next to him is the asia pacific policy and analysts at access now. she has been working in myanmar for about 10 years and has previous experience and political advisor, media and communications. she has served as one of the leading persons in myanmar organizing the myanmar digital rights forum since it was first held in 2016. and it was -- it is and was the cornerstone for the digital rights movement. then we have alex walton, who is the global head of human rights for google. alex is google's global head of human rights and free expression. she builds partnerships to promote free speech and expression and to combat internet censorship and filtering across the globe. she also coordinates policy and strategy around these issues for the company. prior to joining google, she
8:24 am
worked at the raven group, center for american progress and now legal defense and education fund. she also served as a law clerk for the senate committee on the judiciary and for the house of representatives and the committee under dish or a subcommittee of constitution, civil rights and liberties, and worked for the u.s. department of labor, and bay area legal aid during law school. thank you for being here. and last but not least, i have cheryl mendez, who is the senior program manager of freedom house's emergency assistance program. she has worked for over a decade advocating for and providing emergency assistance and logistical support to human rights defenders and journalists at risk worldwide. in her role at freedom house, she provide supports to human rights defenders and civil society organizations who are at risk due to their human rights work. prior to joining freedom while she worked at the committee for debts to protect your nose, founded the crimes of war project and helped launch a culture of safety alliance, a collaboration between news organizations, press freedom ngo's and journalist to promote
8:25 am
journalist safety. mendez is a photojournalist who 's trained documentary photographers in the middle east and north africa. thanks so much for this discussion today. i want to start with the comical questions -- a couple questions to set the stage for the pre-participants watching. i will start with alex and jason with a question. you worked closely with the private sector on addressing these human rights risks associated with technology for a number of years and therefore have this very unique perspective on how companies have worked to address threats to human and human rights defenders over years. so, what would be interesting i think for the audience is to know, how have you seen these risks change over time? are there private-sector actions that have worked really well in combating these issues, and are there any hard lessons you can share? if there is common standards today that maybe you would push
8:26 am
for that one seemed unachievable. help us understand that journey of how the kind of situation has changed over time and how you think this guidance fits into the broader efforts. alex: thanks for including us in the conversation today. it's a great question because i think what has happened over, i have been a google for almost nine years and if i look at what is sort of, how the industry has changed, how technology has changed over that time, and sort of the way the ecosystem of civil society has coalesced around these issues, so sort of we are still dealing with permutations of the same thing, right? there are people who are trying to defend human rights exercise their human rights, and there are bad actors, governments or otherwise, who are targeting them and trying to stop them from doing it good work. the quintessential problem remains the same but the tools available to bad actors have
8:27 am
evolved and attacked us the bad actors use has also evolved. so, teams within companies have over time, i think, become more resourced to look at these things and kind of become attuned to tracking and we have developed intel teams inside companies that are looking at how are bad actors trying to manipulate our technology on our platforms -- to target are vulnerable users. and i think companies have also sort of made more efforts to to engage with civil society to understand exactly how the problems are manifesting and how they look different in different places around the world. and different and the same. and ultimately like, yes, and also we have a.i. and generative a.i. and these things are morphing the ways in which people are -- using technology. all of those things are changing and still the fundamental problem is the same, how do we kind of collectively work
8:28 am
together to understand how problems are evolving to fight back against them? but it is constant, right? the threats are evolving and so are our responses. >> jason, i would love to hear your thoughts. jason: thank you michelle and thank you to csis for hosting, the state department, access now. and to everyone who did the hard work to put this guidance together. i think it is really a wonderful set of principles, and night -- i'm looking forward to have a meaningful impact. and kudos to the question and building off of what alex said, i'm stepping -- it is worth taking a step back because i think these principles build nicely on a foundation that is in some ways 20 or more years in the making. i think for the purposes of this conversation, it's worth kind of starting with a u.n. guided
8:29 am
principles that were developed in a really creative way by john , the special representative at human rights to the secretary-general kofi annan. and carried out multi- year, multi-stakeholder series of negotiations that produced the protect respect remedy that we now know as the u.n. guiding principles that were endorsed by the human rights council in 2011. and that framework really kind of was a pivot point in that it created a foundational shared understanding of the respective duties of states and responsibilities of companies, highlighting in addition in terms of remedy, the gni principles were negotiated in parallel to that process and in fact john had an advisor who was embedded ingni negotiation and there was a lot of crosstalk at the time.
8:30 am
the gni principles came out of a series of incidents that were the result of persecution of human rights defenders using data that had been given to a number of authoritarian governments by u.s. tech companies. and so, the concerns about human rights defenders i do not think that was a time that was necessarily used at that point in time, but the underlying situation of journalists and activists who were trying to use these new technologies and tools to do their work being threatened and persecuted as a result of that, was very much what animated the gni principles. we have to u.n., the gni principles apply to the tech sector and since then, over the last 15 years, we have seen i think a tremendous growth in terms of the amount of
8:31 am
attention, the amount of resources that are put into questions around technology and human rights. so, alex mentioned that a number of tech companies now have put more time and resources into this. at gni, we have a set of principles and application guidelines that i remember companies like google commit to implementing and then we have a unique assessment process that holds them to account to those commitments. at a very high level, those involved creating a human rights policy, employees across the company, in betting that policy-- embedding that policy through trainings across the company. having senior level oversight of that work, creating appropriate excavation channels, putting in place the human rights due diligence -- that are needed to become aware of and be able to respond to human rights risks, including respected defenders, and then having appropriate remedy and transparency
8:32 am
throughout. so, that framework, if it sounds familiar, it's because it is very similar to the one that is in these guidelines, and i think that is a really sort of helpful way to structure this because it takes advantage of what companies have already been working on now for over a decade. it hits all of those sort of points, so that coherence will be really useful in terms of the ability of companies to take and implement this guidance. it will also help civil society organizations and advocates on the outside who are familiar with these frameworks be able to advocate and hold companies accountable in a more consistent way. michelle: thank you both for this great overview of how we ended up here and how these principles are such, these guiding principles are such an important practical step forward. i want to take it now to -- you have such a unique perspective on these issues having worked as a human rights defenders in myanmar, and also for an
8:33 am
organization that is good at helping defenders. i want to hear about the context you are working in. and if you can take us on a hypothetical, if there were -- i human rights defenders today in myanmar who suddenly began doing what is happening -- receiving threats, being doxed on several platforms, what steps could they realistically take to protect themselves, and then when you move to the platforms, what are the most urgent types of threats that you see that platforms need to have processes to address when you come from that on the ground perspective? >> thank you. thank you so much. not just as a team member of the, i am also here as a human rights defenders from myanmar. just to give you a background for many of the people -- about what is going on in myanmar. basically like currently right
8:34 am
now, we are over three years now, people in myanmar tirelessly resisting the military coup. in 2021. but on a daily basis what we see on the ground for over three years is just like pretty much like people in myanmar are -- the human rights abuse across states committed by the military. on a daily basis what we have seen is like air strikes. like dropping bombs on the civilians. in towns and many of the villages have been literally like a whole village has been wiped out after being -- even like this morning when i was on the way i received a message, a township has been basically on fire. and that has been shut down. so those are the kind of daily daily basis, we have seen about both online and off-line as
8:35 am
well. and also like people have been killed, detentions. and those are on a daily basis. and when when you look online, so the same -- it is also the same, internet shut down. the military has basically shut down military since day one. currently right now, we are seeing over 80 townships -- almost more or less one part of the country does not have the internet at all. so, some of these townships around 30 townships, it has been over two years without internet. and also in some areas, not necessarily internet -- and mobile communication has been shut down as well. and then, like and also the other side, in terms of websites, including all of these like social media and messaging platforms, it is been almost three years now. even the use of the vpn has
8:36 am
been banned in myanmar. the other side when you look at the military, has been trying to use different tools, trying to abuse all sort of like digital tools to even extend, there are civilians -- you see about it, they have been abusing all sorts of data collected from a study from footage. or even including a sim card registration data, biometric data, they collected i.d. project etc. so they have been abusing like -- all sort of this data to kind of like do, identify these individuals. where these people are exactly and what they are doing. and to monitor their online activities, to monitor the financial transactions. and also sort of -- violating
8:37 am
all of their privacy and etc. that is what we have seen on a daily basis as well. and also come at the same time, the military has been actively monitoring all these social media, even though they banned social media and also messaging, people are trying to find a way -- and etc., but they are still monitoring all of these digital footprints happening across different platforms. and doxing people, abusing people. we have seen literally this was like, we have seen part of like kind of the campaign of terror. that is basically what we are seeing -- people had been arrested, and arbitrary detention has been happening. even a killing. two one of our close friends was o basically got killed because she was arrested. then the there was on that e
8:38 am
was a trial. and then right after the trial, we didn't get information, where she was taken et cetera. a few days later on the way back from the court she basically she just got killed. so those kinds of information we have, i mean basically like you're saying on a daily basis. this this is the level, the t level, the roots level about people in myanmar are experiencing. and then hit we use this like human rights defenders, the term term human rights defender. like our keynote speaker. he also defined it like human rights. human rights defender, majority of the people in myanmar are currently, i'm basically they are standing up, they're taking the key risks to be able to defend the rights come to be able to defend the rights of their generation.he so when we talk about sometimes
8:39 am
like when we talk about the term of the human rights defenders et cetera particularly in myanmar pre-much like every individuals are human rights defenders. the risk they are having, the threat level, they're expensive is pretty much like at the same more or lesssa the same level, n a daily basis of their it. like, for example, even the public school teacher and also the schoolchildren who are a boycotting public school system which is under the control of the military, they refused to attend the class. they refused to go back to the public school. they are arrested. manyny of them even got killed s well just for refusing the school system, controlled by the military. this kind of threat level. you don't necessarily need to be kind of like like a politir high-profile human rights. but even on days like i mean students, like schoolchildren can be basically, you know, have been targeted and monitored and/or activities have been monitors et cetera.
8:40 am
so sorry back to the question level. i closely work with the local groups who has been monitoring on the digital rights abuses and ethe cetera. this is pretty much like a small group. this group also, we have kind of like a network with extended civil societies and likeed labor unions and the student unions et cetera. so on a daily basis, like, the message we receive is that somebody got arrested from any of these networks, they all look to us. because you know, their digital footprint, by their accounts to be taken down, you know, et cetera. like if that person is a facebook account the want to secure the facebook account, facebook page, telegram channel. even if they have a gmail they would like to - access it.
8:41 am
so basically this is the kind of like messages on a daily basis we receive. whoever, like from this network got arrested, the alert it to us with their url, can you please help us, securities accounts, et cetera. basically kind of like collaborate with the others digital rights groups andan alsa assess now, and also help line and also collaborate with these, like different platforms. i will say about it, like a step, but not necessarily kind of mitigate the risk. like only would basically kind of like 50 steps only after somebody got arrested, we get notified, right? to be mature about it, they're kind of like digital communications has been secured also the s network, involve and our secure, et cetera. those are kind of like immediate steps we have today, right? the other one of basically kind of like we collaborate like our
8:42 am
organization and also like others, the digital rights organization, we closely work. there would also work around on a mitigation plan as well. that is also some of what we engage with these different platforms to make sure about it. like, wehe can be able to kind f like foresee what are the risks and what ofnd the mitigation pls to be able to inform these like human rights defenders and also media, journalists, et cetera. so basically those are kind of like a day-to-day work. we have to take. and also this is a leadoff like a collaboration with the stakeholders including the platforms as well. thank you. >> thank you so much for your work, for the work of the human rights defenders that you are in contact with that of facing these conditions every day. it was so powerful hearing what you just said because you had in at both the fact that authorities lock the internet, which is a lifeline to people
8:43 am
and means defenders can't access tools that they need. at the same time isn't that an indication of how powerful it is as a tool for human rights defenders? i heard both of those in your statement. the idea that these platforms are used to track and monitor people i think it really raises a good question for you and for cheryl about this reporting, these secure and accessible reporting channels. i know 60 become located because even when someone takes a a sp to defend that message in something that could be monitored for putting themselves atti risk. are there lessons that freedom house and access now can share with platforms on the steps that they can take to develop these secure and accessible reporting channels, which will support human rights defenders when the facing arms in the places that the defending human rights? maybe i'll start with sheryl. >> sure. one of the steps is to consult and co-create with the mentors and others who are affected. that would also speak to
8:44 am
accessibility issues, right? so when a platform understands what the operating environment for defender is, how they're able to access any type of channel, but also when they've access channels in the past two report abuse, or abusive language or threats, have they gotten a response? defenders and others are trying to understand, and it should be greater education and training, about what is the lifecycle of reporting? what will happen if a defender or anybody does report something? what are the stages? what can they expect? because often what happens is that defenders and others experience what american bar association's center for human rights, and many others, had said is like flagging fatigue. so that's overwhelming.
8:45 am
and also when they are documenting these threats if there was greater information on how to document i threats, whato document. and also across social media platforms and companies, if the was a better kind of approach where if a defendant is attacked, including across multiple platforms, is there a way that the reporting process can be streamlined so that they are not starting from scratch every single time that they may be reportinge something to oner the other of a channel. this is also really important because as we all know from both defenders at risk and others that when something happens off-line and then it goes, when it happens online, and if he goes off-line. and if it also leads to the criminalization of defenders or different types of legal
8:46 am
harassment because of these orchestrated campaigns to defame them, discredit them, both in the public eye. you know, when that happens whether governments or other actors can use that to then bring these frivolous and worse suits against them, or to charge them, or to jail them. whichke also takes them out of their human rights work. it affects their families. the costs are extreme, depending on let's it if it's legal action. and also even with legal action, a life of the case, even if something gets dismissed, those costs and the psychosocial impact is tremendous. and get enough consequences for their family members and others. i think it's very important with having secure and accessible channels that people know where the points of contact are, off-line andon online.
8:47 am
because, for example, there may be defenders who don't have an online presence who are still attacked online, whether it indigenous defenders or many other defenders. and if there is an understanding of like what other clear channels, who are the actual points of contact, you know? is a a defenders going to be e to report, then are they able to have contact with human beings who are trauma informed, who may be able to give kind of real-time response or a lease help them to understand what are the steps, you know? and again how they are documenting things like that. that also includes the need for resourcing localur and national networks and civil society organizations. and training of trainers. again, because with the way when reports and have a document, when defenders and others are going going to the entire process and
8:48 am
effort, and they've already been put under really difficult psychological strain and stress and threats, you know, every typetr of threat that's been described. defenders are reporting to us when asking for emergency assistance, or to other networks because we don't work in isolation. at the vendor is a a land, environment, indigenous rights defender, a journalist, et cetera, there are a variety of different networks. one of the things we and our partners do is that we are trying to bring those networks across in the same spaces. it's both for the reason that it's very difficult when you're constantly contacting different types and different sources and all of that for help and to kind of work together to provide support for defenders. but because each one of these areas in the broad diversity of human rights work, it literally
8:49 am
is cut and paste. i mean, i used to work at the committee to protect journalists. and i came into working and more broad on human rights defenders it was a same thing. it's like they're using the same tactics, the same approach. like you mention, it's the same scenario except the tools and other things are more insidious, quicker, you name it. but i think it's also important for social media companies and others, all of us, to understand when defenders take kind of preventative measures or precautionary measures, which includes self-censorship, right? which includes leaving,, including in exile, leaving the human rights field in its entirety sometimes. i bring this up because it's w important because that will inform how you think about safe and accessible secure channels. because it depends on how you
8:50 am
respond, you know, what is the man of response, quickly or efficiently and all of that. but if there is no response then this is what happens. defenders leave. that's ath problem. and also with their families because often as was mentioned, i mean the families are attacked. attacked. they are threatened withon kidnapped. it's not just threats. like we'vet seen, i mean i literally in working in this field between cpg and freedom house like 12 years or, right? unit, threats of adduction happen. nobodydy says including speakers and all of that, like i can name every single person that we dealt with and what country, what context can you name it, are happeningings in and how it evolved in the way it's happening ass well. you know, the evolution. one of the other things about
8:51 am
accessibilityib is including, fr example, persons with -- sorry -- with disabilities, and defenders who are working in the space as well, and other marginalized communities, when you're thinking about different reporting channels at how a person would access them, you know? and then i would also say that -- oh, one of the things that defenders have brought up, and i think it's really important and i wrote it because of want to make sure that i express this. but they want to understand and want guidance and socialt media companies and protective groups and all of that, for example, if the experience online abuse, hateful speech, et cetera, what do they do if they block speech or they mute speech, you know,
8:52 am
or they report threats when they had seen, for example, sometimes wheny they block it that threats accelerated or escalated, right? when they muted it then they may not have access to something at the need to monitor in order to document and report what's happening. and then also when they tht threats if they don't receive a response, then what dide, they o if, in fact, it i takes time and theye won't receive a responsen a manner that they would need or in time that they would need? but there could be advised. this is some of the things that you can do alone the process. it'sin also again i mean about training and resources because often defenders who are subjected to these types of threats and hate speech and things like that, they may need colleagues and other trusted their networks to monitor these things for them,
8:53 am
right?t? so they need this kind of education and these trainings, and trainings of trainers. it's almost like everything that access now and other groups in the protection landscape, whether their front-line defenders or american bar association's justice defenders or many others in all of these different networks, the processes and things that have evolved in these networks, including working collectively, that something that social media companies are going down that road. but it's a lot to learn from what are the lessons from these networks, and particularly at the local and national levels? and also consider when defenders may be not your capitals but are in remote situations, what are the challenges and then, you know, if they're deep in a context where they may be at risk? i'll just, that was a mouthful.
8:54 am
>> thank you. you presented such a rich picture of the tensions that people are facing when they're trying to report, and the networks that are supporting them. and the kind of, theme i heard was certainly the ideaco of havg common processes from the platforms to be able to understand who's the point of contact to go to, and online at a person, what do they do? i want to ask wai if you want to add anything to this picture, especially comes to what platforms are able to actually do that human rights defenders on the ground need that would help them as they tried to do this? >> yes, definitely. i agree with like the channels like a contact point, et cetera. those are quite essential. especially like issues like human rights defenders on the ground are kind of like changing like quite constantly.
8:55 am
and also from one country to another the risk level and like the situations are quite different from one country to another. so that's why like i mean can of like platforms like to have global policy and like when one policy fit for all. it doesn't really work in reality. that's why like we actively encourage all the platforms to engage with a local, like a sea is a local activist to be able to first identify the risk level and also with a mitigation plan, right? we have even in myanmar i would say about it, like because we have not necessarily just rearrest like a kindness since last decade, we have a number of issues like hate speech, so that's why i will say about it,
8:56 am
got some sort of like international attention. but still i will say we have very few platforms who are regularly engaging with the csrs on the ground. that's the thing we need to change. and also the other thing like what we have normally what platforms only to a kind of like addressing the kind of like threat, like they are experiencing the human rights defenders on the grounds are experiencing, et cetera. oftentimes we don't really see that much, like a proactive approach part of like mitigation plan, right? that is also definitely the area we would like to see much more progress as well. and then like the other one that even one of the recommendations about like a change of information, that i hear about
8:57 am
and also quite key as well. for example, we exchange information oftentimes to time. we try to engage with these different platforms, legalization like who has been monitoring online content and et cetera, they flash these issues. these are bad actors pick these are the issues. these are the kind of links and et cetera come right? they also have limited capacity as well. sometimes information flow is always kind of like one side come right? we feel this information in, we get this information in. oftentimes we don't really know where this information is going. it is literally kind of like a drop in the black hole and how this information will be kind of like it taken into the consideration behind the platforms. how this information will be kind of reflected in the policy change. we don't have these kinds of information and also often time from time to time we don't really see that kind of like reflection as well.
8:58 am
and also at the same time there is a lot going on. like different platforms have, like investigations et cetera. these kinds of like exchange information with the tsos, what you see on the ground in with the platforms are seeing, these kinds of change information i see positive. especially for these cso who are working on the protection part, security, physical security, et cetera. at least they can pursue what is going on on the platform as well as for the bad actors. sometimes the that actors are not necessarily within their own country territory, right? they have links to russia, links to china,te links to these other countries, et cetera. so csos, local organization for working on the ground having
8:59 am
this knowledge, having this information is like i seen it helpful in the v consideration, like in educating, you know, they are network, public, et cetera. i would say about it like we would also like to see about it, more like an exchange of information in the future as well. definitely like a regular engagement. i think that's kind of like a pretty much like salt. most can be able to be solved like most of these issues, yeah. >> thank you. i want to definitely get to some of the really important questions that you've raised, and give alex a chance. maybe you can help walk us through a little bit from an example of the platform, how do you look at these reports when they come in? kind of what do the processes look like when you think of what message you want, human rights defenders to the about how google does it?
9:00 am
and in that sense what factors go into deciding how you kind of structure protection mechanisms? that behind the scenes might be really helpful for people to hear. >> i know this is something we have long talked about. i think there's a lot more work to do between the platforms generally what information sharing mechanisms might look like. there are a obviously lots of challenges around privacy andch security, and our ability to validate information before we share it out. obviously we know it's useful to those who are working on the ground, and so certainly lots more opportunity to think about what's possible inhi that space. but so just to back up and talk about sort of how we think about these issues and how it's sort of fits into the structure of the company because it's a big global company with a lot of people and a lot of teams and a lot ofnd product surfaces. and so how do we make sure we are paying attention to the issues that are coming up in myanmar, across latin america,
9:01 am
across all regions ofme the wor, right? so i think first, like we do and to security by design. .. ing the threats of that are coming at our platforms all day everywhere, are informing how we are building all of our products, because at base -- in a perfect world, our the way that you flag an issue in a product that any use or the world would flag an issue would also solve the problems of our most vulnerable communities and we design for the most vulnerable users and you're also solving problems for everyone. so, anyway, that is sort of why security by design is so important and when we do that it helps us get a little bit closer to helping these for -- solving these problems. but, obviously, like in addition to building security into the products at their foundation, we think about what are sort of specific features that some
9:02 am
particular vulnerable users might need? and so, that looks for us like teachers. anybody that has might have received a notice, it looks like you might have been targeted and you'll get is notice from us that says you may have been targeted. not a lot of information to share with you, but that's a flag to someone they might be targeted, one, and we also provide them with links ways to test the security on their account and other resources and making sure that people have resources to do some of those texts themselves. and then we also spend time thinking about specific products that might be useful to groups that are being targeted online and so we have things like the advanced protection program which is a super secure version of gmail that lots of-- that we haveh journalists and organizations, as well as candidates that are running for office because we
9:03 am
know they're also going to be attacked. so making sure that those groups of folks have access to this tool. and it's not the short of individual accounts, so we have a program called project field where we provide additional security that might be subject to tax or things like that. and investing in additional tools that we know people might need and you know, obviously, engaging with stake holders is something we have to continue to do to make sure that we are understanding how the threats are evolving and that would feed into the structure, the programs, the process and the teams across the company that are working on these issues and that really runs the gamut, again, from engineers and product developers and safety who are writing the policies and enforcing the policies and our public policy teams who are doing advocacy doing things around internet shutdowns and making sure that there are
9:04 am
adequate protections in place around censorship and privacy. so that company needs a comprehensive approach and built into the way that you build products and then also that you have processes and policies in place. and then of course, having particular mechanisms for rights defenders groups to come directly to your human rights team and when they don't, that it doesn't make it perfect, but i think if you don't have that, it sort of means that you don't have a way for the company to begin to understand how to have a responsibility to address these issues and kind of start thinking about how to do that in concert with other stakeholders. >> thank you for that really fascinating, it's to think about how big the issues are, in countries, and including here in the united states and how you have to respond. i wanted to ask cheryl a little about some of the question on kind of ever thinking about what platforms do, need to do
9:05 am
and teams and human resources to have in place and some of the most dangerous threats to defenders rely on the coded language or images that might by pass detections, or even reviewers for the local language, displaying, symbols, things that people can't find if they don't have that very localized information. so, cheryl, can you tell us more about how you've seen this issue arise with human rights defenders around the world and specifically, how can platforms taking what alex said about needing to communicate more with stake holders and engage different teams, how can they engage civil society and others with context actual knowledge and bring that into not just the human rights teams, but as alice mentioned, people designing products and features for companies? >> so, coded speech, whether hate speech or otherwise, is very contextual and one of the
9:06 am
things that's been important that's been said is that companies have a physical presence, a felt presence or they also have teams on human rights and also linguistically, teams that understand historical narratives and violence. you know, teams who also understand and monitor political language and political histories, cultural, et cetera. because one of the things is that-- one of the difficulties is when speech doesn't seem direct, that it's a direct targeted attack, then often not only doesn't get reported, but it's not seen as something that could lead to off line violence or other types of violence. and without understanding the local context. without understanding language and how it's used, you know, in
9:07 am
different expressions, you know, or different illusions to, you know, without that, then there is an understanding of like, this is a direct threat and also, often this use of indirect, you know, types of language, like basically talking about the longevity of a defender's life, wishing that something would befall them and dehumanizing them. we've talked with defenders, but again, referring back in some context language that's been used historically during political violence, for example, but also you know, on language where defenders, you know, it may call into question
9:08 am
their gender, you know? language which also accuses them are corruption and criminalization or being-- belonging to different criminal groups and things like that because then that also leads to threats in the legal sphere, jailing, things like that. also the other part to your question about how platforms can inform, you know, and build their contextual knowledge, planetary, different types of teams and focusing on an abuser approach. because defenders, you know, that will kind of decrease the burden that's on defenders and then the flag and flag content, et cetera. but also because of the phenomenon we're talking about earlier, flags--
9:09 am
as was mentioned besides having human rights experts, having, you know, people who are experienced in journalism and political context, that also helps to unearth some of these types of nuanced threats or coded threatsment, but also, you know, getting away from these kind of standard lists of hate speech, right. or certain types of vocabularies that are used. as was said, it's not one size fits all so there are different types of known speech that may be considered hate speech or otherwise. but if you just stick to those, then we're also dealing with speech, which might be invisible and not measurable, you know, because we don't know it exists, but within the local context, defenders and others do know it exists and they also
9:10 am
constantly flag it, but it's not seen again as, well, that doesn't really rise to the threat level of a direct threat. but if that context it does. including what defenders are called. in certain contexts, if the defender is called a criminal or an enemy of the state or a variety of other things, you know, that is, you know, a means that often is orchestrated or kind of campaigns against them, you know, get activated and then the catch 22 is that defenders and others are so overwhelmed by the number of whatever posts or whatever it is, that not-- one individual can't believe, you know, handle that, nevertheless, report on it. >> yes, well, thank you. i think you've really raised such important points for
9:11 am
platforms to consider and hearing a couple of themes in that, hearing the need for that very specific human rights defender localized community level engagement, and perhaps you're mentioning with people that are actually marginalized themselves, vulnerable people, i know even in the u.s. context, you may have a majority group like you said, they can list common hate speech indicators, but don't experience it every single day where they know exactly what that insinuation means or symbol or al gory illusion means and to look at the patterns in reporting to figure out what is not rising to the level that maybe is exposing an emerging trend and that's something that sounds like they need local people to help navigate. go ahead. >> i think that certainly, this is something that we're deeply invested in. we have our safety team is global and they do exactly
9:12 am
this. they sort of for hate speech policy in particular, they are looking at what are hate terms, what are themeatic narratives and that actors are dehumanizing, inciting, et ceterament it's a huge investment for companies. and i see my colleagues in the room and they do the same thing and i think one of the things we need to think about is ways to resource the diversity of the industry because it's something that certainly the big companies are invested in and could improve, but are deeply invested in. how do we make sure that smaller companies that can have a really big footprint are having access to this sort of intel from experts what narratives exist and how you should interpret them in a
9:13 am
particular context. that takes a lot of people resources that not all companies in particular, small companies don't have yet. so i think that's an important ongoing challenge for the eco system. >> that's a really important point. i think because we all definitely know that resourcing can be the key between solving and addressing a challenge and having it be a great framework and not actualized. i want to turn to jason with a question. hearing the conversation, it's easy to feel this can be an overwhelming problem. i think seeing the guidelines and recommendations, and seeing these with the u.n. standards is an important reminder that are things that can be done and are being done. how do you really kind of monitor benchmarks to monitor success. i know at that gni established assessment for members on its progress and implementing the gni principles. how do you think that companies
9:14 am
can start to benchmark their progress on human rights defender protection and you know, how can multi-stake holders initiate these processes. >> it's been a really, really, really rich conversation and i think the question how do we sort of take this forward and measure progress is a really important one. so, gni has a framework and we've been for 15 years, we're a multi-cycle organization, we ever tech companies, not only internet, cloud providers, structure providers, all of whom sort of interface with free expression and privacy slightly differently, depending on the products and services they provide. but all of them have obviously by virtue of being part of gni made a commitment. and that commitment has broken down in the principles and implementation guidelines and
9:15 am
we have a periodic assessment product that each company goes through and in that process that company will bring in an independent third party which we accredit to review the steps they've taken against each of the categories of obligations, and document that. and they do that by looking at the sort of internal paper trail, you know, what systems and policies are there, not just the public facing ones, but the internal ones, they talk to key employees from senior level management all the way down to the people in the field who are kind of interacting with and dealing with threats on a daily basis. and then we look at face studies. so actual examples of now to privacy challenges manifest and we place particular focus on the interaction and governments that's where a lot of the threats manifest so all of that goes into a report which then is shared with our
9:16 am
multi-stakeholder membership for review, discussion and recommendations, so we evaluate companies against a benchmark, which is good faith implementations with improvement over time. and this cycle repeats. recommendations that are provided to the companies then are reviewed at the next assessment, in order to make sure that there is progress happening, so that improvement over time concept, which is also baked into this guidance, i think, is really critical, but having a concrete framework and process is to improvement. improvement can mean a lot of different things. i think what's interesting today is the fact that we are now seeing a number of emerging laws and regulations that are effectively taking a lot of these recommendations and guidance from the guiding principles, and we see the guidelines and baking them into hard laws, so we have things
9:17 am
like the corporate sustainability due diligence directive which is in the process of, it seems like, passing in the eu, which applies to all companies of a certain size with certain presence in europe regardless of sector and that basically is a mandatory human rights due diligence law which will require companies to conduct human rights due diligence and demonstrate how they're doing that and addressing the risks that they uncover through that. and then we have text specific regulations like adigital services act, which apply in particular to segments of the technology eco system, and there are obligations for all kinds of different companies, but the very largest homeland service providers and search engines have to go through a rigorous risk assessment process and they have to have that audited by a third party auditor, so there are sort of these elements of the human rights business and human rights work that are being
9:18 am
codified which hopefully will mean not just the very largest providers many of which are members of gni which have been doing this through a framework of co-governance, but also, the smaller providers, the providers who haven't committed as much to human rights as a policy or as a practice will now have to sort of raise their game, at least to a sort of minimum level and over time, hopefully, that floor will rise. a lot of questions about how it's going to work in practice. we can get into that if you like. i think there's taking stock of a lot of the conversation and knowledge and experience that's shared here, i think when we think about human rights defenders specifically. it may be helpful to understand there are two types of defenses or actions that companies can take to help defend against attacks and poignantly pointed out in myanmar, basically the
9:19 am
entire civilian population has become human rights defenders in one way, shape or form. and individual account actions, the individual way to do that in a personal service in myanmar, that's where it's important to have general corporate practices that can protect against threats and i think there's none more important in that context than in encryption, which is, you know, a vital security and privacy measure that fortunately we have seen tech companies like google and meta taking steps to sort of mainstream across products and services and you cannot underscore enough how important that can be for not just the explicit human rights defenders, but one day to the next becoming a target and you know, that's not been-- it's not sort of an obvious thing because there are a lot
9:20 am
of governments and law enforcement agents who see encryption as an obstacle. not just the authoritarian ones, but underscore that. in general, cyber security, and design measures, they can apply across the board, addressing threat actors, right, and a lot of companies have been putting time and energy into understanding these threats because they're not just bad for individual users, but they're bad for the product and service if they become, you know, so pervasively influenced by these actors, so, that's at the general level and then we have the kind of specific projects or programs or mechanisms that can be made available to people who are known activists and known targets, and that's like alex was talking about, advanced production and project shield and other programs like this and i think those are really important for the specific
9:21 am
actors, but never going to be available or useful or even really relevant to the general population. so i think as we think about measuring progress over time, it's worth looking at both the systemic efforts the companies can take and how those are progressing and how we're learning across the industry and across stakeholder groups about their impact. and then the specific measures and how some of those might become not just sort of specific to a particular company and a take user, but can we kind of think about ways that are across the industry, these protections can be provided more consistently because it's pointed out appropriately, that the threat actors don't just play in one space, right? and the threats migrate as do the users across different platforms. >> and i appreciate how you talk about both and the companies of realization of improvements over time, but also, i think from the kind of activist on the ground
9:22 am
protective. people can feel quite impatient for that realization to happen and you talked about concrete things and that are proactive and in the in of time, we'll turn to audience q & a. if you have a question, i'll start with the first one online and give people more time to raise their hand and i know as the first person some are shy. and the question online that we received. you know, is there anything specific, specifically around gendered online harms that's recognized. with the women defenders. and this is in human rights and people discussing that gender isn't kind of specifically called out. obviously there's many vulnerabilities that would cause a person to be-- have this out size harm and gender is one of them. but i'm curious if there's anything specific on gender that anyone wanted to share
9:23 am
about the guidance that could be applied to platforms? >> i think considering when someone is attacked because of their gender, also that they may be attacked because of their families or their families may be attacked. what are the protected mechanisms that already exist, for example, through social movements, you know, that work for the protection of women and other marginalized defenders who may be attacked because of their gender or because of hate speech, about their gender or otherwise. so, i think the thing is that it's really important to look at like who is working within this area because they've also had a lot of experience on what, you know, gender specific attacks and threats and
9:24 am
protections, and mitigation, all of that looks like. it's not something new. women who idea and people who identify as women and others who are marginalized you know, a lot of this is already down, down the road quite a way, you know, but again, the threat actors are using different means to go after them. and in many cases, they go after their families, and on support, you know, like, let's say someone needed to relocate because of threats. often some protection measures don't work because it doesn't consider their family or other people around them who may be at risk, but also, you know, protection also includes when someone may be at risk from their family, because the impact of threats on them or from their community, or otherwise. >> thank you. i think it's really important for-- as we're discussing this, that
9:25 am
people think about the fact that the press can emanate unlike a physical threat where someone is in front of you from anywhere and affect people in the family and network in addition to themselves. those are great points. >> and add that the psychosocial impact because often the threats also speak to denigrating the person and their reputation and calling, you know, for example, the threats against a woman calling into question, you know, her womanhood or otherwise. so again, i think psychosocial in remedy and in responding and in understanding the impact this has on a person besides the fact they mae may not be able to navigate with threats on otherwise. literally every sphere close to a person, themselves, their family, friends, how it goes
9:26 am
out, their society and et cetera, how this may impact those different relations and how they can navigate, including off line in the physical world. >> thank you. i'll turn to an audience question. identify yourself and then take the microphone from elliott. >> from usaid, thank you so much for this panel, it's been really wonderful and instructive and rich conversations and excited about the guidance as well. i want to go something to cheryl you said earlier, cheryl, you said the lack of coordination between platforms on the ground and how that affects human rights defenders. i've been travelling to different contact over the course of the past year and struck me in certain years, civil society organizations will say we have a great
9:27 am
relationship with youtube, we have direct channels, but meta, we can't get on for anything. and other places oh, we can talk to, you know, tik tok, but, you know, we can't talk to google or whoever it is. so the lack of sort of coordination in different spaces is just so apparent and across different geographies and to jason's point about, you know, when there isn't coordination then we'll just migrate. so the threats will migrate to the platforms where there's the least amount of protection so the question being really for alex, mostly, is to what extent are there conversations being had among different platforms about generating some kind of more streamlined approach and also to the point about resourcing. right? it seems like that would be smarter from a resourcing perspective to coordinate with one another on the ground and what are some of the barriers you see to that happening?
9:28 am
>> so, i think so there's a lot of conversations that happen among the companies in particular, the gni companies around how do we learn some best practices how we're engaging with intermediary organizations and on the ground in particular localities, or globally, and what programs we have to do that and kind of, you know, our best practices around implementing those internal guidelines. some of the things, i think, that are ultimately like some of the barriers to more coordination are the platforms aren't necessarily similarly situated in any given country, right? so the user base, the company sort of markets that all of those things look different and the way that people are using our platforms can look different, so, we won't always sort of have the same amounts of resources or energy or approach to put into something at a given time.
9:29 am
and obviously, as we talked about, that can certainly evolve and things that might be a priority at one time might become less of a priority as the situation changes or as our user base changes somewhere, but that just is a reality of how and why it might look different for different companies in a particular country because our services just aren't used in the same way and so our approach to managing the issues might look a little different. >> well, thanks. we are unfortunately at time for this and i want to let you all get out of here. i felt a palpable sense in the room of how many questions i think we wished we could have asked. i have definitely a few more here online. i want to thank our panelists for all the work they're doing each place they sit for advocating and the state department for releasing these really incredible concrete recommendations from the online platforms for protecting human
9:30 am
rights defenders and thank to atlantic council for their support for this. these are important topics ak you can contact that this for the center for human rights studies and initiatives. if you want to see more of this content and if you have questions, it's certainly an important issue that affects people from the bottom to the top of our society for decision makers fighting for human rights. thank you for coming today and join me thanking our panelists and our keynote speakers. [applause]. [inaudible conversations]
9:31 am
[inaudible conversations] [inaudible conversations] [inaudible conversations] [inaudible conversations] >> later today a discussion on european security and the

16 Views

info Stream Only

Uploaded by TV Archive on