Skip to main content

tv   The Bottom Line  Al Jazeera  November 21, 2022 9:00am-9:31am AST

9:00 am
ha ha, which is now my home on the very 1st well come in the middle east, it is the privilege it is a hugely complex and often controversial events and cover. but once a ball is kicked, the passion and the excitement of photovoltaics, of many of us living with the effects of ecological breakdown. so what would be stories in which technology hope the promise of salvation for the planning, millionaires, big tech and an unwavering faith in innovation? ali re, investigates with a tech no optimism is helping or hindering the fight against climate change. it's a distraction. self delusion is he just masking over it all hail the planet episode t on al jazeera. ah hello, i'm darren, jordan and dough. with a quick reminder,
9:01 am
the top stories here on al jazeera, the 1st well cup ever held in the middle east has kicked off in katara with a spectacular opening ceremony. actor morgan freeman gave a theatrical performance on inclusion, while k pop, superstar john cook was the musical headlight, which after me been hammered out, thought he, the amir of cats are welcome the world at the end of the ceremony. so for your stomach and nuts, people of all origins, nationalities, religions, and beliefs will meet tearing guitar and on tv screens around the world on all the continent. it's only to participate in these incredible moments. it's a beautiful thing for people to put their differences aside to celebrate their diversity. and what unites us at the same time. in the opening game house cat, our last 2 nil against ecuador, the wind sees the south. americans top grew pay with 3 points. 33 year old and a valencia scored both goals at albany. stadium who rece has more now from outside the graph. a great roar whence up parrot, i'll bite stadium as cattle. it kicked off the 2022 world cup,
9:02 am
but the crowd soon became subdued as ecuador and a valencia 1st scored a goal. it was eventually ruled out for off side, but then scored twice more in the half to ultimately give ecuador a to know when in this opening fixture and mean that cut off become the 1st world cup hosts to lose that opening much of the tournament. but malaysia is facing a hung parliament for the 1st time in its political history. saturdays, tightly contested election left major parties, unable to secure enough votes to form a new government. the police has extended a deadline for parties to submit their nomination supply minister on tuesday, france louis has more. there are 2 main contenders for the job of prime minister of malaysia. one is more you'd in yes, in the leader of the national alliance, who's blocked the 2nd largest number of seats in parliament. now he says he's got enough support of members of parliament to be the leader of the country,
9:03 am
but he hasn't revealed names or numbers. the other contender is unwise abraham, the leader of the opposition. now an unusual alliance may yet emerge with the opposition, the alliance of hope, teaming up with the national front. folks have been counted in nepal, laughter millions took part in a general election on sunday. a final result could take weeks. many hope a new government will bring political stability and help with development. the un special rapture on human rights and me and mars in south korea, urging more to be done to address the crisis in the country. tom andrews is appealing for sanctions on the agenda, which sees power in a 2021. coo, 7 people including 2 children, had been injured in cross border firing between pakistan and afghanistan. up and along the durand line, separating our dance dance. patio and pakistan's came up to what province lamb disputes between people living in the areas of being blamed for the violence. and historic deal has been struck at the you ends cop 27. summit roget as have agreed
9:04 am
to compensate for a nation's hit hardest by the impact of climate change. but many countries are leaving the meeting in egypt saying not enough progress has been made on limiting emissions. at least 5 people have been killed in a shooting at a gay nightclub in colorado springs in the us. but he say 18 others were wounded customers overpowered the gunman. a 22 year old suspect has been taken into custody by this is why he was late in the entire city of colorado springs. what are we supposed to do? we're supposed to go how it was supposed to be filthy in our environment when he just got santa a protest against peruse, president federal castillo has turned violent. ah, but he sprayed hundreds of demonstrates with tear gas offices in riot gear, blocked off street surrounding the government palace to prevent marches from
9:05 am
advancing. his ellen's highest court has found the voting age to be discriminatory . the ruling forces parliament to discuss whether the current voting age of 18 should be lowered to $16.17 prime minister jacinta arden says parliament will consider the law within 6 months. so those are the headlines. the news continues here now to 0. after at the bottom line, stay 2 events are watching bye for now. ah . hi, i'm steve clemens and i have a question. how much does social media companies know about you? and how easy is it to manipulate you or me or a whole country? let's get to the bottom line. ah, unless you've been living totally off the grid, which means you'd never see this show. you know that your mobile phone is spying on
9:06 am
you. it's telling private companies and governments where you've been, who you've met, what you've bought, and what you think. all those free apps that we use in love, ever wonder how they're free. maybe it's because you're the one being bought and sold. we paid by giving up our personal data and our privacy, and in return we get a service. usually our data is used to try to sell us stuff that we may be interested in based on our personalities and our preferences. but how about when things take a darker turn and our behavior is predicted? how about tools that can manipulate our behavior, or even what we believe? how about companies that work to turn up the intensity of hate in society? should we be scared that our data is out there? and in some cases being weaponized to day, we're talking with brittany kaiser, who worked for cambridge analytical in london before becoming a whistleblower and blowing the lid off that company's inner workings. now she's a data rights activist and co founder of own your data. she wrote a book about her experiences. targeted my inside story of cambridge analytics and
9:07 am
how trump breaks it, and facebook broke democracy. and david carol, who teaches media design at the new school in new york, and one of the best known crusaders for data protection rights in the united states . both brittany and david were featured in the netflix film. the great hack about how the social media platforms that were supposed to connect us have actually divided and paralyzed us. thank you both so much for joining us, david. let me just start out. you're the only and i know who set out to say, i'm going to go sue to get my, the rights to my data. and, and that is a fascinating step that you took. why did you take it? and just to cut to the punch line, i know you didn't get your date early, so i think you didn't as a result of the end of that film. but, but i just want to know, is there any chance you'll eventually win in that in that quest? sure, thanks steve. it's great to be here. the initial exploration was probably purely academic just trying to figure out how this industry works. and also working
9:08 am
in public, posting my research to social media. and it was that way that it attracted the attention of legal mines in europe, who urged me to seek legal counsel right away. and once i started working with my solicitor robbie nyak, we realized we had a very strong data protection case. and the u. k. information commissioner's office immediately agreed that i had jurisdiction in the u. k. because i proved that our data was stored there. so it was the logical step to proceed to learn more and to see if the law would allow us to exercise our rights. but we were obstructed by the company going out of business. and in the end, somebody leaked the trump 2016 database to journalists, and it took her active journalism to actually deliver my data from that leak in the
9:09 am
end. so we also learn some of the limitations of being able to regulate these companies. and the importance of the free press when that fails or before i bring brittany into this because brittany has a foot in both sides of this story and as such an interesting life in the, in the data world. and i want to tell our folks, one of the coolest things about brittany, as she said, brock obama, facebook page ages ago in the united states senate, which i find, you know, sort of an innocent beginning to this that took some, you know, darker turns. but, but i'm interested in, you know, broadly in this question of what, what should we, david legitimately feel and think when it comes to our own data? because i sort of feel like we're surrounded by quick check off boxes. and we've largely, as a society giving up our rights and leverage in that process, is that a, an incorrect assumption i'm making?
9:10 am
i think it's an important characteristic of the feeling that we are made to feel like we have no control. and indeed exercise and control is a lot of work. so there is an inherent asymmetry play and it's about power. but we are learning how to better make decisions. and we are seeing people make different decisions when they're offered a fair choice. so for example, when apple upgraded ios, it asked users if they want to be tracked in their apps. and surprisingly, most people said no. but usually when we browse the web, we're asked to accept cookies in a manipulative way. so there's much work to be done to empower people to make informed choices, give consent, but then revoke it if they choose. thank brittany. i mentioned the obama story because i remember when his facebook page went up. i think i may have written something around that time. i met president obama. well then sen obama before he
9:11 am
was running, and there was this new kind. i was a blogger, and social media was important and was sort of new waves, new infrastructure being built. and you were part of that in the early days. and then you got into this company called cambridge analytics. and i won't tell people all the dimensions of the show, but you became away part of the machine that took a lot of data that, you know, was out there, basically overtly telling people we can change behavior and we can effect elections and came in. and i'm just interested how that process work because it seems like we went from innocence to darkness rather rapidly with no speed bumps in. am i wrong in my impression all over a decade, from the time that i joined senator obama was 1st presidential campaign to the time that i became a whistleblower at cambridge analytic. lot developed in the day to buy and build when we 1st started building product for the obama campaign. that was the 1st time
9:12 am
that data driven work had ever been applied to social media and politics. and we saw that all of a sudden, millions of people were registering to vote for the 1st time turning out to community event, hosting watch parties or debate and getting involved in issues that were important to their communities. so this seemed like an inherent good, but again, there were ethical and moral guidelines that were implemented on the obama campaign where we were only doing positive messaging. now, fast forward to cambridge analytical where i was excited to join this company and learn more about how to build much more advanced tools than we had used on the 1st obama campaign, of course. but there were clients that were working for the company that did not have the same sort of ethical or moral guides, which meant that they were using negative campaigning. i think a lot of people are used to that, especially right ahead of in election this week. it's an amazing time to be talking
9:13 am
about that because i'm sure that you have seen and or even been parted with ad that are things something negative about other candidates. now the amount of data that you produce and your behavioral pattern will tell company is whether you are susceptible to actually being persuaded by those negative messages or not. some people are, others are not, and a lot of money gets loud specifically into not just negative campaigning, but fear base campaigning based on whether or not you could be persuaded to change your vote. and that's one of the biggest problems with data collection is that the more data that we are producing, the easier it is to predict our behavior and therefore it influences in the end. that's why i decided to become a whistleblower. and i've been working on how we actually change that all fun and educational and awareness strength, and the legislative and regulatory friend, as well as developing new technologies that allow you to have more control over how your data is used or not at all. well, many one other element here and,
9:14 am
and again please tell me this is a, i'm a lay person in this discussion. you both have been in a lot more. we all focused on cambridge analytic. i wrote about cambridge analytic, we saw the story, we saw the issues regarding facebook and governance, and we've had lots of discussions even right now in this country as twitter is purchased by elan mosque and we see all this stuff going. i love get your both are using that but, but i guess really want to know was cambridge analytic a, the tip of the iceberg, or the behaviors are the, the capacities to in a way repeat, perhaps less flamboyantly. what cambridge analytic a did still embedded in our society, are we still seeing democracies not only in the united states, but whether it's in trinidad or over in france or the u. k, or places all around the world? are we still seeing these behaviors as part of the political ecosystem? absolutely, unfortunately, many countries around the world have not yet implemented national legislation that
9:15 am
protects individuals from their data being taken from them. box, old traded and used to target them by any organization that has the money to purchase or license that data. even enough data is available without purchasing or licensing for you to be able to create targeted communications and campaigns for political or commercial purposes. so now instead of one cambridge analytical, we have hundreds or thousands of companies that are doing something similar and are not under the guise of a certain type of regulation or legislation that would be able to protect people's rights. so i think one of the biggest issues we have is not just passing that legislation, but also helping inform people for them to be more conscious consumers of digital content. and understand that when they are targeted, targeted with certain content that is meant to influence them. and starting to take smarter stats in order to protect themselves from that type of influence operation . david, let me ask you the same thing. is this
9:16 am
a pandora's box that can't be closed again? you know, on, on the film the great hack, it's very disturbing to me to see visual image is a black white matter protests in encouraging, you know, almost violent gatherings in that the blue lives matter designing clash and collision funded by russian sources. we now know, and it just makes one wonder today as we look at the toxicity now i know the u. s. political culture better than others, but the toxicity here, whether we still have this practice underway and what your views are about somehow getting control of that again. yeah, i think the interesting thing is the need for these national privacy laws and protection rights to be enforced and put into place. because in the united states, we don't nest, we don't have the right to ask, for example, a political campaign, or it's vendors for our data to be able to even know if abusive practices
9:17 am
are being used. if people are being marked for targeted for fear based campaigns are being marked to de mobilize them from participation. so i think what the cambridge analytic a story taught me is that a being able to get your basic voter profile seems to be an important aspect of 21st century democracy to even be able to know if you're the target of abusive practices. and then if you are, what could you do about it? is there an authority to complain to? do you have rights in a court to prevent this from happening or to prosecute people for engaging in unlawful practices? we had that ability because of the weirdness that steve bannon decided to send us voter data to the u. k. and then weirdly make it
9:18 am
a part of that country's jurisdiction if steve ban and had kept the cambridge analytic a data in united states, i would not have been able to do what i did. so fascinating a me just ask you both real quick, are free and fair elections in the united states in the u. k. really possible. now brittany, i believe it's possible, but it's going to take generational change on the education and awareness front for people to understand what happen in elections like the types of content they're being shown, the amount of money that is spent on just information, both from american political organizations as well as or in countries that are paying to influence and change who might be elected and who might attend certain seats. so that's one thing. secondly, on the legislation front, we still don't have federal legislation in the united states. there are plenty of countries around the world,
9:19 am
especially with super national legislation in europe that are starting to protect people and make these types of political operations that are data driven a lot more difficult to do. but there's still so much of the world that has a lot to learn and a lot to work on the legislative and regulatory front, where i think the fastest way we're going to change is by implementing new types of technology is. interestingly enough, we're going to see over the coming months and years of what you must decide to do with twitter. one of the biggest issues we've had in election something that both cambridge analytical use and plenty of organizations around the world use our cap. so when you create, they can count on social media. a lot of those are run by box farms. the run by ai . and that is, that allows hundreds or thousands or tens of thousands or even millions of messages to be pushed out and shared around the world. and those are not by real people.
9:20 am
yolanda has promised 2 things to the public one to get rid of, get rid of those bank accounts by seeing everyone on the platform that is going to make a huge impact on making social media safer. and secondly, to open source the algorithm. one of the things that a lot of legislators and activists like myself, been collaborating on are figuring out how some of these social targeting algorithms actually work. so we can ban certain uses of algorithms to make social media safer, and to make our democracy stronger. david, why do we need to do to make sure that elections in this country are free and fair? i would like to see national data protection and data rights legislation be passed in this country so that individual voters can get a handle on how their personal data is being used in the political process. and then mechanisms to redress abuses and to inhibit actors from
9:21 am
engaging and abusive practices ending and creating the kind of transparency that would be necessary. and i think a challenge is, is that politicians and their campaigns and their super packs and their donors do not want to be regulated, do not want scrutiny. and they're the ones who make the rules. and so we are in a catch $22.00 position. we are, we are, we are asking our politicians to allow us some insight and visibility into how they target us for issues and campaigns and campaigning. and it's a tall order, but other countries are doing it. so it's, it's, it's possible david from what you've seen from eli math bus bar, do you have confidence that he's going to make it a safe or better platform? unfortunately, he has motivated a number of people to leave the platform and to seek new places elsewhere on the
9:22 am
web in newly emerging decentralized social networks that cannot be purchased and controlled by a billionaire. so it remains to be seen if advertisers will come back to the platform. it remains to be seen if like fellow academics like myself, feel comfortable and safe there. but it is giving people reason to find alternatives. and i think he should be concerned that he is driving away people because the power of the platform is the people who are on it. brittany, what i found fascinating in your book and i did read your book was the journey you went through. but also your experiences with daniel ellsberg and i met daniel ellsberg about 30 years ago. we were going into lawrence livermore labs and was a big sign above our car. that said, no classified information will be discussed beyond this point. and daniel ellsberg said to me, let me think of something classified to tell you, steve. and it was this very trick moment. i felt awkward. i'll never forget it,
9:23 am
but i, you know, it makes me wonder to some degree, whether the role you played in cambridge analytic or as a whistleblower and other of your colleagues there. chelsea, manning, edward snowden, where we through them. we learn so much more about essentially the world of official secrecy and about the technical capacity and the digital capacity behind a lot of that intelligence and essentially spying if you will. and i'm wondering what your thoughts are on how important that kind of role is. it's not something often discuss, should we be embracing the snowdon in the mannings of the world? or is that you know a bridge too far? i do a lot of work on supporting whistleblowers. i usually have at least one new whistleblower a week. reach out to me and i introduce them to lawyers or the government departments or agencies to which they would need to submit with applying the
9:24 am
evidence in order to go forward with their complaints. and i believe that we need to strengthen whistleblower law, both in the united states in europe and many other countries around the world that don't even have national whistle blowing law. what's so important about what whistleblowers do and are able to provide is original evidence of wrongdoing. whether that be crime, fraud, waste, etc. those are the types of issues that whistleblowers bring up. and that allows not just companies or government organization to reform, but it allows everyone to be safer, both internally and externally to an organization. so when you become with the law or you don't know what's going to happen and you don't know who is going to retaliate against you, you don't know if you're going to be blacklisted or ever going to be able to work again. you don't know if you are going to be targeted or if your family will be targeted. it's normally described as a crisis of conscience moment which me and if you decide not to do this,
9:25 am
you wouldn't be able to live with yourself. so you do it regardless of the consequences and for a lot of people, they lose a lot of their right and how much bigger issues been than i did. i was very lucky to be in a position where i was talking about an issue that the entire world started to care about. and so i had many more supporters than detractors. and i've been able to work with legislators and regulators around the world. educators technology in order to help close the gap between technology and ethics. what was so amazing when i 1st met daniel ellsberg about a month after i became a whistleblower. he sat down with me and he said, you look very young, how old are you? i told him i was 30 and he last. i was also 30 when i became a whistleblower me. it's just something about that it. and we had a talk about what it actually meant to make that final decision to go in with your evidence and hope for the best knowing that it's possible that you are incredibly
9:26 am
negative repercussions. but you know, you have to do it for the public. good. you know, thank you for that. david, you're, you're active both as an academic, but as an activist and you write a lot about these issues in the united states. and i imagine in the next congress that is elected, this is going to become one of the hot, hottest issues. and i'm just interested, as you see, the sort of guard rails and, and what are the blind spots? we're having this conversation, you know, occurs to me that i had never heard of pegasus software. one of the ways in which allegedly, the saudis track down my friend jamal ca, shoji, through through cell phones and state based systems that enabled an incredible degree of surveillance. but as you know, i sort of joking and talking about how people, cell phones are spying their cars are reporting information on where they drive. they're all, we're all part of a data info system and it just makes me wonder, are we basically hamsters inside a very complex trap now and,
9:27 am
and how do we get out of it? but i just be in the guard rails and blind spots. you see in this discussion sure, the, the big blind spot that attracted me to the scandal initially was the very porous, almost imperceptible boundaries between the election industry and what we referred to as the military industrial complex. the, the way that defense contractors and private intelligence companies operate in the same space with the same techniques, the same people potentially the same data sets. so the, the way that the parent company of cambridge analytic us was really started out in defense and then used its techniques for elections and commercial advertising. the, the failure for us to erect a firewall between intelligence,
9:28 am
private intelligence. and then the data ecosystem was one of the key alarming elements, and i'm not sure if we have succeeded at preventing other examples of this dangerous overlap between defense and civilian activity. well, listen, we will have to leave it their data rights activists. brittany kaiser, co founder of own your data, and i have to say brittany great necklace there. as you, as you're wearing the own, your data jewelry and david carroll associate professor of be the design of the person school of design. thank you so much for joining us. and give steve, thank you david. so what's the bottom line? we all go online and our data's out there, companies figure us out and they send us ads to satisfy our desires, sort of start suggesting a brand of deodorant or vacation destination. but it winds up recommending who to hate or who to love or who to vote for. most folks will say say lobby,
9:29 am
but it's very much like doping in sports. once you know the competition is rig and cheating is rampant. the contest is undermined. that's why democracy seems like it's under threat everywhere, the feeling that everything is being manipulated. our voices don't matter and there is nothing we can do about it. well that's turning people away from hope and change . i don't know of owning your data like david peril and brittany kaiser suggest is a workable solution. but at least being aware of the dangers that exist as we click boxes in give away permissions all day is a good place to start. and that's the bottom line. ah, it's really reporting to the low, the here. one journalist documents life beyond the headlines. ah, that certain stories can change us in easiest crazy story. well, i mean,
9:30 am
the news media history. it turned into what a unique journey into was it means to be human. the things we keep a witness documentary on al jazeera al jazeera sets the stage. ok, now is the moment i took my chest off and i go in the field voices from different corners. welcome to thinking it any that i think community. everybody does. international filmmakers and weld caused jam analysts, which individual be doing to act that respond? we have to resist and we have to be a match bring programs to in spying. we're viewed as troublemakers what we're really looking for real solutions on al jazeera. oh hello. i'm down jordan into the top stories.

43 Views

info Stream Only

Uploaded by TV Archive on