Skip to main content

tv   The Bottom Line  Al Jazeera  November 20, 2022 4:00am-4:31am AST

4:00 am
ah, we understand the differences and similarities have cultures across the world. so no matter how you take it out here, we're bringing the news and current affairs that matter to you. counter 0. ah, hello, i'm carry johnson and doha, with the top stories on al jazeera,
4:01 am
the stage is set for world cup 2022 1st. ever in the middle east festivities are already underway in the cattle of the next 4 weeks. more than 1000000 football fans are expected to come to the country to cheer on their teams. the 1st games on sunday, between host country cattle and ecuador, brazil's national team has touched down in carter the last squad to arrive. they've won the most number of world cup titles in the hoping for 6 naima and his teammates hit the pitch for their 1st match on thursday against serbia. defending it will champions france have suffered a set back on the eve of the torment. it seems a star strike a cream benzo mar has been ruled out of playing in any of the upcoming matches. after sustaining an injury during training benchmark tore a muscle his left thigh on saturday, which doctors say will take at least 3 weeks to recover. meanwhile, c for president johnny infant tino is accusing western critics of hypocrisy. when
4:02 am
it comes to cat up, he defended the treatment of migrant workers, saying europe should not be giving what he calls moral lessons. 15 on this doesn't mean that we shouldn't point at what doesn't work hidden carter as well. of course, there are still things that don't work and the need to be addressed. but this moral lesson giving one side it it's just hypocritical leaders that they corporate 27 climate somewhere 10 egypt have been working late into the night to reach a deal. negotiations include trying to agree on a target to limit global warming. some members want to keep the gold at 1.5 degrees celsius negotiators are also discussing a loss and damage fund. it would require wealthy nations to compensate the poor
4:03 am
ones for the effect of climate change. the world is watching. time is not on our side. we must all rise to the occasion, almost show the necessary flexibility from what i as within, but there is equal dissatisfaction in all quarters. but in a vast majority of the parties, there was a receptive with the to the basis upon which those texts have been formulated. the feeling that they do reflect a balanced approach. the laser is facing a hung parliament after general election delivered no clear winner. opposition leader anwar ibrahim says his condition has enough support to form a government, his rosalind former premier mill, you didn't? yes, it has also claimed to it, but she soon ack has visited ukraine for the 1st time since becoming a britain's prime minister. he met at president bonham as lensky keith soon pledged his government's continued support,
4:04 am
including an air defense aid package worth $60000000.00. we will stand with you until ukraine has won the peace and security it needs and deserves. and then we will stand with you as you rebuild your great country. and i look forward to hosting a reconstruction conference in london next summer by then let us hope that we have ended this barbarous war and secured a just peace turkish national defense ministry says turkey has conducted air raid operations in northern syria and northern iraq and targets are set to be kurdish groups. the p k, k and the y p g to care accuses them of a bomb attack and istanbul. one week ago, 6 people were killed and more than 80 others injured. the kurdish groups have denied involvement. at least 9 people, including a 4 children have been killed in a gas blast and eastern russia. the accident happened in a 5 story building on an island cycle in north of japan. authorities say gas
4:05 am
cylinder exploded in an apartment. an electric car companies as low as recording hundreds of thousands of vehicles in the us. more than 320000 model 3 and model. why a cars with possible 40 tail lights are affected. those are the headlines. news continues here now does. thereafter, the bottom line ah hi, i'm steve clemens and i have a question. how much does social media companies know about you? and how easy is it to manipulate you or me, or a whole country? let's get to the bottom line. ah, unless you've been living totally off the grid, which means you'd never see this show, you know that your mobile phone is spying on you. it's telling private companies
4:06 am
and governments where you've been, who you've met, what you've bought and what you think. all those free apps that we use in love, ever wonder how they're free. maybe it's because you're the one being bought and sold. we pay by giving up our personal data and our privacy. and in return we get a service. usually our data is used to try to sell us stuff that we may be interested in based on our personalities and our preferences. but how about when things take a darker turn and our behavior is predicted? how about tools that can manipulate our behavior, or even what we believe? how about companies that work to turn up the intensity of hate in society? should we be scared that our data is out there? and in some cases being weaponized to day, we're talking with brittany kaiser, who worked for cambridge analytical in london before becoming a whistleblower and blowing the lid off at company's inner workings. now she's a data rights activist and co founder of own your data. she wrote a book about her experiences, targeted my inside story of cambridge analytics and how trump breaks it,
4:07 am
and facebook broke democracy. and david carol, who teaches media design at the new school in new york, and one of the best known crusaders for data protection rights in the united states . both brittany and david were featured in the netflix film. the great hack about how the social media platforms that were supposed to connect us have actually divided and paralyzed us. thank you both so much for joining us, david. let me just start out. you're the only and i know who set out to say, i'm going to go sue to get my, the rights to my data. and, and that is a fascinating step that you took. why did you take it? and just to cut to the punch line, i know you didn't get your date early, so i think you didn't as a result of the end of that film. but, but i just want to know, is there any chance you'll eventually win in that in that quest? sure, thanks steve. it's great to be here. the initial exploration was probably purely academic just trying to figure out how this industry works. and also working
4:08 am
in public, posting my research to social media. and it was that way that it attracted the attention of legal mines in europe, who urged me to seek legal counsel right away. and once i started working with my solicitor robbie nyak, we realized we had a very strong data protection case. and the u. k. information commissioners office immediately agreed that i had jurisdiction in the u. k. because i proved that our data was stored there. so it was the logical step to proceed to learn more and to see if the law would allow us to exercise our rights. but we were obstructed by the company going out of business. and in the end, somebody leaked the trump 2016 database to journalists, and it took her active journalism to actually deliver my data from that leak in the
4:09 am
end. so we also learn some of the limitations of being able to regulate these companies. and the importance of the free press when that fails or before i bring brittany into this because brittany has a foot and both sides of this story and as such an interesting life in the, in the data world. and i want to tell our folks, one of the coolest things about brittany as she set up rock obama, facebook page ages ago in the united states senate, which i find, you know, sort of an innocent beginning to this that took some, you know, darker turns. but, but i'm interested in, you know, broadly in this question of what, what should we, david legitimately feel and think when it comes to our own data? because i sort of feel like we're surrounded by quick check off boxes. and we've largely, as a society, given up our rights and leverage in that process, is that a, an incorrect assumption i'm making?
4:10 am
i think it's an important characteristic of the feeling that we are made to feel like we have no control. and indeed, exercise and control is a lot of work. so there is an inherent asymmetry at play and it's about power. but we are learning how to better make decisions. and we are seeing people make different decisions when they're offered a fair choice. so for example, when apple upgraded ios, it asked users if they want to be tracked in their apps. and surprisingly, most people said no. but usually when we browse the web, we're asked to accept cookies in a manipulative way. so there is much work to be done to empower people to make informed choices, give consent, but then revoke it if they choose. thanks brittany. i mentioned the obama story because i remember when his facebook page went up. i think i may have written something around that time. i met president obama. well then sen obama before he
4:11 am
was running, and there was this new kind. i was a blogger, and social media was important. there were sort of new waves, new infrastructure being built. and you were part of that in the early days. and then you got into this company called cambridge analytics. and i won't tell people all the dimensions of the show, but you became in a way part of the machine that took a lot of data that, you know, was out there, basically overtly telling people we can change behavior and we can effect elections and came in and i'm just interested how that process work because it seems like we went from innocence to darkness rather rapidly with no speed bumps in my wrong and my impression all over a decade, from the time that i joined, senator obama was 1st presidential campaign to the time that i became a whistleblower at cambridge analytic lot developed in the day to buy and build when we 1st started building product for the obama campaign. that was the 1st time
4:12 am
that data driven work had ever been applied to social media and politics. and we saw that all of a sudden, millions of people were registering to vote for the 1st time. turning out to the community event, hosting watch parties or debate and getting involved in issues that were important to their community. so that seemed like an inherent good, but again, there were ethical and moral guidelines that were implemented on the obama campaign where we were only doing positive messaging. now, fast forward to cambridge analytical where i was excited to join this company and learn more about how to build much more advanced tools than we had used on the 1st obama campaign, of course. but there were clients that were working for the company that do not have the same sort of ethical or moral guide, which meant that they were using negative campaigning. i think a lot of people are used to that, especially right ahead of an election this week. it's an amazing time to be talking
4:13 am
about that because i'm sure that you have seen and or even parted with ad that are saying something negative about other candidates. now the amount of data that you produce and your behavioral pattern will tell company is whether you are susceptible to actually being persuaded by those negative messages or not. some people are, others are not. and a lot of money gets loud specifically into not just negative campaigning, but fear base campaigning based on whether or not you could be persuaded to change your vote. and that's one of the biggest problems with data collection is that the more data that we are producing, the easier it is to predict our behavior and therefore it influences in the end. that's why i decided to become a whistleblower. and i've been working on how we actually change that both on an educational and awareness front and the legislative and regulatory friend, as well as developing new technologies that allow you to have more control over how your gate is used or not at all. well, really one other element here and,
4:14 am
and again, please tell me this is a, i'm a lay person in this discussion. you both have been in a lot more. we all focused on cambridge analytic. i wrote about cambridge analytic, we saw the story, we saw the issues regarding facebook and governance, and we've had lots of discussions even right now in this country as twitter is purchased by elan mosque and we see all this stuff going. i love get your both are using that but, but i guess really want to know was cambridge analytic a, the tip of the iceberg, or the behaviors are the, the capacities to in a way repeat, perhaps less flamboyantly. what cambridge analytic a did still embedded in our society, are we still seeing democracies not only in the united states, but whether it's in trinidad or over in france or the u. k, or places all around the world? are we still seeing these behaviors as part of the political ecosystem? absolutely, unfortunately, many countries around the world have not yet implemented national legislation that protects individuals from their data being taken from them. box,
4:15 am
old traded and used to target them by any organization that has the money to purchase or license that data. even enough data is available without purchasing or licensing for you to be able to create targeted communications and campaign for political or commercial purposes. so now instead of one cambridge analytical, we have hundreds or thousands of companies that are doing something similar and are not under the guise of a certain type of regulation or legislation that would be able to protect people's rights. so i think one of the biggest issues we have is not just passing that legislation, but also helping inform people for them to be more conscious consumers of digital content. and understand that when they are targeted, targeted with certain content that is meant to influence them. and starting to take smarter that in order to protect themselves from that type of influence operation. david, let me ask you the same thing. is this
4:16 am
a pandora's box that can't be closed again? you know, on, on the, the film, the great hack, it's very disturbing to me to see visual images on black lives matter protests and encouraging, you know, almost violent gatherings in that the blue lives matter designing clash and collision funded by russian sources. we now know, and it just makes one wonder today as we look at the toxicity now i know the u. s. political culture better than others, but the toxicity here, whether we still have this practice underway and what your views are about somehow getting control of that again. yeah, i think the interesting thing is the need for these national privacy laws and protection rights to be enforced and put into place. because in the united states, we don't nest, we don't have the right to ask for example, of political campaign or it's vendors for our data to be able to even know if
4:17 am
abusive practices are being used. if people are being marked for targeted for fear based campaigns are being marked to de mobilize them from participation. so i think what the cambridge analytic a story taught me is that a being able to get your basic voter profile seems to be an important aspect of 21st century democracy to even be able to know if you're the target of abusive practices. and then if you are, what could you do about it? is there an authority to complain to? do you have rights in a court to prevent this from happening or to prosecute people for engaging in unlawful practices? we had that ability because of the weirdness that steve bannon decided to send us voter data to the u. k. and then weirdly make it
4:18 am
a part of that country's jurisdiction. if steve been and had kept the cambridge analytic a data in united states, i would not have been able to do what i did. so fascinating a mist ask you both real quick are free and fair elections in the united states in the u. k. really possible. now brittany, i believe it's possible, but it's going to take generational change on the education and awareness front for people to understand what happens in elections like the types of content they're being shown, the amount of money that is spent on just information, both from american political organizations as well as or in countries that are paying to influence and change who might be elected and who might attend certain seats. so that's one thing. secondly, on the legislation front, we still don't have federal legislation in the united states. there are plenty of countries around the world,
4:19 am
especially with super national legislation in europe that are starting to protect people and make these types of political operations that are data driven a lot more difficult to do. but there's still so much of the world that has a lot to learn and a lot to work on the legislative and regulatory front, where i think the fastest way we're going to see change is by implementing new types of technology. interestingly enough, we're going to see over the coming months and years of what you must decide to do with twitter. one of the biggest issues we've had in election something that both cambridge analytical use and plenty of organizations around the world use our big cap. so when you create bank accounts on social media, a lot of those are run by box farms are run by a i. and that is, that allows hundreds or thousands or tens of thousands, or even millions of messages to be pushed out and shared around the world. and those are not by real people. the line has promised 2 things to the public one to
4:20 am
get rid of. i get rid of those bank accounts by key. why seeing everyone on the platform that is going to make a huge impact on making social media safer. and secondly, to open source the algorithm. one of the things that a lot of legislators and activists like myself, been collaborating on are figuring out how some of these social targeting algorithms actually work. so we can ban certain uses of algorithms to make social media safer and to make our democracy stronger. david, why do we need to do to make sure that elections in this country are free and fair? i would like to see national data protection and data rights legislation be passed in this country so that individual voters can get a handle on how their personal data is being used in the political process. and then mechanisms to redress abuses and to inhibit actors from
4:21 am
engaging and abusive practices ending and creating the kind of transparency that would be necessary. and i think a challenge is, is that politicians and their campaigns and their super packs and their donors do not want to be regulated, do not want scrutiny. and they're the ones who make the rules. and so we are in a catch $22.00 position. we are, we are, we are asking our politicians to allow us some insight and visibility into how they target us for issues and campaigns and campaigning. and it's a tall order, but other countries are doing it. so it's, it's, it's possible david from what you've seen from eli math bus bar, do you have confidence that he's gonna make this a for better platform? unfortunately, he has motivated a number of people to leave the platform and to seek new places elsewhere on the
4:22 am
web in newly emerging decentralized social networks that cannot be purchased and controlled by a billionaire. so it remains to be seen if advertisers will come back to the platform. it remains to be seen if like fellow academics like myself, feel comfortable and safe there. but it is giving people reason to find alternatives. and i think he should be concerned that he is driving away people because the power of the platform is the people who are on it. brittany, what i found fascinating in your book and i did read your book on was the journey you went through, but also your experiences with daniel ellsberg and i met daniel ellsberg about 30 years ago. we were going into lawrence livermore labs. it was a big sign above our car that said, no classified information will be discussed beyond this point. and daniel ellsberg said to me, let me think of something classified to tell you, steve. and it was this very electric moment where i felt awkward. i'll never forget
4:23 am
it, but i, you know, it makes me wonder to some degree weather the role you played in cambridge analytical as a whistleblower and other of your colleagues there. um, chelsea, manning, edward snowden, where we, through them, we learn so much more about essentially the world of official secrecy and about the technical capacity and the digital capacity behind a lot of that intelligence and essentially spying if you will. and i'm wondering what your thoughts are on how important that kind of role is, is not something often discussed. should we be embracing the snowdon in the mannings of the world? or is that you know, a bridge too far? i do a lot of work on supporting whistleblowers. i usually have at least one new whistleblower a week. reach out to me and i introduce them to lawyers or the government departments or agencies to which they would need to submit with applying i evidence
4:24 am
in order to go forward with their complaints. and i believe that we need to strengthen whistleblower law, both in the united states and europe and many other countries around the world that don't even have national whistle blowing law. what's so important about what whistleblowers do and are able to provide is original evidence of wrongdoing, whether that be crime, fraud, waste, etc. those are the types of issues that whistleblowers bring up. and that allows not just companies or government organization to reform, but it allows everyone to be kept they for both internally and externally to an organization. so when you become a whistleblower, you don't know what's going to happen and you don't know who is going to retaliate against you. you don't know if you're going to be blacklisted or ever going to be able to work again. you don't know if you are going to be targeted or if your family will be targeted. it's normally described as a crisis of conscience moment which me and if you decide not to do this,
4:25 am
you wouldn't be able to live with yourself. so you do it regardless of the consequences and for a lot of people, they lose a lot of their right and how much bigger issues been than i did. i was very lucky to be in a position where i was talking about an issue that the entire world started to care about. and so i had many more supporters than detractors. and i've been able to work with the legislators and regulators around the world. educators, technology and in order to help close the gap between technology and ethics. what was so amazing when i 1st met daniel ellsberg about a month after i became a whistleblower. he sat down with me and he said, you look very young, how old are you? i told him i was 30 and he last. i was also 30 when i became a whistleblower, me just something about that it. and we had a talk about what it actually meant to make that final decision to go in with your evidence and hope for the best knowing that it's possible that you are incredibly
4:26 am
negative repercussions. but you know, you have to do it for the public. good. you know, thank you for that. david, you're, you're active both as an academic, but as an activist and you write a lot about these issues in the united states. and i imagine in the next congress that is elected, this is going to become one of the hot, hottest issues. and i'm just interested, as you see, the sort of guard rails and, and what are the blind spots? we're having this conversation, you know, it occurs to me that i had never heard of pegasus software. one of the ways in which allegedly, the saudis track down my friend shamarka shoji, through through cell phones and state based systems that enabled an incredible degree of surveillance. but as you know, i sort of joking and talking about how people, cell phones are spying their cars are reporting information on where they drive. they're all, we're all part of a data info system and it just makes me wonder, are we basically hamsters inside a very complex trap now and, and how do,
4:27 am
how do we get out of it? but i just mentioned the guardrails and blind spots you see in this discussion. sure, the, the big blind spot that attracted me to the scandal initially was the very porous, almost imperceptible boundaries between the election industry and what we referred to as the military industrial complex. the, the way that defense contractors and private intelligence companies operate in the same space with the same techniques, the same people potentially the same data sets. so the way that the parent company of cambridge analytic us was really started out in defense and then used its techniques for elections and commercial advertising. the, the failure for us to erect a firewall between intelligence,
4:28 am
private intelligence. and then the data ecosystem was one of the key alarming elements, and i'm not sure if we have succeeded at preventing other examples of this dangerous overlap between defense and civilian activity. well, listen, we will have to leave it their data rights activists. brittany kaiser, co founder of own your data, and i have to say brittany great necklace there. as you, as you're wearing the own, your data jewelry and david carroll associate professor of media design at the parson school of design. thank you so much for joining us. and give steve, thank you david. so what's the bottom line? we all go online and our data's out there, companies figure us out and they send us ads to satisfy our desires. so to start suggesting a brand of deodorant or vacation destination, but it winds up recommending who to hate or who to love or who to vote for. most folks will say say levy,
4:29 am
but it's very much like doping in sports. once you know the competition is rigged and cheating is rampant, the contest is undermined. that's why democracy seems like it's under threat everywhere. the feeling that everything is being manipulated, our voices don't matter and there's nothing we can do about it. well that's turning people away from hope and change. i don't know of owning you were data like david carolyn brittney kaiser suggest is a workable solution. but at least being aware of the dangers that exist as we click boxes in give away permissions all day is a good place to start. and that's the bottom line ah, on causing the cost to dfcs empire has collapsed. um, what's next for the crypt covers the industry. why are tech companies lay off thousands of employees plus we explore whether the water, ukraine could speed up the transition to renewable energy. carter the cost on all just
4:30 am
a couple of them. there was a time to be direct. there is a growing realization that rights can be taken away in this country to cut through the rhetoric. how can we resist this narrative and a whole dangerous, and demand the truth? join me markham, on hill, for up front. what al jazeera, i, i've been covering all of latin america for most of my career, but no country is alike, and it's my job to shed light on how and why. no, i'm carry johnson dough with the top.

15 Views

info Stream Only

Uploaded by TV Archive on