Skip to main content

tv   The Bottom Line  Al Jazeera  November 18, 2022 11:00pm-11:31pm AST

11:00 pm
i just masking over it. oh hey, the planet episode t on al jazeera. from sports champion to bed bound, i don't. one's investigates how the nfl escaped paying billions of dollars to former players who suffered devastating brain damage playing america's most popular sport. i'm sorry you didn't die before 2015. good luck to you and your family. i knew murray got c t from football. when i looked at it, it's all about his brain beam groups because of the block source. when he get plaintiff position, blood sports on a jazz eda, i care about healthy us engaging with the rest of the world. we're really interested in taking you in to a place you might not visit otherwise, it feels that you were there. ah, i'm ne, parker, and london with the top stories on al jazeera us attorney general,
11:01 pm
mary garland has named a special council to oversee the justice department investigation into donald trump . they involve the former president's handling of sensitive documents and the aftermath of the 2020 election. collins announcement comes 3 days off to trump announced he would run for president again in 2020 full based on recent developments, including the former president's announcement that he is a candidate for president in the next election. and the sitting president stated intention to be a candidate as well. i have concluded that it is in the public interest to appoint a special council such an employee, an appointment under scores, the departments commitment to both independence and accountability in particularly sensitive matters. it also allows prosecutors and agents to continue their work expeditiously and to make decisions indisputably, guided only by the facts and the law. when our correspond,
11:02 pm
rosalind jordan has more or less from washington d. c. merrick garland has made the decision, which is not a popular one, according to was some on social media. to give this work to another lawyer, someone who has justice department experience, someone who has been involved in the investigation of criminal cases. someone who has been involved in matters that are very difficult to prosecute. he noted the mr . smith some experience dealing with war crimes and cause of all. and so this is a situation where the justice department is trying to make certain that this work is completed. but that there is not any way of accusing the department of trying to tilt the scales for or against donald trump. kenya's former president and rwanda's leader have agreed on the need for fighters to withdraw from captured territory in easton democratic republic of congo. m. 23 fighters have made major gains in recent
11:03 pm
weeks. advancing towards goma, as fighting with government forces intensifies. the congolese government is accused through and of supporting the group of bewanda denies. this attempts to restore peace will resume on monday in nairobi. hundreds of people have been killed and nearly 200000 residents have been forced homes. welcome web has more. can his foreign affairs, ministry is said in his statement, the former canyon president who kenyatta had a phone call with rwandan president pull tagami in which could gum he had agreed on the need for an immediate cease fire. and the cook, i made also agreed to assist the east african community facilitator. that is kenyatta whose facilitating p stokes assist him to urge the n 23 to cease fire and withdraw from the captured charities. to his troubles appear to be deepening with reports that hundreds more employees have quit. in response to an ultimatum from new chief executive, ellen musk,
11:04 pm
he told workers to sign up for long hours at high intensity on thursday or leave with 3 months severance pay. twitters office buildings around the world have been closed until next week. casting doubts on the social media platforms ability to keep operating on friday. musk emailed staff asking any any to meet employees who write software code. twitters office in san francisco, ukraine's prime minister says, waives a russian missile strikes of crippled, almost half the country's energy system as temperatures drop. and so continues to fall in the capitol. officials in key ever warning that a complete shut down of the city's power greatest possible. the authorities have been working to restore power nationwide after strikes and several regions left millions of people without power on thursday. the u. n. has warned over humanitarian emergency because of power and water shortages. just 2 days before the well cut begins, organizers have announced that alcohol will not be sold at the stadiums. it's a reversal of the original plan to sell beer around the 8 stadium sites within the
11:05 pm
ticketing perimeter. the public consumption of alcohol and public intoxication of both illegal in cattle. but alcohol will still be available in fan parks and license bars. alright, you're up to date, those are the top story, stay with the so the bottom line is coming up next here when al jazeera oh i hi, i'm steve clemens and i have a question. how much does social media companies know about you? and how easy is it to manipulate you or me, or a whole country? let's get to the bottom line. ah, unless you've been living totally off the grid,
11:06 pm
which means you'd never see this show. you know that your mobile phone is spying on you. it's telling private companies and governments where you've been, who you've met, what you've bought, and what you think. all those free apps that we use in love, ever wonder how they're free. maybe it's because you're the one being bought and sold. we pay by giving up our personal data and our privacy, and in return we get a service. usually our data is used to try to sell us stuff that we may be interested in based on our personalities and our preferences. but how about when things take a darker turn and our behavior is predicted? how about tools that can manipulate our behavior, or even what we believe? how about companies that work to turn up the intensity of hate in society? should we be scared that our data is out there? and in some cases being weaponized to day, we're talking with brittany kaiser, who worked for cambridge analytical in london before becoming a whistleblower and blowing the lid off that company's inner workings. now she's a data rights activist and co founder of own your data. she wrote a book about her experiences. targeted my inside story of cambridge analytics and
11:07 pm
how trump breaks it, and facebook broke democracy. and david carol, who teaches media design at the new school in new york, and one of the best known crusaders for data protection rights in the united states . both brittany and david were featured in the netflix film. the great hack about how the social media platforms that were supposed to connect us have actually divided and paralyzed us. thank you both so much for joining us, david. let me just start out. you're the only one i know who set out to say, i'm going to go soon to get my, the rights to my data. and, and that is a fascinating step that you took. why did you take it? and just to cut to the punch line, i know you didn't get your date early, so i think you didn't as a result of the end of that film. but, but i just want to know, is there any chance you'll eventually win in that in that quest? sure, thanks steve. it's great to be here. the initial exploration was probably
11:08 pm
purely academic just trying to figure out how this industry works. and also working in public, posting my research to social media. and it was that way that it attracted the attention of legal minds in europe, who urged me to seek legal counsel right away. and once i started working with my solicitor robbie nyak, we realized we had a very strong data protection case. and the u. k. information commissioner's office immediately agreed that i had jurisdiction in the u. k. because i proved that our data was stored there. so it was the logical step to proceed to learn more and to see if the law would allow us to exercise our rights. but we were obstructed by the company going out of business. and in the end, somebody leaked the trump 2016 database to journalists, and it took
11:09 pm
a active journalism to actually deliver my data from that leak in the end. so we also learn some of the limitations of being able to regulate these companies. and the importance of the free press when that fails or before i bring britain end of this because brittany has a foot and both sides of this story and as such an interesting life in the, in the data world. and i want to tell our folks, one of the coolest things about britney, she set up rock obama, facebook page ages ago in united states senate, which i find, you know, sort of an innocent beginning to this, that took some, you know, darker turns. but, but i'm interested in, you know, broadly in this question of what, what should we, david legitimately feel and think when it comes to our own data? because i sort of feel like we're surrounded by quick check off boxes. and we've largely, as a society giving up our rights and leverage in that process, is that a,
11:10 pm
an incorrect assumption i'm making? i think it's an important characteristic of the feeling that we are made to feel like we have no control. and indeed, exercise and control is a lot of work. so there is an inherent asymmetry at play and it's about power. but we are learning how to better make decisions. and we are seeing people make different decisions when they're offered a fair choice. so for example, when apple upgraded ios, it asked users if they want to be tracked in their apps. and surprisingly, most people said no. but usually when we browse the web, we're asked to accept cookies in a manipulative way. so there is much work to be done to empower people to make informed choices, give consent, but then revoke it if they choose. thanks brittany. i mentioned the obama story because i remember when his facebook page went up. i think i may have written
11:11 pm
something around that time. i met president obama. well then sen obama before he was running, and there was this new kind. i was a blogger, and social media was important. there were sort of new waves, new infrastructure being built. and you were part of that in the early days. and then you got into this company called cambridge analytics. and i won't tell people all the dimensions of the show, but you became in a way part of the machine that took a lot of data that, you know, was out there, basically overtly telling people we can change behavior and we can effect elections and came in and i'm just interested how that process work because it seems like we went from innocence to darkness rather rapidly with no speed bumps in my wrong and my impression all over a decade, from the time that i joined, senator obama was 1st presidential campaign to the time that i became a whistleblower at cambridge analytic lot developed in the day to buy and build
11:12 pm
when we 1st started building product for the obama campaign. that was the 1st time that data driven work had ever been applied to social media and politics. and we saw that all of a sudden, millions of people were registering to vote for the 1st time turning out to the community event, hosting watch parties or debate and getting involved in issues that were important to their communities. so this seemed like an inherent good, but again, there were ethical and moral guidelines that were implemented on the obama campaign where we were only doing positive messaging. now, fast forward to cambridge analytics where i was excited to join this company and learn more about how to build much more advanced tools than we had used on the 1st obama campaign, of course. but there were clients that were working for the company that do not have the same sort of ethical or moral guide, which meant that they were using negative campaigning. i think a lot of people are used to that, especially right ahead of in election this week. it's an amazing time to be talking
11:13 pm
about this because i'm sure that you have seen and or even parted with ad that are saying something negative about other candidates. now the amount of data that you produce and your behavioral pattern will tell company is whether you are susceptible to actually being persuaded by those negative messages or not. some people are, others are not. and a lot of money gets loud specifically into not just negative campaigning, but fear base campaigning based on whether or not you could be persuaded to change your vote. and that's one of the biggest problems with data collection is that the more data that we are producing, the easier it is to predict our behavior and therefore it influences in the end. that's why i decided to become a whistleblower. and i've been working on how we actually change that all fun and educational and awareness friends, and the legislative and regulatory friend, as well as developing new technologies that allow you to have more control over how
11:14 pm
your data is used or not at all. well, really one other element here and, and again, please tell me this is a, i'm a lay person in this discussion. you both have been in a lot more. we all focused on cambridge analytic. i wrote about cambridge analytic, we saw the story, we saw the issues regarding facebook in governance and we've had lots of discussions even right now in this country as twitter is purchased by elan mosque and we see all this stuff going. i love get your both are using that but, but i guess really want to know was cambridge analytic a, the tip of the iceberg, or the behaviors are the, the capacities to in a way repeat, perhaps less flamboyantly. what cambridge analytic did still embedded in our society, are we still seeing democracies not only in the united states, but whether it's in trinidad or over in france or the u. k, or places all around the world? are we still seeing these behaviors as part of the political ecosystem? absolutely, unfortunately, many countries around the world have not yet implemented national legislation that
11:15 pm
protects individuals from their data being taken from them. box, old traded and used to target them by any organization that has the money to purchase or licensed data. even enough data is available without purchasing or licensing for you to be able to create targeted communications and campaigns for political or commercial purposes. so now instead of one cambridge analytical, we have hundreds or thousands of companies that are doing something similar and are not under the guise of a certain type of regulation or legislation that would be able to protect people's rights. so i think one of the biggest issues we have is not just passing that legislation, but also helping inform people for them to be more conscious consumers of digital content. and understand that when they are targeted, targeted with certain content that is meant to influence them. and starting to take smarter that in order to protect themselves from that type of influence operation.
11:16 pm
david, let me ask you the same thing. is this a pandora's box that can't be closed again? you know, on, on the, the film, the great hack, it's very disturbing to me to see visual images on black lives matter protests and encouraging, you know, almost violent gatherings in that the blue lives matter designing clash and collision funded by russian sources. we now know, and it just makes one wonder today as we look at the toxicity now i know the u. s. political culture better than others, but the toxicity here, whether we still have this practice underway and what your views are about somehow getting control of that again. yeah, i think the interesting thing is the need for these national privacy laws and protection rights to be enforced and put into place. because in the united states, we don't nest, we don't have the right to ask for example, of political campaign or it's vendors for our data to be able to even know if
11:17 pm
abusive practices are being used. if people are being marked for targeted for fear based campaigns are being marked to de mobilize them from participation. so i think what the cambridge analytic a story taught me is that a being able to get your basic voter profile seems to be an important aspect of 21st century democracy to even be able to know if you're the target of abusive practices. and then if you are, what could you do about it? is there an authority to complain to? do you have rights in a court to prevent this from happening or to prosecute people for engaging in unlawful practices? we had that ability because of the weirdness that steve bannon decided to send us voter data to the u. k. and then weirdly make it
11:18 pm
a part of that country's jurisdiction. if steve been and had kept the cambridge analytic a data in united states, i would not have been able to do what i did. but so fascinating me is to ask you both real quick, are free and fair elections in the united states in the u. k. really possible. now brittany, i believe it's possible, but it's going to take generational change on the education and awareness front for people to understand what happens in elections like the types of content they're being shown, the amount of money that is spent on just information, both from american political organizations as well as or in countries that are paying to influence and change who might be elected and who might attend certain seats. so that's one thing. secondly, on the legislation front, we still don't have federal legislation in the united states. there are plenty of
11:19 pm
countries around the world, especially with super national legislation in europe that are starting to protect people and make these types of political operations that are data driven a lot more difficult to do. but there's still so much of the world that has a lot to learn and a lot to work on the legislative and regulatory front, where i think the fastest way we're going to see change is by implementing new types of technology. interestingly enough, we're going to see over the coming months and years of what you must decide to do with twitter. one of the biggest issues we've had in election something that both cambridge analytical use and plenty of organizations around the world use our fake account. so when you create bank accounts on social media, a lot of those are run by box farms are run by ai and that is, that allows hundreds or thousands or tens of thousands or even millions of messages to be pushed out and shared around the world. and those are not by real people,
11:20 pm
yolanda has promised 2 things to the public one to get rid of, get rid of those bank account by seeing everyone on the platform that is going to make a huge impact on making social media safer. and secondly, to open source, the algorithm. one of the things that a lot of legislators and activists like myself, been collaborating on are figuring out how some of these social targeting algorithms actually work. so we can ban certain uses of algorithms to make social media safer. and to make our democracy stronger, david, why do we need to do to make sure that elections in this country are free and fair? i would like to see national data protection and data rights legislation be passed in this country so that individual voters can get a handle on how their personal data is being used in the political process. and then mechanisms to redress abuses and to inhibit actors from
11:21 pm
engaging and abusive practices ending and creating the kind of transparency that would be necessary. and i think a challenge is, is that politicians and their campaigns and their super packs and their donors do not want to be regulated, do not want scrutiny. and they're the ones who make the rules. and so we are in a catch $22.00 position. we are, we are, we are asking our politicians to allow us some insight and visibility into how they target us for issues and campaigns and campaigning. and it's a tall order, but other countries are doing it. so it's, it's, it's possible david from what you've seen from eli math bus bar, do you have confidence that he's going to make it a safe or better platform? unfortunately, he has motivated a number of people to leave the platform and to seek new places elsewhere on the
11:22 pm
web in newly emerging decentralized social networks that cannot be purchased and controlled by a billionaire. so it remains to be seen if advertisers will come back to the platform. it remains to be seen if like fellow academics like myself, feel comfortable and safe there. but it is giving people reason to find alternatives. and i think he should be concerned that he is driving away people because the power of the platform is the people who are on it. brittany, what i found fascinating in your book and i did read your book was the journey you went through. but also your experiences with daniel ellsberg and i met daniel ellsberg about 30 years ago. we were going into lawrence livermore labs and was a big sign above our car. that said, no classified information will be discuss beyond this point. and daniel ellsberg said to me, let me think of something classified to tell you,
11:23 pm
steve. and it was this very trick moment. i felt awkward. i'll never forget it, but i, you know, it makes me wonder to some degree, whether the role you played in cambridge analytic or as a whistleblower and other of your colleagues there. chelsea, manning, edward snowden, where we through them. we learn so much more about essentially the world of official secrecy and about the technical capacity and the digital capacity behind a lot of that intelligence and essentially spying if you will. and i'm wondering what your thoughts are on how important that kind of role is. it's not something often discussed. should we be embracing the snowdon in the mannings of the world? or is that you know, a bridge too far? i do a lot of work on supporting whistleblowers. i usually have at least one new whistleblower a week. reach out to me and i introduce them to lawyers or the government departments or agencies to which they would need to submit with applying i evidence
11:24 pm
in order to go forward with their complaints. and i believe that we need to strengthen whistleblower law, both in the united states in europe and many other countries around the world that don't even have national whistle blowing law. what's so important about what whistleblowers do and are able to provide is original evidence of wrongdoing. whether that be crime, fraud, waste, etc. those are the types of issues that whistleblowers bring up. and that allows not just companies or government organization to reform, but it allows everyone to be kept they for both internally and externally to an organization. so when you become a whistleblower, you don't know what's going to happen and you don't know who is going to retaliate against you. you don't know if you're going to be blacklisted or ever going to be able to work again. you don't know if you are going to be targeted or if your family will be targeted. it's normally described as
11:25 pm
a crisis of conscience moment which me and if you decide not to do this, you wouldn't be able to live with yourself. so you do it regardless of the consequences and for a lot of people, they lose a lot of their right. and how much bigger issues been than i did. i was very lucky to be in a position where i was talking about an issue that the entire world started to care about. and so i had many more supporters than detractors. and i've been able to work with legislators and regulators around the world. educators technology in order to help close the gap between technology and ethics. what was so amazing when i 1st met daniel ellsberg about a month after i became a whistleblower. he sat down with me and he said, you look very young, how old are you? i told him i was 30 and he last. i was also 30 when i became a whistleblower, me just something about that it. and we had a talk about what it actually meant to make that final decision to go in with your
11:26 pm
evidence and hope for the best knowing that it's possible that you are incredibly negative repercussions. but you know, you have to do it for the public. good. you know, thank you for that. david, you're, you're active both as an academic, but as an activist and you write a lot about these issues in the united states. and i imagine in the next congress that is elected, this is going to become one of the hot, hottest issues. and i'm just interested, as you see, the sort of guard rails and, and what are the blind spots? we're having this conversation, you know, it occurs to me that i had never heard of pegasus software. one of the ways in which allegedly, the saudis track down my friend jamal ca, shoji, through through cell phones and state based systems that enabled an incredible degree of surveillance. but as you know, i sort of joking and talking about how people, cell phones are spying their cars are reporting information on where they drive. they're all, we're all part of a data info system and it just makes me wonder,
11:27 pm
are we basically hamsters inside a very complex trap now and, and how do i, how do we get out of it? but i just mentioned the guardrails and blind spots. you see in this discussion sure, the, the big blind spot that attracted me to the scandal initially was the very porous, almost imperceptible boundaries between the election industry and what we referred to as the military industrial complex. the, the way that defense contractors and private intelligence companies operate in the same space with the same techniques, the same people potentially the same data sets. so the way that the, the parent company of cambridge analytic us was really started out in defense and then used its techniques for elections and commercial advertising. the, the failure for us to erect a firewall between intelligence,
11:28 pm
private intelligence. and then the data ecosystem was one of the key alarming elements, and i'm not sure if we have succeeded at preventing other examples of this dangerous overlap between defense and civilian activity. well, listen, we will have to leave it their data rights activists. brittany kaiser, co founder of own your data, and i have to say brittany great necklace there. as you, as you're wearing the own, your data jewelry and david carroll associate professor of b. d design and the person school of design. thank you so much for joining us. and give steve, thank you david. so what's the bottom line? we all go online and our data's out there, companies figure us out and they send us ads to satisfy our desires. so to start suggesting a brand of deodorant or vacation destination, but it winds up recommending who to hate or who to love or who to vote for. most
11:29 pm
folks will say say lobby, but it's very much like doping in sports. once you know the competition is rigged and cheating is rampant, the contest is undermined. that's why democracy seems like it's under threat everywhere. the feeling that everything is being manipulated, our voices don't matter and there's nothing we can do about it. well, that's turning people away from hope and change. i don't know of owning your data, like david carol and brittany kaiser suggest is a workable solution. but at least being aware of the dangers that exist as we click boxes and give away permissions all day is a good place to start. and that's the bottom line. ah, a new series for those, those from 6 countries hoping to make it to keta 2022. my dinning was to be best with both playing the role. now at each year that the next episode meets 3 will
11:30 pm
come hopefully scroll me around by sacrifice a love seat, a lot of days before my family. i will do that to play with the world cup dream. iran on al jazeera a with, with a new banker in london but the top stories.

17 Views

info Stream Only

Uploaded by TV Archive on