Skip to main content

tv   [untitled]    September 17, 2021 11:30pm-12:00am AST

11:30 pm
a story that needed to be told from the heart of the affected area to be there to tell the people story was very important at the time. ah no, i'm randomizing now main story now. us officials of admitted the drug strike and the african capital combo. last month mistaken, the killed 10 civilians and not iso fighters, as was claimed at the time and investigation found. the august 29th strike killed an innocent aid worker along with 9 members of his family, including 7 children. the strike was one of us military final acts in the country before ending its 20 in military operation. and i am now convinced that as many as 10 civilians, including up to 7 children, were tragically killed in that strike. moreover,
11:31 pm
we now assess that it is unlikely that the vehicle and those who died were associated with isis cake or were a direct rep to us forces. i offer my profound condolences to the family and friends of those who were killed. this strike was taken in the earnest belief that it would prevent an imminent threat to our forces and the evacuees at the airport. but it was a mistake and i offer my sincere apology as the combatant commander. i am fully responsible for the strike in his tragic outcome. now, france has recalled at town bassett as to the u. s. and australia as part as a massive diplomatic backlash against a new security pack that canceled a $14000000000.00 french dale. the u. k. u. s. and australia or announced the defense agreement on wednesday and was widely seen as an effort to counter chinese influence in the pacific. on the deal, australia would get help to develop nuclear power submarines with pact effectively
11:32 pm
council the previous steel where france would provide the australian government with submarines. a french foreign minister, call it a stamp in the back and accuse of bind in administration of behaving like donald trump. and a panel of independent advisors to the united states food and drug administration has voted against coven vaccine booster shots produced by 5. the iron tag, the f d a will take the panels recommendation into consideration before making the decision. many committee members were critical of the boost upon arguing that the data presented by pfizer and the f. d. a is incomplete and at the request for approval, for people as young as 16 years old is too broad. most of them said they would support boosters for older americans, so so more and everything in the news our that's coming up in 25 minutes time with myself. i will see you then all hell the algorithm is next. the
11:33 pm
news news news with me there is a huge group of people at work behind the screen. they called behavior architects,
11:34 pm
persuasive design, and user experience specialist. and the power they have is message that urge to keep swiping through your twitter feed. that's the time, the way we all click, i agree to the terms and conditions that's design, sweating, lift right on kinda that's designed to we live in an online world or someone else is making. and most of us never even give it a 2nd. and actually that's design is what the san francisco, it's the mecca from. tech designers, home to silicon valley. this place piney, the art of constructing, optimizing and enhancing a lot of the technology we use every day. it's turbo charged,
11:35 pm
the speed at which we use the internet and made navigating the web more insurance. but it's also given us a full sense of security. i've lost count of the number of times i've clicked. i agree to get into a website. we all have to do it as we feed around the internet, we face hundreds of these annoying pop ups and consent forms all demanding a response from us. and yes, i call them annoying because that is exactly what they are. they may look like that that it brought us with controls, but the reality is saw from when users click on i agree to the terms and conditions and they see a privacy policy and they click on it. they may think that they're actually being given control of their personal data was collected how it's used, and it's advantageous to companies for them to do that. because if something bad happens, the tech company can then say you actually agreed to this. nobody ever reads the
11:36 pm
terms of service. no one should ever be expected to. if we actually tried to read every terms and conditions agreement that we came across it, probably the only thing we would do it would have to be our day job because for so long. and we come into so many. and it may have the veneer of giving control to data subjects, but ultimately if it's window dressing, what hot soak is, what you'd call a scholar of design. he's been studying the psychology ethics and legal implications of new technologies. one area he specializes in its data protection and privacy. now before we get into this, there's a key term you need to know informed consent. this is a principle that comes up a lot in discussions of our rights online. but it's easier to explain in the context of medical a doctor explains potential risks and worst case scenarios to the patient. once you're fully informed, you have the option to consent to surgery or not in the online well informed
11:37 pm
consent is what everyone says he ideal. but is that even possible? consent only works under s at the very narrow set of conditions, and that when the decision is infrequent like with surgery, so we don't have surgery all the time. it's when the risks are visceral. there are things that we can easily conjure up in our mind, and then finally the harm is possibly great. things go wrong with surgery, you could get thick where you could die. so we've got an incredible incentive to take that decision. but of course, none of those things are present in the data ecosystem. we make the decisions quite frequently 10100 times a day. the harms are not visceral at all, they're incredibly opaque. and finally, the harm is not even that great because the private modern privacy harms aren't huge, very deaf by 1000 cut. the spin from silicon valley is that there on our side, when it comes to how we control our data, long privacy policies are very confusing. and if you make it long and spell it all
11:38 pm
the detail, then you're probably going to reduce the percent of people who read it. however, take a quick look at the design of the buttons and the pop ups that were made to click. and it's clear that the tech companies have the upper hand in the data battle. design is power in every single design decision makes a certain reality, more or less likely, and what tech companies and psychologists and other people have known for years as the defaults are notoriously sticky. and so if you design the interface so that all the defaults are set to maximize exposure, then you're going to get a lot more data in the aggregate than you would if you set all the defaults to privacy protected because people don't go in and change them so we tell, we fundamentally changed the incentives. we're still going to see companies manipulating the design of these button, these technologies to ensure that you still keep disclosing data and that they still keep getting what's the lifeline of their business. most of us assume that
11:39 pm
when we go on a website and click that i agree button beside simply collects information that we voluntarily choose to share. in reality, there are many layers to data collection and the mechanics of it are invisible, hidden by design starters, it isn't just the website you are on that morning inclination, there are so called 3rd party advertising marketing and analytics agencies also tracking you using tiny bits of software, cookies, beacons, pixel pads, scoop up incredibly detailed information. everything from your computer you're using to how long you hover over. like, honestly it's a bit mine but, and all you really did to get informed consent is a fundamentally broken regulatory mechanism for algorithmic accountability. it allows companies to continue to throw risk back on to the user and say, you know, here's a, here's a pop up ad i, here's a pop up banner that tells you about cookies that no one reads. and nobody really
11:40 pm
even cares about yet. we push forward this regime as though it matters to people as though if someone clicks, i agree, then there magically ok with all the data processing that's going to come afterwards. which is a little bit of a joke, and it's a little bit of a legal fiction. yet it's a major component of every, almost every data protection framework around the world. once you've crossed the i agree hurdle. now in the clutches of the website. and this is where design takes on a whole new level of important. the job is to keep one of the most successful innovations in mobile and website design is something called incident scroll helix. we all use it every single day. you seem to scroll endlessly through your feed without having to click i'm on my way now to meet the creator of this function. his name is as a rescue. now he no longer was inside big tech cooperation. in early 2018,
11:41 pm
he co founded the simplest humane technology. all of our apps, all of silken dollar companies, are competing for our attention. and because it's such a cut through game trying to get our attention, we're tens of billions of dollars. they have to increasingly point more powerful computers at our heads to try to frack us for that attention. and we've all had that experience of like going to you tube. and you think i'm just going to, you can watch one video and then like, somehow like you shake your head an hour has passed and like what, why, how is it? the technology has hypnotized us. this tech hypnotize ation is key to what's called the attention economy. our attention is a finite currency in the online world, and it's only as long as websites and apps have our attention that they have our business. and our data retention economy is just this. it says,
11:42 pm
if we are not paying for a product, well, the company to make some money somehow, how do they do it? they do it by selling our attention to advertisers or to other groups that want us to do something. they're trying to make the systems as effective as possible at influencing your decision. they collect as much information about you as they can, like who your friends are, how you spend your time off and build lunches with like how you spend your money to take all of this data to build a model of you imagine like a little simulator view that lives in the facebook server, and then they can put things in front of it. do you like, are you more likely to click this, this or this? or if we want to get you to hate immigrations, what kind of message would you, are you going to resonate with this message or this message? and you can see how just like this begins with just this race for it. for your
11:43 pm
attention ends up becoming an entire economy's worth of pressure with the very smartest mines in engineering. and the biggest super computers trying to make a model of you to be able to influence the kinds of decisions you are going to make . a few years ago, youtube set a company wide objective to reach 1000000000 hours of viewing a day. netflix create a read. hastings has also said multiple times, that the company biggest competitor, is an another website, its sleep. so what happens when you give algorithms the goal of maximizing our attention and time online? they find our weaknesses and exploit them. in 2017 shown taca founding member of facebook and it's best presidents. literally come add. how do we consume as much of your time and conscious attention as possible? and that means that we need to sort of give you a little over me and hit every once in a while. because someone liked or commented on a photo or
11:44 pm
a post or whatever. and that's going to get you to contribute more content, say, social validation feedback loop that it's like. i mean, it's exactly the kind of thing that a hacker like myself would come up with because you're exploiting a vulnerability in, in human psychology. now it's not as though silicon valley piney, the tricks and tactics of addiction or persuasive design. many tech design is openly admit using insights from behavioral scientists of the early 20th century. the concept of randomly scheduled or studied and developed by american psychologist skinner in the 1950. he created what's become known as the skin, a box, a simple contraption. he used to study pigeons. and even at the start of the process, a pigeon is given a food reward every time it picks would pick or tens of full circle. when the word turn appears, as the experiment proceeds through words become less frequent,
11:45 pm
they take place at random time the behavior has been established. the pigeon keeps pecking or turning, not knowing when it might get the reward. but in anticipation that water could become skin boxes were pivotal in demonstrating how design had the power to modify behavior. and if randomly scheduled rewards work for pigeons. why not humans? skinner's concept is, in fact at the heart of a lot of addictive design from casino machines such as slot machines, the social media acts smart phones are unknowingly similar to the machine. think about your facebook, instagram, pinterest face. we all swipe down, pause, and then wait to see what will appear. went back to those randomly scheduled rewards. again. you'll see what could result in a new comment on a photo when you like, or piece of spam or software update. we don't really know, and it's that unpredictability that makes it so natasha daschle is
11:46 pm
a cultural anthropologist who spent more than 15 years studying the algorithms behind persuasive design. just like on a slot machine when you're texting or when you are looking through the news feed. you really never know what's coming down the pipe. you never know when you're going to sort of hit that jack pop, so to speak, and when it's coming and how much it will be. so the randomness is very important to keep you hooked in. i think the fact that we're gambling money in a casino isn't really that different than what we're doing. because in both cases, what we're really gambling is our attention and our time. right. and across the board. we're sitting there sort of hoping for some little reward to come, never knowing when it's going to come in all cases where we're sort of sitting alone with the machine. there's no natural stopping point. i think that the
11:47 pm
similarities are quite striking. we check our phones over 150 times a day or it just like pull it up, put up, pull it up. it's the 1st thing we look at when the last thing we look at before we go to sleep is like what we're glued to it. and that's, that's by design. we now have like over 2000000000 skinner boxes in people's pockets. we are running the largest experiment like psychological experiment that the world has ever seen by orders of magnitude one out of every 4 human beings on earth has skinner box, which is learning how to uniquely target them. super fascinating in the abstract, but sort of terrifying when you think about what it's doing. the research has been done. the evidence is clear that digital technology is being designed with the intention to make us attics. it's not a secret in the tech industry, either. 2 of the biggest tech figures, bill gates, in the late steve jobs, admitted they consciously limited the amount of time that children were allowed to
11:48 pm
engage with the products they helped create. the problem is real, but the media can often sensationalize the issue of tech addiction and make it harder for us to understand. i think that the reason people route maybe reach for the metaphor of crack cocaine is because they see that as a sort of high speed, hyper intensive form of addiction. and while i don't use that metaphor myself, i do think that within the spectrum of media addictions, there are certain ones that are more high potency, you could say. and those are, you know, if we're going to use the language of addiction, they have a higher event frequency. think about horse race, right? you go to the track, you've got to really wait for that event to happen. if you are rapidly engaged in an activity, twitter thread, that is more high potency has a higher event frequency, which means each event has an opportunity to,
11:49 pm
to draw us more to reinforce that behavior more. so i think we really can apply the language of addiction to the different media. i have an ongoing frustration which is that whenever i feel for a 2nd, have this impulse to reach into my pocket and pull up my phone and then i get angry at myself because i say that's not right. just just enjoy. that's not right. just d v with yourself for a 2nd, and then i get angry at myself that my phone has that much power over me, right? i'm, i'm angry that, that i'm subject to the design of a technology in such a way that i have difficulty serve, resisting it to lure. but of course, everything about the technology is built to, to, to create that impulse to, to make it feel as though it irresistible. there's such emphasis put on free choice
11:50 pm
and being able to be a consumer and you make decisions in the marketplace about what you want to do. can you have free will. but at the same time, the very people who are promoting that of the person as a consumer sovereign are operating and designing their technology with a very different human subject in mine. it's somebody who can like a rat or a pigeon, or any other animal, be incentivized and motivated and hooked in and have their attention redirected. and that's really interesting to me because it is a kind of return to skinner. i think you wouldn't have heard that in the eighty's or ninety's. i wouldn't even been creeping. you to think about someone designing your behavior, but now it's become accepted that you can be a behavior designer. and behavior design was one taught of what as they used to do in a previous life. however, he is now one of a growing number of industry insiders who are taking
11:51 pm
a more critical stance towards the really enjoyed talking to him. let me wonder, duffy regret his part in inventing infinite, scroll down to be like humble about it. if i hadn't invented it, it would have been invented. i just haven't been the right place to the right time thinking about the right kind of thing. but yes, i do regret it, but i do think it talks to like the navy of me like, oh, here's a cool feature and making it. and even if it's great for me to user without thinking about the effects, that'll happen if you can scale it up to a 100000000 people or 1000000000 people where this little thing that i went around to like twitter and google and all these other companies like guy you should adopt, it now is wasted, quite literally hundreds of millions of human ours i'm sure all of us have had some one or another said it was stop looking at your phone or why you so we dictate to social media. and before i started this series,
11:52 pm
i thought maybe there was something wrong with me. i just had no idea how deliberately designed our online experiences and how these design algorithms are made to put in. and i haven't, i asked everyone i spoke to, how do we change that? can we change how online design regulation can not be expected to happen on its own within these corporations, right? who are profiting from this? because there's just too deep of a conflict of interest. and so the only viable kind of regulation is going to have to come from the outside. we have to have a public conversation about what is actually going on in these products. how are they working? and once we understand that as a collective, what do we want to limit and constrain? so if you start with the assumption that people are never going to fully
11:53 pm
understand the risks of algorithms in the way we're spending interact with data, then we have to think about what else might work in it. and the data protection regimes like the general data protection regulation are a great combination. one of the ways in which we could really improve upon that is to embrace that trust model. so instead of putting all the risk on the user, the requirement of protecting people would fall on the company that using the algorithms that using the data. i think there is going to be a huge shift from just human center design, as we call it in our field. it is like what the human, at the center to was a big movement to thinking about human protective design. we say that the tools that are, they are so powerful that they cause real damage to us, individually for mental health, to our relationship, to our children, and to us societally toward democracy to having like civil discourse.
11:54 pm
and that move to human protective maintenance, i think is super helpful because then we could actually have technology which does it as opposed to the 1st place which is extend best so if you're concerned about the ways in which data is being collected and algorithms are being used to affect your life. there are 3 things i think you can do. one, use the tools that are given to use privacy dashboard use to factor authentication, which is really valuable to be more deliberate and critical about the ways in which company it companies are asking for your information and the devices that you adopt and the services you participate in understand that companies are trying to tell you things through design,
11:55 pm
and they're trying to make things easier or harder and think about whether you want things to be easier and, and what are the costs of making things easier and understand that companies are trying to get your consent because their entire business depends on it. and so think about that as you go forward. then finally, i think that design and consent and privacy and algorithms need to be a political issue. and so if someone is running for office, ask them what their stance is on algorithmic accountability, ask them what their stance is on privacy and data collection. because if we get better rules about how the data is collected, now rhythms are used and we might all be better off. we're at this inflection point where our technology is beginning to make predictions about us that are better than the predictions we make about ourselves. and one of the best ways of binoculars is by learning about ourselves more just like stop and really ask yourself like before you post something to facebook or anything, i'm like, why am i doing this?
11:56 pm
like, what are my motivations? if you sort of like slow down your thought process, often i found that it will be like, oh, i am pulling this app out because i'm a little bit bored or everyone else just pulled up their phones from feeling a little socially awkward. that's curious. or maybe like i'm having this experience and i want to show up just a little bit and just like stopping and thinking about like what are my motivations for doing things. i found to be a great occupation for, for a, spending my time in ways that i wish i hadn't later. i'll just say that i recommend having a general attitude shift where you understand yourself as an organism, as a creature that can like any other number of creatures and animals be turned in certain directions. have your attention swayed, caught, captured, hooked. i find it liberating to recognize all of the forces that are
11:57 pm
exerting these, these different powers on me to turn my attention in one way or another and somehow realizing that helps me to disconnect a lot because it makes me a little bit angry. and i don't feel so bad about my self, i feel bad about what's being done to me. and then i'm more able to disconnect news news, news, news, news. hello,
11:58 pm
good to be with you read off the bat. we'll get your the latest on tropical storm shawn to as it slices through southern portions of japan. we've got serious weather alerts in play here. landslide warning for that southwest corner of chicago. and no surprise when i show you the rainfall amounts, we're expecting upwards of 300 millimeters over in q issue. i'm thinking about $200.00, so the ground will be incredibly saturated. and that's why we've got the risk of mud slides and line slides starting to dry out for that se portion of china. of course, a few showers and a batch of heavy range. young joe down to the yangtze river valley, moving west to east on saturday, starting to dry out for central portions of vietnam. and you know, tropical storm constant dropped about 900 millimeters of rain in vietnam. so we know it will take time for those water levels. to come down intense range of south per day. next up, we're going down under an a vigorous system bracing across the bytes, spreading heavy rain and also gusty conditions. so we'll see wind gusts here about
11:59 pm
70 kilometers per hour and look at perth 20 degrees. they say your temperatures on the rise in the days to come. we've got a day of reprieve across that new zealand before our next system right here. races across the bite and become sure story on sunday. the government's support dwindling. russell parliamentary election take place in september, but as opposition leader elective only remains in prison and to finalize a band from taking part could the due to the currently be wide open for another clean suite. in supporting personal coverage on al jazeera, on counting cost is this the end of time is experiments with capitalism. presidents using ping launches, reaping socialist reform to address inequality in the world. second biggest economy, plus one second, had watches a spelling fall counting the cost on a lot of the stories that
12:00 am
we cover a highly complex. so it's very important that we make them as understandable as we can as algebra correspondence, that's what we strive to do. me this is al jazeera ah allow mariam nemiah's the welcome to the news our life from london coming up in the next 60 minutes. officials admit us to draw a strike in campbell last month. mistake can be killed, 10 civilians, 7 of them. children.

18 Views

info Stream Only

Uploaded by TV Archive on