tv [untitled] September 20, 2021 9:30am-10:01am AST
9:30 am
migrant farm workers are victims of vicious beatings. jo reed is helping the pakistani community to find a voice. the stories we don't often hear told by the people who live them undocumented and under attack. this is europe on al jazeera, the news hello again, i'm fully batty boy, and bill holly, the headlines on al jazeera, hundreds of haitian migraine fixed spells from the us on repass creation flights have arrived in port au prince. there among the 13000 to gather at a border town in texas, up to 7 daily flights, a planned and a mass 50 for taishan operation. like we looked at them when we didn't want to leave on the bus and board the plane, they locked us up in the bus to come
9:31 am
a stand up. some women even took a beating the beat men and women know they will be ready to receive and process these migrant supported by force in our to airports. but the problem is that the people don't accept the 4th deputation because that is the reason the left. they did not want to live in haiti. vote counting is on the way after 3 days of parliamentary elections in russia. exit polls would be pro kremlin united russia party on course to win with around 45 percent of the vote. that's a drop of almost 10 percent since the last election in 2016. a humanitarian crisis is looming enough. ganeth stand. where mal newish children are filling up hospital wards. thousands of newborns are in desperate need of treatment. unicef says many of them could die if they don't get urgent. a gunman in nigeria have released another 10 children, abducted from their school in kaduna state. in july, officials have confirmed the gunmen were paid ransom 3 days earlier. 21 students
9:32 am
are still missing the u. s, state of alabama had more death than birth last year for the 1st time in recorded history. state health officials have blamed the corona virus for the surgeon fatalities. the u. s. is recording an average of 140000 new cases and 2000 deaths a day. thousands of people have been told to leave their homes after a volcanic eruption on the spanish canary islands. it follows a week of increasing seismic activity on the island of la palmer, 500 kilometers off the coast of west africa. and netflix has one television, top honor for the 1st time. and the m. he goes to the crown, the streaming service, the best drama series for the crown i show about the british monarchy under queen elizabeth. it also won best limited series for the queen gambit. the wind kept a sweep by streaming platforms of the any top on it. those are the headlines next
9:33 am
on alger there all hell. the algorithm ah, there is a huge group of people at work behind our screens. they called behavior architects . this wasted design and user experience specials. and the power they have is massive that urge to keep swiping through your twitter feed. that's the time, the way we all click, i agree to the terms and conditions. that's design sweating, lift right on kinda that's designed to we live in an online world or someone else's making. and most of us never even give it a 2nd. and actually, that's design is what the
9:34 am
san francisco, it's the mecca from. tech designers, home to silicon valley. this place piney, the art of constructing, optimizing and enhancing. a lot of the technology we use every day. it's turbocharged the speed at which we use the internet and made navigating the web more insurance. but it's also given us a full sense of security. i've lost count of the number of times i've clicked. i agree to get into a website. we all have to do it. as we feed around the internet, we face hundreds of these annoying pop ups and consent forms all demanding a response from us. and yes, i call them annoying because that is exactly what they are. they may look like that that it brought us with controls, but the reality is,
9:35 am
saw from it when users click on, i agree to the terms and conditions and they see a privacy policy and they click on it. they may think that they're actually being given control of their personal data was collected how it's used, and it's advantageous to companies for them to do that. because if something bad happens, the tech company can then say, you actually agreed to this. nobody ever reads the terms of service. no one should ever be expected to. if we actually tried to read every terms and conditions agreement that we came across it, probably the only thing we would do it would have to be our day job because for so long. and we come into so many. and it may have the veneer of giving control to data subjects, but ultimately if it's window dressing, what hot soak is, what you'd call a scholar of design. he's been studying the psychology ethics and legal implications of new technologies. one area he specializes in its data protection
9:36 am
and privacy. now before we get into this, there's a key term you need to know informed consent. this is a principle that comes up a lot in discussions of our rights online. but it's easier to explain in the context of medical a doctor explains potential risks and worst case scenarios to the patient. once you're fully informed, you have the option to consent to surgery or not in the online well informed consent is what everyone says he ideal. but is that even possible? consent only works under s at the very narrow set of conditions, and that when the decision is infrequent like with surgery, so we don't have surgery all the time. it's when the risks are visceral. there are things that we can easily conjure up in our mind, and then finally the harm is possibly great. things go wrong with surgery, you could get thick where you could die. so we've got an incredible incentive to take that decision. but of course, none of those things are present in the data ecosystem. we make the decisions quite
9:37 am
frequently 10100 times a day. the harms are not visceral at all, they're incredibly opaque. and finally, the harm is not even that great because the private modern privacy harms aren't huge, very deaf by 1000 cut. the spin from silicon valley is that there on our side, when it comes to how we control our data, long privacy policies are very confusing. and if you make it long and spell it all the detail, then you're probably going to reduce the percent of people who read it. however, take a quick look at the design of the buttons and the pop ups that were made to click. and it's clear that the tech companies have the upper hand in the data battle. design is power and every single design decision makes a certain reality, more or less likely. and what tech companies and psychologists and other people have known for years. if the defaults are notoriously sticky, and so if you design the interface so that all the defaults are set to maximize
9:38 am
exposure, then you're going to get a lot more data in the aggregate than you would if you set all the defaults to privacy protected because people don't go in and change them. so we tell we fundamentally changed the incentives. we're still going to see companies manipulating the design of these buttons and these technologies to ensure that you still keep disclosing data and that they still keep getting what's the lifeline of their business. most of us assume that when we go on a website and click that i agree button, besides simply collects information that we voluntarily choose to share. in reality, there are many ways to data collection and the mechanics of it are invisible. hidden by design started, it isn't just the website you are on that morning inclination, there are so called 3rd party advertising marketing and analytics agencies. also tracking, eating tiny bits of software, cookies, beacons, pixel pads, scoop up incredibly detailed information. everything from your computer you're
9:39 am
using to how long you hover over. like, honestly, it's a bit mine but, and all you really did to get informed consent is a fundamentally broken regulatory mechanism for algorithmic accountability. it allows companies to continue to throw risk back on to the user and say, you know, here's a, here's a pop up ad i, here's a pop up banner that tells you about cookies that no one reads. and nobody really even cares about yet. we push forward this regime as though it matters to people as though if someone clicks, i agree, then there magically ok with all the data processing that's going to come afterwards. which is a little bit of a joke, and it's a little bit of a legal fiction. yet it's a major component of every, almost every data protection framework around the world. once you've crossed the i agree hurdle. now in the clutches of the website. and this is where design takes on a whole new level of important. the job is to keep one of the most
9:40 am
successful innovations in mobile and website design is something called infinite scroll helix. we all use it every single day. you see scroll endlessly through your feed without having to click. i'm on my way now to meet the creator of this function. his name is as a rescue. now he no longer was inside big tech cooperation. in early 2018. he confounded it's interesting, humane technologies, all of our apps, all of silicon valley companies, are competing for our attention. and because it's such a cut through game trying to get our attention, we're tens of billions of dollars. they have to increasingly point more powerful computers at our heads to try to frack us for that attention. and we've all had that experience of like going to you tube. and you think i'm just going to,
9:41 am
you can watch one video and then like, somehow like you shake your head an hour has passed and like what, why, how is it? the technology has hypnotized us. this tech hypnotize ation is key to what's called the attention economy. our attention is a finite currency in the online world, and it's only as long as websites and apps have our attention that they have our business. and our data retention economy is just this. it says, if we are not paying for a product, well, the company to make some money somehow, how do they do it? they do it by selling our attention to advertisers or to other groups that want us to do something. they're trying to make the systems as effective as possible at influencing your decision. like quite as much information is about you as they can, like who your friends are, how you spend your time. often don't lunches with like how you spend your money to
9:42 am
take all of this data to build a model of you imagine like a little simulator view that lives in the facebook server. and then they can put things in front of it. do you like, are you more likely to click this, this or this? or if we want to get you to hate immigrations, what kind of message would you, are you going to resonate with this message or this message? and you can see how it is like this begins with just this race for it. for your attention ends up becoming an entire economy's worth of pressure with the very smartest mines in engineering. and the biggest super computers trying to make a model of you to be able to influence the kinds of decisions you are going to make . a few years ago, youtube set a company wide objective to reach 1000000000 hours of viewing a day. netflix create a read, hastings has also said multiple times that the company, biggest competitor isn't another website, its sleep. so what happens when you give algorithms the goal of maximizing our attention and time online?
9:43 am
they find our weaknesses and exploit them. in 2017 shown taca, a founding member of facebook and its 1st president literally confessed to adam. how do we consume as much of your time and conscious attention as possible? and that means that we need to sort of give you a little over me and hit every once in a while, because someone liked or commented on a photo or a post or whatever. and that's going to get you to contribute more content, say, social validation, feedback loop that it's like. i mean, it's exactly the kind of thing that a hacker like myself would come up with because you're exploiting a vulnerability in, in human psychology. now it's not as though silicon valley piney, the tricks and tactics of addiction or persuasive design. many tech design is openly admit using insights from behavioral scientists of the early 20th century. the concept of randomly scheduled or studied and developed by american psychologist
9:44 am
skinner in the 1950. he created what's become known as the skin, a box, a simple contraption. he used to study pigeons. and even at the start of the process, a pigeon is given a food award every time it picks would pick or tens of full circle. when the word turn appears. as the experiment proceeds through words become less frequent, they take place at random time the behavior has been established. the pigeon keeps pecking or turning, not knowing when it might get the reward. but in anticipation that or ward could become skin boxes, were pivotal in demonstrating how design had the power to modify behavior. and if randomly scheduled rewards work for pigeons. why not humans? skinner's concept is, in fact at the heart of a lot of addictive design from casino machines such as slot machines to social media. smart phones are unerringly similar to the machine. think about
9:45 am
your facebook, instagram, pinterest fades. we all swipe down, pause and then wait to see what will appear. went back to those randomly scheduled rewards. again. you'll see what could result in and you comment on a photo when you like, or piece of spam or software update. we don't really know, and it's that unpredictability that makes it so natasha daschle is a cultural anthropologist who spent more than 15 years studying the algorithms behind persuasive design. just like on a slot machine when you're texting or when you are looking through the news feed. you really never know what's coming down the pipe. you never know when you're going to sort of hit that jack pop, so to speak, and when it's coming and how much it will be. so the randomness is very important to keep you hooked in. i think the fact that we're gambling money in
9:46 am
a casino isn't really that different than what we're doing. because in both cases, what we're really gambling is our attention and our time. right. and across the board. we're sitting there sort of hoping for some, a little reward to come, never knowing when it's going to come in all cases where we're sort of sitting alone with the machine. there's no natural stopping point. i think that the similarities are quite striking. we check our phones over a 150 times a day, or it just like pull it up, pull it up in the 1st thing we look at when the last thing we look at before we go to sleep is like we're glued to it. and that's, that's by design. we now have like over 2000000000 skinner boxes in people's pockets. we are running the largest experiment like psychological experiment that the world has ever seen by orders of magnitude one out of every 4 human beings on earth has skinner box, which is learning how to uniquely target them. super fascinating in the abstract,
9:47 am
but sort of terrifying what you think about what it's doing. the research has been done. the evidence is clear that digital technology is being designed with the intention to make us attics. it's not a secret in the tech industry, either. 2 of the biggest tech figures, bill gates, in the late steve jobs, admitted they consciously limited the amount of time that children were allowed to engage with the products they helped create. the problem is real, but the media can often sensationalize the issue of tech addiction and make it harder for us to understand. i think that the reason people maybe reach for the metaphor of crack cocaine is because they see that as a sort of high speed, hyper intensive form of addiction. and while i don't use that metaphor myself, i do think that within the spectrum of media addictions, there are certain ones that are more high potency, you could say. and those are,
9:48 am
you know, if we're going to use the language of addiction, they have a higher event frequency. think about horse race, right? you go to the track, you've got to really wait for that event to happen. if you are rapidly engaged in an activity, twitter thread, that is more high potency has a higher event frequency, which means each event has an opportunity to, to draw us more to reinforce that behavior more. so i think we really can apply the language of addiction to the different media. i have an ongoing frustration which is that whenever i feel for a 2nd, i have this impulse to reach into my pocket and pull up my phone and then i get angry at myself because i say that's, that's not right. just just enjoy. that's not right. just d v with yourself for a 2nd, and then i get angry at myself that my phone has that much power over me,
9:49 am
right? i'm, i'm angry that, that i'm subject to the design of a technology in such a way that i have difficulty serve, resisting its allure. but of course, everything about the technology is built to, to, to create that impulse to, to make it feel as though it irresistible. there's such emphasis put on free choice and being able to be a consumer and you make decisions in the marketplace about what you want to do. can you have free will. but at the same time, the very people who are promoting that of the person as a consumer sovereign are operating and designing their technology with a very different human subject in mine. it's somebody who can like a rat or a pigeon, or any other animal, be incentivized and motivated and hooked in and have their attention redirected. and that's really interesting to me because it is a kind of return to skinner. i think you wouldn't have heard that in the eighty's
9:50 am
or ninety's that wouldn't even been creepy to think about someone designing your behavior. but now it's become accepted that you can be a behavior designer. and behavior design was one taught of what as they used to do in a previous life. however, you've now one of a growing number of industry insiders who are taking a more critical stance towards the really enjoyed talking to him. let me wondering, does he regret his part in inventing infinite scroll down to be like humble about it. if i hadn't invented it, it would have been better. i just haven't been the right place to the right time thinking about the right kind of thing. but yes, i do regret it, but i do think it talks to like the navy of like, oh, here's a cool feature and making it. and even if it's great for me to was user without thinking about the effects. that'll happen if you can scale it up to
9:51 am
a 100000000 people or 1000000000 people where this little thing that i went around to like twitter and google and all these other companies like guy you should adopt . it now is wasted, quite literally hundreds of millions of human ours i'm sure all of us have had some one or another said it was stop looking at your phone or why you so we dictate the social media. and before i started this series, i thought maybe there was something wrong with me. i just had no idea how deliberately designed our online experiences and how these design algorithms are made to put in. and i haven't, i asked everyone i spoke to, how do we change that? can we change how online design regulation can not be expected to happen on its own within these corporations, right? who are profiting from this? because there's just too deep of a conflict of interest. and so the only viable kind of regulation is going to have
9:52 am
to come from the outside. we have to have a public conversation about what is actually going on in these products. how are they working? and once we understand that as a collective, what do we want to limit and constrain? so if you start with the assumption that people are never going to fully understand the risks of algorithms in the way we spend interactive data, then we have to think about what else might work in it. and the data protection regimes like the general data protection regulation are a great combination. one of the ways in which we could really improve upon that is to embrace a trust status model. so instead of putting all the risk on the user, the requirement of protecting people would fall on the company that using the algorithms that using the data. i think there is going to be
9:53 am
a huge shift from just human center design, as we call it in our field is like what the human at the center just was a big moving to thinking about human protective design. we say that the tools that are so they are so powerful that they cause real damage to us, individually for mental health, to our relationships, to our children and to us societally toward democracy to having like civil discourse. and that move to human protective to her name, i think is super helpful because then we could actually have technology which does it, it was supposed to the 1st place which is extend best so if you're concerned about the ways in which data is being collected and algorithms are being used to affect your life. there are 3 things i think you can do. one,
9:54 am
use the tools that are given to use privacy dashboard use to factor authentication, which is really valuable to be more deliberate and critical about the ways in which company it companies are asking for your information. and the devices that you adopt in the services you participate in, understand that companies are trying to tell you things through design. and they're trying to make things easier or harder and think about whether you want things to be easier and, and what are the costs of making things easier and understand that companies are trying to get your consent because their entire business depends on it. and so think about that as you go forward. then finally, i think that design and consent and privacy and algorithms need to be a political issue. and so if someone's running for office, ask them what their fancy is on algorithmic accountability. ask them what their stance is on privacy data collection, because if we get better rules about how the data is collected,
9:55 am
not rhythms are used and we might all be better off. we're at this inflection point where our technology is beginning to make predictions about us that are better than the predictions we make about ourselves. and one of the best ways of binoculars is by learning about ourselves more just like stuff and really ask yourself like, before you post something to facebook or i'm like, what, why am i doing this? like, what are my motivations? if you sort of like slow down your thought process often i've found that will be like, oh, i am pulling this app out because i'm a little bit bored or everyone else just pulled up their phones from feeling a little socially awkward. that's curious. or maybe like i'm having this experience and i want to show off just a little bit and just like stopping and thinking about like what are my motivations for doing things up. i found to be a great inoculation for, for spending my time in ways that i wish i had later. i'll just say that i recommend having
9:56 am
a general attitude shift where you understand yourself as an organism, as a creature that can like any other number of creatures and animals be turned in certain directions. have your attention swayed, caught, captured, hooked. i find it liberating to recognize all of the forces that are exerting these, these different powers on me to turn my attention in one way or another and somehow realizing that helps me to disconnect a lot because it makes me a little bit angry. and i don't feel so bad about myself. i feel bad about what's being done to me. and then i'm more able to disconnect.
9:57 am
on march 15th 2019 zealand sense of security which chatted with 51 people, was shot dead into christ church. another 40 wounded when a gunman began shooting at a christ church, moth with tech worship and attending the friday service. for those who lost loved ones finding ways to deal with the trauma. crucial. she gave me and she asked me, what was mom? i told her mom was with me 4 months later, i feel much quiet and i feel much more calm and really focused with my life. let us love one. love doesn't close to lunch and makes your heart happier. my heart, if he doesn't bring any last foot into a symbol, let us practice this. the
9:58 am
who's still on the coast of a man, you have a little bit of clad, which really reflects what's happening with the wind direction. southwest is the edge of the monsoons still keeping solid, usually a cost, a bit drizzly. but that is the exception, not the rule. mostly for iran, levant, the raven peninsula. it is still quite hot, not associate has been and there's an interest in the wind direction. this orange here is just picked up or very rocking q 8 brought dancer eastern saturday and just catches everything out. now again, places like bahrain, katara is doing that at the moment. though, how's weather is influenced by that in that it's not a shumate, it's still about sitting on 40 degrees, which is bad average, but it feels rather different. it is not humid. now there is still monsoons type, right, and just catching part of pakistan is shifting karate, but dry it is quite sherry and we seem flooding recently in the hall. we might see again, showers all way down the indus valley,
9:59 am
so the potential is still there. and there are a few showers in northern turkey in the forecast. the wind is much, much lighter through the g than the eastern met. that is not really affecting the weather at all and still on the coast to towns in the air and kenya, there's a week, but still existing, bouncing, bring you a few showers. the on counting cost is this the end of time is experiments with capitalism. president, teaching ping launches, reaping socialist reform to address inequality in the world. second biggest economy thought was sick and had watches a felling fall counting the cost on russellville and southern england, where 2 farmers turn safari park. pioneers of fits the attractive nature in the driving seat. i was just absolutely astonishing the life,
10:00 am
the poor back even not very fast. and i miguel sophie, i, cynthia when one by your company revolutionizing the system, you think blend in artificial intelligence. here in flight. you have you have rise on al jazeera, the news . ready hundreds of my friends supported by the us back in high. she did say they've stuff and why too much to return home now? try to trying to stop getting back in the come up with a cross to make sure to get through. they say because there isn't enough food for them to be there in the can i
10:01 am
25 Views
Uploaded by TV Archive on