tv [untitled] September 19, 2021 4:30am-5:00am AST
4:30 am
icon says that it's a meter i had it is it is i me to roy, morocco's meteorite hunters or disease. ah hello, he didn't, don't have a headlines on al jazeera, the diplomatic disputes between francis trailer, the u. k and the u. s. continues to escalate. francis accused history of a breach of trust after its scrap to submarine contracts saver of a deal with the united states. you muscle there have been lies. there has been duplicity. there has been a major rupture of trust. there has been contempt. so it's not going well between us, not at all yet. that means that there is a crisis. and at that point,
4:31 am
there's 1st a symbolic aspect, meaning we, recalling our ambassadors to try to understand, but also to show, to our former partner countries that we have very strong discontent, really a serious crisis between us. and if they come, it's also a way to reevaluate our position to defend our interests both in australia and in the united states. israeli police have captured the last 2 are fixed, palestinian prisoners, escaped from a high security facility 2 weeks ago. police arrested minot than the fire, and a hum comes in jeanine in the occupied west bank. it ends a 2 week man hunts, having protest engineers. he has capital in the 1st major demonstrations and presidents case studies see power and dismiss parliaments in july. some protest is chanted, shut down, the qu, hearing the you rights one, and the 2011 revolution will be loss. at least 3 people have been killed and 21 injured after a vehicle belonging to taliban forces his roadside mine. it happens in the african
4:32 am
city of july above the capital of non car province, which is a nicer stronghold. thousands of migrants who cross into the us ought to be deported. it's estimated more than 13000 people. most of them from haiti saying that to make sure cam pounds are a bridge in the border 10 of the real, the only border crossing that has been closed. still really, mayor has declared a state of emergency and a french dare. devil has wild crowds in paris as he walked 600 meters on a tight rope between the eiffel tower and the theatre intricate. there a square mason pull on, relied on a narrow strip of rope, just 2 and a half centimeters white as he walked 70 peters of both the grounds and over the river sand. and let's go up to date, stay with us, and i'll go out to sera o hill. the algorithm is next. there is a huge group of people at work behind our screens. they called behavior architects,
4:33 am
persuasive design, and user experience specials. and the power they have is massive that urge to keep swiping through your twitter feed. that's designed the way we all click. i agree to the terms and conditions. that's design sweating, lift right on kinda that's designed to we live in an online world or someone else is making. and most of us never even give it a 2nd. and actually, that's design is what the san francisco, it's the mecca for. tech designers, home to silicon valley. this place piney, the art of constructing, optimizing and enhancing a lot of the technology we use every day. it's turbo charged,
4:34 am
the speed at which we use the internet and made navigating the web more insurance. but it's also given us a full sense of security. i've lost count of the number of times i've clicked. i agree to get into a website. we all have to do it. as we feed around the internet, we face hundreds of these annoying pop ups and consent forms all demanding a response from us. and yes, i call them annoying because that is exactly what they are. they may look like that, it brought us with controls, but the reality is saw from when users click on i agree to the terms and conditions . so they see a privacy policy and they click on it. they may think that they're actually being given control of their personal data was collected how it's used, and it's advantageous to companies for them to do that. because if something bad happens to tech company can then say you actually agreed to this. nobody ever reads
4:35 am
the terms of service. no one should ever be expected to. if we actually tried to read every terms and conditions agreement that we came across, it's probably the only thing we would do. it would have to be our day job because for so long and we come into so many and it may have the near of giving control to data subjects. but ultimately, if it's window dressing, what he hutsel is what you'd call a scholar of design. he's been studying the psychology ethics and legal implications of new technologies. one area he specializes in is data protection and privacy. now, before we get into this, there's a key term you need to know informed consent. this is a principle that comes up a lot in discussions of our rights online, but it's easier to explain in the context of medical a doctor explains potential risks and worst case scenarios to the patient. once you're fully informed, you have the option to consent to surgery or not in the online well informed
4:36 am
consent is what everyone says the ideal. but is that even possible? only works under s at the very narrow set of conditions and that's when the decision is infrequent like with surgery. so we don't have surgery all the time. it's when the risks are, this are all. there are things that we can easily conjure up in our mind, and then finally the harm is possibly great. things go wrong with surgery, you could get thick or you could die. so we've got an incredible incentive to take that decision. but of course, none of those things are present in the data ecosystem. we make the decisions quite frequently 10100 times a day. the harms are not visceral at all, they're incredibly opaque. and finally, the harm is not even that great because the private modern privacy harms aren't huge, very death by a 1000 cut. the spin from silicon valley is that there on our side, when it comes to how we control data, long privacy policies are very confusing. and if you make it long and spell it all
4:37 am
the detail, then you're probably going to reduce the percent of people who read it. however, take a quick look at the design of the buttons and the pop ups that were made to click. and it's clear that the tech companies have the upper hand in the data battle. design is power and everything, all designed decision makes a certain reality, more or less likely. and what tech companies and psychologists and other people have known for years as the defaults are notoriously sticky. and so if you design the interface so that all the defaults are set to maximize exposure, then you're going to get a lot more data in the aggregate than you would if you set all the defaults to privacy protected because people don't go in and change them so we tell, we fundamentally changed the incentives. we're still going to see companies manipulating the design of these button, these technologies to ensure that you still keep the closing data and that they still keep getting what's the lifeline of their business. most of us assume that
4:38 am
when we go on a website and click that i agree button. the site simply collects information that we voluntarily choose to share. in reality, there are many layers to data collection and the mechanics of it are invisible, hidden by design. the starters, it isn't just the website you are and next morning information. there are so called 3rd party advertising marketing and analytics agencies. also tracking, using tiny bits of software, cookies, beacons, pixel pads, they screwed out incredibly detailed information. everything from your computer you're using to how long you hover over. like, honestly it's a bit mine. and all you really did to come informed consent is a fundamentally broken regulatory mechanism for algorithmic accountability. it allows companies to continue to throw risk back on to the user and say, here's a, here's a pop up ad i, here's a pop up banner that tells you about cookies that no one read. and nobody really
4:39 am
cares about yet. we push forward this regime as though it matters to people as though group. if someone clicks, i agree, then there magically ok with all the data processing that's going to come afterwards, which is a little bit of a joke and it's a little bit of a legal fiction. yet. it's a major component of every, almost every data protection framework around the world. once you've crossed the i agree hurdle. now in the clutches of the website. and this is where design takes on a whole new level of important. the job is to keep one of the most successful innovations in mobile and website design is something cool. infinite scroll. we all use it every single day. you seem to scroll through your free without having to click i'm on my way now to meet the creator of this function. his name is as a rescue. now he no longer was inside big tech cooperation. in early 2018,
4:40 am
he confounded the central humane technologies all of our apps, all of silicon valley companies are competing for our attention. and because it's such a cut throat game trying to get our attention, we're tens of billions of dollars. they have to increasingly point more powerful computers at our heads to try to frack us for that attention. and we've all had that experience of like going to you tube. and you think i'm just going to just watch one video and then like, somehow like you shake your head an hour has passed and like what, why, how is it? the technology has hypnotized us. this tech hypnotize ation is key to what's called the attention economy. our attention needs a finite currency in the online world, and it's only as long as websites and apps have our attention that they have our business. and our data retention economy is just this. it says,
4:41 am
if we are not paying for a product, well, the company to make some money somehow, how do they do it? they do it by selling our attention to advertisers or to other groups that want us to do something. they're trying to make these systems as effective as possible at influencing your decision. like quite as much information as you, as they can, like who your friends are, how you spend your time off until monday with like how you spend your money to take all of this data to build a model. if you imagine like a little simulator view that lives in the facebook server and then they can put things in front of it. do you like, are you more likely to click this, this or this? or if we want to get you to hate immigration, what kind of message would you, are you going to resonate with this message or this message? and you can see how just like this begins with just this race for it. for your
4:42 am
attention ends up becoming an entire economy is worth of pressure with the very smartest mines in engineering. and the biggest super computers trying to make a model of you to be able to influence the kinds of decisions you are going to make . a few years ago, you tube set a company wide objective to reach 1000000000 hours of viewing a day. netflix create a read, hastings has also said multiple times that the company, biggest competitor, isn't another website, its sleep. so what happens when you give algorithms the goal of maximizing our attention and time online? they find our weaknesses and exploit them. in 2017 shown taca, a founding member of facebook and its 1st president literally confessed to how do we consume as much of your time and conscious attention as possible. and that means that we need to sort of give you a little over me and hit every once in a while, because someone liked or commented on a photo or
4:43 am
a post or whatever. and that's going to get you to contribute more content, say, social validation, feedback loop that it's like. i mean, it's exactly the kind of thing that a hacker like myself would come up with because you're exploiting a vulnerability in, in human psychology. now it's not as though silicon valley piney, the tricks and tactics of addiction or persuasive design. many tech design is openly admit using insights from behavioral scientists of the early 20th century. the concept of randomly scheduled or studied and developed by american psychologist, be a skinner in the 1950s. he created what's become known as the skinner box, a simple contraption. he used to study pigeons. and even at the start of the process, the pigeon is given a food reward every time it picks a would pick or tens of full circle. when the word turn appears. as the experiment proceeds through the woods, become less frequent, they take place at random time,
4:44 am
the behavior has been established. the pigeon keeps pecking, returning, not knowing when it might get the reward. but in anticipation that her ward could becoming skinner boxes were pivotal in demonstrating how design had the power to modify behavior. and if randomly scheduled rewards work for pigeons. why not humans? skinner concept is in fact at the heart of a lot of addictive design from casino machines such as slot machines. the social media smart phones are unerringly similar to thought machine. think about your facebook, instagram of pinterest phase. we all swipe down, pause, and then wait to see what will appear. went back to the randomly scheduled rewards again. you'll see what could result in and your comment on a photo when you like, or piece of spam or software update. we don't really know, and it's that unpredictability that makes it so natasha daschle is
4:45 am
a cultural anthropologist who spent more than 15 years studying the algorithms behind persuasive design. just like on a slot machine, when you're texting or when you are looking through the news feed, you really never know what's coming down the pipe. you never know when you're going to sort of hit that jack pop, so to speak, when it's coming and how much it'll be. so the randomness is very important to keep you hooked in. i think the fact that we're gambling money in a casino isn't really that different than what we're doing. because in both cases, what we're really gambling is our attention and our time right across the board. we're sitting there sort of hoping for some, a little reward to come, never knowing when it's going to come in all cases where we're sort of sitting alone with the machine. there is no natural stopping point. i think that the
4:46 am
similarities are quite striking. we check our phones over a 150 times a day or it just like pull it up, put up, pull it up. it's the 1st thing we look at when is the last thing we look at before we go to sleep, it's like what we're glued to it. and that's, that's by design. we now have like over 2000000000 skinner boxes in people's pockets. we are running the largest experiment like psychological experiment that the world has ever seen by orders of magnitude, right? one out of every 4 human beings on earth has a skinner box, which is learning how to uniquely target them. super fascinating in the abstract, but sort of terrifying what you think about what it's doing. the research has been done. the evidence is clear that digital technology is being designed with the intention to make us attics. it's not a secret in the tech industry either to of the biggest tech figures. bill gates in the late steve jobs admitted they consciously limited the amount of time the
4:47 am
children were allowed to engage with the products they helped create. the problem is real, but the media can often sensationalize the issue of tech addiction and make it harder for us to understand it. think that the reason people maybe reach for the metaphor of crack cocaine is because they see that is a 3rd of high speed hyper intensive form of addiction. and you know, while i don't use that metaphor myself, i do think that within the spectrum of media addictions, there are certain ones that are more high potency, you could say. and those are, if we're going to use the language of addiction, they have a higher event frequency. think about horse race, right? you go to the track and you've got to really wait for that event to happen. if you are rapidly engaged in an activity, a twitter thread, that is more high potency has a higher event frequency, which means each event has an opportunity to draw you in more to reinforce that
4:48 am
behavior more. so i think we really can apply the language of addiction to the different media. i have an ongoing frustration which is it whenever i feel for a 2nd, i have this impulse to reach into my pocket and pull up my phone and then i get angry at myself because i say that's, that's not right. just just enjoy that moment. right. as d b with yourself for a 2nd, and then i get angry at myself that my phone has that much power over me, right? i'm, i'm angry that, that i'm subject to the design of a technology in such a way that i have difficulty serve, resisting its allure. but of course, everything about the technology is built to, to, to create that impulse to, to make it feel as though it's irresistible. there's such emphasis put on free choice and being able to be
4:49 am
a consumer and you make decisions in the marketplace about what you want to do. can you have free will. but at the same time, the very people who are promoting that of the person as a consumer sovereign are operating and designing their technology with a very different human subject in mind. it's somebody who can like a rad or a pigeon, or any other animal, be incentivized and motivated and hooked in and have their attention redirected. and that's really interesting to me because it is a kind of return to skinner. i think you wouldn't have heard that in the eighty's or ninety's that wouldn't be even been creeping. you to think about someone designing your behavior, but now it's become accepted that you can be a behavior designer. and behavior design was one taught of what they used to do in a previous life. however, he is now one of a growing number of industry insiders who are taking
4:50 am
a more critical stance towards the really enjoyed talking to him. let me wondering, does he regret his part in inventing infinite scroll down to be like humble about it. if i hadn't invented it, it would have been bent it. i just happened being the right place to the right time thing about the right kind of thing. but yes, i do would read it, but i do think it talks to like the navy of like, oh, here's a cool feature and making it. and even if it's great for me to use user without thinking about the effects that will happen if you can scale it up to a 100000000 people or 1000000000 people where this little thing that i went around to like twitter and google and all these other companies like guy you should adopt, it now is wasted, quite literally hundreds of millions of human ours i'm sure all of us have had someone or another say to was stopped looking at your phone or why you so addicted to social media. and before i started this series,
4:51 am
i thought maybe there was something wrong with me. i just had no idea how deliberately designed our online experiences and how these design algorithms are made to us in and full. haven't. i asked everyone i spoke to, how do we change this? can we change how online design work? regulation can not be expected to happen on its own within these corporations. right. who are profiting from this? because there's just too deep of a conflict of interest. and so the only viable kind of regulation is going to have to come from the outside. we have to have a public conversation about what is actually going on in these products. how are they working? and once we understand that as a collective, what do we want to limit and constrain? so if you start with the assumption that people are never going to fully
4:52 am
understand the risks of algorithms the way in which that interact with data, then we have to think about what else might work. and the data protection regimes like the general data protection regulation are a great foundation, one of the ways in which we could really improve upon that is to embrace a trust model. so instead of putting all the risk on the user, the requirement of protecting people would fall on the company that using the algorithms that using the data. i think there is going to be a huge shift from just human center design, as we call it in our field. it is like what the human at the center just was a big movement to thinking about human protective design. we say that the tools that are, they are so powerful that they cause real damage to us, individually for mental health, to our relationships, to our children and to us societally toward democracy to having like civil
4:53 am
discourse. and that move to human protective maintenance, i think is super helpful because then we could actually have technology which does the picture, the 1st place which is extend best so if you're concerned about the ways in which data is being collected and algorithms are being used to affect your life. there are 3 things i think you can do. one, use the tools that are given to use privacy dashboard use to factor authentication, which is really valuable to be more deliberate and critical about the ways in which company it companies are asking for your information. and the device that you adopt from the services you participate in, understand that companies are trying to tell you things through design. and they're
4:54 am
trying to make things easier or harder and think about whether you want things to be easier and, and what are the costs of making things easier and understand that companies are trying to get your consent because their entire business depends on it. and so think about that as you go forward. and then finally, i think that design and consent and privacy and algorithms need to be a political issue. and so if someone's running for office, ask them what their stance is on algorithmic accountability. ask them what their stance is on privacy and data collection. because if we get better rules about how the data is collected and rhythms are used and we might all be better off. where at this inflection point, where our technology is beginning to make predictions about us that are better than the predictions we make about ourselves. it's one of the best ways of binoculars is by learning about ourselves more just like stop and really ask yourself like before you post something to facebook right here, i'm like, what, why am i doing this?
4:55 am
like, what are my motivations? if you sort of like slow down your thought process, often i've found that it will be like, oh, i am pulling this app out because i'm a little bit bored or everyone else just pulled up their phones. i'm feeling a little socially awkward. oh, that's curious. or maybe like i'm having this experience and i so the want to show off just a little bit. and just like stopping and thinking about like what are my motivations for doing things up. i found to be a great, an occupation for, for spending my time in ways that i wish i had later. i'll just say that i recommend having a general attitude shift where you understand yourself as an organism, as a creature that can like any other number of creatures and animals be turned in certain directions. have your attention swayed, caught, captured, hooked. i find it liberating to recognize all of the forces that are
4:56 am
exerting these, these different powers on me to turn my attention in one way or another and somehow realizing that helps me to disconnect a lot because it makes me a little bit angry. and i don't feel so bad about myself. i feel bad about what's being done to me. and then i'm more able to disconnect on the streets of grief on t. m. a grin. violence is on the rise. the road you have to go from. i will turn key and this and that this is from foxes and increasingly migrant farm workers of victims. a vicious beatings jo reed is helping the pakistani community to find a voice. the stories we don't often hear told by the people who live them undocumented and under attack,
4:57 am
this is europe on al jazeera. ah, ah ah, ah. hello, great to see. we're going to start your weather story across the americas and 1st towards south america, disturbed weather puerto lay gray rate into uruguay. take a look at it salty on $39.00. i think. i think we'll get you to 40 by the time monday rolls around, and then the wind direction shifts to a southerly when we get some rain and that cause you down the top end of south america. persistent rain for the pacific coast of columbia is filling into the columbia andes on sunday. in time for central america. really aggressive rain will
4:58 am
be falling for mexico city with a high of $900.00 degrees. ok, north america, here's what your weather stories looking like for the u. s. gulf states. remnants of what was hurricane nicholas. still spreading a lot of what weather for alabama into the florida panhandle. and we do have a disturbance that has now formed into a tropical storm, a debt. it's taking aim at the canadian province of newfoundland, and labrador saint john's looking at about a 100 millimeters of rain in the days to come. so facing when pumping up the temperature and when it take to 28 degrees and still what weather for british columbia that what weather is also driving down to washington state and oregon, which will help provide relief to at least 20 wildfires burning there in time san francisco has a hive 20. the people are much more than to the community. they live,
4:59 am
no matter how much. and it's the presentation as much as anyone else's. going to love domains is one of the most polluted part of the night, the delta. and now it's people say they want a clean up hold that is point and bottom and with the media love, the coverage covers you just when you suffer calamities. i don't think that's right . and that is what i want to change. i wanted to go further to cover story stuff, impact the lives of people to, to flip the stories that i was really passionate about. stories with fucked. the government will try to keep him in the story stuff, drop the fixed narrative and depend on the reality on the thought is why i became a john. mr. hammond i book is who's has become a dangerous one or one of those who refuse to be silent?
5:00 am
19 Views
Uploaded by TV Archive on