Skip to main content

tv   [untitled]    September 18, 2021 3:30pm-4:01pm AST

3:30 pm
last question, and i think that's what they were really does. we all the questions to people who should be accountable, and also we get people to give their view of what's going on. ah, i'm all back into the top stories on al jazeera opponents of trinity, as president guy said, of taking to the streets and the 1st demonstration, after what they say was an organized qu, lizzie, as powerful labor union, and the biggest party in parliament oppose any attempt by the president to suspend the constitution, i said took control of the government and dismissed the prime minister in july. and frances, recalling its baset as to the us and a strange i'm after what is coming out, stabbing the bank, frances angry. after australia, i counseled a multi $1000000000.00 submarine contract to enter
3:31 pm
a military alliance with the u. s. and the u. k. challenge is more from london recalling ambassadors in the carefully coded language of diplomatic relations is, is a pretty serious move. it's not the breaking off of diplomatic relations, but it's what you do, what you want to show to a country that you really are not happy with the way it's behaved. and let's not forget that france is the united states oldest allies, allies relationship that goes back to 1778. when france recognized the independence of these breakaway colonial states and helped them in their war of independence against the british. the us military's admitted a drone strike enough kinda sons capital last month killed civilians instead of ice old fighters. 10 people including 7 children, died in the attacking couple. tell us commander has apologized for what he called a tragic mistake. guineas military leaders say that will not bow to international
3:32 pm
pressure to that detained president. off a con, they fly into exile. the west african regional block echo was expelled guinea after candy was overthrown during a military takeover. earlier this month, the president of ivory coast and john asked for counties released during a one day trip to con, agree on fighting us media report, say the bible ministration, planning the mass deportation of thousands of haitian migrants. my bought a time in texas, most of the migrants are staying at a makeshift camp under a bridge in del rio. an expert panel in the u. s. has rejected government plans to roll out cobit $900.00 booster shots to all americans age 16 and older. but it indoors to extra pfizer bound tech shots for people above the age of $65.00 and those at high risk of severe disease. and those are the headlines. the news is going to continue in about half an hour. go by. there is a huge group of people at work behind our screens. they called behavior architects,
3:33 pm
persuasive design, and it's all user experience specialists and the power they have is massive that urge to keep swiping through your twitter feed. that's the time, the way we all click, i agree to the terms and conditions. that's design fucking lift, all right on kinda that's designed to we live in an online world or someone else is making. and most of us never even give it a 2nd floor. and actually that's design is what the san francisco, it's the mecca from. tech designers, home to silicon valley. this place piney, the art of constructing, optimizing and enhancing a lot of the technology we use every day. it's turbo charged,
3:34 pm
the speed at which we use the internet and made navigating the web more insurance. but it's also given us a full sense of security. i've lost count of the number of times i've clicked. i agree to get into a website. we will have to do it as we feed around the internet. we face hundreds of these annoying pop ups and consent forms all demanding a response from us. and yes, i call them annoying because that is exactly what they are. they may look like that, but it brought us with controls, but the reality is saw from when users click on i agree to the terms and conditions . so they see a privacy policy and they click on it. they may think that they're actually being given control of their personal data. what collected, how it's used, and it's advantageous to companies for them to do that. because if something bad happens, the tech company can then say you actually agreed to this. nobody ever reads the
3:35 pm
terms of service. no one should ever be expected to. if we actually tried to read every terms and conditions agreement that we came across it, probably the only thing we would do it would have to be our day job because for so long and we come into so many i'm it may have the near of giving control to data subjects, but ultimately if it's window dressing, what a hot dog is, what you'd call a scholar of design. he's been studying the psychology ethics and legal implications of new technologies. one area he specializes in is data protection and privacy. now before we get into this as a key term, you need to know informed consent. this is a principle that comes up a lot in discussions of our rights online, but it's easier to explain in the context of medical. a doctor explains potential risks and worst case scenarios to the patient. once you're fully informed, you have the option to consent to surgery or not in the online well informed
3:36 pm
consent is what everyone says the ideal. but is that even possible? profess only works under s at the very narrow set of conditions, and that when the decision is infrequent like with surgery, so we don't have surgery all the time. it's when the risks are visceral. there are things that we can easily conjure up in our mind. and then finally, the harm is possibly great things go wrong with surgery, you could get thick or you could die. so we've got an incredible incentive to take that decision. but of course, none of those things are present in the data ecosystem. we make the decisions quite frequently 10100 times a day. the harms are not visceral at all. they're incredibly opaque. and finally, the harm is not even that great because the private modern privacy harms aren't huge, very deaf by 1000, cut the spin from so we can values that there on our side when it comes to how we control data. long privacy policies are very confusing and if you make it long and
3:37 pm
spell it all the detail, then you're probably going to reduce the percent of people who read it. however, take a closer look at the design of the buttons and the pop ups that were made to click . and it's clear that the tech companies have the upper hand in the data battle. design is power and everything designed decision makes a certain reality, more or less likely. and what tech companies and psychologists and other people have known for years as the defaults are notoriously sticky. and so if you design the interface so that all the defaults are set to maximize exposure, then you're going to get a lot more data in the aggregate than you would if you set all the defaults to privacy protective because people don't go in and change them so until we fundamentally change the incentives, we're still going to see companies manipulating the design of these button, these technologies to ensure that you still keep disclosing data and that they still keep getting what's a life plot of their business. most of us assume that when we go on
3:38 pm
a website and click that i agree button. the site simply collects information that we voluntarily choose to share. in reality, there are many layers to data collection and the mechanics of it are invisible, hidden by design, but started, it isn't just the website you are and next morning information. there are so called 3rd party advertises marketing and analytics agencies. also tracking, using tiny bits of software, cookies, beacons, pixel pads, they scoop up incredibly detailed information. everything from your computer you're using to how long you hover like. honestly, it's a bit mind boggling. and all you really did pick up informed consent is a fundamentally broken regulatory mechanism for algorithmic accountability. it allows companies to continue to throw risk back on to the user and say, you know, here's a, here's a pop up ad i, here's a pop up banner that tells you about cookies that no one reads,
3:39 pm
and nobody really even cares about yet. we push forward this regime as though it matters to people as though group. if someone clicks, i agree, then there magically ok with all the data processing that's going to come afterwards, which is a little bit of a joke and it's a little bit of a legal fiction. yet. it's a major component of every, almost every data protection framework around the world. once you've crossed the i agree hurdle. now in the clutches of the website. and this is where design takes on a whole new level of important. the job is to keep you one of the most successful innovations in mobile and website design is something called infinite scroll helix . we all use it every single day. if you scroll endlessly through your without ever need to click, i'm on my way now to meet the creator of this function. his name is as a rescue. now he no longer with inside big tech corporation. in early 2018,
3:40 pm
he confounded the central humane technologies all of our apps, all of token dollar companies are competing for our attention. and because it's such a cut throat game trying to get our attention, we're tens of billions of dollars. they have to increasingly point more powerful computers at our heads to try to frack us for that attention. and we've all had that experience of like going to you tube. and you think i'm just going to just watch one video and then like, somehow like you shake your head an hour has passed and like what, why, how is it? the technology has hypnotized us. this tech hypnotize ation is key to what's called the attention economy. our attention needs a finite currency in the online world, and it's only as long as websites and apps have our attention that they have our business. and our data retention economy is just this. it says,
3:41 pm
if we are not paying for a product, well, the company to make some money somehow, how do they do it? they do it by selling our attention to advertisers or to other groups that want us to do something. they're trying to make the systems as effective as possible at influencing your decision. they collect as much information about you as they can, like who your friends are, how you spend your time off and on monday with like how you spend your money. do you take all of this data to build a model of you imagine like a little simulator view that lives in the facebook server. and then they can put things in front of it. you like, are you more likely to click this, this or this? or if we want to get you to hate immigration, what kind of message would you, are you going to resonate with this message or this message is you can see how it is like this begins with just this race for it. for your attention ends up becoming
3:42 pm
an entire economy is worth of pressure with the very smartest mines in engineering . and the biggest super computers trying to make a model of you to be able to influence the kinds of decisions you are going to make . a few years ago, you tube set a company wide objective to reach 1000000000 hours of viewing a day. netflix create a read, hastings has also said multiple times that the company, biggest competitor, isn't another website, its sleep. so what happens when you give algorithms the goal of maximizing our attention and time online? they find our weaknesses and exploit them. in 2017 shown taca, a founding member of facebook and its 1st president literally confessed about how do we consume as much of your time and conscious attention as possible. and that means that we need to sort of give you a little over me and hit every once in a while, because someone liked or commented on a photo or
3:43 pm
a post or whatever. and that's going to get you to contribute more content, say, social validation, feedback loop that it's like. i mean, it's exactly the kind of thing that a hacker like myself would come up with because you're exploiting a vulnerability in, in human psychology. now it's not as though silicon valley piney, the tricks and tactics of addiction or persuasive design. many tech design is openly admit using insights from behavioral scientists of the 20th century. the concept of randomly scheduled or studied and developed by american psychologist skinner in the 1950. he created what's become known as the skinner box, a simple contraption. he used to study pigeons and even at the start of the process, the pigeon is given a food reward every time it picks a would pick or tens of full circle when the would turn. as the experiment proceeds through woods become less frequent. they take place at random time,
3:44 pm
the behavior has been established. the pigeon keeps pecking or turning, not knowing when it might get the reward. but in anticipation that her ward could becoming skin boxes were pivotal in demonstrating how design had the power to modify behavior. and if randomly scheduled rewards work for pigeons, why not humans? skins concept is in fact at the heart of a lot of addictive design from casino machines such as slot machines. the social media platforms are unerringly similar to thought machine. think about your facebook and instagram of pinterest phase. we all swipe down, pause, and then wait to see what will appear. went back to those randomly scheduled rewards. again. you'll see what could result in and your comment on a photo and you like or piece of spam or software update. we don't really know, and it's that unpredictability that makes it so. natasha daschle is
3:45 pm
a cultural anthropologist who spent more than 15 years studying the algorithms behind persuasive design. just like on a slot machine when you're texting or when you are looking through the news feed, you really never know what's coming down the pipe. you never know when you're going to sort of hit that jack pop, so to speak, and when it's coming and how much it'll be. so the randomness is very important to keep you hooked in. i think the fact that we're gambling money in a casino isn't really that different than what we're doing. because in both cases, what we're really gambling is our attention and our time right across the board. we're sitting there sort of hoping for some, a little reward to come, never knowing when it's going to come in all cases where we're sort of sitting alone with the machine. there's no natural stopping point. i think that the
3:46 pm
similarities are quite striking. we check our phones over a 150 times a day or it just like pull it up, put up, pull it up. it's the 1st thing we look at when is the last thing we look at before we go to sleep, it's like what we're glued to it. and that's, that's by design. we now have like over 2000000000 skinner boxes in people's pockets. we are running the largest experiment like psychological experiment that the world has ever seen by orders of magnitude, right? one of every 4 human beings on earth has skinner box, which is learning how to uniquely target them. super fascinating in the abstract, but sort of terrifying when you think about what it's doing. the research has been done. the evidence is clear that digital technology is being designed with the intention to make us attics. it's not a secret in the tech industry, either. 2 of the biggest tech figures, bill gates, in the late steve jobs, admitted they consciously limited the amount of time that children were allowed to
3:47 pm
engage with the products they helped create. the problem is real, but the media can often sensationalize the issue of tech addiction and make it harder for us to understand. think that the reason people maybe reach for the metaphor of crack cocaine is because they see that is a 3rd of high speed hyper intensive form of addiction. and you know, while i don't use that metaphor myself, i do think that within the spectrum of media addictions, there are certain ones that are more high potency, you could say. and those are, if we're going to use the language of addiction, they have a higher event frequency. think about horse race, right? you go to the track and you've got to really wait for that event to happen. if you are rapidly engaged in an activity, a twitter thread, that is more high potency has a higher event frequency, which means each event has an opportunity to draw you in more to reinforce that
3:48 pm
behavior more. so i think we really can apply the language of addiction to the different media. i have an ongoing frustration which is that whenever i feel for a 2nd, have this impulse to reach into my pocket and pull up my phone and then i get angry at myself because i say that's, that's not right. just just enjoy that moment. right. as d b with yourself for a 2nd, and then i get angry at myself that my phone has that much power over me, right? i'm, i'm angry that, that i'm subject to the design of a technology in such a way that i have difficulty serve, resisting its allure. but of course, everything about the technology is built to, to, to create that impulse to, to make it feel as though it's irresistible. there's such emphasis put on free choice and being able to be
3:49 pm
a consumer and you make decisions in the marketplace about what you want to do. can you have free will. but at the same time, the very people who are promoting that of the person as a consumer sovereign are operating and designing their technology with a very different human subject in mind. it's somebody who can like a rad or a pigeon, or any other animal, be incentivized and motivated and hooked in and have their attention redirected. and that's really interesting to me because it is a kind of return to skinner. i think you wouldn't have heard that in the eighty's or ninety's that wouldn't be even been creepy to think about someone designing your behavior. but now it's become accepted that you can be a behavior designer. and behavior design was one taught of what they used to do in a previous life. however, he is now one of a growing number of industry insiders who are taking
3:50 pm
a more critical stance towards the silicon. john, talking to him, let me wondering, does he regret his part in inventing infinite scroll down to be like humble about it. if i hadn't invented it, it would have been bent it. i just happened being the right place to the right time . thing about the right kind of thing. yes, i do would read it. but i do think it talks to like the navy of like, oh, here's a cool feature and making it. and even if it's great for me to user without thinking about the effects that will happen, if you can scale it up to 100000000 people or 1000000000 people where this little thing that i went around to like twitter and google and all these other companies like guy you should adopt it now as wasted, quite literally hundreds of millions of human ours. i'm sure all of us have had someone or another say to was stopped looking at your phone or why you so addicted to social media. and before i started this series,
3:51 pm
i thought maybe there was something wrong with me. i just had no idea had deliberately designed our online experiences and how these design algorithms are made to pull us in. and i asked everyone i spoke to, how do we change that? can we change how online design regulation cannot be expected to happen on its own, within these corporations, right? who are profiting from this? because there's just too deep of a conflict of interest. and so the only viable kind of regulation is going to have to come from the outside. we have to have a public conversation about what is actually going on in these products. how are they working? and once we understand that as a collective, what do we want to limit and constrain? so if you start with the assumption that people are never going to fully
3:52 pm
understand the risks of algorithms in the way in which that interact with data, then we have to think about what else might work and the data protection regimes like the general data protection regulation are a great foundation, one of the ways in which we could really improve upon that is to embrace a trust status model. so instead of putting all the risk on the user, the requirement of protecting people would fall on the company that using the algorithms that using the data. i think there is going to be a huge shift from just human center design, as we call it in our field. it is like put the human at the center to was a big moving to thinking about human protective design. we say that the tools that they are so powerful that they cause real damage to us, individually for mental health, to our relationships, to our children and societally toward democracy to having like civil discourse.
3:53 pm
and that move to human protective to her. she made it, i think it's super helpful because then we could actually have technology which it was supposed to the 1st place which is extend best so if you're concerned about the ways in which data is being collected and algorithms are being used to affect your life there are 3 things i think you can do . one, use the tools that are given to use privacy dashboard use to factor authentication, which is really valuable to be more deliberate and critical about the ways in which company it companies are asking for your information. and the device that you adopt from the services you participate in, understand that companies are trying to tell you things through design. and they're
3:54 pm
trying to make things easier or harder and think about whether you want things to be easier and what are the costs of making things easier and understand that companies are trying to get your consent because their entire business depends on it. and so think about that as you go forward and, and finally, i think that design and consent and privacy and algorithms need to be a political issue. and so if someone running for office, ask them what their stance is on algorithmic accountability. ask them what their stance is on privacy and data collection. because if we get better rules about how the data is collected, not rhythms are used and we might all be better off. we're at this inflection point where our technology is beginning to make predictions about us that are better than the predictions we make about ourselves. and one of the best ways of innoculated it is by learning about ourselves more just like stop and really ask yourself like before you post something to facebook right here, i'm like, why am i doing this?
3:55 pm
like, what are my motivations? if you sort of like slow down your thought process, often i've found that it will be like, oh, i am pulling this app out because i'm a little bit bored or everyone else's pulled up their phones from feeling a little socially awkward. oh, that's curious. or maybe like i'm having this experience and i want to show off just a little bit and just like stopping and thinking about like what are my motivations for doing things up. i found to be read an occupation for, for spending my time in ways that i wish i hadn't later. i'll just say that i recommend having a general attitude shift where you understand yourself as an organism, as a creature that can like any other number of creatures and animals be turned in certain directions. have your attention swayed, caught, captured, hooked. i find it liberating to recognize all of the forces that are
3:56 pm
exerting these, these different powers on me to turn my attention in one way or another and somehow realizing that helps me to disconnect a lot because it makes me a little bit angry. and i don't feel so bad about my self, i feel bad about what's being done to me. and then i'm more able to disconnect russell bid in southern england where through farmers trans safari park, pioneers of fits the attractive nature in the driving seat. i was just absolutely astonishing the life, the poor back even the very 1st time left. and i miguel sophie, i santiago when one by your company revolutionizing the system you think fund an artificial intelligence. hearing side you have you have
3:57 pm
fries, phone l just for one day i might be covering politics. you might hear, i think from serbia is a hungry for what most important to me is talking to people understanding what they are going through so that i could convey the headlines in the most human way possible. here as we believe everyone has a story worth hearing. ah, ah ah, it's another beautiful sunny day at 35000 feet. the weather sponsored by cattle airways booted world's best airline of 2021. hi again, good to see you. let's get going with your weather story for the america. this disturbance in the atlantic has formed into a tropical storm,
3:58 pm
a debt taking aim at the canadian province of newfoundland, and labrador. and saint john's looking to pick up about a 100 millimeters of rain. one week after hurricane larry rolled through toward the west. the south facing wind in when a pig is going to pump up your temperature to 28. we've had an atmospheric river for b, c. so north of the fraser scooping up about a 130 millimeters of rain, that wet weather, diving down through washington state and oregon. and that's going to help provide relief to nearly 20 wildfires burning there for the u. s. calls states remnants of was hurricane nicolas still given us what weather pulses towards the south of alabama and along the florida panhandle in time for central america. just aggressive rain falling in mexico city on sunday. we've got some what weather across the caribbean, his farm. ok for that top end of south america, persistent rain for the pacific coast of columbia, falling through the columbia andes further toward the south. disturbed weather for
3:59 pm
part of a lake re rate down to your grey or something on 39, but i think you'll get up to 40 by the time monday rolls around and then a big drop in temperatures. on tuesday, the weather sponsored by cut off airways voted world's best airline of 2021. meteor ice, small natural rocks from outer space that survive the journey down to us and have high market value for rock and minimum collectors. the world's joins the moroccan nomads in their desert such with these gifts for adults. yeah, i can tell that it's a meter i had it is it is, i mean, to roy morocco's meteorite hunters on orders with more than 200000000 cases because of 19, worldwide governments about going to fight fresh wave of the virus. a new variance, there has been a 3rd and the number of people walking vaccination appointment from human call to the political and economic fuller. august era bring to the latest on depend demik.
4:00 pm
this'll have vaccinated more than 1100 people here, all of them migrant farm workers, people on home testing because they think that there is a risk to democracy. special coverage, and i'll just, sarah. oh, um this is al jazeera ah hello, i'm rob patterson. this is a live from bill coming up in the next 60 minutes. june is your constitutional crisis jaws thousands to the streets processed as opposed to seizure of ruling powers by president car side we with deliberately kept in the black. we where there was i go and frustration among allies funds.

16 Views

info Stream Only

Uploaded by TV Archive on