Skip to main content

tv   Shift  Deutsche Welle  August 3, 2024 4:02am-4:16am CEST

4:02 am
useful security measures, and in some cases, the even debates on equipping robots with weapons. but robots really make all the streets safer. ave actually a security threat themselves and who can be held accountable if something goes wrong. the replacing police with robots looks like this. this bang for a button, the us clim drive explosives in his mouth. the robot was used to check whether he was telling the truth. another case, the robot book spots patrol. this puck and singapore during the cobra pandemic. it made sure that everyone kept enough distance spots. cameras can also measure how full the park is visa to relatively typical use this for putting these robots. they collect data like in singapore, and they are used to protect police, especially and extremely dangerous situations. police robots can disarm bombs,
4:03 am
special sensors can detect dangerous chemicals or check for radioactive aviation microphones, lidar and infrared cameras, a lot of the police to get a sense of the pricing before entering into. this helps them to determine the best course of action advancements and hardware development. according to tongue estimate robots can take on more complex tasks. the engineer heads tell it i'm clear, is robotics department and then us, which manufacturers and security robots. if you look at computers, even just 10 years ago, right, we've got like 30 x increase in computation power as we had backed down. here's the opens up a lot of possibilities. it's really proven ground robots off to a new level. robots, like the ones made by tell it unclear, are used by law enforcement agencies around the world. they can assist in hosted situations, this time bombs, and scan dangerous areas. more than 4000 robots of the pac about model alone have been sold of this highly immobile robot can climb stairs,
4:04 am
maneuver difficult terrain, and move in tight spaces. the data collected by integrated cameras and sensors is processed in real time insured with other police devices via the cloud. if you look at the commonality of data and data sharing, right? so now we're able to have a ground robots that you know, that talk to aerial assets and also provide a common operating picture for now. police robots still work in tandem with human colleagues. but machine learning could enable them to learn from their own experiences and take on more task independently. we're starting to see, you know, robots do. so those are thomas tests today for asian, for instance, but also through some threat identification autonomous robots aren't only being developed for police forces, but social security firms and services to this work can also be dangerous. and monotonous in switzerland. robots are being tested and taking on bes tough. the
4:05 am
machines have one advantage that can work 247 without getting tired of this robot is a guard on wheels with foldable legs. it doesn't seem phased by adverse working conditions, and it should ideally patrol large areas like this for yard. the developers of the swift start up acendo hope to fill a gap in the chronically understaffed security industry with their robot. chose to go for the security market because there's a place where it's really difficult to find people nowadays. they have to work long hours and they have to work in every where conditions we did just job ourselves and it's really purple. and we wanted to help there, and our robots are able to make the life of these cards a lot easier. the cost of a robot depends on many different factors, but according to a center, the costs are lower than employing a human garden in switzerland. a 360 degree camera allows the robot to get a full you have it surrounding its thermal camera. it can detect humans and
4:06 am
vehicles at night. a cloud based a guy processes the images in real time and generates a daily report whether robots can stop criminal behavior remains to be seen. that this technology allows human employees in the control room to have eyes and ears in multiple places simultaneously. but just imagine if these robots have permission to decide whether or not to shoot for military purposes, so called killer robots, already being used, usually as drones. but the idea to use them in non military areas makes many people nervous and race and lots of ethical questions. san francisco officials considering robots that can kill no, this isn't science fiction. it's reality. in 2022, the city administration wanted to arm these types of robots. protest stopped the plan for moving forward. but civil rights activists like kate
4:07 am
crawford, one of the police use of on robots is simply a matter of time proffered fears. this could lead to reckless decisions and an increase in deadly force. the police officer might resist engaging in fine because of the kind of a robot acting on orders from someone you know, 101520 a 100 miles away. who may not understand, you know, the full circumstances of what's going on the scene. or even worse, a robot that has been pre programmed and uses machine learning technology to essentially, you know, decide for itself how to behave in a given situation. may not have a compunction. killer robots are not yet being used by us police. but in 2016, when a sniper shot 5 police officers and dallas, texas, police used a remote controlled robot armed with explosives to kill via sealants. the robot had
4:08 am
originally been designed to disarm bonds. there aren't laws regulating the use of police robots, the in a society where there are no laws governing how the police can use or vice technology. that the, the police could use robots to suppress the sense, you know, you've seen incredible, does that in a rod, for example, where you're even members of the guy are saying this to us too far, right? well, that's not an issue, is if the people in power have an army of robots at their control, that's pretty scary. and i'm not the only one that thinks that in a southern by the international component, stop killer robots. 66 percent states at the police will, autonomous weapons would cross a more a line, as machines should not be able to kill. some of the leading robot produces actually agree this spoken out against the all purpose robots being data privacy also comes
4:09 am
into it because it's not just police robots that collect data and los angeles, a delivery robot gave the police the crucial him to solve a case a robot helped catch its own fee. it's pretty funny, right? actually it's not really a laughing matter. in fact, it raises serious concerns about video surveillance of food delivery robot was out in about in los angeles. when suddenly thieves appeared and tried to steal what was inside the robot managed to escape. but afterwards, the manufacturing company that built the machine sent it to video footage to the police without being asked to do so by the investigators. the police located the suspects and the rest of the delivery robots usually record their surroundings. but the decision on what to do with that footage rests with the manufacturer serv. robotics wants to deploy 2000 robots and several us cities. and if they continue making their recordings available to authorities, it could turn them into
4:10 am
a mobile surveillance network. aside from robot security cameras also collect data. somebody equipped with facial recognition software and combat by identified people using biometric data and some police departments the controversial data base can give you a i, it's used this a i can compare one person's face with billions of photos which has been taken from social media platforms without consent, c l want on top sets of give you a, i has already processed more than a 1000000 search requests by us police. analyzing data is meant to help prevent crime by identifying potential criminal activity. this concept is called predictive policing. but what exactly does this entail? at the heart of predictive policing, lies a big promise to stop crying before it is even committed. this is done by coming through masses of data. some programs focus on potential crime scenes. if they find
4:11 am
links between previous incidents, say that they all occurred at the same time and place, or even during the same weather conditions, they recommend the police 0 and on those locations. other programs predict what they consider to be potential offenders. they scanned the criminal and personal histories of individuals for so called risk factors like previous arrests, or even dropping out of school and come up with a list of who is likely to break the law. the law enforcement agencies to play these programs in over 50 countries, western democracies and authoritarian governments. the proponents are predict the policing, say it makes city safer. but the approach has cost outrage among human rights advocates. they war and that it increases discrimination or the
4:12 am
programs, for instance, often slag low income communities or minority neighborhoods as alleged hotspots, prompting the police to patrol those areas more than others. this in turn, generates even more data and sets off a vicious cycle of discrimination. sliding these areas over and over again. similarly, the programs tend to single out low income people and minorities as potential offenders existing police data, well the new delhi within the us, whether in germany, whether in australia in bed, a lot of historical discrimination, they do embed systemic problems such as the racism of cost isn't even the sexes in now with taking all of this data. that is, you know, seemingly objective and correct, but it's in reality quite biased and discriminated to especially it was minority but conditions. we did that as being the ground truth on which the computer kind of like loans, what backgrounds exist?
4:13 am
public awareness for the risks of predictive policing and bias use. again, some social groups is growing in the us a mid a debate over systemic racism and how to refund policing. some police departments stopped using facial recognition and predicted policing. but in other countries the technology is still in demand. in countries that don't have strong data protection legislation or, you know, where governments i'm more open to experimenting with technologies on people. we see the predictive when this thing is actually on the rise, whether it's police robots, all predicted crimes using a i, we should be very careful about which task which like to delegate to machines. machines don't have a conscience or a sense of responsibility. that's why fundamental decisions, especially when the about life and death should never be made by machines alone. what small, i feel like the interpersonal aspect dismissing personally i would draw the trust another human to help me instead of a machine. what do you think to,
4:14 am
would you feel safe if they will police robots in your neighborhood? or does it actually make you feel more anxious? let us know by and see you next time the . this is shadows of jumping colors. these pod costs and video shed lights on the dog is devastating. colonial har is infected by germany across and he employed the scorched coast farms and destroy livestock. what is the legacy of this wide spread races, depression? today, history, we need to talk about here, the stories, shadows of german colonialism. so you don't think
4:15 am
and feel the same way you expect and more different things from life when your parents i just want to pursue what that's my thoughts hired or you think your kid is 2 different, risky, irresponsible, reasonable stop in port is not i want my son to become a dr. joe in the canal. it's time to to get from your generation with a sleep asked and then when generations mash watch now on youtube dw, you mentioned this kind of product feels like therapy the the.

9 Views

info Stream Only

Uploaded by TV Archive on