Skip to main content

tv   Shift  Deutsche Welle  October 21, 2023 4:02am-4:15am CEST

4:02 am
really what we want in many cities, robots already being useful security measures. and in some cases, the even debates on equipping robots with weapons. but robots really make all the streets safer. either actually a security threat themselves and who can be held accountable if something goes wrong. the replacing police with robots looks like this, this bang for a button. the us clim drive explosives in his mouth. the robot was used to check whether he was telling the truth. another case, the robot book spots patrol. this puck and singapore during the covert pandemic. it made sure that everyone kept enough distance spots. cameras can also measure how full the park is visa to relatively typical use this for putting these robots. they collect data like in single floor, and they are used to protect police, especially in extremely dangerous situations. police robots can disarm bombs.
4:03 am
special sensors can detect dangerous chemicals or check for radioactive aviation microphones, lidar and infrared cameras allow the police to get a sense of the crime scene before entering it. this helps them to determine the best course of action advancements and hardware development. according to tongue estimate robots can take on more complex tasks. the engineer heads tell it i'm players, robotics department, and then us, which manufacturers and security robots. if you look at computers, even just 10 years ago, right, we've got like 30 x increase in computation power. as we had back down here, that opens up a lot of possibilities. it's really proven ground robots up to a new level. robots, like the ones made by template unclear, are used by law enforcement agencies around the world. they can assist in hosted situations, this time bombs, and scan dangerous areas. within 4000 robots of the pac bought model alone have
4:04 am
been sold. this highly immobile, ruled by king sign stairs, maneuver difficult terrain, and move in tight spaces. the data collected by integrated cameras and sensors is processed in real time insured with other police devices via the cloud. if you look at commonality of data and data sharing, right? so now we're able to have ground robots that, you know, talk to ariel assets and also provide a common operating picture. for now, police robots still work in tandem with human colleagues. but machine learning could enable them to learn from their own experiences and take on more task independently. we're starting to see, you know, robots do. so those are thomas tests today for asian, for instance, but also through some threat identification autonomous robots are only being developed for police forces. but for security firms and services to this work can also be dangerous and monotonous in switzerland. robots are being tested and taking
4:05 am
on these tough machines have one advantage that can work 247 without getting tired of this robot is a guard on wheels with foldable legs. it doesn't seem phased by adverse working conditions, and it should ideally patrol large areas like this for yard. the developers of the swift start up acendo hope to fill a gap in the chronically understaffed security industry with their robot. chose to go for the security market because there's a place where it's really difficult to find people nowadays to have to work long hours and they have to work in every way to conditions. we did this job ourselves and it's really powerful. and when we wanted to help there, and our robots are able to make the life of these cards a lot easier. the cost of a robot depends on many different factors, but according to a center of the costs are lower than employing a human garden in switzerland. a $360.00 degree camera allows the robot to get
4:06 am
a full view of it surrounding its thermal camera. it can detect humans and vehicles at night. a cloud based a guy processes the images in real time and generates a daily report. whether robots can stop criminal behavior remains to be seen. but this technology allows human employees in the control room to have eyes and ears in multiple places simultaneously. but just imagine if these robots have permission to decide whether or not to shoot for military purposes, so called killer robots already being used, usually as drones. but the idea to use them in non military areas makes many people nervous and race and lots of ethical questions. san francisco officials considering robots that can kill no, this isn't science fiction. it's reality. in 2022, the city administration wanted to arm these types of robots. protest stopped the plan for moving forward. but civil rights activists like kate
4:07 am
crawford, one of the police use of on robots is simply a matter of time proffered fears. this could lead to reckless decisions and an increase in deadly force. the police officer might resist engaging in fine because of the kind of a robot acting on orders from someone you know, 101520 a 100 miles away. who may not understand, you know, the full circumstances of what's going on the scene. or even worse, a robot has been pre programmed and uses machine learning technology to essentially, you know, decide for itself how to behave and given situation, may not have any compunction. killer robots are not yet being used by us police. but in 2016, when a sniper shot 5 police officers in dallas, texas, who he's used
4:08 am
a remote controlled robot, encore, the explosives, to kill via sealants the robot had originally been designed to design and bonds. there aren't long as regulating the use of police robots, the in a society where there are no laws governing how the police can use or vice technology. that the, the police could use robots to suppress the sense, you know, you've seen incredible does that in iraq, for example, where you hear even members of the guns are saying this deal is too far, right? well, that's not an issue. if, if the people in power have an army of robots at their control, that's pretty scary. and i'm not the only one that thinks that in a survey by the international component, stop killer robots. 66 percent states at the police will, autonomous weapons would cross the moral line as machines should not be able to kill some of the leading real, but produces actually agree this spoken out against the all purpose robots being
4:09 am
data privacy also comes into it because it's not just police robots that collect data in los angeles and deliver a robot, gave the police the crucial hand to solve the case. a robot helped catch its own feel. pretty funny, right? actually, it's not really a laughing matter. in fact, it raises serious concerns about video surveillance of food delivery robot was out in about in los angeles. when suddenly thieves appeared and tried to steal what was inside the robot managed to escape. but afterwards, the manufacturing company that built the machine, sent it to video footage to the police without being asked to do so by the investigators. the police located the suspects and the rest of them. delivery robots usually record their surroundings. but the decision on what to do with that footage rests with the manufacturer serv. robotics wants to deploy 2000 robots and several you cities. and if they continue making their recordings available to
4:10 am
authorities, could turn them into a mobile surveillance network. aside from robot security cameras also collect data. somebody equipped with facial recognition software and combat by identified people using biometric data and some police departments the controversial, the data base can give you a i is used this a i can compare one person's face with billions of photos which has been taken from social media platforms without consent, c o want on top setup, give you a i has already processed more than a 1000000 search requests by us please. analyzing data is meant to help prevent crime by identifying potential criminal activity. this concept is called predictive policing. but what exactly does this entail? at the heart of predictive policing, why is a big promise to stop crying before it is even committed?
4:11 am
this is done by coming through masses of data. some programs focus on potential crime scenes. if they find links between previous incidents, say that they all occurred at the same time in place, or even during the same weather conditions, they recommend the police 0 and on those locations. other programs predict, but they're considered to be potential offenders. they scanned the criminal and personal histories of individuals, perso called risk factors like previous arrests, or even dropping out of school to come up with a list of who is likely to break the law. the law enforcement agencies to play these programs in over 50 countries. western democracies and authoritarian governments, the proponents are predicted policing, say it makes city safer, but the approach has cost outrage among human rights advocates may war and that it
4:12 am
increases discrimination. the programs, for instance, often slack low income communities, or minority neighborhoods, as alleged hotspots, prompting the police to patrol those areas more than others. this in turn, generates even more data and sets off a vicious cycle of discrimination. sliding these areas over and over again. similarly, the programs tend to single out low income people and minorities as potential offenders existing. these data were the new daily within the us, whether in germany, whether in australia, embedded a lot of historical discrimination. they do embed systemic problems, such as the racism of cost isn't even effects as in now with taking all of this data. that is, you know, seemingly objective and correct, but it's in the ality quite biased and discriminated especially it was minority but conditions. we did that as being the ground truth on which the computer kind of
4:13 am
like loans, what backgrounds exist? public awareness for the risks of predictive policing and bias use. again, some social groups is growing in the us a mid, a debate over systemic racism and how to reform policing. some police department stopped using facial recognition and predicted policing. but in other countries the technology is still in demand. in countries that don't have strong data protection legislation or, you know, where governments i'm more open to experimenting with technologies on people. we see that product definitely thing is actually on the right, whether it's police robots, all predicting crimes using ai. we should be very careful about which task which like to delegate to machines. machines don't have a conscience or a sense of responsibility. that's why fundamental decisions, especially when the about life and death should never be made by machines alone. what small, i feel like the interpersonal aspect dismissing personally i would draw the trusted
4:14 am
as a human to help me instead of a machine. what do you think? would you feel safe if they will police robots in your neighborhood? or does it actually make you feel more anxious? let us know by and see you next time. the image is freedom of the online where young those koreans fled to south korea, where they realize their dreams of becoming social media. but then they disappeared without warning. need to recess as a north korean propaganda video. lots happened from north korea with love starts october 25th on d. w. jackie is changing 6 years ago. we said it can't get any was to,
4:15 am
but it does guardians of truth. this time, excel gen, this turned into meets the voices of a free turkey alter as the ad one had to flee into exile. i knew the police would search my house. courageous people are trying to stem the turkish governments all sort of tammy and cools. of some kids, but really it's a crime is addressed and the path of trying to teach this phone civility for his action. what about freedom of to print and freedom of expression? god use of choice starts october 28th on d w. the.

17 Views

info Stream Only

Uploaded by TV Archive on