Skip to main content

tv   The Communicators  CSPAN  October 10, 2016 8:01am-8:31am EDT

8:01 am
>> guest: we are looking at a cadillac crossover suv from 2011 which is capable of driving itself. so it's a self-driving cadillac. >> host: well, if you look at the car, it looks pretty norm from here. >> guest: yeah. it's a car that we literally bought new from a cadillac dealership in pittsburgh, and then we outfitted it with a bunch of sensors and computers, added software to it. bam, it drives itself. >> host: well, you wanted to give us a ride. >> guest: oh, we would love to give you a ride. >> host: let's go. >> guest: hop in. >> autonomous driving. >> host: what's the main reaction you get from people riding mt. car with yousome. >> guest: the typical reaction is one of anxiety and angst. fear and occasionally panic attacks. but then they eventually watch the car able to drive, it's
8:02 am
actually stopping when it should stop, it's taking the curve fairly comfortably and seems to recognize -- [inaudible] they build a degree of comfort. >> we learn more about the move to self-driving cars and professor rajkumar at the lab where his experimental car is kept and worked on. >> host: so dr. rajage kumar, what's your job here at carnegie mellon university? >> i am a professor of robotics and engineering. >> host: how'd you get into that? >> guest: turns out that i did my postgraduation studies, did my master's and ph.d. at carnegie mellon, then i joined ibm research for three years or so because i came back -- and i came back because i like9 cmu and pittsburgh so much.
8:03 am
>> host: what kind of things do you work on? >> guest: what are called embedded systems, that's the technical term, these are things that embed computers inside them, but basically different devices. for example, your smartphone is basically kind of a sophisticated device with computers inside them, but you really don't -- [inaudible] take a it's, for example. it has a computer inside of it but is meant to be a television. you think of it as basically that device. a toaster that you use on a kalely basis that costs -- daily basis that costs $10 has a teeny-weeny computer inside it. so basically these are embedded systemses, embed computers and act smart but it's really a dedicated device of some other kind for the end user. so i've been working on embedded
8:04 am
users since my doctoral days, if you will. and it turns out that intelligent vehicles that drive themselves are are a classical example of embedded systems. so a car a car that transports people from and goods from point a to point b. it just doesn't have computers embedded inside them. >> host: how did you get into the business of autonomous vehicles? >> guest: great question. i have been working with general motors, the carmaker, since 2004. general motors has been working with researchers in our department since the year 2000, so when i started working with gm in 2004, started applying my expertise in embedded systems into automobiles. and then in 2006 darpa, the defense advanced research agency, the research arm of the military announced a competition. the name of the and competition was the darpa urban challenge.
8:05 am
the intent was vehicles that drive themselves without anybody in the car, they need to drive for about 60 miles in fewer than six hours driving in an urban-like setting with other self-driving vehicles as well as human-driven vehicles and following the same rules of the road that you and i have to follow on a daily basis. so more that competition gm became our biggest partner and sponsor. we had about 23 other sponsors as well, but gm was the biggest of them. because we already had a very strong working relationship with gm, i became an integral part of the team from cmu that worked on our vehicle which ended up winning the competition as well, a $2 million prize. so that's how i got into automated vehicles. and when our team from carnegie mellon won the competition, gm who was the biggest sponsor
8:06 am
said, hey, our team actually sponsored the winning team, and they said because it's about urban driving, it leery has implications -- clearly has implications to the passenger/consumer market segment and started a second lab on campus focusing exclusively on automated driving, and i've been running that lab as well since its launch. so that's how i got to do automated vehicles and started working with gm. so our relationship continues to be very strong and loyal. >> host: so does gm own the technology that you develop? >> guest: so the technology that they sponsor, i guess they are very grateful more their support, is actually owned by carnegie mellon university, and we have some licensing elements with gm but they do have access to parts of technology. >> and we learn more about the experiment bal cadillac as we got ready to take it for a drive. >> host: okay. now, i want to start by looking at.
8:07 am
what is this monitor up here on the topsome. >> guest: so that's one of six laser sensors. so that's one on the forehead of the car. there's one in the bumper here, right there. there's a third lidar on the other bumper as well. there is one behind the side back window of the car, and there's one on the other side exactly opposite to that. >> host: what are they reading? >> guest: so what they're doing is they are actually sending out multiple laser beams, and then when the laser beams basically hit an object, they bounce back and come back to the origin, the transmitter. and then because we know the speed of light, you can actually calculate how far that is. and because already multiple beams that create scanning as well, you can actually get a profile of the obstacle that you just encountered. and because we have lidar all
8:08 am
over the car, the vehicle actually knows what's happening around the vehicle in realtime. >> host: now, dr. rajkumar, is this car communicating with anybody but itself, mig but itself? -- anything but itself? >> guest: the vehicle is capable of communicating with properly equipped traffic lights, traffic signs and other similar radios. we like to call it a connected and automated vehicle or connected automated vehicle, tav in short. >> host: we see some cameras inside the car. what are these? >> guest: yes. so in addition to the six laser sensors that we talked about, there are three cameras and six radars as well. there are two cameras, one on either side of the rearview mirror inside the cabin. one is actually pointed downwards, is looking for landmarkers on the roads. one of the cameras is pointing upwards, looks at traffic lights
8:09 am
so you know the status of the traffic lights as you're going along. there's a third camera at the back of the reek for backing -- vehicle for backing purposes, you can see what's going on. and there are also six radars. there's actually one behind this cadillac emblem. we replace the met a aloe go with a plastic logo so the radar can see through the plastic. there's a radar behind this bumper as well on top of the lidar that's actually behind the bumper. the bumper is made of plastic so the radar can see through the plastic as well. there's radar on the other side of the bumper, and there's two on the side right next to the lidars as well, but you cannot see from the outside, you don't see from the inside either. there's another radar at the back behind the bumper also. >> host: is this car seeing 360 degrees? >> guest: it is seeing 360 degrees all the time as opposed to humans where the turn of our
8:10 am
heads can't turn all the way around. >> host: so, professor rajkumar, when you get in this car, what's different? be what's different in the look than a regular cadillac? >> guest: so we tried to make this car look normal on the outside and the inside. on the inside it pretty much operates and and looks like a normal car just like you put a rental car at the airport and then pick up the keys and then the layout looks slightly different, but you're still able to drive. basically the same thing, you can bring in your keys and yet the vehicle manually and start driving. meanwhile, if you actually look at the dashboard, there are only two things that have changed. there is a button on the dashboard, it's emergency stop. and there's actually a button behind the stick shift. think of this as your, the autonomy equivalent of cruise control engage button. so you would actually engage
8:11 am
this button to go into autonomous driving mode. so what you need to do is rotate the switch and then pull it upwards to get boo an autonomous -- into an autonomous driving mode, and that is a very conscious action on the driver's part. >> host: is this car licensed to drive the states of pittsburgh? >> guest: this in pennsylvania allow a vehicle to drive itself as long as two -- [inaudible] are satisfied. number one, a licensed human driver is in the driver's seat and, number two, that a human can take over control at any point many time. so under those two conditions, vehicles can drive themselves on public roads. >> host: but this could drive as a normal car as well, correct? >> guest: yes, of course. to disengage the autonomous, one has to reflexively push this thing down. but you did not actually have the time to do that, you could
8:12 am
grab hold of the steering wheel and turn it, you could press the brake pedals or the gas pedals, the vehicle will still respond. it waits for the human to take over. and this button here is for strictly emergency purposes, because we added a bunch of sensors, computers concern. [inaudible] and then engaged, for example, something totally unexpected happens like you start smelling smoke, you have no idea what's going on. the vehicle's already driving, you just push that button in, it mechanically, electronically -- [inaudible] becomes a -- [inaudible] knock on wood, you don't have to engage that while driving yet. >> host: what's the cost after all the different systems that you've added to this car? >> guest: so because many of the sensors, radars and the cameras are really one-off units, so they tend to be expensive.
8:13 am
so it is an expensive vehicle because of that. but the real idea to be thinking about is that when the volumes go up, the costs will go down significantly. i guess our thinking that when these vehicles are mass produced, it would add about $5,000 on top of the -- but we think it would be very affordable to many people. >> host: all right. well, you're going to give us a driverless ride. is it truly driverless? is that a fair word to use? >> guest: it is an automated vehicle are. it can drive itself under many conditions but not under all conditions yet. at least not yet. >> host: okay. >> guest: let's start driving. i guess i'll take you to kind of -- [inaudible] explain a few more things. >> host: and you're driving manually. >> guest: it says manual on the screen. >> host: ark h. ah. >> guest: let's point out a few things on the screen.
8:14 am
it turns out that the screen, we just interfaced our computers to the same thing, so i guess if i flip back and forth, because this is the normal speed of the stock vehicle and then this is a screen that we added. by flipping a screen, we can go back and forth. >> host: and what are these images we're seeing here? >> guest: so what you see on the screen, so this is the display which actually shows what the vehicle is doing at any point in time. so we as humans can feel comfortable that the vehicle is doing, indeed, doing the right thing. so a human can take over if something is not quite right. so what we see here are some icons here, and the icons, for example, say that i can launch, i can stop, i can automatically tell the vehicle to go to the airport or go to work. and this basically lets you zoom
8:15 am
in or zoom out. and what you see on the screen is that you see two blue lines. they represent the lane that the vehicle can drive in -- >> host: now, there are no lines on this parking lot that we're in. >> guest: we have programmed the map of this parking lot, so the vehicle is basically allowed to drive along these lanes, and we go to the main public roads, basically that -- [inaudible] the map that that the gps navigation system has. and then you see a green line there. we have already preprogrammed basically take a right out of this park next to where we are, so that's the route that your gps device calculates. >> host: okay. >> guest: gps has a built-in map database, and -- [inaudible] so that route is the green line that you see. you also see a very short line out there. that is basically the car
8:16 am
knowing the green line route, knowing the blue line map, it basically uses its sensory data from the lidars, radars and cameras and basically says over the next 15, 20 meters, this is how -- this is where i'm going to drive. >> host: did you have to program every route ahead of time, or can it go out onto a a street it's never been on before? >> guest: the it needs to have a map of the roads -- >> >> host: a gps map. >> guest: a gps map, and then you need to basically tell you, tell the system where you want to go. it uses the map information and your current location to calculate the route just like a gps garmin device or google map. so that's the green line. >> host: okay. >> guest: and the red line is basically the vehicle -- >> host: this little red line right here. yes, okay. >> guest: and then so i guess now if i zoom back a little bit be, you see a bunch of dots on
8:17 am
the screen -- >> host: yeah. >> guest: those dots are the laser points from the laser sensors appear anything realtime. so what you see here is basically a bunch of yellow dots -- social >> host: is this that white car? >> guest: that's the white car. and these white -- [inaudible] and dark navy blue -- >> host: dumpster? >> guest: -- that dumpster there. >> host: and all these white lights, these are the trees. okay. >> guest: basically, able to sense the environment. we use our eyes and ears. in this case, the lasers, the cameras and the radars act as the eyes and ears of the vehicle. >> host: how far can it see? >> guest: about 100, 75 meters. but with the connectivity technology that we discussed earlier, it has built-in wireless communication radios that can actually talk to -- that can go as far as 600 meters. >> host: all right.
8:18 am
>> guest: so let's do the following, let's engage the vehicle in autonomous mode. i'm going to do the following, so the vehicle is actually in parking mode. i'm going to basically engage the autonomous mode -- >> host: while it's in park. >> guest: while it's in park. >> autonomous driving. >> guest: it started driving. >> host: does this ever make you nervous? >> guest: i guess the normal reaction for anybody new at this is, hey, you feel a lot of anxiety -- [laughter] what happens if -- [laughter] raj is not paying attention anymore. let's see how the vehicle does. >> host: okay. dallas, you doing okay back there? >> think so. >> host: and it turns on its turn signals because you told it where you wanted to go already, correct? ing. >> guest: yes.
8:19 am
>> host: okay, now, all right. >> it saw -- >> host: so you did that, you hit the brakes. >> guest: i did not. >> host: oh, it hit the brakes. and it knows the speed limit. >> guest: yes. it sticks exactly to 25. it will do the legal speed limit is 25, pretty much everybody drives 35 or 40. but vehicle is a stickler to the rules. [laughter] so basically right now we have a vehicle behind us and, of course, the driver says why are you driving so slow. >> host: uh-huh. it seems to do a little meandering in the lane. is that a fair assessment? >> guest: that could, it could be better, yes. >> host: but it's just reading constantly. >> guest: yes. >> host: now, it's going to make some -- >> guest: this is a curvy, windy road. so i'm not controlling steering wheel or the brake pedal or the
8:20 am
gas pedal, and it was able to shift transmission by itself. >> host: i see that. now, that -- [inaudible] >> host: okay. it sensed that? that biker? >> guest: yes, sir. >> host: okay. all right. how far have you come in 30 years? >> guest: we have come a very long way but still some ways to go to, basically, completely remove the human from the driving process. >> host: is this vehicle constantly learning? >> guest: this vehicle is not constantly learning by itself. we collect data and use the data actually to teach the software about new features, new functions. so it is not learning as we drive, it's actually learning after the fact when we go back. >> host: how did it know there
8:21 am
was a stop sign there? >> guest: the map basically has indications about where the stop line is. it does it all by itself. >> host: that was the car that did that. >> guest: the car did it all by itself, yes. >> host: it wasn't sure of its speed -- >> guest: basically, slightly to the right and saw those parked cars -- >> host: okay. >> guest: so let me do, i do need to get back, i'll take over manually. i just push this down, in which case the vehicle has gone back to manual mode -- >> host: okay. >> autonomous ready. >> host: and you do that on the fly? >> guest: yes. >> autonomous driving. >> guest: so we can switch back and forth seamlessly, if you want. >> host: and it's seeing all of these things -- >> guest: yes. it's not going to -- >> host: there's a baby. >> guest: this crosswalk is not
8:22 am
on the map, so it doesn't understand -- >> autonomous -- >> guest: these have just been added recently. >> host: so it's not quite ready to be sent out on a road it's never been on before. >> guest: highways we have never been on, we can do that. urban areas we're a lot more careful because of pedestrians, bicyclers. we do it on highways but not on you urban corridors. >> host: okay. bike. [inaudible]
8:23 am
>> guest: we are back on that curve, the curvy, winding road. >> host: can it read signs? >> guest: it can read some signs, yes. but not all signs. turns out there are thousands of signs, it does not understand all of them. so there you see that red line there. the green is the path that it wants to take. it turns out that the red line is exactly on top of that. >> host: okay. how far have you driven in this car autonomously? >> guest: we have driven a total of about 20,000 miles or so autonomously. not in one continuous stretch doing. >> host: what's the longest trip you have ever taken? >> guest: we have done a couple of hundred mileses on highways. and the -- miles on highways.
8:24 am
and a shipoff that i created -- a spin-off that i created that's been used by delphi to drive from san francisco to new york city, about a 500-mile -- 3500-mile journey, and the vehicle drove itself on highways about 96.8% of the time. so highway is not a problem. >> autonomous ready. >> guest: so you have taken your first ride in an autonomous car. >> host: when will we do this regularly as consumers? >> guest: simple question, basic question, i'll give you a long and complex answer. you can already buy vehicles, for example, tesla with an autopilot future that the vehicle can drive itself, but
8:25 am
the human must be paying attention. general motors, next year, will include a similar feature that they call supercruise where the vehicle can steer itself and apply the brakes and gas pedal as well, and that will be in the cadillac time next year. and many high-end vehicles can already park themselves today. so some of these features are already available on the market, if you will. and then three to five years from now, we expect that the vehicles will be able to drive themselves, but in well-specified, well-defined, geographically-constrained regions, and that's called geo-fencing, geographically fenced areas where basically, for example, pedestrians aren't allowed, bicyclists are not allowed, and there is no heavy rain or heavy snow. california. so those areas can deploy some of the technologies earlier, but
8:26 am
when you asked the question about when can the human not drive around, that basically implies that the technology should be able to drive a vehicle itself from any point a to any point b that you and i as experienced drivers can drive in the u.s. that capability is going to take at least ten years. we have come a very long way over the past couple of decades or so, but is till quite some ways -- still quite some ways to go before the human can take himself or herself out of the seat, go into the backseat and take a nap. >> host: have you allowed your can kids or wife with you in the autonomous car? >> guest: sure. we have many family members a go along. we have had many customers like yourself in the car. >> host: i was a little surprised we can't have to sign a release before we got in. [inaudible] lax, if you will.
8:27 am
that's because we're just researchers. >> host: why are we talk talking to you about autonomous cars in pittsburgh rather than detroit or silicon valley? >> guest: so great question. it turns out that carnegie mellon is a globally well known as a strong reputation for computer science, engineering as well as robotics. we have an entity called the robotics stews on campus -- institute on campus which is internationally recognized and has more than a hundred researchers in it excluding students, if you will, and they're all extremely knowledgeable about the area of robotics and -- [inaudible] in the field have been built at cmu since the early 1980s. in fact, we at carnegie mellon believe that we are a birthplace of autonomous vehicle technology dating back to about 1983 or so. a couple of years back in 2014
8:28 am
we literally celebrated the 30th birthday of this technology on campus. >> host: you said before we started this interview that computers are simultaneously very intelligent and very stupid. >> guest: yes. so computers are simultaneously very intelligent, they can do things that amaze us, right? they can actually react very quickly, and they can make decisions that to a normal person sounds extremely smart. how does it know that it should be driving at this speed, slow down there and so on. they look very intelligent, and they are very intelligent because they process this 360-degree view of the vehicle with the multiple sensory data streams from lasers, radars and cameras. very intelligent. but at the same time, they are stupid, if you will, because they don't really have common sense. for example, we know that when we fall down or when we
8:29 am
basically touch fire, it hurts, and the next time you won't do it. but computers or cannot make that simple inference saying, hey, i crashed into somebody, next time don't do that. it'll do the exact same thing unless it is programmed to do something else specifically by a human being. .. this is the 11th generation.
8:30 am
it is being run by -- [inaudible] the vehicle on the right is a cadillac that we are able to drive today. so that the vehicle is created by the project that i lead with support from general motors through department of transportation. so because of our close working relationship with gm we are extremely sensitive to the aesthetics of the vehicle, the exterior and injure as well. it looks very normal, that's something gym would be proud to sell. >> host: how far along armed with this technology? >> guest: the technology has

42 Views

info Stream Only

Uploaded by TV Archive on