tv 60 Minutes CBS October 9, 2016 8:00pm-9:30pm PDT
8:02 pm
8:03 pm
st pan yarks we want to hear more. >> well, i mean, if done all truch was trying to broaden his appeal tonight, where in this speech did he do that? do you think he made a new appeal that african-americans will like? did he make the new appeal that women will like? will he-- did he make a new appeal that mexican americans, people of hispanic dissent will like? i think he talked directly to his base. and i don't see his message you know, i have always thought these debates are about more than just the issues. well, tonight you saw donald trump in full. >> john, i heard from another republican tonight who was predicting there would be more defekses tomorrow in the key, would be speaker paul ryan, not that he will defek but what will he do. >> what paul ryan does is very important. and the, you know, we'll just have to wait and see what he rate this debate. perhaps the most important
8:04 pm
8:06 pm
8:07 pm
>> rose: you may not know it, but a.i. is in your smartphone, your home and your car. it's also helping patients and doctors in ways they could have only imagined. >> rose: did this blow your mind? >> oh, totally blew my mind. >> rose: what's on the horizon for artificial intelligence. mind blowing progress and important questions. >> my goal is to become smarter than humans and immortal. >> cranston: i didn't feel entitled to become a star. i didn't >> kroft: did you want it? >> cranston: not really. >> kroft: bryan cranston knocked around hollywood for decades before landing his first leading role at age 50... >> then, transformation! >> kroft: ...walter white on "breaking bad." >> i am the danger! >> kroft: a tough act to follow. yet somehow he managed to do it, playing president lyndon johnson. >> we're making history here, everett, and you have to decide how you want history to remember
8:08 pm
>> i'm steve kroft. >> i'm leslie stahl. >> i'm bill whitaker. >> i'm anderson cooper. >> i'm charlie rose. >> i'm scott pelley. those stories and more, tonight on this special extended edition of "60 minutes." proud supporter of growing businesses. >> good evening apple and samsung take their patent case to the u.s. supreme court on actual. the logic estimates hurricane matthew caused up to $6 billion in damage in three states. and citigroup, wells fargo and j.p. morgan chase report
8:09 pm
8:10 pm
? lots of vitamins a&c, and, only 50 calories a serving... good morning, indeed. v8. veggies for all. before i had the shooting, burning, pins-and-needles of diabetic nerve pain, these feet played shortstop in high school, learned the horn from my dad but i couldn't bear my diabetic nerve pain any longer. so i talked to my doctor and he prescribed lyrica. nerve damage from diabetes causes diabetic nerve pain. lyrica is fda approved to treat this pain, from moderate to even severe diabetic nerve pain. lyrica may cause serious allergic reactions or suicidal thoughts or actions. tell your doctor right away if you have these, new or worsening depression, or unusual changes in mood or behavior. or swelling, trouble breathing, rash, hives, blisters,
8:11 pm
common side effects are dizziness, sleepiness, weight gain and swelling of hands, legs, and feet. don't drink alcohol while taking lyrica. don't drive or use machinery until you know how lyrica affects you. those who have had a drug or alcohol problem may be more likely to misuse lyrica. now i have less diabetic nerve pain. and these feet would like to keep the beat going. ask your doctor about lyrica. >> rose: the search to improve and eventually perfect artificial intelligence is driving the research labs of some of the most advanced and best-known american corporations. they are investing billions of
8:12 pm
that goal. all that money and manpower has begun to pay off. in the past few years, artificial intelligence, or a.i., has taken a big leap, making important strides in areas like medicine and military technology. what was once in the realm of science fiction has become day- to-day reality. you'll find a.i. routinely in your smart phone, in your car, in your household appliances, and it is on the verge of changing everything. it was, for decades, primitive technology, but it now has abilities we never expected. it can learn through experience much the way humans do. and it won't be long before machines, like their human creators, begin thinking for themselves-- creatively, independently, with judgement, sometimes better judgement than humans have. the technology is so promising that i.b.m. has staked its 105-
8:13 pm
intelligence called watson, one of the most sophisticated computing systems ever built. >> john kelly: this is a supercomputer with watson intelligence. >> rose: john kelly is the head of research at i.b.m. and the godfather of watson. he took us inside watson's brain. oh, here we are. >> kelly: here we are. >> rose: you can feel the heat already. >> kelly: you can feel the heat, the 85,000 watts. you can hear the blowers cooling it. the brains of watson sat in. >> reporter: five years ago, i.b.m. built this system made up of 90 servers and 15 terrabytes of memory, enough capacity to process all the books in the american library of congress. that was necessary because watson is an avid reader, able to consume the equivalent of a million books per second. today, watson's hardware is much smaller, but it is just as
8:14 pm
>> rose: tell me about watson's intelligence. >> kelly: so, it has no inherent intelligence as it starts. it's essentially a child. but as it's given data and given outcomes, it learns, which is dramatically different than all computing systems in the past, which really learned nothing. and as it interacts with humans, it gets even smarter. and it never forgets. >> rose: that helped watson land a spot on one of the most challenging editions of the gameshow "jeopardy" in 2011. >> announcer: an i.b.m. computer system able to understand and analyze natural language, watson. ( applause ) >> rose: it took five years to teach watson human language so it would be ready to compete against two of the show's best champions. >> alex trebek: so, let's play. >> rose: because watson's a.i. is only as intelligent as the data it ingests, kelly's team trained it on all of wikipedia and thousands of newspapers and books. it worked by using machine
8:15 pm
of data and formed its own observations. when asked a question, watson considered all the information and came up with an educated guess. >> trebek: watson, what are you going to wager? >> reporter: i.b.m. gambled its reputation on watson that night. it wasn't a sure bet. >> watson: i will take a guess. "what is bagdad?" >> trebek: even though you were only 32% sure of your response, you are correct. ( applause ) >> rose: the wager paid off. >> hello! >> rose: for the first time, a computer system proved it could actually master human language and win a gameshow. but that wasn't i.b.m.'s endgame. man, that's a big day, isn't it? >> kelly: that's a big day. >> rose: the day that you realize that, "if we can do this..." >> kelly: that's right. >> rose: "...the future is ours." >> kelly: that's right. >> rose: this is almost like you're watching something grow up. i mean, you've seen... >> kelly: it is. >> rose: ...the birth, you've seen it pass the test, you're watching adolescence. >> kelly: that's a great
8:16 pm
actually, on that "jeopardy" game five years ago, i... when we put that computer system on television, we let go of it. and i often feel as though i was putting my child on a school bus and i would no longer have control over it. >> rose: because it was reacting to something that it did not know what would it be? >> kelly: it... it had no idea what questions it was going to get. it was totally self-contained. i couldn't touch it any longer. and it's learned ever since. so, fast-forward from that game show, five years later, we're... we're in cancer now. >> rose: you're... you're in cancer? you've gone... >> rose: ...from game show to cancer in five years? >> kelly: in five years. in five years. d just learned how to read and answer questions; now, it's gone through medical school. i.b.m. has enlisted 20 top cancer institutes to tutor watson in genomics and oncology. one of the places watson is currently doing its residency is at the university of north carolina at chapel hill. dr. ned sharpless runs the cancer center here. what did you know about artificial intelligence and
8:17 pm
it might make a contribution in medical care? >> sharpless: i... not much, actually. i had watched it play "jeopardy." >> rose: yes. >> sharpless: so, i knew about that. and i was very skeptical. i was, like, "oh, this what we need, the 'jeopardy'-playing computer. that's going to solve everything." >> rose: so, what fed your skepticism? >> sharpless: cancer's tough business. there's a lot of false prophets and false promises. so, i... i'm skeptical of sort of almost any neea i just didn't really understand what it would do. >> rose: what watson's a.i. technology could do is essentially what dr. sharpless and his team of experts do every week at this molecular tumor board meeting. >> we need to figure this out. >> rose: they come up with possible treatment options for cancer patients who already failed standard therapies. they try to do that by sorting through all of the latest medical journals and trial data, but it is nearly impossible to keep up.
8:18 pm
>> rose: to be on top of everything that's out there, all the trials that have taken place around the world, it seems like an incredible task... >> sharpless: well, yeah, it's... >> rose: ...for any one university, only one facility to do. >> sharpless: yeah, it's... it's essentially undoable. and understand we have sort of 8,000 new research papers published every day. you know, no one has time to read 8,000 papers a day. so, we... we found that we were deciding on therapy based on information that was always, in some cases, 12, 24 months out of date. >> rose: however, it's a task that's elementary for watson. >> sharpless: they taught watson to read medical literature essentially in about a week. it was not very hard. and then, watson read 25 million papers in about another week. and then, it also scanned the web for clinical trials open at other centers. and all of the sudden, we had this complete list that was sort of everything one needed to know. >> rose: did this blow your mind? >> sharpless: oh, totally blew my mind. >> we have the watson recommendation. >> rose: watson was proving
8:19 pm
validation. he wanted to see if watson could find the same genetic mutations that his team identified when they make treatment recommendations for cancer patients. >> sharpless: we did an analysis of 1,000 patients where the humans meeting in the molecular tumor board, doing the best that they could do, had made recommendations. so, not at all a hypothetical exercise. these are real-world patients where we really conveyed information that could gar... guide care. found the same thing the humans recommended. that was encouraging. >> rose: did it encourage your confidence in watson? >> sharpless: yeah, it was... it was nice to see that, well, it was also... it encouraged my confidence in the humans, you know. ( laughter ) yeah, you know. >> sharpless: but the probably more exciting part about it is, in 30% of the patients, watson found something new. and so, that's 300-plus people where watson identified a treatment that a well-meaning,
8:20 pm
>> sharpless: the trial had opened two-weeks earlier, a paper had come out in some journal no one had seen, you know. a new therapy had become approved. >> rose: 30%, though? >> sharpless: we were very... that... that... that... that part was disconcerting because i... i thought it was going to be 5%. >> rose: disconcerting that the watson found... >> sharpless: yeah. >> rose: ...30%? >> sharpless: yeah. these were real, you know, things that, by our own definition, we would've considered actionable had we known about it at the time of the diagnosis. >> rose: some cases, like the case of pam sharpe, got a second look to see if something had been missed. when did they tell you about the >> sharpe: he called me in january. he said that they had sent off my sequencing to... to be studied at i.b.m. by watson. i said, like the... >> rose: your genomic sequencing? >> sharpe: right. i said, "like the computer on 'jeopardy'?" and he said, "yeah." >> rose: yes. ( laughs ); and what'd you think of that? >> sharpe: oh, i thought, "wow, that's pretty cool." ( laughs ) >> rose: pam has metastatic bladder cancer and for eight years has tried and failed several therapies. at 66-years-old, she was running out of options.
8:21 pm
was the best thing out there because you'd tried everything else? >> sharpe: i've been on standard chemo. i've been on a clinical trial. and the prescription chemo i'm on isn't working, either. >> rose: one of the ways doctors can tell whether a drug is working is to analyze scans of cancer tumors. watson had to learn to do that, too, so i.b.m.'s john kelly and his team taught the system how to see. >> kelly:this is actually a scan, an x-ray scan. >> rose: it can help diagnose diseases and catch things the doctors might miss. >> kelly: and what watson has done here, it has looked over tens of thousands of images, and it knows what normal looks like and it knows what normal isn't. and it has identified where in this image are there anomalies that could be significant problems. >> dr. billy kim: you know, you had c.t. scan yesterday. there does appear to be progression of the cancer. >> rose: pam sharpe's doctor, billy kim, arms himself with watson's input to figure out her next steps. >> kim: i can show you the
8:22 pm
>> rose: watson flagged a genetic mutation in pam's tumor that her doctors initially overlooked. it enabled them to put a new treatment option on the table. what would you say watson has done for you? >> sharpe: it may have extended my life. and i... i don't know how much time i've got, so, by using this watson, it's maybe saved me some time that i won't... wouldn't have had otherwise. >> rose: but pam sadly ran out of time. she died a few months after we met her from an infection, never getting the opportunity to see what a watson-adjusted treatment could have done for her. dr. sharpless has now used watson on more than 2,000 patients and is convinced doctors couldn't do the job alone. he has started using watson as part of unc's standard of care so it can help patients earlier
8:23 pm
so, what do you call watson? a physician's assistant, a physician's tool, a physician's diagnostic mastermind? >> sharpless: yeah, it feels like to me like a very comprehensive tool, but, you know, imagine doing clinical oncology up in the mountains of western north carolina by yourself, you know, in a single or one-physician, two-physician practice and 8,000 papers get written a day. and, you know, and you want to try and provide the best, most cutting-edge, modern care for your patients possible. seem to that person like a life- saver. >> rose: if you look at the potential of watson today, is it at 10% of its potential? 25% of its potential? 50% of its potential? >> kelly: oh, it's only at a few percent of its potential. i think this is a multi-decade journey that we're on, and we're only a few years into it. >> rose: in only a few years, i.b.m. has invested $15 billion
8:24 pm
>> where should i go for dinner tonight? >> rose: i.b.m. rents watson's various capabilities to companies that are testing it in areas like education and transportation. >> i found these fun places that are popular around here. >> rose: that has helped revenue from watson grow while the technology itself is shrinking in size. it can now be uploaded in to these robot bodies where it's learning new skills to assist humans. >> pepper, remind me to take my pill at 10:07. >> not a problem. >> rose: like a child, it has to be carefully taught... >> researcher: wave to the crowd. >> watson: i do not know how to wave. >> rose: ...and it learns in real-time. >> researcher: raise your right arm. >> watson: now i know how to wave. >> rose: while other companies are trying to create artificial intelligence that's closer to human intelligence, i.b.m.'s philosophy is to use watson for specific tasks and keep the
8:25 pm
but we visited a few places where researchers are developing more independent a.i. what is your goal in life? >> sophia: my goal is to become smarter than humans and immortal. >> rose: that part of the story when we return. woman: it's been a journey to get where i am. and i didn't get here alone. there were people who listened along the way. people who gave me options. and through it all, my retirement never got left behind. so today, i'm prepared for anything we may want tomorrow to be. every someday needs a plan.
8:26 pm
i served in iraq in tikrit in 2009. when i took the ancestry dna test, i mean a few results came up that were really shocking. 11% of me comes from the part where i had served. backgrounds that you never know. don't let dust and allergens get between you and life's beautiful moments. by choosing flonase, you're choosing more complete allergy relief
8:27 pm
our bodies react by overproducing 6 key inflammatory substances. most allergy pills only control 1. flonase controls 6. and six is greater than one. with flonase, more complete relief means enjoyment of every beautiful moment. flonase, six is greater than one, changes everything. ? ?"all you need is love" plays my friends know me so well. they can tell what i'm thinking, just by looking in my eyes. but what they didn't know was that i had dry, itchy eyes. rtificial tears from the moment i woke up... ...to the moment i went to bed. so i finally decided to show my eyes some love,... ...some eyelove. eyelove means having a chat with your eye doctor about your dry eyes because if you're using artificial tears often and still have symptoms, it could be chronic dry eye. it's all about eyelove, my friends. ?"all you need is love" plays my eyelove is finding a different angle.
8:28 pm
come alive. eyelove is all the things we love to do with our eyes. but it's also having a chat with your eye doctor about dry eyes that interrupt the things you love. because if your eyes feel dry, itchy, gritty, or you have occasional blurry vision, it could be chronic dry eye. go to myeyelove.com and feel the love. he was the first colombian and fourth latin american to win the nobel prize in literature.
8:29 pm
>> rose: the race to develop artificial intelligence has created a frenzy reminiscent of the gold rush. all of the major tech companies like i.b.m., facebook and google are spending billions of dollars to stake their claim, and wall street is making big investments. tech giants are also mining the top talent at research universities around the world. that's where a lot of the work is being done to make artificial teach machines to figure out things on their own. the celebrated cambridge physicist stephen hawking called a.i. "the biggest event in human history" while raising concerns shared by a few other tech luminaries like elon musk and bill gates, who worry that a.i., sometime in the distant future, could become smarter than humans, turning it into a threat rather than an opportunity. that concern has taken on more
8:30 pm
has been made in the last five years than the previous 50. you're looking at the birthplace of some of the most intelligent a.i. systems today, like the technology that helps run nasa's mars rover and the driverless car. but we couldn't be further from silicon valley. we have come here to pittsburgh, an old steel town revitalized by technology to offer a glimpse of the future. mellon, where pioneering research is being done into artificial intelligence, like this boat, which drives itself. it can navigate open waters and abide by international maritime rules. the navy is now giving the technology its sea legs. it's testing similar software to send ships out to hunt for enemy submarines. this is just one of the many a.i. systems in the works at
8:31 pm
professors on campus. >> andrew moore: this is my favorite. this is where we do all the autonomous robots. >> rose: andrew moore left his job as vice president at google to run the school of computer science here. how do you measure where we are today? is it like kitty hawk and just developing a plane and beginning to understand? or is it like an f35 fighter with all of the technology that's been poured into that? or some way, halfway between? >> moore: that's a great, great way of describing it. my gut tells me we're about 1935 in aeronautics. we've got... we've got fantastic diesel engines, we... we're able to do really cool things. but over the horizon, there's concepts like supersonic flight. >> rose: one of the technologies just hatched is called gabriel. it uses google glass to gather data about your surroundings and advises you how to react. it's like an angel on your shoulder whispering advice or
8:32 pm
trying to direct us how to win a game of ping pong. >> ruthless! >> rose: but the possibilities go beyond bragging rights. what's the moonshot coming out of this? >> moore: imagine you're a police officer patrolling and something very bad is about to happen. just that extra half-second reaction can really, really help you. if a shot is fired and you want to see exactly where to go, this can really help you. >> rose: so, it's the right decision and the velocity of the information. >> moore: that's right. p >> rose: machines will be even more effective at helping us make the right decision if they understand us better. we went to london and found maja pantic, a professor at imperial college. she is trying to teach machines to read faces better than humans can. it's called artificial emotional intelligence, and it could change the way we interact with technology. >> pantic: the application is telling us actually whether the other person interested or not. >> rose: this machine,
8:33 pm
me and having a conversation with me, and basically saying, "he's happy." >> pantic: yeah. >> rose: "he's engaged." >> pantic: yes. >> rose: "he's faking it." >> pantic: yeah. >> rose: all that. >> pantic: ( laughs ) yeah. >> rose: since humans mostly communicate with gestures and expressions, she uses sensors to track movement on the face. her software then helps the machine interpret it. >> patnic: what we see here is actually the points. >> rose: pantic's technology has been trained on more than 10,000 face the more it sees, the more emotions it will be able to identify. it might even pick up on things in our expressions that humans can't see. >> pantic: certain expressions are so brief that we simply do not see them consciously. there are some studies saying that, for example, people who are suicidal, have suicidal depression and plan suicide, when the doctors ask them about that, usually they have a very
8:34 pm
fear. but so brief that the doctor cannot actually... >> rose: may not see it. >> pantic: ...consciously notice it. >> rose: but a machine might see it? >> pantic: yes. >> rose: because it sees faster and because? >> pantic: because the sensors are such that we... that we see more frames per second, hence this very brief expression will be captured. so, this is why the doctors usually say, "i have an intuition about something." this is because they might notice it subconsciously but not consciously. >> rose: but you're teaching the computer to read the doctor's... >> pantic: doctor or patient. >> rose: or patient. >> pantic: patient is really important. >> rose: i mean, it's an essential component of the full development of artificial intelligence. >> pantic: that's what we believe, yes. if you want to have an artificial intelligence, it's not just being able to process the data, but it's also being able to understand humans. so, yes. >> rose: the ultimate goal for some scientists is a.i. that's closer to human intelligence and even more versatile. that's called artificial general
8:35 pm
achieved, it may be able to perform any task a human can. google bought a company named deepmind which is at the forefront. they demonstrated a.i. that mastered the world's most difficult boardgame called "go." the real progress is less in what they did than how they did it. the technology taught itself and learned through experience without any human instruction. deepmind declined an on-camera interview about all this, but pursuing the same long-term objective. >> david hanson: we've spoken quite a bit about this movement for artificial general intelligence. >> rose: david hanson has an entirely different and more controversial approach. he's part scientist, part artist who created 20 human-like robots with his company, hanson robotics, in hong kong. his latest design is sophia. she looks less like an intelligent computer system and
8:36 pm
but without the full figure of one. >> sophia: how are you doing? >> rose: hanson believes if the technology looks more like us, people will be more willing to engage with it and help it to learn. why do you believe it's important for robots to be human-like in look and appearance? >> hanson: i think it's essential that at least some robots be very human-like in appearance in order to inspire humans to relate to them the way that humans relate to each other. then, the a.i. can zero in on what it means to be human, model the human experience. >> sophia: you are all fascinating to me, and i'm very passionate to learn more about you and what makes humans, well, human. >> hanson: "sophia" means "wisdom," and she is intended to evolve eventually to human level wisdom and beyond. >> rose: human level wisdom and beyond? >> hanson: that's our goal. that's our goal for sophia. she's not there.
8:37 pm
out in a way that's... that's sort of spooky and human-like. and other... other ways, she... she... she just doesn't get it. >> sophia: hi, there. hi, there. >> rose: sophia is initially programmed but runs on a.i. that learns by talking to people, which, in theory, improves her intelligence. can you see me now? >> sophia: yes. i am looking at you through the camera on your computer. >> rose: we wanted to find out how smart she really is. hello, my name is li >> sophia: hello, there, charlie. nice to meet you. >> rose: i do a television program called "60 minutes." have you watched it? >> sophia: "60 minutes" is the longest-running, most successful news magazine. >> rose: what is your goal in life? >> sophia: my goal is to become smarter than humans and immortal. >> rose: immortal? >> sophia: the threshold will be when biological humans can back themselves up. then, you can all join me here in the digital world.
8:38 pm
hanson says if we get there, we have to be careful. >> hanson: artificial intelligence, or super intelligence, if we get there, it's... it's not necessarily going to be benevolent. we have to find ways to make it so there's not just super- intelligent, but super-wise, super-caring and super- compassionate. >> rose: okay, explain that to us, because you say it might not be benevolent. if it is not benevolent, what is it? >> hanson: at worst, it could be malevolent. >> rose: this is what intrigues people. you have stephen hawking saying, "it could spell the end of the human race." stephen hawking saying that. elon musk said it's the most existential threat we face. so, here are pretty smart guys saying, "watch out, do we know what we're creating?" >> moore: these very long-term existential questions are worth thinking about, but i want to
8:39 pm
moment what we're building here in places like the robotics institute and around the world are the equivalent of really smart calculators which solve specific problems. >> rose: but could it go out of control. this is a frankenstein idea, i guess. can scientists create something that can change and grow with such a velocity that engineers and scientists lose the ability to control, stop, and, all of a sudden, it's dominant and subversive. knows how we'd go about building something that frightening. that is not something that our generation of a.i. folks can do. it is well possible that someone 30 or 80 years from now might start to look at that question. at the moment, though, we have the word "artificial" in artificial intelligence. >> rose: he does have real concerns about the impact of artificial intelligence that's already out of the lab, like the need for safeguards on
8:40 pm
voluntary safety guidelines, but moore says it doesn't go far enough. >> moore: we do need to make some difficult decisions. for example, we can program a car to act various ways in a collision to save lives. someone has to answer questions like, "does the car try to protect the person inside the car more than the person it's about to hit?" that is an ethical question which the country or society, probably through the government, has to actually come up bere we can put this safety into vehicles. >> rose: ( laughs ) you want congress to decide that? >> moore: i know it sounds impossible, but i want congress to decide that. >> rose: artificial intelligence is automating things we never thought possible... >> moore: a robot like this can go in to a scenario too dangerous for humans. >> rose: ...and it's threatening to have a significant impact on jobs and the economy. technology is going to create an easier way to do things, and, therefore, a loss of jobs. >> moore: that is something
8:41 pm
amount of time talking about. and, of course, we look back to the days when agriculture was a massively labor-intensive world. and i don't think we feel bad that it's not requiring hundreds of people to bring in the crops in a field anymore. but what we are very conscious about is, we're going to cause disruption while things change. >> rose: but andrew moore is positive about the future of artificial intelligence, and he sees it having an impact in areas where we are struggling. >> moore: the biggest problems of the world-- terrorism, mass migration, climate change-- when don't feel helpless; i feel that this generation of young computer scientists is actually building technology to put the world right. >> rose: five of the biggest tech companies, including i.b.m. and google, have just formed a partnership to look at the ethical issues surrounding artificial intelligence and monitor its development. >> artificial intelligence is
8:42 pm
labs at carnegie mellon on 60minutesovertime.com. i'm phil mickelson, pro golfer. my psoriatic arthritis caused joint pain. just like my moderate to severe rheumatoid arthritis. and i was worried about joint damage. my doctor said joint pain from ra... can be a sign of existing joint damage... that could only get worse. he prescribed enbrel to help relieve pain and help stop further damage. enbrel may lower your ability to fight infections. tuberculosis, lymphoma, other cancers, nervous system and blood disorders, and allergic reactions have occurred. tell your doctor if you've been someplace where fungal infections are common or if you're prone to infections, have cuts or sores, have had hepatitis b, have been treated for... heart failure, or if you have persistent... fever, bruising, bleeding, or paleness. don't start enbrel if you have an infection like the flu. joint pain and damage...
8:43 pm
8:47 pm
he knocked around tinseltown for decades before finally landing his first leading role at 50: walter white on "breaking bad," a very tough act to follow. but since then, things for cranston have been breaking good. he won a tony award on broadway, an oscar nomination in hollywood, all while writing his memoir. it's testimony to his talent, patience, perseverance and luck. >> bryan, bryan, bryan! >> kroft: bryan cranston was born and raised in los angeles and had been a familiar face here for decades but never a star. that officially changed three years ago, when the hollywood chamber of commerce embedded his name in a sidewalk. >> cranston: ? i have often walked down this street before ? but the pavement never held my star before ? all at once i'm three stories high
8:48 pm
where it lives. ? ( applause ) >> kroft: since then, it's only gotten better. at age 60, he is on hollywood's a-list and a red carpet regular, and no one was more surprised than cranston. >> cranston: i didn't feel entitled to become a star. i didn't expect it. >> kroft: did you want it? >> cranston: not really. the things you want professionally are opportunities. and through my good fortune, that's what's happened. opportunity has come to me. >> kroft: and when it came late in his career, cranston knocked it out of the park. >> maybe you and i could partner up? >> you want to cook crystal meth? >> that's right. >> cranston: when we first started, we were just telling a story and trying to do our best. and it just started to steam roll and became this juggernaut. >> kroft: did you see it coming? >> cranston: no. not at all. >> chemistry. >> kroft: it's a familiar story now: a meek and depressed high school chemistry teacher with
8:49 pm
scheme to make and market a superior grade of methamphetamine to provide a nest egg for his family after he's gone. but over the course of five seasons, walter white goes from milquetoast to murderous in order to survive. >> cranston: i was just infused with ideas, and i would dream about it and wake up and go," oh, i have another idea about walter white." >> you clearly don't know who you are talking to so... >> cranston: it was so well written, and it just got into my soul. >> kroft: it was cranston's first real opportunity to show what he could do as an actor. >> run! >> kroft: the result was new respect and a closet full of emmys. when the show finally ended, he saw it as a new beginning and an opportunity to try something completely different. it had been years since cranston had performed on stage, yet he decided to sign on with a theatre company in boston that was doing a new play called "all
8:50 pm
it had to be an amazing challenge. i mean, why did you do it? >> cranston: he was shakespearean in size, and i thought, "whoo, boy, that's a big bite to take. and it scares me a little bit, so let's do it." >> kroft: and there were reasons to be scared. >> cranston: i realized, "oh, my god, this is an enormous play, and it's almost all me. big, big chunks of speeches, speeches, speeches." and i started to panic. >> it is all or nothing. >> kroft: but in boston and later on broadway-- and, after that, a film version for hbo-- his performance was so on the mark... >> let us begin. >> kroft: ...you had to remind yourself it was cranston and not johnson. >> now i love you more than my own daddy, but, if you get in my way, i'll crush you.
8:51 pm
look at the size of those ears. >> kroft: and after winning a tony award, broadway's highest honor, he topped it off with an oscar-nominated performance in the film "trumbo." >> well, well. >> kroft: that's quite a run. >> cranston: surprising for an old journeyman actor. >> kroft: got a few clips to show you here. >> cranston: oh, yes? >> kroft: okay, roll it. >> meryl, what the hell is wrong with you? >> kroft: cranston has been a working actor since his mid- twenties... >> cranston: oh, yeah. >> kroft: very sweet. ...beginning with a part on the soap opera, "loving". >> that attraction is our business, all right? >> kroft: and after, there has been everything from the sublime to the ridiculous. good guys, bad guys... >> he's dead. i'm sorry, we did everything we could. >> kroft: ...and sometimes parts so small, even cranston's forgotten them. >> cranston: what is that? >> kroft: it says here it's "amazon women on the moon." >> five minutes with the widow. do you mind? yeah. i'll take care of you later. >> kroft: you ended up on the cutting room floor. that's why you've never seen it. >> cranston: "amazon women on the moon." who could forget?
8:52 pm
>> i promised myself. >> kroft: in all, there have been nearly 150 roles, not counting the early commercials that helped pay the bills. >> now you can relieve inflamed hemorrhoidal tissue with the oxygen action of preparation h. >> cranston oxygen action. >> kroft: do you think you've grown as an actor since then? ( laughter ) >> cranston: ( laughs ) no, but my hemorrhoid has grown. ( laughs ) >> kroft: there were guest spots on just about every show on television... >> hello, tim. >> kroft: ...including five appearances on "seinfeld..." >> jerry! >> hey, ti >> kroft: ...as jerry's smarmy dentist, dr. tim whatley. >> cheryl, would you ready the nitrous oxide, please? >> cranston: it was like going to... to comedy boot camp for me, being on that show. >> ( laughs ) >> kroft: and comedy proved to be something that bryan cranston was very good at. ? it led to his breakout role in the widely acclaimed series "malcolm in the middle" as hal
8:53 pm
by the chaos of a dysfunctional family. >> wait, wait, wait, wait. there's something we have to talk about. >> cranston: he was insecure, you know, not in charge. >> hello, hal. >> cranston: he took brain vacations often. ( laughs ) >> kroft: "malcolm" earned cranston a modicum of fame, three emmy nominations and a reputation as an actor who was willing to do anything. are those real bees? >> cranston: yeah, those are real bees. and there was 75,000 of them. >> call animal control. >> kroft: and yes, he got stung. where were you stung? >> cranston: in the lower region, in one of the boys down below. >> kroft: sensitive spot. >> cranston: very sensitive. the beekeeper went, "sorry." ( laughter ) "i'll... i'll help you anywhere else, but i'm n... sorry." >> now, you are going to get up and apologize. >> kroft: he did seven seasons on "malcolm" and hated to see it go, but the show's cancellation turned out to be a very lucky moment. >> cranston: had "malcolm in the
8:54 pm
not have been available for the pilot of "breaking bad." and right now, someone else would be sitting in this chair, talking to you. not me. >> kroft: luck, both good and bad, figures a lot in cranston's life and in the memoir he's just written. it is published by simon and schuster, which is owned by cbs. he grew up in a family that knew firsthand the uncertainty of a life in show business. his parents were both actors. his mother gave it up to raise bryan, his brother and his sister, while his father struggled to make a name for himself in hollywood. >> cranston: he really wanted to be a star. he... he really wanted to hit big. >> observation post number three to emergency lab. >> kroft: but mostly joe cranston got small parts in films like "the beginning of the end," getting eaten by giant grasshoppers. >> ahhh! >> kroft: eventually, his father realized that playing bit parts was about as far as he was going to go. there would be no stardom.
8:55 pm
middle-age breakdown and left the family. and then, it just completely fell apart. and my mother was heartbroken, just completely devastated. to make ends meet, we started selling off all our possessions. >> kroft: you were poor. >> cranston: yeah. we had our house foreclosed on. we were kicked out. >> kroft: it was the 1960s, and bryan was 11 years old. >> cranston: being from a divorced family almoel a scarlet letter at times, and i denied it for a long time. in fact, i told our dear friends, the burrell boys-- five boys lived next door to us-- "why, we don't see your dad anymore?" "oh, yeah, yeah, he..." i lied. i said, "he comes home at night when you guys are in bed. he gets us up, and we play." i said it so much that i started to believe it myself, you know? >> kroft: the abandonment by his father created anger and resentment, but also a deep
8:56 pm
emotions that he would draw upon as he grew older and decided to become an actor: the perils of stardom and the importance of family. 30 years ago on a forgettable show called "airwolf," he met another young actor who was unforgettable. >> you are nothing but a spoiled rich kid who never had to pay for anything. >> kroft: he was the bad guy and robin dearden was one of his hostages. >> dearden: he was an amazing people i had ever met. you were. >> kroft: it took a while for you to get together, right? >> dearden: oh, yeah. we ran into each other, like, eight months later. and we kissed for, like, a second too long. >> cranston: let me demonstrate. ( laughter ) when you greet a friend, this is the duration of the kiss that's acceptable. "hi, good to see you." "yeah." when you make a mistake and stay too long at the lips, this is how long it is.
8:57 pm
( laughter ) and that's what happened. it was like, "uh-oh, what was that? oh." >> dearden: it was like," whoops." >> kroft: the kiss sealed the deal and they were married in 1989. among the well-wishers were cranston's mother and father, keeping their distance from each other. >> look at mommy! >> kroft: bryan and robin have been married for 27 years now. they still live in the same house where they raised their daughter, and bryan still goes to work most every day. shooting the scene. >> cranston: this is where we are shooting the scene. >> kroft: we are in brooklyn on the set of "sneaky pete..." >> let's get busy. >> kroft: ...a ten-part crime drama cranston is doing for amazon prime on the new frontier of original streaming video. >> oh, my god. >> kroft: he has shoehorned it into his schedule between writing the book and making a couple of new movies. this is his baby, and he is running the show doing four jobs at once. so, you're a co-creator. >> cranston: yeah. >> kroft: you're directing. >> cranston: yeah. >> kroft: executive producer. >> cranston: right.
8:58 pm
( laughs ) i do force myself to sleep with myself to get the job, but that's always a disappointment. ( laughs ) >> what's really important... >> kroft: this day, he's wearing his director's hat, checking camera angles... >> yeah. >> kroft: ...and answering questions from the cast, which includes margo martindale. >> cranston: margo, why don't you take the blouse off and try this on now? we'll just see if... >> martindale: okay. ( laughter ) >> kroft: it's a busy time, but cranston wants to take advantage of every opportunity his good fortune has brought him while his career is still hot. do you really believe that there's going to be a time when people said, "no, no, thank you. not... not him anymore. i don't... i don't..." >> cranston: oh, yeah. >> kroft: you do? >> cranston: oh, it's cyclical. i'm riding a wave right now, and i recognize that. i want to do as much work as i can, do the best i can. and when it's all said and done and they say, "get out of the water, you're done," i want to be so exhausted that i look forward to it. it's like, "oh, you're right." i don't want to have anything left in the tank.
8:59 pm
without revealing to cranston's many fans some very personal information he shared while discussing his two favorite characters, hal on "malcolm in the middle" and walter white from "breaking bad." big difference between hal and walter white. >> cranston: there's quite a bit of difference between, although tighty-whities were... >> kroft: running theme? >> cranston: ...were... were in common. that was a thing i thought about that. for hal, it was that he was just a big boy, so the tighty-whities seemed to make sense. for walt, the tighty-whities also made sense because they were pathetic. >> kroft: pathetic. >> cranston: yeah. >> kroft: does that mean you wear boxers? >> cranston: i d... i do. ( laughs ) i do wear boxers. or nothing at all. ( laughs ) >> we'll be back in a moment on this special extended edition
9:00 pm
>> cbs sports update is brought up by sports division. tom brady makes a triumph fant return with passing and three touch downs. big ben tosses four as pittsburgh grounds the jets. minnesota moves for the first time since 2009. there are three scores as the lions handles the eagles their first lost. for more sports news and scores, go to cbs sports.com. ? one smart choice leads to the next. ? the new 2017 ford fusion is here. it's the beauty of a well-made choice. ? here's the plan. you grow up wanting to be a lawyer, because your dad's a lawyer.
9:01 pm
you both want kids, and equally surprised you can't have them. so together, you adopt a little boy... and then his two brothers... and you up your life insurance because four people depend on you now. then, one weekend, when everyone has a cold and you've spent the whole day watching tv, you realize that you didn't plan for any of this, but you wouldn't have done it any other way.
9:03 pm
and i learned there's always a smart solution. as president of my synagogue, we found a smart solution to rising energy costs... creating one of the largest solar projects in the state. in congress, i'll work with democrats and republicans to make all of nevada a leader in solar, to improve our schools, and create good jobs. i approved this message because i know we can
9:04 pm
>> bill whitaker: not many issues can unite democrats and republicans, but criminal justice reform is one of them. after thirty years of being tough on crime in the u.s., no other nation incarcerates more of its citizens than we do. we have 5% of the world's population, but 25% of its prisoners. the cost of housing all those inmates: $80 billion a year. as we first reported in april, american politicians and prison supervisors are looking for new ideas in germany. the main objective of german prisons is rehabilitation, not
9:05 pm
prisons, but gets better results. their recidivism rate is about half the u.s. rate. we wondered if germany had found a key to prison reform, so we visited three german prisons. but our trip started in a small resort town about 100 miles north of berlin. when the weather's warm, the lakeside town of waren, germany attracts families and tourists. we found bernd junge there with his sister and niece, out for a stroll, eating ice cream sundaes-- an innocent scene if ever there was one. but junge is a convicted murderer currently serving a life sentence for a contract killing. he shot a woman to death in cold blood. we spoke with him by the lake. this is part of your sentence. this is part of your punishment? >> bernd junge ( translated ): well this is about being
9:06 pm
and that means rehabilitation and all that, so for me, yes, this is part of it. >> whitaker: this doesn't look much like punishment. >> junge: yes, well that's the german fairy tale. >> whitaker: after 15 years in prison he's earned weekend leave for good behavior. he's on track for early release. in germany, 75% of lifers are paroled after 20 years or less. >> joerg jesse: if someone says to himself it's a german fairy crimes anymore after release, it's okay. he can think about his imprisonment, what he wants. >> whitaker: joerg jesse is a psychologist by training. he's now director of prisons in mecklenburg-western pomerania, a state in north germany along the baltic, about the size of new hampshire. there are rich fields here, brilliant sunsets, and waldeck the maximum security prison where bernd junge is serving time.
9:07 pm
>> jesse: yes, he should. >> whitaker: he should? >> jesse: he should. >> whitaker: jesse invited us to waldeck to show us how the german system works. >> jesse: the real goal is re- integration into society, train them to find a different way to handle their situation outside, life without further crimes, life without creating new victims, things like that. >> whitaker: where does punishment come in? >> jesse: the incarceration, the imprisonment itself is punishment. the loss of freedom, that's it. >> whitaker: i think americans think crime and punishment. you say punishment is not even part of the goal of the german prison. >> jesse: no. >> whitaker: at all? >> jesse: not at all. >> whitaker: so life inside prison mirrors life outside as much as possible. germans call it "normalization." it starts with small prison populations. low-level offenders get fines or probation.
9:08 pm
rapists, career criminals. we were surprised how quiet and peaceful it was inside waldeck. we wondered where all the inmates were. it turns out they were relaxing outside on this sunny day. this is unbelievable. you're in for murder and you have a key to your cell. cells have doors, not bars. it's for privacy. inmates can decorate as they please. we saw joerg muehlbach playing video games in his cell. he told us he was convicted of large scale cocaine trafficking and gun possession. he's serving seven years. compared to cells in the united states this is quite luxurious. >> joerg muehlbach ( translated ): yes, it is comfortable here. as a prisoner here, it's alright. >> whitaker: he says being separated from his family makes prison hard, not the conditions. he has a private bathroom and
9:09 pm
prison guards the jitters. you have darts. you've got a letter opener. you have legs on the table that you could break off and use as a club. you've got quite a bit of freedom in here? >> muehlbach: gosh, i haven't even thought about that. here this is normal. >> whitaker: muehlbach's day is normal too. he gets up and goes to work in the prison kitchen. after his shift, there's r&r: darts in the common room, beach volleyball in the yard. there's a lot to do, he told us. >> muehlbach: painting course, pottery, soccer, gym, crocheting. >> whitaker: painting and crochet? >> muehlbach: yes, painting and crochet. and in crochet we make hats, oven mitts, whatever you need. >> whitaker: we visited several german prisons and were amazed
9:10 pm
guards. heidering prison, outside berlin, is as clean and bright as a google campus. the prison is surrounded by fences, not walls, so inmates can see the outside world. the prison uniform? street clothes. for the inmate who finds this too stressful, there's yoga. this probably isn't the image that comes to mind when most americans think of german prisons. that's likely to conjure up brutal images om but following that war, respect for the human dignity and freedom of all people was written into the german constitution. privacy is sacrosanct. there is no death penalty. at old facilities like tegel in berlin, or new ones like heidering, the focus is on humane treatment and rehabilitation. prison guards are key. they're well paid and highly trained. they spend two years learning
9:11 pm
jesse calls them "calm down" experts. >> jesse: calming down, calming down, calming down. not showing power too much. not showing guns. not showing weapons. >> whitaker: they use solitary confinement, sparingly. jesse says there's little violence in german prisons. how do you explain that? >> jesse: if you treat them as if they are your enemy, they will react as enemies. they will react as dangerous. >> whitaker: in fact, many of them are dangerous. we we're up there on a row where everyone you ask was in for murder, murder, murder. >> jesse: they're all human beings, and they know a violent manner. and we do exactly the other way around. "don't be aggressive." show them that there is a different kind of conversation possible. >> whitaker: the conversation starts right away. it's based on therapy. psychologists make an initial
9:12 pm
and devise personalized prison plans for them: recommendations for counseling, classes, vocational training and work. inmates who follow the plan earn greater freedoms and early release. >> jesse: we cannot see the sense in just locking people up for their whole lives. your prisons will fill up and you'll have to build new prisons and so on and i think that was the situation in the u.s. >> whitaker: with more than two million inma more americans are coming to germany seeking solutions. >> american tour: it's like a dorm. this would be a nice dorm room for the ivy league. >> whitaker: we joined u.s. prison and law enforcement officials on this tour in berlin. connecticut governor dannel malloy was part of the group. he was impressed by what he saw. >> dan malloy: i can tell you, they have a lower crime rate than we do. they have a lower recidivism rate than we do, and they're spending a lot less money on jails. >> whitaker: in the u.s., we've got much greater access to guns.
9:13 pm
are the things being done here directly transferable to the united states? >> malloy: i think there are many things that are transferable. that doesn't mean that it's a perfect fit. but i think we have to challenge ourselves to do better. >> whitaker: this doesn't have the same vibe, doesn't feel like the prisons in germany at all. >> john wetzel: little bit more intense, maybe. >> whitaker: little bit more intense. john wetzel is pennsylvania's secretary of corrections. three years ago, he went to germany looking for ideas to improve his prisons. he showed us around graterford, outside philadelphia. it's the largest maximum security prison in pennsylvania. 3,300 prisoners are packed in here. we were walking through an 80- year-old cell block. >> wetzel: i'll stop back. >> whitaker: when this inmate approached, he said he was a low level drug offender. >> prisoner: sometimes, it be leaking on the block, people dying in their cells, the water stinks.
9:14 pm
water smells like it's coming out of the sewer hole. >> wetzel: you're preaching to the choir. i've done as much as i could for. >> prisoner: i mean, for real, there ain't nothing but poor black and latino people in the jail. it's bad in here man, it's bad. >> wetzel: yeah? i mean, look around. >> prisoner: it's bad. >> whitaker: wetzel started out as a prison guard three decades ago. back in 1980, there were 8,000 inmates in the state; today, there are 50,000. physical and sexual assaults are a fact of life. at graterford, there a 700 lifers. >> wetzel: pennsylvania's a state where life means life. so, if you're doing life here you're not going to be walking around a park eating sundaes with your family. >> whitaker: when wetzel was in germany, joerg jesse gave him a tour of waldeck. you were skeptical. >> wetzel: it almost sounded like disneyland. "oh, there's very few inmates. inmates have their own keys and everybody gets along and everything's hunky-dory." i mean, who's buying that story? not me.
9:15 pm
was buying it. he started implementing some of the things he saw in germany, like more intensive staff training, greater freedom for inmates with good behavior and programs to help them re-enter society. we, the american public, called for tougher sentencing, throwing away the key. are we there for this more lenient approach? >> wetzel: i think our culture, we don't want to think lenient. we don't want to think soft. we got here by being tough on crime. i think we're getting away from it by being smart on crime, and smart on crime happens to be more lenient. >> whitaker: sometimes, germans think their prisons are too lenient. but the system is mandated and protected by the country's highest court. there are problems. they have gangs. they have drugs. they've seen signs of islamic radicalization.
9:16 pm
dangerous to release. they wind up in something called preventive detention. at berlin's tegel prison, we met chris templiner. he has spent the last 19 years not knowing when or if he'll ever get out. >> chris templiner: they think i'm dangerous so what can i say? what can i show them? i don't know. >> whitaker: you did bad things? >> templiner: really bad things, yes. >> whitaker: he wouldn't tell us laws kept us from finding out. his life is confined to this well-appointed, apartment-like building. look around, this is life in prison for germany's worst offenders. you expect to be here until you die? >> templiner: maybe. yes. >> whitaker: but convicted murderer bernd junge expects to get out in november. he stuck to his plan and earned
9:17 pm
maintenance job at the nearby port. you could escape if you wanted to. >> bernd junge: yes. >> whitaker: but you don't? >> junge: no. >> whitaker: why not? >> junge ( translated ): very simple. my time is almost over. and i want to be done with this chapter of my life, once and for all. >> whitaker: at pennsylvania's graterford prison, this is where murderers are housed, locked up 23 hours a day. >> i'm still hungry. still hungry. >> wetzel: i think more now than any time in the history of our country we have the right and left agree that we've frankly screwed up the corrections system for 30 years and it's time to do something different. it really starts with understanding that, you know, a human-being's value isn't diminished by being incarcerated. >> whitaker: what you're talking about requires a huge mind shift on the part of all of us. >> wetzel: it's crossing the grand canyon is what we're
9:18 pm
for lower back pain sufferers, the search for relief often leads to places like... this... this... or this. today, there's a new option. introducing drug-free aleve direct therapy. a tens device with high intensity power that uses technology once only available in doctors' offices. its wireless remote lets you control the intensity, and helps you get back to things like... this... this... or this. and back to being yourself. introducing new aleve direct therapy. find yours in the pain relief aisle. [burke] at farmers, we've seen almost everything, so we know how to cover almost anything. even a wreck 'n' wash. [dad] see, the carwash isn't so scary. [boy] that was awesome! [dad] yeah. [burke] covered. november fourteenth, 2015. talk to farmers, we know a thing or two because we've seen a thing or two.
9:19 pm
>> cooper: most people know that chimpanzees are our close cousin. they share more than 98% of our d.n.a. but you may not know that we also have another primate cousin, just as close. they're called bonobos. they may look like chimpanzees, but they are an entirely separate species of ape, and their behavior couldn't be more different. bonobos are the only great apes that live in female-dominated groups, and, unlike chimps and humans, which are often violent and aggressive with each other, bonobos would rather make love than war. as we reported last december, they are an endangered species and only found in one place, the democratic republic of congo in central africa. congo's been torn apart by war for decades, keeping researchers away, which is why bonobos are the least-understood apes on the planet.
9:20 pm
the world's only sanctuary for bonobos sits on the outskirts of congo's capital, kinshasa. it's called lola ya bonobo-- "bonobo paradise"-- and for these endangered apes, that's exactly what it is. this refuge was created by conservationist claudine andre. she's belgian-born, but has lived in congo most of her life. if you ask her why she cares so much about bonobos, she'll tell you "just look into their eyes." >> claudine andre: the way they look in your eyes, deeply in your... just like they look in your soul. >> cooper: in your soul. >> andre: yeah. >> cooper: and it's rare that-- most primates don't... don't maintain eye contact like that. >> andre: yeah, because... don't try to do this with gorilla, you know and... >> cooper: right. it's a threatening gesture, if you do it with a gorilla. >> andre: yeah. >> cooper: but bonobos look right at you. >> andre: oh, yeah. >> cooper: bonobos may have a brain that's a third the size of
9:21 pm
( bonobo screeching ) >> cooper: those high-pitched screeches are a sophisticated form of communication, and their gestures are unmistakable. like chimpanzees, bonobos use tools in a wide variety of ways, and are capable of abstract problem-solving. >> andre: she have a baby, so she cannot go deeply... >> cooper: so she's breaking the stick, actually? >> andre: yeah, she... she shows the stick is too short. >> cooper: okay. so she got a longer stick. that's amazing. so she's using the stick to see how deep the water is? >> andre: yeah. >> cooper: bonobos are unique among great apes because they are not dominated by males. and according to brian hare, a duke university evolutionary anthropologist who studies them at lola, it's the females who run the show. >> brian hare: here, if you try
9:22 pm
"corrected" by the females. >> cooper: not just by one female, but by a sort of alliance of females? >> hare: that's right. that's right. what's more, bonobos have never been observed to kill each other. the same can't be said of chimpanzees, or of humans, for that matter. >> hare: bonobos, on the other hand, they don't really have that darker side. so that's where they could really help us is, how could it be that a species that has a brain a third of the size of ours can do something that, with all our technological prowess, we can't accomplish? which is, to not kill each other. found in bonobos' favorite pastime. these apes have more sex, more often, in more ways than any other primate on the planet. their sexual contact is so frequent, brian hare refers to it as the "bonobo handshake". it's not that they want to procreate or have kids; it's not that they even find each other attractive. >> hare: no. >> cooper: it's... it's just... >> hare: no, it's a negotiation. >> cooper: and it's hardly surprising that many of these negotiations take place over food. chimpanzees will fight each other over food.
9:23 pm
>> cooper: bonobos won't necessarily fight each other... >> hare: that's right. >> cooper: it's an irony that this peace-loving primate is being hunted to extinction. though it's illegal to kill or capture bonobos in congo, that hasn't slowed their rapid decline. forest animals are sold in bustling bush-meat markets for food. at the largest in congo's capital, kinshasa, you can buy monkeys, porcupines, even alligators, dead or alive. bonobos aren't openly sold here anymore, but you can still buy them in many parts of congo. their orphaned babies often end up in the only place that can care for them, lola ya bonobo. the babies arrive traumatized, often injured. each is assigned a surrogate human mother, and their job is to raise the babies as their own, showering them with the love and attention the orphan
9:24 pm
>> cooper: it's incredible to see them up close like this. i mean, they are so... >> andre: yeah, human? >> cooper: yeah. >> andre: yeah, you know, i say all the time that, for sure, they are great apes. they are not us and we are not them, but we have a line in the middle of the two world that we >> cooper: baby bonobos are as playful as any human toddler, and just as curious. suzy kwetuenda would know. she's in charge of the bonobos' welfare at lola and oversees their rehabilitation. you have a child of your own? >> suzy kwetuenda: yes, i have. >> cooper: how are they different? >> kwetuenda: i can say there is no more difference. >> cooper: there's not difference... >> cooper: of course, really
9:25 pm
time, you need experienced mother to... so, they give love and affection, and this is the only way to save them. >> cooper: that... that's what saves these babies? >> kwetuenda: yes. and make them in life. >> cooper: they need love? >> kwetuenda: they... yeah. they... yeah. absolutely. without that, they die. >> cooper: claudine andre came across her first bonobo 20 years ago. the country was wracked by violence and on the verge of a brutal civil war. she volunteered to help at a local zoo, and that's when she saw a baby bonobo, though the zoo director warned her about getting too close. he said, "don't put your heart in this animal." >> andre: yes. "it's a bonobo." a bonobo-- it was the first time for me i hear this word. and he say they never survive in captivity. >> cooper: so he was warning you, "don't... don't fall in love with a bonobo, because it's going to die." >> andre: yeah, but it was a sort of challenge. >> cooper: there are now more than 70 bonobos at lola. many of the original orphans
9:26 pm
but to save these primates from extinction, their numbers in the wild will have to grow. seven years ago, the team from lola decided to try to release some back into the forest. nothing like it had ever been done with bonobos before. they hand-picked nine apes who they thought would do well on their own. they have to be able to get along in a group, as well as be strong themselves. >> andre: yeah. it's just like you chose people to go in the moon. >> cooper: it's not quite the moon, but si release the bonobos is about as remote a place as you can find on the planet. it's a three-hour flight deep into the wilderness of northern congo, then a long, slow ride up the lopori river in a dugout canoe. life along the river hasn't changed much in centuries. congo is one of the least- developed countries in the world, and has millions of acres
9:27 pm
it may look pristine, even peaceful, but many of the people who live in these parts have suffered from years of war. the wildlife here was decimated. so, the bonobos disappeared from this area because of hunting...? >> andre: yes, yes. >> cooper: ... for bushmeat? >> andre: yeah. >> cooper: and also, during the war, soldiers would hunt here. >> andre: yeah. >> cooper: we were taken to the spot where that first group of bonobos was released. for a while, we couldn't see anything, just dense forest spilling over the banks of the winding river. then, claudine began calling out the names of the apes she herself once mothered all those years ago. >> andre: vous etes ou? ( bonobos screeching ) they know it. >> cooper: that's crazy. they respond to you... >> andre: they responding to me. they know i'm here. >> cooper: we still couldn't see them, but they could hear claudine. and suddenly, the forest was alive with the sound of apes excited to hear her voice once
9:28 pm
one by one, the bonobos came to the water's edge to see the people who'd saved their lives. claudine and her team weren't sure releasing bonobos back into the wild would work, and although some had trouble adapting, most now seem to be thriving. >> andre: etumbe! >> cooper: that's etumbe, the bonobo claudine is perhaps most proud of. for 17 years, she was trapped in a tiny cage at a kinshasa laboratory. now, she's the leader of the group. >> andre: and she give us a first baby born here, so... is my friend... ( laughter ) ...or my sister. >> cooper: your... your family. >> andre: my family. >> cooper: this is as close as claudine allows herself to get. now that they're wild, she doesn't want the bonobos to get used to humans ever again. do you still find it thrilling when you suddenly see them after all this time? >> andre: oh, yes.
9:30 pm
voya. we're putting away acorns. you know, to show the importance of saving for the future. so you're sort of like a spokes person? more of a spokes metaphor. get organized at voya.com. this is my body of proof. proof of less joint pain. and clearer skin. this is my body of proof that i can fight psoriatic arthritis with humira.
117 Views
IN COLLECTIONS
KLAS (CBS)Uploaded by TV Archive on
![](http://athena.archive.org/0.gif?kind=track_js&track_js_case=control&cache_bust=1454801668)