tv Rigor Mortis CSPAN July 23, 2017 7:30am-8:21am EDT
7:30 am
he writes all these books on sports. a lot of these types of books i read when i was growing up and i love him as an author so that's sort of the fun reading i'm doing right now. >> i noticed on your phone, do you read from your phone? >> i do and i got an ipad as well. you need a break from healthcare, tax reformand trade . on things you can enjoy as well so i just love it. >> you is that your reading time back and forth to texas? >> when i'm not preparing for landing, normally i'm finding some book on the phone to read. >> kevin brady, thank you. >> tv wants to know what you are reading. does your summer reading list via twitter at book tv or instagram bookótv or posted to our facebook page, facebook.com/book tv. tv on c-span2, television for serious readers.
7:31 am
>> good evening everyone. if you want to find a seat we are going to get started. >> my name is candace, i work with some of the events in the store and on behalf of the owners and staff i want to welcome you. i have a couple logistics to go over before we get started , if you could make sure your cell phones are on silent. we recording the event and we have c-span with us so having no interruptions would be excellent. our author is going to speak first for about half after the hour and we're going to open it up to questions. you have one microphone over here next to this pillar if you could use that, that
7:32 am
would be helpful. we can catch it on the recording and we can help you and finally at the end, if you wouldn't mind holding up your chairs and sending them against the shelf, that would be helpful. i'm happy to be welding welcoming richard harris here tonight to discuss his new book rigor mortis, how sloppy science creates useless cures, right crushes hopes and waste millions. detailing several shocking cases of raising the stakes in both dollar amount in human value, harris shows how the science of the country's top research institutions are in urgent need of reform. from drugs being advanced despite their failure in the test phase two dramatic testing errors, this epidemic of sloppy science not only delays through medical breakthroughs but is also essentially fatal. ivan brantley, cofounder of retraction watch says of the book science remains the best place to build knowledge and
7:33 am
improve health but as richard harris reminds us in rigor mortis, it is also carried out by humans and other perverse incentives. tapping into these tensions, harris deftly weaves gripping tales of sleuthing with possible ties out to what some call a crisis. read this book if you want to see how biomedical research is reviving itself. from science, medicine and the environment richard harris is a reporter for public radio, he's been reporting from the npr guest for 30 years running with his initial focus on climate change and now currently focusing on biomedical research. these the three time winner of the american association for theadvancement of science journalism award and a cofounder of the dca science writers association. this is his first book. one me in welcoming richard harris . [applause] >> thank you very much candace and thank you for coming out this evening. i do appreciate it and i'll
7:34 am
take a few minutes and talk about the book but i'm interested in hearing your questions so outline not to talk too long.as candace mentioned i've been at npr for 31 years and have covered everything over those years and after a long stretch of covering climate change in the 1990s into the 2000, my boss said hey, how about coming back to biomedical research? i said that sounds good, i better brush up on what's been going on so one of the first things i did was i looked at what was happening at the nih budget and i was startled to discover the budget had been undergoing of your gyrations and actually doubled between 1998 and 2003 which seems like good news or biomedical research. really nice infusion of cash but after 2003, the dollars flattened out and in terms of
7:35 am
spending power they actually decreased by about 20 percent so i thought this is not a formula for good things to be happening, if you have this burst of money that led to a 50 percent increase in the amount of laboratory space and academic labs for biomedical research and then basically congress okay, we've done enough. were going to flatline the budget but that's not a good thing to be happening. there must be some consequences of that. that's the first thing i noticed. shortly thereafter i came across a paper that had been published in the journal nature which had looked at what was coming out of some of these academic labs. and it was a researcher, the main researcher at amgen and it was glenn begley and he was head of cancer research for the strong company and he was, he went back to review a bunch of exciting studies that he had looked at as they came across the transom that had been published in the scientific journal and he discovered that when the
7:36 am
originally came across his desk, he thought wow, if any of these were good results, i could get some really good deeds for drugs so he thought, you know, this is how pharmaceutical companies work by a large these days, that's a lot of their own basic research, they rely on science they said you know, i couldn't get these studies were the first time, we tried to replicate in our lab, let's try again so if you went to a second cycle of getting these studies replicate and 53 studies, he even recruited the help of a lot of the scientists who done the work to begin with and of those 53 only accident to replicate which is what, 11 percent so pretty grim statistics and he was raising the alarm saying a lot of the research coming out of academic labs is not reproducible. he wasn't the only ones who have done this, a few months before his study was published the second study
7:37 am
had been done by researchers at the their company in germany and they found 25 percent of their studies could reproduce this became the germ of something that is now known as reproducibility crisis. i don't call it a crisis, i think it's a concern but a crisis implies that it's a new, surprising thing that had been around before and i think a lot of these problems actually have persisted for quite a while. following along in a low-level. i think what happened is science has become acutely aware that this is an issue and something that they need to be concerned about and something they need to address. and i'm not the only person out there thinking about these things. last year nature did a survey of scientists saying how many people perceive there's a reproducibility crisis and more than half of people who responded to the survey said these were scientists said yes, there is a crisis . another almost 40 percent there's a slight crisis, i'm not sure what is like crisis is but nonetheless, 90 percent of scientists acknowledged is a very serious issue. so when i set out to write my
7:38 am
book , i was aware that i was stepping into a world where there were a lot of people concerned about this and my first thought was are people going to want to talk about this? this is uncomfortable, this is not something that people necessarily want to hang their dirty laundry out and talk about but i was surprised how many scientists said yes, this is a concern. please talk about it, talk to me and let's talk about what's wrong and what to do about it. so i have an incredible access to gloria and other scientists through the united states who said yes, i want to talk about this and let's see what we can do. so let me say a word about the title rigor mortis. rigor is the pond. rigor morris implies death, it's fair to say that the vigor in science is not died at all but i think that it could use a shot in the arm. it definitely needs help and the subtitle is pretty over-the-top when you read the book but it really does draw the eye to the fact that
7:39 am
these are issues that have in fact on us as people who consume or are turning to scientists and say there's a bunch of health problems out there, please help us understand what causes them and find ways to treat them. there's something like 7000 diseases out there and their treatments are only about 500 and many of those treatments operated so there's a huge amount of work to be done and there are lots of gaps in the knowledge base here so i set out, talk to a bunch of people and heard a lot of different things, not only about the bad news that was going on but the subtitle but also a lot of the stuff that is starting to generate some buzz and interest among scientists about how to fix these problems so by the time you're done with the book you're not totally depressed but you have a sense that there are people who are engaged in care about these issues. talk just a few minutes about the bad news. get to the good news. i think first of all, what we know is science his heart. we should not expect 100
7:40 am
percent perfection from scientists at all. everything they publish is absolutely correct, my will be your not trying hard enough. you're only running over old ground and saying things that we also there should be failure, there should be a failure rate, absolutely and that's not the cause of concern. why focus on and what some of the scientists i talk to focused on was the fact that a lot of these years are avoidable. they are things happening every day in scientific laboratories that are leading people inthe wrong direction. we don't have to be doing that and things can be done better . not to perfection, noted nobody's expecting perfection but you can reduce the amount of unforced errors and accelerate the process. my initial thought of the title was inspection because there is a sort of slowing down of the progress of science but is not stopping it all together so we could reduce science fiction if we could reduce these unforced errors.
7:41 am
so what kind of errors are there? i'll talk briefly about four different categories, one is bad materials and ingredients and being used in experiments and the other thing is that methods, scientists don't plan their experiments with as much care as they could. also scientists make bad assumptions . they use animal models and assume that what they're learning from animals apply directly to human beings and very often that's not the case. in fact, sometimes it's misleading and finally, it's a toxic culture. not the making of the scientists themselves but of the system is built up around biomedical research. that's largely caused by this problem of this basically decreasing funding available for scientific research for biomedical research in particular. titus are fighting for a ever shrinking pool of money in these days if you are a scientist reliance on grants the national institutes of health to get your funding,
7:42 am
put in a grant proposal, you have one in five chance or less of getting approved. so if you're a scientist and you have to support your lab on the strength, his main source of funding for this research, you know i'm going to have to write five rants were six brands on average in order to get funding for one brand just to keep my lab going, that puts an incredible amount of pressure on these scientists. and there's not much of a backup for any of them that opens the grants, it's possible their labs will shut down so this is an environment that invites people get into awkward situations to put it mildly. >> let me talk about the just some of the couple of examples. i mentioned the bad materials and one of the prime examples is cell lines that science is used commonly in laboratories. these are cells that are going, plastic dishes and grow perpetually in the first of those was actually featured in the story of henry which is a wonderful
7:43 am
book and now also a tv movie that's just out. she was a woman who was diagnosed with cervical cancer and at johns hopkins in 1951 they isolated some cervical cancer cells from her and created the world's first set of immortal cells. this is a useful line of self to be used in biomedical research, there is some today and in labs around the world but this also turned out to be because you of cell lines if you will because the cells grow incredibly rapidly and if you make a small error in your lab, before you know it, these gila sales, before you know it they are spread through all your cells and they take over. scientist would think they're studying liver cancer for example or liver cells are ultimately maybe, we hope you realize that they're studying these gila cells and this is been a huge problem for decades. scientists recognized this back in the early 70s that
7:44 am
the cells were taking over and i was concerned, a lot of handwringing and honestly not very much was done. in studying 15 years ago there were actually some pretty good tests that could rapidly identify other these tells were in fact those cells or the cell scientists thought they were using but the test did not take off, they were not used as widely as they need to be so there's, this is one example, there are 450 other examples of cell lines that are misidentified are used all the time in biomedical research labs and scientists have these tools to check them out but they cost money, their cost inconvenient or inconvenient and so these things don't use. these tests do not get end up being used as much as they are to be so that's onesource of problem is these bad cell lines. the second thing i mentioned is that methods . pick up. this design sometimes they design experiments don't
7:45 am
really have enough power to tell them what's going on. and some of the classic examples involve the mouse studies of lou gehrig's disease. these studies have led to a lot of drugs and all of them have been failures. some treatments for als and one of the problems, so that our scientists don't think very clearly about what they need to do, how many mice they need to use but also it's true that these experiments are quite expensive. human experiment rights, you might take dozens of mice for your study group and dozens more for your control group. cost could easily cost over $100,000. and many scientists don't have that kind of money so is a all do mice and i will call the results of the pilot study. and you know, they're constrained by resources, but on the other hand, there have been many occasions where those kind of things do large-scale political trials, they actually have led to results that have been
7:46 am
disappointing.look promising when you do it in this small number of mice and then maybe spend tens of millions of dollars trying to expand into humans and only to discover it doesn't work. so that's one example of a ecological problem but it gives you a flavor for what could go wrong. i mentioned that bad assumptions and i think the other assumption about mouse work in particular is if you study something in mice and it works in mice you can cure cancer in mice, you can cure strokes in mice and so on but those findings often don't translate human beings.we are not just giant mice and mice are just tiny people so but the assumption is well, their mammals, where mammals, it ought to work and oftentimes scientists i don't have anything better.all i can do if these rodents and hope for the best but i we should be a little bit more modest in our expectations about what can come out of those studies.
7:47 am
we talk to people thinking about ways tomake better use of animals , think more broadly about how you can expect from these animals without assuming and crossing your fingers that what works in mice will work in human beings so that's one thing as well. i want to say a couple things about culture and i'll read from my book from the age that i wrote on this topic. >>. >> to give you a flavor for the pros in the book. it's a little harder to read than what i read from the radio because this is writing for the post voice but your we go, this is a quote darting with a comment from brian martinson who is a social scientist who said those people who work in scientists are working as hard as they can, as long as they can and the hours are putting in but they're also going to be on the road for physical limits and are working as smart as they can so if you are doing all those things, what else can you do to get ahead would be the first to crosses the finish line 1st. >> what all you can do is cut corners, the only option left to you. martinson works to displace
7:48 am
called health partners institute, a nonprofit research institute in arizona. he's got other academic affiliations but the document some of his behavior and he says this really admit outright misbehavior but rarely a third of those who admit to less noble practices such as dropping data that weakens results. like the feeling or changing the design methodology or result of a study in response to pressures from a funding source. i get why this is going on. the scientists know that if they don't get a result is good, they're going to end up maybe not being able to get their grant renewed and if they don't get their grant renewed, they might not get there lab moving forward so the pressures are all wrong on the scientists unfortunately. they have to choose sometimes between doing what's best or science, versus doing what's best for their careers and for the people there supporting particularly if they are scientists who are drawing in the money that people are not doing so the incentive system is out of
7:49 am
lack in science and i talk about that. in the book and not only what's wrong but some of the things that could be done to fix it. and let's see, i just want to read a little more from martinson's book because i thought he was a very thoughtful person on among many i spoke to him he said part of the problem comes down to an element of human nature we develop as children and never let go of, our notion of what's right and fair to form in a vacuum. people look at other people are mandating as a clue to their own behavior. this is his words but i'm paraphrasing. if you perceive you have a fair shot, you're less likely to bend the rules and but he says if you feel the principles have been violated you will say screw it, i'm going to g2. the scientists perceive their being treated unfairly, they themselves are more likely to engage in less than ideal behavior. that's all, he says.
7:50 am
he was on a recent national academy of sciences committee looking at these issues of scientific integrity and you know, they wrestled with this. this is the first time in about 25 years the academy has gone back to look at these issues and their perception was when they did in 1992 they were focused on the bad apples of science and there are people who misbehave in science but they realize that if they reconsidered these issues of scientific integrity, the real problem they focused on more broadly was the fact that there's not just bad apples but as one of his colleagues said, let's look at the barrel and people who make the barrel. it's the environment the scientists have to work on that are pressuring them. i talked to somebody else, malcolm macleod, a scottish scientist who said nobody wants to go out there and do bad science. we all want to push things ahead and understand what's
7:51 am
going on but the incentives are just so out of lack in this world that scientists don't end up being a do what they want to do. the question is how can we change the system in order to make people able to serve along these incentives better science follows the desires of what scientists are pushing for? there's easy solutions and difficult solutions so i'll mention a couple of them and be happy to take questions. one thing that simple to do is validate your ingredients. these tests for cell line around readily available for the people who are funded to go out and get your cell lines authenticated. they're going to continue to get there federal grant funding. effective as of about a year ago so we will see if that helps to resolve some of these issues of these bogus lines that are quite common out there. a second thingthat scientists can do is they can make sure that their results are more transparent.one of my favorite stories was , this
7:52 am
big paper in 2001 where they finally split the human genome and published it with huge news, they published it on the cover story journal of nature and as i recall, president clinton was not there, and instead of publishing all three billion bits of data from the represent the human genome, the scientist said were going to pick out a dozen or so, i like to think we discovered when we start with the human genome and one of them was the observation they made jean had been jumping from bacteria directly into the human genome, totally surprising finding. biologically was curious. that nobody expected bacterial genes to do that so they published this as part of their lineup of exciting discoveries from the human genome and someone in the lab to look at this and send this can't be true, this can't possibly be true and because the human genome and published, all the data had
7:53 am
been made publicly available, any other scientists get into it, reanalyze it and figure out what was going on so within a few months this other lab get in and said, guess what, you made a mistake, this is not actually what's going on, these genes are a misunderstanding and in very short order, this error has been corrected. >> think about it, if you couldtake again, science is never perfect , but if you can shorten the cycle where people make a false discovery to the time people realize that is false, that could help accelerate the science progress so one idea, these scientists are approaching his let's just be more transparent and let people put out data, put out the computer coding to analyze it and share their ingredients when they're asked you and this obviously raises competitive issues because science is a competitive process and not all scientists are thrilled to put all their stuff out there but to the extent that it can happen, this is good for several reasons.
7:54 am
not least of which being if you know you're going to have to put yourself out there in the public domain, maybe you will check it one more time. maybe one more experiment because it's better to have you find your own errors than someone find them for you. i think transparency is a powerful way to help solve some of these problems.not the entire solution but i think it would be helpful. i think there's clearly a need for better scientific training. i was talking to john walsh who is the director for the institute at the nih and he said in 2014 we went and found the best studies, the best courses that go out in universities to teach methodology, how to do these experiments properly. put out a notice for information to a bunch of universities, said who's got the best course? he got essentially no answers. these people learn about these things from their mentors. and while that's sort of an additional approach to learning, when you think
7:55 am
about it, if your mentor isn't that hot on methodology, you're not going to learn the best methodology and by definition half of your metaphors are going to be below average so that's an issue that people realize , and the nih has funded a number of courses to develop the curriculum for methodology so it's going to take a while for that rigor to bubble up through the entire world of science but that's also very positive effort that can help resolve some of these issues. the hardest problem and ones that i think that people have been strike scratching their heads aboutfor a long time is how to rebalance science . how to not only recognize the fact that there are far too many mouths to feed in the world of biomedical research then there are federal dollars to support them, how do you solve that problem? you don't want to push people out of science, you rather
7:56 am
have a nice healthy world of science but if there's that bad an imbalance, you've got to do something. you got to pour more money in which doesn't seem like it's going to be a going concern, at least not the kind of money scientists would need to sustain this entire enterprise which is growing quite large. it's also useful to get people out of science to realize they're doing work that could ultimately benefit us, that we hope will understand how basically the biology of disease and looking for treatments and even cures. there's been an ongoing discussion about that. i haven't heard a satisfactory solution but a lot of people are grappling with that question and trying to sort that out. so that's the problem and maybe you in the audience will have answers and we can jot them down and send them off to the folks who are thinking hard about these issues but at any rate, it's not all bad news but there's a lot of room for improvement and this comes at a critical
7:57 am
juncture where some of the top people including the director of the nih recognize this is something we can't forget about. we need to address it and i think there is still the will to do something and that's the most encouraging thing i found, reporting about this book. thank you for your attention and i'm happy to take questions. [applause] >> you've identified before types of sloppiness. is the rate of sloppiness the same in university and hospital affiliated biomedical research as in big pharma labs or does big pharma have a better track record? >> it's an excellent question. big pharma has a strong profit motive to get things right though i think they
7:58 am
pharma spends a little more time and more money andmoves more carefully . by the time an idea comes out of big pharma, is much more likely to be correct and what comes out of academic research. absolutely. just anybody in pharma knows, anybody who does an experiment, i talked to glenn begley himself who said 90 percent of what we do fails, it doesn't work out but we recognize it doesn't work out. the question is the things that work, do they actually work as those move forward? and does have a better track record than what comes out of academic labs. universities are starting to realize it might be a good idea to put an extra layer of review before things go out and say they pharma world, your great results, go run with it. like johns hopkins university and uc san francisco have programs where there is a hot result, take it to this intermediate lab first and
7:59 am
let's run through, check ourselves and see if it works. i was talking to the woman at johns hopkins runs this program and she said half the time we can get it to work and things go forward but even at a major institution like johns hopkins, a lot of those things don't go any farther than that exciting finding and to their credit, they say let's put the brakes on it. we'd rather not waste anybody else's time and the drug companies are getting more skeptical about what's coming out to universities so they want more evidence that what they're picking up is going to work so universities are trying to meet them halfway. >> thanks very much for a great presentation. thanks very much for the first question which is, i agree, a good one and one that i think mine is sort of an extension of that. you described the basic problem as where the basic research of materials and
8:00 am
methodology and culture and so forth and up with unpredictable results but then that leaves or can lead to trials that are unsuccessful so that it seems as though you're saying the great societal damage comes down because those trials are so expensive and they go through a lot of expense needlessly. i'm wondering first of all if that as far as you're concerned or as far as you're aware, the basic problem or is the inverse truth and is there any way to quantify the inverse square because of those let's say those same pauses, you get negative results in a lot of cases and things don't follow along from there, negative results that at least in some fraction of cases show positive results, does that
8:01 am
ever happen or to what extent does that happen? >> i think that's hard to know whether a drug would have worked if you tested it properly and you can clear that false lead. >> and so that means most of that effort, those can be $100 million trial or more, those often fail. so that's clearly part of the driving force in drug prices. obviously there's a lot more behind. in terms of the basic science that's the bottleneck that farm says if we could improve the
8:02 am
success rate coming out of those original studies everyone would benefit. the drugs would be more likely to be beneficial. we would waste less money and we would have faster progress. >> but to your knowledge no one has ever tried to do, quantify the inverse, that is, are we losing out because repetitive studies have been done on negative result? >> i think that would be hard, i think they cannot to chase -- the one counterexample which is sort of an anecdote was with staten drugs which were developed, is incredibly powerful drugs and successful drugs to treat high cholesterol. probably many people and is remarkably with staten drugs. but what happened was initially did those studies in mice and rats because at the basic biology suggested these drugs really ought to work and this could make a difference.
8:03 am
they understood you about the things on a molecular level and they did these studies in the rodents and the got nothing. they did not work at all. one of these big drug companies that forget it, seem like a good idea but we're moving on. there was a scientist in japan you said i'm not ready to give up yet. i'm going to keep pushing away. he had a colleague down the hall who was doing research on chickens and he said can i tried my drug on your chickens? the guy said sure. lo and behold they work in chickens. that was my lesson about why you shouldn't press mice too much. because you think that they will not work on mice they will never work on chicken or us. in this case it turns out the chickens give us the answer. >> thank you. >> thank you for your question. >> i been listening to you for years and it's interesting to see you.
8:04 am
you look much younger than expected you would be after all that time. [laughing] >> mi taller or shorter? >> younger. [laughing] [applause] >> used in apparently the good pharma. i'm an attorney and a physician and work with a law firm that is the various large pharma companies over of off-label dru. pharma sponsor some trials with small pilot studies, may not have controls, positive results are always published, negative results are never published or suppressed to the extent possible. and now that we have the supreme court is going to be no restriction on labeling, pharmacy companies are going to be protected by the first amendment, whatever they say as long as it's reasonable.
8:05 am
so do you think something should be done about publication? the fact that small trials are accepted for publication for a different indication, then it's not follow up on come on negative trials are suppressed so that somebody else may go into the same thing and get the same result. but a negative result would tell people know, this doesn't work, try something else. >> that is a huge problem in basic research, no question about it. if you get an exciting result to publish it, if you don't you don't publish it and nobody knows about it. that can end up wasting huge amount of time. i think in pharma, not a perfect system, but people have been concerned about that and one wastwayto do address it is if ye a drug your developing you are supposed to register it at a website clinical trials.gov and you're supposed to say here's what i'm planning to do, and i
8:06 am
would expect results at this time, and basically the fda expects that to be done before you test your drug just to see. it is still to a lot of drug companies are other people who do those experiment never bother to publish the results, but at least, they're supposed to report their results. it's a difficult situation but if think you can release go to that database and say somebody did this study. the results are not in the literature and they haven't reported it here on clinical trials.gov which is required by law. that's a red flag. may be the company will tell you what happened maybe to vote but unless you know there's data out there that should be informative and should raise an issue. there are some people talking about doing this registration idea for studies in laboratory not just for clinical trials. and that's another solution that's been knocking around. some scientists say that sounds pretty cumbersome, but others
8:07 am
say that could really help resolve some of these methodological problems that people sit in advance what you plan to study for the do the studies, then they can't turn around which often happens and says well, here are my results, i have a new hypothesis that it turns out it wasn't what i was looking for but i found something surprising. very friendly those turn out to be false leads but people present them in the literature as i just change my hypothesis, after results are known, it's kind of cheating but scientist do it. if you preregister you can avoid that particular kind of pitfall. you pointed your finger on another difficult problem that is related to that. >> thank you. >> thank you. >> hi, richard. >> greetings. >> so your focus was on biomedicine and i'm wondering if there are any different
8:08 am
considerations that apply to the physical sciences or different principles, different rates a failure to replicate? how does it differ? >> i don't think anyone knows about failure rates and physical sciences. in physical sciences as a very general principle it's easier to measure things. if you're studying animals and arrangements will very widely from one animal to another, from one extreme to another enema which makes it hard to get results you can olympic if you are measuring physicochemical values or whatever, you get less about general noise in the system. but you can still have results other people can't reproduce, no question about it. my favorite counterexample is window searching using the giant accelerator in switzerland, they realized we want to get this right, we don't want to get a result that we can't really trust. what they did was they built two completely different detectors
8:09 am
to look for this thing based on different technologies and so on. they figured if we get a result, we basically have also taken the care upfront to validated as we went along. that's why they have so much confidence when they found it. it cost $4.5 billion was a something you could necessarily do for every single laboratory experiment but the point is that there are ways if you think ahead, if the stakes are high enough you can apply those kinds of principles to physical sciences as well. the data are really not very strong for any kind of sites about how reproducible anything is. but i think, i focus on biomedicine because it touches so much on our personal lives, and clearly there's a lot of documented problems with that field in particular. thanks for the question. >> one suggestion that is been brought up to the eu in terms of how do you make ways, basically
8:10 am
force recovered. they say out the door. i wouldn't relate to. actually one of my friends were like he's like one of the top people in work and he goes i don't know what i'm going to do because attorney 65 this year. that problem remains to be seen. the other problem on a comment is the desire to have something absolutely true has got to be in competition with the funding that occurs so that when you have a series of grants and they go we will no longer take one out of five, not take one out of 20. you generate a tremendous amount of competition to the point where listen, jesus woul would'd it taken this medicine. more than that, simply raise the dead or doesn't get funded. there's a tremendous desire to show your best i suspect that it's only going to get worse, not going to get better. even if you are training people
8:11 am
it's going to be a very, very tough road when one at a 20 grants or one out of 30 grants -- that's not a sustainable system to have that kind of successor in funding. somethings got to give and what exactly that will be i don't know. i'm not that optimistic the huge amount of new influx of cash will follow this. i also think it's hard ask people to leave science. nih has thought about getting people to retire sooner and offering them sort of incentive grants. if you want to make this your last great we will make it easier to get it. there are ways less than saying 65, walk, 65, your time is up. but this is a problem because of this funding crunch the average age of getting your first grant, the classic nih grant, to support an individual laboratory, that's now in your mid-'40s. so if your first grant is in your mid-40 your not really ready to retire at 65. he's only been added for, what hope your dunce of productive
8:12 am
work before you got your first grant but that's a really tough situation to begin. but yeah, forced retirement isn't pleasant, too good for people who are still very productive. some of us feel like 65, i'll still be hitting on all cylinders. i understand that. >> hi, richard. congratulations. it's a great book. you really did a great job capturing the history of how this has evolved. one of the questions i keep thinking about, you have rightly showed how complicated the issues are and that there are many issues involved. but one of the things that's maybe the most fundamental is a perverse incentive. and i see this all the time, that one of the most frustrating things about the perverse incentives is really the publication issues, and i want
8:13 am
it you've thought about, i do want to say culpability, maybe that's a tough word, but what is the role of science journalists and publishers, scientific publications, and having set the tone to where the only thing that's interesting to reporting site is something that looks like it is groundbreaking? and this is a gesture of the highest o journals although they are the worst offenders, but almost any journal is now trained, because the highest in the journals that sort of set the standard if you will answer almost all journals have a criteria about innovativeness and something surprising, as opposed to simply good solid work or adequate drawing adequate conclusions from your data, which might not be extrapolating to the entire
8:14 am
universe. but being very careful about whether your preliminary results or whether they only are then shown in a particular experiment. you know, i just can't help but think that perversion that has come about through journalism and publishers is really what has changed how study sections look at science. >> it's a huge issue for sure come at a do about it in the book of their amount. there's a lot of handwringing about this and not a lot of solutions. yet journalists are partly culpable but also the scientists themselves are. i have a statistic, there's a couple of researchers in europe who said let's look at the language in the scientific papers themselves. they picked up words like robust, innovative, novel, unprecedented. they looked between 1974-2014 and he was like 15,000% increase
8:15 am
in the in use of those words by the scientists themselves because -- [inaudible] >> you can't get published unless you do that. >> so i'm wondering if the journalists or publication field are starting to think of how they might change the criteria that they use. certainly there's checklist now, nature has a bunch of checklist of things yet to do and report, and there have been some suggestions that may be reproduced data should be published. that's kind of silly. the real important thing is to recognize when good science has been done, and to celebrate that. >> that's absolutely true. i talked to the executive editor of a journal who basically says, she says that she feels like the universities are putting her in a position she does ought to be
8:16 am
in because if you're the dean and you can't decide who's a good scientist, you look and say what he did publish, how many papers today published in the top journals? she said i pick things that are interesting to me. not the best or markers of the typos career, but the deans are overwhelmed with these applications and decisions. and so they basically sort of farm at that pic they said i would just let for me. journalists do the same thing. i don't read every single journal. nothing remotely close to that. i look at the top journal because my philosophy, the philosophy of many sites journalists is i will it the world of journals do the filtering for me. if it's a hot paper someone will have submitted designs or natu nature. >> really, really important responsibility. >> it is predictors of meat of the papers actually don't stand the test of time. there's another study, i won't mention right now, but it just came out in mid february and one of the journals that basically
8:17 am
said a lot of the really exciting study they can reported in the media turn out not to pan out. we don't learn about that right away because we don't read the journals where people post the results it's a remember that study about parkinson's disease and pesticide? that turned out not to be true. we see the original study. we don't see the follow-up study. so yeah, there -- >> it would be great if journalists or publishers could think about what kind of efforts they could make to help turn this around. >> i think everyone has a role, and that's clearly an important role for them to think about that part of the world is messed up and need some improvement. thank you for your question. >> we will make this the last one. >> hey, dad. [laughing] i have a pretty quick one. i was just curious if kind of over the course of talking to people, and hope this isn't spoiling for ending.
8:18 am
over the course of talking to all the people you talk to for the book, if some solution somebody was working on i was looking at struck you as kind of your favorite, as the best one, when the kind and that at a crossroads actionable and effective in all these things, one that you are at your like this is a good place to start. >> i wish there were one sort of magic bullet. the clerk is about one that sticks in my head for sure is us talking to brian was at university of regina runs the center for open science down there, and he's a psychologist and he said when he got to university of virginia and it was time for his tenure review, his chairman said sent me every paper in britain. he said i've written when header papers for you and i going to read 100 papers. what are you going to do, way them? bite he did. they said we don't care. send us everything you publish. he pulled a stack together into realized at that point, he tells
8:19 am
a story of realizing that when he got to the university, instead that professor hinton we don't care about the volume but when it's time for your tenure review, we want you to send us that you are three best things you've done, we don't care if you publish a jillion things and a bunch of journals but we want to know what have you thought of, think of come just focus on a couple of really strong ideas and we will judge you by that. i think if people focus much more on that kind of, not productivity writ large but what is the smartest thing you thought about. i think those of the sorts of things that both sides forward. if more deans could take that up and do that i think that would make a big difference. i think that, not overall solution but i think if there was a change in mindset about not thinking about the volume of sites with thinking about what's the best sites, i think i could make a tremendous difference. that was sort of, i do remember
8:20 am
that was a lightbulb moment and thank you for the final question, riley. >> and congratulations. [applause] >> booktv is on twitter and facebook and we want to hear from you. tweet us twitter.com/booktv or post a comment on our facebook page, facebook.com/booktv. >> and booktv today is on the campus of ucla in los angeles and were talking with professors who were also authors. brenda stevenson teaches history here at ucla. professor, what took courses to teach? >> guest: i teach courses on slavery and courses on women and also interracial dynamics. >> host: how long have you been at ucla? >> guest: 26 years. >> host: how is a change? >> guest: tremendous of its grown in student population has grown inattentive students that we have. itoi
78 Views
IN COLLECTIONS
CSPAN2 Television Archive Television Archive News Search ServiceUploaded by TV Archive on