IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Graduate Student Wants University of Colorado to Adopt Campus-Wide Transparent Research Policy Requirements

The student wants the university to model its transparent research policy on the transparency and openness promotion, or TOP, guidelines, created by the Center for Open Science.

(TNS) -- Inspired by a new movement to improve the transparency and reproducibility of research, graduate student John Lurquin wants the University of Colorado to adopt a campus-wide transparent research policy requiring academics to publish data and information about their experiments.

Though reproducibility, or the ability to reproduce the results of an experiment, has always been on the minds of researchers, it's been getting more attention recently, thanks to several studies measuring the reliability of published research, said Lurquin, a doctoral student in the department of psychology and neuroscience and an outgoing student body president.

In 2012, researchers attempted to replicate 53 preclinical cancer drug studies published in top journals and found that they could replicate the results of just six, despite efforts to repeat the original experiments exactly.

Then in 2015, psychologist Brian Nosek published the results of his reproducibility project, in which he asked 270 researchers to repeat 100 published experiments. Nosek's team was able to replicate the findings of just 36 percent of the studies.

Nosek's study caught the eye of editors at National Public Radio, The Atlantic and Discover Magazine, which brought more attention to the issue from outside the research community.

Because many labs receive federal funding, aka taxpayer money, and because new science builds on past results, Lurquin and other academics say these findings are troubling.

"We're conducting all of this research and if the findings aren't valid, then that's a lot of wasted resources," Lurquin said. "It's also really, really bad for science. If I'm a graduate student working on a thesis and I'm trying to build on someone else's research, and I can't first replicate their work, everything I had planned disappears into thin air and I have to start over from scratch."

Incentives

Lurquin and other researchers say the problem is rooted in academia's incentive system.

At universities, including CU, researchers' careers depend on getting published in top journals — hiring, promotion and tenure all rely heavily on the number of times a researcher has had his or her work published and cited.

Academic journals tend to publish positive results, that is, experiments that uncover something new and exciting. Experiments that don't confirm a researcher's hypothesis rarely see the light of day, which is why this so-called publication bias is also often referred to the as "the file drawer effect."

No one is accusing the scientific research community of outright fraud, Lurquin said. Instead, experts say researchers make tiny, subconscious decisions that can lead to unreliable results.

They may make subtle changes to an experiment halfway through, or they may analyze data in a way that produces significant results or confirms their hypothesis.

"Journals are looking for flashy findings or something that's going to get press attention — eating chocolate helps you lose weight," Lurquin said. "Which means I'm going to be more likely to only submit articles that produce a flashy finding. I'm incentivized to report the variables that confirm the hypothesis, even though that isn't the whole story... that's just a perfect recipe for irreproducibility."

Solutions

Experts say there are a number of relatively simple solutions to this reproducibility problem. Some journals are already promoting and rewarding transparent practices, such as encouraging researchers to publish their dataset along with the article.

Another idea is for all researchers to pre-register an experiment in an online database, outlining all of the study's variables and the steps they plan to take. In theory, this prevents them from tweaking an experiment or leaving out variables when they publish their findings.

Pre-registration has already shown some promising results, Lurquin pointed out. The National Heart, Lung and Blood Institute began requiring pre-registration in 2000. After the rule was put in place, the number of studies with significant results dropped from 57 percent to 8 percent, according to researchers who studied the impact of pre-registration last year.

Along with pre-registering their experiment, researchers could also publish the results of the study — no matter what the finding — in the same repository, along with the data that supported their work.

Lurquin said this wouldn't be hard to do at CU, which in 2014 launched an open access repository so that all research conducted at CU would be accessible to the public, not just to those who can afford journal subscriptions.

"It isn't just a witch hunt," he said. "Publishing your dataset allows other researchers to grab your data and do more research. If I know I'm going to have to share my dataset, and the whole world is going to look at it, I'm going to take much more time to make sure it's put together well, it's organized. Now I'm being a better researcher."

Guidelines at CU

Lurquin wants CU to model its transparent research policy on the transparency and openness promotion, or TOP, guidelines, created by the Center for Open Science, which was founded by Nosek.

David Mellor, a project manager for the center, said those guidelines were initially written with academic journals in mind. But, he said, he'd love to see the guidelines adopted elsewhere, such as by funding agencies and universities.

He thinks it may take a while for universities to mandate some of the guidelines, such as pre-registration and data-sharing, as conditions for hiring and promotion.

Researchers operate in a competitive environment — for funding, for jobs, for prestige — so they may be hesitant about sharing their data and their experiment designs to the world. But Mellor thinks that a good first step is for universities to encourage, but not require, transparency within its research communities.

"They could start out with guidelines to departments saying 'We recommend adding language that supports open data-daring as one of the criteria that we're looking for in new job offers,'" Mellor said. "Really we would want that first step to lead to conversations about 'What does it mean for a university to support transparency? How can we reward that?'"

Technology changes research

So far, Lurquin has had only preliminary meetings about his ideas with Terri Fiez, CU's vice chancellor for research.

Fiez, whose background is in engineering and computer science, said she's not completely convinced by the results of the reproducibility experiments that are getting some attention lately.

"I'm sure we could find other studies that show the opposite, right?" she said.

Fiez said there are any number of reasons why experiments may be hard to repeat, chief among them variables that are hard to control, like the human body.

"Even if you set out to do a (reproducibility) study, there are things you don't necessarily control," she said. "So I don't think there's any ill-intent by anybody that's doing the research."

She also believes that rapid advances in technology have led to better, more accurate research. That's not to say that earlier scientists were wrong, though.

"Researchers are extremely conscientious and they want to do rigorous studies and we continue to learn as mankind evolves," she said. "So you look at the research that was done 40 years ago, we didn't have the same knowledge base, we didn't have the same tools that allowed us to look at things with more depth."

She also believes in the integrity of the peer review process, in which a study is vetted by a team of researchers in that field before publication.

Provost Russ Moore also believes that technological advances are important pieces of the reproducibility conversation.

"The answer to a question back in the day, before there were sophisticated electronics, would be different, perhaps, than an answer to a question with new tools or investigative processes," Moore said.

But, technology may also help make research more transparent and valid, too.

Prior to the internet, there was no good way to publish and share raw data conducted during research, he said.

"Now with the digital world we live in and the ability to store more and more data, we're closer to achieving the reality of having more open inquiry and presentation of data that underlie specific research findings," he said. "That also includes how you design a study and experimental exclusion criteria."

Moore said he's "absolutely" open to the idea of CU forming a committee to study and discuss whether the university should adopt some sort of transparent research policy. And he said he's proud that a graduate student is bringing this issue to the fore.

"As a recipient of federal research dollars, we have an obligation to try to make things as open and transparent as possible," Moore said. "And there's always a quest for having reproducible experiments in an attempt to find scientific truth."

©2016 the Daily Camera (Boulder, Colo.) Distributed by Tribune Content Agency, LLC.