Read our COVID-19 research and news.

Advertising Feature

Responsibly conducting research

This Advertising Feature has been commissioned, edited, and produced by the Science/AAAS Custom Publishing Office

High-profile retractions of papers for falsification, misrepresentation, and dishonest reviews are a blow to science. They add urgency to ongoing campaigns for responsible conduct of research (RCR). RCR is every scientist’s obligation, say researchers who have made RCR part of their scholarship. To promote high-quality science with lasting impact, these experts recommend individual actions and institutional policies that will create a culture of RCR.

Outright scientific fraud is fortunately rare. More pervasive and arguably more damaging to science is hastily conducted, poorly reported, irreproducible research. To combat this problem, prominent organizations have launched campaigns to raise awareness about it, explore its root causes, and promote RCR resources. For example, Science helped develop the Transparency and Openness Promotion (TOP) guidelines. The American Society for Cell Biology is behind the draft of the San Francisco Declaration on Research Assessment (DORA). An international initiative has formed the Enhancing the QUAlity and Transparency Of health Research (EQUATOR) Network. And an initiative from The Lancet is fighting waste in biomedical research.

These and other programs address an expanding catalog of RCR issues. The list of topics is overwhelming. It includes validation of reagents, secure and transparent data handling, full reporting of studies including negative results, proper assignment of authorship, and open access to publications. But we can meet these challenges, say researchers with years of RCR experience, if we all take steps to promote best research and publication practices. We should start with our own work, then encourage trainees, peers, and our institution.

Create a responsible culture

Virginia Barbour


It all begins with attitude, says Virginia Barbour, chair of the Committee on Publication Ethics (COPE). "The bottom line," she says, "is that the culture of the group and the institution determines how people conduct their research." From the moment you walk into the lab, she says, practice transparency: "Expect that everything you do is public. Make sure that other people can look at your work and know exactly what you did." Leaders promote transparency by regularly communicating with junior researchers, encouraging open discussions throughout the group, and taking RCR policies seriously. For example, Australia, where Barbour resides, follows the Australian Code for the Responsible Conduct of Research. Once you have that culture in place, says Barbour, layer on specific elements such as data management, recordkeeping, and publication plans.

Just knowing RCR rules does not guarantee ethical behavior. Wanting to be ethical is the key. 

Ioanna Semendeferi

Preparation is important. "Before you start experiments," says Barbour, "think about how you will manage your data, notebook, images, and analysis software associated with the project." Authorship is one of the biggest challenges that COPE deals with, Barbour says, and should also be addressed at the beginning of a study. Be clear about the contribution of each person considered for authorship and remember to credit junior researchers. "You and your collaborators might not agree in the beginning about who will be authors and their order," she says, "but at least agree on the process of deciding authorship." This initial investment will pay off when you write up results. When protocols and procedures are in place from the beginning, accurate reporting at the end is easier.

Barbour was a founding editor of the open-access journal PLOS Medicine and is now executive officer for the Australasian Open Access Support Group. Researchers understand that open access promotes equity by making results available to all scientists and showing taxpayers the products of public funding, says Barbour. However, early career researchers in particular can feel torn between the demand for publications in certain journals and the open access publication model.

To increase the accessibility of scientific publications, many funding agencies now require articles to be publicly archived, for example in PubMed Central. Australian funders demand at least deposition of the author-accepted manuscript in an institutional repository. Some universities, including Harvard and the University of California San Francisco, have policies and repositories for this purpose. These are all steps toward recognizing and rewarding high-quality science that is clearly documented and can be validated and verified, says Barbour: "Research that is done well and reported well. That's what we should be aiming for."

Keep up with RCR developments

Francis Macrina


Maintaining a culture of responsible research means keeping an eye on evolving RCR issues. Francis Macrina, vice president for research and innovation at Virginia Commonwealth University (VCU), has tracked RCR changes since he started an RCR course in the 1980s, before the National Institutes of Health and other funders began requiring ethics instruction for many trainees. He still teaches the course, and his experience and case studies are collected in his textbook on scientific integrity. Macrina has seen RCR grow to include a focus on data transparency, treatment, and storage; verification of cell lines, antibodies, and other reagents; parameters for working with the media; and the expansion of publication guidelines including "dual-use" biosecurity reporting for results that might be used for weapons development. Referring to the TOP Guidelines, he says, "We'll also probably see these entering practice incrementally."

To get an overview of current RCR requirements and issues in your field, Macrina suggests starting with journal author instructions. COPE and the International Committee of Medical Journal Editors have general publishing guidelines. Discipline-specific information is available from professional organizations such as the Society for Neuroscience and the American Chemical Society.

Macrina's office at VCU oversees industry collaborations, which raise additional RCR considerations. Examples include how long a company can delay manuscript submission for intellectual property review and whether students should work on industry-sponsored projects. "VCU has a corporate-sponsored research policy," Macrina says, so academic scientists thinking about an industry partnership should check with their technology transfer or commercialization center about similar documents. Experts at these centers can provide background, guidance, and advocacy in developing partnership agreements.

Companies want to collaborate with scientists who apply best practices because RCR is critical to the science-based industry, says Christopher J. Roberts, associate director of computational biology at Biogen Idec. When developing a drug or device, he explains, everything has to be absolutely dependable. "If you can't replicate something or get reliable results from a preclinical assay that you'll be running repeatedly," he says, "you'll never get a drug that works in the clinical phase." Companies also understand that peer-reviewed articles are important for the career development of their own scientists and their university collaborators, so many have established publication policies. However, the closer you get to a product, Roberts says, the more constraints you'll find on publishing. This is why, for university collaborations, Roberts says, "We set up legal agreements in advance that spell out intellectual property considerations and a publication strategy and timeline."

Train new scientists in RCR

Semendeferi (left) and Pavlidis


To achieve the aim of a ubiquitous culture of RCR, early career researchers need to be trained in RCR principles. Scientists have always learned their craft from mentors, but the faster pace and increased complexity of research now demands more formal training in best practices. A comprehensive, multimethod, history-based approach to RCR training is underway at the University of Houston. Associate Instructional and Research Professor Ioanna Semendeferi of the Department of Physics led development of a three-credit course with validated evaluation methods supported by the National Science Foundation. The core principle of the course is that best practices follow when scientists internalize ethical values. "Just knowing RCR rules does not guarantee ethical behavior," says Semendeferi. "Wanting to be ethical is the key."

Ioannis Pavlidis, Computational Physiology Lab director, co-teaches the course and helped develop it. "Semendeferi's approach is [to] lecture with visual elements like movies and documentaries that add emotional richness and cultivate empathy," he says. Course activities include in-class debates, a peer-review exercise supervised by a senior scientist, observing research with animals and human participants, and interpreting science ethics commentaries in national newspapers. The project also organizes a public seminar series on science history and ethics.

Semendeferi and Pavlidis recommend general science ethics courses that mix students from engineering, humanities, and social and natural sciences. "Everyone coming together is an education by itself," says Semendeferi. She says that hearing different viewpoints makes students aware of the decisions involved in doing ethical science and of their personal responsibility for their work. Stephanie Watts, professor of pharmacology and toxicology and assistant dean of the graduate school at Michigan State University (MSU), has had the same experience in a workshop series she coordinates that takes a practical approach to RCR issues. Watts says when students from multiple disciplines and countries hold discussions, they often spontaneously raise questions about expectations and norms in other fields and cultures. Examples are honorary authorship, which might be expected in some countries but against journal guidelines, or an engineer's impulse to precisely duplicate published text about methods, which can result in self-plagiarism. "We recognize that different parts of the world have different rules," says Watts, "So we talk about having conversations at the beginning of a collaboration about data sharing, authorship, and other publication issues. We talk about how plagiarism is stealing someone else's work, and no society allows stealing."

Watts and Pavlidis say teaching RCR has influenced their own research. "Reading and thinking about these issues has made me a better mentor," says Watts. "I talk with lab members often so I know what they're doing. I see raw data from the start to the end of a project and we all interrogate each other in lab meetings about how we got our data and what they mean." Watts says keeping close track of everyone's work is just part of her job. "I tell them it's not because I don't trust them but to make sure we agree on the approach and what we see in the data. It's an RCR issue."

Teaching RCR has also affected Pavlidis's research on methods to measure physiological variables, such as for sleep, exercise, and dexterity. "We handle a lot of data," he says, "so we try to be transparent about it." For a project on measuring drivers' responses under stress, data are posted online as they accumulate. "Open data sharing lets everyone trace our conclusions from A to Z," he says. His group practices team science, cultivating a culture of mutual respect and credit sharing that recognizes both intellectual and technical contributions. Everyone understands the arrangement from the beginning, he says: For a given project, technical contributors get first credit in methods papers, while theoretical publications highlight other team members.

Promote RCR in departments and institutions

Formal courses mean that students, postdocs, and faculty who teach RCR are well versed in current issues in best research practices. For senior faculty educated before RCR training requirements came about, Watts says getting involved in an RCR course is a good introduction and can be fun. For the MSU workshops, Watts recruits colleagues as speakers, and students choose faculty members to be their research integrity consultants—their sounding board for RCR discussions. At VCU, Macrina recruits two or three faculty members per session to facilitate case discussions and gives them a one-hour training session on the basics of the course material.

However, even this time commitment might seem like a burden to overworked faculty. Watts leads her own research group, so she sympathizes with scientists who say they already spend up to 40 percent of their time on regulatory work. To handle the paperwork, she recommends taking advantage of institutional offices that help with Institutional Review Board proposals and radiation safety requirements, for example. But for scientists to stay motivated in the face of increasing regulations, widespread changes in culture and attitudes are needed. When Watts feels overwhelmed by regulations, she tells herself: "It's a privilege to have a lab and I'm lucky to be doing this work supported by taxpayer dollars. You want to do your science right, so your colleagues trust your work and you trust theirs."

Macrina agrees, and he questions the value of data that can't be reproduced. Although every new requirement adds bureaucracy, he says, "It's all about public trust." Trust is critical for the reputation of the researcher, the institution, and science in general. "It sounds like a cliché," he says, "but if we want research to have an impact, people need to trust researchers."

For culture change at the institutional level, we need to align hiring and promotion practices with RCR. Pavlidis and Semendeferi recommend rewarding scientists for mentoring, especially in RCR. Semendeferi says, "Just demanding particular behaviors without eliminating the conditions that lead to unethical practices will not solve the issues." We can all contribute to change, she says: "Individual scientists have the ability and power to make a difference."

The message that paper counts and journal impact factors don't represent true research value is reaching the academy. Macrina warns that moving away from these simple measures will take time. "Determining research quality isn't easy," he says. "Hiring committees have to seek and gather evidence to evaluate each publication, instead of just counting papers in high-impact journals." However, he is encouraged by initiatives like DORA, and by scientific leaders like National Medal of Science awardee Bruce Alberts who publicly criticize using impact factors to assess research productivity. Watts notes that these discussions are already having an effect. "I understand that young scientists feel pressured to move quickly and produce high-impact publications," she says. "But I'm involved in a job search right now and I'm looking for people who do their work with integrity, can finish what they start, and do solid science that others can build on—that's who I want to hire."

Search Jobs

Enter keywords, locations or job types to start searching for your new science career.