IMPORTANT:This page has used Creative Commons Licensed content from Wikipedia in either a refactored, modified, abridged, expanded, built on or 'straight from' text content! (view authors)

Template:Science Scientific misconduct is the violation of the standard codes of scholarly conduct and ethical behavior in professional scientific research. A Lancet review on Handling of Scientific Misconduct in Scandinavian countries provides the following sample definitions:[1] (reproduced in The COPE report 1999[2])

  • Danish Definition: "Intention(al) or gross negligence leading to fabrication of the scientific message or a false credit or emphasis given to a scientist"
  • Swedish Definition: "Intention(al) distortion of the research process by fabrication of data, text, hypothesis, or methods from another researcher's manuscript form or publication; or distortion of the research process in other ways."

The consequences of scientific misconduct can be severe at a personal level for both perpetrators and any individual who exposes it. In addition there are public health implications attached to the promotion of medical or other interventions based on dubious research findings.

Motivation to commit scientific misconduct

According to David Goodstein of Caltech, there are motivators for scientists to commit misconduct, which are briefly summarised here.[3]

Career pressure
Science is still a very strongly career-driven discipline. Scientists depend on a good reputation to receive ongoing support and funding; and a good reputation relies largely on the publication of high-profile scientific papers. Hence, there is a strong imperative to "publish or perish". Clearly, this may motivate desperate (or fame-hungry) scientists to fabricate results.
To this category may also be added a paranoia that there are other scientists out there who are close to success in the same experiment, which puts extra pressure on being the first one. It is suggested as a cause of the fraud of Hwang Woo-Suk.[citation needed][4] A main source of detection comes when other research teams in fact fail or get different results.
Even on the rare occasions when scientists do falsify data, they almost never do so with the active intent to introduce false information into the body of scientific knowledge. Rather, they intend to introduce a fact that they believe is true, without going to the trouble and difficulty of actually performing the experiments required.
Easiness of fabrication
In many scientific fields, results are often difficult to reproduce accurately, being obscured by noise, artifacts and other extraneous data. That means that even if a scientist does falsify data, they can expect to get away with it - or at least claim innocence if their results conflict with others in the same field. There are no "scientific police" which are trained to fight scientific crimes, all investigations are made by experts in science but amateurs in dealing with criminals. It is relatively easy to cheat.

Forms of scientific misconduct

Forms of scientific misconduct include:

  • fabrication – the publication of deliberately false or misleading research, often subdivided into:
    • Obfuscation – the omission of critical data or results. Example: Only reporting positive outcomes and not adverse outcomes.
    • Fabrication – the actual making up of research data and (the intent of) publishing them, sometimes referred to as "drylabbing".[5]
    • Falsification – manipulation of research data and processes in order to reflect or prevent a certain result.[6]
    • bare assertions – making entirely unsubstantiated claims

Another form of fabrication is where references are included to give arguments the appearance of widespread acceptance, but are actually fake, and/or do not support the argument.[7]

  • plagiarism – the act of taking credit (or attempting to take credit) for the work of another.[8] A subset is citation plagiarism – willful or negligent failure to appropriately credit other or prior discoverers, so as to give an improper impression of priority. This is also known as, "citation amnesia", the "disregard syndrome" and "bibliographic negligence".[9] Arguably, this is the most common type of scientific misconduct. Sometimes it is difficult to guess whether authors intentionally ignored a highly relevant cite or lacked knowledge of the prior work. Discovery credit can also be inadvertently reassigned from the original discoverer to a better-known researcher. This is a special case of the Matthew effect.[10]
  • self-plagiarism – or multiple publication of the same content with different titles and/or in different journals is sometimes also considered misconduct; scientific journals explicitly ask authors not to do this. It is referred to as "salami" (i.e. many identical slices) in the jargon of medical journal editors (MJE). According to some MJE this includes publishing the same article in a different language.[11]
  • the violation of ethical standards regarding human and animal experiments – such as the standard that a human subject of the experiment must give informed consent to the experiment.[12]
  • ghostwriting – the phenomenon where someone other than the named author(s) makes a major contribution. Typically, this is done to mask contributions from drug companies. It incorporates plagiarism and has an additional element of financial fraud.
  • Conversely, research misconduct is not limited to NOT listing authorship, but also includes the conferring authorship on those that have not made substantial contributions to the research.[13][14] This is done by senior researchers who muscle their way onto the papers of inexperienced junior researchers[15] as well as others that stack authorship in an effort to guarantee publication. This is much harder to prove due to a lack of consistency in defining "authorship" or "substantial contribution".[16][17][18]
  • Misappropriation of data – Literally stealing the work and results of others and publishing as to make it appear the author had performed all the work under which the data was obtained.

In addition, some academics consider suppression—the failure to publish significant findings due to the results being adverse to the interests of the researcher or his/her sponsor(s)—to be a form of misconduct as well; this is discussed below.

In some cases, scientific misconduct may also constitute violations of the law, but not always. Being accused of the activities described in this article is a serious matter for a practicing scientist, with severe consequences should it be determined that a researcher intentionally or carelessly engaged in misconduct.

Three percent of the 3,475 research institutions that report to the US Department of Health and Human Services' Office of Research Integrity, indicate some form of scientific misconduct. (Source: Wired Magazine, March 2004) However the ORI will only investigate allegations of impropriety where research was funded by federal grants. They routinely monitor such research publication for red flags.

Other private organizations like the Committee of Medical Journal Editors (COJE) can only police their own members.

The validity of the methods and results of scientific papers are often scrutinized in journal clubs. In this venue, members can decide amongst themselves with the help of peers if a scientific paper's ethical standards are met.

The U.S. National Science Foundation defines three types of research misconduct: fabrication, falsification, and plagiarism.[19][20] Fabrication is making up results and recording or reporting them. Falsification is manipulating research materials, equipment, or processes or changing or omitting data or results such that the research is not accurately represented in the research record. Plagiarism is the appropriation of another person’s ideas, processes, results, or words without giving appropriate credit.

Responsibility of authors and of coauthors

Authors and coauthors of scientific publications have a variety of responsibilities. Contravention of the rules of scientific authorship may lead to a charge of scientific misconduct. All authors, including coauthors, are expected to have made reasonable attempts to check findings submitted to academic journals for publication. Simultaneous submission of scientific findings to more than one journal or duplicate publication of findings is usually regarded as misconduct, under what is known as the Ingelfinger rule, named after the editor of the New England Journal of Medicine 1967-1977, Franz Ingelfinger.[21]

Guest authorship (where there is stated authorship in the absence of involvement, also known as gift authorship) and ghost authorship (where the real author is not listed as an author) are commonly regarded as forms of research misconduct. In some cases coauthors of faked research have been accused of inappropriate behavior or research misconduct for failing to verify reports authored by others or by a commercial sponsor. Examples include the case of Gerald Schatten who co-authored with Hwang Woo-Suk, the case of Professor Geoffrey Chamberlain who co-authored fabricated papers with Malcolm Pearce,[22] and the coauthors with Jan Hendrik Schön at Bell Laboratories. More recent cases include the Charles Nemeroff, then the editor-in-chief of Neuropsychopharmacology, and a well documented case involving the drug Actonel Sheffield Actonel affair.

Authors are expected to keep all study data for later examination even after publication. The failure to keep data may be regarded as misconduct. Some scientific journals require that authors provide information to allow readers to determine whether the authors might have commercial or non-commercial conflicts of interest. Authors are also commonly required to provide information about ethical aspects of research, particularly where research involves human or animal participants or use of biological material. Provision of incorrect information to journals may be regarded as misconduct. Financial pressures on universities have encouraged this type of misconduct. The majority of recent cases of alleged misconduct involving undisclosed conflicts of interest or failure of the authors to have seen scientific data involve collaborative research between scientists and biotechnology companies (Nemeroff, Blumsohn).

Responsibilities of Research Institutions

In many countries there is no regulator to oversee the investigation of allegations of research misconduct. For example in the UK, unlike the United States, even acquisition of funds on the basis of fraudulent data is not a legal offence. Universities therefore have no incentive and every disincentive to investigate allegations or act on the findings of such investigations if they vindicate the allegation. In one notable example, King's College (London) performed an internal investigation which showed research findings from one of their researchers to be 'at best unreliable, and in many cases spurious'.[23] King's College took no action in terms of retracting relevant published research, or preventing further episodes from occurring. It was only 10 years later, when an entirely separate form of misconduct by the same individual was being investigated by the General Medical Council, that the internal report came to light.

This case and another [22] suggest that the execution of scientific misconduct should not be simply considered as an individual acting alone. In both these cases senior academics supported an individual engaging in scientific misconduct. The role of the institution in tolerating and supporting scientific misconduct has probably been underestimated.

Responsibilities of Scientific Colleagues who are "Bystanders"

Some academics believe that scientific colleagues who suspect scientific misconduct should consider taking informal action themselves, or reporting their concerns. (See Gerald Koocher and Patricia Keith Speigel in NATURE Vol 466 22 July 2010: Peers Nip Misconduct in the Bud, and (with Joan Sieber) Responding to Research Wrongdoing: A User Friendly Guide, July 2010.) This question is of great importance since much research suggests that it is very difficult for people to act or come forward when they see unacceptable behavior, unless they have help from their organizations. A "User-friendly Guide," and the existence of a confidential organizational ombudsman may help people who are uncertain about what to do, or afraid of bad consequences for their speaking up. (See Mary Rowe, Linda Wilcox and Howard Gadlin, Dealing with—or Reporting—“Unacceptable” Behavior—with additional thoughts about the “Bystander Effect,” in JIOA, vol.2, no.1, pp52-62.)

Photo manipulation

In 2006, the Journal of Cell Biology gained publicity for instituting tests to detect photo manipulation in papers that were being considered for publication.[24] This was in response to the increased usage of programs by scientists such as Adobe Photoshop, which facilitate photo manipulation. Since then more publishers, including the Nature Publishing Group have instituted similar tests and require authors to minimize and specify the extent of photo manipulation when a manuscript is submitted for publication

Although the type of manipulation that is allowed can depend greatly on the type of experiment that is presented and also differ from one journal to another, in general the following manipulations are not allowed:

  • splicing together different images to represent a single experiment
  • changing brightness and contrast of only a part of the image
  • any change that conceals information, even when it is considered to be aspecific, which includes:
    • changing brightness and contrast to leave only the most intense signal
    • using clone tools to hide information
  • showing only a very small part of the photograph so that additional information is not visible

And more in general, most journals nowadays urge authors to use photo manipulation with restraint and great care.

Suppression/non-publication of data

A related issue concerns the deliberate suppression, failure to publish, or selective release of the findings of scientific studies. Such cases may not be strictly definable as scientific misconduct as the deliberate falsification of results is not present. However, in such cases the intent may nevertheless be to deliberately deceive. Studies may be suppressed or remain unpublished because the findings are perceived to undermine the commercial, political or other interests of the sponsoring agent or because they fail to support the ideological goals of the researcher. Examples include the failure to publish studies if they demonstrate the harm of a new drug, or truthfully publishing the benefits of a treatment while omitting harmful side-effects.

This is distinguishable from other concepts such as bad science, junk science or pseudoscience where the criticism centres on the methodology or underlying assumptions. It may be possible in some cases to use statistical methods to show that the datasets offered in relation to a given field are incomplete. However this may simply reflect the existence of real-world restrictions on researchers without justifying more sinister conclusions.

Some cases go beyond the failure to publish complete reports of all findings with researchers knowingly making false claims based on falsified data. This falls clearly under the definition of scientific misconduct, even if the result was achieved by suppressing data. In the case of Raphael B. Stricker, M.D.,[25] for instance, the U.S. Office of Research Integrity has found the removal of samples from a data set in order to reach a desired conclusion to be grounds for disbarment from funding.

Consequences for science

The consequences of scientific fraud vary based on the severity of the fraud, the level of notice it receives, and how long it goes undetected. For cases of fabricated evidence, the consequences can be wide ranging, with others working to confirm (or refute) the false finding, or with research agendas being distorted to address the fraudulent evidence. The Piltdown Man fraud is a case in point: The significance of the bona-fide fossils being found was muted for decades because they disagreed with Piltdown Man and the pre-conceived notions that those faked fossils supported. In addition, the prominent paleontologist Arthur Smith Woodward spent time at Piltdown each year until he died trying to find more Piltdown Man remains. The misdirection of resources kept others from taking the real fossils more seriously and delayed the reaching of a correct understanding of human evolution. (The Taung Child, which should have been the death knell for the view that the human brain evolved first, was instead treated very critically because of its disagreement with the Piltdown Man evidence.)

In the case of Dr Albert Steinschneider, two decades and tens of millions of research dollars were lost trying to find the elusive link between infant sleep apnea, that Steinschneider said he had observed and recorded in his laboratory and claimed was a precursor of sudden infant death syndrome (SIDS). The cover was blown in 1994, 22 years after Steinschneider's 1972 Pediatrics paper claiming such an association,[26] when Waneta Hoyt, the mother of the patients in the paper, was arrested, indicted and convicted on 5 counts of second degree manslaughter for the smothering deaths of her five children.[27] While that in itself was bad enough, the paper, presumably written as an attempt in trying to save infants' lives, ironically was ultimately used as a defense in cases where parents were suspected in multiple deaths of their own children in cases of Munchausen syndrome by proxy. The 1972 Pediatrics' paper was cited by 404 papers in the interim and is still listed on Pubmed without comment.[28]

Consequences for those who expose misconduct

The potentially severe consequences for individuals who are found to have engaged in misconduct also reflect back on the institutions that host or employ them and also on the participants in any peer review process that has allowed the publication of questionable research. This means that a range of actors in any case may have a motivation to suppress any evidence or suggestion of misconduct. Persons who expose such cases, commonly called whistleblowers, can find themselves open to retaliation by a number of different means.[22] These negative consequences for exposers of misconduct have driven the development of whistle blowers charters - designed to protect those who raise concerns. A whistleblower is almost always alone in his fight - his career becomes completely dependent on the decision about alleged misconduct. If the accusations prove false, his career is completely destroyed, but even in case of positive decision the career of the whistleblower can be under question: his reputation of "troublemaker" will prevent many employers from hiring him. There is no international body where a whistleblower could give his concerns. If a university fails to investigate suspected fraud or provides a fake investigation to save their reputation the whistleblower has no right of appeal. High profile journals like "Nature" and "Science" usually forward all allegations to the university where the authors are employed, or may do nothing at all.

Exposure of falsified data

With the advancement of the internet, there are now several tools available to aid in the detection of plagiarism and multiple publication within biomedical literature. One tool developed in 2006 by researchers in Dr. Harold Garner's laboratory at the University of Texas Southwestern Medical Center at Dallas is Déjà Vu, an open-access database containing several thousand instances of duplicate publication. All of the entries in the database were discovered through the use of text data mining algorithm eTBLAST, also created in Dr. Garner's laboratory. The creation of Déjà Vu and the subsequent classification of several hundred articles contained therein have ignited much discussion in the scientific community concerning issues such as ethical behavior, journal standards, and intellectual copyright. Studies on this database have been published in journals such as Nature and Science, among others.[29][30]

Other tools which may be used to detect falsified data include error analysis. Measurements generally have a small amount of error, and repeated measurements of the same item will generally result in slight differences in readings. These differences can be analyzed, and follow certain known mathematical and statistical properties. Should a set of data appear to be too faithful to the hypothesis, i.e., the amount of error that would normally be in such measurements does not appear, a conclusion can be drawn that the data may have been forged. Error analysis alone is typically not sufficient to prove that data have been falsified, but it may provide the supporting evidence necessary to confirm suspicions of misconduct.

Data sharing

Kirby Lee and Lisa Bero suggest, "Although reviewing raw data can be difficult, time-consuming and expensive, having such a policy would hold authors more accountable for the accuracy of their data and potentially reduce scientific fraud or misconduct."[31]

Alleged cases

Below are some cases of alleged scientific misconduct.

See also





  1. Nylenna, Magne; Daniel Andersen, Gisela Dahiquist, Matti Sarvas, Asbjørn Aakvaag (July 3, 1999). "Handling of scientific dishonesty in the Nordic countries". The Lancet 354. "Definition of dishonesty".
  2. "Coping with fraud" (PDF). The COPE Report 1999: 11–18. Archived from the original on 2007-09-28. Retrieved 2006-09-02. "It is 10 years, to the month, since Stephen Lock ... Reproduced with kind permission of the Editor, The Lancet.".[dead link]
  3. Goodstein, David (January-February 2002). "Scientific misconduct". Academe (AAUP).
  4. Phenomenon of Collective Nobel-Prize Paranoia vs Phenomenon of Nobel-Prize Hunger
  5. Shapiro, M.F. (1992). "Data audit by a regulatory agency: Its effect and implication for others" (PDF). Accountability in Research 2 (3): 219–229. doi:10.1080/08989629208573818. PMID 11653981.
  6. Emmeche, slide 3
  7. Emmeche, slide 5
  8. "Publication Ethics Policies for Medical Journals — The World Association of Medical Editors". Retrieved 2009-07-30.
  9. meher mistry. "Demand Citation Vigilance, The Scientist, 16(2):6, January 21, 2002". Retrieved 2009-07-30.
  10. Emmeche, slide 3, who refers to the phenonemon as Dulbecco's law
  11. "Publication Ethics Policies for Medical Journals — The World Association of Medical Editors". Retrieved 2009-07-30.
  12. "Publication Ethics Policies for Medical Journals — The World Association of Medical Editors". Retrieved 2009-07-30.
  14. "Publication Ethics Policies for Medical Journals — The World Association of Medical Editors". Retrieved 2009-07-30.
  15. Kwok LS (September 2005). "The White Bull effect: abusive coauthorship and publication parasitism". J Med Ethics 31 (9): 554–6. doi:10.1136/jme.2004.010553. PMC 1734216. PMID 16131560.
  16. Bates T, Anić A, Marusić M, Marusić A (July 2004). "Authorship criteria and disclosure of contributions: comparison of 3 general medical journals with different author contribution forms". JAMA 292 (1): 86–8. doi:10.1001/jama.292.1.86. PMID 15238595.
  17. Bhopal R, Rankin J, McColl E, et al. (April 1997). "The vexed question of authorship: views of researchers in a British medical faculty". BMJ 314 (7086): 1009–12. PMC 2126416. PMID 9112845.
  18. Wager E (2007). "Do medical journals provide clear and consistent guidelines on authorship?". MedGenMed 9 (3): 16. PMC 2100079. PMID 18092023.
  19. New Research Misconduct Policies, NSF
  20. 45 CFR Part 689 [1]
  21. [2][dead link]
  22. 22.0 22.1 22.2 22.3 "Lessons from the Pearce affair: handling scientific fraud". BMJ (310): 1547. June 17, 1995. Retrieved 2009-09-08. "in Britain ... brushed under the carpet, and the whistleblower would probably have been hounded out of his or her job.". (requires free registration)
  23. Wilmshurst P. Institutional corruption in medicine (2002). British Medical Journal; 325:1232-5
  24. Nicholas Wade (2006-01-24). "It May Look Authentic; Here's How to Tell It Isn't". New York Times. Retrieved 2010-04-01.
  25. "Nih Guide: Final Findings Of Scientific Misconduct". Retrieved 2009-07-30.
  26. 26.0 26.1 Steinschneider A (October 1972). "Prolonged apnea and the sudden infant death syndrome: clinical and laboratory observations". Pediatrics 50 (4): 646–54. PMID 4342142.
  27. 27.0 27.1 Talan, Jamie; Firstman, Richard (1997). The death of innocents. New York: Bantam Books. ISBN 0-553-10013-0.
  28. Prolonged apnea and the sudden infant death syndro...[Pediatrics. 1972] - PubMed Result
  29. Errami M, Garner HR (2008-01-23). "A tale of two citations". Nature. doi:10.1038/451397a.
  30. Long TC, Errami M, George AC, Sun Z, Garner HR (2009-03-06). "SCIENTIFIC INTEGRITY: Responding to Possible Plagiarism". Science. doi:10.1126/science.1167408.
  31. Ethics: Increasing accountability Nature (2006) | doi:10.1038/nature05007
  32. Weiss RB, Rifkin RM, Stewart FM, Theriault RL, Williams LA, Herman AA, Beveridge RA. (2000-03-18). "High-dose chemotherapy for high-risk primary breast cancer: an on-site review of the Bezwoda study". The Lancet 355: 999–1003. doi:10.1016/S0140-6736(00)90024-2.
  33. Jo Revill (2006-01-15). "Doctor in drug research row quits NHS post". London: Observer. Retrieved 2009-09-08.
  34. Phil Baty (2006-01-06). "Drugs trial row scientist resigns". Times Higher Education. Retrieved 2009-09-08. "did not indicate that initial inquiries uncovered wrongdoing"
  35. Actonel Case Media Reports - Scientific Misconduct Wiki
  36. 36.0 36.1 Altman, Larry (May 2, 2006). "For Science's Gatekeepers, a Credibility Gap". New York Times. Retrieved 2008-03-26. "Recent disclosures of fraudulent or flawed studies in medical and scientific journals have called into question as never before the merits of their peer-review system. The system is based on journals inviting independent experts to critique submitted manuscripts. The stated aim is to weed out sloppy and bad research, ensuring the integrity of the what it has published."
  37. "ScienceWeek". ScienceWeek. 1998-03-20. Retrieved 2009-09-08. "agreed to a legally binding settlement, in which Paquette excludes himself from receiving any federal funding for the next 2 years, while NSF agrees not to issue a finding of scientific misconduct. ... university's chemistry department, however, considers the plagiarism charge insignificant, saying that Paquette's actions "could be considered sloppy, but do not constitute plagiarism by most definitions.""
  38. "C:\Documents and Settings\JEgbert\Local Settings\Temporary Internet Files\Content.IE5\D1I6JYFV\Vol1no3[1].wpd" (PDF). Retrieved 2009-07-30.
  39. "Top Pain Scientist Fabricated Data in Studies, Hospital Says", Wall Street Journal, 11 March 2009.
  40. "Nih Guide: Findings Of Scientific Misconduct". United States Department of Health and Human Services. 2001-12-13. Retrieved 2009-09-08. "NOTICE: NOT-OD-02-020"
  41. Bridget Murray (February 2002). "Research fraud needn't happen at all". Monitor on Psychology (American Psychological Association) 33 (2). Retrieved 2009-09-08.
  42. "Scandal Rocks Scientific Community". Deutsche Welle. 30 September 2002.,2144,646321,00.html.
  43. Hick JF (July 1973). "Letter to editor: Sudden Infant Death Syndrome and Child Abuse". Pediatrics 52 (1): 147. PMID 4724436.
  44. Steinschneider A (June 1994). "Prolonged apnea and the sudden infant death syndrome: clinical and laboratory observations". Pediatrics 93 (6): 944.
  45. Lucey JF (June 1994). "Woman Confesses in Deaths of Children". Pediatrics 93 (6): 944.
  46. Nature. "Access : Doubts over biochemist's data expose holes in Japanese fraud laws". Nature. Retrieved 2009-07-30.
  47. William T. A. Harrison,a Jim Simpsonb and Matthias Weilc (January 2010). "Editorial". Acta Crystallographica Section E 66: e1-e2. doi:10.1107/S1600536809051757. ISSN 1600-5368.
  48. Doreen Walton (8 January 2010). Lancet urges China to tackle scientific fraud. BBC.


External links

da:Videnskabelig uredelighed de:Betrug und Fälschung in der Wissenschaft fr:Fraude scientifique ja:科学における不正行為 pt:Fraude científica

Community content is available under CC-BY-SA unless otherwise noted.