We all like to think of scientists as embodying the ideals of carrying out research with the highest integrity, moral standards and dedication. The overwhelming majority fit within this mould, but there are some who engage in what can only be described as scientific misconduct, even fraud. The question is what happens when there is deliberate misconduct?
The initial answer is to rely on the scientific process itself to weed out bad science. Theories based on spurious or false data will soon die a death as further experiments and observations unravel any misbehaviour. The problem is that it takes enormous effort and time to uncover such misconduct, which in some cases may have continued for many years, the perpetrators having gained senior positions or even retired.
The peer review process will also filter out suspicious or incompetent research practices, although this is becoming more difficult with the advent of research, which requires complex computational calculations. Sometimes the only way to check results is to actually perform and duplicate the computations, which is not practical.
There is some help possibly available, in some cases, from those working in the field of linguistics. Markowitz and Hancock at Stanford University, US, identify patterns in fraudulent papers, which may represent markers to identify misconduct. Negations and ambiguity are a notable part of deceptive communication. Such linguistic sleuthing is still in its infancy, but in the future, may be a worthwhile tool for the detection of fraud.
The first step is establishing what good scientific practice is in practical terms. Medicine, because of the close relationship between doctor and patient, has always been subject to rigorous rules of professional behaviour, overseen by medical associations in different countries, which have legal powers to sanction, expel or in more serious cases, pass files on to the authorities for criminal investigation. Also, ever since ancient times physicians have taken the Hippocratic Oath, which has endured as the basis for ethical conduct in medicine.
Outside of medicine the picture is less clear. In most areas of science, apart from some areas where there is specific legislation, there is a mixture of government organisations, scientific institutions and individual universities and departments which publish, disseminate and educationally promote guidelines on ethical and professional conduct. These guidelines, in their application to scientific research, are usually based on what are known as the Mertonian norms.
Robert K. Merton (1910 – 2003) published in 1942 a set of four ethical norms which detail the minimum standard for scientific research. These are: universalism, communism, disinterestedness and organised scepticism.
Universalism refers to the equality of scientists. Science applies to all and should be shared universally. Science should not depend on the individual characteristics of scientists. This ideal is the one that scientists today would rate as the most crucial for science.
Communism, or Communalism, so as not to confuse with the term for Marx’s political theory, relates to the ownership of intellectual property. Authorship of scientific papers should be fairly expressed. This can be problematic where large collaborations of scientists in a project may result in hundreds of names on a scientific paper, but nevertheless, fair credit should be given, and one may also add that work must be original.
Disinterestedness means that scientists should declare any special interests in their work and ideally should not have any, although this is not always possible. Even so, there should be objective interpretation of data and to borrow a legal phrase there must be “the truth, the whole truth and nothing but the truth”.
Finally, organised scepticism, or just plain scepticism in everything scientific is the order of the day. It is only by putting scientific pronouncements to the test in the most rigorous way that science can progress.
We have the ethical tools to handle moral decisions in science and guidelines on what makes good research practice, but how do we combat fraud in science?
There are two distinct areas of misconduct in science, academic and scientific misconduct. The first is when an academic advantage is gained by cheating in exams, plagiarising essays or any kind of falsification in order to obtain higher grades or a better degree. This is normally dealt with by universities.
The second is scientific misconduct, where a scientist deliberately distorts the research process. The four main categories of wrongdoing are: fabrication, falsification, plagiarism and misrepresentation. In addition, misconduct may involve professional malpractice and also the deliberate covering up of misconduct.
Fortunately, serious misconduct is rare, but nevertheless causes great damage when it occurs, both to those institutions that fund science, other scientists and to the general public.
The first objective of universities and other scientific institutions is to lay down in clear terms what kind of behaviour is considered as misconduct and what is not acceptable.
In the UK there are guidelines which universities include in their terms of employment for scientists. These are mirrored in other countries, together with organisations, which oversee good practice in scientific research. For international collaborative projects there is an OECD code.
As far as investigation of misconduct is concerned, universities, following published guidelines, set up structures under the aegis of their human resource departments, to consider and investigate allegations of scientific fraud.
The problem is what happens after serious misconduct is identified. Universities, as employers, may rely on employment legislation to sanction perpetrators. In addition, false reports will be corrected, funding may be withdrawn, the public and regulatory bodies may be informed, training regimes amended and in some cases dismissal of those involved.
In spite of these measures some feel that in serious cases internal inquiries and professional displeasure are insufficient sanctions. As an eminent lawyer once told me, lying per se is not a specific criminal offence in England – everybody does it. Perhaps this is a somewhat cynical observation, but it is the case that there is no specific legislation directed at scientific fraud under English law.
That is not to say that there are no sanctions under the criminal law. The Fraud Act 2006 specifies offenses of dishonestly obtaining services and the commission of fraud by false representation. Services could be publication in a journal and false representation could cover obtaining a research grant. The Act looks at the dishonest behaviour of the subject, rather than relying on proving that someone was deceived.
In the US the first case of a scientist accused, convicted and sentenced for scientific misconduct occurred in 1988. Stephen Breuning, a psychologist, was indicted in a federal court in Baltimore, Maryland, for falsifying his research results. He had obtained significant funding for his work, which was spurious and fraudulent.
Breuning pleaded guilty and was sentenced to 60 days in a half-way house, five years probation and 250 hours of community service. In addition, he had to pay back just over 11,000 dollars.
Generally speaking scientific misconduct does not result in criminal charges, although there have been calls to change this situation. The problem is that we would need a police squad with special knowledge of scientific research and universities would have to accept the idea of “internal affairs” tramping about their laboratories.
On the other hand, if the problem of misconduct becomes too widespread, the damage to the reputation of science and scientists may be such that more rigorous enforcement measures become necessary.
 David M. Markowitz and Jeffrey T. Hancock, Linguistic Obfuscation in Fraudulent Science, Journal of Language and Social Psychology, Nov. 8, 2015.
 Magda Osman, How rational is deception? Pantaneto Forum, Issue 3, July 2001.
 Robert K. Merton The Sociology of Science: Theoretical and Empirical Investigations, (Chicago University Press, Chicago, 1942).
 Communism, Universalism and Disinterestedness: Re-examining Contemporary Support among Academics for Merton’s Scientific Norms, Bruce Macfarlane and Ming Cheng, J. Acad. Ethics 2008 6: pp67-68.
 Policy and Guidelines on Governance of Good Research Conduct, Research Councils UK, February 2013 (updated July 2015).
 For example the Office for Research Integrity in the US and the UK Research Integrity Office.
 Investigating Research Misconduct Allegations in International Collaborative Research Projects: A Practical Guide, OECD Global Science Forum, April 2009.
 It’s time to criminalise serious scientific misconduct, New Scientist, 10th September, 2014, Interview Richard Smith by Rachel Nuwer.