Where is the dividing line between a simple mistake and falsification?
So in the title I have suggested that there are three types of data integrity deviation: fat finger, falsification, and fraud. Here are my definitions of the terms:
I have drawn a distinction in the definitions of falsification and fraud: Falsification is perpetrated by an individual and fraud by two or more people. However, the impact of both is the same: the intent to deceive. In writing this column, I have made the assumptions that each spectroscopist has a minimum level of scientific and professional training and will follow the documented analytical methods and laboratory SOPs. In addition, the organization the individual works for also has stated what ethical and professional standards it expects of its staff at their induction and via regular training sessions thereafter.
Mistakes and fat finger moments? If we are honest, we all make them. That is why any quality system for laboratories (for example, ISO 17025, GLP, and GMP) has the four eyes principle: One individual to perform the work and a second one to review the data produced to see that the procedure was carried out correctly and that there are no typographical errors or mistakes with calculations. Errors are easy to make; you should see the number I'm making as I type this column using a new PC that has a slightly larger keyboard than I am used to.
Many errors and mistakes we make are self corrected. For example as you enter a number into a spreadsheet cell or database field, often you will notice that while the brain tells you to enter "12.3" your fingers actually enter "13.2." This is a fat finger moment, but before committing the number to the cell or database you can correct this as you can see and have realized your error. The equivalent moment on paper is when you actually write the wrong numbers down in your laboratory notebook and then correct them by striking through the original entry so as not to obscure it and then entering the correct value along with your initials, the date, and possibly the reason for change. This is the paper version of an audit trail.
Some other mistakes that are not noticed by the spectroscopist can be detected by the software application you are using, such as a spell checker, or by verification that the data entered fail to meet certain criteria, such as being within a predefined range or specific format. So with our example above, if the data verification range was 11.0–13.0, the software would have picked up the problem even if you had not.
However, that still leaves the mistakes you don't realize you have made. For example, if the entry in the case above was 11.3, data verification would be useless and the error would have been entered without you or the software realizing that there was a problem.
Don't assume that you will spot all of your mistakes — they are human mistakes, which is why we need the second pair of eyes to check the analytical data and calculated results. From my experience as a laboratory manager and an auditor, supervisors know which members of their staff are diligent about their work and how well they check it and which members are slapdash, and the supervisor will adjust the review accordingly. So if you don't want a dubious reputation to precede you, be diligent and try your best to find and correct your own errors before passing your work to be checked.
To a certain extent, this column is about airing dirty laundry, which may not be a particularly interesting problem to all but is the heart of any good quality management system: self audits coupled with effective corrective and preventative action planning. Quality is everybody's problem, and it is not the sole responsibility of the quality assurance group to pick up the errors that the analytical laboratory has made. However, finding papers on how often we make mistakes in an analytical laboratory is difficult — probably because we don't really want to go there. This, however, is the wrong approach to take and we should encourage studies to investigate this.
Luckily help is at hand from clinical chemists working in hospitals who have published many studies on error rates in laboratories. For those that do not know, clinical chemistry is involved in the analysis of blood, urine, and other bodily outputs to help the diagnosis and management of diseases. Mistakes in this area can have a critical impact on the health of a patient and therefore the reduction in errors is essential.
One paper, entitled "The Blunder Rate in Clinical Chemistry," measured the rate of detected analytical errors before and after the introduction of a laboratory information management system (LIMS) and found that they were reduced from about 5% to less than 0.3% following the introduction of the computer system (1).
Manual transcription errors in patient records were assessed for blood results recorded in a critical care setting by comparing the handwritten and printed laboratory results in 100 consecutive patients in the intensive care unit of a UK hospital. Out of 4664 individual values, 67.6% were complete and accurate, 23.6% were not transcribed at all, and 8.8% were inaccurate transcriptions of the results. Interestingly this study found that accuracy work was significantly better in the morning (2).
An Australian study of transcribing hand-written pathology request forms to a computer systems and chemical analysis of the samples found that error rates were both in the 1–3% range in the best laboratories. The worst laboratories, however, had error rates of up to 39% in transcription and 26% in analytical results (3).
So let us extrapolate from the clinical chemistry laboratory and suggest that error rates in an analytical laboratory are in the range of 0.3–3% depending on the degree of automation you have. The more manual input and transcription checking required, the greater the number of errors that need to be detected and captured. Therefore laboratory errors are expected by external quality audits and regulatory inspections. Not finding these detectable errors raises suspicion of problems with the resultant delving further into laboratory records.
This brings us to a common issue that we all have experience with: the humble laboratory notebook. Typically this is a bound book with prenumbered pages, just there to prevent you tearing out a page to write down the shopping list or make a paper airplane; it's the first stage of ensuring data integrity in the laboratory. At the bottom of each page is space for you to sign and afterward a reviewer–supervisor–witness–peer to sign after checking your work and accepting it as accurate.
OK, here's the situation: You are a supervisor and you are checking a laboratory notebook for some current work, and in turning the page you notice that your signature is missing from when you reviewed some earlier work. Three out of four pages of the old work are signed and dated but you have neglected to sign one of the pages — so what do you do? Temptation time! You have the following options:
1. Ignore the problem and wait for somebody else to discover it.
2. Sign the page and date it the same as the other pages.
3. Sign the page but date it with the current date and add a note that you have just noticed the problem.
So what are you going to do? It is a pity that the paper and electronic versions of this magazine do not come with the possibility to fit a large hammer that will hit you over the head if you pick the wrong options. Most spectroscopists should reject the first option, especially if you are working in a research environment where product development and especially patent protection can be crucially dependent on the date of discovery. So we're down to options 2 and 3. Option 2 is a little voice whispering in your ear "nobody will know if you put the same date that the other pages were signed on." You can never find a hammer when you want one! You are now on the brink of the abyss — on the plateau is ethics and integrity and down the slippery slope is falsification and fraud. May I suggest that option 3 is the only option worth considering that will establish credibility for you and the laboratory? Reiterating the point in the section above and putting my auditor's hat on: I expect to see mistakes and if I don't find any I become suspicious.
Now let us move into the murkier world of falsification and fraud with the intent to deceive. A place that is very rich for finding examples dealing with both these issues is the FDA warning letters section found at www.fda.gov. The agency posts warning letters on its web site under the US Freedom of Information Act, with the intent of a name-and-shame approach. I quote these examples from the pharmaceutical industry; the FDA openly publishes this information, whereas the European regulators or ISO 17025 accreditation agencies usually keep it confidential.
Four of the warning letters and regulatory issues that have emerged recently in this arena follow:
You can see from these few examples that by being diligent, honest, and professional you can avoid the problems faced by these companies. The fourth example also illustrates that if a company outsources to a third party, the first company is still accountable for the quality of the material going into its own supply chain Proactive auditing will help prevent these issues.
There are a number of ways that we can avoid the problems of fraud and falsification. The first is to develop clear written policies and procedures of what is expected when work is carried out in any laboratory; the integrity of the data generated in the laboratory is paramount and must not be compromised. Coupled with this is the need to provide initial and on-going training in this area. The training should start when new spectroscopists join the laboratory and should continue as part of their ongoing training over the course of their careers.
To help training staff we need to know the basics of laboratory data integrity. The main criteria are listed below. Data must be:
Spectroscopists need to understand these criteria and apply them in their respective analytical methods.
To support human work, we should also provide automation in the form of integrated laboratory instrumentation with data handling systems and LIMS as necessary to perform the work. In any laboratory this integration needs to include effective audit trails to help maintain data integrity and monitor changes to data. Supervisors and quality personnel need to monitor these audit trails to assess the quality of data being produced in a laboratory; if necessary a key performance indicator (KPI) or measurable metric could be produced. Finally, if all else fails, disciplinary procedures need to be in place and should be used to resolve any problem, because the reputation of the laboratory is of prime importance.
In this column I have looked at errors caused by fat finger moments that are normal and why we need a second person to check our data and ensure that they are correct. These errors can be reduced by using automation to transfer data automatically and eliminate the need for manual entry of data followed by transcription error checking. We have also looked at falsification and fraud with ways of ensuring that none occur in your laboratory.
R.D. McDowall is principal of McDowall Consulting and director of R.D. McDowall Limited, and the editor of the "Questions of Quality" column for LCGC Europe, Spectroscopy's sister magazine. Address correspondence to him at 73 Murray Avenue, Bromley, Kent, BR1 3DJ, UK.
(1) A.M. Chambers, J. Elder, and D. St. J. O'Reilly, Annals Clinical Biochemistry 23, 470–473 (1986).
(2) R. Black, P. Woolman, and J. Kinsella, presented at American Society of Anaesthesiologists Annual Meeting, New Orleans, LA, October 2001.
(3) M. Khoury, L. Burnett, and M.A. Mackay, The Medical Journal of Australia 165, 128–130 (1996) (http://www.mja.com.au/).
(4) R.D. McDowall, Quality Assurance Journal 10, 15–20 (2006).
(5) Ohm Laboratories, Warning Letter (December 2009).
(6) Xian Libang Pharmaceutical Company, Warning Letter (January 2010).
(7) J.-F. Tremblay, Chemical & Engineering News 88(34), 23 (August 23, 2010).