Fat Finger, Falsification, or Fraud?

December 1, 2010

Where is the dividing line between a simple mistake and falsification?

So in the title I have suggested that there are three types of data integrity deviation: fat finger, falsification, and fraud. Here are my definitions of the terms:

  • Fat Finger: An inadvertent mistake made by an analyst during the course of his or her work that can be made either on paper or electronically.

  • Falsification: An action by an individual who deliberately writes or enters data or results with the intention to deceive.

  • Fraud: Collusion between two or more individuals who deliberately write or enter data or results with the intention to deceive.

I have drawn a distinction in the definitions of falsification and fraud: Falsification is perpetrated by an individual and fraud by two or more people. However, the impact of both is the same: the intent to deceive. In writing this column, I have made the assumptions that each spectroscopist has a minimum level of scientific and professional training and will follow the documented analytical methods and laboratory SOPs. In addition, the organization the individual works for also has stated what ethical and professional standards it expects of its staff at their induction and via regular training sessions thereafter.

R.D. McDowall

To Err Is Human

Mistakes and fat finger moments? If we are honest, we all make them. That is why any quality system for laboratories (for example, ISO 17025, GLP, and GMP) has the four eyes principle: One individual to perform the work and a second one to review the data produced to see that the procedure was carried out correctly and that there are no typographical errors or mistakes with calculations. Errors are easy to make; you should see the number I'm making as I type this column using a new PC that has a slightly larger keyboard than I am used to.

Many errors and mistakes we make are self corrected. For example as you enter a number into a spreadsheet cell or database field, often you will notice that while the brain tells you to enter "12.3" your fingers actually enter "13.2." This is a fat finger moment, but before committing the number to the cell or database you can correct this as you can see and have realized your error. The equivalent moment on paper is when you actually write the wrong numbers down in your laboratory notebook and then correct them by striking through the original entry so as not to obscure it and then entering the correct value along with your initials, the date, and possibly the reason for change. This is the paper version of an audit trail.

Some other mistakes that are not noticed by the spectroscopist can be detected by the software application you are using, such as a spell checker, or by verification that the data entered fail to meet certain criteria, such as being within a predefined range or specific format. So with our example above, if the data verification range was 11.0–13.0, the software would have picked up the problem even if you had not.

However, that still leaves the mistakes you don't realize you have made. For example, if the entry in the case above was 11.3, data verification would be useless and the error would have been entered without you or the software realizing that there was a problem.

Don't assume that you will spot all of your mistakes — they are human mistakes, which is why we need the second pair of eyes to check the analytical data and calculated results. From my experience as a laboratory manager and an auditor, supervisors know which members of their staff are diligent about their work and how well they check it and which members are slapdash, and the supervisor will adjust the review accordingly. So if you don't want a dubious reputation to precede you, be diligent and try your best to find and correct your own errors before passing your work to be checked.

What Is the Fat Finger Rate in a Laboratory?

To a certain extent, this column is about airing dirty laundry, which may not be a particularly interesting problem to all but is the heart of any good quality management system: self audits coupled with effective corrective and preventative action planning. Quality is everybody's problem, and it is not the sole responsibility of the quality assurance group to pick up the errors that the analytical laboratory has made. However, finding papers on how often we make mistakes in an analytical laboratory is difficult — probably because we don't really want to go there. This, however, is the wrong approach to take and we should encourage studies to investigate this.

Luckily help is at hand from clinical chemists working in hospitals who have published many studies on error rates in laboratories. For those that do not know, clinical chemistry is involved in the analysis of blood, urine, and other bodily outputs to help the diagnosis and management of diseases. Mistakes in this area can have a critical impact on the health of a patient and therefore the reduction in errors is essential.

One paper, entitled "The Blunder Rate in Clinical Chemistry," measured the rate of detected analytical errors before and after the introduction of a laboratory information management system (LIMS) and found that they were reduced from about 5% to less than 0.3% following the introduction of the computer system (1).

Manual transcription errors in patient records were assessed for blood results recorded in a critical care setting by comparing the handwritten and printed laboratory results in 100 consecutive patients in the intensive care unit of a UK hospital. Out of 4664 individual values, 67.6% were complete and accurate, 23.6% were not transcribed at all, and 8.8% were inaccurate transcriptions of the results. Interestingly this study found that accuracy work was significantly better in the morning (2).

An Australian study of transcribing hand-written pathology request forms to a computer systems and chemical analysis of the samples found that error rates were both in the 1–3% range in the best laboratories. The worst laboratories, however, had error rates of up to 39% in transcription and 26% in analytical results (3).

So let us extrapolate from the clinical chemistry laboratory and suggest that error rates in an analytical laboratory are in the range of 0.3–3% depending on the degree of automation you have. The more manual input and transcription checking required, the greater the number of errors that need to be detected and captured. Therefore laboratory errors are expected by external quality audits and regulatory inspections. Not finding these detectable errors raises suspicion of problems with the resultant delving further into laboratory records.

The Laboratory Notebook — Integrity or Falsification?

This brings us to a common issue that we all have experience with: the humble laboratory notebook. Typically this is a bound book with prenumbered pages, just there to prevent you tearing out a page to write down the shopping list or make a paper airplane; it's the first stage of ensuring data integrity in the laboratory. At the bottom of each page is space for you to sign and afterward a reviewer–supervisor–witness–peer to sign after checking your work and accepting it as accurate.

OK, here's the situation: You are a supervisor and you are checking a laboratory notebook for some current work, and in turning the page you notice that your signature is missing from when you reviewed some earlier work. Three out of four pages of the old work are signed and dated but you have neglected to sign one of the pages — so what do you do? Temptation time! You have the following options:

1. Ignore the problem and wait for somebody else to discover it.

2. Sign the page and date it the same as the other pages.

3. Sign the page but date it with the current date and add a note that you have just noticed the problem.

So what are you going to do? It is a pity that the paper and electronic versions of this magazine do not come with the possibility to fit a large hammer that will hit you over the head if you pick the wrong options. Most spectroscopists should reject the first option, especially if you are working in a research environment where product development and especially patent protection can be crucially dependent on the date of discovery. So we're down to options 2 and 3. Option 2 is a little voice whispering in your ear "nobody will know if you put the same date that the other pages were signed on." You can never find a hammer when you want one! You are now on the brink of the abyss — on the plateau is ethics and integrity and down the slippery slope is falsification and fraud. May I suggest that option 3 is the only option worth considering that will establish credibility for you and the laboratory? Reiterating the point in the section above and putting my auditor's hat on: I expect to see mistakes and if I don't find any I become suspicious.

Laboratory Fraud and Falsification

Now let us move into the murkier world of falsification and fraud with the intent to deceive. A place that is very rich for finding examples dealing with both these issues is the FDA warning letters section found at www.fda.gov. The agency posts warning letters on its web site under the US Freedom of Information Act, with the intent of a name-and-shame approach. I quote these examples from the pharmaceutical industry; the FDA openly publishes this information, whereas the European regulators or ISO 17025 accreditation agencies usually keep it confidential.

Four of the warning letters and regulatory issues that have emerged recently in this arena follow:

  • The classic fraud case involving laboratory data is that of Able Laboratories from 2005 (4). The company was engaged in a systematic laboratory fraud to pass batches of drug product that failed to meet specifications by changing weights and conversion factors and even cutting and pasting chromatograms. Results that failed were manipulated and faked until they passed — an original result for dissolution testing was ~30% versus a specification of >85% but after the magic fingers were applied the final result was ~89%! The company had passed several regulatory inspections until a whistleblower alerted the FDA to these practices. After a detailed inspection, the company withdrew several drug applications, recalled over 3100 batches of product and eventually went bankrupt. There was a subsequent criminal prosecution of four members of the company for fraud.

  • During an inspection of Ohm Laboratories in 2009, suspicion was aroused in the stability testing laboratory about material that had been taken out of the stability chambers for analysis. The material had been signed out by the stability coordinator but, as the warning letter noted, the attendance record showed that the stability coordinator was absent from the firm during those dates in which the coordinator recorded the withdrawal of samples from the stability chambers (5). This is very similar to the laboratory notebook example we discussed above.

  • A Chinese company, Xian Libang Pharmaceutical Co. (6), was found to have used the IR spectra from one batch of material to support the release of two subsequent batches. The warning letter noted that this practice is unacceptable and raises serious concerns regarding the integrity and reliability of the laboratory analyses conducted by your firm. It is essential that at least one test be conducted to verify the identity of each lot of incoming material. In addition, the laboratory control records should include complete documentation of all raw data generated during each test, including graphs, charts, and spectra from laboratory instrumentation. These records should be properly identified to demonstrate that each raw material batch was tested and met the release specification before its use in production. . . . A cursory review of records is not sufficient to ensure that other personnel did not manipulate or inaccurately report test data. It is interesting to note that after finding falsification in one analysis, the agency, quite rightly, casts doubt on the whole laboratory.

  • There was a further citation about the lack of controls to prevent manipulation of raw data during routine analytical testing and how measures would be put in place to stop unauthorized changes being made to data in the future. The agency wanted to see a process to prevent omissions in data, but also for recording any changes made to existing data, which should include the date of change, the identity of the person who made the change, and an explanation or reason for the change. All changes to existing data should be made in accordance with an established procedure.

  • My last example is recent and cost a European generic drug manufacturer a loss of $3.3 million earlier this year (7). Acino, a Swiss generic drug manufacturer, contracted Glochem, an Indian company, to supply clopidogrel, which is the active ingredient of Plavix. Following a visit by European inspectors to the Indian company, they found more than 70 original batch records in a dumpster at the site; all the records had been rewritten to be perfect with no errors, in total contradiction of Good Manufacturing Practice (GMP). The inspectors, again quite rightly, classified this as fraud and this triggered a recall of the material. In response, the company thought that the inspector's response was excessive and commissioned an extensive third-party analysis to demonstrate that the material met specifications. However, as the batch records had been copied and the originals were in the process of being destroyed, the inspectors held to their original view.

You can see from these few examples that by being diligent, honest, and professional you can avoid the problems faced by these companies. The fourth example also illustrates that if a company outsources to a third party, the first company is still accountable for the quality of the material going into its own supply chain Proactive auditing will help prevent these issues.

How Should We Prevent Fraud and Falsification?

There are a number of ways that we can avoid the problems of fraud and falsification. The first is to develop clear written policies and procedures of what is expected when work is carried out in any laboratory; the integrity of the data generated in the laboratory is paramount and must not be compromised. Coupled with this is the need to provide initial and on-going training in this area. The training should start when new spectroscopists join the laboratory and should continue as part of their ongoing training over the course of their careers.

To help training staff we need to know the basics of laboratory data integrity. The main criteria are listed below. Data must be:

  • Attributable — Who acquired the data or performed an action and when?

  • Legible — Can you read the data and any laboratory notebook entries?

  • Contemporaneous — Documented at the time of the activity.

  • Original — A written printout or observation or a certified copy thereof.

  • Accurate — No errors or editing without documented amendments.

  • Complete — All data including any repeat or reanalysis performed on the sample.

  • Consistent — All elements of the analysis such as the sequence of events follow on and are date or time stamped in the expected sequence.

  • Enduring — Not recorded on the back of envelopes, cigarette packets, sticky notes, or the sleeves of a laboratory coat but in laboratory notebooks or electronic media in the data systems of instruments and LIMS.

  • Available — Can be accessed for review and audit or inspection over the lifetime of the record.

Spectroscopists need to understand these criteria and apply them in their respective analytical methods.

To support human work, we should also provide automation in the form of integrated laboratory instrumentation with data handling systems and LIMS as necessary to perform the work. In any laboratory this integration needs to include effective audit trails to help maintain data integrity and monitor changes to data. Supervisors and quality personnel need to monitor these audit trails to assess the quality of data being produced in a laboratory; if necessary a key performance indicator (KPI) or measurable metric could be produced. Finally, if all else fails, disciplinary procedures need to be in place and should be used to resolve any problem, because the reputation of the laboratory is of prime importance.

Conclusions

In this column I have looked at errors caused by fat finger moments that are normal and why we need a second person to check our data and ensure that they are correct. These errors can be reduced by using automation to transfer data automatically and eliminate the need for manual entry of data followed by transcription error checking. We have also looked at falsification and fraud with ways of ensuring that none occur in your laboratory.

R.D. McDowall is principal of McDowall Consulting and director of R.D. McDowall Limited, and the editor of the "Questions of Quality" column for LCGC Europe, Spectroscopy's sister magazine. Address correspondence to him at 73 Murray Avenue, Bromley, Kent, BR1 3DJ, UK.

References

(1) A.M. Chambers, J. Elder, and D. St. J. O'Reilly, Annals Clinical Biochemistry 23, 470–473 (1986).

(2) R. Black, P. Woolman, and J. Kinsella, presented at American Society of Anaesthesiologists Annual Meeting, New Orleans, LA, October 2001.

(3) M. Khoury, L. Burnett, and M.A. Mackay, The Medical Journal of Australia 165, 128–130 (1996) (http://www.mja.com.au/).

(4) R.D. McDowall, Quality Assurance Journal 10, 15–20 (2006).

(5) Ohm Laboratories, Warning Letter (December 2009).

(6) Xian Libang Pharmaceutical Company, Warning Letter (January 2010).

(7) J.-F. Tremblay, Chemical & Engineering News 88(34), 23 (August 23, 2010).