Knowledge Management as a tool to reduce errors

In our service section you will find information on the following subjects
Becoming the ideal healthcare provider read

- use the EQK-pyramid to get there

EQK-pyramid

- or the dynamic behind quality improvement

Qualitydynamic

Efforts to reduce human errors in medicine include the establishing of error management programs. Robert L. Helmreich(1) argues:

“Available data, including analyses of adverse events, suggest that aviation's strategies for enhancing teamwork and safety can be applied to medicine. I am not suggesting the mindless import of existing programmes; rather, aviation experience should be used as a template for developing data driven actions reflecting the unique situation of each organization”.  

 Helmreich(1) suggest the following six steps approach establishing error management in healthcare organizations:

“As in the treatment of disease, action should begin with
1.
History and examination; and
2.
Diagnosis.

The history must include detailed knowledge of the organization, its norms, and its staff. Diagnosis should include data from confidential incident reporting systems and surveys, systematic observations of team performance, and details of adverse events and near misses.
Further steps are:
3. Dealing with latent factors that have been detected, changing the organizational and professional cultures, providing clear performance standards, and adopting a non-punitive approach to error (but not to violations of safety procedures);
4.
Providing formal training in teamwork, the nature of error, and in limitations of human performance;
5.
Providing feedback and reinforcement on both interpersonal and technical performance; and
6.
Making error management an ongoing organizational commitment through recurrent training and data collection.

Some might conclude that such programs may add bureaucratic layers and burden to an already overtaxed system. But in aviation, one of the strongest proponents and practitioners of these measures is an airline that eschews anything bureaucratic, learns from everyday mistakes, and enjoys an enviable safety record.”

Clearly Helmreich’s “six steps”-plan resembles very much a Knowledge Management approach judged from the features of the plan.

First of all it is analytical, the problem should be acknowledged by examination – knowledge and data driven (step 1 + 2). The plan realizes that it is multidisciplinary and concentrates on it. Scientific studies in human factors engineering, organizational psychology, operations research, and other disciplines shows that in complex systems, safety depends not on exhortation, but rather on proper design of equipment, jobs, support systems, and organizations(2;3)

Secondly it builds on an open culture and realizes that a cultural change toward openness is mandatory (step 3).

If we can take any lesson from the stunning progress in safety in aviation and other risk industries it is that fear, reprisal, and punishment produce not safety, but rather defensiveness secrecy, and enormous human anguish(2). Thus Error management shall promote a greater climate of openness and move away from finger pointing and routine assignment of blame(4;5). It is important to realize that the necessary changes are as much cultural as technical(3).       

Some reports that error management reaffirms the provider’s reputation for putting the patient first. The patients were much more accepting of inevitability of human error than the health care personnel were and further more the patients were impressed that something was done to reduce errors(6).

The necessary change toward openness can certainly also be an uphill movement because medical staff generally does not report many errors. They neither acknowledged them nor discuss them for several reasons:  personal reputation (76%), the threat of malpractice (71%), high expectations of the patients family or society (68%), possible disciplinary actions by licensing boards (64%), threat to job security (63%) and expectation of egos or other team members (61% and 60%)(7).

Other barriers are skepticism that it generates extra workload, lack of trust, fear of reprisals, and lack of effectiveness of presenting the results(5). But of cause the healthcare workers can be an immense reservoir of creativity and motivation once the barrier has been broken down(3).

Thirdly the plan implies a learning organization (step 4). The clinicians are trained to identify the problems bedside (i.e. the unsafe actions) however they are unfamiliar with identifying contributing factors (i.e preconditions for unsafe acts, unsafe supervision and organizational influence). A systematic error management programs pays dividend when exploring the contributing factors as a greater awareness of these issues is translated into preventive measures(4).

 Fourthly it builds on open two-way communication otherwise reinforcement cannot take place (step 5). The problem is not fundamentally due to lack of knowledge, we already know a lot more than we put into practice(3). But we have to implement this knowledge through an ongoing learning process. Organizational learning without reinforcement -"single loop” learning- fuels and sustains "vulnerable system syndrome" while "double loop" learning is a necessity to start breaking free from the burden of "vulnerable system syndrome"(8).

 Finally plan is an ongoing continuous process with an explicit strategy (step 6).

 It is generally believed that the introduction of IT based clinical decision support and better linkage in and among systems will result in substantial improvement in patient safety(9). IT based decision support software has been proven to be of great value in reducing errors in the prescribing and administration of medicine(10-12).   

 To declare that adding an error management program in an organization used to Knowledge Management strategies is a piece of cake is maybe pushing the arguments to extremes, but it is hard to deny that there are sticking similarities between error management and Knowledge Management.

 Reference List

(1) Helmreich RL. On error management: lessons from aviation. BMJ 2000; 320(7237):781-785.

(2) Berwick DM, Leape LL. Reducing errors in medicine. BMJ 1999; 319(7203):136-137.

(3) Leape LL, Berwick DM. Safe health care: are we up to it? BMJ 2000; 320(7237):725-726.

(4) Vincent C, Taylor-Adams S, Chapman EJ et al. How to investigate and analyse clinical incidents: clinical risk unit and association of litigation and risk management protocol. BMJ 2000; 320(7237):777-781.

(5) Barach P, Small SD. Reporting and preventing medical mishaps: lessons from non-medical near miss reporting systems. BMJ 2000; 320(7237):759-763.

(6) Pietro DA, Shyavitz LJ, Smith RA, Auerbach BS. Detecting and reporting medical errors: why the dilemma? BMJ 2000; 320(7237):794-796.

(7) Sexton JB, Thomas EJ, Helmreich RL. Error, stress, and teamwork in medicine and aviation: cross sectional surveys. BMJ 2000; 320(7237):745-749.

(8) Reason JT, Carthey J, de Leval MR. Diagnosing "vulnerable system syndrome": an essential prerequisite to effective risk management. Qual Health Care 2001; 10 Suppl 2:II21-II25.

(9) Bates DW, Cohen M, Leape LL, Overhage JM, Shabot MM, Sheridan T. Reducing the frequency of errors in medicine using information technology. J Am Med Inform Assoc 2001; 8(4):299-308.

(10) Nightingale PG, Adu D, Richards NT, Peters M. Implementation of rules based computerised bedside prescribing and administration: intervention study. BMJ 2000; 320(7237):750-753.

(11) Rind DM, Safran C, Phillips RS et al. Effect of computer-based alerts on the treatment and outcomes of hospitalized patients. Arch Intern Med 1994; 154(13):1511-1517.

(12) Bates DW. Using information technology to reduce rates of medication errors in hospitals. BMJ 2000; 320(7237):788-791.

 

Anders Lassen Nielsen

     @ct Consult  

Mail to: admin@act-consult.com


Copyright © 1998 - 2008 ACT Consult. All Rights  Reserved | Terms of use | Contact Us |