Healthcare Consulting Services
April 7, 2009
Rehospitalization is a problem our healthcare system must address in order to improve not only patient safety and quality outcomes but the economic viability of our entire healthcare system. This issue has now floated to the top of the priority charts for CMS and other third party payors. Even though high rehospitalization rates are an indictment of the fragmentation and inefficiency of the whole healthcare system, future changes in hospital reimbursement make it imperative that hospitals begin to take the lead in re-engineering and coordinating post-discharge care.
Last weeks New England Journal of Medicine (Jencks et al 2009) has a study on Medicare data that shows 19.6% of all patients discharged from an acute care hospital are rehospitalized within 30 days. Since this study excluded some patients, such as Medicare managed care patients, that figure is in keeping with the more widely used rate of 18% in the literature. 34% were rehospitalized within 90 days and almost two-thirds either died or were rehospitalized within a year. Probably less than 10% of the rehospitalizations were planned. And most of the rehospitalizations after surgical discharges were for primarily medical conditions. And the rehospitalizations had an average LOS longer than that for comparable first hospitalizations for the same diagnosis. The cost to Medicare is probably over $20 billion annually for rehospitalizations.
Significantly, over half of those patients rehospitalized after a discharge to the community for a non-surgical diagnosis were not seen by a physician between hospitalizations (based on physician billing data).
Though there are some predictors of rehospitalizations (certain DRGs, presence of ESRD, long LOS, many prior hospitalizations), the data strongly support the need to re-engineer the process for all patients, not just targeted ones.
Our February 24, 2009 Patient Safety Tip of the Week Discharge Planning: Finally Something That Works! began to address the rehospitalization problem and focus on the hospital discharge process. In that column, we discussed the recent randomized study (Jack et al. 2009) from Project RED that documented considerable improvement in rehospitalization rates using a structured hospital discharge program. Recall that that study showed about a 30% reduction in rehospitalizations or ER visits after hospital discharge and saved about $412 per patient. Last week, many of you had the opportunity to learn about Project RED from a webinar sponsored by AHRQ featuring Brian Jack, M.D., the lead author of the above article. AHRQ has announced that webinar will be made available free on its website for a year. Believe us, this is one you dont want to miss!
But even while you are waiting to view the webinar, go to the Project RED website and learn about the program. There you can download a detailed description about the project and the idealized discharge process. You can also download a training manual plus sample after hospital care plans (AHCPs). And make sure you watch the videos to meet Louise and their other animated virtual nurse discharge advocates! (More on that later.)
Some of the key components to the structured discharge planning process are not new. Everyone has heard discharge planning begins on admission! While the words are somewhat trite, you really do need to begin planning for discharge on admission (or prior to admission if the admission is not emergent). Discharge planning should be a truly interdisciplinary process, not one managed in traditional silos. Unfortunately, the discharge planning participants from the medical team are often the most inexperienced medical team members (interns and first year residents), who are also least familiar with the physicians and medical resources in the community outside the hospital. So a senior member of the medical team, preferably the attending, should be intimately involved in the process.
The nurse discharge advocate still remains the key player and coordinator of the process. Using a discharge planning checklist is a good idea (add another thing to our list of things we should all be using checklists for!). The Project RED team actually also uses a computerized database to help coordinate the discharge process. That is a good way to help ensure that you have all the necessary components in place. But it should be clear that a paper-based system can be as good as a computer-based one.
Scheduling followup visits and tests for the patient is critical. The discharge team needs to work with that patient and family to make sure they will be able to keep those appointments. A color-coded calendar is given to the patient with all key follow-ups and appointments highlighted in color. (Note also from the Jencks study in NEJM that most of the rehospitalizations after a surgical admission are for medical conditions. That stresses the importance of coordinating not just surgical followup but also medical followup after such admissions.)
Getting a pertinent and timely discharge summary to all who need it is critical. That summary, in addition to outlining the reason(s) for hospitalization, tests results, care provided, condition and diagnoses, needs also to emphasize what needs to be done after hospitalization. Weve spoken often in the past about the need to highlight in the discharge summary test results that are pending so that the physician providing after-hospital care knows to follow up on them. Explaining to the patient (or family) about those pending tests is also important. Expediting transmission of that summary to the next provider in the care chain (whether it is a physician, SNF, home care organization, etc.) is very important. No longer can we tolerate waiting weeks to get discharge summaries dictated and signed. Just as importantly, you need to know who to send the discharge summary to. Most hospitals get a copy to the hospital attending but do a poor job at identifying the primary care physician or specialist who will be providing care to the patient after discharge. (See a recent timely article from ACP Hospitalist Creating a Better Discharge Summary. Is Standardization the Answer? for relevant insight and a good review of the literature on discharge summaries.)
Medication reconciliation at discharge is as important as during any phase of the hospitalization process. It needs to be explained to the patient what drugs he was on prior to hospitalization need to be continued and which should be discontinued. Any new drugs or changes in dosages of old drugs must be explained in detail. Care must be taken to avoid duplication of therapy. It is not uncommon for a patient to be given a prescription at discharge with, for example, the generic name of a drug and the patient knows the drug at home by its brandname and thus takes both the generic and the brandname drug. Make sure that drugs intended to be prophylactic only while hospitalized are not inadvertently continued after discharge. And it must be carefully explained to the patient what each drug is for and what side effects to watch for (and what to do if those side effects occur). And making sure the patient can easily get his medications is important.
Medication reconciliation is so important that the next key feature of the re-engineered discharge is added: the post-discharge phone call. In Project RED that phone call was done by a clinical pharmacist but theoretically it could be done by another type of clinician. Weve stressed before that most patients on the day of discharge just want to get out of the hospital so discharge is not your optimal teaching moment. Reinforcing medication management a few days after discharge is very helpful and can raise unanticipated problems that would otherwise be missed.
In our February 24, 2009 Patient Safety Tip of the Week Discharge Planning: Finally Something That Works! we also mentioned a prior demonstration project by the Colorado Foundation for Medical Care (a Medicare QIO) had demonstrated that a coaching model was successful in reducing readmission rates by almost 50%. In that model, an RN coach visits the patient once in the hospital and once within 48 hours after discharge and also calls the patient by phone three additional times. They discuss medication management, followup visits with physicians, a patient-centered record, and knowledge of red flags the patient should be aware of. And weve also stressed that post-discharge phone calls to patients can do wonders for the public relations of your organization.
The written discharge plan given to the patient is useful both for the patient and for the hospital team to document what was done at discharge. Every effort should be made to establish that the patient understands all the elements in the discharge plan.
Making it clear to the patient who to call for emergencies or other questions is crucial. This should be clearly spelled out, providing not only the name of the physician to call but also the phone contact information. The sample Project RED after hospital care plans provide great examples of how to communicate such issues.
Lastly, communicationcommunicationcommunication. While written materials are a necessity, there is still no substitute for verbal or face-to-face communication. The old saw that 80-90% of communication is non-verbal holds true. The best hospitalist programs are successful because the hospitalist is in frequent communication with the outside physician(s).
That gets us to Louise. Louise is a virtual nurse discharge advocate in a pilot program being used by the Project RED team. Timothy Bickmore, PhD, explains her role in the AHRQ Project RED webinar. The virtual nurses are very realistic and the communication is two-way. The patient (or their family) can interact with the virtual nurse as often as they like. It is a good way of ensuring that the patient truly understands the elements and importance of the after-hospital care plan. Take a look at the video clips of the virtual nurse on the Project RED website. Youll be impressed! And the Project RED team tells you that as many as 75% of the respondents actually prefer the virtual nurse to a real discharge nurse! They like her because they can ask all the questions they want, are not constrained by time, and often ask questions they are too embarrassed to ask a live nurse. Stay tuned to the wave of the future!
Jencks SF, Williams MV, Coleman EA.. Rehospitalizations among Patients in the Medicare Fee-for-Service Program. NEJM 2009; 360: 1418-1428
Jack BW, Chetty VK, Anthony D et al. A Reengineered Hospital Discharge Program to Decrease Rehospitalization: A Randomized Trial. Annals of Internal Medicine 2009; 150(3): 178-187
AHRQ webcast. Improving Patient Safety: Implementing Re-Engineered Hospital Discharges Web Conference originally aired Tuesday, March 31, 2009 - 2:00 PM - 3:15PM (EDT)
AHRQ Project RED website
Louden K. Creating a Better Discharge Summary. Is standardization the answer? ACP Hospitalist March 2009
Atlantic Information Services. CMS Targets Readmission Through Payment, Audits; Coaching Model Reduces Rates. Report on Medicare Compliance 2008; 17(24): 1-2 (June 30, 2008)
Print Project RED
Last week our Patient Safety Tip of the Week Project RED focused on the rehospitalization after discharge from the hospital. Were continuing that discussion this week because of the tremendous potential impact of this problem on both our healthcare system and your individual facilities.
In the olden days of quality improvement we used to focus on readmissions within a short timeframe (typically 7 days or 15 days). We did that because most studies had shown that readmissions within that timeframe were most often associated with quality problems and that readmissions beyond that were not. We, of course, were very parochial in our thinking! Essentially, what we were saying was anything else is someone elses fault! We were not thinking about our role in the much larger healthcare continuum.
It is very clear that quality and patient safety issues are a major reason for rehospitalizations. So if a patient is rehospitalized because of a failed handoff (eg. inadequate information getting to the PCP after discharge) it doesnt matter whether that rehospitalization occurred within 15 days or 30 days or even longer.
A recent paper (Friedman 2009) looking at the relationship between the AHRQ Patient Safety Indicators (PSIs) and rehospitalization showed that the rate of readmission within 1 month increased by 44% after a safety event and by 50% within 3 months after a safety event. Theirs was a primarily surgical population but we anticipate that the percentages would be similar or even higher in a primarily nonsurgical population.
So how does your facility approach the issue of rehospitalizations? If you are like most facilities, you probably track them (by either the 15-day or 30-day interval) and report them monthly in a manner not likely to point to an actionable steps. We advocate, on the other hand, doing a mini-RCA (root cause analysis) at the time of the readmission. That is the time when you are likely to have available the most pertinent information about the patient and the circumstances leading to readmission. Not only do you have the admission H&P and ER notes available, but hopefully there has been some communication with those providing care outside the hospital and you probably have accessed the old hospital chart as well. So you have at hand all the tools you need to identify those factors contributing to the need for rehospitalization.
We actually recommend that you establish a database (could be a simple spreadsheet) having categories for the common reasons leading to readmission. Keep in mind that there may be, in fact likely will be, more than one factor contributing to that rehospitalization. Many of you keep such databases now for looking at avoidable days or delay days when doing utilization management on your inpatients. Well, develop a similar list for reasons for readmission, for example:
Discharge summary did not get to caregiver
No visit with PCP (or other scheduled provider)
No home care visit
Patient did not understand discharge instructions
Medication reconciliation failure
Patient did not take recommended medications
Patient dietary indiscretion
Pending test result not followed up on
Delayed complication from hospitalization
Another huge consideration is patients returning from long-term care (LTC) facilities or skilled nursing facilities (SNFs). The study we mentioned in last weeks column (Jencks et al 2009) on Medicare data showing 19.6% of all patients discharged from an acute care hospital are rehospitalized within 30 days did not include patient who had been discharged to long-term care. That undoubtedly would have increased the percentages even more since residents of long-term care have much higher hospitalization rates in general.
In the past, that was more of a concern for managed care organizations, who were concerned about the cost of acute care hospitalizations. However, now that Medicare is considering bundling payment to include not only the acute hospital stay but also all aftercare, acute care facilities must begin to look at where their patients are being discharged to and where their readmissions are coming from.
Many managed care organizations have utilized the Evercare model developed by United Health. In that model, nurse practitioners working in conjunction with geriatricians visit patients in long-term care facilities frequently. They identify and treat early those conditions with deterioration before acute hospitalization is necessary. They are critical in both coordinating care and triaging patients to the most appropriate level of care. One study (Kane et al 2003) showed the program reduced hospital admissions and ER visits 45-50%, with resulting considerable financial savings.
This may require establishment of new partnerships. Some hospitals already have long-term care facilities, rehab facilities, substance abuse programs, and home health care as part of their system. But many hospitals get those readmissions from other institutions that are not part of their current systems. They must be able to develop creative methods to help ensure that patients discharged to those facilities will not need rehospitalization soon. That may be scary for some but it is clear that in the healthcare system of the future we all need to get out of our silo thinking and think about how to deliver for our patients and society the safest and most effective care at a reasonable cost.
Friedman B, Encinosa W, Jiang HJ, Mutter R. Do Patient Safety Events Increase Readmissions? Medical Care 2009; published ahead of print 23 March 2009
Jencks SF, Williams MV, Coleman EA.. Rehospitalizations among Patients in the Medicare Fee-for-Service Program. NEJM 2009; 360: 1418-1428
Kane RL, Keckhafer G; Flood S, Bershadsky B, Siadaty MS. The Effect of Evercare on Hospital Use. Clinical Investigations. Journal of the American Geriatrics Society. 51(10): 1427-1434, October 2003.
April 21, 2009
Still Futzing with Foleys?
Here we are six+ months into the era where CMS will no longer pay extra for catheter-associated urinary tract infections (CAUTIs) and many hospitals are still struggling to eliminate CAUTIs.
At the 2009 Annual Scientific Meeting of SHEA (the Society for Healthcare Epidemiology of America) there were several presentations on CAUTIs as summarized in a Medscape Medical News article. One study by Mark E. Rupp, M.D. and others at the University of Nebraska found the almost a third of Foley catheter days were unnecessary. That is similar to a study done by Raffaele et al in Italy last year. Rupp is quoted in that Medscape article as noting that Foleys are frequently used for convenience in incontinent patients. He points out the alternatives to indwelling catheters in the incontinent patient, including diapers, scheduled voiding, intermittent catheterization, and condom catheters. Also in that Medscape article, Dr. Jennifer Meddings from University of Michigan Health System (where many of our previously mentioned studies on bladder bundles were done) notes that new evidence-based guidelines on indwelling catheter use will be forthcoming within months from the Healthcare Infection Control Practices Advisory Committee (HICPAC) of the US Centersfor Disease Control and Prevention. Those guidelines will emphasize that use of indwelling catheters for convenience or incontinence are not indicated.
So why are hospitals still struggling? At multiple hospitals we have found one consistent, recurring theme: the problem areas are the ER and the OR. Most med/surg floors have done a reasonably good job at eliminating or minimizing use of indwelling urinary catheters. However, they keep popping up in patients arriving from either the ER or the OR. Those two areas have historically been sites where Foleys are frequently inserted (note that in the Rupp studies the indication for Foley catheter in 75% of cases was for surgery or postoperative management). They often have legitimate indications in those areas but many times they are inserted almost reflexly.
But we have found one other key and perhaps more important reason for this being a special problem for the ER and OR: these two areas are often not integrated with the rest of the hospital clinical information systems. Many IT vendors sell ER and OR modules separately. So as hospitals have begun implementing the CPOE and EMR systems they either begin on the med/surg units or they have separate ER/OR IT systems that are not integrated with the system installed on the med/surg units. (Yes, well add this to our list of unintended consequences of healthcare IT!).
So what should you do? As in our previous columns, you can either go hi-tech or low-tech. The low-tech solution, you will recall, is simply using a brightly colored sticker that requests a reason for the Foley and/or prompts for a reason for continuation of the Foley. A little more sophisticated sticker would have checkboxes for the legitimate reasons for using a Foley catheter. We actually now recommend that you package your Foley catheters with a card or sticker on the outside that must be filled out prior to opening the Foley tray/package.
But your second solution is more hi-tech. Though many OR and ER areas are not integrated with the HIS, they often do use or interface with the medication ordering system. So the solution is: treat the Foley catheter like a drug! Have it ordered through your CPOE or medication ordering system. That way you can:
Just be careful that you develop carefully the CPOE screens for ordering a Foley in this manner. Test them first in test mode before moving to live. And, just as you would with any new drug, be observant for any potential unintended consequences.
Most CPOE programs allow the above capabilities in modules other than your medication ordering module. However, the above solution can work well for those settings where other aspects of CPOE are not yet available, such as the OR or ER. There are probably several other non-medications that could be programmed for ordering through the medication ordering module as well.
And dont forget our previous columns on urinary catheter-associated UTIs:
Rebelo K. Medscape Medical News article: SHEA 2009: Inappropriate Catheterization Is Common. March 24, 2009
Raffaele G, Bianco A, Aiello M, Pavia M. Appropriateness of Use of Indwelling Urinary Tract Catheters in Hospitalized Patients in Italy. Infect Control Hosp Epidemiol 2008; 29: 279281 http://www.journals.uchicago.edu/doi/abs/10.1086/528814?prevSearch=(raffaele)+AND+[journal:+iche]
April 28, 2009
Ticket Home and Other Tools to Facilitate Discharge
While you are all busy out there developing your discharge planning checklists that we talked about in our Patient Safety Tips of the Week for April 7, 2009 Project RED and April 14, 2009 More on Rehospitalization After Discharge, weve come across another good checklist tool to facilitate hospital discharge. Last weeks Nursing Times had an article on the Ticket Home project (Webber-Maybank et al 2009).
Ticket Home is another checklist-like tool that facilitates communication among many disciplines, somewhat akin to the Ticket to Ride tool weve described previously for inhospital transports. It is a laminated card that is placed at the patient bedside, easily visible, with sections for multidisciplinary input (eg. PT, OT instructions), information about whether the patient requires transportation home, whether their medication reconciliation has been done and followup appointments scheduled, and a section for planned date of discharge. The latter, of course, is estimated early in the admission and has to be updated regularly. The Webber-Maybank study, done at an orthopedic hospital, was associated with about a 20% sustained reduction in length of stay.
The concept is actually a throwback to the 1990s, when clinical pathways were the rage. Many of us spent hours and hours in the 90s developing clinical pathways for clinical problems most commonly seen in our facilities. We later concluded that they were successful primarily because of the standardized order sets we attached to them and the other things we implemented to facilitate them (such as nurse case manager programs). However, one of the good things about clinical pathways was that we usually also gave a modified copy of the pathway to the patient. That helped both the patient and the family better understand what to expect during the hospital stay and anticipate what goals need to be met to move on. We found that these often brought out the competitive spirit in our patients. They strove to meet those goals or be ahead of them. As early as the day of admission (or pre-admission for elective surgery cases), we would show the patient when and to where we anticipated theyd be discharged. Of course, Ticket Home is doing the same thing. Using that anticipated date of discharge is also very helpful for the patients family or other caregivers. They can plan their availability around that anticipated date.
The Ticket Home concept works well for discharges after certain types of admission. You can all readily see its potential for many orthopedic admissions. It is obviously much more difficult to anticipate the date of discharge for many medical admissions. But that should not dissuade you from adopting the Ticket Home concept because you will continuously be updating the anticipated discharge date.
The Ticket Home project had two very desirable offshoots as well weekend discharges increased and the number of patients being discharged prior to 12 noon increased.
We often see hospitals with lots of discharges happening in late afternoon. We say Thats great! Must mean your doctors are doing a second set of rounds and getting patients discharged a day early?. (Laughter ensues). Wrongthey should have been discharged this morning. Now they are being discharged later in the day, often around the time of nursing change of shift or physician signout, and when staffing levels may be lower. Those are factors that may contribute to fumbled handoffs at discharge. And now your housekeeping staff may also be increasingly taxed to turn around that patient room. And then there is the cascade effect where admissions from the emergency department or transfers from the ICUs are delayed because of bed inavailability on the floors, creating bottlenecks throughout the system.
So enter another 90s tool-gone-by-the-wayside the discharge lounge. The discharge lounge takes its origin from the hotel industry. You all know the rules the hotel checkout time is 11 AM or 12 noon. Thats how they maximize use of their rooms. But they add amenities to help you. If you are attending a conference, for example, and wont be leaving until later in the day, the hotel usually provides concierge service to at least check your luggage and provide a comfortable place to wait.
So the discharge lounge concept is the same. Patients who are awaiting transportation home can be formally discharged and then wait in a designated area that is appropriately staffed and provides amenities. The area should be quiet and provide privacy and be convenient to the person who will be providing transportation to the patient. The chairs should be comfortable and there must be ready availability and easy access to bathrooms. Provision of beverages and light snacks is typical. There need to be activities for the patients (eg. TV, computer access, magazines, etc.). When you eventually have your own Louise virtual nurse discharge advocate that we noted in our Project RED columns, your patients could play those interactions over and over as many times as they want while waiting in the discharge lounge, too. Your brochure should include a phone number for the family or caregiver to call though your hospital operator should also be able to connect them to the discharge lounge (maintaining all HIPAA requirements as if the patient were an inpatient).
Staffing of a discharge lounge is variable, both by facility size and type and in some cases by state regulations. This is a great place to utilize your hospital volunteers. They can provide most of the necessary services. Since many of the patients may have difficulty ambulating, you must have some staff who are trained and competent in helping patients ambulate or moving them in wheelchairs. Many discharge lounges will help the patient get their medications by faxing copies of the discharge prescriptions to the pharmacy the patient requests. In those that have an inhouse outpatient pharmacy, those prescriptions may be filled while the patient waits in the discharge lounge. Some discharge lounges help patients schedule followup appointments. However, if you are a Project RED believer you will have already done all that before the patient leaves the floor! Whether you need an RN or not depends on your circumstances. Certainly, larger hospitals that will expect multiple patients waiting in the discharge lounge will want to have RN staffing or at least ready availability of RNs. This is also where specific state requirements may mandate RN coverage. However, remember that these are patients who are already formally discharged from the hospital. They are usually expected to be able to manage things like their medications themselves at home. But not all are capable of that and that is where an RN may be helpful in the discharge lounge. Either way, you must anticipate that a patient in the discharge lounge is likely to have some medication needs while waiting so you must either ensure they have brought their own medications or ensure that they were provided before the patient was discharged.
The discharge lounge must be desirable in terms of location and aesthetics and staffing so that physicians and nurses feel comfortable in doing morning discharges. Your biggest sales challenge is usually not to the patient and family but rather to your physicians and nurses. But remember the two biggest factors in patient/family satisfaction with a hospitalization are (1) what happens on entering the hospital and (2) what happens on leaving the hospital. You clearly want to make a good impression on both the patient and their family when they are going home. And a good discharge lounge program may also improve satisfaction on the front end by relieving ED bottlenecks. Discharge lounges arent for everyone but you should do a cost-effectiveness analysis and see whether they make sense for your organization.
Webber-Maybank, M., Luton, H. Making effective use of predicted discharge dates to reduce the length of stay in hospital. Nursing Times 2009; 105: 15 (early online publication)
May 5, 2009
Adverse Drug Events in the ICU
Medication errors and adverse drug events, of course, occur in every part of the healthcare system. But there are some unique adverse drug events that occur in special areas. We have previously discussed medication errors and adverse drug events occurring in the radiology suite and in the operating room and ambulatory surgery suite. Recently there have been several good papers about adverse drug events occurring in the ICU.
In many respects, the ICU environment has many of the crucial factors that can allow for the perfect storm. Patients are critically ill, often with multiple organ failure. As such they are more vulnerable to adverse drug events than healthier patients. The number of drugs patients in the ICU receive is higher than in most other areas of the hospital and the drugs used more often fall into the category of high-risk drugs. And often side effects of drugs are not immediately recognized because they are masked by or confused with changes related to the underlying medical conditions.
The ICU environment is arguably the most complex of any healthcare environment. It tends to have the most expansive human-technology interface, further enhancing the risk of errors. Because many of the patients are coming from other areas of the hospital or even from other hospitals, the potential for fumbled handoffs is very high. And we have previously discussed the many problems that may arise when ICU patients must be transported to other areas of the hospital for diagnostic testing, etc.
When the Harvard Work Hours and Health Study was performed, one offshoot was the Critical Care Safety Study (Rothschild et al 2005). This was a study that looked prospectively at adverse events and serious errors occurring in the coronary and medical ICUs in a large academic teaching hospital. Data were collected over a year, using multiple different methods to identify adverse events and serious errors. They identified adverse events at a rate of 80.5 per 1000 patient-days and serious errors 149.7 per 1000 patient-days. Around 12% of both types were potentially life-threatening. Significantly, almost half of the events were considered to be potentially preventable. About half of the adverse events and 78% of the serious errors involved medications. This included not only drugs used as treatments but also drugs used in prophylaxis, diagnosis, and monitoring of medications. Cardiovascular drugs, anticoagulants, and antiinfective drugs were the categories most often involved.
A new paper in the BMJ (Valentin et al 2009) looked at errors in administration of parenteral drugs in intensive care units in 113 units in 23 countries. One-third of the 1328 patients had an error. They found 74.5 events per 100 patient days, including 0.9% that caused permanent harm or death. Cardiovascular drugs, anticoagulants, insulin, sedative/analgesic drugs and antimicrobial drugs were the categories most often involved. Wrong time and missed dose were the two most common types of error seen. Note that more than half the errors resulting in serious harm were errors of omission.
In analyzing the risk factors for such errors, they found increased risk with severity of illness, the number of organ failures present, the number of parenteral administrations, higher level of care, and the number of patients per nurse. Trainees were involved in a substantial number of cases with errors. Increased workload, stress and fatigue were contributory factors in a third of all events. Interestingly, there was a strong protective relationship with the existence of a clinical incident reporting system.
However, one very interesting finding, which tends to run counter to prevailing thinking, was that error rates were higher for medications previously prepared by a pharmacist and reduced when nurses labeled syringes of medications they themselves had prepared.
Both the Critical Care Safety Study and the BMJ papers noted that the majority of errors occurred during routine care rather than during emergencies.
A recent excellent review of medication errors in critical care appeared in the Canadian Medical Association Journal (Camire et al 2009). They point out that the number of preventable and potential adverse drug events in ICUs is double that seen on general care units but this is likely due to the fact that ICU patients receive twice as many medications. Medication errors are also considerably more frequent in medical ICUs than surgical ICUs. They noted risk factors similar to those described above and also included problems with medication reconciliation (failure to document a patients usual medication list) and housestaff hours worked.
Wed like to emphasize the need for medication reconciliation. Not only are some important drugs inadvertently omitted when patients are transferred to ICUs but there are often many medications begun in ICUs that are never discontinued. The classic examples are proton pump inhibitors and other drugs used as prophylaxis against GI bleeding. While there is evidence such prophylaxis is effective in at-risk patients in the ICU, there is little evidence to support their use in non-ICU patients. These agents are frequently continued not only on transfer to the general care units but may be continued past discharge as well. In fact, we recommend regular surveillance for such occurrence either manually or electronically.
Wed also like to speculate that the type of ICU coverage may be an important risk factor for medication errors. Though most hospitals today have dedicated ICU coverage (if not by intensivists, at least by a dedicated ICU team each month) there are still vestiges of older systems where ICU coverage may be by housestaff who have responsibilities for other areas. We all know that verbal orders are high risk for errors. If staff are busy elsewhere and have to give verbal orders to the ICU staff, risks are high. While CPOE may help minimize those risks, we have also given examples of how remote order entry may be dangerous.
The biggest value of the CMAJ paper is it bibliography and list of strategies demonstrated to prevent medication errors in the ICU. These include things like CPOE with clinical decision support, use of barcoding/bedside medication verification, and smart pumps. Medication reconciliation, as noted above, is extremely important. Having a clinical pharmacist round with the medical team is especially valuable in ICUs. Standardized orders or standardized medication protocols are also useful.
Lastly, the CMAJ article also has good recommendations about disclosure to patients and families about medication errors that have occurred.
Rothschild JM, Landrigan CP, Cronin JW, et al. The Critical Care Safety Study: The incidence and nature of adverse events and serious medical errors in intensive care. Critical Care Medicine 2005; 33(8): 1694-1700
Valentin A, Capuzzo M, Guidet B, Moreno R, Metnitz B, Bauer P, Metnitz P, on behalf of the Research Group on Quality Improvement of the European Society of Intensive Care Medicine (ESICM) and the Sentinel Events Evaluation (SEE) Study Investigators. Errors in administration of parenteral drugs in intensive care units: multinational prospective study. BMJ2009; 338:b814, doi: 10.1136/bmj.b814 (Published 12 March 2009)
Camir E, Moyen E, Stelfox HT. Medication errors in critical care: risk factors, prevention and disclosure. CMAJ 2009; 180: 936-943
May 12, 2009
Errors With PCA Pumps
In a recent discussion after a naloxone reversal of excess opioid therapy, the question was raised as to why a patient had not been on patient-controlled analgesia (PCA). PCA has been a major development in pain management and, for the most part, has resulted in improved pain control and patient satisfaction, shorter lengths of stay, and better utilization of resources. Though having the theoretical built-in advantage that should prevent inadvertent narcotic overdose (i.e. as the patient becomes drowsy, he/she cannot press the button to infuse more narcotic), use of PCA pumps have been associated with multiple problems of their own. PCA is another example where new technologies effectively eliminate some problems but introduce new problems of their own.
In our September 9, 2008 Patient Safety Tip of the Week Less is More.and Do You Really Need that Decimal? we discussed many of the safety considerations related to use of PCA pumps. We referred to ISMPs outstanding monograph about patient safety issues involved in PCA, published in 2006.
So just how frequent are errors and safety issues associated with use of PCA? An analysis of one years worth of reports to the MEDMARX database (Hicks et al 2008) found 1% of all reports related to PCA. However, 6.5% of these were associated with harmful outcomes, compared to 1.5% for all other errors in the MEDMARX database. That makes sense because all the medications being delivered via PCA are high-risk medications. They also noted that a greater percentage of harm occurred in older patients.
One of the problems in determining the frequency of adverse events related to PCA is that the MEDMARX database is a voluntary reporting database and likely considerably underestimates the true frequency. A new paper (Meissner et al 2009) has actually quantified both the frequency and costs of errors attributable to PCA. They used data from both the MEDMARX database and the MAUDE database (an FDA database that collects mandatory reports on device-related errors), then used a correction factor to adjust for the likely underreporting in both databases. They estimated the rates of PCA-related errors and device-related errors, respectively, at 407 per 10,000 people and 17 per 10,000 people. The average cost per error was $733 for errors in the MEDMARX database and $552 for those in the MAUDE database. Given that about 30% of surgical patients use PCA (the most common situation for PCA use), they estimate that errors related to PCA may cost the entire United States healthcare system about $400 million annually.
The Hicks paper categorized the types of errors seen. Improper dosage or quantity was the most frequent type of error, but omission errors, unauthorized or wrong drug errors, and prescribing errors were also frequent. Errors in the monitoring phase were also frequent. Errors related to human factors were the leading causes for errors, accounting for almost 70% of causes listed. But also seen were faulty equipment, communication issues, problems with storage/handling/packaging, documentation, and name confusion. Errors were seen with the gamut of opioid drugs.
Other contributing factors were distraction, workload, inexperienced or temporary staff, shift changes, and cross-coverage. Many errors were also noted to occur at the time of patient transfer from one clinical area to another.
They provide actual clinical examples of the errors in the various phases of the medication cycle. Of particular note, problems associated with dosage conversions (when converting from one narcotic to another), LASA (look-alike sound-alike drugs) confusion (eg. confusion morphine-hydromorphone and meperidine-morphine), and retrieving the wrong medication from automated dispensing machines were highlighted. The latter issue will hopefully be minimized as more facilities move to barcoding/bedside medication verification systems but still suggests that healthcare organizations need to pay careful attention to overrides and their automated dispensing systems. Restricting the number of medications used for PCA may also help minimize some of these errors. Tall-man lettering may help avoid some of the LASA drug-pair confusion.
Dose miscalculations and device programming errors are frequent. In our September 9, 2008 Patient Safety Tip of the Week Less is More.and Do You Really Need that Decimal? we discussed some specific examples of such errors. We noted the ISMP Safe Medication Alert Misprogramming PCA concentration leads to dosing errors that highlighted a paradoxical problem with PCA pump programming. If one programs in too high of a concentration, the patient tends to get underdosed (so may suffer continued pain). If one programs in too low a concentration, the patient actually gets overdosed! This seems counterintuitive. But think about it the patient asks for a certain dose of the narcotic and the pump delivers the volume it is programmed for to meet that request. If the concentration was erroneously too low, the pump has now given a higher volume and, hence, a higher actual narcotic dose. And often a warning on the pump that the concentration is too low may be overridden because the nurse or physician feels less concerned about too low than too high. Also, in our March 12, 2007 Patient Safety Tip of the Week 10x Overdoses we pointed out another potential problem with misprogramming PCA (or other infusion) pumps. The data entry person may double press a key (or the key may become stuck) resulting in, for example, 88 instead of 8. Also, during data entry it is possible to think one hit a decimal point but it fails to print out. These types of data entry error have recently been noted in programmable intravenous infusion pumps and there have been several occurrences of 10x overdoses with those pumps.
For a variety of reasons, using independent double checks is very important when dealing with high-risk drugs and PCA. But keep in mind that the error rate for the person doing the second check may be as high as 10% (based on data on double checks in almost any industry). Such double checks should take place not only at the time the preparation is made and when it is first set up on the PCA pump but also any time there is a change in dosage or rate, when a patient is moved from one site to another, at change of shift, etc.
The Hicks paper also gave an example of PCA by proxy. PCA By Proxy is perhaps the best known safety issue with PCA. This, of course, means the pressing of the infusion administration button by someone other than the patient. This is most often a friend or family member but could be a member of the healthcare team. In most cases, the person pressing the PCA button thinks they are helping the patient avoid pain. They may not recognize the problem of overdosage from the narcotics. One of the built-in safety features of PCA is that when a patient gets sedated from too much analgesic, they can no longer press the PCA button to get more analgesic. That safety measure is bypassed in PCA by Proxy. In fact, the occurrence of incidents involving PCA by Proxy was significant enough for Joint Commission to issue a Sentinel Event Alert in 2004.
The ISMP monograph about patient safety issues involved in PCA published in 2006 also discusses other issues regarding misprogramming PCA pumps and several other issues, including selecting appropriate patients for PCA, monitoring patients, setting up quality indicators, and performing FMEA.
Recently, ISMP had a Medication Safety Alert: Beware of basal opioid infusions with PCA therapy. This included both a case presentation and excellent discussion of many of the problems that can be associated with PCA. In particular, the patient had both obesity and sleep apnea, 2 conditions which predispose to significant risk during PCA, and monitoring for respiratory depression was inadequate. Use of a basal infusion in this opioid-nave patient also predisposed to the ultimate respiratory depression that occurred. The ISMP article, and several before it, point out that basal infusions are seldom necessary in PCA and may be dangerous. That article also has a nice table of the risk factors for respiratory depression in PCA patients including:
Actually, all the risk factors for hypercapnic respiratory failure we noted in our June 10, 2008 Patient Safety Tip of the Week Monitoring the Postoperative COPD Patient and our January 27, 2009 Patient Safety Tip of the Week Oxygen Therapy: Everything You Wanted to Know and More! should also be considered relative contraindications for PCA. These include COPD, certain neuromuscular disorders, chest wall deformities, massive obesity, and obstructive sleep apnea. These are not absolute contraindications for PCA, but they do require that such patients would have more intensive monitoring. The same applies to patients requiring high doses of opiates regardless of whether they are receiving them via PCA or otherwise. After identifying high-risk patients, one must use sedation scales properly, and consider using capnography in addition to pulse oximetry in such high-risk patients.
Pasero and McCaffery (2002) discuss risk factors for opioid-induced respiratory depression and keys to monitoring such patients. Infants less than 6 months old, opioid-nave elderly patients, and patients with coexisting conditions such as COPD, sleep apnea, or major organ failure are at increased risk of respiratory depression. In addition, drugs such as intramuscular opioids, muscle relaxants and anxiolytics, benzodiazepines, sedating antihistamines, and some antiemetics may increase the risk for opioid-induced respiratory depression. In their discussion of monitoring, they provide the following very practical sedation scale:
S = Sleep, easy to arouse (acceptable; no action necessary)
1 = Awake and alert (acceptable; no action necessary)
2 = Slightly drowsy, easily arousable (acceptable; no action necessary)
3 = Frequently drowsy, arousable, drifts off to sleep during conversation (unacceptable; decrease opioid dose by 25-50%, add an opioid-sparing analgesic, and monitor the patients level of sedation and respiratory status closely)
4 = Somnolent, minimal or no response to physical stimulation (unacceptable; stop opioid, consider administering naloxone)
PCA pumps or other judicious use of opiates may be very important for patient care. However, one must recognize the potential dangers and ensure proper patient and drug selection, appropriate training of staff, patients and families, proper safeguards and good monitoring techniques.
Cohen MR, Weber RJ, Moss J (Institute for Safe Medication Practices). Patient-Controlled Analgesia: Making it Safer for Patients. A continuing education program for pharmacists and nurses. ISMP. April 2006 http://www.ismp.org/profdevelopment/PCAMonograph.pdf
Hicks RW, Sikirica V, Nelson W, Schein JR, Cousins DD. Medication errors involving patient-controlled analgesia. American Journal of Health-System Pharmacy 2008; 65: 429-440
Meissner B, Nelson W, Hicks R, Sikirica V, Gagne J, Schein J. The rate and costs attributable to intravenous patient-controlled analgesia errors. Hosp Pharm. 2009;44:312324
ISMP. Misprogramming PCA concentration leads to dosing errors. Medication Safety Alert Newsletter (Acute Care Edition). August 28, 2008 http://www.ismp.org/Newsletters/acutecare/articles/20080828.asp
Joint Commission. Sentinel Event Alert. Patient controlled analgesia by proxy. Issue 33. December 20, 2004
Institute for Safe Medication Practices (ISMP). Beware of basal opioid infusions with PCA therapy. Medication Safety Alert. Acute Care Edition. March 12, 2009.
Pasero C, McCaffery M. Monitoring Sedation: It's the key to preventing opioid-induced respiratory depression. AmericanJournal of Nursing. 2002; 102(2):67-69
Print Errors With PCA Pumps
May 19, 2009
Learning from Tragedies
Those of you who are regular readers of this column know we often use lessons learned from the NTSB (National Transportation Safety Board) accident reports and try to apply them to healthcare. While no one wants to see tragedies, either in aviation (or other transportation) or healthcare, a bigger tragedy is when we fail to apply learnings to prevent other tragedies from occurring. So this month we are looking at lessons learned in two major fatal aviation accidents. One was the crash of Continental Flight 3407, which happened near where we are based in Western New York. The other was the mid-air crash of two airplanes in Brazil in 2006.
Well start with the latter. This incident is chronicled superbly and the chain of events recreated with humanization of the key players by William Langewiesche in the January 2009 edition of Vanity Fair (Langewiesche 2009). Pilots were flying a brand new luxury Legacy 600 private jet back to the United States from Brazil when it collided at 37,000 feet with a Brazilian Boeing 737. All 154 passengers in the Boeing 737 died but all passengers and crew aboard the smaller Legacy 600 jet survived. The investigation showed the typical cascade of errors and circumstances that allowed the fatal collision to occur. Just as in medical sentinel events, breakdowns in communication were significant factors in causing the disaster.
The key elements of the crash are as follows. The Legacy 600s transponder was inadvertently turned off. The transponder is the device that allows air traffic controllers to accurately identify planes and get an accurate reading of their altitude. It is also necessary for operation of the TCAS (traffic alert and collison avoidance system) that prevents 2 planes from colliding. In addition, communication with air traffic control was lost for a considerable period of time. As a result, air traffic control did not realize that two planes were coursing toward each other at the same altitude and the TCAS never alerted the planes they were about to collide.
There are many obvious analogies to adverse incidents in medicine. Just as we have noted a particularly risky time is when equipment comes out of maintenance (see our August 7, 2007 Patient Safety Tip of the Week on the Role of Maintenance in Incidents), new equipment poses especially risky circumstances. Unfamiliarity with the equipment can lead to many sorts of errors. The new Legacy 600 had numerous new high technology features, many of them nested in various modes on the dials in the cockpit. Even though they were given training and had several test runs on this plane, the pilots were still relatively new to this plane and they did not have ready knowledge of all its features. At one point they even had to go through multiple computer screens just to find time to destination.
Sound familiar to healthcare? How often do we trot out the newest and fanciest equipment to our ORs and ICUs and expect that with minimal training staff will be able to perform without glitches? This is another great argument for standardization. Airlines that have proven success stories, like Southwest, have used one or just a few plane designs so that their pilots have little difficulty moving from one plane to another.
But what about all the high tech bells and whistles here? Lessons from other industries have provided ample warning and examples that introduction of new technologies introduces opportunities for new types of errors and other unintended consequences.In our December 16, 2008 Patient Safety Tip of the Week Joint Commission Sentinel Event Alert on Hazards of Healthcare IT we noted Charles Perrow in his classic book Normal Accidents (Perrow 1999) talks about how new technologies often simply push the envelope, citing as an example how the introduction of maritime radar simply encouraged boats to travel faster and did little to reduce the occurrence of maritime accidents. Aviation is similar. In the not so distant past, the standard vertical clearance for planes was 2000 feet. However, when newer more sophisticated instruments became available, that vertical clearance was reduced to 1000 feet. Ironically, as pointed out by Langewiesche in the current incident, the new equipment may have allowed the midair collision to occur. When equipment was less sophisticated, there would be enough variation just by chance that two planes flying at 37,000 feet would unlikely actually both be flying at exactly the same altitude. However, with the newest altimeters and GPS devices and autopilot systems the accuracy is so good that both planes can fly at that exact altitude, thereby increasing the chance of collision.
You are all well aware that about three-quarters of Sentinel Events reported to Joint Commission involve problems with communication. This aviation disaster had multiple problems with communication. One is the language barrier. Though English is the universal language used in aviation, there are problems that arise regarding accents, pronunciation, nuances of words and phrases, and cultural differences. In the Brazil plane collision, there were several times when the Legacy 600 pilots did not understand what the air traffic controller had said. And, for whatever reason, they did not seek clarification. In healthcare, we emphasize the need for both readback and hearback. Readback is typically used when someone is taking verbal or telephone orders and reads back the orders to the person who gave them, often spelling out specific terms. Verbal or telephone orders should be avoided whenever possible. But there are times when they are unavoidable and that is when readback is critical to minimize the chance of error. Hearback, on the other hand, is a little more complex. While it does involve to a degree repeating something that has been said, it also is a statement of understanding about what was said or intended. Particularly in healthcare we often use terminologies that are not well standardized and we need to convey back our level of understanding.
While communication is a problematic area, a more sublte one is recognizing lack of communication. It is, of course, more difficult to recognize something negative than something positive. In this incident, the pilots and the air traffic controllers went through long periods of time where there was no radio communication. Part of situational awareness should including being attuned to what is not happening. Gary Klein (See our May 27, 2008 Patient Safety Tip of the Week If You Do RCAs or Design Healthcare ProcessesRead Gary Kleins Work) graphically describes how fire captains responding to a fire often sense that a disaster is imminent not by what they see (or hear or smell or feel) but rather by what they do not see. Similarly, in healthcare we need to pay constant attention to the variety of cues in our environment and actively seek out those things that should be there but are missing.
Just as in healthcare, there are handoffs and changes of shift that occur in the airline industry. During the flights of both the Legacy 600 and the Boeing 737 there were handoffs as the air traffic controllers changed shifts and as the planes passed from one air traffic control territory to another. The article contains only a few details about these handoffs. It is not clear whether those handoffs are done with a structured format, such as SBAR (see our September 30, 2008 Patient Safety Tip of the Week Hot Topic: Handoffs). But clearly some vital information was omitted or overlooked during these handoffs and some erroneous information (most notably the altitude of the Legacy 600) was passed on. In healthcare, handoffs should not only follow a structure to help ensure completeness and relevance but should also be carried out under optimal conditions in an interactive fashion where all parties can and do ask questions.
Problems with alarms are an issue identified in many of our healthcare root cause analyses. In this aviation incident there was no evidence of alarms purposefully being turned off, as we often see in healthcare. Rather, the problem with alarms here seems to be in the design of the systems. When the transponder in the Legacy 600 turned off (not clear whether it was inadvertently turned off by one of the pilots or turned off somehow by the high technology of the aircraft), a small warning appeared on the two radio management units. That warning was simply an abbreviation for standby and it appeared silently without any audible warning. Also, there was a small warning that appeared on each pilots Primary Flight Display indicating the TCAS (the traffic collision avoidance system) was off. The latter can only be on when the transponder is active. The change in the TCAS status also appeared without audible alert. With TCAS and the transponder off, other planes cannot sense the presence of this plane so would not get a collision avoidance warning if they neared this plane just as the Legacy 600 would not get such a warning. And with the transponder off the actual altitude of the Legacy 600 was not accurately conveyed to the air traffic controllers on the ground. We dont know enough about aviation to understand design of such systems but wed certainly wonder why you would ever want the transponder and TCAS to be off once the plane is airborne and in motion. But even if there is such a reason for an off option, there certainly should be an audible alert that gets the pilot to focus on the status of the transponder and TCAS systems. Weve addressed alarm issues previously in our Patient Safety Tips of the Week for March 5, 2007, March 26, 2007, April 2, 2007, and April 1, 2008.
Air safety is also plagued by what are called automation surprises. These are incidents where computers and other high tech instruments are working in the background to control or correct various factors and conditions. In the past, pilots would have been aware of such conditions but now are unaware while the computers are in the background. Autopilots are now so sophisticated that changes are made very subtly and pilots are unaware until they turn off the autopilot. In the Legacy 600, the autopilot at one point appropriately changed the course of the airplane as per the flight plan. The new course should have been accompanied by a change in altitude to 36,000 feet (by convention planes cruising in a westerly direction are assigned odd altitudes and those cruising in an easterly direction are assigned even altitudes). But the altitude is not to be changed until directed by the air traffic controller and in this case communication with air traffic control had broken down so the Legacy 600 remained at 37,000 feet. In healthcare, the design of systems must take into account the possibility of automation surprises (see our November 6, 2007 Patient Safety Tip of the Week Don Norman Does It Again!).
There are multiple other conditions that contributed to the adverse outcome here. Though there do not appear to have been any significant time pressures, there were other types of pressure. One of the passengers on the Legacy 600 was a writer for the New York Times. Both the company that bought this jet and the company that made it (who also had a representative onboard) obviously wanted to impress that writer. So some of the conversations and cockpit intrusions occurring during the flight occurred because of the presence of these passengers. One is reminded of the serious incident where the US submarine Greeneville, showing off the submarines capabilities to a group of important civilian visitors, accidentally struck a Japanese fishing trawler, the Ehime Maru, killing nine people aboard the trawler. Though the exact role of the civilian visitors in that incident is unclear, the possibilities were raised that they might have served as a distraction or may have indirectly led to the captain and crew taking some of the actions taken. Does this sort of thing happen in healthcare? Yes, there are times when there are visitors or media attending or witnessing live medical interventions. One must recognize such events as having inherent dangers and be extremely wary not to cut corners, get distracted, or be overly aggressive in attempting to get a point across. Similarly, there are times when intrusions into the OR or the pharmacy may cause distractions that facilitate errors.
Assumptions may also have played a role. It is speculated that the Legacy 600 pilots may have assumed that the air traffic controllers were doing them a favor by allowing them to stay at 37,000 feet. Apparently, the convention of odd vs. even altitudes for westbound and eastbound flights is sometimes purposefully not followed by some air traffic controllers under certain circumstances, perhaps leading to the false assumption in this case. In healthcare, a cardinal rule is to never assume that someone else has done something without verifying it.
So this accident, despite its tragic outcome, has numerous lessons not only for aviation but also for healthcare.
Next week we will be discussing some of the lessons learned from the investigation of the crash of Continental Flight 3407 in Western New York. Our regular readers also know we often do a book review around the time of holidays. We had intended to do one on the book Nudge by Richard Thaler and Cass Sunstein but, in view of these new columns on lessons learned from aviation disasters we may instead review John Nances new book Why Hospitals Should Fly. In any event, we will review both books at some time because they each have important lessons for healthcare and patient safety.
Langewiesche W. The Devil at 37,000 Feet. Vanity Fair; January 2009
Perrow C. Normal Accidents: Living with high-risk technologies. Princeton, New Jersey: Princeton University Press, 1999
Wikipedia. Ehime Maru and USS Greeneville collision
Klein G. Sources of Power. How People Make Decisions. (1999) Cambridge: MIT Press
Print Learning from Tragedies
May 26, 2009
Learning from Tragedies.
Last weeks Tip of the Week dealt with lessons learned from the tragic mid-air collision of 2 planes in Brazil and showed numerous analogies to healthcare. This week we discuss the investigation of the tragic crash of Continental Flight 3407 near Buffalo, New York in early 2009. Flight 3407 was a Bombardier Dash8-Q400 twin-engine turboprop aircraft that took off from Newark, New Jersey. Only a few miles short of the Buffalo airport the plane encountered an aerodynamic stall and plummeted to the ground, landing on a house. All 49 passengers and crew on the plane died, as well as one individual in the house. The NTSB public hearing on this accident took place this month and there is a wealth of information about this accident both on the NTSB website and in the lay press.
Details about the NTSB public hearing on this crash can be found on the NTSB website. These include the agenda, presentations, an animated video of the crash and other links. The NTSB documents related to this investigation can also be found at this link. The Buffalo News has an extensive collection of articles related to the accident as well.
On descent and approach to the Buffalo-Niagara International Airport, the Dash8-Q400 aircraft experienced a rapid loss of velocity. Shortly after the landing gear were lowered, the plane neared stall speed. When the stick shaker alert went off, signifying imminent aerodynamic stall, the pilot pulled back on the yoke in attempt to raise the nose of the plane. The correct maneuver to recover from such a stall is the opposite: the pilot should push the yoke down to lower the nose and accelerate the plane. The copilot had also put the plane flaps up, which was also not a correct response in this situation. The plane rolled and pitched, then plummeted almost straight downward. The animated video on the NTSB website provides a good reproduction of the likely events.
Icing on the wings is the first thing you think about in Buffalo in February. And while it is hard to say icing was not a contributory factor, the NTSB investigators apparently felt the moderate ice buildup was not significant enough to cause the reduction to stall speed. The type of plane involved in this accident apparently spends more time in icing zones than do bigger jet planes. Apparently, crashes in the past have led some carriers to cease flying this type of plane in northeast and just fly them in their Caribbean fleets.
But the testimony about icing brings up several relevant points. First, the transcript of the cockpit voice recorder clearly shows a discussion about icing that was occurring on the plane and the copilot remarked that she had never really been in icing conditions before and commented on her fright of those conditions. That conversation may have been one of several things that may have distracted the pilots during the last minute or so of the fatal flight. Second, in the interview with the air traffic controllers, the issue of how icing is monitored for came up. Ordinarily, the air traffic controllers check with pilots of incoming flights to see what sort of icing conditions are prevalent. When the handoff occurred between air traffic controllers that evening, the offgoing controller told the oncoming controller there was negative icing. However, that was basically assumed because no pilots had specifically complained about icing (that controller had been briefed about one report of icing when he first came on his shift but none of the 20 or so planes that had landed on his shift had complained about icing). In healthcare, assumptions are always dangerous. Almost every time we do a root cause analysis (RCA) we find one or more instances where someone assumed something that proved incorrect and was one of multiple factors leading to the adverse outcome. In fact, when we do a patient safety presentation for our students and housestaff, one of our slides says Never Assume it will make an ass our of u and me. Try the following sometime. Sit in on (or record) any handoff at your facility (nurse-to-nurse, housestaff-to-housestaff, attending-to-attending, etc.). Then, when it is over go over with the participants the facts and find out what is factual and what is assumed. You (and they) will be amazed at how many of the handoff facts are really assumptions.
Maintenance and Physical Factors
The numerous reports did not seem to uncover any significant contributory factors related to airplane maintenance or physical condition of the aircraft. We were somewhat surprised by the lack of discussion about fuel here. Though the aircraft was below its maximum weight allowance, it does appear that it had considerably more fuel aboard than needed for the relatively short trip from Newark to Buffalo (is fuel more expensive in Buffalo than Newark?). Though the plane was below the weight limit, one wonders if the events would have still occurred if the weight of that excess fuel had not been on board.
Sometimes doing more than you need to can give rise to problems. Weve previously talked about incidents were dosage calculations to the second decimal point may be irrelevant for large numbers yet give rise to errors related to missing the decimal point.
Weve discussed the role of maintenance in both aviation and healthcare in the past (see our August 7, 2007 Patient Safety Tip of the Week Role of Maintenance in Incidents.
Automation surprises: Autopilot
A discussion of automation surprises logically follows the discussion on icing. Several of our previous columns have discussed automation surprises, whether that is a computer changing things in the background or a system operating with several modes controlled via a single swich. When we first heard about this crash, the first thing we suspected was that the plane was on autopilot, which was correcting for icing, and when the autopilot was turned off the crew encountered an automation surprise. That is, they had not been aware the autopilot was compensating for the icing and they now had to suddenly do that manually. It is not clear what role the use of autopilot actually played in this crash. The NTSB public hearing did not emphasize this (perhaps because they apparently did not feel that the icing was a major contributor). However, the plane was flying on autopilot and the autopilot disengaged shortly before the stall and fatal descent (not clear if that disengagement was automatic or manual). Use of autopilot in icing conditions is well recognized as being hazardous. However, there seems to be no single industry standard about its use in those conditions. Some say autopilot should not be used at all during icing conditions, others say it may be used but should be disengaged every 10 minutes or so to assess conditions. And that assumes everyone concurs on what icing conditions are. While there do exist graded criteria for icing, our bet is that there is poor agreement amongst pilots about the definition of significant icing.
Most of you have already personally experienced automation surprises. Have you ever been driving on a highway and started slowing down, only to find that the cruise control on your car (you forgot you had set it!) speeds you up?
In healthcare, much of our equipment in ICUs (and other areas) is high tech and computers are often doing things in the background that we are not immediately aware of. Similarly, we often do have equipment that runs several different modes off a single switch. We gave an example of such a surprise back in 2007 when we described a ventilator operating on battery power when all thought it was operating on AC current from a wall socket.
Preparation for rare/unexpected circumstances: Simulation and Rehearsal
There are certain potentially fatal circumstances that may be encountered in many industries. Most are rare and not encountered frequently or may never be encountered in a career. Yet you need to be prepared on how to deal with such if they do occur. The aerodynamic stall is one of those. This means that the speed of the plane was too slow to allow the air to lift the wings and keep the plane airborne. Pilots do receive training in stalls, mostly during simulation exercises. But it is not clear how much training is done on stalls and, perhaps more important, how often that training is updated. Something you learned 3 or more years ago and have never seen again is probably not something you are likely to remember tomorrow.
In healthcare there are some rare events akin to the aerodynamic stall in which only preparation with simulation or rehearsal could reasonably prepare one to deal with the situation. Surgical fires are a great example. They occur instantaneously and most healthcare workers are ill prepared to deal with them once they occur (that is why prevention is so critical). Yet when they occur there are responses expected from each individual in the OR, both to minimize injury to the patient and to prevent injury to other staff. Even if you dont have fancy tools to do an actual simulation, you can and should do surgical fire drills where the physician, nurse and anesthesiologist (and others typically present in the OR) learn and rehearse what they should do in the event of a surgical fire. And those drills should be done regularly (once or twice a year).
There is also an evolving body of literature supportive of simulation to improve other aspects of healthcare. A recent prospective randomized controlled trial reported in the British Medical Journal (Larsen et al 2009) showed that a group of young surgeons who received virtual reality training on laparascopic surgery were able to perform to the level of intermediately experienced laparoscopists (20-25 cases) compared to the control group which performed at the novice level (five or less cases) and took half the time on average to complete those cases. The accompanying editorial (Kneebone 2009) urges caution in generalizing these findings to other circumstances and notes that the study dealth with a fairly straight forward laparoscopic procedure not likely to be associated with many complications. Simulation exercises have also been very helpful in developing teamwork and improving communication among team members and have been used to simulate many emergencies and unexpected circumstances, though not in rigorously controlled trials.
One of the experts at the NTSB hearing, Robert Dismukes of NASA, noted that there are problems with the current type of simulation provided. The simulated stalls are anticipated. Ones reactions when a stall occurs as a surprise are likely to be quite different.
We have our own analogy for that. Fellow kayakers learned the first time they got in a kayak how to do a wet exit. That is, someone tipped their kayak over and they had to get out of it while upside down under water tightly packed into the kayak with a rubber skirt covering their lower half and the opening of the kayak. Of course, they were instructed what to do first and their instructor was there to ensure they did not drown trying. So you anticipate what is going to happen and plan for it. However, the first time you unexpectedly flip your kayak is a different story! You didnt have time to plan what you are going to do and you have no one to make sure you wont drown. You are upside down, under water, and you know you will drown if you cannot exit that kayak. If you panic, you lose valuable time. What if I cant get the rubber skirt to pry loose? What if I cant slide out of that tight kayak?! A wet exit might be disastrous without the prior training under more controlled conditions. But the surprise wet exit is a confidence builder. Once it has happened to you, you are no longer fearful that you wont be able to do it.
So if you are fortunate enough to be able to avail your staff of formal simulation training, make sure you program in emergencies and unexpected circumstances that truly come as surprises.
Lastly on the issue of simulation, the training for the pilots in the current crash included witnessing a NASA video on stalls that also included a section on tail stalls. In a tail stall, one must do just the opposite of what one does for a wing stall. In fact, the pilot of the current crash did what one would do in a tail stall. We will never know what was going through this pilots mind at the time. However, since tail stalls apparently do not apply to this particular type of plane, it was questioned during the public hearings whether inclusion of the tail stall in the training video was wise. Could this be an unexpected consequence of information overload?
When the plane begins to nosedive, a natural tendency would be to pull back on the yoke in attempt to aim the nose upward. Unfortunately, in the aerodynamic stall described here the proper response is to push the yoke down and lower the nose while accelerating. That is the only way to get enough air to prop the wings back up. Again, you might remember the correct maneuver if you knew the stall was coming but it is hard to fight your natural tendencies when taken by surprise.
Again, kayakers know about counterintuitive maneuvers. When you do an Eskimo roll in a kayak (where you flip over from being turned upside down without exiting the kayak) you have a natural tendency to try to lift your head first as you near the surface. The correct maneuver is to actually lay your head down on the surface of the water while you let your other movements upright your body. Your head should come up last. This is a maneuver that requires practice, practice, practice. Its almost impossible to do without proper instruction and multiple rehearsals.
Are there counterintuitive maneuvers in medicine? Think about diabetic ketoacidosis where the serum potassium is typically elevated. Yet the serum potassium will fall precipitously with insulin therapy. You have to anticipate this drop in potassium and actually begin potassium placement early, sometimes when the serum potassium is still relatively high. Again, such counterintuitive maneuvers must be learned.
Weve discussed the sterile cockpit concept several times in the past. In the current public hearings on the crash of Flight 3407, violation of the sterile cockpit concept received considerable attention. Sterile cockpit procedures mandate that conversations not relevant to the safe operation of the plane do not take place during certain critical phases of aircraft operation (such as taxi, takeoff, landing and all operations below 10,000 feet). The entire conversation between pilot and copilot appears in the transcript of the cockpit voice recorder. It does demonstrate considerable conversation during the descent that was not directly related to the operation of the plane. Did that distract them from their duties at hand?
How do airlines monitor adherence to the sterile cockpit mandate? Some do line operations safety audits (LOSA) audits where an independent observer sits in the cockpit and monitors and assesses multiple operations and procedures, then critiques the crew. This airline apparently did do LOSA audits but said they never showed very much. Keep in mind, too, that the cockpit is much more likely to be sterile when the crew knows their behavior is being assessed. We wonder how many airlines routinely review cockpit voice recordings randomly to assess cockpit sterility.
In our October 2, 2007 Patient Safety Tip of the Week Taking Off From the Wrong Runway we discussed the sterile cockpit analogies to healthcare facilities. The sterile cockpit concept also applies to the surgical timeout/final verification process. It also applies in those central pharmacies where the pharmacist is expected to do certain work without interruptions. And one could make a case that it should apply to any healthcare worker tasked with doing a double check or second independent verification (eg. for a blood product or a chemotherapy infusion rate). There are probably many other circumstances where the sterile cockpit concept applies.
How many healthcare organizations actually audit or monitor those processes to see how often the sterile cockpit process is indeed sterile? We recommend that periodic audits of at least the surgical timeout be done via a sampling methodology. We actually favor videotaping OR cases for use in performance improvement activities. Letting the OR team view and critique their own performance and the performance of the team as a whole is a great way to improve coordination and teamwork and identify issues that would have otherwise been overlooked. The biggest issue is getting your physicians and legal counsel to be comfortable with such videotaping. Very few facilities currently do this.
Of interest, the NTSB report in our October 2, 2007 column mentioned that a LOSA Collaborative showed that flight crewmembers who intentionally deviated from standard operating procedures were three times more likely to commit other types of errors, mismanage more errors, and find themselves in more undesired aircraft situations compared with those flight crewmembers who did not intentionally deviate from procedures. We suspect the numbers in healthcare would be similar. So auditing as above might identify risk for other situations.
The Learning Curve
Pilot experience is obviously an important safety consideration. Generally, it one sees a correlation between safe, efficient operation of an aircraft and the number of hours spent flying, particularly the number of hours spent on that particular type of aircraft. (But keep in mind that seniority and experience does not prevent errors. In fact, in our Novermber 25, 2008 Patient Safety Tip of the Week Wrong-Site Neurosurgery we noted that certain types of error, such as wrong site surgery, may be more likely with experienced surgeons.)
In this case, both the pilot and copilot were relatively new to this particular type of airplane (they also had relatively few total flight hours). Sound familiar? Last week we discussed the crash that occurred in pilots flying home a brand new airplane from Brazil. They obviously had the same unfamiliarity with some aspects of that plane.
When one reads through the assessments of the performance of the pilot in the current crash, there is somewhat of a surprise. While the assessments were usually good, several times it was mentioned that he had some difficulties with programming the FMS (Flight Management System) but all the pilots transitioning to this airplane have trouble with that. Thats reassuring!!!
Switching to the new plane also means that some functions you used on your previous plane(s) are different on the new plane. Several examples are given where switch location or configurations were opposite on the Dash-8 400Q to what they were on this pilots previous plane.
So think about healthcare. Yes, we all know that there is a learning curve for surgeons for certain types of procedures. But there is a learning curve every time a new piece of equipment is introduced almost anywhere in our healthcare facilities. Staff are often confronted with an unfamiliar new piece of equipment without proper training in how to use it. Weve all seen the case where a nurse floats from one ICU to another ICU and encounters a different type of ventilator that has dials and switches and settings that re totally different from that on the ventilators in the other ICU. That is the biggest reason that standardization is so important. When a team gets together to consider purchase of new items such as ventilators, it clearly needs to include frontline staff (those who use the equipment) and actually needs to let them play with the equipment before deciding to purchase it. Where possible, the same look and feel ought to apply to the equipment in all locations. And one must especially be careful when temporary staff (eg floating nurses, agency nurses) is brought on board. They need to be educated and oriented to all types of equipment and policies and procedures you do at your facility or unit.
Fatigue causes a deterioration of performance in almost all work situations. Aviation was one of the first industries to institute strict work hour limitations. The NTSB investigations go into great detail about not only the work hours but also what was going on in the lives of the involved crew for a longer period of time, looking for activities that might have led to fatigue. They look at things like where the pilots slept the night prior, etc. Fatigue and other factors, like distractions and the unsterile cockpit, are things that interfere with situational awareness of the pilots.
We, of course, in healthcare have now instituted strict work hour limitations for housestaff. However, we have no limits to avoid fatigue in other members of the healthcare team (nurses, attending physicians, extenders, technicians, etc.). And we know of no tools or systems in widespread use at this time to identify fatigue so we could remove fatigued healthcare workers from harms way. We need to do a better job at that.
One interesting aspect of the Flight 3407 crash was that the pilots first two flights that day were cancelled due to bad weather in the Newark region. So they had considerable idle time prior to the fatal flight. We are unaware of any studies on the effect of idle time on performance. However, if it does impact performance, that could be relevant in healthcare. Think of all the times you might have an OR team waiting because of a delay in a case already in the OR. Sounds like a good research project for a human factors investigation!
Alarm problems are one of our big three findings in most root cause analyses of medical incidents (along with communication issues and problems with hierarchy/authority gradients). In this crash, there does not appear to be any tampering with alarm settings or misuse of alarms. But there may be an alarm design issue here.
The stick shaker alarm was the first clue to these pilots of the impending stall. But the investigation panel observed that the precipitous drop in airspeed had taken place over 20 seconds. One of the NTSB board members, Deborah Hersman (who always seems to make astute observations in the investigations weve read), questioned whether it made more sense to have some other sort of alarm that would alert crew to the dropping airspeed well before the catastrophic stick shaker alarm. That certainly makes sense. We have had multiple columns in the past focusing not only on abuse or misuse of alarms but also on faulty alarm design. Alarms must be designed in labs that actually observe how humans interact with the equipment and respond to the alarms.
The Vermont Incident
One piece of evidence in the hearings dealt with an incident in Vermont of a similar plane developing the stick shake alert when airspeed reduced below the set limit. The crew was able to recover in that event. However, reading that transcript provides great insight into the complexity of landing an airplane and the structured processes the crew goes through. They describe at least 3 separate checklists used: a descent checklist, an approach checklist, and a before landing checklist. Sound familiar? Recall that the WHO Safe Surgery Checklist is also actually 3 separate checklists.
The latter transcript also brings out another important point about checklists: what happens when you get distracted? It is distractions that often cause items in checklists to be overlooked. After the stick shaker alarm went off, one of the pilots had to backtrack on the checklist because he forgot where he had left off. That is good advice: any time you are using a checklist and get interrupted, backtrack so you can be sure you have not omitted any steps.
The Vermont transcript also provides insight into use of alerts under varying circumstances. The airspeed alert system is set at different rates based upon whether there are icing conditions or not (the speed at which you would approach and land is higher in icing conditions than in non-icing ones). So the crew typically sets the alert level based on the conditions and then adds a few more miles per hour so that it would go off before you reached that critical speed. Is there an equivalent for this in healthcare safety? We can think of a few examples where it might be important. Consider the OR. If a case is running smoothly and on time there may be no need for certain types of alert. But suppose there are complications or other delays so that your surgical case is now running longer than normal. Is there a point in time where you need to consider an extra dose of prophylactic antibiotic? Is there a point where you need to consider repositioning the patient to avoid a nerve pressure injury or a decubitus? Is there a point where the risk of DVT has risen high enough that youd consider beginning intraoperative DVT prophylaxis? Consider setting some sort of time-based alert to ask these questions in your surgical cases.
Lingo and Abbreviations
We thought we had a dizzying array of special words and abbreviations that have led to many errors and adverse outcomes in healthcare. But when you read these NTSB reports it is mindboggling how many terms must be recognized by all sorts of people in the aviation field. And the abbreviations used, both verbally and on paper or display screens, are incredible. Aviation clearly needs to have a do not use abbreviation list similar to what we use in healthcare.
Weve identified above many of the conditions and events at the sharp end and some root causes of this unfortunate event. But the deeper root causes will be discussed for years to come. The low pay scales of the pilots in the regional airlines leads to recruiting less experienced pilots who often have to commute long distances because they cant afford housing near their airlines bases. The low profit margins may have impacts on training and retraining.
It is all too easy here to blame this accident on pilot error. However, when you do a root cause analysis you must insert yourself into the situation as it played out. You need to ask Could this have happened to 2 other pilots? or Could this have happened to me?. The answer here is probably yes. There were clearly multiple system issues that need to be addressed so that an accident of this sort is not repeated. The same applies in almost every RCA we do in healthcare. The primary goal of an RCA is not to affix blame but rather to learn how to avert similar disasters in the future. Lets hope we can proactively use some of the many lessons learned from this tragic event. Not to do so would make the loss of lives even more tragic.
NTSB Public Hearing
Colgan Air, Inc. Flight 3407, Bombardier DHC8-400, N200WQ
Clarence Center, New York, February 12, 2009
May 12-14, 2009
NTSB Docket Management System
Documents related to the NTSB investigation of Flight 3407 crash
The Buffalo News
The Tragedy of Flight 3407
(contains multiple articles and links on the accident and NTSB investigation)
NTSB Reports on the crash of Flight 3407
Air Traffic Control Group Chairman's
Larsen CR, Soerensen JL, Grantcharov TP, et al. Effect of virtual reality training on laparoscopic surgery: randomised controlled trial. BMJ 2009; 338: b1802
Kneebone R, Aggarwal R. Editorials. Surgical training using simulation. Early evidence is promising, but integration with existing systems is key. BMJ 2009;338:b1001
NTSB Sterile Cockpit Procedures
NTSB Reports on the crash of Flight 3407
Cockpit Voice Recorder Group Chairman
NTSB Reports on the crash of Flight 3407
Operations Group Chairman
Interview Summaries during Field Investigation Buffalo
Updated: 05/14/09 11:04 AM
'From complacency to catastrophe in 20 seconds'
By Michael Beebe and Jerry Zremski
News Staff Reporters
NTSB Reports on the crash of Flight 3407
Operations Group Chairman
Interview Summaries of Vermont Incident Crew
Updated: 05/14/09 11:03 AM
SPECIAL REPORT: INVESTIGATING FLIGHT 3407
Low pay, fatigue 'recipe' for crash; Flight 3407 families outraged
By Jerry Zremski and Michael Beebe
NEWS STAFF REPORTERS
June 2, 2009
Why Hospitals Should Fly...
John Nance Nails It!
Weve spent the last 2 columns (and several columns in the past three years) discussing valuable lessons learned from aviation tragedies that can be applied to healthcare. In fact, since the late 1980s we have used the aviation industry as a model for patient safety. Everyone involved in patient safety realizes the rapidly growing numbers of effective measures that have been demonstrated to minimize errors and improve outcomes. But we have all been disappointed at our overall inability to significantly reduce the number of patients being harmed annually in whom adverse outcomes were potentially avoidable.
John Nances new book Why Hospitals Should Fly really hits the target. While Nance masterfully weaves many best practices from IHI, NQF, AHRQ, Joint Commission, ISMP and others into his fictional St. Michaels, his theme is that we have failed because we have failed to convert to a culture of safety. That basic theme is that culture kills strategy every time.
Nance describes the sentinel event that really led to cultural change in aviation: the 1977 Tenerife accident. In that incident, a KLM jumbo jet attempting to take off crashed into a Pan Am jumbo jet that had not yet cleared the runway, resulting in 583 deaths. He describes in detail many of the contributing factors, including time and monetary pressures, bad weather, poor visibility, language problems, excess fuel, and communication problems. But most importantly, the pilot of the plane attempting to take off was the best and brightest. Captain Jacob van Zanten was the Chief Pilot for KLM, a Vice President, head of their safety program, a veteran pilot with over 30 years of experience, and poster child in their ad campaign. Two other people in the crew had concerns about the attempted takeoff but each acquiesced to the captains wishes to take off. The fact that this accident could happen to people with outstanding records forced everyone in the industry over the next decade to focus on the systems of safety and develop crew resource management (CRM), remove the powerful hierarchical structure in the cockpit, and other changes that helped mold a totally new culture of safety and teamwork in which all parties in a flight had equal voice.
Nance then demonstrates through the eyes of a physician visitor to St. Michaels all the lessons learned from aviation that were applied to healthcare and changed the culture to one of total, egoless support of the common goal (taking safe, effective care of our patients) in a system in which the whole team acknowledges that errors will occur and the team will catch each others mistakes before harm comes to patients. It is a high reliability organization and a learning organization in which all members take pride in learning from their mistakes as much as celebrating their successes. It is a culture in which teams are empowered and encouraged to do a root cause analysis on the spot and make changes to the system immediately (ala Toyota/lean thinking concepts).
It is a culture in which every member really looks at each thing they do thinking could what I am about to do be wrong?. Essentially, in the fictional hospital a 50-50 rule is applied to everything but medications. That rule is to expect that what your are about to do has a 50% chance of being in error and harming a patient. He uses the striking analogy to a see-saw where something weighted 90-10 (i.e. a 90% chance of doing something correctly) leads to a perception that nothing is likely to go wrong. But using the 50-50 rule instead makes you question virtually every process you are involved in. For medications, he argues, you must assume that every process is going to result in error. He also uses the last chance, best chance concept from law for avoiding accident or injury, using the surgical timeout as an example of how this last chance may be the best chance to avoid wrong site surgery.
He provides lots of examples of good team communications and removal of the hierarchical structure. He does talk about how one bad apple can spoil the communication culture and notes the studies on disruptive behaviors (for both physicians and nurses). And though he gives some vivid descriptions of such bad apples he makes a great point: it is not just the 5% at the bad end of the curve that is the problem. Rather, he stresses that large middle of the curve that is silent as being in need of shifting to new attitudes and a new culture of safety. He also uses an amusing bucket of crabs analogy in which a crab cannot escape a bucket because all the other crabs in the bucket will pull it back in!
He stresses barrierless communication and the fact that physician autonomy must be subjugated in order for barrierless communication to take place. In his discussion about standardization and practice variation, he uses the concept patients do not grant physicians the right to gamble their welfare just so a physician can demonstrate his autonomy. And he nicely describes the distinction between being a leader and being a commander (using Star Trek analogies that he often uses in this book!). It is a culture where every one is encouraged to speak up and is rewarded for speaking up even if they were wrong.
He, of course, talks about good patient safety practices and concepts such as use of checklists and bundles, standardization, a fearless error reporting system, structured handoffs, briefings and debriefings, readback, shadow-a-colleague for a day, staff empowerment, normalization of deviance, Just Culture, the James Reason/swisscheese model of error defenses, and many other things we talk about in patient safety circles.
He discusses failures in perception, assumption, and communication as the three tiers that a patient safety culture must address to be successful. His discussion on assumptions (see also our discussion on assumptions in last weeks Patient Safety Tip of the Week Learning from Tragedies. Part II) is excellent. In a safety culture it boils down to this: if you need to assume something, always assume the negative.
In the communication discussion, he points out that studies show 12.5% of the time we do not understand what someone is actually saying even when we are speaking the same language, same dialect, and have the same education and profession. As we have advocated in several of our columns, his St. Michaels videotapes OR cases so that they can be used to dissect problem areas in communication and be used in a constructive manner to improve teamwork and communication. The CEO of St. Michaels does a nice job destroying the arguments typically used to avoid videotaping. Videotapes are great for showing what actually went on, not what you think went on. And videotapes are great for demonstrating the old adage 90% of communication is non-verbal.
There is, of course, discussion about sterile cockpit analogies in medicine. In addition to some of the examples we have given in the past, St. Michaelss has some offshoots: the zero-exceptions rule for bedside barcoding and the no-interruptions zone when a nurse is preparing, dispensing, administering, or otherwise handling medications. The latter even includes a signal - a red towel over the nurses left shoulder that tells everyone not to interrupt him/her.
And, speaking of nursing, St. Michaels has created an atmosphere that promotes nurses as full equals with strong voices, empowered to make changes as needed, and freed up to provide care where it is needed the most at the bedside.
So do we think Nances St. Michaels is achievable? Can it be done overnight? Do we have to wait for a new generation of healthcare workers? The question should not be can it be done?. We simply must do it. Nance has nailed it when he says our failure to change the culture is central to our failure to impact patient safety. Will it happen overnight? No. In the airline industry it took a decade or more. Will it require a new generation? Not necessarily, but there will be some casualties along the way. At St. Michaels they had to let several of their top surgeons go because they could not fit into the new culture. But we need to be developing that next generation anyway. Our professional school curricula need dramatic changes. While we teach patient safety concepts to our medical students and residents, we dont yet have any significant curricular interactions between medical, nursing and pharmacy students. Our simulation training usually focuses on interaction directly with patients. We need to develop simulation training for our students with students in the other healthcare professions they will work with for the rest of their careers. Instead of fostering perfection in all our students, we need to be training them to understand they will make errors and the best way to prevent those errors from producing patient harm is to have teams that work together collectively to recognize and mitigate those errors. No longer can we just focus our teaching efforts to show our students and residents get top scores on the National Boards or their specialty board exams. Just as importantly, we need to restructure the finance side of healthcare so that incentives are aligned for all healthcare workers to produce different outcomes than we see today and ensure appropriate behaviors to achieve those outcomes are rewarded.
We probably overuse the must read label. But we have no qualms about using that label for this masterful book by John Nance. You wont be sorry you read it. It is easy reading, thanks to his style and its quasi-fictional nature. But youll readily recognize all the characters in the book (they all have a counterpart in your hospital!) and youll realize why you will never impact medical error and patient safety without changing the way those characters interact.
Nance, John J. Why Hospitals Should Fly: The Ultimate Flight Plan to Patient Safety and Quality Care. Bozeman MT: Second River Healthcare Press. 2008
June 9, 2009
CDC Update to the Guideline for Prevention of CAUTI
In our April 21, 2009 Patient Safety Tip of the Week Still Futzing with Foleys? we noted that the legitimate indications for indwelling urethral catheters would be clarified in the upcoming release of new CDC/HICPAC guidelines. Well, a draft of the 322-page Guideline for Prevention of Catheter-Associated Urinary Tract Infection 2008 has been made available through the Premier Patient Safety Institute website. But dont worry, you dont have to read the entire document! It comes with an excellent executive summary which contains all the key elements youll need. This is a great guideline that provides useful recommendations about the indications for indwelling urethral catheters, the proper insertion and care of them, alternatives to indwelling urethral catheters, and help in setting up the quality improvement systems you need to help your facilities avoid CAUTIs. The bulk of the 322 pages is made up of appendices related to the evidence base, a great resource for those who need further insights into the recommendations in the guideline.
Youll recall that the original CDC guideline was published in 1981 and had not been updated officially since that time.
Most importantly, the guideline stresses that indwelling urethral catheters should not be used for management of incontinence except under very unique circumstances. Nor should they be used just to obtain specimens for culture or diagnostic testing. The guideline provides a table listing appropriate indications for indwelling urethral catheters. Appropriate indications include:
The recommendations on indications for indwelling urethral catheters are particularly useful when it comes to the OR. In our April 21 column we noted that perioperative use of indwelling urethral catheters remains a significant problem in most hospitals. The guidelines stresses that indwelling urethral catheters should not be used routinely in operative patients, but rather only when necessary. Legitimate perioperative indications include those cases where prolonged duration is anticipated, those where intraoperative monitoring of urinary output is needed, and those where it is anticipated there will be large volumes of fluid infused or diuretics used. They may also be used in incontinent patients during the surgery period. And, of course, they may be used during urologic surgery or surgery on structures contiguous with the G-U tract. If there is a legitimate indication for perioperative use of a indwelling urethral catheter, the catheter should be removed as soon as possible (preferably within 24 hours). If used perioperatively, there should be specific protocols established to guide appropriate evaluation and consideration for removal of the catheters.
In our April 21 column we noted that one of the problems with failure to remove indwelling urethral catheters postoperatively may be that the OR IT systems are often poorly integrated with the other hospital IT systems. The other problem is that multiple handoffs occur in the perioperative patient. They typically go from a med/surg floor (or pre-op intake area) to the OR, then to the PACU or recovery room, then back to a med/surg floor or ICU. We strongly recommend that your structured handoff tools include a specific item related to indwelling urethral catheters.
The section on alternatives to indwelling urethral catheter use is also excellent. It discusses use of condom catheters in male patients and has an extensive discussion about use of intermittent catheterization, both clean and sterile types. Those of us who are neurologists have used clean intermittent catheterization in many patients for many years and continue to be amazed at how infrequently UTIs occur when used properly. In the hospital, however, sterile technique and equipment should be used. The guideline recommends use of a portable ultrasound device (by appropriately trained personnel) to assess urine volume so as to avoid unnecessary intermittent catheter insertions.
The section on proper techniques for urinary catheter insertion include discussions on proper training of personnel, hand hygiene before and after catheter insertion, sterile equipment, aseptic technique, use of lubricant jelly, proper securing of the catheter to prevent movement and urethral traction, and use of the smallest bore catheter possible. Our comment: use a checklist, just like you would if you were inserting a central line.
Not included in the guideline, but worth noting if it applies to your facilities, is a recent UK National Patient Safety Agency Rapid Response Report Female urinary catheters causing trauma to adult males. This report notes numerous instances where shorter female catheters, when used in male patients, may result in pain, hematuria, retention, and penile swelling.
The section on urinary catheter maintenance focuses on maintaining sterile continuously closed drainage and unobstructed flow of urine (in the correct direction, of course). And they detail how to appropriately get samples for culture or diagnostic testing.
Some of the best advice in this guideline are the things not to do. This includes advice that you should not arbitrarily change catheters or drainage bags at fixed intervals and should not use systemic antibiotics routinely to prevent CAUTIs. Bladder irrigation should be avoided unless obstruction is suspected. And antimicrobials need not be instilled in either the bladder or the drainage bags and antiseptic lubricants need not be routinely used. The guideline also notes that silver-alloy catheters or antibiotic-coated catheters need not be routinely used. And catheterized patients should not be screened routinely for asymptomatic bacteruria. They also discuss considerations that are still subject to future research.
The section on monitoring, surveillance, and quality improvement is excellent. They suggest use of a system of alerts and reminders to identify those patients with indwelling urethral catheters and need for continued catheterization and development of protocols for nurse-directed removal of unnecessary catheters. They also suggest procedure-specific protocols for perioperative management and protocols for postoperative urinary retention. Lastly, they discuss the role of administration in providing appropriate guidance and leadership, education and training, supplies and a system for documentation and surveillance. Use of feedback, both to individual providers and units, is highly recommended.
Going back to our April 21, 2009 Patient Safety Tip of the Week Still Futzing with Foleys? we again offer the following suggestion: treat the Foley catheter like a drug! Have it ordered through your CPOE or medication ordering system. That column lists out some of the potential benefits of using such a system.
And dont forget our other columns on urinary catheter-associated UTIs:
Gould CV, Umscheid CA, Agarwal RK, Kuntz G, Pegues DA, and the Healthcare Infection Control Practices Advisory Committee (HICPAC). DRAFT Guideline for Prevention of Catheter-Associated Urinary Tract Infection 2008. CDC. http://www.premierinc.com/quality-safety/tools-services/safety/topics/guidelines/downloads/cauti_GuidelineApx_June09.pdf
National Patient Safety Agency (UK). Rapid Response Report. Female urinary catheters causing trauma to adult males. NPSA Reference: NPSA/2009/RRR002. 30 April 2009
June 16, 2009
Disclosing Errors That Affect Multiple Patients
Given that this week U.S. Congress is holding hearings into the possible exposure of multiple Veterans Administration patients to infectious contamination during endoscopies and other procedures, it is fitting that an article appeared in the Canadian Medical Association Journal on Disclosing Errors That Affect Multiple Patients (Chase et al 2009).
Most of you have followed in the news the events in the VA system in which cases of possible contamination were discovered in late 2008 affecting patients from at least 3 different VA sites and going back possibly as far as 203. Cases involved endoscopies, colonoscopies and ENT procedures and involved patients in Murfreesboro TN, Augusta GA, and Miami FL. In February 2009 the VA began sending out notices of possible exposure to 10,000 plus patients who received care at these facilities and testing for possible infectious complications was provided. Of those tested, there have been 34 patients testing positive for hepatitis C, 13 for hepatits B, and 6 for HIV.
Frankly, given that relatively low prevalence of positivity in that large population tested, it may never be known whether any patients were contaminated as a result of exposure in those procedures. Nevertheless, the VA has embarked on an investigation of its infection control practices for such equipment in all its facilities. Non-VA facilities would also be wise to look at their own procedures for decontamination of such equipment.
And though events of this type generate lots of negative press, the VA appears to have handled this in an honest, transparent manner with the best interests of its patients in mind.
We have long been an advocate of disclosure and apology (see our July 24, 2007 Patient Safety Tip of the Week Serious Incident Response Checklist). One of the items in our serious incident response checklist, actually developed back in 1991, is identifying who should talk to the patient or family after an incident in which medical error and/or patient harm has taken place. That is usually the attending physician, or in some circumstances, the medical director.
The National Quality Forums Safe Practice 7 Disclosure states: Following serious unanticipated outcomes, including those that are clearly caused by systems failures, the patient and, as appropriate, the family should receive timely, transparent, and clear communication concerning what is known about the event..
Disclosure of medical errors has been the trend in medicine over the past decade or so (Gallagher et al 2007). However, most such disclosures involve events isolated to a single patient and the disclosures have thus been to that single patient and his/her family, where appropriate. Disclosure of events potentially involving multiple patients is much more complex and the need to balance individual patient confidentiality against the need to make a much larger patient population and the public aware is problematic.
The CMAJ paper nicely spells out the steps necessary in an investigation into multiple-patient events and outlines the many issues in setting up an effective communication strategy. The authors nicely describe the problems involved in the first step timely identification of the error because a representative sample of records to review requires an estimate of the potential number of patients affected. This step requires identifying the time period to review, scope of the review, training teams to do the reviews, and developing reliability checks. Then it needs to be determined whether the error affected clinical decision making. Then the physicians need to follow up with patients who may require changes in management after the review is done.
The decision about who and when to notify is complex. Especially when the number of patients to be notified is large, the decision is even harder. The authors voice their opinion that disclosure on a scale this large should be led by a physician other than the one(s) directly involved in the error. A formal disclosure plan should be developed, complete with dates and plans for specific disclosure to all potential stakeholders, including the public and the media. They point out that sensitivity must be used, particularly since some patients may have died in the interim (regardless of whether that was related to the incident or not). They provide some specific examples where timely disclosure to the public helped mitigate the response to errors and stress the need for continued honest updates to the public.
Their guidelines for public disclosure are well thought out. They stress early disclosure and note that it is best if the public hears about the event from you, not the media or other parties. As such, disclosure even before all the details are known may be appropriate, though initial disclosure should avoid making assumptions or identifying specific individuals involved. Patients need to be contacted individually. But using a website or dedicated phone line for keeping everyone up to date may be useful. The results of the investigation of the event need to be made public along with a description of the steps that were taken to prevent similar occurrences in the future. Overall, this is an excellent guideline to help any organization that must deal with an untoward event affecting or possibly affecting multiple patients.
We can also thank our Canadian colleagues from the Canadian Patient Safety Institute who produced the Canadian Disclosure Guidelines in 2008. These deal more with disclosure to one patient and or family. But they are extremely useful in recommending what to disclose, the setting for disclosure, who should participate, how to disclose, and how to express regret. They include a very practical checklist for the entire disclosure process.
Safe Practices for Better Healthcare
2009 Update A Consensus Report
Canadian Patient Safety Institute. Canadian Disclosure Guidelines. 2008
June 23, 2009
More on Delirium in the ICU
Weve done a series of columns on delirium, including our Patient Safety Tips of the Week for October 21, 2008 Preventing Delirium, October 14, 2009 Managing Delirium, February 10, 2009 Sedation in the ICU: The Dexmedetomidine Study, and March 31, 2009 Screening Patients for Risk of Delirium. Delirium is prevalent, costly (in both human and financial terms), difficult to recognize, and potentially preventable.
Most outcome studies done on delirium in the ICU have focused primarily on medical rather than surgical populations. Now a new study (Lat et al. 2009) looks at mechanically ventilated patients in surgical and trauma ICUs. Expectations were that the prevalence of delirium in these ICU patients would be lower because they were, in general, younger and lacking many of the comorbid conditions typically seen in medical ICU patients. Yet, using the CAM-ICU tool and Richmond Agitation and Sedation Scale tool daily, they found delirium at some point in the ICU stay for 63% of these patients. Patients who developed delirium had more ventilator days and longer ICU and total hospital lengths of stay independent of the illness severity or injury severity. These patients also had cumulatively larger doses of lorazepam and fentanyl. They did not find an association with mortality as seen in prior studies but did not have long-term followup on their patients. The study highlights the high prevalence of delirium in all ICU patients and especially the importance of optimizing the use of both sedatives and analgesics in the ICU population.
Use of sedation and analgesics in mechanically ventilated patients is universal in ICUs. However, the problem of oversedation is widespread. Misperceptions about the level of sedation are striking. One study (Weinert et al 2007) showed that only 2.6% of nursing staff felt that their patients were oversedated when objective criteria documented oversedation in a third of these ICU patients. They also noted that time of day may influence ones interpretation of sedation level.
One of the biggest reasons for oversedation is the use of continuous infusions of sedating agents, especially in the elderly and those with hepatic disease (Devlin 2008). This is especially problematic when sedatives with long half lives (eg. lorazepam) are being used. Yet we continue to be surprised at how many ICUs still use continuous infusion of sedating agents.
Ventilator weaning protocols that include spontaneous breathing trials have been supported by numerous clinical trials. Also, a number of clinical trials have shown that either use of intermittent (as opposed to continuous) sedation or daily interruption of sedation have reduced the need for mechanical ventilation. The ABC trial (Awakening and Breathing Controlled trial) (Girard et al 2008) was a multicenter prospective controlled trial that paired the use of spontaneous breathing trials (SBTs) with spontaneous awakening trials (SATs) in comparison to a usual care group in mechanically ventilated ICU patients. Validated tools, including the Richmond Agitation-Sedation Scale (RASS) and the CAM-ICU were used in the assessments. Results demonstrated patients in the intervention group had more ventilator-free days, shorter ICU and total hospital lengths of stay, and a 32% better survival at one year.
A second recent paper (van Eijk et al 2009) on delirium in the ICU compared two commonly used tools to detect delirium. They found the CAM-ICU tool had higher sensitivity (64%) and negative predictive value, whereas the ICDSC (Intensive Care Delirium Screening Checklist) had higher specificity and positive predictive value. Overall, the CAM-ICU tool picked up more cases of delirium. However, the most important point of the paper was really that the physicians providing most of the care to the ICU patients (i.e. residents) had a sensitivity of only 14%. Fellows and intensivists, on the other hand, had sensitivities of 63%. The study shows the importance of using structured tools to look for delirium since the physicians are not cognizant of most cases.
To complicate matters, two papers (Kilbride et al 2009; Oddo et al 2009) recently appeared in the neurology and critical care journals pointing out the frequent occurrence of nonconvulsive seizures in ICU patients, as detected by continuous EEG monitoring. Up to 10-20% of such ICU patients, particularly those admitted with sepsis, may have electroencephalographic seizure activity. Obviously, such activity may account for variable level of consciousness or attention and might easily be confused with delirium in this patient population.
The lessons learned from this weeks group of studies:
We would strongly recommend that hospitals review their current sedation management strategies and protocols in their ICUs. While it may be easier in the short run to care for the patient who is oversedated, that oversedation in the long term increases the likelihood of prolonged ventilator therapy, prolonged ICU and hospital lengths of stay, ventilator-associated pneumonia (VAP), delirium, and death. Sedation protocols should be tailored to the specific patient, using validated assessment tools, and include regular assessment to determine whether continued use of sedatives and analgesics is necessary. A good discussion of goal-directed sedation is available at the Vanderbilt University ICU Delirium and Cognitive Impairment Study Group site and includes a copy of their sedation protocol.
Lat I, McMillian W, Taylor S, et al. The impact of delirium on clinical outcomes in mechanically ventilated surgical and trauma patients. Critical Care Medicine 2009; 37(6):1898-1905, June 2009
Weinert CR, Calvin AD. Epidemiology of sedation and sedation adequacy for mechanically ventilated patients in a medical and surgical intensive care unit. Critical Care Medicine. 35(2):393-401, February 2007.
Devlin JW. The pharmacology of oversedation in mechanically ventilated adults.
Current Opinion in Critical Care 2008; 14(4):403-407, August 2008.
Girard TD, Kress JP, Fuchs BD, et al. Efficacy and safety of a paired sedation and ventilator weaning protocol for mechanically ventilated patients in intensive care (Awakening and Breathing Controlled trial): a randomised controlled trial. The Lancet 2008; 371: 126 - 134, 12 January 2008
Richmond Agitation-Sedation Scale
van Eijk MMJ, van Marum RJ, Klijn IAM, et al. Comparison of delirium assessment tools in a mixed intensive care unit. Critical Care Medicine 2009; 37(6): 1881-1885
Kilbride RD, Costello DJ, Chiappa KH.How Seizure Detection by Continuous Electroencephalographic Monitoring Affects the Prescribing of Antiepileptic Medications. Arch Neurol. 2009; 66(6):723-728.
Oddo M, Carrera E, Claassen J, Mayer SA, Hirsch LJ. Continuous electroencephalography in the medical intensive care unit. Critical Care Medicine 2009; 37(6): 2051-2056
Vanderbilt University ICU Delirium and Cognitive Impairment Study Group. Patient-Oriented Goal-Directed Sedation Delivery.
June 30, 2009
iSoBAR: Australian Clinical Handoffs/Handovers
Most academic healthcare systems have been welcoming their new incoming residents and fellows in that past 2 weeks. Fortunately, most programs are now incorporating patient safety training into their orientation programs for those new members of the healthcare team. And many have begun to incorporate specific sessions on the importance of handoffs and introduced them to the structured formats that we have been using at our individual organizations.
So it is timely that the June 1, 2009 issue of the Medical Journal of Australia contains an entire supplement dedicated to clinical handoffs. It covers handoffs from offices, from long-term care units to emergency rooms, using TeamSTEPPS training to improve communication in a mental health unit, use of video recordings of handoffs in emergency departments to improve communication, use of whiteboards, use of a standardized communication tool for maternity settings, a framework for communication in a post-anesthetic care situation, plus your more common in-hospital and inter-hospital handoffs.
They have articles on two modifications of the well-known SBAR format. One is the SHARED (situation, history, assessment, risk, expectation, documentation) format. The second is the iSoBAR (identifysituationobservationsbackgroundagreed planread back) format, which they have used as a clinical handover checklist.
If you dont have subscription access to the Medical Journal of Australia (it is not included in most of our academic electronic journal collections but you can purchase 1-week online access to it for about $25), you can still read about the iSoBAR program at the Australian Commission on Safety and Quality in Health Care website and download for free some really great materials about the tools developed and how to use them.
Communication problems underlie at least 75% of sentinel events and can be found as an important contributing factor in almost every root cause analysis (RCA) that we do. It is well-recognized that handoffs are a high-risk activity from a patient safety perspective. Development of structured handoff formats has been an important milestone in patient safety. The SBAR (Situation-Background-Assessment-Recommendation) format was originally developed in the Kaiser-Permanente health system and has been widely adopted as a handoff tool.
Perhaps the biggest problem with handoffs is that they all too often tend to be one-way in practice. On the other hand, good handoffs are clearly two-way communication sessions. They should be conducted in an environment that is conducive to interaction and not subject to interruptions. The most important things are allowing adequate time for the recipient to ask questions and seek clarification and for both parties to clearly understand and agree on what needs to be done. Therefore, a regional and national collaborative in Australia modified certain elements of the SBAR format to better recognize these needs (Porteous et al 2009). Involving teams with clinicians, nurses and others, they analyzed data from multiple sources and developed both standardized operating protocols (SOPs) and minimum datasets (MDSs) that could be integrated into checklists and/or forms to be used for handoffs and transfers of care. However, one concept they emphasized was that, even though they ultimately want a structured and standardized format, flexibility and the ability to adapt the process to meet needs at the local level were essential. That flexibility plus engagement of clinicians to apply the concept within the context of their local healthcare environment is important in influencing both ownership and adoption of the processes and tools. So they both added to and made some subtle, but important, changes to the SBAR format. The additions are the i for identify (identify yourself and the patient) and the o for observations (factual information about the patients condition, diagnostic studies, etc.). But it is the changes that are the most important features. The a has been changed from assessment to agreed plan and the r has been changed from recommendation to read back. The latter is important to clarify for all parties a shared understanding of the plans and who is responsible for what. The resulting acronym iSoBAR thus stands for:
A Agreed Plan, Accountability
R Read Back
While one of the tools first developed using iSoBAR was an inter-hospital transfer form, the concept and checklist can be readily adapted to multiple other types of handoff, including the typical change-of-shift handoffs that nurses and physicians do daily in hospitals. One group (Yee et al 2009) refined the process even more. Their team conducted extensive interview and observation sessions, and analysis of not only verbal and written handoff content but also factors such as body language and frequency of interruptions. Through an interative process they arrived at a new SOP and MDS with the catchy acronym HAND ME AN iSoBAR. The iSoBAR part is unchanged from above but the additions are helpful in operationalizing and preparing for your handoffs.
The HAND part stands for:
H Hey, its handover time!
A Allocate staff for continuity of care
N Nominate participants, time and venue
D Document on written sheets and patient notes
This ensures that all the pertinent parties are available for an effective handoff and that sufficient arrangements have been made for others to provide patient care so that minimal interruptions occur during the handoff.
The ME part stands for:
M Make sure all participants have arrived
E Elect a leader
This means ensuring protected time for all parties, that sessions will be prompt, and that one person ensures the agenda items are covered in a timely fashion.
The AN part stands for:
A Alerts, attention, and safety
This part is really about prioritizing and anticipating. The alert and attention part is to make clear to everyone the items that need to be addressed first (eg. patients who are deteriorating, test results that need to be checked soon, etc.) and other items that may be important for patient safety and safety of staff and others. The notice part deals with anticipating all potential patient movements so that arrangements can be made for changes in workflow.
In hospital pilot projects, the Yee group was able to adapt the HAND ME AN ISOBAR approach to shift-to-shift handoffs for both medical and nursing staffs on general medicine, general surgery, and emergency medicine. The concept has been well accepted in those settings. They are now in the process of testing the concept in multiple other clinical settings and collecting data about the effects on clinical outcomes.
They stress the importance of a culture change in recognizing the importance of the clinical handoff and making it a priority.
Those of you who are regular readers of our columns know we are proponents of using videotapes with feedback to help improve communications in multiple venues (see the discussions on the sterile cockpit in our Patient Safety Tips of the Week for October 2, 2007 Taking Off From the Wrong Runway and May 26, 2009 Learning from Tragedies. Part II). One of the articles in the MJA supplement (Iedema et al 2009) used such videotaping in improving the handoff process. They used a video-reflexive tool called HELiCS (Handover-Enabling Learning in Communication for Safety) to redesign handoff processes in emergency department and ICU settings. They began with discussing handover processes, observing some actual handovers, and then watching videotapes of themselves during actual handoffs. This led to the clinicians improving both their intra- and inter-disciplinary communications and redesigning multiple facets of their handoffs. Videotapes are powerful tools for promoting change. Just like we use story telling and personalization as powerful tools to effect change in multiple aspects of patient safety, use of videotapes personalizes whatever process is being reviewed and makes one both aware of his/her interactions and remember them.
There are multiple other great lessons in this MJA supplement. They are well worth your time. Also, take some time to play with the downloadable educational toolkit on iSoBar from the Australian Commission on Safety and Quality in Health Care website.
Kudos to our Australian colleagues who have taken the lead in developing, validating and piloting a number of tools to improve the handoff process. It should be emphasized that measuring the impact of implementing these tools is not yet complete but it is expected that publication of that data will be forthcoming in the near future.
Clinical handover: critical communications
Med J Aust 2009; 190 (11 supplement): S108-S160
Western Australia Country Health Service and Royal Perth Hospital. iSoBAR for Inter-hospital Patient Transfers. Austrailian Commission on Safety and Quality in Health Care website.
Porteous JM, Stewart-Wynne EG, Connolly M, Crommelin PF. iSoBAR a concept and handover checklist: the National Clinical Handover Initiative . Med J Aust 2009; 190 (11): S152-S156.
Inter-Hospital Patient Transfer Form (from the WACHS Clinical Handover Initiative)
Yee KC, Wong MC, Turner P. HAND ME AN ISOBAR: a pilot study of an evidence-based approach to improving shift-to-shift clinical handover. Med J Aust 2009; 190 (11): S121-S124.
Iedema R, Merrick ET, Kerridge R, Herkes R, Lee B, Anscombe M, Rajbhandari D, Lucey M, White L. Handover Enabling Learning in Communication for Safety (HELiCS): a report on achievements at two hospital sites. Med J Aust 2009; 190 (11): S133-S136.
See also some of our prior columns on handoffs:
February 26, 2008 Nightmares.The Hospital at Night
September 30, 2008 Hot Topic: Handoffs
December 2008 Another Good Paper on Handoffs
Click here to leave a comment onany of thesetips.
Clickhere to leave a comment onany of thesetips.
Click on the "Contact Us" button at the left to send us your comments on our "Patient Safety Tip of the Week" cases.
To get "Patient Safety Tip of the Week "emailed to you, click here and enter "subscribe" in the subject field.
April 25, 2017
April 18, 2017
April 11, 2017
April 4, 2017
March 28, 2017
March 21, 2017
March 14, 2017
March 7, 2017
February 28, 2017
February 21, 2017
February 14, 2017
February 7, 2017
January 31, 2017
January 24, 2017
January 17, 2017
January 10, 2017
January 3, 2017
December 27, 2016
Tip of the Week on Vacation
December 20, 2016
December 13, 2016
December 6, 2016
November 29, 2016
November 22, 2016
November 15, 2016
November 8, 2016
November 1, 2016
October 25, 2016
October 18, 2016
October 11, 2016
October 4, 2016
September 27, 2016
September 20, 2016
September 13, 2016
September 6, 2016
August 30, 2016
August 23, 2016
August 16, 2016
August 9, 2016
August 2, 2016
July 26, 2016
July 19, 2016
July 12, 2016
July 5, 2016
Tip of the Week on Vacation
June 28, 2016
June 21, 2016
June 14, 2016
June 7, 2016
May 31, 2016
May 24, 2016
May 17, 2016
May 10, 2016
May 3, 2016
April 26, 2016
April 19, 2016
April 12, 2016
April 5, 2016
March 29, 2016
March 22, 2016
March 15, 2016
March 8, 2016
Tip of the Week on Vacation
March 1, 2016
February 23, 2016
February 16, 2016
February 9, 2016
February 2, 2016
January 26, 2016
January 19, 2016
January 12, 2016
January 5, 2016
December 29, 2015
December 22, 2015
The Alberta Abbreviation Safety Toolkit
December 15, 2015
Vital Sign Monitoring at Night
December 8, 2015
December 1, 2015
TALLman Lettering: Does It Work?
November 24, 2015
Door Opening and Foot Traffic in the OR
November 17, 2015
November 10, 2015
Weighing in on Double-Booked Surgery
November 3, 2015
October 27, 2015
October 20, 2015
Updated Beers List
October 13, 2015
Dilaudid Dangers #3
October 6, 2015
September 29, 2015
September 22, 2015
The Cost of Being Rude
September 15, 2015
September 8, 2015
September 1, 2015
August 25, 2015
August 18, 2015
August 11, 2015
August 4, 2015
Tip of the Week on Vacation
July 28, 2015
July 21, 2015
July 14, 2015
July 7, 2015
June 30, 2015
June 23, 2015
June 16, 2015
June 9, 2015
June 2, 2015
May 26, 2015
May 19, 2015
May 12, 2015
May 5, 2015
April 28, 2015
April 21, 2015
April 14, 2015
April 7, 2015
March 31, 2015
March 24, 2015
March 17, 2015
March 10, 2015
March 3, 2015
February 24, 2015
February 17, 2015
February 10, 2015
February 3, 2015
January 27, 2015
January 20, 2015
January 13, 2015
January 6, 2015
December 30, 2014
December 23, 2014
December 16, 2014
December 9, 2014
December 2, 2014
November 25, 2014
November 18, 2014
November 11, 2014
November 4, 2014
October 28, 2014
October 21, 2014
October 14, 2014
October 7, 2014
September 30, 2014
More on Deprescribing
September 23, 2014
September 16, 2014
Focus on Home Care
September 9, 2014
September 2, 2014
August 26, 2014
August 19, 2014
August 12, 2014
August 5, 2014
Tip of the Week on Vacation
July 29, 2014
July 22, 2014
July 15, 2014
July 8, 2014
July 1, 2014
Interruptions and Radiologists
June 24, 2014
June 17, 2014
June 10, 2014
June 3, 2014
May 27, 2014
May 20, 2014
May 13, 2014
May 6, 2014
April 29, 2014
April 22, 2014
April 15, 2014
Specimen Identification Mixups
April 8, 2014
April 1, 2014
March 25, 2014
March 18, 2014
March 11, 2014
March 4, 2014
February 25, 2014
February 18, 2014
February 11, 2014
February 4, 2014
January 28, 2014
Is Polypharmacy Always Bad?
January 21, 2014
January 14, 2014
January 7, 2014
December 24-31, 2013
Tip of the Week on Vacation
December 17, 2013
December 10, 2013
December 3, 2013
November 26, 2013
November 19, 2013
November 12, 2013
November 5, 2013
October 29, 2013
October 22, 2013
October 15, 2013
October 8, 2013
October 1, 2013
September 24, 2013
September 17, 2013
September 10, 2013
September 3, 2013
August 27 2013
August 20 2013
August 13 2013
August 6, 2013
July 9-30, 2013
Tip of the Week on Vacation
July 2, 2013
June 25, 2013
June 18, 2013
June 11, 2013
June 4, 2013
May 28, 2013
May 21, 2013
May 14, 2013
May 7, 2013
April 30, 2013
April 23, 2013
April 16, 2013
April 9, 2013
April 2, 2013
March 26, 2013
March 19, 2013
March 12, 2013
March 5, 2013
February 26, 2013
February 19, 2013
February 12, 2013
February 5, 2013
January 29, 2013
January 22, 2013
January 15, 2013
January 8, 2013
January 1, 2013
December 25, 2012
Tip of the Week on Vacation
December 18, 2012
December 11, 2012
December 4, 2012
November 27, 2012
November 20, 2012
November 13, 2012
November 6, 2012
October 30, 2012
October 23, 2012
October 16, 2012
October 9, 2012
October 2, 2012
September 25, 2012
September 18, 2012
September 11, 2012
September 4, 2012
August 28, 2012
August 21, 2012
August 14, 2012
August 7, 2012
July 31, 2012
July 24, 2012
July 17, 2012
July 10, 2012
Tip of the Week on Vacation
July 3, 2012
June 26, 2012
June 19, 2012
June 12, 2012
June 5, 2012
May 29, 2012
May 22, 2012
May 15, 2012
May 8, 2012
May 1, 2012
April 24, 2012
April 17, 2012
April 10, 2012
April 3, 2012
March 27, 2012
March 20, 2012
March 13, 2012
March 6, 2012
February 28, 2012
February 21, 2012
February 14, 2012
February 7, 2012
January 31, 2012
January 24, 2012
January 17, 2012
January 10, 2012
January 3, 2012
December 20, 2011
December 13, 2011
December 6, 2011
November 29, 2011
November 22, 2011
November 15, 2011
November 8, 2011
November 1, 2011
October 25, 2011
October 18, 2011
October 11, 2011
October 4, 2011
September 27, 2011
September 20, 2011
September 13, 2011
September 6, 2011
August 30, 2011
August 23, 2011
August 16, 2011
August 9, 2011
August 2, 2011
July 26, 2011
July 19, 2011
July 12, 2011
July 5, 2011
June 28, 2011
June 21, 2011
June 14, 2011
June 6, 2011
May 31, 2011
May 24, 2011
May 17, 2011
May 10, 2011
May 3, 2011
April 26, 2011
April 19, 2011
April 12, 2011
April 5, 2011
March 29, 2011
The Silent Treatment:A Dose of Reality
March 22, 2011
March 15, 2011
March 8, 2011
March 1, 2011
February 22, 2011
February 15, 2011
February 8, 2011
February 1, 2011
January 25, 2011
January 18, 2011
January 11, 2011
January 4, 2011
December 28, 2010
December 21, 2010
December 14, 2010
December 6, 2010
November 30, 2010
November 23, 2010
November 16, 2010
November 9, 2010
November 2, 2010
October 26, 2010
October 19, 2010
October 12, 2010
October 5, 2010
September 28, 2010
September 21, 2010
September 14, 2010
September 7, 2010
August 31, 2010
August 24, 2010
August 17, 2010
August 10, 2010
August 3, 2010
Tip of the Week on Vacation
July 27, 2010
July 20, 2010
July 13, 2010
July 6, 2010
June 29, 2010
June 22, 2010
June 15, 2010
June 8, 2010
June 1, 2010
May 25, 2010
May 18, 2010
May 11, 2010
May 4, 2010
April 27, 2010
April 20, 2010
April 13, 2010
April 6, 2010
March 30, 2010
March 23, 2010
March 16, 2010
March 9, 2010
March 2, 2010
February 23, 2010
February 16, 2010
February 9, 2010
February 2, 2010
January 26, 2010
January 19, 2010
January 12, 2010
January 5, 2010
December 29, 2009
December 22, 2009
December 15, 2009
December 8, 2009
December 1, 2009
November 24, 2009
November 17, 2009
November 10, 2009
November 3, 2009
October 27, 2009
October 20, 2009
October 13, 2009
October 6, 2009
September 29, 2009
September 22, 2009
September 15, 2009
September 8, 2009
September 1, 2009
August 25, 2009
August 18, 2009
August 11, 2009
August 4, 2009
July 28, 2009
July 21, 2009
July 14, 2009
July 7, 2009
June 30, 2009
June 23, 2009
June 16, 2009
June 9, 2009
June 2, 2009
May 26, 2009
May 19, 2009
May 12, 2009
May 5, 2009
April 28, 2009
April 21, 2009
April 14, 2009
April 7, 2009
March 31, 2009
March 24, 2009
March 17, 2009
March 10, 2009
March 3, 2009
February 24, 2009
February 17, 2009
February 10, 2009
February 3, 2009
January 27, 2009
January 20, 2009
January 13, 2009
January 6, 2009
December 30, 2008
December 23, 2008
December 16, 2008
December 9, 2008
December 2, 2008
November 25, 2008
November 18, 2008
November 11, 2008
November 4, 2008
October 28, 2008
October 21, 2008
October 14, 2008
October 7, 2008
September 30, 2008
September 23, 2008
September 16, 2008
September 9, 2008
September 2, 2008
August 26, 2008
August 19, 2008
August 12, 2008
August 5, 2008
July 29, 2008
July 22, 2008
July 15, 2008
July 8, 2008
July 1, 2008
June 24, 2008
June 17, 2008
June 10, 2008
June 3, 2008
May 6, 2008
April 29, 2008
April 22, 2008
April 15, 2008
April 8, 2008
April 1, 2008
March 25, 2008
March 18, 2008
March 11, 2008
March 4, 2008
February 26, 2008
February 19, 2008
February 12, 2008
February 5, 2008
January 29, 2008
January 22, 2008
January 15, 2008
January 8, 2008
January 1, 2008
December 25, 2007
December 18, 2007
December 11, 2007
December 4, 2007
November 20, 2007
November 13, 2007
November 6, 2007
October 30, 2007
October 23, 2007
October 16, 2007
October 9, 2007
October 2, 2007
September 25, 2007
September 18, 2007
September 11, 2007
September 4, 2007
August 28, 2007
August 21, 2007
August 14, 2007
August 7, 2007
July 31, 2007
July 24, 2007
July 17, 2007
July 10, 2007
July 3, 2007
June 26, 2007
June 19, 2007
June 12, 2007
June 5, 2007
May 29, 2007
May 22, 2007
May 15, 2007
May 8, 2007
May 1, 2007
April 23, 2007
April 16, 2007
April 9, 2007
April 2, 2007
March 26, 2007
March 19, 2007
March 12, 2007
March 5, 2007
February 26, 2007