Quality Improvement & Patient Safety Section Newsletter - June 2007, Vol 8, #3
From the Chair
Jack Kelly, DO, FACEP
ACEP Quality Course
We have a lot to celebrate; our ACEP Quality and Patient Safety Section Quality Course just held in San Diego at the ACEP Spring Congress was a solid success!!! The course was well planned by David John and the core group. We met on weekly phone conferences for months before the course. And it really showed. The Course Topics were well thought out, drafts of each talk were improved upon, and then handed to the "expert" lecturers for their final touch for their talk.
I'll leave it for Dave John to fill in all of the details (See article below). Kudos for a GREAT idea, which turned into a success!
CMS: Data Element Caveat for Pneumonia
PLEASE remember to tell all of your colleagues in your state, city, and local EDs that CMS has added a new Data Element "caveat" for the Pneumonia Core Measure. They now recognize that there are cases where "the patient's initial clinical picture was questionable or unclear and not suggestive of Pneumonia." The Data Element Name is "Diagnostic Uncertainty." Please document this in cases where there is a "reasonable delay to diagnosis" due to "diagnostic uncertainty". (See article below)
CMS: Surprise JCAHO Validation Survey
Finally, in the spring, my Medical Center (which had just been surveyed by the Joint Commission about 6 weeks before) received a surprise visit from CMS. They performed a "Sample Validation Survey" following the Joint Commission Survey as a means to validate the JCAHO Survey Process. The CMS surveyors were even more scrutinizing than the Joint Commission, and did find several life safety and nursing survey deficiencies that the Joint Commission passed by. Has anyone else had this "double scrutiny" in the last year? Anyway, my ED still sailed through without a single deficiency; we are really happy about that!
Final finally, the Chief Quality Officer of my Medical Center presented the algorithmic approach to Patient Safety using the "Just Culture" strategies. I had seen it about two months ago briefly, and it is really something worthy of your attention. As I get more insight, I will bring it to this newsletter - check out the website as a start.
Back to Top
ED Quality Course - April 25, 2007
David John, MD, FACEP
Our Quality course was offered for the first time at the Spring Congress last month. In all there were around seventy attendees, not bad considering it was a free afternoon on a beautiful day in San Diego. The feedback and evaluations were excellent. A consistent comment was that this course ought to be offered several times around the country.
The faculty included Drs. Jim Augustine, Shari Welch, Azita Hamadani, Chris Beach, Helmut Meisl, Kevin Klauer, and Annie Gerhard from ENA. Some guy named Dave John mumbled into the microphone for the first hour. We even had a Veterinarian, Dr. Irina Miles in the audience who participated in the grant. Angela Franklin deserves most of the credit for getting us all together and in San Diego.
The course consisted of:
- The Case Review
- Data Collection
- System Fixes
- Expert Panel Discussion
The surprise was the panel discussion. It was animated and the questions just kept coming from the audience. The panel had prepared for almost any topic and our experts held up well. In the future, we may pose questions to the experts to touch on important topics that might otherwise be missed.
Many of us met for the first time over lunch before the course and a wrap up event at a local watering hole. About twenty-five of us brainstormed right after the course on future directions for the ED Quality Course. A future conference call in September will be open to any and all members of the QIPS Section and beyond.
The course was successful, well attended, and an important educational offering for the College. A PowerPoint presentation of the Case Review lecture is available online which includes all of the important points from the presentation for you to share. It is also available by mail. There will be a shipping and handling fee, but it should be minimal. It will be a good educational piece for a seventy-minute drive.
I hope all of you get a chance to attend a future course offering and please keep us in mind for future state chapter meetings. The course lasts about 4 hours and we have plenty of faculty, about twenty-five of us participated on the grant.
I want to personally thank the entire faculty who flew to San Diego at their own expense, all of you who participated in the grant, and especially Angela Franklin who made it work.
Back to Top
Joint Commission to Extend Time to Antibiotic Administration
Marilyn Bromley, RN
Angela Franklin, JD
The Joint Commission/CMS performance measure for emergency departments' treatment of pneumonia is being modified, and will now align with Infectious Diseases Society of America/American Thoracic Society (IDSA-ATS) consensus guidelines.
In 2004, the Joint Commission issued standard PN-5b, which requires giving patients an antibiotic within four hours of presentation if they present to an emergency department and are discharged with a diagnosis of pneumonia. Now, the Joint Commission plans to extend the time to administer an antibiotic to six hours (PN-5c).
The Joint Commission will also allow emergency physicians to document "diagnostic uncertainty" in to indicate that the diagnosis of pneumonia was not clear at the time of the patient's arrival in the ED. Such cases will be excluded from the denominator when determining a hospital's performance on the measure. The Joint Commission stresses, however, that for the present data collection period: April 1, 2007 thru Sept 30, 2007, all three timing measures are in effect: the four- and eight-hour measures, and the six-hour test measure. The change will be reflected in the Joint Commission Specification Manual version 2.3, which will be effective for October 1, 2007 discharges.
Specifications for the six-hour antibiotic timing measure (PN-5c) were posted on the Joint Commission website in December 2006 and were incorporated into the Specifications Manual for National Hospital Quality Measures version 2.2 to be implemented with April 1, 2007 discharges. Because the measure was pending National Quality Forum (NQF) endorsement at that time, implementation was as a test measure. On April 20, 2007 NQF announced their endorsement of the PN-5c measure (Initial Antibiotic Received Within 6 Hours of Hospital Arrival). NQF's endorsement of PN-5c replaces their past endorsement of the PN-5b measure (Initial Antibiotic Received within 4 Hours of Hospital Arrival). The Specification Manual version 2.3 effective for October 1, 2007 discharges removes the "test measure" designation for PN-5c.
The IDSA-ATS guidelines support the measure by clearly stating in the section on suggested performance measures: Initiation of treatment would be expected within 6 to 8 hours of presentation whenever the admission diagnosis is likely community acquired bacterial pneumonia (CAP). The Technical Expert Panel for the CMS National Pneumonia Project recommended the six-hour antibiotic timing measure in light of the measure modifications made listed below.
In addition to extending the window for initial administration of empiric antibiotic administration from four to six hours from hospital arrival, the Joint Commission and CMS have made the following revisions to the pneumonia antibiotic timing performance measure as a result of the expressed concerns respecting potential unintended consequences:
In July 2006, the Joint Commission added the data element "Chest X-ray" requiring the finding of a positive chest x-ray or CT scan during the hospitalization to confirm the diagnosis of pneumonia – those cases without a positive radiographic test are excluded from the measure.
Effective October 2006, the Joint Commission revised the data element Pneumonia Diagnosis: ED/Direct Admit and have given the following guidance to hospitals respecting abstraction:
For pneumonia patients admitted through the ED:
- ED physician documents pneumonia/infiltrate/pneumonitis (probable/suspected) as the ED final diagnosis/impression on the ED form or their ED dictation, answer "yes" to pneumonia diagnosis (regardless of who else sees patient).
- ED physician does NOT document pneumonia/infiltrate/pneumonitis as the ED final diagnosis/impression.... but then that SAME ED physician turns around and writes the admit note or admit orders and gives a diagnosis of pneumonia, answer "yes" to pneumonia diagnosis.
- ED physician does NOT document pneumonia/infiltrate/pneumonitis as the ED final diagnosis/impression.... but then a hospitalist/attending/consultant comes along and writes orders or an admit note with a diagnosis of pneumonia (whether the patient is still in the ED or not), answer "no" to pneumonia diagnosis (here the rationale for answering "no" was complaints from around the country that hospitals were being held accountable for timely delivery of antibiotics for patients that the ED physician never considered a diagnosis of pneumonia for – then an attending or hospitalist would come down several hours later and diagnosis pneumonia – long after the time frame for the antibiotic timing measure had passed)
Essentially, this data element ensures that a case is not included in the denominator of the measure if the ED physician's diagnosis is not pneumonia at the time the patient is discharged from the ED.
Added the data element "diagnostic uncertainty" and have given the following guidance for this data element: The primary intent of this data element is to determine if the physician identified clinical circumstances that would delay the diagnosis of pneumonia. The physician must specifically document the diagnostic picture was questionable or unclear and not suggestive of pneumonia, as noted by the following examples:
- Clinical picture not clear
- Diagnostic picture unclear
- Not suggestive of pneumonia
- No obvious signs of pneumonia
- No overt evidence of pneumonia
- Atypical presentation
- Poor patient cooperation because of impaired mental status
Because this is a specific data element, there is the ability to audit how frequently the data element is used to exclude patients from the denominator of the antibiotic timing performance measure in an effort to address potential "gaming" of the system.
References: The Joint Commission, JCAHO Tweaks Emergency Departments' Pneumonia Treatment Standards, JAMA, April 25, 2007—Vol. 297, No. 16, pp 1758-1759.
Back to Top
Angela Franklin, Staff Liaison
CMS PQRI Question of the Week
Question: Why should I participate in Physician Quality Reporting Initiative (PQRI)?
Answer: Eligible professionals will have the opportunity to use participation in the PQRI program to improve the care of the patients they serve through the evidence based measures that are based upon clinical guidelines.
Participating in PQRI is a way to prepare for future pay-for-performance programs.
Finally, the 1.5% bonus incentive is new money being made available to reward participating professionals.
June 11, 2007
The 2007 Physician Quality Reporting Initiative (PQRI), kicks off on July 1st. To help providers prepare, CMS will host three National Provider Calls will be held on the following dates:
- June 13, 2007, 3:00 - 5:00pm EDT. Technical discussion of quality measures, with a question and answer session.
- June 20, 2007, 3:30 - 5:30pm EDT. One of two open question and answer sessions.
- June 27, 2007, 3:00 - 5:00pm EDT. The second of two open question and answer sessions.
CMS has finalized specifications for the PQRI measures, and has designated the code "G8300" as the code providers may use to test their systems with $0 or $0.01 charges prior to the July 1 start date for reporting under PQRI. More information on how to use the code to test reporting may be found under the Reporting Link on the CMS Website.
CMS has provided a handbook: "Coding for Quality"; a Code Master that provides a sequential list of all ICD-9-CM (I9) and CPT ® (CPT4) codes with associated CPT II exclusion modifiers that are included in the 2007 PQRI, and a PQRI Fact Sheet which provides a general overview.
CMS also plans to provide individual coding worksheets for each measure prior to July 1. These AMA-developed worksheets are designed to assist physicians and their staff in selecting measures and reporting appropriate codes for the PQRI.
For more information, FAQs, and updates on the PQRI please see the CMS website and ACEP's website .
Emergency Medicine Measures
There are nine (9) measures on the PQRI list that include CPT II E/M Codes used by emergency physicians. Seven (7) were developed with ACEP input:
#28 Aspirin at Arrival for Acute Myocardial Infarction (AMI)
#54 ECG Performed for Non-Traumatic Chest Pain
#55 ECG Performed for Syncope
#56 Vital Signs for Community-Acquired Bacterial Pneumonia
#57 Assessment of Oxygen Saturation for Community-Acquired Bacterial Pneumonia
#58 Assessment of Mental Status for Community-Acquired Bacterial Pneumonia, and
#59 Empiric Antibiotic for Community-Acquired Bacterial Pneumonia.
Following revisions in the measure specifications, other measures that emergency physicians may choose to report are: #29 Beta-Blocker at Time of Arrival for Acute Myocardial Infarction (AMI); and # 47 Advance Care Plan.
Tips for successful reporting: begin reporting on July 1 or as soon as possible thereafter; focus on measures your department or practice sees most, and report on as many of these as practical.
Back to Top
Recent Conference: Knowledge Translation and Emergency Medicine
David Meyers, MD, FACEP
Just prior to the 2007 annual meeting of the Society for Academic Emergency Medicine (SAEM), a Consensus Conference on Knowledge Translation (KT) was held Chicago on May 15th. The 100 or so attendees - mostly researchers and teachers of emergency medicine from the USA, Canada and other countries - discussed how to facilitate the use of evidence-based medical research at the bedside, so-called Knowledge Translation.
The conference organizers, Drs. Peter Wyer, Eddy Lang and Barney Eskin, are all well-known leaders in the disciplines of evidence-based medicine and of the SAEM Evidence-Based Medicine Interest Group.
The concept and need for linking research to the patient is not new. In the past 20 years, Dr. Lawrence Weed, the creator of the Problem Oriented Medical Record (SOAP notes system) and many others have published approaches to solving this problem. Unfortunately, progress has been glacially slow, but it has now taken on more urgency as the IOM, JCAHO, CMS and other payers, etc. seek ways to bring the fruits of research to the patient's experience, thereby making care safer and more effective.
Examples abound of the slow rates of penetration into daily practice of clinically proven interventions or treatments (aspirin in MI, various X-ray decision rules, timely antibiotic in CAP) and of the continued use of certain treatments and interventions with demonstrated lack of benefit (antibiotics in URIs, "GI cocktail" in differentiating coronary from GI related mid-epigastric pain). Certainly, this is a driving force behind Medicare's Core Measures and PQRI initiatives, hospitals' and insurers' use of InterQual and M&R "criteria", many JCAHO standards and other approaches to improving care by getting docs to change their practices.
Among the many goals and objectives for the conference were these:
- Raise awareness of barriers to incorporating research into clinical practice;
- Review and evaluate strengths and weaknesses of various methodologies and domains for translating evidence to bedside decision-making;
- Integrated evidence awareness systems;
- Decision support technology
- Clinical pathways
- Inter-departmental protocols
- Create a research agenda for further study of this subject – knowledge translation;
- Identify effective strategies for implementation
- Prepare a publication summarizing the main findings of the Conference. The wide-range of topics under the Conference umbrella included how to change physician behavior, use of decision-support technology and clinical pathways – both in EM and across disciplines.
At the opening session, Dr. Eddy Lang elucidated the magnitude of the gap between research and bedside practice, pointing out 2 areas of focus: 1) the gap between basic science and clinical research, and 2) the gap between clinical research and the bedside provider. Two tracks were presented – a "research" track where discussants addressed researching methodologies in KT, funding, and model development; and an "application" track where discussants focused theoretical underpinnings, decision support technology, implementation in a busy ED and developing evidence summaries for use by providers.
The lunch speaker, Dr. Carolyn Clancy, Director of the Agency for Healthcare Research and Quality, made some interesting observations in support of the efforts of the conference.
Afternoon break-out sessions were organized around various large themes and sub-elements, namely:
- Evidence Implementation
- Guidelines, pathways; evidence synthesis
- The EM practitioner and KT
- CME, Self-Improvement; cognitive, social and behavioral issues
- The Clinical Teaching Unit and Informatics
- The Macro View
- Health policy and KT; Medicolegal and ethical considerations
- Context-specific challenges
- International EM
- Public health
- Science of Evidence Implementation and Dissemination of Innovation
- Research principles and methodology
- Capacity development and research networks
The organization of the conference with numerous simultaneous sessions made it impossible for an individual to participate in most of them. I chose to attend the "CME & Self Improvement" session where we discussed the limits of CME in changing physician behavior and the value of "just-in-time" information to the bedside for immediate decision-making by the practitioner. I also attended the "Medicolegal and ethics" session where we held a vigorous discussion of the potential for KT to improve care and reduce bad outcomes and of the need for more and better patient education. Though little exchange took place on the ethical implications of KT, the group elected to focus further efforts on that subject.
It is expected that each of the group leaders will host continuing listserv discussions on their respective topics over the coming months in preparation for publication in the November issue of the journal "Academic Emergency Medicine" dedicated to the proceedings of this conference. The discussion is just getting under way with lots of opportunity for participation and input. It should be interesting.
Back to Top
Article Review: The Swiss Cheese Theory of Error
Elaine Thallner, MD, MS, FACEP
The usual reaction to an error is to seek individuals at fault and then to punish them. The underlying assumption is that the error-maker had some sort of mental lapse: carelessness, lack of attention, poor motivation, lack of knowledge, recklessness, or negligence. We hope that punishing that individual will reduce the number of errors through ‘teaching him/her a lesson' and that holding the threat of punishment as an example will instill fear of committing an error in everyone else (presumably by encouraging individuals to be more careful, more attentive, less reckless, etc). This notion is strongly supported by our legal system. Unfortunately, this tradition encourages the hiding of errors, hinders learning from errors, and makes improvement difficult.
In a 2000 article in the British Medical Journal ("Human Error: Models and Management"), James Reason proposed an alternative model for thinking about the causation of errors (http://www.bmj.com/cgi/content/full/320/7237/768). Instead of the underlying assumption about error being caused by individual fault, he offers a model that highlights a system approach, in which humans are acknowledged as fallible, errors are to be expected, and ‘error traps' in the workplace need to be addressed.
He proposed a "Swiss Cheese Model" in which each piece of cheese represents a safeguard or barrier to error. For example, for conscious sedation in the ED, the slices of cheese may represent the identification of the patient, electronic medical record, conscious sedation forms, protocol for checking drug dosages, mandatory ‘time-out' procedures, a non-harried nurse, and other factors. Most often, the holes in the cheese do not line up and an error is averted. When the holes do line up, the opportunity for a significant error exists.
Dr. Reason briefly discusses the organizational characteristics of three high reliability organizations (systems operating in hazardous conditions that have fewer than expected adverse events: nuclear aircraft carriers, air traffic control systems, and nuclear power plants). These high reliability organizations anticipate that errors will occur, and train personnel to recognize and handle errors. They rehearse scenarios of error and look for system fixes.
Although medical care is unique, we have much to learn from other industries about changing our culture to support learning from error and to improve our systems to reduce the occurrence of errors.
Back to Top
Call for Essays for Longevity and Tenure Awards
Angela Franklin, Staff Liaison
The ACEP Section of Careers in Emergency Medicine is soliciting nominations for an award for emergency physicians in the following two categories:
- A Longevity Award for the physician with the longest active career in emergency medicine.
- A Tenure Award for the physician with the longest active career in the same emergency department.
Recognition is also given to those physicians who are still actively practicing emergency medicine after 20, 25, 30, and 35 years. Further details are available online. Members do not have to belong to the Careers section.
Nominations are due by July 9 to Tracy Napper c/o email@example.com, fax: 972-580-2816, or by mail to:
Section of Careers in Emergency Medicine
PO Box 619911
Dallas, TX 75261-9911
Welcome New QIPS Members!
Alan T Forstater, MD, FACEP, Philadelphia, PA
David Aaron Baker, Clarksville, TN
David E Custodio, MD, FACEP, Akron, OH
David Josef Amin, MD. Torrance, CA
E Scott Isbell, MD, FACEP, Glendora, CA
Frances Jensen, MD, Baltimore, MD
Gregg A Miller, MD, Torrance, CA
Ian W Cummings, MD, PhD, Fort Pierce, FL
James Malcolm Schmidt, MD, Norfolk, VA
John Albert Vozenilek, III, MD, FACEP, Evanston, IL
John J Parker, MD, FACEP, Flat Rock, NC
Kevin Michael Klauer, DO, FACEP, Canton, OH
Kristine Thompson, St. Augustine, FL
Maureen A Gang, MD, FACEP, New York, NY
Maya R Heinert, Sacramento, CA
Robert M Hutton, MD, Nashville, TN
Samuel H Ko, Rochester, NY
Sang Do Shin, Seoul, South Korea
Stuart Gary Kessler, MD, FACEP, Queens, NY
Susan Gail Vanpelt, MD, Buffalo, MN
Suzanne K Elliott, MD, FACEP, Plattsburgh, NY
Teresa Sullivan Dolan, MD, State College, PA
Thomas W Lukens, MD PhD FACEP, Cleveland, OH
Timothy William Jahn, MD, FACEP, Green Bay, WI
Todd W Zaayer, MD, FACEP, Oceanside, CA
The ACEP Section on Quality Improvement and Patient Safety (QIPS) is always looking for prospective new members. If you know of anyone in your group who has an interest in QI/PS (or has had it thrust upon them), please encourage them to join our section. We endeavor to keep our membership updated on the latest in the field – and the $35 per year cost is significantly less than ANY publication on the market. Please click here for information on membership.
Back to Top
Article Review: Easy Ways to Resist Change in Medicine
Helmut Meisl, MD, FACEP
Here is a summary of an article from the BMJ that I found very amusing, but also very true and wise. It highlights issues that we in QI all face regarding physicians, and often in ourselves. The authors are A. Shaugnessy and David Slawson in BMJ, Dec 2004; 329(7480): 1473-1474, and is so well written that I will quote liberally. They also provide a table of Levels of Belief:
- Class 0 - Things I believe
- Class 0a - Things I believe despite the available data
- Class 1 - Randomized controlled clinical trials that agree with what I believe
- Class 2 - Other prospectively collected data
- Class 3 - Expert opinion
- Class 4 - Randomized controlled clinical trials that don't agree with what I believe
- Class 5 - What you believe that I don't
The abstract states that there are numerous forces being imposed on physicians to change behavior. "Even continuing medical education, previously a form of intellectual entertainment or a forum for much needed sleep, has refocused its efforts towards improving the care of patients."
Techniques to Resist Change
- Don't pay Attention - "Get so busy with your practice that you do not have the time to read, attend meetings..."
- Attack the Data - Dismiss the source and question "the validity of the information." Also, "question the applicability to your patients". "This technique is especially useful when data from large studies contradict our impressions from personal experience with a few patients."
- Maintain Absolute Confidence - Everything necessary to practice medicine was taught in medical school. "Instead of worrying about this newfangled ‘evidence-based medicine", stick with 'belief-based medicine'."
- Follow the Pack - "Stay far back, waiting for all your colleagues to change before you (reluctantly) join them."
- Defer to Experts - "An expert is always available to support your death grip on the status quo."
- Bring in the Lawyers - "A good defense against change is to assert that you will get sued if you start doing something new."
- Blame Patients - One cannot change, because the patients will not like it. "Everyone will understand why you still give..... antibiotics for colds if you tell them your patients don't want you to stop."
- Show how Much You Changed - "Point to all the new drugs you use as a result of information provided solely by pharmaceutical representatives. After all, its more important to feel up to date than to actually be up to date."
- Pull Rank - When someone else such as a nurse or patient makes suggestions, be sure to ask them where they received their medical degree.
- Simply Refuse - "I wouldn't believe this information even if it were true."
"Using these time honoured techniques will allow you to practise with the assurance that little thinking will be required that might distract you from the matter in hand-taking care of patients as you see fit- and will keep you in total control without any nagging feelings that there might be a better way to practise."
Back to Top
Article Review: How Doctors Think
David Meyers, MD, FACEP
The January 29, 2007 issue of The New Yorker contained an article by Dr. Jerome Groopman, entitled "What's the Trouble – How Doctors Think." Dr. Groopman is an internist in Boston and a regular contributor to the magazine on medical topics. This essay summarizes his new book, How Doctors Think, recently published by Houghton Mifflin and already on the New York Times Best Seller list. The article is available on-line.
The article opens with a case vignette. The patient, a "trim and extremely fit" man in his early forties, developed "sharp" chest pain while hiking in the woods. Although he had some milder discomfort for a few days preceding, this sharp pain was different and worried him. This pain hurt worse when he took a deep breath. So he went to an ED in Halifax, Nova Scotia, where Dr. Pat Croskerry was on duty.
Dr. Croskerry, as many of you may know, is an experienced emergency physician, and with his special interest and expertise in medical errors in the ED and one of the leaders in our field. He performed a history and physical exam, which revealed no risk factors for coronary artery disease. Diagnostic studies were performed – an EKG, chest X-ray and cardiac enzymes among others – and all were normal. The young man was discharged after Dr. Croskerry told him his condition was likely due to a muscle strain and sent him on his way with a reassuring comment, "My suspicion that this is coming from your heart is about zero."
You can probably predict what happened when Dr. C arrived for his shift the next day. He heard those chilling words guaranteed to strike terror into the heart of every ER doctor, "You know that patient you saw last night?" The patient had returned with an acute myocardial infarction (AMI).
Dr. Groopman continues with a discussion of Dr. Croskerry's interest in the cognitive dimensions of clinical decision-making – how doctors collect and interpret information obtained from the H & P and diagnostic tests to arrive at a diagnosis and treatment plan. In particular, he recognizes that ER doctors face great uncertainty due numerous factors - the lack of prior clinical relationships with our patients, the limited information available during our encounters, and, not mentioned but no less important, production pressure to keep the patients moving from waiting to treatment area, the avoidance of over-utilization or under-utilization of resources, the relatively chaotic and distracting environment.
Whether because of these factors and our medical training, or just because humans have evolved tools and behaviors to assist in making decisions, we use heuristics or rules of thumb as shortcuts to get to the end result more effectively. We start using these rules (with biases) before we even see our patients, like when we pick up a chart and see the name of a "frequent flyer" with headache or watch a patient wheeled in on a gurney with the pain of a kidney stone.
Perhaps the heuristic used by Dr. Croskerry in his chest pain patient could be summarized as:
In fact, these rules of thumb are often successful, as demonstrated in Malcolm Gladwell's book, Blink, which I discussed in a prior issue of this newsletter. But there is a darker side to their use; doctors are often inappropriately influenced to make the wrong diagnosis because the problem or patient they are dealing with strongly resembles another patient who had the benign index condition, i.e., in Croskerry's patient, muscle strain. With this bias, the "representativeness error," all the factors he evaluated pointed toward a non-cardiac condition. A wiser Dr. C, after reckoning with his error is quoted in the article to say, "You have to be prepared in your mind for the atypical and not be too quick to reassure yourself, and your patient, that everything is OK." Ahhhh, If only it were so simple.
A second ER case is described in the article, this time a case of aspirin poisoning misdiagnosed by the ER doc as pneumonia, in completely different circumstances. The doc is very self-critical when the internist to whom he admitted the patient told him what the diagnosis turned out to be. With the benefit of hindsight, the ED doc, reflecting on his colleague's revelation, realized that the patient exhibited a number of classic features which were misinterpreted because of another common cognitive error, that of the "availability" heuristic. This heuristic encourages us to "judge the likelihood of an event by the ease with which relevant examples come to mind." The doc's response after examining his mistake was "This [the features of aspirin poisoning] was something that was drilled into me throughout my training. She was an absolute classic case… and I missed it. I got cavalier." This attitude again reflects our essential self-critical and perfectionist view of ourselves as physicians. It leads us to say and believe "I should have been better, if only I had tried harder, remembered something I forgot, avoided biases, etc., I could have gotten it right." Ahhhh, If only it were so simple.
Another influence which plays a role in our thinking is confirmation bias, a cognitive strategy by which we which accept information that confirms our suspicions while rejecting information which runs counter to our expectation. We do it all the time, rejecting an abnormal lab result because it just doesn't fit (or matter?). This is certainly a common problem in medicine. Those of us who review charts for quality improvement or legal consultations see it all the time.
But it is not only a problem for physicians. The confirmation bias also played a major role in the Challenger Space Shuttle disaster, when engineers rejected information about performance of the "O" rings' in cold temperatures, described by Diane Vaughan in her excellent book on the subject, The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. She called this attitude the "normalization of deviance".
In the final case vignette, Dr. Groopman finally focuses on a patient from his own internal medicine practice, a young man with bone cancer who, during a long hospitalization developed septic shock while undergoing chemotherapy. Dr. G had come to like this patient, who had developed a low-grade fever in the hospital. During rounds one day, Dr. G was examining the patient with his residents, but when the patient complained of being too tired to sit up, the exam was stopped. Had he sat the patient up, he likely would have found a large abscess on the man's lower back which later that day caused the patients near-fatal sepsis. Dr. G attributed this "affective error" to his liking the patient which resulted in his wanting to minimize the discomfort he caused the young man and perhaps, he speculates, he had an "unconscious" hope that the patient did not have a serious cause for his fever. Ahhhh, If only it were so simple.
Clinical practice, especially in the ED, is very vulnerable to cognitive errors such as those described above, as well as other system and situational factors. Our response to this knowledge of our human limitations can continue to be one of simply trying harder to overcome or avoid these biases (blame and train, as the cynical aphorism goes). Or perhaps we can find truly effective methods to supplement and complement these human limitations and approach the ideal of truly safe, error-free and patient-focused health care.
Back to Top
Pneumonia Core Measures Update
Jack Kelly, DO, FACEP
Our combined voices are being heard by CMS!
CMS and Premier now agree that in cases where "diagnostic uncertainty" delays the time to diagnosis of pneumonia, these can be excluded from our Core Measure Dataset.
We all have a case where nothing by history, physical exam or CXR point to Pneumonia and an "equivocal CXR" or the "not sure if this is a PE, and the CXR doesn't help" type case. You have no clinical mandate to treat the case as Pneumonia, and no Antibiotics are given. Then a Chest CT is performed hours later, which finds a Pneumonia! Now you are late, and missed the 4-hour Door-to-Antibiotic CMS Core Measure.
|Example #1: ED reads CXR as negative... later Radiology calls back and says "we think there may be an early infiltrate." Take the moment to document your diagnostic uncertainty using the terms listed above.
|Example #2: Your patient's CXR is negative, and the constellation of severe pleuritic chest pain and hypoxia without fever/sputum really push you to get a CT angio to rule out PE. The CT is read as Pneumonia 6 hrs later.
CMS has taken our collective feedback, and realizes these cases should have a "documentation of a reason that despite being seen by the physician, the patient's initial clinical picture was questionable or unclear and not suggestive of Pneumonia". The Data Element Name is "Diagnostic Uncertainty."
It is now permissible to document the cases of "Diagnostic Uncertainty for Pneumonia" using the following possible terms:
- "Clinical picture not clear"
- "Diagnostic picture unclear"
- "Not suggestive of pneumonia"
- "No obvious signs of pneumonia"
- "Atypical presentation"
- "Poor patient cooperation because of impaired mental status"
I encourage you to write a caveat in the chart using the above terms. Only then can the case can be excluded from your data. Please document these cases where reasonable delay to diagnosis happens because of "diagnostic uncertainty". Write that term into your caveat sentence so that the Chart Abstractor sees it clearly and add in the sentence that the "clinical picture was questionable and not suggestive of pneumonia."
NOTE: CMS will not allow for cases where there was a "long delay in seeing the Physician" or delays in "Diagnostic Testing" as permissible reasons for exclusion in this category. If you do not see the patient within 4hrs (door-to-doc), then you cannot use this caveat.
Our biggest challenge will be to attempt to get that caveat note written on the chart hours after we have discharged (or admitted) the patient.
Please remember to write that diagnostic uncertainty Data Element caveat. Your Core Measure scores will benefit!
Back to Top
Quality and Safety Articles
Helmut Meisl, MD, FACEP
Here again is a list of recent articles that may interest you. These are compiled by AHRQ PSNet at http://psnet.ahrq.gov/.
Hospital workload and adverse events.
Weissman JS, Rothschild JM, Bendavid E, et al. Med Care. 2007;45:448-455.
Eliminating preventable death at Ascension Health.
Tolchin S, Brush R, Lange P, Bates P, Garbo JJ. Jt Comm J Qual Patient Saf. 2007;33:145-154.
Information technology cannot guarantee patient safety.
de Wildt SN, Verzijden R, van den Anker JN, de Hoog M. BMJ. 2007;334:851-852.
Embedding quality improvement and patient safety - the UCLA value analysis experience.
Gambone JC, Broder MS. Best Pract Res Clin Obstet Gynaecol. 2007 Mar 30; [Epub ahead of print].
Integrating patient safety into curriculum.
Rapala K, Novak JC. Patient Saf Qual Healthc. March/April 2007;4:16-18, 20-23.
Should patients have a role in patient safety? A safety engineering view.
Lyons M. Qual Saf Health Care. 2007;16:140-142.
Doctors are more dangerous than gun owners: a rejoinder to error counting.
Dekker S. Hum Factors. 2007;49:177-184.
The new patient safety officer: a lifeline for patients, a life jacket for CEOs.
Denham CR. J Patient Saf. 2007;3:43-54.
The problem of engaging hospital doctors in promoting safety and quality in clinical care.
Neale G, Vincent C, Darzi SA. J R Soc Promot Health. 2007;127:87-94.
Cognitive processes involved in blame and blame-like judgments and in forgiveness and forgiveness-like judgments.
Mullet E, Riviere S, Sastre MT. Am J Psychol. 2007;120:25-46.
Evaluating teamwork in a simulated obstetric environment.
Morgan PJ, Pittini R, Regehr G, Marrs C, Haley MF. Anesthesiology. 2007;106:907-915.
Healthcare climate: a framework for measuring and improving patient safety.
Zohar D, Livne Y, Tenne-Gazit O, Admi H, Donchin Y. Crit Care Med. 2007 Mar 19
Making use of mortality data to improve quality and safety in general practice: a review of current approaches.
Baker R, Sullivan E, Camosso-Stefinovic J, et al. Qual Saf Health Care. 2007;16:84-89.
Ambulatory care adverse events and preventable adverse events leading to a hospital admission.
Woods DM, Thomas EJ, Holl JL, Weiss KB, Brennan TA. Qual Saf Health Care. 2007;16:127-131.
Patient safety event reporting in critical care: a study of three intensive care units.
Harris CB, Krauss MJ, Coopersmith CM, et al. Crit Care Med. 2007 Feb 26;
Confronting medical errors in oncology and disclosing them to cancer patients.
Surbone A, Rowe M, Gallagher TH. J Clin Oncol. 2007;25:1463-1467.
The many faces of error disclosure: a common set of elements and a definition.
Fein SP, Hilborne LH, Spiritus EM, et al. J Gen Intern Med. 2007 Mar 20;
Barriers and motivators for making error reports from family medicine offices: a report from the American Academy of Family Physicians National Research Network (AAFP NRN).
Elder NC, Graham D, Brandt E, Hickner J. J Am Board Fam Med. 2007;20:115-123.
Causes of errors in the electrocardiographic diagnosis of atrial fibrillation by physicians.
Davidenko JM, Snyder LS. J Electrocardiol. 2007 Feb 21;
The American College of Surgeons' closed claims study: new insights for improving care.
Griffen FD, Stephens LS, Alexander JB, et al. J Am Coll Surg. 2007;204:561-569.
An alternative to the clinical negligence system.
Furniss R, Ormond-Walshe S. BMJ. 2007;334:400-402.
Using medical malpractice closed claims data to reduce surgical risk and improve patient safety.
Manuel BM, Greenwald LM. Bull Am Coll Surg. March 2007;92:27-30.
Medication errors among acutely ill and injured children treated in rural emergency departments.
Marcin JP, Dharmar M, Cho M, et al. Ann Emerg Med. 2007 Apr 10;
A prospective hazard and improvement analytic approach to predicting the effectiveness of medication error interventions.
Karnon J, McIntosh A, Dean J, et al. Saf Sci. 2007;45:523-539.
Pharmacist workload and pharmacy characteristics associated with the dispensing of potentially clinically important drug-drug interactions.
Malone DC, Abarca J, Skrepnek GH, et al. Med Care. 2007;45:456-462.
Medication error reduction and the use of PDA technology.
Greenfield S. J Nurs Educ. 2007;46:127-131.
Nurses relate the contributing factors involved in medication errors.
Tang FI, Sheu SJ, Yu S, Wei IL, Chen CH. J Clin Nurs. 2007;16:447-457.
Medication-error reporting and pharmacy resident experience during implementation of computerized prescriber order entry.
Weant KA, Cook AM, Armitstead JA. Am J Health Syst Pharm. 2007;64:526-530.
Medication errors in the outpatient setting: classification and root cause analysis.
Friedman AL, Geoghegan SR, Sowers NM, Kulkarni S, Formica RN Jr. Arch Surg. 2007;142:278-283.
ISMP medication error report analysis.
Cohen MR. Hosp Pharm. 2007;42:181–182.
Medication errors in paediatric care: a systematic review of epidemiology and an evaluation of evidence supporting reduction strategy recommendations.
Miller MR, Robinson KA, Lubomski LH, Rinke ML, Pronovost PJ. Qual Saf Health Care. 2007;16:116-126.
Direct observation approach for detecting medication errors and adverse drug events in a pediatric intensive care unit.
Buckley MS, Erstad BL, Kopp BJ, Theodorou AA, Priestley G. Pediatr Crit Care Med. 2007;8:145-152.
Multidisciplinary approach to inpatient medication reconciliation in an academic setting.
Varkey P, Cunningham J, O'Meara J, Bonacci R, Desai N, Sheeler R. Am J Health Syst Pharm. 2007;64:850-854.
Improving medication reconciliation in the outpatient setting.
Varkey P, Cunningham J, Bisping S. Jt Comm J Qual Patient Saf. 2007;33:286-292.
Achieving the National Quality Forum's "Never Events": prevention of wrong site, wrong procedure, and wrong patient operations.
Michaels RK, Makary MA, Dahab Y, et al. Ann Surg. 2007;245:526-532.
Patterns of communication breakdowns resulting in injury to surgical patients.
Greenberg CC, Regenbogen SE, Studdert DM, et al. J Am Coll Surg. 2007;204:533-540.
Staying safe: simple tools for safe surgery.
Karl RC. Bull Am Coll Surg. April 2007;92:16-22.
Expanded surgical time out: a key to real-time data collection and quality improvement.
Altpeter T, Luckhardt K, Lewis JN, Harken AH, Polk HC Jr. J Am Coll Surg. 2007;204:527-532.
Application of the human factors analysis and classification system methodology to the cardiovascular surgery operating room.
ElBardissi AW, Wiegmann DA, Dearani JA, Daly RC, Sundt TM 3rd. Ann Thorac Surg. 2007;83:1412-1418; discussion 1418-1419.
Patient safety curriculum for surgical residency programs: results of a national consensus conference.
Sachdeva AK, Philibert I, Leach DC, et al. Surgery. 2007;141:427-441.
Patient harm in general surgery--a prospective study.
Kaul AK, McCulloch PG. J Patient Saf. 2007;3:22-26.
Back to Top
Back to Top
This publication is designed to promote communication among emergency physicians of a basic informational nature only. While ACEP provides the support necessary for these newsletters to be produced, the content is provided by volunteers and is in no way an official ACEP communication. ACEP makes no representations as to the content of this newsletter and does not necessarily endorse the specific content or positions contained therein. ACEP does not purport to provide medical, legal, business, or any other professional guidance in this publication. If expert assistance is needed, the services of a competent professional should be sought. ACEP expressly disclaims all liability in respect to the content, positions, or actions taken or not taken based on any or all the contents of this newsletter.