Client Login

Patient Safety Lessons Learned from the Aviation Industry’s Missteps

If you attended a healthcare conference 20 years ago and heard a presentation on patient safety, you may have heard the speaker compare the number of people who died annually due to medical mistakes to the number of people who would die in a plane crash each year – if a plane crashed every day for a year.

The message, sparked by the release of the Institute of Medicine’s eye-opening patient safety report To Err is Human in 1999, was clear. Society would never tolerate passengers dying in plane crashes every day, but somehow, we’ll tolerate the same number of people dying from medical errors.

Ever since then, experts have used aviation safety as a model for patient safety. Healthcare has made care safer by implementing some healthcare-specific interventions. However, true counterparts to aviation practices, such as airplane manufacturing excellence, preflight checklists, flight simulations and teamwork between pilots and air traffic controllers, have yet to be implemented systematically or at scale.

While aviation has been a model of safety over the past two decades, there has been some recent turbulence in the form of two fatal crashes and the subsequent grounding of the Boeing 737 Max airliner. These tragedies highlight the perils of complacency – mostly due to the unintended consequences of cost cutting. While the final chapter is yet to be written, some of the missteps that have already surfaced include:

  • Deficits in pilot training and education. Pilots were not adequately trained on how to use the plane’s Maneuvering Characteristics Augmentation System, or MCAS, which helps the plane from stalling, or losing lift.
  • Lax maintenance and maintenance records. The fuel tanks on 60 percent of the planes inspected had debris in them, including shoe covers and tools.
  • Design flaws and construction shortcuts. The malfunction of a single sensor on the plane led to the two crashes, as there were no redundant and independent sensors. A second backup sensor was deemed an optional purchase, again to reduce the “base price.”
  • Discarding internal warning signs from staff. Senior managers ignored emails from frontline workers and others warning of dangerous design and construction problems.
  • Lack of independent regulatory oversight. A cozy relationship between the airline and the Federal Aviation Administration diluted independent and rigorous safety inspections.

Patient safety advocates would be wise to buckle up and learn from the aviation industry’s missteps. What can this new aviation saga – much of which can be traced back to Boeing’s effort to reduce costs – teach those in the healthcare industry as they look to cut costs in a world of value-based payment?

Six value-based healthcare tips for avoiding safety missteps

1. Culture starts at the top. Building a culture of patient safety starts in the C-suite and in the boardroom. Employees will prioritize patient safety when leadership has first embraced it and communicated its importance to the entire organization.
2. Workforce patient safety training and education are essential. No excuses.
3. Patient safety technology, health IT and medical technologies are a double-edged sword. Chosen wisely, they can save lives. But poor design, improper implementation and a lack of training can lead to unsafe care from the very systems designed to improve safety.
4. Objective and effective continuous measurement of processes and outcomes, even at peak performance, is critical. This measurement often holds the early warning signs that things may be slipping. Timely, corrective action is necessary to avoid a catastrophe, and it can be avoided if you pay attention to measurement.
5. Redundancy, even at a higher cost, is sometimes necessary when patients’ lives are at stake. That includes accurate and timely documentation.
6. Robust and independent regulation is key. If nothing else, the Boeing example clearly shows that self-policing (by even the most prominent and well-intentioned provider organizations) can fail under the pressure of reducing expenses and maximizing revenue. Unbiased and effective external oversight is always required.

Ironically, the airline industry could potentially save lives in healthcare by showing the industry how easy it is to fall off the safety wagon as value-based care looms and margin pressures continue to increase. What airlines and healthcare providers can never do is sacrifice the safety of their passengers and their patients. That’s a lesson neither industry can ignore.


To learn more on this topic, please read “Dear Medisolv: How Can I Reignite Our Passion for Patient Safety?” on our blog.

 

STAY AHEAD OF THE QUALITY CURVE 

Medisolv’s Value Maximizer software, uses machine learning and predictive modeling to forecast your future years payments in the CMS hospital quality programs (HAC, HVBP, HRRP). Our simulation guides your team on how to optimize your performance to maximize your reimbursements.

Here are some resources you may find useful.

Blog: "How to Use CMS' Value-Based Programs' Data
Blog: "Should We Still Abstract Core Measures?"
Blog: "Changes to Quality Reporting in Response to COVID-19
 
 
Dr. Zahid Butt, FACG

Medisolv CEO, Dr. Zahid Butt, is a senior executive with 30 years of experience in health care delivery and health information technology (HIT). Prior to his current role at Medisolv, he was a Senior Attending Gastroenterologist and Director of Clinical Informatics at St. Agnes Healthcare, a member of Ascension Health. He has served on several government and private sector Health IT task forces. He is currently the Chair of the HIMSS Quality and Safety Taskforce. As a nationally-recognized expert in electronic quality measurement (eCQMs), he has served on several CMS and NQF Technical Expert Panels to develop and maintain quality measures for national quality reporting programs.

Add a comment

Use your data to drive meaningful improvement.

See our quality management software in action.

Request a Demo