Skip to the main content.
Request a demo
Request a demo
New call-to-action
EXPLORE THE MEDISOLV PLATFORM

New call-to-action

WATCH ON-DEMAND WEBINAR

Six Root Causes of Poor Data Fitness

What causes your organization’s data to go bad? And what steps can you take to maintain data integrity?

Welcome to the final part of our three-part series on data quality. In a previous post, I discussed what it means to have fit data and reviewed six dimensions of data fitness: completeness, correctness, timeliness, uniqueness, format validity and acceptability. Remember, data fitness relates to how fit each of your organization’s data elements are for their ultimate purpose.

Today, to wrap up our data quality discussion, I will go over six common things that can cause poor data fitness.

Also See: The Five Stages of Health Care Data Quality Maturity [INFOGRAPHIC]

Also See: Six Dimensions of Data Fitness

IT_Processes1. IT system processes 

System updates and consolidations

There’s a lot to think about when your organization is going through system updates and consolidations. Say you are being merged or acquired by a larger health system and your team will transition to a new EHR or a new Quality Management System (QMS). To transition successfully, you’ll need to understand how rule logic in your new source system may differ from rules in your old system. Rules help manage how data is handled in an interface feed or populated into a system dictionary. For example, your old system may require a specific format for procedure codes such as “no decimal points,” but the new system may require decimal points.

Failure to modify source code format or interface rules could result in measures not receiving some of the data they need to correctly compile results. While your IT team is responsible for ensuring the integrity of the data fed into your new systems, Quality professionals should confirm that all measures are compiling correctly. Running before and after reports that examine both the numerators and denominators of your measures is a quick way to verify that measure integrity has not been jeopardized.

Mapping errors

Mapping errors are particularly problematic for regulatory reporting. This usually happens when system dictionaries have been modified and mapping hasn’t been completed or mapping was done incorrectly. Even though a review of dictionaries and their associated mappings typically occurs during the initial implementation of a new system, it is important to periodically review dictionaries and mapping terms on an ongoing basis. While the tasks of mapping are typically conducted by the IT team, Quality professionals should be aware of any changes in dictionaries or mappings in order to evaluate their potential impact on measure results. Even small changes, such as the addition of terms to a discharge disposition dictionary, can have a dramatic impact on your data and measure results. 

Take for example one hospital that added three new terms for the concept of “expired” to differentiate between deaths in the OR, deaths in critical care and coroner cases. Failure to map these three new terms could impact your mortality rates across the board. To avoid these mapping errors, consider scheduling a collaborative review with your IT department and your quality measurement vendor several times a year. This is especially important following the release of new or modified technical specifications for measures required by CMS or The Joint Commission (TJC). 

Data processing

With data processing, timing is everything. When changes are made in your IT systems, consider how far back in time your measures need to be resummarized in order to accommodate new codes, mapping updates or new interface data. Keep in mind that interface data is sent one of two ways. The first is a “batch feed”, in which data is sent in batches. Batch feeds can occur nightly, weekly, monthly, quarterly or even semi-annually. The second is a transactional or “real or near time” feed, in which data is received into your QMS moments after they are entered into the source system. Quality professionals need to understand the timing of their interface data and the impact on measure results in order to ensure their reports are accurate and up to date. 

The ideal velocity of data into your measurement system largely depends on the ultimate purpose of the data. While the implementation and testing of an interface feed is the responsibility of your IT team, Quality professionals can influence the type of interface that is selected by describing the use cases for their data. For instance, analytics used to identify patients at risk for unplanned readmission may require transactional data so that the proper interventions may be implemented prior to discharge. Conversely, mortality measures compiled from the discharge abstract after a patient has been discharged and the record has gone through the medical records coding process may be received in a nightly or weekly batch file. 

Manual-Data-Entry 2. Manual data entry

Manual data entry issues typically occur with chart abstraction, which requires the end user to review information in the EHR, interpret the results and then enter the correct information into an electronic form. On the surface this may seem like a simple task, but there are a wide range of factors that could affect the accuracy of manually collected data.

Missing data is the first type of data error that is common for manually collected data. Quality professionals need to understand the reasons for missing data, and if necessary, work with their measurement vendor to minimize missing data. Take for example the question, “Serum lactate within 2 hours of arrival”. A missing data element could mean it wasn’t ordered, it wasn’t drawn, it was drawn outside of the two-hour window or the abstractor missed the question entirely. Some data elements that are critical to the compilation of a measure may be marked as a mandatory field, however, there may be some cases where additional allowable values or comments need to be created in the form.

In addition to missing data, here are a few things to keep in mind if your organization is planning on designing their  own forms for data collection:

  • The first entry in any drop down list of allowable values is often selected more than any other entry. Periodically conduct a review of the frequency each value is selected to confirm the form itself isn’t creating measurement bias. 
  • Be sure to pay attention to unclear or poorly organized fields. Creating help fields that are easily accessed during the data entry process may help. Consider the workflow for your reviewers so that they don’t have to move back and forth between different sections of the EHR in order to answer the questions in the form. For example, keep all the medication related questions together on the form. 
  • Data entry repetitions and redundancies increase the chance of incongruent and conflicting data and should be considered in the form design. For example, a question about a patient’s smoking history may be found in the history & physical, a doctor’s progress note, a surgical consultation or a nursing assessment form. Quality professionals should identify and document the primary source of allowable information, as well as any allowable secondary sources in the event that the information is missing from the primary source. 
  • A periodic “inter-rater reliability” (IRR) study should be conducted to ensure reviewers are correctly and consistently following the agreed upon data abstraction guidelines for manually collected data. This is a requirement for regulatory measures submitted to TJC; however, if resources permit, IRR studies should also be conducted on Registry measures and “Home-grown” measures. 


Process-Automation 3. Process automation 

In today’s technologically advanced world, more tasks are being automated than ever before including billing systems, discharging instructions, case management referrals, appointments and reminders, eCQMs, financial and operational measures and even predictive and perspective analytics.

All of these automated systems that help health care organizations manage their patients’ care rely heavily on successful process automation and the quality of the data. Thus, when things go wrong, it can trigger significant errors such as manual forms being auto-populated with interface data from other source systems. That’s why it’s crucial to ensure that your source system is properly updated with the correct information and formatting. When errors occur, data may need to be historically recompiled following a correction. Quality professionals, who understand the reporting time periods and submission deadlines, are optimally positioned to instruct their IT teams about how far back data needs to be reprocessed in order to correct errors in their data.

Changes-Not-Captured 4. Changes not captured

In any complex and dynamic system such as an EHR, communication among stakeholders is needed in order to ensure stability. In an ideal world, a meta-data warehouse would exist, which would list every data element in the system and describe how the data element is used in measures, reports, worklists, clinical decision support alerts or other analytics. The stakeholders for these data outputs would be listed so that when changes are made in the source or down-stream systems, all stakeholders could be advised of the changes and have the opportunity for input. However, few organizations have such state-of-the-art data governance systems in place. Until then, Quality professionals can guide this communication process to ensure they are kept in the communication loop when changes are made to the clinical, financial and medical record coding systems that feed their quality measurement systems.

To better understand the impact of failed communication, consider the following example. One hospital’s clinical stakeholders in the neonatal ICU worked with the IT department to redesign the capture of vital signs and newborn assessments in order to better align with their workflow at the bedside. The new fields were established and an inservice was conducted to instruct the clinicians on their new documentation process. The IT department documented the new workflow and fields, and the change was implemented and tested successfully. The only problem was that no one told the Quality department. So, several months after the change, the Quality team noted that performance for newborn hearing assessment suddenly dipped from 96% compliance to 2%. New scripts had to be written so that the quality vendor could extract the required information from new fields in the EHR. Once completed, performance was restored to its usual high level of excellence and the correct data could be submitted just in time to meet mandatory regulatory requirements. This failure to communicate created a “near miss” with the organization’s regulatory submission requirements and could have been avoided by including the Quality department in the change control process. Remember, communicate, communicate, communicate in order to stay in a “proactive” rather than a “reactive” state of data readiness.

New-data-uses 5. New data uses 

Measurement stewards and organizations themselves are constantly striving to measure performance in new ways that can help us understand our outcomes and improve our care delivery systems. With the emerging emphasis on electronic clinical quality measurement, in which value sets and other “buckets of data” can be reused and applied to a wider range of measures, extra vigilance is needed to be sure that the individual data retains the meaning within the context in which it was was designed. 

Certain data uses can change the meanings of even the simplest concepts. Take for example data elements “Medical Leave of Absence” and “Personal Leave of Absence”. Both LOA terms might be fine to use together in a report on productivity from your Human Resources department, but they may have very different meanings in a report on benefits for employees, in which employees on a medical LOA retain their benefits. On the other hand, employees on a personal LOA may not. 

In another example, consider the four discharge disposition terms: Discharge to Rehab, Discharge to SNF, Discharge to Hospice and Discharge to Inpatient Psychiatric Facility. All four of these terms might be mapped to a concept of “other non-acute care”. When this mapping bucket of “other non-acute care” is used to exclude certain encounters from being included in an unplanned 30-day readmission measure denominator, for example, all four terms are appropriate. However, when used in a different way, such as 30-day post discharge mortality measure, only hospice patients should be excluded. Quality professionals are the domain experts with measurement concepts and should be consulted whenever individual data or value-sets are to be repurposed within the organization. 

Loss-Expertise 6. Loss of expertise 

Perhaps the most invasive root cause of data decay occurs when domain experts in IT and Quality leave the organization. So much of data knowledge—history of data changes, why data changed at some point in time, in which facilities it changed or an understanding of the differences between old and new source systemsexists in the minds of data and Quality experts rather than in clearly documented meta data documents which help to describe the data, its purpose, and the stakeholders.

As previously discussed, this would ideally be prevented with a meta-data warehouse, but not all health care systems are ready to implement one. Until then, you’ll need to be prepared for a degree of data decay when any of your data experts leave the organization. Protecting your “tribal knowledge” when staff retire, resign or get reassigned or eliminated begins with forming a succession plan and ensuring each expert on the team is partnered with another team member for mentoring and coaching. Don’t depend on “one person” to know everything. Along the way, documentation should be created and maintained. A well-organized set of meta-data documents can then become a living resource for onboarding new staff, as well as a firm foundation for your organization’s evolving and maturing data governance structure. 


Medisolv can help improve your organization’s data quality and create highly reliable, accurate and actionable information for improving clinical outcomes. In addition, our clinical experts can work closely with you and your team to assist with data validation and performance improvement

Ready to conquer poor data fitness? Send us a note today.

 

Stay Ahead of the Quality Curve 

Medisolv Can Help

Medisolv’s Value Maximizer software, uses machine learning and predictive modeling to forecast your future years payments in the CMS hospital quality programs (HAC, HVBP, HRRP). Our simulation guides your team on how to optimize your performance to maximize your reimbursements.

Here are some resources you may find useful.

 

 

Vicky Mahn-DiNicola, RN, MS, CPHQ

Vicky is the Vice President of Clinical Analytics and Research at Medisolv. She has over 20 years of clinical analytics and product management experience, as well as a strong clinical background in Cardiovascular and Critical Care Nursing, Case Management and Quality Improvement. She has been successful at partnering with innovative thought leaders and executing strategy for new models of care delivery, case and quality management programs, performance measurement and benchmarking.

Add a comment

Use your data to drive meaningful improvement.

See our quality management software in action.

Request a Demo