When it comes to population health management, the purpose of data aggregation and normalization is to blend data from different sources together to create a universal destination that aids providers and medical groups in their clinical and financial decision-making. To get patient data represented properly, it’s crucial the data extraction process from source system to the data warehouse is as smooth and efficient as possible. However, when source systems lack data governance, there can be major roadblocks that hinder accurate analytics and factual reports. Complications arise with the following scenarios:
- Entering custom codes into data capture systems
- Customizing source systems to store data in non-standard places
Dangers of Custom Coding
Take Electronic Health Records (EHR), for example. While EHR configurations can be applied to work around this custom coding, medical groups are typically not focused on satisfying quality measure programs when setting up configurations. If patient details or services are not properly coded, when the time comes for the data to be extracted, the codes that come through may not align with quality measure program standards. The measure program will not recognize the procedure because it hasn’t been recording in the standard format, be it HEDIS, GPRO, or PQRS, either due to custom notes or special codes being entered in the EHR.
Another example is complications when calculating compliance for diabetic eye exams. The group practice reporting option (GPRO) for accountable care organizations (ACOs) and the new Merit-based Incentive Payment System (MIPS) program measures are strict about the code used for calculating diabetic eye exams, 2022F is a valid procedure code for Diabetes Mellitus Exam, yet for this measure we commonly receive a nonstandard code like ‘DMEYE’. If a physician documents a custom code and there is no mapping logic in the back end of the EHR to the standard code, the GPRO measurement system will not recognize that the exam was completed. While the patient and physician are in fact compliant, they will appear as non-compliant. This can lead to physician distrust of the measurements pulled from the EHR system.
If custom codes are being used, the source system should have the correct coding pathways in place so when data is extracted, the measure outcomes are accurate.
Dangers of Custom Storing
Another roadblock related to lack of data governance is the customization of source systems (such as EHRs) to store data in non-standard places. When templates or pages have been created to insert data into custom tables, it can be difficult to change or find where the data within the tables has been stored without the individual who created the template guiding the extraction. Complicated data storage can lead to confusion and inaccurate measure calculations.
How to Establish Clear and Quality Data for Population Health Management
To prevent potential issues down the road, it’s important to ensure that custom codes are properly converted to standard ones that will be stored in the EHR’s back end database. Though setting these pathways that automatically pull data involves more work on the front end, it is well worth the long-term benefit when the data can be easily and accurately retrieved to be used in a meaningful way.
In searching for a vendor to partner with on population health or value-based care, make sure to examine their ability to consume and transform large amounts of data from disparate systems, custom codes, and non-standard storing locations. Questioning your vendor will eliminate headaches and physician distrust in the future. Lightbeam’s vendor questioning approach, combined with our extensive clinical and claims data experience, allows us to configure interfaces to convert custom codes and pull data from nonstandard places. In return, effectively making life easier for clients while accurately displaying data insights on the patient population. Without data accuracy, the foundation of all population health efforts is in jeopardy.
Read more from Mike Hoxter, Lightbeam’s Chief Technology Officer.