Interoperability of health data is intended to bring together healthcare stakeholders, however it has proven difficult to implement. Moving data between health IT systems, whether internal or external, is a policy and commercial decision as much as it is a technological one.
Everyone appears to have an opinion on how to achieve health data interoperability, what it entails, and how much money should be spent on it.
As new payment models drive providers to strengthen relationships across the care continuum, no provider can afford to overlook the importance of health data interoperability to their patients, practices, and finances.
But what exactly is interoperability, and why is it mentioned in every patient care discussion?
Today’s healthcare data interoperability: True data interoperability necessitates the exchange of syntactic and semantic information. Syntactic interoperability refers to the ability to transfer data from one EHR to another via a uniform transport technique.
Consider the postal service in the United States. We can deliver a letter from point A to point B, but we cannot guarantee that you will read it. We do, however, have a standard envelope and address label. We can not only receive but also interpret the letter’s content thanks to semantic interoperability. The letter’s content is communicated in a common language.
Following a 10-year implementation, true semantic interoperability in healthcare would be valued around $78 billion per year.
Recently enacted legislation and policies are also promoting healthcare data interoperability. The 21st Century Cures Act resulted in final ONC and CMS guidelines on information blocking, interoperability, and patient access.
Fast Healthcare Interoperability Standards are being advocated and legislated. FHIR is extendable for plug-and-play apps, in addition to providing bidirectional data interchange (read/write) (Smart on FHIR).
All of the following are supported: eCase Reports, computable knowledge artifacts (EBM on FHIR), and computable practice recommendations (CPG on FHIR).
We have a clear path toward required semantic and syntactic data interoperability – true data fluidity at scale – with the forthcoming USCDI (US Core Data for Interoperability) standard.
What steps can hospitals take to improve data interoperability?
While I recently discussed advancements in data transfer and interoperability, there is still work to be done in knowledge interoperability. How do we make the best sepsis detection algorithm and care management pathway or guideline in the world available to a small, rural hospital?
Clearly, this necessitates data interoperability from the outset. Unshareable or computable biomedical knowledge across disparate EHRs is presently preventing us from realizing the entire value proposition that many expected from EHR deployment in the US and the projected revolution of US healthcare.
If every EHR implementation necessitates re-implementation or, worse, re-discovery of an evidence-based computable best practice guideline, our healthcare system will never be transformed. The “learning health system” is based on interchange of knowledge.
It’s also worth mentioning that current clinical practice and cognitive support with EHRs entail a diverse set of knowledge components, ranging from billing and coding standardization to clinical data standards for quality measurement.
All of these components are part of a “knowledge ecosystem” or “knowledge supply chain” that should complement one another.
An analogy would be the automobile components supply chain, which enables manufacturers to combine their parts to produce cars in Detroit or anywhere else.
It guarantees that all components of a computable practice guideline, also known as an e-pathway, operate together or, conversely, helps us to recognize when one component of another is not working properly, learn from it, and change it for use at scale.
The HL7 Computable Practice Guideline on FHIR and the Clinical Quality Language are exciting modern standards for staying current with best practices within EHRs as well as exchanging them across many divergent EHRs.
Common data formats such as QDM or FHIR, as well as common (standardized) value sets, regular expression logic, and finally common presentation techniques in the clinical workflow, are all feasible.
Together, this enables any physician end user to maximize her clinical practice by using the most recent, best evidence computable practice guideline.
Why isn’t health data interchange functioning properly?
Because all electronic data is based on zeros and ones, EHR adoption across the whole health system was expected to be straightforward. It is less susceptible to human errors like as bad handwriting or copying errors, and it can be crunched by powerful computers to generate insights rapidly and completely.
However, technology is constantly developing. These ones and zeros evolved into proprietary technologies written in languages that locked data into silos built for distinct purposes at specific times. After all, patient data is collected under a fee-for-service system for two reasons: diagnosis and billing.
Until the EHR Incentive Programs upended the health IT sector, clinicians who used these locked-down systems were actually well ahead of the curve. Now that the federal government’s budget is being depleted and care quality has surpassed quantity as a gauge of success, the meaning and usefulness of health data interoperability has shifted.
Enforcing meaningful use has been a long, tough, and expensive process for providers who must recognize that data is no longer just for billing purposes.
The EHR Innovative Programs necessitate thorough and accurate health data that is collected in a precise manner for clinical quality reporting, standardized for cross-system exchange, and tailored to aid population health management programs. Because health data is the foundation of personalized care, it must be easily available, clear, and meaningful to patients.
Even Nevertheless, many EHR systems that were first certified for meaningful use saw electronic documentation as a billing concern. During Stage 1, providers spent millions of dollars on health IT systems to meet their urgent reporting needs, but they overestimated the value of health information sharing.
Healthcare practitioners have access to petabytes of data, but the system couldn’t learn from it. Because data could not be exchanged between organizations, patients were forced to do much of the work in controlling their own care.
While many providers are still making rookie mistakes, an industry-wide push for health data interoperability may be able to assist.
Why is data exchange critical in hospitals?
Why should healthcare providers be concerned about an issue with IT code development? Because it costs businesses money. As health IT capabilities progress toward interoperability, reimbursement methods are requiring clinicians to rethink patient care. Providers are now paid based on their capacity to promote optimal health, rather than how many blood tests or MRIs they can fit into an afternoon.
This is a major worry since patients are getting older and more chronic, necessitating more ongoing care than an annual exam or an antibiotic here and there. Diabetic patients require a multidisciplinary strategy encompassing specialists, home health care, long-term care, hospitals, and community resources, all of which must be coordinated by their primary care provider.
Information must accompany the patient between venues in this new kind of patient-centered care delivery. Care coordination today necessitates a continuous, interoperable data stream to minimize costly ER visits, 30-day readmissions, and other high-intensity events that damage a provider’s quality measure.
Increasing evidence suggests that providers who use tools to promote health data interoperability can reduce spending, improve population health management, reduce preventable readmissions, reduce chronic disease management burden, and provide better patient-centered care, all of which are goals of the Triple Aim.
What are industry leaders doing to address the issue?
Suppliers, on the other hand, are now collaborating with other healthcare stakeholders to address the initial flaws of an immature business. Innovators from several industries are working together to make global health data interoperability a reality.
Data standards organizations: Interoperability necessitates agreed-upon standards that allow all types of health IT systems to extract useful information from every transaction since incompatible data exists. Protocols such as FHIR are being disseminated by multi-stakeholder projects such as HL7 International and the Argonaut Project to enhance data flow between organizations.
Interoperability organizations: The CommonWell Health Alliance, Carequality, and dozens of other regional health information exchange organizations are enticing providers to join the interoperability circle. These organizations have received widespread support from the EHR vendor community and are rapidly growing as the industry addresses the issue.
Plans and decisions made by the federal government: Meanwhile, CMS and ONC have agreed to provide guidance to healthcare providers and developers interested in long-term health data interoperability initiatives. The ONC has issued a ten-year strategy for large-scale data interchange, and Congress is modifying reimbursement structures and technical requirements.
Healthcare providers can improve care and fine-tune computable practice guidelines as a result of the learning health system’s built-in feedback loops, machine learning algorithms that generate cognitive assistants (AI) in the form of computable practice guidelines, and interoperable healthcare data.
This continuous learning system has been a major focus of the AHRQ evidence-based Care Transformation Support program (ACTS).
There is no patient, no doctor, no health system, and no EHR. To be as effective as possible, the patient journey must span the whole healthcare continuum, just as our patients do. The exchange of data and knowledge is a pillar of this concept.
We can better understand optimal practices and public health at the neighborhood, community, and national levels through a learning health system.
What works and what does not work
Six years after the HITECH Act was established, the industry remains divided on whether meaningful utilization and the expansion of health IT have genuinely aided the establishment of a learning health system.
According to a recent blog post on Health Affairs, “Despite $28 billion in total public investment, true progress toward interoperability has proven elusive.” Providers are struggling to meet even the most basic benchmarks in Stage 2 of the Meaningful Use program.
Another JAMIA survey respondent concurred that progress has been slow. “I don’t believe we have put enough pressure on firms to make their systems interoperable,” she said. “And I believe that the combination of poor technical solutions and a lack of client motivation… It simply feels hopeless, and everyone I ask about where we are with HIT today says it’s number one… Not even close.”
The ONC, on the other hand, has chastised EHR businesses and self-centered providers who may be purposely limiting data transfer for financial gain.
Charging large fees to organizations that participate in health information sharing is clearly counterproductive. Any commitment to interoperability can quickly die in the absence of an incentive for providers to invest more money in EHRs and associated health IT infrastructure.
“Technology can sometimes impede providers’ ability to practice as they wish,” said Brett Jakovac, Senior Vice President and Managing Director of Government Healthcare Solutions at Xerox. “The benefits do not always outweigh the costs. They can’t afford to integrate all of their systems for the benefit of health programs, private and commercial organizations, and so forth.”
As meaningful usage incentive payments shrink, Congress is searching for financial incentives for data-driven care. The government recently scrapped the Sustainable Growth Rate, streamlined quality reporting mechanisms that were wreaking havoc on health IT, and secured sustained Medicare reimbursement increases, giving providers a little more confidence in future cash flows.
These initiatives, in addition to fostering big data analytics, population health management, and patient involvement, may assist practitioners who are burdened by health data interoperability.
Stage 2 meaningful use providers are already reporting more public health data, and the proposed Stage 3 norm stresses healthcare analytics and information exchange as critical competencies.
While the industry has yet to hit the jackpot in terms of health data interoperability, small victories can snowball into larger ones.
In a data-driven healthcare system, what does interoperability imply?
It’s difficult to say whether all of these efforts to increase information flow will be fruitful. Those who oppose meaningful use may also oppose health data interoperability, accusing the industry of not moving fast enough for government regulators and idealists. As technology permeates more and more aspects of patient care, the sector is growing at a breakneck pace.
Providers will notice the financial impact of accountable care, and suppliers will be forced to design systems that view health data as a valuable commercial commodity, rather than a dry, static list of diagnosis and treatment codes.
Population health management and proactive preventative treatment will promote innovation in health IT, driving down prices and fostering competition in order to make health information exchange seamless for providers.
The CEO of the Massachusetts eHealth Collaborative, Micky Tripathi, stated that, despite what many in the industry, Congress, and the ONC say, we are genuinely in a fantastic situation. “That is because the industry is maturing,” I say.
“I believe value-based purchasing has created sufficient demand for that type of interoperability,” he said. Value-based purchasing and accountable care organizations have created a previously unseen need for interoperability. It’s taken a long time, but interoperability in healthcare is finally becoming a reality. Integration of AI and ML
It is now possible to incorporate the most recent, best-evidence (derived from real-world data and experience) computable practice recommendations into clinical workflows, and thus to patients and their clinicians.
We have EHRs in place, the secure cloud is widely used, and data and knowledge representation standards are available. Both traditional knowledge writers, such as those producing best practice guidelines, and those employing machine learning to construct AI/cognitive assist systems, can benefit from a feedback loop.
Diverse computational practice recommendations and other knowledge artifacts can now be deployed and tested across various EHRs. Machine learning can be used to improve predictive analytics, quality measures, and e-pathways in sickness surveillance.
As we gain a greater understanding of patient phenotypes, pharmacogenomics (correlations between genetic polymorphisms and effective therapeutic techniques), socioeconomic determinants of health, and other factors, e-pathways, measurements, and even value-based contracts can become more personalized.
Understanding how each component of a composite knowledge artifact works in practice, providing feedback (both quantitative and qualitative), and rapidly improving the tool are all crucial to achieving this goal.
We have not been able to construct and test a composite knowledge artifact using standard data models, nomenclature, value sets, logic, workflow, and implementation options. This information ecology, as well as the idea of frictionless data and knowledge exchange, is how we truly impact patient health and healthcare at scale.