Interoperability has to move from data exchange to data management

The journey toward achieving interoperability in healthcare needs to now move beyond data exchange, and […]

The journey toward achieving interoperability in healthcare needs to now move beyond data exchange, and instead, focus on data management.

This is the opinion of a panel of experts who gathered at the all-virtual Health Datapalooza and National Health Policy Conference Thursday to discuss one of healthcare’s most hotly debated concepts: interoperability.

The healthcare industry has come a long way with regard to interoperability, especially with the new rules proposed by the Department of Health and Human Services, set to take effect April 5. These rules aim to provide patients with unprecedented access to their data.

But the healthcare problems of today require solutions that support data management, and not just data exchange, said Claudia Williams, CEO of California-based Manifest MedEX, at the Health Datapalooza conference.

The industry has focused on enabling the basic exchange of health records between providers and made great progress, but the connective tissue that enables data management — including matching and cleaning data — is lacking, she said. And it’s the smaller providers that are being left behind.

“In California, a slim share of the health delivery systems, mostly the big health systems, have hundreds of people doing this work, and safety net providers and small Medicaid plans and others really are stuck with just being able to pile up CCDAs [consolidated clinical document architecture] or maybe not even process CCDAs at all,” she said.

There needs to be policies and strategies enacted at the state and federal levels to create an infrastructure that has more to do with the management of data than with health record exchange, Williams said.

While policy actions are necessary, there also needs to be more alignment between the needs of the providers on the ground and the health IT technology and capabilities available today, said Dr. Farzad Mostashari, CEO of Bethesda, Maryland-based Aledade, during the panel discussion.

Through Aledade, which operates accountable care organizations in partnership with more than 800 primary care practices, Mostashari has experienced that disconnect firsthand.

“What EHRs do today have nothing to do with what I need,” he said. “Well, not nothing, but they really don’t fill the thermometer of what I need to do for population health.”

It is costing Aledade millions to map, match and translate EHR data, he added.

Looking ahead, care quality measurement needs to be automated within the EHR and providers should get free access to information that is already mapped in accordance with data standards, Mostashari said.

The technical tools needed to push interoperability forward already exist, but the regulatory landscape needs to catch up, said Donald Trigg, president of North Kansas City, Missouri-based Cerner, during the session.

The government is now both the biggest healthcare regulator in the country and the biggest payer. This means it is in a unique position to use health IT certification and provider reimbursement to help create the interoperability architecture that is necessary for the coming decade, Trigg said.

“I’m still an optimist,” he said. “And I think that Covid and this administration will be an accelerant for the next wave of meaningful data exchange.”

Trigg’s advice for the new administration’s HHS is to tackle the inter-agency complexity that exists at the federal level.

Coordination between agencies like the HHS, Federal Trade Commission and Food and Drug Administration is necessary. This means, it will be important to create clarity around the inter-agency landscape and data management so that the healthcare industry can innovate and do more, Trigg said.

 


The original article can be found at: MedCity News