Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

We Can’t Afford To Be Vague About Population Health Challenges

Posted on June 19, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Today, I looked over a recent press release from Black Book Research touting its conclusions on the role of EMR vendors in the population health technology market. Buried in the release were some observations by Alan Hutchison, vice president of Connect & Population Health at Epic.

As part of the text, the release observes that “the shift from quantity-based healthcare to quality-based patient-centric care is clearly the impetus” for population health technology demand. This sets up some thoughts from Hutchison.

The Epic exec’s quote rambles a bit, but in summary, he argues that existing systems are geared to tracking units of care under fee-for-service reimbursement schemes, which makes them dinosaurs.

And what’s the solution to this problem? Why, health systems need to invest in new (Epic) technology geared to tracking patients across their path of care. “Single-solution systems and systems built through acquisition [are] less able to effectively understand the total cost of care and where the greatest opportunities are to reduce variation, improve outcomes and lower costs,” Hutchison says.

Yes, I know that press releases generally summarize things in broad terms, but these words are particularly self-serving and empty, mashing together hot air and jargon into an unappetizing patty. Not only that, I see a little bit too much of stating as fact things which are clearly up for grabs.

Let’s break some of these issues down, shall we?

  • First, I call shenanigans on the notion that the shift to “value-based care” means that providers will deliver quality care over quantity. If nothing else, the shifts in our system can’t be described so easily. Yeah, I know, don’t expect much from a press release, but words matter.
  • Second, though I’m not surprised Hutchison made the argument, I challenge the notion that you must invest in entirely new systems to manage population health.
  • Also, nobody is mentioning that while buying a new system to manage pop health data may be cleaner in some respects, it could make it more difficult to integrate existing data. Having to do that undercuts the value of the new system, and may even overshadow those benefits.

I don’t know about you, but I’m pretty tired of reading low-calorie vendor quotes about the misty future of population health technology, particularly when a vendor rep claims to have The Answer.  And I’m done with seeing clichéd generalizations about value-based care pass for insight.

Actually, I get a lot more out of analyses that break down what we *don’t* know about the future of population health management.

I want to know what hasn’t worked in transitioning to value-based reimbursement. I hope to see stories describing how health systems identified their care management weaknesses. And I definitely want to find out what worries senior executives about supporting necessary changes to their care delivery models.

It’s time to admit that we don’t yet know how this population health management thing is going to work and abandon the use of terminally vague generalizations. After all, once we do, we can focus on the answering our toughest questions — and that’s when we’ll begin to make real progress.

UCHealth Adds Claims Data To Population Health Dataset

Posted on April 24, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A Colorado-based health system is implementing a new big data strategy which incorporates not only data from clinics, hospitals and pharmacies, but also a broad base of payer claim data.

UCHealth, which is based in Aurora, includes a network of seven hospitals and more than 100 clinics, caring collectively for more than 1.2 million unique patients in 2016. Its facilities include the University of Colorado Hospital, the principal teaching hospital for the University of Colorado School of Medicine.

Leaders at UCHealth are working to improve their population health efforts by integrating data from seven state insurers, including Anthem Blue Cross and Blue Shield, Cigna, Colorado Access, Colorado Choice Health Plans, Colorado Medicaid, Rocky Mountain Health Plans and United Healthcare.

The health system already has an Epic EMR in place across the system which, as readers might expect, offers a comprehensive view of all patient treatment taking place at the system’s clinics and hospitals.

That being said, the Epic database suffers from the same limitations as any other locally-based EMR. As UCHealth notes, its existing EMR data doesn’t track whether a patient changes insurers, ages into Medicare, changes doctors or moves out of the region.

To close the gaps in its EMR data, UCHealth is using technology from software vendor Stratus, which offers a healthcare data intelligence application. According to the vendor, UCHealth will use Stratus technology to support its accountable care organizations as well as its provider clinical integration strategy.

While health system execs expect to benefit from integrating payer claims data, the effort doesn’t satisfy every item on their wish list. One major challenge they’re facing is that while Epic data is available to all the instant it’s added, the payer data is not. In fact, it can take as much as 90 days before the payer data is available to UCHealth.

That being said, UCHealth’s leaders expect to be able to do a great deal with the new dataset. For example, by using Stratus, physicians may be able to figure out why a patient is visiting emergency departments more than might be expected.

Rather than guessing, the physicians will be able to request the diagnoses associated with those visits. If the doctor concludes that their conditions can be treated in one of the system’s primary care clinics, he or she can reach out to these patients and explain how clinic-based care can keep them in better health.

And of course, the health system will conduct other increasingly standard population health efforts, including spotting health trends across their community and better understanding each patient’s medical needs.

Over the next several months, 36 of UCHealth’s primary care clinics will begin using the Stratus tool. While the system hasn’t announced a formal pilot test of how Stratus works out in a production setting, rolling this technology out to just 36 doctors is clearly a modest start. But if it works, look for other health systems to scoop up claims data too!

Database Linked With Hospital EMR To Encourage Drug Monitoring

Posted on March 31, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

According to state officials, Colorado occupies the unenviable position of second worst in the US for prescription drug misuse, with more than 255,000 Coloradans misusing prescribed medications.

One way the state is fighting back is by running the Colorado Prescription Drug Monitoring Program which, like comparable efforts in other states, tracks prescriptions for controlled medications. Every regular business day, the state’s pharmacists upload prescription data for medications listed in Schedules II through V.

While this effort may have value, many physicians haven’t been using the database, largely because it can be difficult to access. In fact, historically physicians have been using the system only about 30 percent of the time when prescribing controlled substances, according to a story appearing in HealthLeaders Media.

As things stand, it can take physicians up to three minutes to access the data, given that they have to sign out of their EMR, visit the PDMP site, log in using separate credentials, click through to the right page, enter patient information and sort through possible matches before they got to the patient’s aggregated prescription history. Given the ugliness of this workflow, it’s no surprise that clinicians aren’t searching out PDMP data, especially if they don’t regard a patient as being at a high risk for drug abuse or diversion.

But perhaps taking some needless steps out of the process can make a difference, a theory which one of the state’s hospitals is testing. Colorado officials are hoping a new pilot program linking the PDMP database to an EMR will foster higher use of the data by physicians. The pilot, funded by a federal grant through the Bureau of Justice Assistance, connects the drug database directly to the University of Colorado Hospital’s Epic EMR.

The project began with a year-long building out phase, during which IT leaders created a gateway connecting the PDMP database and the Epic installation. Several months ago, the team followed up with a launch at the school of medicine’s emergency medicine department. Eventually, the PDMP database will be available in five EDs which have a combined total of 270,000 visits per year, HealthLeaders notes.

Under the pilot program, physicians can access the drug database with a single click, directly from within the Epic EMR system. Once the PDMP database was made available, the pilot brought physicians on board gradually, moving from evaluating their baseline use, giving clinicians raw data, giving them data using a risk-stratification tool and eventually requiring that they use the tool.

Researchers guiding the pilot are evaluating whether providers use the PDMP more and whether it has an impact on high-risk patients. Researchers will also analyze what happened to patients a year before, during and a year after their ED visits, using de-identified patient data.

It’s worth pointing out that people outside of Colorado are well aware of the PDMP access issue. In fact, the ONC has been paying fairly close attention to the problem of making PDMP data more accessible. That being said, the agency notes that integrating PDMPs with other health IT systems won’t come easily, given that no uniform standards exist for linking prescription drug data with health IT systems. ONC staffers have apparently been working to develop a standard approach for delivering PDMP data to EMRs, pharmacy systems and health information exchanges.

However, at present it looks like custom integration will be necessary. Perhaps pilots like this one will lead by example.

National Health Service Hospitals Use Data Integration Apps

Posted on February 20, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

While many providers in the US are still struggling with selecting and deploying apps, the UK National Health Service trusts are ready to use them to collect vital data.

According to the New Scientist, the four National Health Services serving the United Kingdom are rolling out two apps which help patients monitor their health at home. Both of the apps, which are being tested at four hospitals in Oxfordshire, UK, focus on management of a disease state.

One, called GDm-health, helps manage the treatment of gestational diabetes, which affects one in 10 pregnant women. Women use the app to send each of their blood glucose readings to the clinician monitoring their diabetes. The Oxford University Institute of Biomedical Engineering led development of the app, which has allowed patients to avoid needless in-person visits. In fact, the number of patient visits has dropped by 25%, the article notes.

The other app, which was also developed by the Institute, helps patients manage chronic obstructive pulmonary disease, which affects between 1 million and 1.5 million UK patients. COPD patients check their heart rate and blood oxygen saturation every day, entering each result into the app.

After collecting three months of measurements, the app “learns” to recognize what a normal oxygen sat level is for that patient. Because it has data on what is normal for that patient, it will neither alert clinicians too often nor ignore potential problems. During initial use the app, which already been through a 12-month clinical trial, cut hospital admissions among this population by 17% and general practitioner visits by 40%.

NHS leaders are also preparing to launch a third app soon. The technology, which is known as SEND, is an iPad app designed to collect information on hospital patients. As they make their rounds, nurses will use the app to input data on patients’ vital signs. The system then automatically produces an early warning score for each patient, and provides an alert if the patient’s health may be deteriorating.

One might think that because UK healthcare is delivered by centralized Trusts, providers there don’t face data-sharing problems in integrating data from apps like these. But apparently, we would be wrong. According to Rury Holman of the Oxford Biomedical Research Centre, who spoke with New Scientist, few apps are designed to work with NHS’ existing IT systems.

“It’s a bit like the Wild West out there with lots of keen and very motivated people producing these apps,” he told the publication. “What we need are consistent standards and an interface with electronic patient records, particularly with the NHS, so that information, with permission from the patients, can be put to use centrally.”

In other words, even in a system providing government-delivered, ostensibly integrated healthcare, it’s still hard to manage data sharing effectively. Guess we shouldn’t feel too bad about the issues we face here in the US.

UCSF Partners With Intel On Deep Learning Analytics For Health

Posted on January 30, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

UC San Francisco’s Center for Digital Health Innovation has agreed to work with Intel to deploy and validate a deep learning analytics platform. The new platform is designed to help clinicians make better treatment decisions, predict patient outcomes and respond quickly in acute situations.

The Center’s existing projects include CareWeb, a team-based collaborative care platform built on Salesforce.com social and mobile communications tech; Tidepool, which is building infrastructure for next-gen smart diabetes management apps; Health eHeart, a clinical trials platform using social media, mobile and realtime sensors to change heart disease treatment; and Trinity, which offers “precision team care” by integrating patient data with evidence and multi-disciplinary data.

These projects seem to be a good fit with Intel’s healthcare efforts, which are aimed at helping providers succeed at distributed care communication across desktop and mobile platforms.

As the two note in their joint press release, creating a deep learning platform for healthcare is extremely challenging, given that the relevant data is complex and stored in multiple incompatible systems. Intel and USCF say the next-generation platform will address these issues, allowing them to integrate not only data collected during clinical care but also inputs from genomic sequencing, monitors, sensors and wearables.

To support all of this activity obviously calls for a lot of computing power. The partners will run deep learning use cases in a distributed fashion based on a CPU-based cluster designed to crunch through very large datasets handily. Intel is rolling out the computing environment on its Xeon processor-based platform, which support data management and the algorithm development lifecycle.

As the deployment moves forward, Intel leaders plan to study how deep learning analytics and machine-driven workflows can optimize clinical care and patient outcomes, and leverage what they learn when they create new platforms for the healthcare industry. Both partners believe that this model will scale for future use case needs, such as larger convolutional neural network models, artificial networks patterned after living organizations and very large multidimensional datasets.

Once implemented, the platform will allow users to conduct advanced analytics on all of this disparate data, using machine learning and deep learning algorithms. And if all performs as expected, clinicians should be able to draw on these advanced capabilities on the fly.

This looks like a productive collaboration. If nothing else, it appears that in this case the technology platform UCSF and Intel are developing may be productized and made available to other providers, which could be very valuable. After all, while individual health systems (such as Geisinger) have the resources to kick off big data analytics projects on their own, it’s possible a standardized platform could make such technology available to smaller players. Let’s see how this goes.

Do Health IT Certificate Of Need Requirements Make Sense?

Posted on January 23, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

The other day, I read an interesting piece about the University of Vermont Medical Center’s plans to create an integrated EMR connecting its four network hospitals. The article noted that unlike its peers in some other states, UVMC was required to file a Certificate of Need (CON) application with the state before it proceeds with the work.  And that struck me as deserving some analysis.

According to a story appearing in Healthcare Informatics,  UVMC plans to invest an initial $112.4 million in the project, which includes an upgrade to informatics, billing and scheduling systems used by UVMC and network facilities Central Vermont Medical Center, Champlain Valley Physicians Hospital and Elizabethtown Community Hospital. The total costs of implementing and operating the integrated system should hit $151.6 million over the first six years. (For all of you vendor-watchers, UVMC is an Epic shop.)

In its CON application, UVMC noted that some of the systems maintained by network hospitals are 20 years old and in dire need of replacement. It also asserted that if the four hospitals made upgrades independently rather than in concert, it would cost $200 million and still leave the facilities without a connection to each other.

Given the broad outline provided in the article, these numbers seem reasonable, perhaps even modest given what execs are trying to accomplish. And that would be all most hospital executives would need to win the approval of their board and steam ahead with the project, particularly if they were gunning for value-based contracts.

But clearly, this doesn’t necessarily mean that such investments aren’t risky, or don’t stand a chance of triggering a financial meltdown. For example, there’s countless examples of health systems which have faced major financial problems (like this and this),  operational problems (particularly in this case) or have been forced to make difficult tradeoffs (such as this). And their health IT decisions can have a major impact on the rest of the marketplace, which sometimes bears the indirect costs of any mistakes they make.

Given these concerns, I think there’s an argument to be made for requiring hospitals to get CONs for major health IT investments. If there’s any case to be made for CON programs make any sense, I can’t see why it doesn’t apply here. After all, the idea behind them is to look at the big picture rather than incremental successes of one organization. If investment in, say, MRIs can increase costs needlessly, the big bucks dropped on health IT systems certainly could.

Part of the reason I sympathize with these requirements is I believe that healthcare IS fundamentally different than any other industry, and that as a public good, should face oversight that other industries do not. Simply put, healthcare costs are everybody’s costs, and that’s unique.

What’s more, I’m all too familiar with the bubble in which hospital execs and board members often live. Because they are compelled to generate the maximum profit (or excess) they can, there’s little room for analyzing how such investments impact their communities over the long term. Yes, the trend toward ACOs and population health may mitigate this effect to some degree, but probably not enough.

Of course, there’s lots of arguments against CONs, and ultimately against government intervention in the marketplace generally. If nothing else, it’s obvious that CON board members aren’t necessarily impartial arbiters of truth. (I once knew a consultant who pushed CONs through for a healthcare chain, who said that whichever competitor presented the last – not the best — statistics to the room almost always won.)

Regardless, I’d be interested in studying the results of health IT CON requirements in five or ten years and see if they had any measurable impact on healthcare competition and costs.  We’d learn a lot about health IT market dynamics, don’t you think?

“Learning Health System” Pilot Cuts Care Costs While Improving Quality

Posted on January 11, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

As some of you will know, the ONC’s Shared Nationwide Interoperability Roadmap’s goal is to create a “nationwide learning health system.”  In this system, individuals, providers and organizations will freely share health information, but more importantly, will share that information in “closed loops” which allow for continuous learning and care improvement.

When I read about this model – which is backed by the Institute of Medicine — I thought it sounded interesting, but didn’t think it terribly practical. Recently, though, I stumbled upon an experiment which attempts to bring this approach to life. And it’s more than just unusual — it seems to be successful.

What I’m talking about is a pilot study, done by a team from Nationwide Children’s Hospital and The Ohio State University, which involved implementing a “local” learning health system. During the pilot, team members used EHR data to create personalized treatments for patients based on data from others with similar conditions and risk factors.

To date, building a learning health system has been very difficult indeed, largely because integrating EHRs between multiple hospital systems is very difficult. For that reason, researchers with the two organizations decided to implement a “local” learning health system, according to a press statement from Nationwide Children’s.

To build the local learning health system, the team from Nationwide Children’s and Ohio State optimized the EHR to support their efforts. They also relied on a “robust” care coordination system which sat at the core of the EHR. The pilot subjects were a group of 131 children treated through the hospital’s cerebral palsy program.

Children treated in the 12-month program, named “Learn From Every Patient,” experienced a 43% reduction in total inpatient days, a 27% reduction in inpatient admissions, a 30% reduction in emergency department visits and a 29% reduction in urgent care visits.

The two institutions spent $225,000 to implement the pilot during the first year. However, the return on this investment was dramatic.  Researchers concluded that the program cut healthcare costs by $1.36 million. This represented a savings of about $6 for each dollar invested.

An added benefit from the program was that the clinicians working in the CP clinic found that this approach to care simplified documentation, which saved time and made it possible for them to see more patients during each session, the team found.

Not surprisingly, the research team thinks this approach has a lot of potential. “This method has the potential to be an effective complementary or alternative strategy to the top-down approach of learning health systems,” the release said. In other words, maybe bottom-up, incremental efforts are worth a try.

Given these results, it’d be nice to think that we’ll have full interoperability someday, and that we’ll be able to scale up the learning health system approach to the whole US. In the mean time, it’s good to see at least a single health system make some headway with it.

Bringing EHR Data to Radiologists

Posted on December 2, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

One of the most interesting things I saw at RSNA 2016 in Chicago this week was Philips’ Illumeo. Beside being a really slick radiology interface that they’ve been doing forever, they created a kind of “war room” like dashboard for the patient that included a bunch of data that is brought in from the EHR using FHIR.

When I talked with Yair Briman, General Manager for Healthcare Informatics Solutions and Services at Philips, he talked about the various algorithms and machine learning that goes into the interface that a radiologist sees in Illumeo. As has become an issue in much of healthcare IT, the amount of health data that’s available for a patient is overwhelming. In Illumeo, Philips is working to only present the information that’s needed for the patient at the time that it’s needed.

For example, if I’m working on a head injury, do I want to see the old X-ray from a knee issue you had 20 years ago? Probably not, so that information can be hidden. I may be interested in the problem list from the EHR, but do I really need to know about a cold that happened 10 years ago? Probably not. Notice the probably. The radiologist can still drill down into that other medical history if they want, but this type of smart interface that understands context and hides irrelevant info is something we’re seeing across all of healthcare IT. It’s great to see Philips working on it for radiologists.

While creating a relevant, adaptive interface for radiologists is great, I was fascinated by Philips work pulling in EHR data for the radiologist to see in their native interface. Far too often we only talk about the exchange happening in the other direction. It’s great to see third party applications utilizing data from the EHR.

In my discussion with Yair Briman, he pointed out some interesting data. He commented that Philips manages 135 billion images. For those keeping track at home, that amounts to more than 25 petabytes of data. I don’t think most reading this understand how large a petabyte of data really is. Check out this article to get an idea. Long story short: that’s a lot of data.

How much data is in every EHR? Maybe one petabyte? This is just a guess, but it’s significantly smaller than imaging since most EHR data is text. Ok, so the EHR data is probably 100 terabytes of text and 900 terabytes of scanned faxes. (Sorry, I couldn’t help but take a swipe at faxes) Regardless, this pales in comparison to the size of radiology data. With this difference in mind, should we stop thinking about trying to pull the radiology data into the EHR and start spending more time on how to pull the EHR data into a PACS viewer?

What was also great about the Philips product I saw was that it had a really slick browser based HTML 5 viewer for radiology images. Certainly this is a great way to send radiology images to a referring physician, but it also pointed to the opportunity to link all of these radiology images from the EHR. The reality is that most doctors don’t need all the radiology images in the EHR. However, if they had an easy link to access the radiology images in a browser when they did need it, that would be a powerful thing. In fact, I think many of the advanced EHR implementations have or are working on this type of integration.

Of course, we shouldn’t just stop with physicians. How about linking all your radiology images from the patient portal as well? It’s nice when they hand you a DVD of your radiology images. It would be much nicer to be able to easily access them anytime and from anywhere through the patient portal. The great part is, the technology to make this happen is there. Now we just need to implement it and open the kimono to patients.

All in all, I love that Philips is bringing the EHR data to the radiologists. That context can really improve healthcare. I also love that they’re working to make the interface smarter by removing data that’s irrelevant to the specific context being worked on. I also can’t wait until they make all of this imaging data available to patients.

Does Clinical Integration Call For New Leaders?

Posted on October 10, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

For quite some time now, U.S. healthcare reforms have been built around the idea that we need to achieve clinical integration between key care partners. Under these emerging models of integration, it isn’t good enough for physicians and hospitals to have a general sense of what care the other is delivering. Instead, the idea is for independent entities to function as much as possible as though they were part of the same organization.

Of course, for these partners in an integrated system to work together, they have to share a great deal of data on a patient, if not necessarily every scrap of their lifetime medical record. In other words, some degree of data integration isn’t “nice to have,” it’s a “must have.”  In fact, I wouldn’t be the first to suggest that without data integration, effective clinical integration is basically impossible.

However, while readers of this publication aren’t ignorant of this fact, my sense is that some participants in such schemes are hoping to jump in with both feet first, and figure out data sharing models later. This is mostly a hunch, but I’m pretty sure it’s happening, and moreover, I’m convinced that the mediocre performance of most ACOs is due to a leap-before-you-look approach to data sharing.

I don’t know if any models exist that emerging integrated clinical entities can use to lay out data pathways before they’re under the gun. But my sense is that we spend too little time figuring this out in advance.

Generally speaking, my guess is that these ACO partnerships and other integrated care projects are being driven by old-school healthcare execs. By this I mean folks who understand very well how to build for cross referrals between entities, forge partnerships that help all hospitals and doctors involved do better in insurance negotiations, know how to negotiate with health purchases such as large employers and the like.

Having followed such folks for some 25 years, I have nothing but respect for their strategic skills. However, I sort of doubt that they are the right people to guide larger healthcare organizations into the age of clinical and technical integration. While they might be very smart, their intuition tells them to hold back data as a proprietary asset, not share it with partners who might be competitors again in the future. And while it’s understandable why they think this way, it’s not constructive today.

Don’t get me wrong, I’m not suggesting that CXOs with decades of experience have suddenly become dinosaurs. There’s still plenty of work for them to do, and most of it is vitally important to the future of the health system. But if they want to be successful, they’ll have to turn their thinking around regarding data integration with partners. And if they can’t do that, it very well be time to bring in some fresh blood.

Hospitals, Groups Come Together To Create Terminology For Interoperability

Posted on August 5, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A health IT trade coalition dedicated to supporting data interoperability has kicked off an effort providing fuel for shareable health IT app development.

The Healthcare Services Platform Consortium, whose members include Intermountain Healthcare, the American Medical Association, Louisiana State University, the Veterans Health Administration and the Regenstrief Institute, is working to increase interoperability by defining open, standards-based specifications for enterprise clinical services and clinical applications.

Its members came together to to create a services-oriented architecture platform that supports a new marketplace for interoperable healthcare applications, according to Healthcare Informatics. Stan Huff, MD, CMIO of Intermountain, has said that he’d like to see more shareable clinical decision support modules developed.

Now, in furtherance of these goals, HSPC members are throwing their support behind an initiative known as SOLOR, which calls for integrating SNOMED CT and Laboratory LOINC, as well as selected components of RxNorm. According to the group, SOLOR will provide a terminology foundation for CIMI (Clinical Information Modeling Initiative) efforts, as well as FHIR profile development.

“We hope SOLOR can serve as a foundation to deliver sharable clinical decision-support capability both within the VA and ultimately throughout the nation’s healthcare system,” said Veterans Health Administration deputy CMIO for strategy and functional design Jonathan Nebeker, M.S., M.D., in a prepared statement.

Ultimately, HSPC hopes to create an “app store” model for plug-and-play healthcare applications. As HSPC envisions it, the app store will support common services and models that vendors can use to shorten software development lifecycles.

Not only that, the evolving standards-oriented architecture will allow multiple providers and other organizations to each deliver different parts of a solution set. This solution set will be designed to address care coordination, gaps in workflow between systems and workflows that cut across acute care, ambulatory care and patient-centered medical home models.

Industry players have already created a small selection of apps built on the SMART technology platform, roughly three dozen to date. The apps, some of which are experimental, include a tool estimating a patient’s cardiac risk, a SMART patient portal, a tool for accessing the Cerner HIE on SMART and an app called RxCheck offering real-time formulary outcomes, adherence data, clinical protocols and predictive analytics for individual patients.

Now, leaders of the HSPC – notably Intermountain’s Huff – would like to scale up the process of interoperable app development substantially. According to Healthcare Informatics, Huff told an audience that while his organization already has 150 such apps, he’d like to see many more. “With the budget we have and other constraints, we’ll never get from 150 to 5,000,” Huff said. “We realized that we needed to change the paradigm.”