Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Is There a Case to Be Made that Interoperability Saves Hospitals Money?

Posted on April 17, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Back in 2013 I argued that we needed a lot less talk and a lot more action when it came to interoperability in healthcare. It seemed very clear to me then and even now that sharing health data was the right thing to do for the patient. I have yet to meet someone who thinks that sharing a person’s health data with their providers is not the right thing to do for the patient. No doubt we shouldn’t be reckless with how we share the data, but patient care would improve if we shared data more than we do today.

While the case for sharing health data seems clear from the patient perspective, there were obvious business reasons why many organizations didn’t want to share their patients health data. From a business perspective it was often seen as an expense that they’d incur which could actually make them lose money.

These two perspectives is what makes healthcare interoperability so challenging. We all know it’s the right thing to do, but there are business reasons why it doesn’t make sense to invest in it.

While I understand both sides of the argument, I wondered if we could make the financial case for why a hospital or healthcare organization should invest in interoperability.

The easy argument is that value based care is going to require you to share data to be successful. That previous repeat X-ray that was seen as a great revenue source will become a cost center in a value based reimbursement world. At least that’s the idea and healthcare organizations should prepare for this. That’s all well and could, but the value based reimbursement stats show that we’re not there yet.

What are the other cases we can make for interoperability actually saving hospitals money?

I recently saw a stat that 70% of accidental deaths and injuries in hospitals are caused by communication issues. Accidental deaths and injuries are very expensive to a hospital. How many lives could be saved, hospital readmissions avoided, or accidental injuries could be prevented if providers had the right health data at the right place and the right time?

My guess is that not having the right healthcare data to treat a patient correctly is a big problem that causes a lot of patients to suffer needlessly. I wonder how many malpractice lawsuits could be avoided if the providers had the patients full health record available to them. Should malpractice insurance companies start offering healthcare organizations a doctors a discount if they have high quality interoperability solutions in their organization?

Obviously, I’m just exploring this idea. I’d love to hear your thoughts on it. Can interoperability solutions help a hospital save money? Are their financial reasons why interoperability should be implemented now?

While I still think we should make health data interoperability a reality because it’s the right thing to do for the patients, it seems like we need to dive deeper into the financial reasons why we should be sharing patient’s health data. Otherwise, we’ll likely never see the needle move when it comes to health data sharing.

Database Linked With Hospital EMR To Encourage Drug Monitoring

Posted on March 31, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

According to state officials, Colorado occupies the unenviable position of second worst in the US for prescription drug misuse, with more than 255,000 Coloradans misusing prescribed medications.

One way the state is fighting back is by running the Colorado Prescription Drug Monitoring Program which, like comparable efforts in other states, tracks prescriptions for controlled medications. Every regular business day, the state’s pharmacists upload prescription data for medications listed in Schedules II through V.

While this effort may have value, many physicians haven’t been using the database, largely because it can be difficult to access. In fact, historically physicians have been using the system only about 30 percent of the time when prescribing controlled substances, according to a story appearing in HealthLeaders Media.

As things stand, it can take physicians up to three minutes to access the data, given that they have to sign out of their EMR, visit the PDMP site, log in using separate credentials, click through to the right page, enter patient information and sort through possible matches before they got to the patient’s aggregated prescription history. Given the ugliness of this workflow, it’s no surprise that clinicians aren’t searching out PDMP data, especially if they don’t regard a patient as being at a high risk for drug abuse or diversion.

But perhaps taking some needless steps out of the process can make a difference, a theory which one of the state’s hospitals is testing. Colorado officials are hoping a new pilot program linking the PDMP database to an EMR will foster higher use of the data by physicians. The pilot, funded by a federal grant through the Bureau of Justice Assistance, connects the drug database directly to the University of Colorado Hospital’s Epic EMR.

The project began with a year-long building out phase, during which IT leaders created a gateway connecting the PDMP database and the Epic installation. Several months ago, the team followed up with a launch at the school of medicine’s emergency medicine department. Eventually, the PDMP database will be available in five EDs which have a combined total of 270,000 visits per year, HealthLeaders notes.

Under the pilot program, physicians can access the drug database with a single click, directly from within the Epic EMR system. Once the PDMP database was made available, the pilot brought physicians on board gradually, moving from evaluating their baseline use, giving clinicians raw data, giving them data using a risk-stratification tool and eventually requiring that they use the tool.

Researchers guiding the pilot are evaluating whether providers use the PDMP more and whether it has an impact on high-risk patients. Researchers will also analyze what happened to patients a year before, during and a year after their ED visits, using de-identified patient data.

It’s worth pointing out that people outside of Colorado are well aware of the PDMP access issue. In fact, the ONC has been paying fairly close attention to the problem of making PDMP data more accessible. That being said, the agency notes that integrating PDMPs with other health IT systems won’t come easily, given that no uniform standards exist for linking prescription drug data with health IT systems. ONC staffers have apparently been working to develop a standard approach for delivering PDMP data to EMRs, pharmacy systems and health information exchanges.

However, at present it looks like custom integration will be necessary. Perhaps pilots like this one will lead by example.

The Distributed Hospital On The Horizon

Posted on February 24, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

If you’re reading this blog, you already know that distributed, connected devices and networks are the future of healthcare.  Connected monitoring devices are growing more mature by the day, network architectures are becoming amazingly fluid, and with the growth of the IoT, we’re adding huge numbers of smart devices to an already-diverse array of endpoints.  While we may not know what all of this will look when it’s fully mature, we’ve already made amazing progress in connecting care.

But how will these trends play out? One nice look at where all this is headed comes from Jeroen Tas, chief innovation and strategy officer at Philips. In a recent article, Tas describes a world in which even major brick-and-mortar players like hospitals go almost completely virtual.  Certainly, there are other takes out there on this subject, but I really like how Tas explains things.

He starts with the assertion that the hospital of the future “is not a physical location with waiting rooms, beds and labs.” Instead, a hospital will become an abstract network overlay connecting nodes. It’s worth noting that this isn’t just a concept. For an example, Tas points to the Mercy Virtual Care Center, a $54 million “hospital without beds” dedicated to telehealth and connected care.  The Center, which has over 300 employees, cares for patients at home and in beds across 38 hospitals in seven states.

While the virtual hospital may not rely on a single, central campus, physical care locations will still matter – they’ll just be distributed differently. According to Tas, the connected health network will work best if care is provided as needed through retail-type outlets near where people live, specialist hubs, inpatient facilities and outpatient clinics. Yes, of course, we already have all of these things in place, but in the new connected world, they’ll all be on a single network.

Ultimately, even if brick-and-mortar hospitals never disappear, virtual care should make it possible to cut down dramatically on hospital admissions, he suggests.  For example, Tas notes that Philips partner Banner Health has slashed hospital admissions almost 50% by using telehealth and advanced analytics for patients with multiple chronic conditions. (We’ve also reported on a related pilot by Partners HealthCare Brigham and Women’s Hospital, the “Home Hospital,” which sends patients home with remote monitoring devices as an alternative to admissions.)

Of course, the broad connected care outline Tas offers can only take us so far. It’s all well and good to have a vision, but there are still some major problems we’ll have to solve before connected care becomes practical as a backbone for healthcare delivery.

After all, to cite one major challenge, community-wide connected health won’t be very practical until interoperable data sharing becomes easier – and we really don’t know when that will happen. Also, until big data analytics tools are widely accessible (rather than the province of the biggest, best-funded institutions) it will be hard for providers to manage the data generated by millions of virtual care endpoints.

Still, if Tas’s piece is any indication, consensus is building on what next-gen care networks can and should be, and there’s certainly plenty of ways to lay the groundwork for the future. Even small-scale, preliminary connected health efforts seem to be fostering meaningful changes in how care is delivered. And there’s little doubt that over time, connected health will turn many brick-and-mortar care models on their heads, becoming a large – or even dominant – part of care delivery.

Getting there may be tricky, but if providers keep working at connected care, it should offer an immense payoff.

Many Providers Still Struggle With Basic Data Sharing

Posted on February 15, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

One might assume that by this point, virtually every provider with a shred of IT in place is doing some form of patient data exchange. After all, many studies tout the number of healthcare data send and receive transactions a given vendor network or HIE has seen, and it sure sounds like a lot. But if a new survey is any indication, such assumptions are wrong.

According a study by Black Book Research, which surveyed 3,391 current hospital EMR users, 41% of responding medical record administrators find it hard to exchange patient health records with other providers, especially if the physicians involved aren’t on their EMR platform. Worse, 25% said they still can’t use any patient information that comes in from outside sources.

The problem isn’t a lack of interest in data sharing. In fact, Black Book found that 81% of network physicians hoped that their key health system partners’ EMR would provide interoperability among the providers in the system. Moreover, the respondents say they’re looking forward to working on initiatives that depend on shared patient data, such as value-based payment, population health and precision medicine.

The problem, as we all know, is that most hospitals are at an impasse and can’t find ways to make interoperability happen. According to the survey, 70% of hospitals that responded weren’t using information outside of their EMR.  Respondents told Black Book that they aren’t connecting clinicians because external provider data won’t integrate with their EMR’s workflow.

Even if the data flows are connected, that may not be enough. Researchers found that 22% of surveyed medical record administrators felt that transferred patient information wasn’t presented in a useful format. Meanwhile, 21% of hospital-based physicians contended that shared data couldn’t be trusted as accurate when it was transmitted between different systems.

Meanwhile, the survey found, technology issues may be a key breaking point for independent physicians, many of whom fear that they can’t make it on their own anymore.  Black Book found that 63% of independent docs are now mulling a merger with a big healthcare delivery system to both boost their tech capabilities and improve their revenue cycle results. Once they have the funds from an acquisition, they’re cleaning house; the survey found that EMR replacement activities climbed 52% in 2017 for acquired physician practices.

Time for a comment here. I wish I agreed with medical practice leaders that being acquired by a major health system would solve all of their technical problems. But I don’t, really. While being acquired may give them an early leg up, allowing them to dump their arguably flawed EMR, I’d wager that they won’t have the attention of senior IT people for long.

My sense is that hospital and health system leaders are focused externally rather than internally. Most of the big threats and opportunities – like ACO integration – are coming at leaders from the outside.

True, if a practice is a valuable ally, but independent of the health system, CIOs and VPs may spend lots of time and money to link arms with them technically. But once they get in house, it’s more of a “get in line” situation from what I’ve seen.  Readers, what is your experience?

Boston Children’s Benefits From the Carequality and CommonWell Agreement

Posted on February 3, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Recently two of the bigger players working on health data interoperability – Carequality and the CommonWell Health Alliance – agreed to share data with each other. The two, which were fierce competitors, agreed that CommonWell would share data with any Carequality participant, and that Carequality users would be able to use the CommonWell record locator service.

That is all well and good, but at first I wasn’t sure if it would pan out. Being the cranky skeptic that I am, I assumed it would take quite a while for the two to get their act together, and that we’d hear little more of their agreement for a year or two.

But apparently, I was wrong. In fact, a story by Scott Mace of HealthLeaders suggests that Boston Children’s Hospital and its physicians are likely to benefit right away. According to the story, the hospital and its affiliated Pediatric Physicians Organization at Children’s Hospital (PPOC) will be able to swap data nicely despite their using different EMRs.

According to Mace, Boston Children’s runs a Cerner EMR, as well as an Epic installation to manage its revenue cycle. Meanwhile, PPOC is going live with Epic across its 80 practices and 400 providers. On the surface, the mix doesn’t sound too promising.

To add even more challenges to the mix, Boston Children’s also expects an exponential jump in the number of patients it will be caring for via its Medicaid ACO, the article notes.

Without some form of data sharing compatibility, the hospital and practice would have faced huge challenges, but now it has an option. Boston Children’s is joining CommonWell, and PPOC is joining Carequality, solving a problem the two have struggled with for a long time, Mace writes.

Previously, the story notes, the hospital tried unsuccessfully to work with a local HIE, the Mass Health Information HIway. According to hospital CIO Dan Nigrin, MD, who spoke with Mace, providers using Mass Health were usually asked to push patient data to their peers via Direct protocol, rather than pull data from other providers when they needed it.

Under the new regime, however, providers will have much more extensive access to data. Also, the two entities will face fewer data-sharing hassles, such as establishing point-to-point or bilateral exchange agreements with other providers, PPOC CIO Nael Hafez told HealthLeaders.

Even this step upwards does not perfect interoperability make. According to Micky Tripathi, president and CEO of the Massachusetts eHealth Collaborative, providers leveraging the CommonWell/Carequality data will probably customize their experience. He contends that even those who are big fans of the joint network may add, for example, additional record locator services such as one provided by Surescripts. But it does seem that Boston Children’s and PPOC are, well, pretty psyched to get started with data sharing as is.

Now, back to me as Queen Grump again. I have to admit that Mace paints a pretty attractive picture here, and I wish Boston Children’s and PPOC much success. But my guess is that there will still be plenty of difficult issues to work out before they have even the basic interoperability they’re after. Regardless, some hope of data sharing is better than none at all. Let’s just hope this new data sharing agreement between CommonWell and Carequality lives up to its billing.

Some Projections For 2017 Hospital IT Spending

Posted on January 4, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A couple of months ago, HIMSS released some statistics from its survey on US hospitals’ plans for IT investment over the next 12 months. The results contain a couple of data points that I found particularly interesting:

  • While I had expected the most common type of planned spending to be focused on population health or related solutions, HIMSS found that pharmacy was the most active category. In fact, 51% of hospitals were planning to invest in one pharmacy technology, largely to improve tracking of medication dispensing in additional patient care environments. Researchers also found that 6% of hospitals were planning to add carousels or packagers in their pharmacies.
  • Eight percent hospitals said that they plan to invest in EMR components, which I hadn’t anticipated (though it makes sense in retrospect). HIMSS reported that 14% of hospitals at Stage 1-4 of its Electronic Medical Record Adoption Model are investing in pharmacy tech for closed loop med administration, and 17% in auto ID tech. Four percent of Stage 6 hospitals plan to support or expand information exchange capabilities. Meanwhile, 60% of Stage 7 hospitals are investing in hardware infrastructure “for the post-EMR world.”

Other data from the HIMSS report included news of new analytics and telecom plans:

  • Researchers say that recent mergers and acquisitions are triggering new investments around telephony. They found that 12% of hospitals with inpatient revenues between $25 million and $125 million – and 6% of hospitals with more than $500 million in inpatient revenues — are investing in VOIP and telemedicine. FWIW, I’m not sure how mergers and acquisitions would trigger telemedicine rollouts, as they’re already well underway at many hospitals — maybe these deals foster new thinking and innovation?
  • As readers know, hospitals are increasingly spending on analytics solutions to improve care and make use of big data. However (and this surprised me) only 8% of hospitals reported plans to buy at least one analytics technology. My guess is that this number is small because a) hospitals may not have collected their big data assets in easily-analyzed form yet and b) that they’re still hoping to make better use of their legacy analytics tools.

Looking at these stats as a whole, I get the sense that the hospitals surveyed are expecting to play catch-up and shore up their infrastructure next year, rather than sink big dollars into future-looking solutions.

Without a doubt, hospital leaders are likely to invest in game-changing technologies soon such as cutting-edge patient engagement and population health platforms to prepare for the shift to value-based health. It’s inevitable.

But in the meantime it probably makes sense for them to focus on internal cost drivers like pharmacy departments, whose average annual inpatient drug spending shot up by more than 23% between 2013 and 2015. Without stanching that kind of bleeding, hospitals are unlikely to get as much value as they’d like from big-idea investments in the future.

A Look At Geisinger’s Big Data Efforts

Posted on December 28, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

This week I got a look at a story appearing in a recent issue of Harvard Business Review which offers a description of Geisinger Health System’s recent big data initiatives. The ambitious project is designed not only to track and analyze patient outcomes, but also to visualize healthcare data across cohorts of patients and networks of providers and even correlate genomic sequences with clinical care. Particularly given that Geisinger has stayed on the cutting edge of HIT for many years, I think it’s worth a look.

As the article’s authors note, Geisinger rolled out a full-featured EMR in 1996, well ahead of most of its peers. Like many other health systems, Geisinger has struggled to aggregate and make use of data. That’s particularly the case because as with other systems, Geisinger’s legacy analytics systems still in place can’t accommodate the growing flood of new data types emerging today.

Last year, Geisinger decided to create a new infrastructure which could bring this data together. It implemented Unified Data Architecture allowing it to integrate big data into its existing data analytics and management.  According to the article, Geisinger’s UDA rollout is the largest practical application of point-of-care big data in the industry. Of particular note, Geisinger is crunching not only enterprise healthcare data (including HIE inputs, clinical departmental systems and patient satisfaction surveys) and consumer health tools (like smartphone apps) but even grocery store and loyalty program info.

Though all of its data hasn’t yet been moved to the UDA, Geisinger has already seen some big data successes, including:

* “Close the Loop” program:  Using natural language processing, the UDA analyzes clinical and diagnostic imaging reports, including free text. Sometimes it detects problems that may not be relevant to the initial issue (such as injuries from a car crash) which can themselves cause serious harm. The program has already saved patient lives.

* Early sepsis detection/treatment: Geisinger uses the UDA to bring all sepsis-patient information in one place as they travel through the hospital. The system alerts providers to real-time physiologic data in patients with life-threatening septic shock, as well as tracking when antibiotics are prescribed and administered. Ninety percent of providers who use this tool consistently adhere to sepsis treatment protocols, as opposed to 40% of those who don’t.

* Surgery costs/outcomes: The Geisinger UDA tracks and integrates surgical supply-chain data, plus clinical data by surgery type and provider, which offers a comprehensive view of performance by provider and surgery type.  In addition to offering performance insight, this approach has also helped generate insights about supply use patterns which allow the health system to negotiate better vendor deals.

To me, one of the most interesting things about this story is that while Geisinger is at a relatively early stage of its big data efforts, it has already managed to generate meaningful benefits from its efforts. My guess is that its early successes are more due to smart planning – which includes worthwhile goals from day one of the rollout — than the technology per se. Regardless, let’s hope other hospital big data projects fare so well. (Meanwhile, for a look at another interesting hospital big data project, check out this story.)

Access To Electronic Health Data Saves Money In Emergency Department

Posted on October 24, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A new research study has found that emergency department patients benefit from having their electronic health records available when they’re being treated. Researchers found that when health information was available electronically, the patient’s care was speeded up, and that it also generated substantial cost savings.

Researchers with the University of Michigan School of Public Health reviewed the emergency department summaries from 4,451 adult and pediatric ED visits for about one year, examining how different forms of health data accessibility affected patients.

In 80% of the cases, the emergency department had to have all or part of the patient’s medical records faxed to the hospital where they were being treated. In the other 20% of the cases, however, where the ED staff had access to a patient’s complete electronic health record, they were seen much more quickly and treatment was often more efficient.

Specifically, the researchers found that when information requests from outside organizations were returned electronically instead of by fax, doctors saw that information an hour faster, which cut a patient’s time in the ED by almost 53 minutes.

This, in turn, seems to have reduced physicians’ use of MRIs, x-rays and CT scans by 1.6% to 2.5%, as well as lowering the likelihood of hospital admission by 2.4%. The researchers also found that average cost for care were $1,187 lower when information was delivered electronically.

An interesting side note to the study is that when information was made available electronically on patients, it was supplied through Epic’s Care Everywhere platform, which is reportedly used in about 20% of healthcare systems nationwide. Apparently, the University of Michigan Health System (which hosted the study) doesn’t belong to an HIE.

While I’m not saying that there’s anything untoward about this, I wasn’t surprised to find principal author Jordan Everson, a doctoral candidate in health services at the school, is a former Epic employee. He would know better than most how Epic’s health data sharing technology works.

From direct experience, I can state that Care Everywhere isn’t necessarily used or even understood by employees of some major health systems in my geographic location, and perhaps not configured right even when health systems attempt to use it. This continues to frustrate leaders at Epic, who emphasize time and again that this platform exists, and that is used quite actively by many of its customers.

But the implications of the study go well beyond the information sharing tools U-M Health System uses. The more important takeaway from the study is that this is quantitative evidence that having electronic data immediately available makes clinical and financial sense (at least from the patient perspective). If that premise was ever in question, this study does a lot to support it. Clearly, making it quick and easy for ED doctors to get up to speed makes a concrete difference in patient care.

Hospitals Using Market-Leading EHR Have Higher HIE Use

Posted on July 29, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A new study concludes that hospital engagement with HIEs is tied with the level of dominance their EHR vendor has in their marketplace. The study, which appeared in Health Affairs, looked at national data from 2012 and 2013 to look at how vendor dominance related to hospitals’ HIE involvement level. And their analysis suggests that the more market power a given vendor has, the more it may stifle hospitals’ HIE participation.

As researchers note, federal policymakers have expressed concern that some EHR vendors may be hampering the free flow of data between providers, in part by making cross-vendor HIE implementation difficult. To address this concern, the study looked at hospitals’ behavior in differently-structured EHR marketplaces.

Researchers concluded that hospitals using the EHR which dominated their marketplace engaged in an average of 45% more HIE activities than facilities using non-dominant vendors. On the other hand, in markets where the leading vendor was less dominant, controlling 20% of the market, hospitals using the dominant vendor engaged in 59% more HIE activities than hospitals using a different vendor.

Meanwhile, if the dominant EHR vendor controlled 80% of the market, hospitals using the leading vendor engaged in only 25% more HIE activities than those using a different vendor. In other words, high levels of local market dominance by a single vendor seemed to be associated with relatively low levels of HIE involvement.

According to the study’s authors, the data suggests that to promote cross-vendor HIE use, policymakers may need to take local market competition between EHR vendors into consideration. And though they don’t say this directly, they also seem to imply that both high vendor dominance and low vendor dominance can both slow HIE engagement, and that moderate dominance may foster such participation.

While this is interesting stuff, it may be moot. What the study doesn’t address is that the entire HIE model comes with handicaps that go beyond what it takes to integrate disparate EHR systems. Even if two hospital systems in a market are using, say, Cerner systems, how does it benefit them to work on sharing data that will help their rival deliver better care? I’ve heard this question asked by hospital financial types, and while it’s a brutal sentiment, it gets to something important.

Nonetheless, I’d argue that studying the dynamics of how EHR vendors compete is quite worthwhile. When a single vendor dominates a marketplace, it has to have an impact on everyone in that market’s healthcare system, including patients. Understanding just what that impact is makes a great deal of sense.

Data Sharing Largely Isn’t Informing Hospital Clinical Decisions

Posted on July 6, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Some new data released by ONC suggests that while healthcare data is being shared far more frequently between hospitals than in the past, few hospital clinicians use such data regularly as part of providing patient care.

The ONC report, which is based on a supplement to the 2015 edition of an annual survey by the American Hospital Association, concluded that 96% of hospitals had an EHR in place which was federally tested and certified for the Meaningful Use program. That’s an enormous leap from 2009, the year federal economic stimulus law creating the program was signed, when only 12.2% of hospitals had even a basic EHR in place.

Also, hospitals have improved dramatically in their ability to share data with other facilities outside their system, according to an AHA article from February. While just 22% of hospitals shared data with peer facilities in 2011, that number had shot up to 57% in 2014. Also, the share of hospitals exchanging data with ambulatory care providers outside the system climbed from 37% to 60% during the same period.

On the other hand, hospitals are not meeting federal goals for data use, particularly the use of data not created within their institution. While 82% of hospitals shared lab results, radiology reports, clinical care summaries or medication lists with hospitals or ambulatory care centers outside of their orbit — up from 45% in 2009 — the date isn’t having as much of an impact as it could.

Only 18% of those surveyed by the AHA said that hospital clinicians often used patient information gathered electronically from outside sources. Another 35% reported that clinicians used such information “sometimes,” 20% used it “rarely” and 16% “never” used such data. (The remaining 11% said that they didn’t know how such data was used.)

So what’s holding hospital clinicians back? More than half of AHA respondents (53%) said that the biggest barrier to using interoperable data integrating that data into physician routines. They noted that since shared information usually wasn’t available to clinicians in their EHRs, they had to go out of the regular workflows to review the data.

Another major barrier, cited by 45% of survey respondents, was difficulty integrating exchange information into their EHR. According to the AHA survey, only 4 in 10 hospitals had the ability to integrate data into their EHRs without manual data entry.

Other problems with clinician use of shared data concluded that information was not always available when needed (40%), that it wasn’t presented in a useful format (29%) and that clinicians did not trust the accuracy of the information (11%). Also, 31% of survey respondents said that many recipients of care summaries felt that the data itself was not useful, up from 26% in 2014.

What’s more, some technical problems in sharing data between EHRs seem to have gotten slightly worse between the 2014 and 2015 surveys. For example, 24% of respondents the 2014 survey said that matching or identifying patients was a concern in data exchange. That number jumped to 33% in the 2015 results.

By the way, you might want to check out this related chart, which suggests that paper-based data exchange remains wildly popular. Given the challenges that still exist in sharing such data digitally, I guess we shouldn’t be surprised.