Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

What’s the Glue Holding EHR Migration and Conversion Projects Together? – Optimize Healthcare Integration Series

Posted on August 13, 2015 I Written By

The following is a guest blog post by Stephane Vigot, President of Caristix, a leading provider of healthcare integration products and services. This post is part of the Optimize Healthcare Integration series.
Stephane Vigot - Caristix
Are you considering migrating from an older EHR to a newer EHR or are you in the process of that conversion? If so, you are well aware of the complexity of this process. There are a lot of reasons that drive the EHR conversion decision, but the primary reason that organizations undertake EHR conversion is simply to improve patient care and safety by providing clinicians and caregivers with the right information at the right time.

It’s easy to think that this is all about the technology. EHR conversion is far more than an IT project. It is a central business issue that needs to be strategically sponsored and backed by upper level management. In our previous post, we addressed the issue of aligning integration goals for business and technology.  In a project of this magnitude, aligning business and technology goals becomes critical. Implementation takes hard work, time, and is very expensive. Effectively dealing with scope, budget & time creep, and change management matched to the stated business goals is the key to success. The complex planning needed is just one part of the story but the actual execution can be extremely problematic.

Since the primary reason for undertaking EHR conversion is to improve patient care and safety, clinical workflow is top-of-mind and coupled to data exchange and flow through your systems. On the IT side, your analysts define the project requirements and your developers build the interfaces based on those requirements. But the team that plays the most critical role is your quality team. Think of them as your project’s glue.

QA has layers of responsibilities. They are the ones that hold the requirements as the project blueprint and make sure that those requirements, driven by the pre-identified business needs, are being met. They also make sure that all defined processes are being followed. Where processes are not followed, QA defines the resulting risks that must be accommodated for in the system. A subset of responsibility for QA is in the final gate-keeping of a project, the testing and validation processes that address the functionality and metrics of a project.

Analysts work to build the interfaces and provide QA with expected workflows. If those workflows are not correctly defined, QA steps in to clarify them and the expected data exchange, and builds test cases to best represent that evolving knowledge. Identifying workflow is often done blindly with little or no existing information. Once the interface is built, those test cases become the basis for testing. QA also plays an important role in maintenance and in contributing to the library of artifacts that contribute to guaranteeing interoperability over time.

Though it is difficult to estimate the actual costs of interfacing due to the variance implicit in such projects, functional and integrated testing is often up to 3x more time consuming than development. It’s important to note that this most likely represents defects in the process. Normally, in traditional software development those numbers are inversed with QA taking about 1/3 of development time. It’s quite common that requirements are not complete by the time the project lands in QA’s lap. New requirements are continually discovered during testing. These are usually considered to be bugs but should have been identified before the development phase started. Another major reason for the lengthy time needed is that all testing is commonly done manually. A 25 minute fix may require hours of testing when done manually.

In technology projects, risk is always present. QA teams continuously work to confine and evaluate risk based on a predefined process and to report those issues. The question continually being asked is: what are the odds that X will be a problem? And how important is that impact if there is a problem? Here the devil is in the details. QA is constantly dancing with that devil. Risk is not an all or nothing kind of thing. If one were to try and eliminate all risk, projects would never be completed. QA adds order and definition to projects but there are always blind alleyways and unknown consequences that cannot be anticipated even with the most well defined requirements. Dealing with the unknown unknowns is a constant for QA teams. The question becomes how much risk can be tolerated to create the cleanest and most efficient exchange of date on an ongoing basis.

If QA is your glue, what are you doing to increase the quality of that glue, to turn that into super glue? What you can do is provide tools that offset the challenges your QA team faces. At the same time, these tools help contain project scope, time & budget creep, and maintain continual alignment with business goals. The right tools should help in the identification of requirements prior to interface development and throughout that process, identify the necessary workflows, and help in the QA process of building test cases. De-identification of PHI should be included so that production data can be used in testing. Tools should automate the testing and validation process and include the capability of running tests repetitively. In addition, these tools should provide easily shared traceability of the entire QA process by providing a central depository for all assets and documentation to provide continuity for the interoperability goals defined for the entire ecosystem.

What is your organization experiencing in your conversion projects? We’d love to hear your thoughts in the comments.

Caristix, a leading healthcare integration company, is the sponsor of the Optimize Healthcare Integration blog post series.  Check out this free online demo of Caristix Workgroup product which helps you test your interface and speed up HL7 interface development.

About Stéphane Vigot
Stéphane Vigot, President of Caristix, has over 20 years of experience in product management and business development leadership roles in technology and healthcare IT. Formerly with CareFusion and Cardinal Health, his experience spans from major enterprises to startups. Caristix is one of the few companies in the health IT ecosystem that is uniquely focused on integrating, connecting, and exchanging data between systems. He can be reached at stephane.vigot@caristix.com

The Top 3 Reasons Your Health IT Systems Take So Long To Integrate – Optimize Healthcare Integration Series

Posted on July 1, 2015 I Written By

The following is a guest blog post by Stephane Vigot, President of Caristix, a leading provider of healthcare integration products and services. This post is part of the Optimize Healthcare Integration series.
Stephane Vigot - Caristix
The push for interoperability is on. What’s at the core of interoperability that truly supports next generation analytics, real patient engagement, true care coordination, and high value population health? Data exchange through interfacing. And that means HL7.

HL7 represents 95% of interfacing in hospital environments for clinical systems and many other information systems in healthcare.  Many people make the error of thinking HL7 is just simple strings, but it’s a lot more than that. It’s a system of data organization, a dynamic framework that establishes the base of data exchange through its specifics, syntax and structure. But, despite standards, if you take two identical systems, same vendor, deployed in two different environments, you’ll find discrepancies 100% of the time when it comes to data management.

What’s the result? It takes too long to take systems live. And that means time, money, resource drain, and headaches for integrators, maintenance and quality teams. The most critical impact is on essential clinical care. Beyond that, this negatively impacts your short and long term business goals over time. This impact will grow with the increasing demands of interoperability, particularly with the drive for automation and easy data access and analytics.

There are three primary challenges that feed into this problem of getting a system live. These are:

Aligning the integration goals for business and technology users – This step alone will differentiate success or failure. Without a clear picture of your goals and environment from day one, you can’t measure the required investment and resources. Project planning becomes a wild guess. How do you get everyone involved on deck with a common understanding of the overall project?  Is it crystal clear how your new system fits into your existing ecosystem in the context of your data flow and structure? Do you know what information you need from whom, when? Is all documentation readily available? Are the business impacts of the interface understood?

Complete and clear data transformation requirements – It’s common to manually compare outdated specs, list the differences and jump into the code. This makes it virtually impossible to quickly come up with a complete list. Complete requirements are not identified until too late in the project, sometimes not until it’s in production. Are all data flows and system workflows identified? Are the system’s data semantics clear? Are documented system specs accurate? Has customized data been included?  Are all the transformations and mappings defined?  Have you automated the processes that define requirements?

Testing/Verification – Your QA team knows this is about a lot more than making sure all the dots are connected. You want to get this right before your go live and avoid handling data related emergencies in production with constant break-fix repairs. Are you doing enough testing before go live so your caregivers can count on applications being completely functional for their critical patient care? Are your test cases based on your requirements? Are you testing against your clinical workflows? Do you include edge cases and performance in your testing? Are you testing with de-identified production data that accurately represents your system’s data flow and needs? Is your testing HIPAA compliant? Are you prepared for ongoing maintenance and updating with reusable test cases backed by reliable and repeatable quality measures? Is your testing automated?

What’s the most efficient solution to these three challenges?  Productivity software that supports your integration and workflow process from start to finish. With the right solution, you understand the big picture before you start with complete requirements built upon your specifications that set you up for robust system testing and maintenance. The right solution will cut your project timelines in half, reduce your resource drain and costs, and guarantee predictable results while streamlining the repetitive tasks of your teams. In addition, gap analysis, automatic specification management, HL7 message comparison and editing, debugging tools, PHI de-identification, documentation building, and team collaborative depositories should be included. As seen in the charts below, savings of up to 52% can be realized through optimization with productivity software.
Healthcare Integration Project Time Chart
Do these healthcare integration challenges resonate with you? What is your organization experiencing? We’d love to hear your thoughts in the comments.

Caristix, a leading healthcare integration company, is the sponsor of the Optimize Healthcare Integration blog post series.  If you’d like to learn more about how you can simplify your healthcare integration process, download this Free Whitepaper.

About Stéphane Vigot
Stéphane Vigot, President of Caristix, has over 20 years of experience in product management and business development leadership roles in technology and healthcare IT. Formerly with CareFusion and Cardinal Health, his experience spans from major enterprises to startups. Caristix is one of the few companies in the health IT ecosystem that is uniquely focused on integrating, connecting, and exchanging data between systems. He can be reached at stephane.vigot@caristix.com

Interoperability Becoming Important To Consumers

Posted on June 26, 2015 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

The other day, I was talking with my mother about her recent primary care visit — and she was pretty po’d. “I can’t understand why my cardiologist didn’t just send the information to my family doctor,” she said. “Can’t they do that online these days? Why isn’t my doctor part of it?”

Now, to understand why this matters you need to know that my mother, who’s extremely bright, is nonetheless such a technophobe that she literally won’t touch my father’s desktop PC. She’s never opened a brower and has sent perhaps two or three e-mails in her life. She doesn’t even know how to use the text function on her basic “dumb” phone.

But she understands what interoperability is — even if the term would be foreign — and has little patience for care providers that don’t have it in place.

If this was just about my 74-year-old mom, who’s never really cared for technology generally, it would just be a blip. But research suggests that she’s far from alone.

In fact, a study recently released by the Society for Participatory Medicine and conducted by ORC International suggests that most U.S. residents are in my mother’s camp. Nearly 75% of Americans surveyed by SPM said that it was very important that critical health information be shared between hospitals, doctors and other providers.

What’s more, respondents expect these transfers to be free. Eighty seven percent were dead-set against any fees being charged to either providers or patients for health data transfers. That flies in the face of current business practices, in which doctors may pay between $5,000 to $50,000 to connect with laboratories, HIEs or government, sometimes also paying fees each time they send or receive data.

There’s many things to think about here, but a couple stand out in my mind.

For one thing, providers should definitely be on notice that consumers have lost patience with cumbersome paper record transfers in the digital era. If my mom is demanding frictionless data sharing, then I can only imagine what Millenials are thinking. Doctors and hospitals may actually gain a marketing advantage by advertising how connected they are!

One other important issue to consider is that interoperability, arguably a fevered dream for many providers today, may eventually become the standard of care. You don’t want to be the hospital that stands out as having set patients adrift without adequate data sharing, and I’d argue that the day is coming sooner rather than later when that will mean electronic data sharing.

Admittedly, some consumers may remain exercised only as long as health data sharing is discussed on Good Morning America. But others have got it in their head that they deserve to have their doctors on the same page, with no hassles, and I can’t say the blame them. As we all know, it’s about time.

Working to Understand FHIR

Posted on April 9, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Ever since I’d heard so many good things about FHIR, I’ve been slowly trying to learn more about it, how it will be implemented, what challenges it faces, and what’s the pathway for FHIR to have widespread adoption.

So, it was no surprise that the Corepoint Health sessions on FHIR caught my eye and will be part of my HIMSS 2015. As part of that education they sent me their FHIR whitepaper which they’ll be handing out at their booth along with their sessions on FHIR. As with most things, the more I learn about FHIR, the more I realize I need to learn.

One example of this comes from the FHIR whitepaper linked above. It talks about defining resources for FHIR:

Resources are small, logically discrete units of exchange. Resources define behavior and meaning, have a known identity and location, are the smallest possible unit of transaction, and provide meaningful data that is of interest to healthcare. The plan is to limit resources to 100 to 150 in total. They are sometimes compared to an HL7 V2 segment.

The resources can be extended and adapted to provide a more manageable solution to the healthcare demand for optionality and customization.
Source: Corepoint Health

This section reminded me of a comment Greg Meyer tweeted during an #HITsm chat about FHIR’s biggest challenge being to define profiles. When he said, that I made a note to myself to learn more about what made up profiles. What Greg called profiles, it seems Corepoint Health is calling resources. They seem to be the same thing. This chart from the whitepaper does a great job summarizing why creating these resources (or profiles if you prefer) is so challenging:

FHIR Resource Examples
Source: Corepoint Health

I still have a lot more to learn about FHIR, but it seems like it does have really good founding principles. We’ll see if the powers that be can keep it pure or try and corrupt and modify its core principles. Not to mention take it and make it so complex that it’s not usable. I’ll be learning more about FHIR at HIMSS and I’ll be sure to report back. Until then, this FHIR whitepaper provides a pretty good historical overview of FHIR versus the other healthcare IT standards.

New Federal Health IT Strategic Plan for 2015-2020

Posted on December 8, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The big news came out today that HHS had released its Health IT Strategic Plan for 2015-2020. You can find more details about the plan and also read the 28 page Federal Health IT Strategic plan online. Unlike many of the regulations, this strategic plan is very readable and gives a pretty good idea of where ONC wants to take healthcare IT (hint: interoperability). Although, the document is available for comment, so your comments could help to improve the proposed plan.

I think this image from the document really does a nice job summarizing the plan’s goals:
Federal Health IT Strategic Plan Summary

When I see a plan like this, the goals are noble and appropriate. No doubt we could argue about some of the details, but I think this is directionally good. What I’m not so sure about is how this plan will really help healthcare reach the specified goals. I need to dive into the specific strategies offered in the document to know if they really have the ability to reach these goals. I might have to take each goal and strategy and make a series out of it.

What do you think of this new health IT strategic plan?

John Glaser to Stay on as Senior VP of Cerner Upon Close of Acquisition

Posted on November 19, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

In case you’re living under a rock (or more affectionately, you’re too busy working to follow the inside baseball of EHR company acquisition), Cerner is set to acquire Siemens in late winter or early spring pending all the needed approvals for companies this size. Watching the merging of these two companies is going to be very interesting indeed.

Neil Versel just reported that John Glaser, current CEO of Siemens Health Services, has announced that upon close of acquisition he’ll be joining the Cerner team as a Senior VP. I also love that John Glaser made this announcement on the Cerner blog.

I think this is a big deal since I believe John Glaser is at the point in his career that he could do just about anything (or nothing) if that’s what he desired. The few times I’ve interacted with John Glaser, he was sincerely interested in moving healthcare forward through the use of advanced IT. I imagine that’s what’s motivating him to stay with Cerner. No doubt, Cerner is sitting on a huge opportunity.

In John Glaser’s blog post, he provided an interesting insight into Neal Patterson’s comments at the Cerner user conference:

In his CHC keynote address, Cerner CEO Neal Patterson did a masterful job of conveying Cerner’s commitment to patient-centered care. Before he spoke, a patient and her nurse were introduced with explanation that the woman’s life was saved by a Cerner sepsis alerting system. Neal then shared the incredible challenges he and his wife have faced in her battle with cancer because of limited interoperability.

Neal’s keynote was very personal – about how we can make a loved one’s care journey easier by ensuring that all records – every detail – are available electronically and accurately wherever the patient receives care. It was the case for interoperability but also the case for making a patient’s life easier and the care better.

It’s hard for me to say how much of this was theatrics, but I’m glad they are at least talking the right talk. I really do hope that Neal’s personal experience will drive interoperability forward. Neil Versel suggested that interoperability would be John Glaser’s focus at Cerner. I hope he’s successful.

While at CHIME, I talked with Judy Faulkner, CEO of Epic, and we talked briefly about interoperability. At one point in our conversation I asked Judy, “Do you know the opportunity that you have available to you?” She looked at me with a bit of a blank stare (admittedly we were both getting our lunch). I then said, “You are big enough and have enough clout that you (Epic) could set the standard for interoperability and the masses would follow.” I’m not sure she’s processed this opportunity, but it’s a huge one that they have yet to capitalize on for the benefit of healthcare as we know it.

The same opportunity is available for Cerner as well. I really hope that both companies embrace open data, open APIs, and interoperability in a big way. Both have stated their interest in these areas, but I’d like to see a little less talk…a lot more action. They’re both well positioned to be able to make interoperability a reality. They just need to understand what that really means and go to work on it.

I’m hopeful that both companies are making progress on this. Having John Glaser focused on it should help that as well. The key will be that both companies have to realize that interoperability is what’s best for healthcare in general and in the end that will be what’s best for their customers as well.

Do Hospitals Want Interoperability?

Posted on November 17, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I’ve had this discussion come up over and over again today in a series of discussions that I’ve had at the NYeC’s Digital Health Conference in NYC. Many people are blaming the EHR vendors for not being interoperable. Other people are blaming standards. Some like to blame HIPAA (which is ironic since it was passed to make health data portable). There are many more reasons that people give for why healthcare isn’t exchanging data and that interoperability isn’t a reality.

Although, in all of these discussions, I keep going back to the core question of whether hospitals and healthcare organizations really want that healthcare data to be interoperable. As I look back on the past, I can think of some doctors who’ve wanted it for a while, but I think the healthcare industry as a whole didn’t really want interoperability to happen. They would never admit this in public, because we all know on face that there are benefits to the healthcare system and the patient for interoperability. However, interoperability would have been a bad thing financially for many healthcare organizations.

It’s one of the dirty little secrets of healthcare. Sure, the EHR vendors never provided the interoperability functionality, but that’s largely because the healthcare providers never asked for it and largely didn’t want that functionality. They were all a little complicit in hiding the dirty little secret that healthcare organizations were benefiting from the inefficiency of the system.

I’m extremely hopeful that we’re starting to see a shift away from the above approach. I think the wheels are turning where hospitals are starting to see why their organization is going to need to be interoperable or their reimbursement will be affected. ACOs are leading this charge as the hospitals are going to need the data from other providers in order to improve the care they provide and lower costs.

Now, I think the biggest barrier to interoperability for most hospitals is figuring out the right way to approach it. Will their EHR vendor handle it? Do they need to create their own solution? Are CCD’s enough? Should they use Direct? Should they use a local HIE? Should they do a private HIE? Of course, this doesn’t even talk about the complexities of the hospital system and outside providers. Plus, there’s no one catch all answer.

I hope that we’re entering a new era of healthcare interoperability. I certainly think we’re heading in that direction. What are you seeing in your organizations?

More Epic Interoperability Discussion

Posted on October 7, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Looks like Epic is starting to open up and join the conversation about healthcare interoperability. The latest is an article in the New York Times which includes a few comments from Judy Faulkner, CEO of Epic. Here’s the main comments from Judy:

In 2005, when it became clear to her [Judy] that the government was not prepared to create a set of rules around interoperability, Ms. Faulkner said, her team began writing the code for Care Everywhere. Initially seen as a health information exchange for its own customers, Care Everywhere today connects hospitals all over the country as well as to various public health agencies and registries.

“Let’s say a patient is coming from U.C.L.A. and going to the University of Chicago, an Epic-to-Epic hospital. Boom. That’s easy,” Ms. Faulkner said. “These are hospitals that have agreed to the Rules of the Road, a legal contract, that says the other organization is going to take good care of the data.”

This is a really interesting approach. Blame the government for not applying a standard. Talk about how you’ve had to do it yourself and that’s why you built Care Everywhere. I wish that Judy would come out with the heart of the matter. Epic’s customers never asked for it and so they never did it. I believe that’s the simple reality. Remember that interoperability might be a big negative for many healthcare systems. If they’re interoperable, that could be a hit to revenue. Hopefully ACOs and other value based reimbursement will change this.

The key to coming clean like this though, is to come out with a deep set of initiatives that show that while it wasn’t something you worked on in the past, you’re going all in on interoperability now. We’re a very forgiving people, and if Epic (or any other large EHR vendor for that matter) came out with a plan to be interoperable, many would jump on board and forgive them for past transgressions (wherever the blame may lie).

Unfortunately, we don’t yet see this. I’d love to catch up with Judy Faulkner at CHIME and talk to her about it. The key will be to have a full spectrum interoperability plan and not just Care Everywhere that doesn’t work everywhere. Remember that Epic has charts for about 50% of the US patient population, but that’s still only 50%. Plus, of the 50% of patients they do have, a very very small percentage of them are all stored in the same Epic system. My guess would be that 99+% of patients who have a record in Epic have their medical records in other places as well. This means that Epic will need data from other non-Epic systems.

As I’ve said before, Epic wouldn’t need to wait for the government to do this. They are more than large enough to set the standard for the industry. In fact, doing so puts them in a real position of power. Plus, it’s the right thing to do for the US healthcare system.

Will the interoperability be perefect? No. It will take years and years to get everything right, but that’s ok. Progress will be better than what we have now. I love this quote from the NY Times article linked above:

“We’ve spent half a million dollars on an electronic health record system about three years ago, and I’m faxing all day long. I can’t send anything electronically over it,” said Dr. William L. Rich III, a member of a nine-person ophthalmology practice in Northern Virginia and medical director of health policy for the American Academy of Ophthalmology.

I hope that Epic continues down the path to interoperability and becomes even more aggressive. I think the climate’s right for them to make it happen. They’re in a really unique position to be able to really change the way we think and talk about interoperability. I’m interested to see if they seize the opportunity or just talk about it.

Of course, we’ve focused this article talking about Epic. That’s what happens when you’re the A list celebrity on the red carpet. People want to talk about you. The NY Times article pretty aptly points out that the other EHR vendors aren’t much more or less interoperable than Epic. Feel free to replace Epic with another large EHR vendor’s name and the story will likely read the same.

My hope is that EHR vendors won’t wait for customers to demand interoperability, but will instead make interoperability so easy that their customers will love taking part. Watch for a future series of posts on Healthcare Intoperability and why this is much easier said than done.

EMR Change Cuts Cardiac Telemetry Use Substantially

Posted on September 25, 2014 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Changing styles of medical practice can be really tough, even if major trade organization sticks its oar in to encourage new behavior from docs.

Such is the situation with cardiac telemetry, which is listed by the American Board of Internal Medicine Foundation as either unnecessary or overused in most cases. But a recent piece of research demonstrated that configuring an EMR to help doctors comply with the guideline can help hospitals lower needless cardiac monitoring substantially.

Often, it takes a very long time to get doctors to embrace new guidelines like these, despite pressure from payers, employers and even peers. (Physicians may turn on a dime and try out a new drug when the right pharmaceutical rep shows up, but that’s another story.) Doctors say they stick to their habits because of patient, institutional or personal preferences, as well as fear of lawsuits.

But according to a recent study appearing in JAMA Internal Medicine, reprogramming its Centricity EMR did the trick for Wilmington, Del.-based Christiana Care Health System.

To curb the use of cardiac telemetry that was unnecessary, Christiana Care removed the standard option for doctors to order cardiac monitoring outside of AHA guidelines, and required them to take an extra step to order this type of test.

Meanwhile, when the cardiac monitoring order did fall within AHA guidelines, Christiana Care added an AHA-recommended time frame for the monitoring. After that time passed, the EMR notified nurses to stop the monitoring or ask physicians if they believed it would be unsafe to stop.

The results were striking. After implementing the changes in the EMR, the health systems average daily not intensive care unit patients with cardiac monitoring fell by 70%. What’s more, Christiana Care’s average daily cost of administering  non-ICU cardiac monitoring held by 70%, from $18,971 to $5,772.

Christiana Care’s health IT presence is already well ahead of many hospitals — it’s reached Stage 6 of the HIMSS EMRAM scale — so it’s not surprising to see it leading the way in shaping physician behavior.

The question now is how the system builds on what it’s learned. Having survived a politically-sensitive transition without creating a revolution in its ranks, I’d argue the time is now to jump in and work on compliance with other clinical guidelines. With pressure mounting to deliver efficient care, it’d be smart to keep the ball rolling.

Epic Wants to Be Known for Interoperability – Are They Interoperable?

Posted on September 19, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Epic has been fighting the stigma of being a closed system for a while now. It seems that Epic isn’t happy about this characterization and they’re coming out guns blazing to try and show how Epic is interoperable. They’re so interested in changing this perception that Epic recently hired a lobbyist to change how they’re viewed by the people in DC.

A recent tweet highlighted a slide from the Epic user conference (Epic UGM) that shows how many Epic patient records they’re exchanging per month. Here’s the tweet and graph below:

Farzad Mostashari asks a very good question, “Does that graph help?” I find Farzad’s tweet also interesting because just over a year ago Farzad tweeted another Epic interoperability chart when he was still National Coordinator at ONC. I’ll embed the previous chart below so you can easily compare the two graphs side by side:
Epic Data Sharing Chart

I think Farzad is right to be skeptical about Epic’s claims to interoperability. First, it seems Epic is finally making some progress with Epic to Epic interoperability, but Epic to Non-Epic systems is still far behind. Second, Epic loves to claim how they have charts for some huge percentage of the US population (currently about 314 million people). I bet if we looked at the percentage of total Epic charts that have been exchanged, it would be an extremely small number. I also wonder if the charts above count a full patient chart or something simple like a lab result or prescription.

I don’t want to harp on this too much, because this is a step forward for Epic. Even if they’re not as interoperable as they could be and as we’d like them to be, I’m excited that they’re now at least open to the idea of interoperability.

With that said, I wish that Epic would spend more time and effort on actually being interoperable and not just trying to say that they’re interoperable. This includes committing the resources required to support connections outside of Epic. I’ve heard over and over from health IT vendor after health IT vendor about how hard it is to get Epic to work with them in any form or fashion. There’s a way that Epic could scale their effort to hundreds of other health IT vendors, but they haven’t made the commitment to do so.

Think about the opportunity that Epic has available to them. They have enough scale, reach and clout that they could by force of size establish a standard for interoperability. Many health IT vendors would bend over backwards to meet whatever standard Epic chose. That’s a powerful position to be in if they would just embrace it. I imagine the reason they haven’t done so yet is because the market’s never demanded it. Sometimes companies like Epic need to embrace something even if it doesn’t drive short term sales. I think this is one of those choices Epic should make.

I’m sure that lobbyists can be an effective solution to change perceptions in Washington. However, a far more effective strategy would be to actually fully embrace interoperability at every level. If they did so, you can be sure that every news outlet would be more than excited to write about the change.