Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Surescripts Deal Connects EMR Vendors And PBMs To Improve Price Transparency

Posted on November 22, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

I’m no expert on the pharmacy business, but from where I sit as a consumer it’s always looked to me as though pharmaceutical pricing is something of a shell game. It makes predicting what your airline ticket will cost seem like child’s play.

Yes, in theory, the airlines engage in demand-oriented pricing, while pharma pricing is based on negotiated prices spread among multiple contracted parties, but in either case end-users such as myself have very little visibility into where these numbers are coming from.  And in my opinion, at least, that’s not good for anyone involved. You can say “blah blah blah skin in the game” all you want, but co-pays are a poor proxy for making informed decisions as a patient as to what benefits you’ll accrue and problems you face when buying a drug.

Apparently, Surescripts hopes to change the rules to some degree. It just announced that it has come together with two other interest groups within the pharmacy supply chain to offer patient-specific benefit and price information to providers at the point of care.

Its partners in the venture include a group of EMR companies, including Cerner, Epic, Practice Fusion and Aprima Medical Software, which it says represent 53% of the U.S. physician base. It’s also working with two pharmacy benefit managers (CVS Health and Express Scripts) which embrace almost two-thirds of US patients.

The new Surescripts effort actually has two parts, a Real-Time Prescription Benefit tool and an expanded version of its Prior Authorization solution.  Used together, and integrated with an EHR, these tools will clarify whether the patient’s health insurance will cover the drug suggested by the provider and offer therapeutic alternatives that might come at a lower price.

If you ask me, this is clever but fails to put pressure on the right parties. You don’t have to be a pharmaceutical industry expert to know that middlemen like PBMs and pharmacies use a number of less-than-visible stratagems jack up drug prices. Patients are forced to just cope with whatever deal these parties strike among themselves.

If you really want to build a network which helps consumers keep prices down, go for some real disclosure. Create a network which gathers and shares price information every time the drug changes hands, up to and including when the patient pays for that drug. This could have a massive effect on drug pricing overall.

Hey, look at what Amazon did just by making costs of shipping low and relatively transparent to end-users. They sucked a lot of the transaction costs out of the process of shipping products, then gave consumers tools allowing them to watch that benefit in action.

Give consumers even one-tenth of that visibility into their pharmacy supply chain, and prices would fall like a hot rock. Gee, I wonder why nobody’s ever tried that. Could it be that pharmaceutical manufacturers don’t want us to know the real costs of making and shipping their product?

CHIME Suspends the $1 Million Dollar National Patient ID Challenge

Posted on November 17, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

CHIME just announced that they’ve suspended their National Patient ID Challenge. For those not familiar with the challenge, almost 2 years ago CHIME Announced a $1 million prize for companies to solve the patient identification and matching problem in healthcare. Here’s the description of the challenge from the HeroX website that hosted the challenge:

The CHIME National Patient ID Challenge is a global competition aimed at incentivizing new, early-stage, and experienced innovators to accelerate the creation and adoption of a solution for ensuring 100 percent accuracy in identifying patients in the U.S. Patients want the right treatment and providers want information about the right patient to provide the right treatment. Patients also want to protect their privacy and feel secure that their identity is safe.

And here’s the “Challenge Breakthrough” criteria:

CHIME Healthcare Innovation Trust is looking for the best plan, strategies and methodologies that will accomplish the following:

  • Easily and quickly identify patients
  • Achieve 100% accuracy in patient identification
  • Protect patient privacy
  • Protect patient identity
  • Achieve adoption by the vast majority of patients, providers, insurers, and other stakeholders
  • Scale to handle all patients in the U.S.

When you look at the fine print, it says CHIME (or the Healthcare Innovation Trust that they started to host the challenge) could cancel the challenge at any time without warning or explanation including removing the Prize completely:

5. Changes and Cancellation. Healthcare Innovation Trust reserves the right to make updates and/or make any changes to, or to modify the scope of the Challenge Guidelines and Challenge schedule at any time during the Challenge. Innovators are responsible for regularly reviewing the Challenge site to ensure they are meeting all rules and requirements of and schedule for the Challenge. Healthcare Innovation Trust has the right to cancel the Challenge at any time, without warning or explanation, and to subsequently remove the Prize completely.

It seems that CHIME’s legally allowed to suspend the challenge. However, that doesn’t mean that doesn’t burn the trust of the community that saw them put out the $1 million challenge. The challenge created a lot of fanfare including promotion by ONC on their website, which is a pretty amazing thing to even consider. CHIME invested a lot in this challenge, so it must hurt for them to suspend it.

To be fair, when the challenge was announced I hosted a discussion where I asked the question “Is this even solvable?” At 100% does that mean that no one could ever win the challenge? With that in mind, the challenge always felt a bit like Fool’s Gold to me and I’m sure many others. I thought, “CHIME could always come back and make the case that no one could ever reach 100% and so they’d never have to pay the money.” Those that participated had to feel this as well and they participated anyway.

The shameful part to me is how suspending the competition is leaving those who did participate high and dry. I asked CHIME about this and they said that the Healthcare Innovation Trust is still in touch with the finalists and that they’re encouraging them to participate in the newly created “Patient Identification Task Force.” Plus, the participants received an honorarium.

Participation in a CHIME Task Force and the honorarium seems like a pretty weak consolation prize. In fact, I can’t imagine any of the vendors that participated in the challenge would trust working with CHIME going forward. Maybe some of them will swallow hard and join the task force, but that would be a hard choice after getting burnt like this. It’s possible CHIME is offering them some other things in the background as well.

What’s surprising to me is why CHIME didn’t reach out to the challenge participants and say that none of them were going to win, but that CHIME still wanted to promote their efforts and offerings to provide a solid benefit to those that participated. CHIME could present the lessons learned from the challenge and share all the solutions that were submitted and the details of where they fell short and where they succeeded. At least this type of promotion and exposure would be a nice consolation prize for those who spent a lot of time and money participating in the challenge. Plus, the CIOs could still benefit from something that solved 95% of their problems.

Maybe the new Patient Identification Task Force will do this and I hope they do. CHIME did it for their new Opioid Task Force at the Fall Forum when they featured it on the main stage. How about doing the same for the Patient Identification Challenge participants? I think using the chance to share the lessons learned would be a huge win for CHIME and its members. I imagine it’s hard for CHIME to admit “failure” for something they worked on and promoted so much. However, admitting the failure and sharing what was learned from it would be valuable for everyone involved.

While I expect CHIME has burnt at least some of the challenge participants, the CHIME CIO members probably knew the challenge was unlikely to succeed and won’t be burnt by this decision. Plus, the challenge did help to call national attention to the issue which is a good thing and as they noted will help continue to push forward the national patient identifier efforts in Washington. Maybe now CHIME will do as Andy Aroditis, Founder and CEO of NextGate, suggested in this article where Shaun Sutner first reported on issues with the CHIME National Patient ID Challenge:

Aroditis complained that rather than plunging into a contest, CHIME should have convened existing patient matching vendors, like his company, to collaborate on a project to advance the technology.

“Instead they try to do these gimmicks,” Aroditis said.

I imagine that’s what CHIME would say the Patient Identification Task Force they created will now do. The question is whether CHIME burnt bridges they’ll need to cross to make that task force effective.

The reality is that Patient Identification and Patient Matching is a real problem that’s experienced by every healthcare organization. It’s one that CHIME members feel in their organizations and many of them need better solutions. As Beth Just from Just Associates noted in my discussion when the challenge was announced, $1 million is a drop in the bucket compared to what’s already been invested to solve the problem.

Plus, many healthcare organizations are in denial when it comes to this problem. They may say they have an accuracy of 98%, the reality is very different when a vendor goes in and wakes them up to what’s really happening in their organization. This is not an easy problem to solve and CHIME now understands this more fully. I hope their new task force is successful in addressing the problem since it is an important priority.

Waiting For The Perfect “Standard” Is Not The Answer To Healthcare’s Interoperability Problem

Posted on October 16, 2017 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

Have you bought into the “standards will solve healthcare’s interoperability woes” train of thought? Everyone understands that standards are necessary to enable disparate systems to communicate with each other, but as new applications and new uses for data continually appear, healthcare organizations that are waiting for universal standards, are not maximizing the value of their data. More importantly, they will be waiting a long time to realize the full potential of their data.

Healthcare interoperability is not just a matter of transferring data as an entire file from one user to another. Instead, effective exchange of information allows each user to select which elements of a patient’s chart are needed, and then access them in a format that enables analysis of different data sets to provide a holistic picture of the patient’s medical history or clinical trends in a population of patients. Healthcare’s interoperability challenge is further exacerbated by different contextual interpretations of the words within those fields. For instance, how many different ways are there to say heart attack?

The development of the Health Level Seven (HL7®) FHIR®, which stands for Fast Healthcare Interoperability Resources, represents a significant step forward to interoperability. While the data exchange draft that is being developed and published by HL7 eliminates many of the complexities of earlier HL7 versions and facilitates real-time data exchange via web technology, publication of release 4 – the first normative version of the standard – is not anticipated until October 2018.

As these standards are further developed, the key to universal adoption will be simplicity, according to John Lynn, founder of the HealthcareScene.com. However, he suggests that CIOs stop waiting for “perfect standards” and focus on how they can best achieve interoperability now.

Even with standards that can be implemented in all organizations, the complexity and diversity of the healthcare environment means that it will take time to move everyone to the same standards. This is complicated by the number of legacy systems and patchwork of applications that have been added to healthcare IT systems in an effort to meet quickly changing needs throughout the organization. Shrinking financial resources for capital investment and increasing competition for IT professionals limits a health system’s ability to make the overall changes necessary for interoperability – no matter which standards are adopted.

Some organizations are turning to cloud-based, managed service platforms to perform the integration, aggregation and harmonization that makes data available to all users – regardless of the system or application in which the information was originally collected. This approach solves the financial and human resource challenges by making it possible to budget integration and data management requirements as an operational rather than a capital investment. This strategy also relieves the burden on in-house IT staff by relying on the expertise of professionals who focus on emerging technologies, standards and regulations that enable safe, compliant data exchange.

How are you planning to scale your interoperability and integration efforts?  If you're waiting for standards, why are you waiting?

As a leading provider of healthcare interoperability solutions, Liaison is a proud sponsor of Healthcare Scene. While the conversation about interoperability has been ongoing for many years, ideas, new technology and new strategies discussed and shared by IT professionals will lead to successful healthcare data exchange that will transform healthcare and result in better patient care.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

AHA Asks Congress To Reduce Health IT Regulations for Medicare Providers

Posted on September 22, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

The American Hospital Association has sent a letter to Congress asking members to reduce regulatory burdens for Medicare providers, including mandates affecting a wide range of health IT services.

The letter, which is addressed to the House Ways and Means Health subcommittee, notes that in 2016, CMS and other HHS agencies released 49 rules impacting hospitals and health systems, which make up nearly 24,000 pages of text.

“In addition to the sheer volume, the scope of changes required by the new regulations is beginning to outstrip the field’s ability to absorb them,” says the letter, which was signed by Thomas Nickels, executive vice president of government relations and public policy for the AHA. The letter came with a list of specific changes AHA is proposing.

Proposals of potential interest to health IT leaders include the following. The AHA is asking Congress to:

  • Expand Medicare coverage of telehealth to patients outside of rural areas and expand the types of technology that can be used. It also suggests that CMS should automatically reimburse for Medicare-covered services when delivered via telehealth unless there’s an individual exception.
  • Remove HIPAA barriers to sharing patient medical information with providers that don’t have a direct relationship with that patient, in the interests of improving care coordination and outcomes in a clinically-integrated setting.
  • Cancel Stage 3 of the Meaningful Use program, institute a 90-day reporting period for future program years and eliminate the all-or-nothing approach to compliance.
  • Suspend eCQM reporting requirements, given how difficult it is at present to pull outside data into certified EHRs for quality reporting.
  • Remove requirements that hospitals attest that they have bought technology which supports health data interoperability, as well as that they responded quickly and in good faith to requests for exchange with others. At present, hospitals could face penalties for technical issues outside their control.
  • Refocus the ONC to address a narrower scope of issues, largely EMR standards and certification, including testing products to assure health data interoperability.

I am actually somewhat surprised to say that these proposals seem to be largely reasonable. Typically, when they’re developed by trade groups, they tend to be a bit too stacked in favor of that group’s subgroup of concerns. (By the way, I’m not taking a position on the rest of the regulatory ideas the AHA put forth.)

For example, expanding Medicare telehealth coverage seems prudent. Given their age, level of chronic illness and attendant mobility issues, telehealth could potentially do great things for Medicare beneficiaries.

Though it should be done carefully, tweaking HIPAA rules to address the realities of clinical integration could be a good thing. Certainly, no one is suggesting that we ought to throw the rulebook out the window, it probably makes sense to square it with today’s clinical realities.

Also, the idea of torquing down MU 3 makes some sense to me as well, given the uncertainties around the entirety of MU. I don’t know if limiting future reporting to 90-day intervals is wise, but I wouldn’t take it off of the table.

In other words, despite spending much of my career ripping apart trade groups’ legislative proposals, I find myself in the unusual position of supporting the majority of the ones I list above. I hope Congress gives these suggestions some serious consideration.

Interoperability: Is Your Aging Healthcare Integration Engine the Problem?

Posted on September 18, 2017 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.
There is no shortage of data collected by healthcare organizations that can be used to improve clinical as well as business decisions. Announcements of new technology that collects patient information, clinical outcome data and operational metrics that will make a physician or hospital provide better, more cost-effective care bombard us on a regular basis.

The problem today is not the amount of data available to help us make better decisions; the problem is the inaccessibility of the data. When different users – physicians, allied health professionals, administrators and financial managers – turn to data for decision support, they find themselves limited to their own silos of information. The inability to access and share data across different disciplines within the healthcare organization prevents the user from making a decision based on a holistic view of the patient or operational process.

In a recent article, Alan Portela points out that precision medicine, which requires “the ability to collect real-time data from medical devices at the moment of care,” cannot happen easily without interoperability – the ability to access data across disparate systems and applications. He also points out that interoperability does not exist yet in healthcare.

Why are healthcare IT departments struggling to achieve interoperability?

Although new and improved applications are adopted on a regular basis, healthcare organizations are just now realizing that their integration middleware is no longer able to handle new types of data such as social media, the volume of data and the increasing number of methods to connect on a real-time basis. Their integration platforms also cannot handle the exchange of information from disparate data systems and applications beyond the four walls of hospitals. In fact, hospitals of 500 beds or more average 25 unique data sources with six electronic medical records systems in use. Those numbers will only move up over time, not down.

Integration engines in place throughout healthcare today were designed well before the explosion of the data-collection tools and digital information that exist today. Although updates and additions to integration platforms have enabled some interoperability, the need for complete interoperability is creating a movement to replace integration middleware with cloud-based managed services.

A study by the Aberdeen Group reveals that 76 percent of organizations will be replacing their integration middleware, and 70 percent of those organizations will adopt cloud-based integration solutions in the next three years.

The report also points out that as healthcare organizations move from an on-premises solution to a cloud-based platform, business leaders see migration to the cloud and managed services as a way to better manage operational expenses on a monthly basis versus large, up-front capital investments. An additional benefit is better use of in-house IT staff members who are tasked with mission critical, day-to-day responsibilities and may not be able to focus on continuous improvements to the platform to ensure its ability to handle future needs.

Healthcare has come a long way in the adoption of technology that can collect essential information and put it in the hands of clinical and operational decision makers. Taking that next step to effective, meaningful interoperability is critical.

As a leading provider of healthcare interoperability solutions, Liaison is a proud sponsor of Healthcare Scene. It is only through discussions and information-sharing among Health IT professionals that healthcare will achieve the organizational support for the steps required for interoperability.

Join John Lynn and Liaison for an insightful webinar on October 5, titled: The Future of Interoperability & Integration in Healthcare: How can your organization prepare?

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Rush Sues Patient Monitoring Vendor, Says System Didn’t Work

Posted on August 25, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Rush University Medical Center has filed suit against one of its health IT vendors, claiming that its patient monitoring system didn’t work as promised and may have put patients in danger.

According to a story in the Chicago Tribune, Rush spent $18 million installing the Infinity Acute Monitoring Solution system from Telford, PA-based Draeger Inc. between 2012 and early 2016.  The Infinity system included bedside monitors, larger data aggregating monitors at central nursing stations, battery-powered portable monitors and M300 wireless patient-worn monitors.

However, despite years of attempting to fix the system, its patient alarms were still unreliable and inaccurate, it contends in the filing, which accuses Draeger of breach of contract, unjust enrichment and fraud.

In the suit, the 664-bed hospital and academic medical center says that the system was dogged by many issues which could have had an impact on patient safety. For example, it says, the portable monitors stopped collecting data when moved to wireless networks and sometimes stole IP addresses from bedside monitors, knocking the bedside monitor off-line leaving the patient unmonitored.

In addition, the system allegedly sent out false alarms for heart arrhythmia patients with pacemakers, distracting clinicians from performing their jobs, and failed monitor apnea until 2015, according to the complaint. Even then, the system wasn’t monitoring some sets of apnea patients accurately, it said. Near the end, the system erased some patient records as well, it contends.

Not only that, Draeger didn’t deliver everything it was supposed to provide, the suit alleges, including wired-to-wireless monitoring and monitoring for desaturation of neonatal patients’ blood oxygen.

As if that weren’t enough, Draeger didn’t respond effectively when Rush executives told it about the problems it was having, according to the suit. “Rather than effectively remediating these problems, Draeger largely, and inaccurately, blamed them on Rush,” it contends.

While Draeger provided a software upgrade for the system, it was extremely difficult to implement, didn’t fix the original issues and created new problems, the suit says.

According to Rush, the Draeger system was supposed to last 10 years. However, because of technical problems it observed, the medical center replaced the system after only five years, spending $30 million on the new software, it says.

Rush is asking the court to make Draeger pay that the $18 million it spent on the system, along with punitive damages and legal fees.

It’s hard to predict the outcome of such a case, particularly given that the system’s performance has to have depended in part on how Rush managed the implementation. Plus, we’re only seeing the allegations made by Rush in the suit and not Draeger’s perspective which could be very different and offer other details. Regardless, it seems likely these proceedings will be watched closely in the industry. Regardless of whether they are at fault or not, no vendor can afford to get a reputation for endangering patient safety, and moreover, no hospital can afford to buy from them if they do.

Is It Time To Put FHIR-Based Development Front And Center?

Posted on August 9, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

I like to look at questions other people in the #HIT world wonder about, and see whether I have a different way of looking at the subject, or something to contribute to the discussion. This time I was provoked by one asked by Chad Johnson (@OchoTex), editor of HealthStandards.com and senior marketing manager with Corepoint Health.

In a recent HealthStandards.com article, Chad asks: “What do CIOs need to know about the future of data exchange?” I thought it was an interesting question; after all, everyone in HIT, including CIOs, would like to know the answer!

In his discussion, Chad argues that #FHIR could create significant change in healthcare infrastructure. He notes that if vendors like Cerner or Epic publish a capabilities-based API, providers’ technical, clinical and workflow teams will be able to develop custom solutions that connect to those systems.

As he rightfully points out, today IT departments have to invest a lot of time doing rework. Without an interface like FHIR in place, IT staffers need to develop workflows for one application at a time, rather than creating them once and moving on. That’s just nuts. It’s hard to argue that if FHIR APIs offer uniform data access, everyone wins.

Far be it from me to argue with a good man like @OchoTex. He makes a good point about FHIR, one which can’t be emphasized enough – that FHIR has the potential to make vendor-specific workflow rewrites a thing of the past. Without a doubt, healthcare CIOs need to keep that in mind.

As for me, I have a couple of responses to bring to the table, and some additional questions of my own.

Since I’m an HIT trend analyst rather than actual tech pro, I can’t say whether FHIR APIs can or can’t do what Chat is describing, though I have little doubt that Chad is right about their potential uses.

Still, I’d contend out that since none other than FHIR project director Grahame Grieve has cautioned us about its current limitations, we probably want to temper our enthusiasm a bit. (I know I’ve made this point a few times here, perhaps ad nauseum, but I still think it bears repeating.)

So, given that FHIR hasn’t reached its full potential, it may be that health IT leaders should invest added time on solving other important interoperability problems.

One example that leaps to mind immediately is solving patient matching problems. This is a big deal: After all, If you can’t match patient records accurately across providers, it’s likely to lead to wrong-patient related medical errors.

In fact, according to a study released by AHIMA last year, 72 percent of HIM professional who responded work on mitigating possible patient record duplicates every week. I have no reason to think things have gotten better. We must find an approach that will scale if we want interoperable data to be worth using.

And patient data matching is just one item on a long list of health data interoperability concerns. I’m sure you’re aware of other pressing problems which could undercut the value of sharing patient records. The question is, are we going to address those problems before we began full-scale health data exchange? Or does it make more sense to pave the road to data exchange and address bumps in the road later?

We Can’t Afford To Be Vague About Population Health Challenges

Posted on June 19, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Today, I looked over a recent press release from Black Book Research touting its conclusions on the role of EMR vendors in the population health technology market. Buried in the release were some observations by Alan Hutchison, vice president of Connect & Population Health at Epic.

As part of the text, the release observes that “the shift from quantity-based healthcare to quality-based patient-centric care is clearly the impetus” for population health technology demand. This sets up some thoughts from Hutchison.

The Epic exec’s quote rambles a bit, but in summary, he argues that existing systems are geared to tracking units of care under fee-for-service reimbursement schemes, which makes them dinosaurs.

And what’s the solution to this problem? Why, health systems need to invest in new (Epic) technology geared to tracking patients across their path of care. “Single-solution systems and systems built through acquisition [are] less able to effectively understand the total cost of care and where the greatest opportunities are to reduce variation, improve outcomes and lower costs,” Hutchison says.

Yes, I know that press releases generally summarize things in broad terms, but these words are particularly self-serving and empty, mashing together hot air and jargon into an unappetizing patty. Not only that, I see a little bit too much of stating as fact things which are clearly up for grabs.

Let’s break some of these issues down, shall we?

  • First, I call shenanigans on the notion that the shift to “value-based care” means that providers will deliver quality care over quantity. If nothing else, the shifts in our system can’t be described so easily. Yeah, I know, don’t expect much from a press release, but words matter.
  • Second, though I’m not surprised Hutchison made the argument, I challenge the notion that you must invest in entirely new systems to manage population health.
  • Also, nobody is mentioning that while buying a new system to manage pop health data may be cleaner in some respects, it could make it more difficult to integrate existing data. Having to do that undercuts the value of the new system, and may even overshadow those benefits.

I don’t know about you, but I’m pretty tired of reading low-calorie vendor quotes about the misty future of population health technology, particularly when a vendor rep claims to have The Answer.  And I’m done with seeing clichéd generalizations about value-based care pass for insight.

Actually, I get a lot more out of analyses that break down what we *don’t* know about the future of population health management.

I want to know what hasn’t worked in transitioning to value-based reimbursement. I hope to see stories describing how health systems identified their care management weaknesses. And I definitely want to find out what worries senior executives about supporting necessary changes to their care delivery models.

It’s time to admit that we don’t yet know how this population health management thing is going to work and abandon the use of terminally vague generalizations. After all, once we do, we can focus on the answering our toughest questions — and that’s when we’ll begin to make real progress.

UCHealth Adds Claims Data To Population Health Dataset

Posted on April 24, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A Colorado-based health system is implementing a new big data strategy which incorporates not only data from clinics, hospitals and pharmacies, but also a broad base of payer claim data.

UCHealth, which is based in Aurora, includes a network of seven hospitals and more than 100 clinics, caring collectively for more than 1.2 million unique patients in 2016. Its facilities include the University of Colorado Hospital, the principal teaching hospital for the University of Colorado School of Medicine.

Leaders at UCHealth are working to improve their population health efforts by integrating data from seven state insurers, including Anthem Blue Cross and Blue Shield, Cigna, Colorado Access, Colorado Choice Health Plans, Colorado Medicaid, Rocky Mountain Health Plans and United Healthcare.

The health system already has an Epic EMR in place across the system which, as readers might expect, offers a comprehensive view of all patient treatment taking place at the system’s clinics and hospitals.

That being said, the Epic database suffers from the same limitations as any other locally-based EMR. As UCHealth notes, its existing EMR data doesn’t track whether a patient changes insurers, ages into Medicare, changes doctors or moves out of the region.

To close the gaps in its EMR data, UCHealth is using technology from software vendor Stratus, which offers a healthcare data intelligence application. According to the vendor, UCHealth will use Stratus technology to support its accountable care organizations as well as its provider clinical integration strategy.

While health system execs expect to benefit from integrating payer claims data, the effort doesn’t satisfy every item on their wish list. One major challenge they’re facing is that while Epic data is available to all the instant it’s added, the payer data is not. In fact, it can take as much as 90 days before the payer data is available to UCHealth.

That being said, UCHealth’s leaders expect to be able to do a great deal with the new dataset. For example, by using Stratus, physicians may be able to figure out why a patient is visiting emergency departments more than might be expected.

Rather than guessing, the physicians will be able to request the diagnoses associated with those visits. If the doctor concludes that their conditions can be treated in one of the system’s primary care clinics, he or she can reach out to these patients and explain how clinic-based care can keep them in better health.

And of course, the health system will conduct other increasingly standard population health efforts, including spotting health trends across their community and better understanding each patient’s medical needs.

Over the next several months, 36 of UCHealth’s primary care clinics will begin using the Stratus tool. While the system hasn’t announced a formal pilot test of how Stratus works out in a production setting, rolling this technology out to just 36 doctors is clearly a modest start. But if it works, look for other health systems to scoop up claims data too!

Database Linked With Hospital EMR To Encourage Drug Monitoring

Posted on March 31, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

According to state officials, Colorado occupies the unenviable position of second worst in the US for prescription drug misuse, with more than 255,000 Coloradans misusing prescribed medications.

One way the state is fighting back is by running the Colorado Prescription Drug Monitoring Program which, like comparable efforts in other states, tracks prescriptions for controlled medications. Every regular business day, the state’s pharmacists upload prescription data for medications listed in Schedules II through V.

While this effort may have value, many physicians haven’t been using the database, largely because it can be difficult to access. In fact, historically physicians have been using the system only about 30 percent of the time when prescribing controlled substances, according to a story appearing in HealthLeaders Media.

As things stand, it can take physicians up to three minutes to access the data, given that they have to sign out of their EMR, visit the PDMP site, log in using separate credentials, click through to the right page, enter patient information and sort through possible matches before they got to the patient’s aggregated prescription history. Given the ugliness of this workflow, it’s no surprise that clinicians aren’t searching out PDMP data, especially if they don’t regard a patient as being at a high risk for drug abuse or diversion.

But perhaps taking some needless steps out of the process can make a difference, a theory which one of the state’s hospitals is testing. Colorado officials are hoping a new pilot program linking the PDMP database to an EMR will foster higher use of the data by physicians. The pilot, funded by a federal grant through the Bureau of Justice Assistance, connects the drug database directly to the University of Colorado Hospital’s Epic EMR.

The project began with a year-long building out phase, during which IT leaders created a gateway connecting the PDMP database and the Epic installation. Several months ago, the team followed up with a launch at the school of medicine’s emergency medicine department. Eventually, the PDMP database will be available in five EDs which have a combined total of 270,000 visits per year, HealthLeaders notes.

Under the pilot program, physicians can access the drug database with a single click, directly from within the Epic EMR system. Once the PDMP database was made available, the pilot brought physicians on board gradually, moving from evaluating their baseline use, giving clinicians raw data, giving them data using a risk-stratification tool and eventually requiring that they use the tool.

Researchers guiding the pilot are evaluating whether providers use the PDMP more and whether it has an impact on high-risk patients. Researchers will also analyze what happened to patients a year before, during and a year after their ED visits, using de-identified patient data.

It’s worth pointing out that people outside of Colorado are well aware of the PDMP access issue. In fact, the ONC has been paying fairly close attention to the problem of making PDMP data more accessible. That being said, the agency notes that integrating PDMPs with other health IT systems won’t come easily, given that no uniform standards exist for linking prescription drug data with health IT systems. ONC staffers have apparently been working to develop a standard approach for delivering PDMP data to EMRs, pharmacy systems and health information exchanges.

However, at present it looks like custom integration will be necessary. Perhaps pilots like this one will lead by example.