Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Some Projections For 2017 Hospital IT Spending

Posted on January 4, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A couple of months ago, HIMSS released some statistics from its survey on US hospitals’ plans for IT investment over the next 12 months. The results contain a couple of data points that I found particularly interesting:

  • While I had expected the most common type of planned spending to be focused on population health or related solutions, HIMSS found that pharmacy was the most active category. In fact, 51% of hospitals were planning to invest in one pharmacy technology, largely to improve tracking of medication dispensing in additional patient care environments. Researchers also found that 6% of hospitals were planning to add carousels or packagers in their pharmacies.
  • Eight percent hospitals said that they plan to invest in EMR components, which I hadn’t anticipated (though it makes sense in retrospect). HIMSS reported that 14% of hospitals at Stage 1-4 of its Electronic Medical Record Adoption Model are investing in pharmacy tech for closed loop med administration, and 17% in auto ID tech. Four percent of Stage 6 hospitals plan to support or expand information exchange capabilities. Meanwhile, 60% of Stage 7 hospitals are investing in hardware infrastructure “for the post-EMR world.”

Other data from the HIMSS report included news of new analytics and telecom plans:

  • Researchers say that recent mergers and acquisitions are triggering new investments around telephony. They found that 12% of hospitals with inpatient revenues between $25 million and $125 million – and 6% of hospitals with more than $500 million in inpatient revenues — are investing in VOIP and telemedicine. FWIW, I’m not sure how mergers and acquisitions would trigger telemedicine rollouts, as they’re already well underway at many hospitals — maybe these deals foster new thinking and innovation?
  • As readers know, hospitals are increasingly spending on analytics solutions to improve care and make use of big data. However (and this surprised me) only 8% of hospitals reported plans to buy at least one analytics technology. My guess is that this number is small because a) hospitals may not have collected their big data assets in easily-analyzed form yet and b) that they’re still hoping to make better use of their legacy analytics tools.

Looking at these stats as a whole, I get the sense that the hospitals surveyed are expecting to play catch-up and shore up their infrastructure next year, rather than sink big dollars into future-looking solutions.

Without a doubt, hospital leaders are likely to invest in game-changing technologies soon such as cutting-edge patient engagement and population health platforms to prepare for the shift to value-based health. It’s inevitable.

But in the meantime it probably makes sense for them to focus on internal cost drivers like pharmacy departments, whose average annual inpatient drug spending shot up by more than 23% between 2013 and 2015. Without stanching that kind of bleeding, hospitals are unlikely to get as much value as they’d like from big-idea investments in the future.

Paris Hospitals Use Big Data To Predict Admissions

Posted on December 19, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Here’s a fascinating story in from Paris (or par-ee, if you’re a Francophile), courtesy of Forbes. The article details how a group of top hospitals there are running a trial of big data and machine learning tech designed to predict admission rates. The hospitals’ predictive model, which is being tested at four of the hospitals which make up the Assistance Publiq-Hopitaux de Paris (AP-HP), is designed to predict admission rates as much as 15 days in advance.

The four hospitals participating in the project have pulled together a massive trove of data from both internal and external sources, including 10 years’ worth of hospital admission records. The goal is to forecast admissions by the day and even by the hour for the four facilities participating in the test.

According to Forbes contributor Bernard Marr, the project involves using time series analysis techniques which can detect patterns in the data useful for predicting admission rates at different times.  The hospitals are also using machine learning to determine which algorithms are likely to make good predictions from old hospital data.

The system the hospitals are using is built on the open source Trusted Analytics Platform. According to Marr, the partners felt that the platform offered a particularly strong capacity for ingesting and crunching large amounts of data. They also built on TAP because it was geared towards open, collaborative development environments.

The pilot system is accessible via a browser-based interface, designed to be simple enough that data science novices like doctors, nurses and hospital administration staff could use the tool to forecast visit and admission rates. Armed with this knowledge, hospital leaders can then pull in extra staffers when increased levels of traffic are expected.

Being able to work in a distributed environment will be key if AP-HP decides to roll the pilot out to all of its 44 hospitals, so developers built with that in mind. To be prepared for the future, which might call for adding a great deal of storage and processing power, they designed distributed, cloud-based system.

“There are many analytical solutions for these type of problems, [but] none of them have been implemented in a distributed fashion,” said Kyle Ambert, an Intel data scientist and TAP contributor who spoke with Marr. “Because we’re interested in scalability, we wanted to make sure we could implement these well-understood algorithms in such a way that they work over distributed systems.”

To make this happen, however, Ambert and the development team have had to build their own tools, an effort which resulted in the first contribution to an open-source framework of code designed to carry out analysis over scalable, distributed framework, one which is already being deployed in other healthcare environments, Marr reports.

My feeling is that there’s no reason American hospitals can’t experiment with this approach. In fact, maybe they already are. Readers, are you aware of any US facilities which are doing something similar? (Or are most still focused on “skinny” data?)

Easing The Transition To Big Data

Posted on December 16, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Tapping the capabilities of big data has become increasingly important for healthcare organizations in recent years. But as HIT expert Adheet Gogate notes, the transition is not an easy one, forcing these organizations to migrate from legacy data management systems to new systems designed specifically for use with new types of data.

Gogate, who serves as vice president of consulting at Citius Tech, rightly points out that even when hospitals and health systems spend big bucks on new technology, they may not see any concrete benefits. But if they move through the big data rollout process correctly, their efforts are more likely to bear fruit, he suggests. And he offers four steps organizations can take to ease this transition. They include:

  • Have the right mindset:  Historically, many healthcare leaders came up through the business in environments where retrieving patient data was difficult and prone to delays, so their expectations may be low. But if they hope to lead successful big data efforts, they need to embrace the new data-rich environment, understand big data’s potential and ask insightful questions. This will help to create a data-oriented culture in their organization, Gogate writes.
  • Learn from other industries: Bear in mind that other industries have already grappled with big data models, and that many have seen significant successes already. Healthcare leaders should learn from these industries, which include civil aviation, retail and logistics, and consider adopting their approaches. In some cases, they might want to consider bringing an executive from one of these industries on board at a leadership level, Gogate suggests.
  • Employ the skills of data scientists: To tame the floods of data coming into their organization, healthcare leaders should actively recruit data scientists, whose job it is to translate the requirements of the methods, approaches and processes for developing analytics which will answer their business questions.  Once they hire such scientists, leaders should be sure that they have the active support of frontline staffers and operations leaders to make sure the analyses they provide are useful to the team, Gogate recommends.
  • Think like a startup: It helps when leaders adopt an entrepreneurial mindset toward big data rollouts. These efforts should be led by senior leaders comfortable with this space, who let key players act as their own enterprise first and invest in building critical mass in data science. Then, assign a group of core team members and frontline managers to areas where analytics capabilities are most needed. Rotate these teams across the organization to wherever business problems reside, and let them generate valuable improvement insights. Over time, these insights will help the whole organization improve its big data capabilities, Gogash says.

Of course, taking an agile, entrepreneurial approach to big data will only work if it has widespread support, from the C-suite on down. Also, healthcare organizations will face some concrete barriers in building out big data capabilities, such as recruiting the right data scientists and identifying and paying for the right next-gen technology. Other issues include falling reimbursements and the need to personalize care, according to healthcare CIO David Chou.

But assuming these other challenges are met, embracing big data with a willing-to-learn attitude is more likely to work than treating it as just another development project. And the more you learn, the more successful you’ll be in the future.

Using NLP with Machine Learning for Predictive Analytics in Healthcare

Posted on December 12, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

There are a lot of elements involved in doing predictive analytics in healthcare effectively. In most cases I’ve seen, organizations working on predictive analytics do some but not all that’s needed to really make predictive analytics as effective as possible. This was highlighted to me when I recently talked with Frank Stearns, Executive Vice President from HBI Solutions at the Digital Health Conference in NYC.

Here’s a great overview of the HBI Solutions approach to patient risk scores:

healthcare-predictive-analytics-model

This process will look familiar to most people in the predictive analytics space. You take all the patient data you can find, put it into a machine learning engine and output a patient risk score. One of the biggest trends happening with this process is the real-time nature of this process. Plus, I also love the way the patient risk score includes the attributes that influenced a patients risk score. Both of these are incredibly important when trying to make this data actionable.

However, the thing that stood out for me in HBI Solutions’ approach is the inclusion of natural language processing (NLP) in their analysis of the unstructured patient data. I’d seen NLP being used in EHR software before, but I think the implementation of NLP is even more powerful in doing predictive analytics.

In the EHR world, you have to be absolutely precise. If you’re not precise with the way you code a visit, you won’t get paid. If you’re not precise with how the diagnosis is entered into the EHR, that can have long term consequences. This has posed a real challenge for NLP since NLP is not 100% accurate. It’s gotten astoundingly good, but still has its shortcomings that require a human review when utilizing it in an EHR.

The same isn’t true when applying NLP to unstructured data when doing predictive analytics. Predictive analytics by its very nature incorporates some modicum of variation and error. It’s understood that predictive analytics could be wrong, but is an indication of risk. Certainly a failing in NLP’s recognition of certain data could throw off a predictive analytic. That’s unfortunate, but the predictive analytics aren’t relied on the same way documentation in an EHR is relied upon. So, it’s not nearly as big of a deal.

Plus, the value that’s received from applying NLP to pull out the nuggets of information that exists in the unstructured narrative sections of healthcare data is well worth that small amount of risk of the NLP being incorrect. As Frank Stearns from HBI solutions pointed out to me, the unstructured data is often where the really valuable data about a patients’ risk score exist.

I’d be interested in having HBI Solutions do a study of the whole list of findings that are often available in the unstructured data that weren’t available otherwise. However, it’s not hard to imagine a doctor documenting patient observations in the unstructured EHR narrative that they didn’t want to include as a formal diagnosis. Not the least of these are behavioral health observations that the doctor saw, observed, and documented but didn’t want to fully diagnose. NLP can pull these out of the narrative and include them in their patient risk score.

Given this perspective, it’s hard to imagine we’ll ever be able to get away from using NLP or related technology to pull out the valuable insights in the unstructured data. Plus, it’s easy to see how predictive analytics that don’t use NLP are going to be deficient when trying to use machine learning to analyze patients. What’s amazing is that HBI Solutions has been applying machine learning to healthcare for 5 years. That’s a long time, but also explains why they’ve implemented such advanced solutions like NLP in their predictive analytics solutions.

Steps Hospitals Should Consider When Migrating EMR Data

Posted on November 2, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

When your organization decides to convert to a new EMR, the problems it faces extend beyond having to put the right technical architecture in place. Deciding which data to migrate and how much data to migrate from the previous EMR poses additional challenges, and they’re not trivial.

On the one hand, moving over all of your data is expensive (and probably not necessary). On the other, if you migrate too little data, clinicians won’t have an adequate patient history to work from, and what’s more, may not be in compliance with legal requirements.

But there are methods for determining how to make the transition successfully. HCI Group Data Technical Lead Mustafa Raja, argues that there are three key factors hospitals should consider when planning to migrate legacy EMR data into a new system:

  • Decide which data you will archive and which you will migrate. While many organizations fall back on moving six months of acute care data and a year’s worth of ambulatory data, Raja recommends looking deeper. Specifically, while ambulatory transitions may just include medications the patients are on and diagnostic codes in the past year, acute care data encompasses many different data types, including allergies, medications, orders, labs and radiology reports. So deciding what should transition isn’t a one-size-fits-all decision. Once you’ve made the decision as to what data will be transitioned, see that whatever archival storage system you decide upon is easily accessible and not too costly, Raja suggests. You’ll want to have the data available, in part, to respond to security audits.
  • Consider how complex the data is before you choose it for transition to the new EMR. Bear in mind that data types will vary, and that storage methods within the new system may vary from the old. If you are migrating from a nonstandard legacy system to an EMR with data standards in place — which is often the case — you’ll need to decide whether you are willing to go through the standardization process to make the old data available. If not, bear in mind that the nonstandard data won’t be easily accessible or usable, which can generate headaches.
  • Be prepared for the effect of changes in clinical rules and workflow. When upgrading from your legacy system, you’ll probably find that some of its functionality doesn’t work well with the new system, as the new system’s better-optimized workflows will be compatible with the old system, Raja notes. What kind of problems will you encounter? Raja offers the example of a legacy system which includes non-required fields in one of its forms, transitioning to a system that DOES require the fields. Since the data for the newly-required fields doesn’t exist, how do you handle the problem?

Of course, your plans for data migration will be governed by many other considerations, including the speed at which you have to transition, the purposes to which you plan to put your new EMR, your budget, staffing levels and more. But these guidelines should offer a useful look at how to begin thinking about the data migration process.

Are Your Health Data Efforts a Foundation for the Future?

Posted on June 10, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I recently was talking with Jonathan Sheldon from Oracle and I was inspired by the idea that today’s data projects could be the essential foundation for future healthcare analytics and care that form what we now call Precision Medicine. Chew on that idea for a minute. There’s a lot of power in the idea of building blocks that open up new avenues for innovation.

How many healthcare ideas have been shot down because “that’s impossible”? Lots of them. Why are so many of these things “impossible”? They’re impossible because there are usually 10-15 things that need to be accomplished to be able to make the impossible possible.

Take healthcare analytics as an example. I once worked with a clinician to do a study on obesity in our patient population. As we started to put together the study it required us to pull all of the charts for patients whose BMI was over a certain level. Since we were on an EHR, I ran the report and the clinician researching the study easily had a list of every patient that met her criteria. Imagine trying to do that study before EHR. Someone would have had to manually go through thousands of paper charts to identify which ones met the criteria. No doubt that study would have been met with the complaint “That’s impossible.” (Remember that too expensive or time consuming is considered impossible for most organizations.)

What I just described was a super simple study. Now take that same concept and apply it beyond studies into things like real time analytics displayed to the provider at the point of care. How do you do that in a paper chart world? That’s right. You don’t even think about it because it’s impossible.

Sometimes we have to take a step back and imagine the building blocks that will be necessary for future innovation. Clean, trusted data is a good foundational building block for that innovation. The future of healthcare is going to be built on the back of health data. Your ability to trust your data is going to be an essential step to ensuring your organization can do the “impossible”.

The Current State Of “Big Data” In Healthcare – Health Care CXO Scene

Posted on November 2, 2015 I Written By

David is a global digital healthcare leader that is focusing on the next era of healthcare IT.  Most recently David served as the CIO at an academic medical center where he was responsible for all technology related to the three missions of education, research and patient care. David has worked for various healthcare providers ranging from academic medical centers, non-profit, and the for-profit sectors. Subscribe to David's latest CXO Scene posts here.

Editor’s Note: A big welcome to David Chou, the newest member of the Healthcare Scene family of bloggers. David has a great background as a hospital CIO and will bring a wealth of knowledge to Hospital EMR and EHR readers. We’re calling David’s series of blog posts the Healthcare CXO Scene. You can receive the CXO Scene blogs by email as well. Welcome David!

Healthcare is finally evolving towards utilizing data in our decision-making.  The landscape has changed dramatically with the adoption of Electronic Medical Record across the nation. Healthcare use to be a predominately paper based vertical and there are still lots of areas where it is dominated by paper. The fax is also still alive as a communication channel, but the industry has transformed dramatically in the last few years.

According to the Office Of The National Coordinator in 2013, nearly six in ten (59%) hospitals had adopted at least a Basic EHR system. This represents an increase of 34% from 2012 to 2013 and a five-fold increase since 2008. I am sure that percentage is even higher in 2015 in our journey towards an electronic world.

The workflow for the clinician and physician documentation does take a little longer now that they have to type instead of write their notes, but the advantages of having discrete data elements to run analytics will transform the decision making of every organization. If you Google the definition of “big data” the consensus definition is the wealth of structured, semi-structured and unstructured data that has the potential to be mined for information.

Unfortunately the healthcare vertical is still playing catch up and the majority of the organizations still only have Electronic Medical Record (EMR) data being used for decision-making. The healthcare vertical use to be similar to the airline industry where the key to success was keeping the hospital beds occupied similar to how the airline industry wanted to keep every seat on the airplane filled. The new model of care is figuring out a mechanism to keep patients out of the hospital beds and focus on keeping them healthy through preventative measures. We have to do all of this while figuring out the right financial model to be profitable.

As we move down the journey where we transition from a fee for service payment model to a value based payment model it is critical for every organization to transform their business process. Analytics will be key in making that change. Now let’s focus on the 2 key challenges that will force healthcare providers to focus on data to drive their decisions impacting their operations internally and externally.

Challenge #1: Healthcare reimbursements from Medicare and Medicaid have reduced year after year

This has a huge financial impact on health care since the Medicare expenditures have been growing as the baby boomer population ages. There has also been a steady increase of Medicaid expenditures, so the trend of lower reimbursements for taking care of a growing population will be what lies ahead for us in health care. Effective, quality delivery of care while reducing waste will be the main driver of success in the future.

Healthcare providers must understand the cost of delivering care down to the unit level. You will be surprised by the variation of cost for various procedures. The same procedure cost can vary by as much as 15-25% based on the products used. So one of the key elements of cost containment is standardization. As we transition to a value based payment model there will also be value based contracts which will be structured towards a shared savings model. The contractual terms will vary but the general theme will be to incentivize the providers to reduce cost for providing quality care to a population by offering a percentage of the net savings. We are seeing this trend in the Medicare shared saving program and leveraging data analytics will be the key-driving tool for this to be successful.

Challenge #2: The Move Towards Personalized Care

Consumers/patients have different expectations now. We are living in an on-demand personalized world where every industry vertical is moving towards a predictive environment including healthcare. The ideal scenario would be to consume data from the social platforms, wearables/sensors, mobile, public data, and other sources so that we can really understand in real time the current state of the consumer/patient.

Let’s assume the scenario of a digital consumer who is currently a diabetic patient that has been prescribed to be on a low calorie diet. The patient wears a fitbit and also has their smartphone app that tracks her heart rate. The heart rate is a bit higher than normal and the patient feels a little bit off. This wearable and mobile app is integrated with a central monitoring system at the hospital and an alarm triggers a clinician who checks the patient profile and history and takes the proactive measure of making a video call to the patient.

The patient answers the video call with the clinician and they have a video interaction where the clinician can see the facial color of the patient and asks a few questions. Fortunately the patient finished an intense workout about a hour ago so things are fine with the irregular heart rate at the moment and this video interaction also alleviates any anxiety for the patient. It is about 7pm so the patient decides to get something to eat and he is craving a burger so he pulls in to the drive through. The patient has his GPS turned on from his smartphone and also posts on Facebook that he is at a fast food chain’s drive through. This data element is picked up by the hospital’s CRM app and then an automated text is sent to the patient reminding him of the low calorie diet and makes a few recommendation from the menu. The patient can now make an inform decision and instead of ordering a burger he orders a grilled chicken sandwich.

The technology that I have described is already in place and it is similar to the retail sector when you walk in to the store and they already know your behavior. There is a trigger to create an action which hopefully equates to a sale.

Healthcare must move towards this culture of living in an on demand world where we can predict or persuade a behavior by the patient. The challenge that I see is that the majority of healthcare providers are still focused on their internal operations leveraging EMR data and we have not focused on the digital consumer yet. There are a lot of great work being put together by enterprise vendors and healthcare providers, but as we move down the journey of managing population health we can really learn from the other verticals and how they leverage the big data technology to improve consumer/patient engagement. All of this will ultimately lead to a healthier population.

If you’d like to receive future health care C-Level executive posts by David in your inbox, you can subscribe to future Health Care CXO Scene posts here.

Key Big Data Challenges Providers Must Face

Posted on July 17, 2015 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Everybody likes to talk about the promise of big data, but managing it is another story. Taming big data will take new strategies and new IT skills, neither of which are a no-brainer, according to new research by the BPI Network.

While BPI Network has identified seven big data pain points, I’d argue that they boil down to just a few key issues:

* Data storage and management:  While providers may prefer to host their massive data stores in-house, this approach is beginning to wear out, at least as the only strategy in town. Over time, hospitals have begun moving to cloud-based solutions, at least in hybrid models offloading some of their data. As they cautiously explore outsourcing some of their data management and storage, meanwhile, they have to make sure that they have security locked down well enough to comply with HIPAA and repel hackers.

Staffing:  Health IT leaders may need to look for a new breed of IT hire, as the skills associated with running datacenters have shifted to the application level rather than data transmission and security levels. And this has changed hiring patterns in many IT shops. When BPI queried IT leaders, 41% said they’d be looking for application development pros, compared with 24% seeking security skills. Ultimately, health IT departments will need staffers with a different mindset than those who maintained datasets over the long term, as these days providers need IT teams that solve emerging problems.

Data and application availability: Health IT execs may finally be comfortable moving at least some of their data into the cloud, probably because they’ve come to believe that their cloud vendor offers good enough security to meet regulatory requirements. But that’s only a part of what they need to consider. Whether their data is based in the cloud or in a data center, health IT departments need to be sure they can offer high data availability, even if a datacenter is destroyed. What’s more, they also need to offer very high availability to EMRs and other clinical data-wrangling apps, something that gets even more complicated if the app is hosted in the cloud.

Now, the reality is that these problems aren’t big issues for every provider just yet. In fact, according to an analysis by KPMG, only 10% of providers are currently using big data to its fullest potential. The 271 healthcare professionals surveyed by KPMG said that there were several major barriers to leveraging big data in their organization, including having unstandardized data in silos (37%), lacking the right technology infrastructure (17%) and failing to have data and analytics experts on board (15%).  Perhaps due to these roadblocks, a full 21% of healthcare respondents had no data analytics initiatives in place yet, though they were at the planning stages.

Still, it’s good to look at the obstacles health IT departments will face when they do take on more advanced data management and analytics efforts. After all, while ensuring high data and app availability, stocking the IT department with the right skillsets and implementing a wise data management strategy aren’t trivial, they’re doable for CIOs that plan ahead. And it’s not as if health leaders have a choice. Going from maintaining an enterprise data warehouse to leveraging health data analytics may be challenging, but it’s critical to make it happen.

Interview with Dana Sellers: Encore Pay for Perfomance (P4P) Managed Services

Posted on February 20, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The following is an interview with Dana Sellers, CEO of Encore Health Resources about their new Pay for Performance Managed Services offering which they’ll be sharing at HIMSS 2014.
Dana Sellers
Tell us your vision for how your new P4P Managed Services will work.
Our vision is to help our clients manage performance and data components against payer contracts to maximize quality, obtain incentives, and avoid penalties. Our offering uses a combination of Encore subject-matter experts (SMEs), software tools, and methodologies that we’ve already tested and proven in large healthcare systems. P4P Managed Services lifts the burden of meeting these value-based demands off our clients’ shoulders and into Encore’s hands. As part of this innovative offering, we also share risks and rewards via multi-year partnerships. We work with clients to ensure that they have the trusted data they need to support performance improvement and obtain incentives.

Our service begins with the P4P Value-driven Roadmap, which identifies the dollars and associated measures at stake in clients’ at-risk, value-based contracts — including projections for the next few years. As input for the roadmap, we perform a data assessment of a client’s EHR and other source systems to determine if they are capturing the right data for the targeted measures. The roadmap defines the multi-year data and process program required to obtain the desired incentives.

Next, we establish this required program along with data governance, technology, and data tools. We also build the value components of the program, including EHR remediation, workflow redesign, change management, data profiling, ETL, and dashboards required to monitor performance.

Once the program foundation is established, the value-management cycle begins. Encore monitors each client’s performance, providing insight through performance analysis and suggesting needed performance improvements to meet all targeted incentives and enhance the quality of care. Also, as new contracts emerge, we work with clients to incorporate new eMeasures into the program.

By creating trusted, transparent data, Encore helps health systems transform and meet new payment-model requirements by using eMeasures to adhere to evidence-based standards. The result is better patient care and an improved bottom line. We provide the consulting expertise, unique methodologies, and our own, in-house developed software tools to help our clients succeed — as we’ve proven by our results in helping other large clients accelerate their achievements through eMeasures.

Why did you choose to offer a service like this?
We know EHR data! Our methodologies and software tools are built around EHR data and eMeasures.

Encore was founded to provide consulting services with a focus on analytics fueled by clinical data. In the broad spectrum of consulting services that we provide – from HIT and clinical advisory to implementation, go-live services, and analytics — our focus is trained on identifying and gathering the data that our clients need to improve healthcare and operational performance. Therefore, our P4P Managed Services offering is a natural extension of our mission. At-risk contracts require the ability to track eMeasures, which has been an Encore strength – and differentiator — since our founding five years ago.

Our vision for P4P Managed Services is also supported by our clients – especially CIOs and CFOs. They have told us that they need assistance with all aspects of data capture, analytics, and performance improvement. When we lift that burden from our clients’ shoulders, it frees them to focus on other critical issues, such as cost reduction, while we leverage our unique expertise and proven experience to manage the value side of the equation.

Which P4P programs do you see Encore supporting?
We support measures — quality measures that go back to incentives. These include Medicare, Medicaid, commercial P4P/Fee-for-Value type contracts, IQR, PQRS, ACO, ACAs, ACCs, NQF/CQMs, PCMH, PCQUS, clinically integrated networks, and the like.

Our methodology and tools tie the eMeasures directly to workflow, so we know how to change each client’s workflow to get better results. Our knowledge bases include over 350 eMeasures.

How much of this offering is technical and how much of it is services.
This is an important question. Encore is first and foremost a services company – a services company that is strongly differentiated by unmatched, in-house-developed software solutions that are uniquely designed to support the services we provide. So our new offering is precisely that: services supported by innovative technology and processes on a flexible, as-needed basis.

What does the cost structure look like for this service?
As described earlier, the P4P Managed Services cost structure is based upon a roadmap we define with each client to quantify the value-based, at-risk dollars and the client’s capabilities to manage the quality-performance components of their at-risk contracts. Contract details, therefore, will vary with each client’s situation. The bottom line is that Encore is willing to manage our P4P Managed Services contracts while working with clients to define a risk-sharing arrangement that incents everyone to achieve.

Why would an organization choose to outsource the P4P to Encore as opposed to doing it in-house?
The process of managing performance against eMeasures across a health system is complex, and many clients have not put together a disciplined approach to performance improvement. Further, many of our clients are telling us that they simply do not have the full complement of expertise, resources, technology, and program-management disciplines available to move fast enough against a dizzying array of government and commercial at-risk contracts. But we do, and we – especially our skilled eMeasures experts — have a track record that proves it.

Also, an increasing number of health systems are recognizing that they’ll have to enter a world of eMeasures that is growing every year. With P4P Managed Services, we bring the expertise, skills, tools, and methodology that can take this eMeasure world and our clients under our wing. Our new service provides clients the breathing room to focus on multiple fronts simultaneously – and not leave any dollars on the table as a result.

A third reason for choosing our new offering is because it’s a cost-conscious solution. We eliminate the need for clients to hire more architects, eMeasures specialists, analysts, and report builders.

Finally, P4P Managed Services can preserve endangered species. That is, we supplement our clients’ existing IT department with some of the hardest resources to find: clinicians and operational SMEs with an understanding of data; eMeasures experts; and, technical SMEs with an understanding of the clinical and the operational worlds.

How much accountability is Encore taking on with these P4P Managed Services? Where do you draw the line?
Our new offering is a full life-cycle solution that we approach as a partnership. We nail down the amount of accountability – the risk that we’re able to share – on a case-by-case basis through the roadmap. Depending upon what we learn, we then determine the degree of accountability that both we and our clients can share to incent the highest levels of achievement.

Is there some risk on Encore’s part that the client will fall short on what they need to accomplish for Encore to provide the P4P services? Encore can’t go in and do the documentation for the doctor.
This is precisely what our new service is in place to define. As with every engagement, we use a thorough, careful assessment process to ascertain the nature of the challenges involved. With P4P Managed Services, that means understanding:
• The incentives involved
• The risk involved if our clients can’t achieve optimal revenue reimbursement – say with Medicare and Medicaid contracts
• The risk involved for Encore if those contractual incentives are not earned
Bottom line: we both win, or we both lose. With P4P managed services, we are convinced that we can define on a case-by-case basis the mix of Encore services, solutions, and client resources that Encore will manage to produce a win or multiple wins for both sides.

This feels similar to revenue cycle management (RCM) applied to P4P programs. Can you apply some of the RCM learnings to this type of offering?
Yes, similarities do exist between RCM and the management of quality performance components of at-risk contracts. The way we see it, RCM has been responsible for collecting patient data and getting claims ready for a long time. It remains fairly unchanged and encompasses the management of people, processes, and technology across health systems to improve revenue collection. By tying eMeasures to clinical rather than latent claims data, performance issues can be corrected within a few days. That is because the use of EHR data literally “moves the needle” in real time. Beyond claims data, we use EHR clinical data to affect change that meets the required quality measures thresholds.

At present, there is an increased focus on traditional cost monitoring, which informs RCM. This typically happens at the service line and department level; not at the episode-of-care level. Although direct, indirect, fixed, and ad-hoc costs are certainly important and are included, value-based cost control and reduction efforts must focus on the clinical processes, just like the quality performance components. Both will require tracking the costs and quality across the entire continuum of care, constantly analyzing performance and applying adjustments. And the revenue cycle is a significant piece of this. So the discipline and techniques needed for RCM can certainly inform a health system’s approach to fee-for-value focused management.

Do you see this as the start of offering even more Managed Services offerings?
Yes. We are now working on another offering – it’s in the packaging stages – around Meta-Data Management. Stay tuned for more details later this year.

Can Big Data Do What Vendors Claim?

Posted on December 6, 2013 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

There’s no doubt about it — the air is ringing with the sounds of vendors promising big things from big data, from population health to clinical support to management of bundled payments. But can they really offer these blessings?  According to enterprise health IT architect Michael Planchart (known to many as @theEHRGuy), there’s a lot of snake oil sales going on.

In his experience, many of the experts on what he calls Big Bad Data either weren’t in healthcare or have never touched healthcare IT until the big data trend hit the industry. And they’re pitching the big data concept to providers that aren’t ready, he says:

  • Most healthcare providers haven’t been collecting data in a consistent way with a sound data governance model.
  • Most hospitals have paper charts that collect data in unstructured and disorganized ways.
  • Most hospitals — he asserts — have spent millions or even billions of dollars on EMRs but have been unable to implement them properly. (And those that have succeeded have done so in “partial and mediocre ways,” he says.)

Given these obstacles,  where is big data going to come from today? Probably not the right place, he writes:

Well, some geniuses from major software vendors thought they could get this data from the HL7 transactions that had been moving back and forth between systems.  Yes, indeed.  They used some sort of “aggregation” software to extract this data out of HL7 v2.x messages.  What a disaster!  Who in their sane mind would think that transactional near real time data could be used as the source for aggregated data?

As Planchart sees it, institutions need quality, pertinent, relevant and accurate data, not coarsely aggregated data from any of the sources hospitals and providers have. Instead of rushing into big data deals, he suggests that CIOs start collecting discrete, relevant and pertinent data within their EMRs, a move which will pay off over the next several years.

In the mean time, my colleague John Lynn suggests, it’s probably best to focus on “skinny data” — a big challenge in itself given how hard it can be to filter out data “noise” — rather than aggregate a bunch of high volume data from all directions.