Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Making Healthcare Data Useful

Posted on May 14, 2018 I Written By

The following is a guest blog by Monica Stout from MedicaSoft

At HIMSS18, we spoke about making health data useful to patients with the Delaware Health Information Network (DHIN). Useful data for patients is one piece of the complete healthcare puzzle. Providers also need useful data to provide more precise care to patients and to reach patient populations who would benefit directly from the insights they gain. Payers want access to clinical data, beyond just claims data, to aggregate data historically. This helps payers define which patients should be included in care coordination programs or who should receive additional disease management assistance or outreach.

When you’re a provider, hospital, health system, health information exchange, or insurance provider and have the data available, where do you start? It’s important to start at the source of the data to organize it in a way that makes insights and actions possible. Having the data is only half of the solution for patients, clinicians or payers. It’s what you do with the data that matters and how you organize it to be usable. Just because you may have years of data available doesn’t mean you can do anything with it.

Historically, healthcare has seen many barriers to marrying clinical and claims data. Things like system incompatibility, poor data quality, or siloed data can all impact organizations’ ability to access, organize, and analyze data stores. One way to increase the usability of your data is to start with the right technology platform. But what does that actually mean?

The right platform starts with a data model that is flexible enough to support a wide variety of use models. It makes data available via open, standards-based APIs. It organizes raw data into longitudinal records. It includes services, such as patient matching and terminology mapping, that make it easy to use the data in real-world applications. The right platform transforms raw data into information that that aids providers and payers improve outcomes and manage risk and gives patients a more complete view of their overall health and wellness.

Do you struggle with making your data insightful and actionable? What are you doing to transform your data? Share your insights, experiences, challenges, and thoughts in the comments or with us on Twitter @MedicaSoftLLC.

About Monica Stout
Monica is a HIT teleworker in Grand Rapids, Michigan by way of Washington, D.C., who has consulted at several government agencies, including the National Aeronautics Space Administration (NASA) and the U.S. Department of Veterans Affairs (VA). She’s currently the Marketing Director at MedicaSoft. Monica can be found on Twitter @MI_turnaround or @MedicaSoftLLC.

About MedicaSoft
MedicaSoft  designs, develops, delivers, and maintains EHR, PHR, and UHR software solutions and HISP services for healthcare providers and patients around the world. MedicaSoft is a proud sponsor of Healthcare Scene. For more information, visit www.medicasoft.us or connect with us on Twitter @MedicaSoftLLC, Facebook, or LinkedIn.

Improving Data Outcomes: Just What The Doctor Ordered

Posted on May 8, 2018 I Written By

The following is a guest blog post by Dave Corbin, CEO of HULFT.

Health care has a data problem. Vast quantities are generated but inefficiencies around sharing, retrieval, and integration have acute repercussions in an environment of squeezed budgets and growing patient demands.

The sensitive nature of much of the data being processed is a core issue. Confidential patient information has traditionally encouraged a ‘closed door’ approach to data management and an unease over hyper-accessibility to this information.

Compounding the challenge is the sheer scale and scope of the typical health care environment and myriad of departmental layers. The mix of new and legacy IT systems used for everything from billing records to patient tracking often means deep silos and poor data connections, the accumulative effect of which undermines decision-making. As delays become commonplace, this ongoing battle to coordinate disparate information manifests itself in many different ways in a busy hospital.

Optimizing bed occupancies – a data issue?

One example involves managing bed occupancy, a complex task which needs multiple players to be in the loop when it comes to the latest on a patient’s admission or discharge status. Anecdotal evidence points to a process often informed manually via feedback with competing information. Nurses at the end of their shift may report that a patient is about to be discharged, unaware that a doctor has since requested more tests to be carried out for that patient. As everyone is left waiting for the results from the laboratory, the planned changeover of beds is delayed with many knock-on effects, increasing congestion and costs and frustrating staff and patients in equal measure.

How data is managed becomes a critical factor in tackling the variations that creep into critical processes and resource utilization. In the example above, harnessing predictive modelling and data mining to forecast the number of patient discharges so that the number of beds available for the coming weeks can be estimated more accurately will no doubt become an increasingly mainstream option for the sector.

Predictive analytics is great and all, but first….

Before any of this can happen, health care organizations need a solid foundation of accessible and visible data which is centralized, intuitive, and easy to manage.

Providing a holistic approach to data transfer and integration, data logistics can help deliver security, compliance, and seamless connectivity speeding up the processing of large volumes of sensitive material such as electronic health records – the kind of data that simply cannot be lost. These can ensure the reliable and secure exchange of intelligence with outside health care vendors and partners.

For data outcomes, we’re calling for a new breed of data logistics that’s intuitive and easy to use. Monitoring interfaces which enable anyone with permission to access the network to see what integrations and transfers are running in real time with no requirement for programming or coding are the kind of intervention which opens the data management to a far wider section of an organization.

Collecting data across a network of multiple transfer and integration activities and putting it in a place where people can use, manage and manipulate becomes central to breaking down the barriers that have long compromised efficiencies in the health care sector.

HULFT works with health care organizations of all sizes to establish a strong back-end data infrastructure that make front-end advances possible. Learn how one medical technology pioneer used HULFT to drive operational efficiencies and improve quality assurance in this case study.

Dave Corbin is CEO of HULFT, a comprehensive data logistics platform that allows IT to find, secure, transform and move information at scale. HULFT is a proud sponsor of Health IT Expo, a practical innovation conference organized by Healthcare Scene.  Find out more at hulftinc.com

Health Leaders Say Automating Patient Engagement Efforts Will Have Major Impact

Posted on March 12, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Over the last few years, many vendors have rolled out products designed to engage patients further in their care. According to a new study, these solutions may be just the tip of the iceberg. In fact, many healthcare executives see patient-facing, engagement-enhancing technology as critical to the future of healthcare, according to a new study.

The study, by the World Business Group, focuses on technology that can link patients with care in between visits to their primary care center. Patient engagement technologies, which they call “automated care,” have the potential to bridge such gaps by providing AI-based assistance to consumers.

The survey, which was also backed by Conversa Health, drew on direct interviews and survey responses from 134 health execs. The researchers looked at how those execs viewed automated healthcare technologies, how these technologies might be used and whether respondents plan to adopt them.

Respondents were clearly very enthusiastic about these tools. Nearly all (98%) said they believed automated healthcare will be important in creating a continuous, collaborative relationship with providers. The survey also found that 87% of respondents felt that automated healthcare will be helpful in getting patients to engage with their own care.

The next step, of course, is throwing resources at the problem — and it’s happening. Seventy-nine percent of survey respondents said they expected to work on integrating automated healthcare in their organization within the next two years. Also, 44% said that they had a chief patient experience officer or other executive with an equivalent title on board within their organization. This development is fairly new, however, as 40% of these organizations said that the role has existed for two years or less.

Meanwhile, several respondents felt that automating patient healthcare could generate a positive feedback loop. Forty-nine percent of those surveyed reported that they were either integrating or have already integrated patient-generated health data, which they expect, in turn, to integrate into the patient experience efforts.

Among organizations working with patient-generated health data, 73% were gathering patient health histories, 64% treatment histories, 59% lifestyle or social data, 52% symptoms data, and 32% biometric data.

Thirty percent said they were beginning to integrate such data and collect it work effectively, 28% said they could collect some PGHD but had trouble integrating with their systems, 14% said they were just beginning to collect such data and 9% said they were not able to collect this data at all. Just 19% reported they were fully able to collect integrate PGHD and use it to improve patient experiences.

All told, it appears we’re on the cusp of a major change in the role patient services play in provider outreach. It will probably be a few more years before we have a good idea of where all this is headed, but my guess is that it’s heading somewhere useful.

Yale New Haven Hospital Partners With Epic On Centralized Operations Center

Posted on February 5, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Info, info, all around, and not a place to manage it all. That’s the dilemma faced by most hospitals as they work to leverage the massive data stores they’re accumulating in their health IT systems.

Yale New Haven Hospital’s solution to the problem is to create a centralized operations center which connects the right people to real-time data analytics. Its Capacity Command Center (nifty alliteration, folks!) was created by YNHH, Epic and the YNHH Clinical Redesign Initiative.

The Command Center project comes five years into YNHH’s long-term High Reliability project, which is designed to prepare the institution for future challenges. These efforts are focused not only on care quality and patient safety but also managing what YNHH says are the highest patient volumes in Connecticut. Its statement also notes that with transfers from other hospitals increasing, the hospital is seeing a growth in patient acuity, which is obviously another challenge it must address.

The Capacity Command Center’s functions are fairly straightforward, though they have to have been a beast to develop.

On the one hand, the Center offers technology which sorts through the flood of operational data generated by and stored in its Epic system, generating dashboards which change in real time and drive process changes. These dashboards present real-time metrics such as bed capacity, delays for procedures and tests and ambulatory utilization, which are made available on Center screens as well as within Epic.

In addition, YNHH has brought representatives from all of the relevant operational areas into a single physical location, including bed management, the Emergency Department, nursing staffing, environmental services and patient transport. Not only is this a good approach overall, it’s particularly helpful when patient admissions levels climb precipitously, the hospital notes.

This model is already having a positive impact on the care process, according to YNHH’s statement. For example, it notes, infection prevention staffers can now identify all patients with Foley catheters and review their charts. With this knowledge in hand, these staffers can discuss whether the patient is ready to have the catheter removed and avoid related urinary tract infections associated with prolonged use.

I don’t know about you, but I was excited to read about this initiative. It sounds like YNHH is doing exactly what it should do to get more out of patient data. For example, I was glad to read that the dashboard offered real-time analytics options rather than one-off projections from old data. Bringing key operational players together in one place makes great sense as well.

Of course, not all hospitals will have the resources to pull something off something like this. YNHH is a 1,541-bed giant which had the cash to take on a command center project. Few community hospitals would have the staff or money to make such a thing happen. Still, it’s good to see somebody at the cutting edge.

The 4 P’s of Innovation in Health Science

Posted on January 31, 2018 I Written By

Sunny is a serial entrepreneur on a mission to improve quality of care through data science. Sunny’s last venture docBeat, a healthcare care coordination platform, was successfully acquired by Vocera communications. Sunny has an impressive track record of Strategy, Business Development, Innovation and Execution in the Healthcare, Casino Entertainment, Retail and Gaming verticals. Sunny is the Co-Chair for the Las Vegas Chapter of Akshaya Patra foundation (www.foodforeducation.org) since 2010.

You’ll never meet anyone that loves health data science more than Prashant Natarajan. He literally wrote the book on the subject (Check out Demystifying Big Data and Machine Learning for Healthcare to see why I mean literally). He recently gave a presentation on the 4 P’s of Innovation in Health Science which included this slide:

Sadly, I couldn’t find a recording of his presentation. However, this slide puts health data science in perspective. Prashant boiled it down to 4 simple points. The problem is that too many healthcare organizations are unable to really execute all 4 P’s in their health science innovation efforts.

No doubt each of these 4 P’s is challenging, but the most challenging one I see today is the first P: People.

I’m not sure all of the ways that Prashant addresses the people problem, but it’s somewhat ironic that people is the biggest problem with health science innovation. I see the challenge as two fold. First, finding people who have the health science mindset are hard to find. Competition for people with these skills is fierce and many of them don’t want to get into healthcare which is complex, regulated, and often behind.

The second major health science challenge revolves around the people who collect, aggregate, and enter the data. It’s easy for a front line person to not care about the downstream effects of them entering poor quality data. Not to mention being consistent in what you enter and how you enter it.

It’s somewhat apart of human nature for us to jimmy rig a solution to the problem we face. Those workaround solutions wreaked havoc downstream in your data science efforts. I recently heard the example of a hospital always choosing Mongolian for some setting because it was a setting that would never be used otherwise. The culture of the hospital just knew this is what to do. Once the data scientists started looking at the data they wondered why this Mongolian population kept coming up in their results. Every healthcare organization has their “mongolian” workaround that causes havoc on data science.

What do you think of these 4 Ps of Innovation in Health Science? Is there something missing? Do you see one of these as more important than another?

Merged Health Systems Face Major EHR Integration Issues

Posted on January 2, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Pity the IT departments of Advocate Health Care and Aurora Health Care. When the two health systems complete their merger, IT leaders face a lengthy integration process cutting across systems from three different EHR vendors or a forklift upgrade of at least one.

It’s tough enough to integrate different instances of systems from the same vendor, which, despite the common origin are often configured in significantly different ways. In this case, the task is exponentially more difficult. According to Fierce Healthcare, when the two organizations come together, they’ll have to integrate Aurora’s Epic EHR with the Cerner and Allscripts systems used by Advocate.

As part of his research, the reporter asked an Aurora spokesperson whether health systems attempt to pull together three platforms into a single EHR. Of course, as we know, that is unlikely to ever happen. While full interoperability is obviously an elusive thing, getting some decent data flow between two affiliated organizations is probably far more realistic.

Instead, depending on what happens, the new CIO might or might not decide to migrate all three EHRs onto one from a single vendor. While this could turn out to be a hellish job, it certainly is the ideal situation if you can afford to get there. However, that doesn’t mean it’s always the best option. Especially as health system mergers and acquisitions get bigger and bigger.

To me, however, the big question around all of this is how much the two organizations would spend to bring the same platforms to everyone. As we know, acquiring and rolling out Epic for even one health system is fiendishly expensive, to the point where some have been forced to report losses or have had ratings on the bond reduced.

My guess is that the leaders of the two organizations are counting often-cited merger benefits such as organizational synergies, improved efficiency and staff attrition to meet the cost of health IT investments like these. If this academic studies prove this will work, please feel free to slap me with a dead fish, but as for now I doubt it will happen.

No, to me this offers an object lesson in how mergers in the health IT-centered world can be more costly, take longer to achieve, and possibly have a negative impact on patient care if things aren’t done right (which often seems to be the case).

Given the other pressures health systems face, I doubt these new expenses will hold them back from striking merger deals. Generally speaking, most health systems face little choice but to partner and merge as they can. But there’s no point minimizing how much complexity and expense EHRs bring to such agreements today.

Pennsylvania Health Orgs Agree to Joint $1 Billion Network Dev Effort

Posted on December 27, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

If the essence of deal-making is putting your money where your mouth is, a new agreement between Pennsylvania healthcare giants fit the description. They’ve certainly bitten off a mouthful.

Health organizations, Penn State Health and Highmark Health, have agreed to make a collective investment of more than $1 billion. That is a pretty big number to swallow, even for two large organizations, though it very well may take even more to develop the kind of network they have in mind.

The two are building out what they describe as a “community-based healthcare network,” which they’re designing to foster collaboration with community doctors and keep care local across its service areas.  Makes sense, though the initial press release doesn’t do much to explain how the two are going to make that happen.

The agreement between Penn State and Highmark includes efforts to support population health, the next step in accepting value-based payment. The investors’ plans include the development of population health management capabilities and the use of analytics to manage chronic conditions. Again, pretty much to be expected these days, though their goals are more likely to actually be met given the money being thrown at the problem.

That being said, one possible aspect of interest to this deal is its inclusion of a regionally-focused academic medical center. Penn State plans to focus its plans around teaching hospital Milton S. Hershey Medical Center, a 548-bed hospital affiliated with more than 1,100 clinicians. In my experience, too few agreements take enough advantage of hospital skills in their zeal to spread their arms around large areas, so involving the Medical Center might offer extra benefits to the agreement.

Highmark Health, for its part, is an ACO which encompasses healthcare business serving almost 50 million consumers cutting across all 50 states.  Clearly, an ACO with national reach has every reason in the world to make this kind of investment.

I don’t know what the demographics of the Penn State market are, but one can assume a few things about them, given the the big bucks the pair are throwing at the deal:

  • That there’s a lot of well-insured consumers in the region, which will help pay for a return on the huge investment the players are making
  • That community doctors are substantially independent, but the two allies are hoping to buy a bunch of practices and solidify their network
  • That prospective participants in the network are lacking the IT tools they need to make value-based schemes work, which is why, in part, the two players need to spend so heavily

I know that ACOs and healthcare systems are already striking deals like this one. If you’re part of a health system hoping to survive the next generation of reimbursement, big budgets are necessary, as are new strategies better adapted to value-based reimbursement.

Still, this is a pretty large deal by just about any measure. If it works out, we might end up with new benchmarks for building better-distributed healthcare networks.

Breaking Bad: Why Poor Patient Identification is Rooted in Integration, Interoperability

Posted on December 20, 2017 I Written By

The following is a guest blog post by Dan Cidon, Chief Technology Officer, NextGate.

The difficulty surrounding accurate patient ID matching is sourced in interoperability and integration.

Coordinated, accountable, patient-centered care is reliant on access to quality patient data. Yet, healthcare continues to be daunted by software applications and IT systems that don’t communicate or share information effectively. Health data, spread across multiple source systems and settings, breeds encumbrances in the reconciliation and de-duplication of patient records, leading to suboptimal outcomes and avoidable costs of care. For organizations held prisoner by their legacy systems, isolation and silo inefficiencies worsen as IT environments become increasingly more complex, and the growth and speed to which health data is generated magnifies.

A panoramic view of individuals across the enterprise is a critical component for value-based care and population health initiatives. Accurately identifying patients, and consistently matching them with their data, is the foundation for informed clinical decision-making, collaborative care, and healthier, happier populations. As such, the industry has seen a number of high-profile initiatives in the last few years attempting to address the issue of poor patient identification.

The premature end of CHIME’s National Patient ID Challenge last month should be a sobering industry reminder that a universal solution may never be within reach. However, the important lesson emanating in the wake of the CHIME challenge is that technology alone will not solve the problem. Ultimately, the real challenge of identity management and piecing together a longitudinal health record has to do with integration and interoperability. More specifically, it revolves around the demographics and associated identifiers dispersed across multiple systems.

Because these systems often have little reason to communicate with one another, and because they store their data through fragmented architecture, an excessive proliferation of identifiers occurs. The result is unreliable demographic information, triggering further harm in data synchronization and integrity.

Clearly, keeping these identifiers and demographics as localized silos of data is an undesirable model for healthcare that will never function properly. While secondary information such as clinical data should remain local, the core identity of a patient and basic demographics including name, gender, date of birth, address and contact information shouldn’t be in the control of any single system. This information must be externalized from these insulated applications to maintain accuracy and consistency across all connected systems within the delivery network.

However, there are long-standing and relatively simple standards in place, such as HL7 PIX/PDQ, that allow systems to feed a central demographic repository and query that repository for data. Every year, for the past eight years, NextGate has participated in the annual IHE North American Connectathon – the healthcare industry’s largest interoperability testing event. Year after year, we see hundreds of other participating vendors demonstrating that with effective standards, it is indeed possible to externalize patient identity.

In the United Kingdom, for example, there has been slow but steady success of the Patient Demographic Service – a relatively similar concept of querying a central repository for demographics and maintaining a global identifier. While implementation of such a national scale service in the U.S. is unlikely in the near-term, the concept of smaller scale regional registries is clearly an achievable goal. And every deployment of our Enterprise Master Patient Index (EMPI) is a confirmation that such systems can work and do provide value.

What is disappointing, is that very few systems in actual practice today will query the EMPI as part of the patient intake process. Many, if not most, of the systems we integrate with will only fulfill half of the bargain, namely they will feed the EMPI with demographic data and identifiers. This is because many systems have already been designed to produce this outbound communication for purposes other than the management of demographic data. When it comes to querying the EMPI for patient identity, this requires a fundamental paradigm shift for many vendors and a modest investment to enhance their software. Rather than solely relying on their limited view of patient identity, they are expected to query an outside source and integrate that data into their local repository.

This isn’t rocket science, and yet there are so few systems in production today that initiate this simple step. Worse yet, we see many healthcare providers resorting to band aids to remedy the deficiency, such as resorting to ineffective screen scraping technology to manually transfer data from the EMPI to their local systems.

With years of health IT standards in place that yield a centralized and uniform way of managing demographic data, the meager pace and progress of vendors to adopt them is troubling. It is indefensible that a modern registration system, for instance, wouldn’t have this querying capability as a default module. Yet, that is what we see in the field time and time again.

In other verticals where banking and manufacturing are leveraging standards-based exchange at a much faster pace, it really begs the question: how can healthcare accelerate this type of adoption? As we prepare for the upcoming IHE Connectathon in January, we place our own challenge to the industry to engage in an open and frank dialogue to identify what the barriers are, and how can vendors be incentivized, so patients can benefit from the free flow of accurate, real-time data from provider to provider.

Ultimately, accurate patient identification is a fundamental component to leveraging IT for the best possible outcomes. Identification of each and every individual in the enterprise helps to ensure better care coordination, informed clinical decision making, and improved quality and safety.

Dan Cidon is CTO and co-founder NextGate, a leader in healthcare identity management, managing nearly 250 million lives for health systems and HIEs in the U.S. and around the globe.

Surescripts Deal Connects EMR Vendors And PBMs To Improve Price Transparency

Posted on November 22, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

I’m no expert on the pharmacy business, but from where I sit as a consumer it’s always looked to me as though pharmaceutical pricing is something of a shell game. It makes predicting what your airline ticket will cost seem like child’s play.

Yes, in theory, the airlines engage in demand-oriented pricing, while pharma pricing is based on negotiated prices spread among multiple contracted parties, but in either case end-users such as myself have very little visibility into where these numbers are coming from.  And in my opinion, at least, that’s not good for anyone involved. You can say “blah blah blah skin in the game” all you want, but co-pays are a poor proxy for making informed decisions as a patient as to what benefits you’ll accrue and problems you face when buying a drug.

Apparently, Surescripts hopes to change the rules to some degree. It just announced that it has come together with two other interest groups within the pharmacy supply chain to offer patient-specific benefit and price information to providers at the point of care.

Its partners in the venture include a group of EMR companies, including Cerner, Epic, Practice Fusion and Aprima Medical Software, which it says represent 53% of the U.S. physician base. It’s also working with two pharmacy benefit managers (CVS Health and Express Scripts) which embrace almost two-thirds of US patients.

The new Surescripts effort actually has two parts, a Real-Time Prescription Benefit tool and an expanded version of its Prior Authorization solution.  Used together, and integrated with an EHR, these tools will clarify whether the patient’s health insurance will cover the drug suggested by the provider and offer therapeutic alternatives that might come at a lower price.

If you ask me, this is clever but fails to put pressure on the right parties. You don’t have to be a pharmaceutical industry expert to know that middlemen like PBMs and pharmacies use a number of less-than-visible stratagems jack up drug prices. Patients are forced to just cope with whatever deal these parties strike among themselves.

If you really want to build a network which helps consumers keep prices down, go for some real disclosure. Create a network which gathers and shares price information every time the drug changes hands, up to and including when the patient pays for that drug. This could have a massive effect on drug pricing overall.

Hey, look at what Amazon did just by making costs of shipping low and relatively transparent to end-users. They sucked a lot of the transaction costs out of the process of shipping products, then gave consumers tools allowing them to watch that benefit in action.

Give consumers even one-tenth of that visibility into their pharmacy supply chain, and prices would fall like a hot rock. Gee, I wonder why nobody’s ever tried that. Could it be that pharmaceutical manufacturers don’t want us to know the real costs of making and shipping their product?

CHIME Suspends the $1 Million Dollar National Patient ID Challenge

Posted on November 17, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

CHIME just announced that they’ve suspended their National Patient ID Challenge. For those not familiar with the challenge, almost 2 years ago CHIME Announced a $1 million prize for companies to solve the patient identification and matching problem in healthcare. Here’s the description of the challenge from the HeroX website that hosted the challenge:

The CHIME National Patient ID Challenge is a global competition aimed at incentivizing new, early-stage, and experienced innovators to accelerate the creation and adoption of a solution for ensuring 100 percent accuracy in identifying patients in the U.S. Patients want the right treatment and providers want information about the right patient to provide the right treatment. Patients also want to protect their privacy and feel secure that their identity is safe.

And here’s the “Challenge Breakthrough” criteria:

CHIME Healthcare Innovation Trust is looking for the best plan, strategies and methodologies that will accomplish the following:

  • Easily and quickly identify patients
  • Achieve 100% accuracy in patient identification
  • Protect patient privacy
  • Protect patient identity
  • Achieve adoption by the vast majority of patients, providers, insurers, and other stakeholders
  • Scale to handle all patients in the U.S.

When you look at the fine print, it says CHIME (or the Healthcare Innovation Trust that they started to host the challenge) could cancel the challenge at any time without warning or explanation including removing the Prize completely:

5. Changes and Cancellation. Healthcare Innovation Trust reserves the right to make updates and/or make any changes to, or to modify the scope of the Challenge Guidelines and Challenge schedule at any time during the Challenge. Innovators are responsible for regularly reviewing the Challenge site to ensure they are meeting all rules and requirements of and schedule for the Challenge. Healthcare Innovation Trust has the right to cancel the Challenge at any time, without warning or explanation, and to subsequently remove the Prize completely.

It seems that CHIME’s legally allowed to suspend the challenge. However, that doesn’t mean that doesn’t burn the trust of the community that saw them put out the $1 million challenge. The challenge created a lot of fanfare including promotion by ONC on their website, which is a pretty amazing thing to even consider. CHIME invested a lot in this challenge, so it must hurt for them to suspend it.

To be fair, when the challenge was announced I hosted a discussion where I asked the question “Is this even solvable?” At 100% does that mean that no one could ever win the challenge? With that in mind, the challenge always felt a bit like Fool’s Gold to me and I’m sure many others. I thought, “CHIME could always come back and make the case that no one could ever reach 100% and so they’d never have to pay the money.” Those that participated had to feel this as well and they participated anyway.

The shameful part to me is how suspending the competition is leaving those who did participate high and dry. I asked CHIME about this and they said that the Healthcare Innovation Trust is still in touch with the finalists and that they’re encouraging them to participate in the newly created “Patient Identification Task Force.” Plus, the participants received an honorarium.

Participation in a CHIME Task Force and the honorarium seems like a pretty weak consolation prize. In fact, I can’t imagine any of the vendors that participated in the challenge would trust working with CHIME going forward. Maybe some of them will swallow hard and join the task force, but that would be a hard choice after getting burnt like this. It’s possible CHIME is offering them some other things in the background as well.

What’s surprising to me is why CHIME didn’t reach out to the challenge participants and say that none of them were going to win, but that CHIME still wanted to promote their efforts and offerings to provide a solid benefit to those that participated. CHIME could present the lessons learned from the challenge and share all the solutions that were submitted and the details of where they fell short and where they succeeded. At least this type of promotion and exposure would be a nice consolation prize for those who spent a lot of time and money participating in the challenge. Plus, the CIOs could still benefit from something that solved 95% of their problems.

Maybe the new Patient Identification Task Force will do this and I hope they do. CHIME did it for their new Opioid Task Force at the Fall Forum when they featured it on the main stage. How about doing the same for the Patient Identification Challenge participants? I think using the chance to share the lessons learned would be a huge win for CHIME and its members. I imagine it’s hard for CHIME to admit “failure” for something they worked on and promoted so much. However, admitting the failure and sharing what was learned from it would be valuable for everyone involved.

While I expect CHIME has burnt at least some of the challenge participants, the CHIME CIO members probably knew the challenge was unlikely to succeed and won’t be burnt by this decision. Plus, the challenge did help to call national attention to the issue which is a good thing and as they noted will help continue to push forward the national patient identifier efforts in Washington. Maybe now CHIME will do as Andy Aroditis, Founder and CEO of NextGate, suggested in this article where Shaun Sutner first reported on issues with the CHIME National Patient ID Challenge:

Aroditis complained that rather than plunging into a contest, CHIME should have convened existing patient matching vendors, like his company, to collaborate on a project to advance the technology.

“Instead they try to do these gimmicks,” Aroditis said.

I imagine that’s what CHIME would say the Patient Identification Task Force they created will now do. The question is whether CHIME burnt bridges they’ll need to cross to make that task force effective.

The reality is that Patient Identification and Patient Matching is a real problem that’s experienced by every healthcare organization. It’s one that CHIME members feel in their organizations and many of them need better solutions. As Beth Just from Just Associates noted in my discussion when the challenge was announced, $1 million is a drop in the bucket compared to what’s already been invested to solve the problem.

Plus, many healthcare organizations are in denial when it comes to this problem. They may say they have an accuracy of 98%, the reality is very different when a vendor goes in and wakes them up to what’s really happening in their organization. This is not an easy problem to solve and CHIME now understands this more fully. I hope their new task force is successful in addressing the problem since it is an important priority.