Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

PointClickCare Tackling Readmissions from Long-Term and Post-Acute Care Facilities Head-On

Posted on January 12, 2018 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He is currently an independent marketing consultant working with leading healthIT companies. Colin is a member of #TheWalkingGallery. His Twitter handle is: @Colin_Hung.

Transitioning from an acute care to a long-term/post-acute care (LTPAC) facility can be dangerous.

According to one study, nearly 23% of patients discharged from a hospital to a LTPAC facility had at least 1 readmission. Research indicates that the leading cause of readmission is harm caused by medication (called an adverse drug event). Studies have shown that as much as 56% of all medication errors happen at a transitional point of care.

By the year 2050 more than 27 million Americans will be using LTPAC services. The majority of these LTPAC patients will transition from an acute care facility at least once each year. With this many transitions, the number of medication errors each year would balloon into the millions. The impact on patients and on the healthcare system itself would be astronomical.

Thankfully there is a solution: medication reconciliation

The Agency for Healthcare Research and Quality (AHRQ) states: “Patients frequently receive new medications or have medications changed during hospitalizations. Lack of medication reconciliation results in the potential for inadvertent medication discrepancies and adverse drug events—particularly for patients with low health literacy, or those prescribed high-risk medications or complex medication regimens.”

Medication reconciliation is a process where an accurate list of medications a patient is taking is maintained at all times. That list is compared to admission, transfer and/or discharge orders at all transitional points both within a facility and between facilities. By seeing orders vs existing medications, clinicians and caregivers are able to prevent drug-interactions and complications due to omissions or dosage discrepancies.

What is surprising is the lack of progress in this area.

We have been talking about interoperability for years in HealthIT. Hundreds of vendors make announcements at the annual HIMSS conference about their ability to share data. Significant investments have been made in Health Information Exchanges (HIEs). Yet despite all of this, there has been relatively little progress made or coverage given to this problem of data exchange between hospitals and LTPAC facilities.

One company in the LTPAC space is working to change that. PointClickCare, one of the largest EHR providers to skilled nursing facilities, home care providers and senior living centers in North America, is dedicating resources and energy to overcoming the challenge of data sharing – specifically for medication reconciliation.

“We are tackling the interoperability problem head-on,” says Dave Wessinger, co-founder and Chief Operating Officer at PointClickCare. “The way we see it, there is absolutely no reason why it can take up to three days for an updated list of medications to arrive at our customer’s facility from a hospital. In that time patients are unnecessarily exposed to potential harm. That’s unacceptable and we are working with our customers and partners to address it.”

Over the past 12 months, the PointClickCare team has made significant progress integrating their platform with other players in the healthcare ecosystem – hospitals, pharmacies, HIEs, ACOs, physician practices and labs. According to Wessinger, PointClickCare is now at a point where they have “FHIR-ready” APIs and web-services.

“We believe that medication reconciliation is the key to getting everyone in the ecosystem to unlock their data,” continues Wessinger. “There is such a tremendous opportunity for all of us in the healthcare vendor community to work together to solve one of the biggest causes of hospital readmissions.”

Amie Downs, Senior Director ISTS Info & App Services at Good Samaritan Society, an organization that operates 165 skilled nursing facilities in 24 states and a PointClickCare customer, agrees strongly with Wessinger: “We have the opportunity to make medication reconciliation our first big interoperability win as an industry. We need a use-case that shows benefit. I can’t think of a better one than reducing harm to patients while simultaneously preventing costly readmissions. I think this can be the first domino so to speak.”

Having the technology infrastructure in place is just part of the challenge. Getting organizations to agree to share data is a significant hurdle and once you get organizations to sit down with each other, the challenge is resisting the temptation just to dump data to each other. Downs summed it up this way:

“What is really needed is for local acute care facilities to partner with local long-term and post-acute care facilities. We need to sit down together and pick the data that we each want/need to provide the best care for patients. We need to stop just sending everything to each other through a direct connection, on some sort of encrypted media that travels with the patient, via fax or physically printed on a piece of paper and then expecting the other party to sort it out.”

Downs goes on to explain how narrowing the scope of data exchange is beneficial: “I definitely see a strong future for CCDA data exchange to help in medication reconciliation. Right now medication information is just appended to the file we receive from acute care facilities. We need to agree on what medication information we really need. Right now, we get the entire medication history of the patient. What we really need is just the active medications that the patient is on.”

In addition to working on FHIR and APIs, BJ Boyle, Director of Product Management at PointClickCare, is also leading a data sharing initiative for those instances when there is no fellow EHR platform to connect to. “We are working towards something that is best described as a ‘Post-Acute Care Cloud’ or ‘PAC Cloud’,” explains Boyle. “We’re designing it so that hospital case managers can go to a single place and get all the information they need from the various SNFs they refer patients to. Today, when HL7 integration isn’t possible, case managers have to be given authorized access to the SNF’s system. That’s not ideal.”

PointClickCare has already taken an initial step towards this vision with an offering called eINTERACT. According to the company’s website eINTERACT allows for the “early identification of changes in condition…and the sooner a change in condition is identified, the quicker interventions can be implemented to prevent decline and avoid potential transfers” which is key to managing patient/resident health.

It’s worth noting that John Lynn blogged about LTPAC readmissions in 2014. Unfortunately at the macro/industry level, not much has changed. Dealing with readmissions from LTPAC facilities is not particularly exciting. Much of the attention remains with consumer-monitoring devices, apps and gadgets around the home.

Having said that, I do find it encouraging to see real progress being made by companies like PointClickCare and Good Samaritan Society. I hope to find more examples of practical interoperability that impacts patient care while touring the HIMSS18 exhibit floor in early March. In the meantime, I will be keeping my eye on PointClickCare and the LTPAC space to see how these interoperability initiatives progress.

When It Comes To Meaningful Use, Some Vendors May Have An Edge

Posted on December 1, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A new article appearing in the Journal of the American Medical Informatics Association has concluded that while EHRs certified under the meaningful use program should perform more or less equally, they don’t.

After conducting an analysis, researchers found that there were significant associations between specific vendors and level of hospital performance for all six meaningful use criteria they were using as a yardstick. Epic came out on top by this measure, demonstrating significantly higher performance on five of the six criteria.

However, it’s also worth noting that EHR vendor choice by hospitals accounted for anywhere between 7% and 34% of performance variation across the six meaningful use criteria. In other words, researchers found that at least in some cases, EHR performance was influenced as much by the fit between platform and hospital as the platform itself.

To conduct the study, researchers used recent national data on certified EHR vendors hospitals and implemented, along with hospital performance on six meaningful use criteria. They sought to find out:

  • Whether certain vendors were found more frequently among the highest performing hospitals, as measured by performance on Stage 2 meaningful use criteria;
  • Whether the relationship between vendor and hospital performance was consistent across the meaningful use criteria, or whether vendors specialized in certain areas; and
  • What proportion of variation in performance across hospitals could be explained by the vendor characteristics

To measure the performance of various vendors, the researchers chose six core stage two meaningful use criteria, including 60% of medication orders entered using CPOE;  providing 50% of patients with the ability to view/download/transmit their health information; for 50% of patients received from another setting or care provider, medication reconciliation is performed; for 50% of patient transitions to another setting or care provider, a summary of care record is provided; and for 10% of patient transitions to another setting or care provider, a summary of care record is electronically transmitted.

After completing their analysis, researchers found that three hospitals were in the top performance quartile for all meaningful use criteria, and all used Epic. Of the 17 hospitals in the top performance quartile for five criteria, 15 used Epic, one used MEDITECH and one another smaller vendor. Among the 68 hospitals in the top quartile for four criteria, 64.7% used Epic, 11.8% used Cerner and 8.8% used MEDITECH.

When it came to hospitals that were not in the top quartile for any of the criteria, there was no overwhelming connection between vendor and results. For the 355 hospitals in this category, 28.7% used MEDITECH, 25.1% used McKesson, 20.3% used Cerner, 14.4% used MEDHOST and 6.8% used Epic.

All of this being said, the researchers noted that news the hospital characteristics nor the vendor choice explained were then a small amount of the performance variation they saw. This won’t surprise anybody who’s seen firsthand how much other issues, notably human factors, can change the outcome of processes like these.

It’s also worth noting that there might be other causes for these differences. For example, if you can afford the notably expensive Epic systems, then your hospital and health system could likely afford to invest in meaningful use compliance as well. This added investment could explain hospitals meaningful use performance as much as EHR choice.

Surescripts Deal Connects EMR Vendors And PBMs To Improve Price Transparency

Posted on November 22, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

I’m no expert on the pharmacy business, but from where I sit as a consumer it’s always looked to me as though pharmaceutical pricing is something of a shell game. It makes predicting what your airline ticket will cost seem like child’s play.

Yes, in theory, the airlines engage in demand-oriented pricing, while pharma pricing is based on negotiated prices spread among multiple contracted parties, but in either case end-users such as myself have very little visibility into where these numbers are coming from.  And in my opinion, at least, that’s not good for anyone involved. You can say “blah blah blah skin in the game” all you want, but co-pays are a poor proxy for making informed decisions as a patient as to what benefits you’ll accrue and problems you face when buying a drug.

Apparently, Surescripts hopes to change the rules to some degree. It just announced that it has come together with two other interest groups within the pharmacy supply chain to offer patient-specific benefit and price information to providers at the point of care.

Its partners in the venture include a group of EMR companies, including Cerner, Epic, Practice Fusion and Aprima Medical Software, which it says represent 53% of the U.S. physician base. It’s also working with two pharmacy benefit managers (CVS Health and Express Scripts) which embrace almost two-thirds of US patients.

The new Surescripts effort actually has two parts, a Real-Time Prescription Benefit tool and an expanded version of its Prior Authorization solution.  Used together, and integrated with an EHR, these tools will clarify whether the patient’s health insurance will cover the drug suggested by the provider and offer therapeutic alternatives that might come at a lower price.

If you ask me, this is clever but fails to put pressure on the right parties. You don’t have to be a pharmaceutical industry expert to know that middlemen like PBMs and pharmacies use a number of less-than-visible stratagems jack up drug prices. Patients are forced to just cope with whatever deal these parties strike among themselves.

If you really want to build a network which helps consumers keep prices down, go for some real disclosure. Create a network which gathers and shares price information every time the drug changes hands, up to and including when the patient pays for that drug. This could have a massive effect on drug pricing overall.

Hey, look at what Amazon did just by making costs of shipping low and relatively transparent to end-users. They sucked a lot of the transaction costs out of the process of shipping products, then gave consumers tools allowing them to watch that benefit in action.

Give consumers even one-tenth of that visibility into their pharmacy supply chain, and prices would fall like a hot rock. Gee, I wonder why nobody’s ever tried that. Could it be that pharmaceutical manufacturers don’t want us to know the real costs of making and shipping their product?

Waiting For The Perfect “Standard” Is Not The Answer To Healthcare’s Interoperability Problem

Posted on October 16, 2017 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

Have you bought into the “standards will solve healthcare’s interoperability woes” train of thought? Everyone understands that standards are necessary to enable disparate systems to communicate with each other, but as new applications and new uses for data continually appear, healthcare organizations that are waiting for universal standards, are not maximizing the value of their data. More importantly, they will be waiting a long time to realize the full potential of their data.

Healthcare interoperability is not just a matter of transferring data as an entire file from one user to another. Instead, effective exchange of information allows each user to select which elements of a patient’s chart are needed, and then access them in a format that enables analysis of different data sets to provide a holistic picture of the patient’s medical history or clinical trends in a population of patients. Healthcare’s interoperability challenge is further exacerbated by different contextual interpretations of the words within those fields. For instance, how many different ways are there to say heart attack?

The development of the Health Level Seven (HL7®) FHIR®, which stands for Fast Healthcare Interoperability Resources, represents a significant step forward to interoperability. While the data exchange draft that is being developed and published by HL7 eliminates many of the complexities of earlier HL7 versions and facilitates real-time data exchange via web technology, publication of release 4 – the first normative version of the standard – is not anticipated until October 2018.

As these standards are further developed, the key to universal adoption will be simplicity, according to John Lynn, founder of the HealthcareScene.com. However, he suggests that CIOs stop waiting for “perfect standards” and focus on how they can best achieve interoperability now.

Even with standards that can be implemented in all organizations, the complexity and diversity of the healthcare environment means that it will take time to move everyone to the same standards. This is complicated by the number of legacy systems and patchwork of applications that have been added to healthcare IT systems in an effort to meet quickly changing needs throughout the organization. Shrinking financial resources for capital investment and increasing competition for IT professionals limits a health system’s ability to make the overall changes necessary for interoperability – no matter which standards are adopted.

Some organizations are turning to cloud-based, managed service platforms to perform the integration, aggregation and harmonization that makes data available to all users – regardless of the system or application in which the information was originally collected. This approach solves the financial and human resource challenges by making it possible to budget integration and data management requirements as an operational rather than a capital investment. This strategy also relieves the burden on in-house IT staff by relying on the expertise of professionals who focus on emerging technologies, standards and regulations that enable safe, compliant data exchange.

How are you planning to scale your interoperability and integration efforts?  If you're waiting for standards, why are you waiting?

As a leading provider of healthcare interoperability solutions, Liaison is a proud sponsor of Healthcare Scene. While the conversation about interoperability has been ongoing for many years, ideas, new technology and new strategies discussed and shared by IT professionals will lead to successful healthcare data exchange that will transform healthcare and result in better patient care.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Interoperability: Is Your Aging Healthcare Integration Engine the Problem?

Posted on September 18, 2017 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.
There is no shortage of data collected by healthcare organizations that can be used to improve clinical as well as business decisions. Announcements of new technology that collects patient information, clinical outcome data and operational metrics that will make a physician or hospital provide better, more cost-effective care bombard us on a regular basis.

The problem today is not the amount of data available to help us make better decisions; the problem is the inaccessibility of the data. When different users – physicians, allied health professionals, administrators and financial managers – turn to data for decision support, they find themselves limited to their own silos of information. The inability to access and share data across different disciplines within the healthcare organization prevents the user from making a decision based on a holistic view of the patient or operational process.

In a recent article, Alan Portela points out that precision medicine, which requires “the ability to collect real-time data from medical devices at the moment of care,” cannot happen easily without interoperability – the ability to access data across disparate systems and applications. He also points out that interoperability does not exist yet in healthcare.

Why are healthcare IT departments struggling to achieve interoperability?

Although new and improved applications are adopted on a regular basis, healthcare organizations are just now realizing that their integration middleware is no longer able to handle new types of data such as social media, the volume of data and the increasing number of methods to connect on a real-time basis. Their integration platforms also cannot handle the exchange of information from disparate data systems and applications beyond the four walls of hospitals. In fact, hospitals of 500 beds or more average 25 unique data sources with six electronic medical records systems in use. Those numbers will only move up over time, not down.

Integration engines in place throughout healthcare today were designed well before the explosion of the data-collection tools and digital information that exist today. Although updates and additions to integration platforms have enabled some interoperability, the need for complete interoperability is creating a movement to replace integration middleware with cloud-based managed services.

A study by the Aberdeen Group reveals that 76 percent of organizations will be replacing their integration middleware, and 70 percent of those organizations will adopt cloud-based integration solutions in the next three years.

The report also points out that as healthcare organizations move from an on-premises solution to a cloud-based platform, business leaders see migration to the cloud and managed services as a way to better manage operational expenses on a monthly basis versus large, up-front capital investments. An additional benefit is better use of in-house IT staff members who are tasked with mission critical, day-to-day responsibilities and may not be able to focus on continuous improvements to the platform to ensure its ability to handle future needs.

Healthcare has come a long way in the adoption of technology that can collect essential information and put it in the hands of clinical and operational decision makers. Taking that next step to effective, meaningful interoperability is critical.

As a leading provider of healthcare interoperability solutions, Liaison is a proud sponsor of Healthcare Scene. It is only through discussions and information-sharing among Health IT professionals that healthcare will achieve the organizational support for the steps required for interoperability.

Join John Lynn and Liaison for an insightful webinar on October 5, titled: The Future of Interoperability & Integration in Healthcare: How can your organization prepare?

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Open Source Tool Offers “Synthetic” Patients For Hospital Big Data Projects

Posted on September 13, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

As readers will know, using big data in healthcare comes with a host of security and privacy problems, many of which are thorny.

For one thing, the more patient data you accumulate, the bigger the disaster when and if the database is hacked. Another important concern is that if you decide to share the data, there’s always the chance that your partner will use it inappropriately, violating the terms of whatever consent to disclose you had in mind. Then, there’s the issue of working with incomplete or corrupted data which, if extensive enough, can interfere with your analysis or even lead to inaccurate results.

But now, there may be a realistic alternative, one which allows you to experiment with big data models without taking all of these risks. A unique software project is underway which gives healthcare organizations a chance to scope out big data projects without using real patient data.

The software, Synthea, is an open source synthetic patient generator that models the medical history of synthetic patients. It seems to have been built by The MITRE Corporation, a not-for-profit research and development organization sponsored by the U.S. federal government. (This page offers a list of other open source projects in which MITRE is or has been involved.)

Synthea is built on a Generic Module Framework which allows it to model varied diseases and conditions that play a role in the medical history of these patients. The Synthea modules create synthetic patients using not only clinical data, but also real-world statistics collected by agencies like the CDC and NIH. MITRE kicked off the project using models based on the top ten reasons patients see primary care physicians and the top ten conditions that shorten years of life.

Its makers were so thorough that each patient’s medical experiences are simulated independently from their “birth” to the present day. The profiles include a full medical history, which includes medication lists, allergies, physician encounters and social determinants of health. The data can be shared using C-CDA, HL7 FHIR, CSV and other formats.

On its site, MITRE says its intent in creating Synthea is to provide “high-quality, synthetic, realistic but not real patient data and associated health records covering every aspect of healthcare.” As MITRE notes, having a batch of synthetic patient data on hand can be pretty, well, handy in evaluating new treatment models, care management systems, clinical support tools and more. It’s also a convenient way to predict the impact of public health decisions quickly.

This is such a good idea that I’m surprised nobody else has done something comparable. (Well, at least as far as I know no one has.) Not only that, it’s great to see the software being made available freely via the open source distribution model.

Of course, in the final analysis, healthcare organizations want to work with their own data, not synthetic substitutes. But at least in some cases, Synthea may offer hospitals and health systems a nice head start.

Healthcare Interoperability and Standards Rules

Posted on September 11, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Dave Winer is a true expert on standards. I remember coming across him in the early days of social media when every platform was considering some sort of API. To illustrate his early involvement in standards, Dave was one of the early developers of the RSS standard that is now available on every blog and many other places.

With this background in mind, I was extremely fascinated by a manifesto that Dave Winer published earlier this year that he calls “Rules for Standards-Makers.” Sounds like something we really need in healthcare no?

You should really go and read the full manifesto if you’re someone involved in healthcare standards. However, here’s the list of rules Dave offers standards makers:

  1. There are tradeoffs in standards
  2. Software matters more than formats (much)
  3. Users matter even more than software
  4. One way is better than two
  5. Fewer formats is better
  6. Fewer format features is better
  7. Perfection is a waste of time
  8. Write specs in plain English
  9. Explain the curiosities
  10. If practice deviates from the spec, change the spec
  11. No breakage
  12. Freeze the spec
  13. Keep it simple
  14. Developers are busy
  15. Mail lists don’t rule
  16. Praise developers who make it easy to interop

If you’ve never had to program to a standard, then you might not understand these. However, those who are deep into standards will understand the pitfalls. Plus, you’ll have horror stories about when you didn’t follow these rules and what challenges that caused for you going forward.

The thing I love most about Dave’s rules is that it focuses on simplicity and function. Unfortunately, many standards in healthcare are focused on complexity and perfection. Healthcare has nailed the complexity part and as Dave’s rules highlight, perfection is impossible with standards.

In fact, I skipped over Dave’s first rule for standards makers which highlights the above really well:

Rule #1: Interop is all that matters

As I briefly mentioned in the last CXO Scene podcast, many healthcare CIOs are waiting until the standards are perfect before they worry about interoperability. It’s as if they think that waiting for the perfect standard is going to solve healthcare interoperability. It won’t.

I hope that those building out standards in healthcare will take a deep look at the rules Dave Winer outlines above. We need better standards in healthcare and we need healthcare data to be interoperable.

Is It Time To Put FHIR-Based Development Front And Center?

Posted on August 9, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

I like to look at questions other people in the #HIT world wonder about, and see whether I have a different way of looking at the subject, or something to contribute to the discussion. This time I was provoked by one asked by Chad Johnson (@OchoTex), editor of HealthStandards.com and senior marketing manager with Corepoint Health.

In a recent HealthStandards.com article, Chad asks: “What do CIOs need to know about the future of data exchange?” I thought it was an interesting question; after all, everyone in HIT, including CIOs, would like to know the answer!

In his discussion, Chad argues that #FHIR could create significant change in healthcare infrastructure. He notes that if vendors like Cerner or Epic publish a capabilities-based API, providers’ technical, clinical and workflow teams will be able to develop custom solutions that connect to those systems.

As he rightfully points out, today IT departments have to invest a lot of time doing rework. Without an interface like FHIR in place, IT staffers need to develop workflows for one application at a time, rather than creating them once and moving on. That’s just nuts. It’s hard to argue that if FHIR APIs offer uniform data access, everyone wins.

Far be it from me to argue with a good man like @OchoTex. He makes a good point about FHIR, one which can’t be emphasized enough – that FHIR has the potential to make vendor-specific workflow rewrites a thing of the past. Without a doubt, healthcare CIOs need to keep that in mind.

As for me, I have a couple of responses to bring to the table, and some additional questions of my own.

Since I’m an HIT trend analyst rather than actual tech pro, I can’t say whether FHIR APIs can or can’t do what Chat is describing, though I have little doubt that Chad is right about their potential uses.

Still, I’d contend out that since none other than FHIR project director Grahame Grieve has cautioned us about its current limitations, we probably want to temper our enthusiasm a bit. (I know I’ve made this point a few times here, perhaps ad nauseum, but I still think it bears repeating.)

So, given that FHIR hasn’t reached its full potential, it may be that health IT leaders should invest added time on solving other important interoperability problems.

One example that leaps to mind immediately is solving patient matching problems. This is a big deal: After all, If you can’t match patient records accurately across providers, it’s likely to lead to wrong-patient related medical errors.

In fact, according to a study released by AHIMA last year, 72 percent of HIM professional who responded work on mitigating possible patient record duplicates every week. I have no reason to think things have gotten better. We must find an approach that will scale if we want interoperable data to be worth using.

And patient data matching is just one item on a long list of health data interoperability concerns. I’m sure you’re aware of other pressing problems which could undercut the value of sharing patient records. The question is, are we going to address those problems before we began full-scale health data exchange? Or does it make more sense to pave the road to data exchange and address bumps in the road later?

The More Hospital IT Changes, The More It Remains The Same

Posted on June 23, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Once every year or two, some technical development leads the HIT buzzword list, and at least at first it’s very hard to tell whether that will stick. But over time, the technologies that actually work well are subsumed into the industry as it exists, lose their buzzworthy quality and just do their job.

Once in a while, the hot new thing sparks real change — such as the use of mobile health applications — but more often the ideas are mined for whatever value they offer and discarded.  That’s because in many cases, the “new thing” isn’t actually novel, but rather a slightly different take on existing technology.

I’d argue that this is particularly true when it comes to hospital IT, given the exceptionally high cost of making large shifts and the industry’s conservative bent. In fact, other than the (admittedly huge) changes fostered by the adoption of EMRs, hospital technology deployments are much the same as they were ten years ago.

Of course, I’d be undercutting my thesis dramatically if I didn’t stipulate that EMR adoption has been a very big deal. Things have certainly changed dramatically since 2007, when an American Hospital Association study reported that 32% percent of hospitals had no EMR in place and 57% had only partially implemented their EMR, with only the remaining 11% having implemented the platform fully.

Today, as we know, virtually every hospital has implemented an EMR integrated it with ancillary systems (some more integrated and some less).  Not only that, some hospitals with more mature deployments in place have used EMRs and connected tools to make major changes in how they deliver care.

That being said, the industry is still struggling with many of the same problems it did in a decade ago.

The most obvious example of this is the extent to which health data interoperability efforts have stagnated. While hospitals within a health system typically share data with their sister facilities, I’d argue that efforts to share data with outside organizations have made little material progress.

Another major stagnation point is data analytics. Even organizations that spent hundreds of millions of dollars on their EMR are still struggling to squeeze the full value of this data out of their systems. I’m not suggesting that we’ve made no progress on this issue (certainly, many of the best-funded, most innovative systems are getting there), but such successes are still far from common.

Over the longer-term, I suspect the shifts in consciousness fostered by EMRs and digital health will gradually reshape the industry. But don’t expect those technology lightning bolts to speed up the evolution of hospital IT. It’s going take some time for that giant ship to turn.

Google’s DeepMind Rolling Out Bitcoin-Like Health Record Tracking To Hospitals

Posted on May 8, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Blockchain technology is gradually becoming part of how we think about healthcare data. Even government entities like the ONC and FDA – typically not early adopters – are throwing their hat into the blockchain ring.

In fact, according to recent research by Deloitte, healthcare and life sciences companies are planning the most aggressive blockchain deployments of any industry. Thirty-five percent of Deloitte’s respondents told the consulting firm that they expected to put blockchain into production this year.

Many companies are tackling the practical uses of blockchain tech in healthcare. But to me, few are more interesting than Google’s DeepMind, a hot new AI firm based in the UK acquired by Google a few years ago.

DeepMind has already signed an agreement with a branch of Britain’s National Health Trust, under which it will access patient data in the development healthcare app named Streams. Now, it’s launching a new project in partnership with the NHS, in which it will use a new technology based on bitcoin to let hospitals, the NHS and over time, patients track what happens to personal health data.

The new technology, known as “Verifiable Data Audit,” will create a specialized digital ledger which automatically records every time someone touches patient data, according to British newspaper The Guardian.

In a blog entry, DeepMind co-founder Mustafa Suleyman notes that the system will track not only that the data was used, but also why. In addition, the ledger supporting the audit will be set to append-only, so once the system records an activity, that record can’t be erased.

The technology differs from existing blockchain models in some important ways, however. For one thing, unlike in other blockchain models, Verifiable Data Audit won’t rely on decentralized ledger verification of a broad set of participants. The developers have assumed that trusted institutions like hospitals can be relied on to verify ledger records.

Another way in which the new technology is different is that it doesn’t use a chain infrastructure. Instead, it’s using a mathematical function known as a Merkle tree. Every time the system adds an entry to the ledger, it generates a cryptographic hash summarizing not only that latest ledger entry, but also the previous ledger values.

DeepMind is also providing a dedicated online interface which participating hospitals can use to review the audit trail compiled by the system, in real-time. In the future, the company hopes to make automated queries which would “sound the alarm” if data appeared to be compromised.

Though DeepMind does expect to give patients direct oversight over how, where and why their data has been used, they don’t expect that to happen for some time, as it’s not yet clear how to secure such access. In the mean time, participating hospitals are getting a taste of the future, one in which patients will ultimate control access to their health data assets.