Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Waiting For The Perfect “Standard” Is Not The Answer To Healthcare’s Interoperability Problem

Posted on October 16, 2017 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

Have you bought into the “standards will solve healthcare’s interoperability woes” train of thought? Everyone understands that standards are necessary to enable disparate systems to communicate with each other, but as new applications and new uses for data continually appear, healthcare organizations that are waiting for universal standards, are not maximizing the value of their data. More importantly, they will be waiting a long time to realize the full potential of their data.

Healthcare interoperability is not just a matter of transferring data as an entire file from one user to another. Instead, effective exchange of information allows each user to select which elements of a patient’s chart are needed, and then access them in a format that enables analysis of different data sets to provide a holistic picture of the patient’s medical history or clinical trends in a population of patients. Healthcare’s interoperability challenge is further exacerbated by different contextual interpretations of the words within those fields. For instance, how many different ways are there to say heart attack?

The development of the Health Level Seven (HL7®) FHIR®, which stands for Fast Healthcare Interoperability Resources, represents a significant step forward to interoperability. While the data exchange draft that is being developed and published by HL7 eliminates many of the complexities of earlier HL7 versions and facilitates real-time data exchange via web technology, publication of release 4 – the first normative version of the standard – is not anticipated until October 2018.

As these standards are further developed, the key to universal adoption will be simplicity, according to John Lynn, founder of the HealthcareScene.com. However, he suggests that CIOs stop waiting for “perfect standards” and focus on how they can best achieve interoperability now.

Even with standards that can be implemented in all organizations, the complexity and diversity of the healthcare environment means that it will take time to move everyone to the same standards. This is complicated by the number of legacy systems and patchwork of applications that have been added to healthcare IT systems in an effort to meet quickly changing needs throughout the organization. Shrinking financial resources for capital investment and increasing competition for IT professionals limits a health system’s ability to make the overall changes necessary for interoperability – no matter which standards are adopted.

Some organizations are turning to cloud-based, managed service platforms to perform the integration, aggregation and harmonization that makes data available to all users – regardless of the system or application in which the information was originally collected. This approach solves the financial and human resource challenges by making it possible to budget integration and data management requirements as an operational rather than a capital investment. This strategy also relieves the burden on in-house IT staff by relying on the expertise of professionals who focus on emerging technologies, standards and regulations that enable safe, compliant data exchange.

How are you planning to scale your interoperability and integration efforts?  If you're waiting for standards, why are you waiting?

As a leading provider of healthcare interoperability solutions, Liaison is a proud sponsor of Healthcare Scene. While the conversation about interoperability has been ongoing for many years, ideas, new technology and new strategies discussed and shared by IT professionals will lead to successful healthcare data exchange that will transform healthcare and result in better patient care.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Visible and Useful Patient Data in an Era of Interoperability Failure

Posted on October 13, 2017 I Written By

Healthcare as a Human Right. Physician Suicide Loss Survivor. Janae writes about Artificial Intelligence, Virtual Reality, Data Analytics, Engagement and Investing in Healthcare. twitter: @coherencemed

Health record interoperability and patient data is a debated topic in Health IT. Government requirements and business interests create a complex exchange about who should own data and how it should be used and who should profit from patient data. Many find themselves asking what the next steps in innovation are. Patient data, when it is available, is usually not in a format that is visible and useful for patients or providers. The debate about data can distract from progress in making patient data visible and useful.

Improvements in HealthIT will improve outcomes through better data interpretation and visibility. Increasing the utility of health data is a needed step. Visibility of patient data has been a topic of debate since the creation of electronic health records. This was highlighted in a recent exchange between former vice president Joe Biden and Judy Faulkner, CEO of Epic Systems.

Earlier this year at the Cancer Moonshoot, Faulkner expressed her skepticism about the usefulness of allowing patients access to their medical records. Biden replied, asking Faulkner for his personal health data.

Faulkner was quick to retort, questioning why Mr. Biden wanted his records, and reportedly responded “Why do you want your medical records?” There are a thousand pages of which you understand 10.”

My interpretation of her response-“You don’t even know what you are asking. Do not get distracted by the shiny vendor trying to make money from interpreting my company’s data”

As reported in Politico Biden–and really, I think that man can do no wrong, responded, “None of your business.”

In the wake of the Biden Faulkner exchange, the entire internet constituency of Health IT and patient records had an ischemic attack. Since this exchange we’ve gone on to look at interoperability in times of crisis. We’ve had records from Houston and Puerto Rico and natural disasters. The importance of sharing data and the scope of useful data is the same. 

During what I call the beginning of several months of research about the state of interoperability I started reading about the Biden and Faulkner exchange. This was not the first time I had been reading extensively about patient data and if EHR and EMR data is useful. It just reminded me of the frustrations I’ve heard for years about EHR records being useless. Like many of us, I disappeared down the rabbit hole of tweets about electronic health records for a full day. Patient advocates STILL frustrated by the lack of cooperation between EHR and EMR vendors found renewed vigor; they cited valid data. Studies were boldly thrown back and the exchange included some seriously questionable math and a medium level of personal attack.

Everyone was like, Are we STILL on this problem where very little happens and it’s incredibly complex? How? How do we still not have a system that makes patient data more useful? Others were like, Obviously it doesn’t make sense because A) usefulness in care, and B) money.

Some argued that patients just want to get better. Others pointed out that acting like patients were stupid children not only causes a culture of contempt for providers and vendors alike, but also kills patients. Interestingly, Christina Farr CNBC reported that the original exchange may have been more civil than originally interpreted. 

My personal opinion: Biden obviously knew we needed to talk about patient rights, open data, and interoperability more. It has had more coverage since then. I don’t know Faulkner, but it sounds like a lot of people on Twitter don’t feel like she is very cooperative. She sounds like a slightly savage businesswoman, which for me is usually a positive thing. I met Peter from Epic who works with interoperability and population health and genomics and he was delightful.

Undeniably, there is some validity to Judy’s assertion that the data would not be useful to Biden; EHR and EMR data, at least in the format available from the rare cooperative vendors, is not very useful. They are a digital electronic paper record. I am willing to bet Biden–much as I adore the guy–didn’t even offer a jump drive on which to store his data. The potential of EHR data visualization to improve patient outcomes needs more coverage. Let’s not focus on the business motivations of parties that don’t want to share their data, let’s look at potential improvements in data usefulness. 

It was magic because I had just had a conversation about data innovation with Dr. Michael Rothman. An early veteran in the artificial intelligence field, Dr. Rothman worked in data modeling before the AI winter of the 80s and the current resurgence in investment and popularity. He predates the current buzz cycle of blockchain and artificial intelligence everything. With many data scientists frustrated by an abandonment of elegant, simple solutions in favor of venture capital and sexy advertising vaporware, it is timely to look at tools that improve outcomes.

In speaking with Dr. Rothman, I was surprised by the cadence of his voice, he asked me what I knew about the history of artificial intelligence, and I asked him to tell his data story. He started by outlining the theory of statistical modeling and data dump in neural net modeling. His company, PeraHealth, represents part of the solution for making EMR and EHR data useful to clinicians and patients.

The idea that data is going to give you the solution is, in a sense, slightly possible but extremely unlikely. If you look at situations where people have been successful, there is a lot of human ingenuity that goes into selecting and transforming the variables into meaningful forms before building the neural network or deep learning algorithm. Without a framework of understanding, a lot of EHR data is simply a data dump that lacks clinical knowledge or visualization to provide appropriate scaffolding.  You do need ingenuity, and you do need the right data. There are so many problems and complexities with data that innovation and ingenuity is lagging behind with healthIT.

The question is – is the answer you are looking for in the input data? If you have the answer in the data, you will be able to provide insights based on it. Innovation in healthcare predictions and patient records will come from looking at data sets that are actually predictive of health.

Dr. Rothman’s work in healthcare started with a medical error. His mother had valve replacement surgery and came through in good shape. Although initially she was recovering quickly, she started to deteriorate after a few days. And the problem was that the system made it difficult to see.  Each day she was evaluated.  Each day her condition was viewed as reasonable given her surgery and age.  What they couldn’t see was that each day she was getting worse.  They couldn’t see the trend.  She was discharged and returned to the ED 4-days later and died.

As a scientist, he recognized that the hospital staff didn’t have everything they needed to avoid an error like this. He approached the hospital CEO and asked for permission to help them solve the problem. Dr. Rothman explained, I didn’t feel that the doctors had given poor medical care, this was a failure of the system.

The hospital CEO did something remarkable. They shared their data. In a safe system they allowed an expert in data science to come in to see what he could find in their patient records, rather than telling him he probably wouldn’t understand the printout. The hospital was an early adopter of EHR records, so they were able to look at a long history of data to find what was being missed. Using vital signs, lab tests, and importantly, an overlooked source of data, nursing notes, Dr. Rothman (and his brother) found a way to synthesize a unified score, a single number which captures the overall condition of the patient, a single number which was fed from the EMR and WOULD show a trend.  There is an answer if you include the right data.  

Doctors and nurses look at a myriad of data and synthesize it, to reach an understanding.  Judy is right that a layman looking at random pieces of data will not likely gain much understanding, BUT they may.  And with more help they might.  Certainly, they deserve a chance to look.  And certainly, the EMR and EHR companies have an obligation to present the data in some readable form.

Patients should be demanding data, they should be demanding hospitals give them usable care and normalize data based on their personal history to help save their lives.

Based on this experience, Michael and Steven built the Rothman Index, a measure of patient health based on analytics that visualizes data found in EHRs. They went on to found PeraHealth, which enables nursing kiosks to show the line and screens to see if any patients decline. In some health systems, an attending physician can get an alert about patients in danger. The visualization from the record isn’t just a screen by the patient, it is also on the physicians and nurses’ screens and includes warnings. Providers have time to evaluate what is wrong before it is too late. The data in the health record is made visual and can be a tool for providers.


Visualization of Patient Status with the Rothman Index and Perahealth

Is Perahealth everywhere? Not yet. For every innovation and potential improvement there is a period of time where slow adopters wait and invest in sure bets. Just like interoperable data isn’t an actuality in a system that desperately needs it, this is a basic step toward improving patient outcomes. Scaling implementation of an effective data tool is not always clear to hospital CMIO and CEO teams.  The triage of what healthIT solution a healthcare system chooses to implement is complex. Change also requires strong collaborative efforts and clear expectations. Often, even if hospital systems know something provides benefits to patients, they don’t have the correct format to implement the solution. They need a strategy for adoption and a strong motivation. It seems that the strongest motivations are financial and outcomes based. The largest profit savings with the minimum effort usually takes adoption precedent. This should also be aligned with end users- if a nurse uses the system it needs to improve their workflow, not just give them another task.

One of the hospitals that is successfully collaborating to make patient data more useful and visual is Houston Methodist. I spoke to Katherine Walsh, Chief Nursing Officer from Houston Methodist about their journey to use EHR data with Perahealth. She explained it to me- Data is the tool, without great doctors and nurses knowing the danger zone, it doesn’t help. This reminded me of Faulkner’s reaction that not all patient data is useful. Clinical support should be designed around visible data to give better care. Without a plan, data is not actionable. Katherine explained that when nurses could see that the data was useful, they also had to make sure their workflow included timely records. When EHR data is actually being used in the care of patients, suddenly data entry workflow changes. When nurses and doctors can see that their actions are saving lives, they are motivated.
The process to change their workflow and visualize patient data did not happen overnight. In the story of Houston Methodist’s adoption of Perahealth, Walsh said they wanted to make sure they helped doctors and nurses understand what the data meant.  “We put large screens on all the units- you can immediately see the patients that are at risk- it’s aggregated by the highest risk factor.” If you are waiting for someone to pull this data up on their desktop, you are waiting for them to search something. But putting it on the unit where you can see it makes it much easier to round, and makes it much easier to get a sense of what is going on. You can always identify what and who is at risk because it’s on a TV screen. The Houston Methodist team showed great leadership in nursing informatics, improving outcomes and using an internal strategy for implementation.

They normalize the variants for each person- a heart rate of 40 for a runner might be normal- then on the next shift 60 seems normal- then at 80 it also seems normal- you can tell them when you want an alert. To help with motivation, Walsh needed to make the impact of PeraHealth visual. They hung 23 hospital gowns around a room, representing the patients they had saved using the system.
The future of electronic health records will be about creating usable data, not just a data dump of fields. It is transforming EHRs from a cost hemorrhage to a life-saving tool through partnerships. Physicians don’t want another administrative task or another impersonal device. Nurses don’t want to go through meaningless measures and lose track of patients during shift changes. Show them the success they’ve had and let the data help them give great care.

Hospital administrators don’t want another data tool that doesn’t improve patient outcomes but has raised capital on vaporware. Creators don’t want more EHR companies that don’t know how to work with agile partners to create innovation.

The real ingenuity is in understanding – what data do you need? What data do patients need? Who can electronic healthcare record companies partner with to bridge the data divide?

We can bridge the gap of electronic health records that aren’t legible or useful to patients and create tools to save lives. Tools like those from PeraHealth are the result of a collaborative effort to take the data we have and synthesize it and visualize it and let care providers SEE their patients.  This saves lives.

Without this, the data is there, it’s just not usable.

Don’t just give the patients their data, show them their health.

Geisinger Partners With Pharmas To Improve Diabetes Outcomes

Posted on October 10, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Geisinger has struck a deal with Boehringer Ingelheim to develop a risk-prediction model for three of the most common adverse outcomes from type 2 diabetes. The agreement is on behalf of Boehringer’s diabetes alliance with Eli Lilly and Company.

What makes this partnership interesting is that the players involved in this kind of pharma relationship are usually health plans. For example:

  • In May, UnitedHealth Group’s Optum struck a deal to model reimbursement models in which payment for prescription drugs is better structured to improve outcomes.
  • Earlier this year, Aetna cut a deal with Merck in which the two will use predictive analytics to identify target populations and offer them specialized health and wellness services. The program started by focusing on patients with diabetes and hypertension in the mid-Atlantic US.
  • Another example is the 2015 agreement between Harvard Pilgrim health plan and Amgen, in which the pharma would pay rebates if its cholesterol-control medication Repatha didn’t meet agreed-upon thresholds.

As the two organizations note in their joint press statement, cardiovascular disease is the leading cause of death associated with diabetes, and diabetes is the top cause of kidney failure in the U.S. population. Cardiovascular complications alone cost the U.S. more than $23 billion per year, and roughly 68 percent of deaths in people with type 2 diabetes in the U.S. are caused by cardiovascular disease.

The two partners hope to improve the odds for diabetics by identifying their condition quickly and treating it effectively.

Under the Geisinger/Boehringer agreement, the partners will attempt to predict which adults with type 2 diabetes are most likely to develop kidney failure, undergo hospitalization for heart failure or die from cardiovascular causes.

To improve the health of diabetics, the partners will develop predictive risk models using de-identified EHR data from Geisinger. The goal is to develop more precise treatment pathways for people with type 2 diabetes, and see that the pathways align with quality guidelines.

Though this agreement itself doesn’t have a value-based component, it’s likely that health systems like Geisinger will take up health plans’ strategies for lowering spend on medications, as the systems will soon be on the hook for excess spending.

After all, according to a KPMG survey, value-based contracts are becoming a meaningful percentage of health system revenue. The survey found that while value-based agreements aren’t dominant, 36 percent of respondents generated some of their revenue from value-based payments and 14 percent said the majority of revenue is generated by value-based payments.

In the meantime, partnerships like this one may help to improve outcomes for expensive, prevalent conditions like diabetes, high blood pressure, arthritis and heart disease. Expect to see more health systems strike such agreements in the near future.

Predictive Analytics with Andy Bartley from Intel

Posted on September 20, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

#Paid content sponsored by Intel.

In the latest Healthcare Scene video interview, I talk with Andy Bartley, Senior Solutions Architect in the Health and Life Sciences Group at Intel. Andy and I talk about the benefits of and challenges to using predictive analytics in healthcare.

Andy offers some great insights on the subject, having had a long and varied career in the industry. Before joining Intel, he served in multiple healthcare organizations, including nurse communication and scheduling application startup NurseGrid, primary care practice One Medical Group and medical device manufacturer Stryker.

In my interview, he provides a perspective on what hospitals and health systems should be doing to leverage predictive analytics to improve care and outcomes, even if they don’t have a massive budget. Plus, he talks about predictive analytics that are already happening today.

Here are the list of questions I asked him if you’d like to skip to a specific topic in the video. Otherwise, you can watch the full video interview in the embedded video at the bottom of this post:

What are your thoughts on predictive analytics? How is it changing healthcare as we know it? What examples have you seen of effective predictive analytics? We look forward to seeing your thoughts in the comments and on social media.

Interoperability: Is Your Aging Healthcare Integration Engine the Problem?

Posted on September 18, 2017 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.
There is no shortage of data collected by healthcare organizations that can be used to improve clinical as well as business decisions. Announcements of new technology that collects patient information, clinical outcome data and operational metrics that will make a physician or hospital provide better, more cost-effective care bombard us on a regular basis.

The problem today is not the amount of data available to help us make better decisions; the problem is the inaccessibility of the data. When different users – physicians, allied health professionals, administrators and financial managers – turn to data for decision support, they find themselves limited to their own silos of information. The inability to access and share data across different disciplines within the healthcare organization prevents the user from making a decision based on a holistic view of the patient or operational process.

In a recent article, Alan Portela points out that precision medicine, which requires “the ability to collect real-time data from medical devices at the moment of care,” cannot happen easily without interoperability – the ability to access data across disparate systems and applications. He also points out that interoperability does not exist yet in healthcare.

Why are healthcare IT departments struggling to achieve interoperability?

Although new and improved applications are adopted on a regular basis, healthcare organizations are just now realizing that their integration middleware is no longer able to handle new types of data such as social media, the volume of data and the increasing number of methods to connect on a real-time basis. Their integration platforms also cannot handle the exchange of information from disparate data systems and applications beyond the four walls of hospitals. In fact, hospitals of 500 beds or more average 25 unique data sources with six electronic medical records systems in use. Those numbers will only move up over time, not down.

Integration engines in place throughout healthcare today were designed well before the explosion of the data-collection tools and digital information that exist today. Although updates and additions to integration platforms have enabled some interoperability, the need for complete interoperability is creating a movement to replace integration middleware with cloud-based managed services.

A study by the Aberdeen Group reveals that 76 percent of organizations will be replacing their integration middleware, and 70 percent of those organizations will adopt cloud-based integration solutions in the next three years.

The report also points out that as healthcare organizations move from an on-premises solution to a cloud-based platform, business leaders see migration to the cloud and managed services as a way to better manage operational expenses on a monthly basis versus large, up-front capital investments. An additional benefit is better use of in-house IT staff members who are tasked with mission critical, day-to-day responsibilities and may not be able to focus on continuous improvements to the platform to ensure its ability to handle future needs.

Healthcare has come a long way in the adoption of technology that can collect essential information and put it in the hands of clinical and operational decision makers. Taking that next step to effective, meaningful interoperability is critical.

As a leading provider of healthcare interoperability solutions, Liaison is a proud sponsor of Healthcare Scene. It is only through discussions and information-sharing among Health IT professionals that healthcare will achieve the organizational support for the steps required for interoperability.

Join John Lynn and Liaison for an insightful webinar on October 5, titled: The Future of Interoperability & Integration in Healthcare: How can your organization prepare?

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Open Source Tool Offers “Synthetic” Patients For Hospital Big Data Projects

Posted on September 13, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

As readers will know, using big data in healthcare comes with a host of security and privacy problems, many of which are thorny.

For one thing, the more patient data you accumulate, the bigger the disaster when and if the database is hacked. Another important concern is that if you decide to share the data, there’s always the chance that your partner will use it inappropriately, violating the terms of whatever consent to disclose you had in mind. Then, there’s the issue of working with incomplete or corrupted data which, if extensive enough, can interfere with your analysis or even lead to inaccurate results.

But now, there may be a realistic alternative, one which allows you to experiment with big data models without taking all of these risks. A unique software project is underway which gives healthcare organizations a chance to scope out big data projects without using real patient data.

The software, Synthea, is an open source synthetic patient generator that models the medical history of synthetic patients. It seems to have been built by The MITRE Corporation, a not-for-profit research and development organization sponsored by the U.S. federal government. (This page offers a list of other open source projects in which MITRE is or has been involved.)

Synthea is built on a Generic Module Framework which allows it to model varied diseases and conditions that play a role in the medical history of these patients. The Synthea modules create synthetic patients using not only clinical data, but also real-world statistics collected by agencies like the CDC and NIH. MITRE kicked off the project using models based on the top ten reasons patients see primary care physicians and the top ten conditions that shorten years of life.

Its makers were so thorough that each patient’s medical experiences are simulated independently from their “birth” to the present day. The profiles include a full medical history, which includes medication lists, allergies, physician encounters and social determinants of health. The data can be shared using C-CDA, HL7 FHIR, CSV and other formats.

On its site, MITRE says its intent in creating Synthea is to provide “high-quality, synthetic, realistic but not real patient data and associated health records covering every aspect of healthcare.” As MITRE notes, having a batch of synthetic patient data on hand can be pretty, well, handy in evaluating new treatment models, care management systems, clinical support tools and more. It’s also a convenient way to predict the impact of public health decisions quickly.

This is such a good idea that I’m surprised nobody else has done something comparable. (Well, at least as far as I know no one has.) Not only that, it’s great to see the software being made available freely via the open source distribution model.

Of course, in the final analysis, healthcare organizations want to work with their own data, not synthetic substitutes. But at least in some cases, Synthea may offer hospitals and health systems a nice head start.

Healthcare Interoperability and Standards Rules

Posted on September 11, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Dave Winer is a true expert on standards. I remember coming across him in the early days of social media when every platform was considering some sort of API. To illustrate his early involvement in standards, Dave was one of the early developers of the RSS standard that is now available on every blog and many other places.

With this background in mind, I was extremely fascinated by a manifesto that Dave Winer published earlier this year that he calls “Rules for Standards-Makers.” Sounds like something we really need in healthcare no?

You should really go and read the full manifesto if you’re someone involved in healthcare standards. However, here’s the list of rules Dave offers standards makers:

  1. There are tradeoffs in standards
  2. Software matters more than formats (much)
  3. Users matter even more than software
  4. One way is better than two
  5. Fewer formats is better
  6. Fewer format features is better
  7. Perfection is a waste of time
  8. Write specs in plain English
  9. Explain the curiosities
  10. If practice deviates from the spec, change the spec
  11. No breakage
  12. Freeze the spec
  13. Keep it simple
  14. Developers are busy
  15. Mail lists don’t rule
  16. Praise developers who make it easy to interop

If you’ve never had to program to a standard, then you might not understand these. However, those who are deep into standards will understand the pitfalls. Plus, you’ll have horror stories about when you didn’t follow these rules and what challenges that caused for you going forward.

The thing I love most about Dave’s rules is that it focuses on simplicity and function. Unfortunately, many standards in healthcare are focused on complexity and perfection. Healthcare has nailed the complexity part and as Dave’s rules highlight, perfection is impossible with standards.

In fact, I skipped over Dave’s first rule for standards makers which highlights the above really well:

Rule #1: Interop is all that matters

As I briefly mentioned in the last CXO Scene podcast, many healthcare CIOs are waiting until the standards are perfect before they worry about interoperability. It’s as if they think that waiting for the perfect standard is going to solve healthcare interoperability. It won’t.

I hope that those building out standards in healthcare will take a deep look at the rules Dave Winer outlines above. We need better standards in healthcare and we need healthcare data to be interoperable.

Hospital EMR Adoption Divide Widening, With Critical Access Hospitals Lagging

Posted on September 8, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

I don’t know about you, but I was a bit skeptical when HIMSS Analytics rolled out its EMRAM {Electronic Medical Record Adoption Model) research program. As some of you doubtless know, EMRAM breaks EMR adoption into eight stages, from Stage 0 (no health IT ancillaries installed) to Stage 7 (complete EMR installed, with data analytics on board).

From its launch onward, I’ve been skeptical about EMRAM’s value, in part because I’ve never been sure that hospital EMR adoption could be packaged neatly into the EMRAM stages. Perhaps the research model is constructed well, but the presumption that a multivariate process of health IT adoption can be tracked this way is a bit iffy in my opinion.

On the other hand, I like the way the following study breaks things out. New research published in the Journal of the American Medical Informatics Association looks at broader measures of hospital EHR adoption, as well as their level of performance in two key categories.

The study’s main goal was to assess the divide between hospitals using their EHRs in an advanced fashion and those that were not. One of the key steps in their process was to crunch numbers in a manner allowing them to identify hospital characteristics associated with high adoption in each of the advanced use criteria.

To conduct the research, the authors dug into 2008 to 2015 American Hospital Association Information Technology Supplement survey data. Using the data, the researchers measured “basic” and “comprehensive” EHR adoption among hospitals. (The ONC has created definitions for both basic and advanced adoption.)

Next, the research team used new supplement questions to evaluate advanced use of EHRs. As part of this process, they also used EHR data to evaluate performance management and patient engagement functions.

When all was said and done, they drew the following conclusions:

  • 80.5% of hospitals had adopted a basic EHR system, up 5.3% from 2014
  • 37.5% of hospitals had adopted at least 8 (of 10) EHR data sets useful for performance measurement
  • 41.7% of hospitals adopted at least 8 (of 10) EHR functions related to patient engagement

One thing that stood out among all the data was that critical access hospitals were less likely to have adopted at least 8 performance measurement functions and at least eight patient engagement functions. (Notably, HIMSS Analytics research from 2015 had already found that rural hospitals had begun to close this gap.)

“A digital divide appears to be emerging [among hospitals], with critical-access hospitals in particular lagging behind,” the article says. “This is concerning, because EHR-enabled performance measurement and patient engagement are key contributors to improving hospital performance.”

While the results don’t surprise me – and probably won’t surprise you either – it’s a shame to be reminded that critical access hospitals are trailing other facilities. As we all know, they’re always behind the eight ball financially, often understaffed and overloaded.

Given their challenges, it’s predictable that critical access hospitals would continue lag behind in the health IT adoption curve. Unfortunately, this deprives them of feedback which could improve care and perhaps offer a welcome boost to their efficiency as well. It’s a shame the way the poor always get poorer.

The More Hospital IT Changes, The More It Remains The Same

Posted on June 23, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Once every year or two, some technical development leads the HIT buzzword list, and at least at first it’s very hard to tell whether that will stick. But over time, the technologies that actually work well are subsumed into the industry as it exists, lose their buzzworthy quality and just do their job.

Once in a while, the hot new thing sparks real change — such as the use of mobile health applications — but more often the ideas are mined for whatever value they offer and discarded.  That’s because in many cases, the “new thing” isn’t actually novel, but rather a slightly different take on existing technology.

I’d argue that this is particularly true when it comes to hospital IT, given the exceptionally high cost of making large shifts and the industry’s conservative bent. In fact, other than the (admittedly huge) changes fostered by the adoption of EMRs, hospital technology deployments are much the same as they were ten years ago.

Of course, I’d be undercutting my thesis dramatically if I didn’t stipulate that EMR adoption has been a very big deal. Things have certainly changed dramatically since 2007, when an American Hospital Association study reported that 32% percent of hospitals had no EMR in place and 57% had only partially implemented their EMR, with only the remaining 11% having implemented the platform fully.

Today, as we know, virtually every hospital has implemented an EMR integrated it with ancillary systems (some more integrated and some less).  Not only that, some hospitals with more mature deployments in place have used EMRs and connected tools to make major changes in how they deliver care.

That being said, the industry is still struggling with many of the same problems it did in a decade ago.

The most obvious example of this is the extent to which health data interoperability efforts have stagnated. While hospitals within a health system typically share data with their sister facilities, I’d argue that efforts to share data with outside organizations have made little material progress.

Another major stagnation point is data analytics. Even organizations that spent hundreds of millions of dollars on their EMR are still struggling to squeeze the full value of this data out of their systems. I’m not suggesting that we’ve made no progress on this issue (certainly, many of the best-funded, most innovative systems are getting there), but such successes are still far from common.

Over the longer-term, I suspect the shifts in consciousness fostered by EMRs and digital health will gradually reshape the industry. But don’t expect those technology lightning bolts to speed up the evolution of hospital IT. It’s going take some time for that giant ship to turn.

We Can’t Afford To Be Vague About Population Health Challenges

Posted on June 19, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Today, I looked over a recent press release from Black Book Research touting its conclusions on the role of EMR vendors in the population health technology market. Buried in the release were some observations by Alan Hutchison, vice president of Connect & Population Health at Epic.

As part of the text, the release observes that “the shift from quantity-based healthcare to quality-based patient-centric care is clearly the impetus” for population health technology demand. This sets up some thoughts from Hutchison.

The Epic exec’s quote rambles a bit, but in summary, he argues that existing systems are geared to tracking units of care under fee-for-service reimbursement schemes, which makes them dinosaurs.

And what’s the solution to this problem? Why, health systems need to invest in new (Epic) technology geared to tracking patients across their path of care. “Single-solution systems and systems built through acquisition [are] less able to effectively understand the total cost of care and where the greatest opportunities are to reduce variation, improve outcomes and lower costs,” Hutchison says.

Yes, I know that press releases generally summarize things in broad terms, but these words are particularly self-serving and empty, mashing together hot air and jargon into an unappetizing patty. Not only that, I see a little bit too much of stating as fact things which are clearly up for grabs.

Let’s break some of these issues down, shall we?

  • First, I call shenanigans on the notion that the shift to “value-based care” means that providers will deliver quality care over quantity. If nothing else, the shifts in our system can’t be described so easily. Yeah, I know, don’t expect much from a press release, but words matter.
  • Second, though I’m not surprised Hutchison made the argument, I challenge the notion that you must invest in entirely new systems to manage population health.
  • Also, nobody is mentioning that while buying a new system to manage pop health data may be cleaner in some respects, it could make it more difficult to integrate existing data. Having to do that undercuts the value of the new system, and may even overshadow those benefits.

I don’t know about you, but I’m pretty tired of reading low-calorie vendor quotes about the misty future of population health technology, particularly when a vendor rep claims to have The Answer.  And I’m done with seeing clichéd generalizations about value-based care pass for insight.

Actually, I get a lot more out of analyses that break down what we *don’t* know about the future of population health management.

I want to know what hasn’t worked in transitioning to value-based reimbursement. I hope to see stories describing how health systems identified their care management weaknesses. And I definitely want to find out what worries senior executives about supporting necessary changes to their care delivery models.

It’s time to admit that we don’t yet know how this population health management thing is going to work and abandon the use of terminally vague generalizations. After all, once we do, we can focus on the answering our toughest questions — and that’s when we’ll begin to make real progress.