Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Connecting the Data: Three Steps to Meet Digital Transformation Goals

Posted on July 16, 2018 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

A white paper published by the World Economic Forum in 2016 begins with the statement, “Few industries have the potential to be changed so profoundly by digital technology as healthcare, but the challenges facing innovators – from regulatory barriers to difficulties in digitalizing patient data – should not be underestimated.”

That was two years ago, and many of the same challenges still exist as the digital transformation of healthcare continues.

In a recent HIMSS focus group sponsored by Liaison, participants identified their major digital transformation and interoperability goals for the near future as:

  • EMR rollout and integration
  • Population health monitoring and analytics
  • Remote clinical encounters
  • Mobile clinical applications

These goals are not surprising. Although EMRs have been in place in many healthcare organizations for years, the growth of health systems as they add physicians, clinics, hospitals and diagnostic centers represents a growing need to integrate disparate systems. The continual increase in the number of mobile applications and medical devices that can be used to gather information to feed into EMR systems further exacerbates the challenge.

What is surprising is the low percentage of health systems that believe that they are very or somewhat well-prepared to handle these challenges – only 35 percent of the HIMSS/Liaison focus group members identified themselves as well-prepared.

“Chaos” was a word used by focus group participants to describe what happens in a health system when numerous players, overlapping projects, lack of a single coordinator and a tendency to find niche solutions that focus on one need rather than overall organizational needs drive digital transformation projects.

It’s easy to understand the frustration. Too few IT resources and too many needs in the pipeline lead to multiple groups of people working on projects that overlap in goals – sometimes duplicating each other’s efforts – and tax limited staff, budget and infrastructure resources. It was also interesting to see that focus group participants noted that new technologies and changing regulatory requirements keep derailing efforts over multi-year projects.

Throughout all the challenges identified by healthcare organizations, the issue of data integrity is paramount. The addition of new technologies, including mobile and AI-driven analytics, and new sources of information, increases the need to ensure that data is in a format that is accessible to all users and all applications. Otherwise, the full benefits of digital transformation will not be realized.

The lack of universal standards to enable interoperability are being addressed, but until those standards are available, healthcare organizations must evaluate other ways to integrate and harmonize data to make it available to the myriad of users and applications that can benefit from insights provided by the information. Unlocking access to previously unseen data takes resources that many health organizations have in short supply. And the truth is, we’ll never have the perfect standards as they will always continue to change, so there’s no reason to wait.

Infrastructure, however, was not the number one resource identified in the HIMSS focus group as lacking in participants’ interoperability journey. In fact, only 15 percent saw infrastructure as the missing piece, while 30 percent identified IT staffing resources and 45 percent identified the right level of expertise as the most critical needs for their organization.

As all industries focus on digital transformation, competition for expert staff to handle interoperability challenges makes it difficult for healthcare organizations to attract the talent needed. For this reason, 45 percent of healthcare organizations outsource IT data integration and management to address staffing challenges.

Health systems are also evaluating the use of managed services strategies. A managed services solution takes over the day-to-day integration and data management with the right expertise and the manpower to take on complex work and fluctuating project levels. That way in-house staff resources can focus on the innovation and efficiencies that support patient care and operations, while the operating budget covers data management fees – leaving capital dollars available for critical patient care needs.

Removing day-to-day integration responsibilities from in-house staff also provides time to look strategically at the organization’s overall interoperability needs – coordinating efforts in a holistic manner. The ability to implement solutions for current needs with an eye toward future needs future-proofs an organization’s digital investment and helps avoid the “app-trap” – a reliance on narrowly focused applications with bounded data that cannot be accessed by disparate users.

There is no one answer to healthcare’s digital transformation questions, but taking the following three steps can move an organization closer to the goal of meaningful interoperability:

  • Don’t wait for interoperability standards to be developed – find a data integration and management platform that will integrate and harmonize data from disparate sources to make the information available to all users the way they need it and when they needed.
  • Turn to a data management and integration partner who can provide the expertise required to remain up-to-date on all interoperability, security and regulatory compliance requirements and other mandatory capabilities.
  • Approach digital transformation holistically with a coordinated strategy that considers each new application or capability as data gathered for the benefit of the entire organization rather than siloed for use by a narrowly-focused group of users.

The digital transformation of healthcare and the interoperability challenges that must be overcome are not minor issues, nor are they insurmountable. It is only through the sharing of ideas, information about new technologies and best practices that healthcare organizations can maximize the insights provided by data shared across the enterprise.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies, a proud sponsor of Healthcare Scene. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Healthcare Interoperability Insights

Posted on June 29, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I came across this great video by Diameter Health where Bonny Roberts talked with a wide variety of people at the interoperability showcase at HIMSS. If you want to get a feel for the challenges and opportunities associated with healthcare interoperability, take 5 minutes to watch this video:

What do you think of these healthcare interoperability perspectives? Does one of them stand out more than others?

I love the statement that’s on the Diameter Health website:

“We Cure Clinical Data Disorder”

What an incredible way to describe clinical data today. I’m not sure the ICD-10 code for it, but there’s definitely a lot of clinical data disorder. It takes a real professional to clean the data, organize the data, enrich the data, and know how to make that data useful to people. IT’s not a disorder that most people can treat on their own.

What’s a little bit scary is that this disorder is not going to get any easier. More data is on its way. Better to deal with your disorder now before it becomes a full on chronic condition.

The Truth about AI in Healthcare

Posted on June 18, 2018 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

Those who watched the television show, “The Good Doctor,” in its first season got to see how a young autistic surgeon who has savant syndrome faced challenges in his everyday life as he learns to connect with people in his world. His extraordinary medical skill and intuition not only saves patients’ lives but also creates bridges with co-workers.

During each show, there is at least one scene in which the young doctor “visualizes” the inner workings of the patient’s body – evaluating and analyzing the cause of the medical condition.

Although all physicians can describe what happens to cause illness, the speed, detail and clarity of the young surgeon’s ability to gather information, predict reactions to treatments and identify the protocol that will produce the best outcome greatly surpasses his colleagues’ abilities.

Yes, this is a television show, but artificial intelligence promises the same capabilities that will disrupt all of our preconceived notions about healthcare on both the clinical and the operational sides of the industry.

Doctors rely on their medical training as well as their personal experience with hundreds of patients, but AI can allow clinicians to tap into the experience of hundreds of doctors’ experiences with thousands of patients. Even if physicians had personal experience with thousands of patients, the human mind can’t process all of the data effectively.

How can AI improve patient outcomes as well as the bottom line?

We’re already seeing the initial benefits of AI in many areas of the hospital. A report by Accenture identifies the top three uses of AI in healthcare as robot-assisted surgery, virtual nursing assistants and administrative workflow assistance. These three AI applications alone represent a potential estimated annual benefit of $78 billion for the healthcare industry by 2026.

The benefits of AI include improved precision in surgery, decreased length of stay, reduction in unnecessary hospital visits through remote assessment of patient conditions, and time-saving capabilities such as voice-to-text transcription. According to Accenture, these improvements represent a work time savings of 17 percent for physicians and 51 percent for registered nurses – at a critical time when there is no end in sight for the shortages of both nurses and doctors.

In a recent webinar discussing the role of AI in healthcare, John Lynn, founder of HealthcareScene.com, described other ways that AI can improve diagnosis, treatment and patient safety. These areas include dosage error detection, treatment plan design, determination of medication adherence, medical imaging, tailored prescription medicine and automated documentation.

One of the challenges to fully leveraging the insights and capabilities of AI is the volume of information accumulated in electronic medical records that is unstructured data. Translating this information into a format that can be used by clinical providers as well as financial and administrative staff to optimize treatment plans as well as workflows is possible with natural language processing – a branch of AI that enables technology to interpret speech and text and determine which information is critical.

The most often cited fear about a reliance on AI in healthcare is the opportunity to make mistakes. Of course, humans make mistakes as well. We must remember that AI’s ability to tap into a much wider pool of information to make decisions or recommend options will result in a more deeply-informed decision – if the data is good.

The proliferation of legacy systems, continually added applications and multiple EMRs in a health system increases the risk of data that cannot be accessed or cannot be shared in real-time to aid clinicians or an AI-supported program. Ensuring that data is aggregated into a central location, harmonized, transformed into a usable format and cleaned to provide high quality data is necessary to support reliable AI performance.

While AI might be able to handle the data aggregation and harmonization tasks in the future, we are not there yet. This is not, however, a reason to delay the use of AI in hospitals and other organizations across the healthcare spectrum.

Healthcare organizations can partner with companies that specialize in the aggregation of data from disparate sources to make the information available to all users. Increasing access to data throughout the organization is beneficial to health systems – even before they implement AI tools.

Although making data available to all of the organization’s providers, staff and vendors as needed may seem onerous, it is possible to do so without adding to the hospital’s IT staff burden or the capital improvement budget. The complexities of translating structured and unstructured data, multiple formats and a myriad of data sources can be balanced with data security concerns with the use of a team that focuses on these issues each day.

While most AI capabilities in use today are algorithms that reflect current best practices or research that are programmed by healthcare providers or researchers, this will change. In the future, AI will expand beyond algorithms, and the technology will be able to learn and make new connections among a wider set of data points than today’s more narrowly focused algorithms.

Whether or not your organization is implementing AI, considering AI or just watching its development, I encourage everyone to start by evaluating the data that will be used to “run” AI tools. Taking steps now to ensure clean, easy-to-access data will not only benefit clinical and operational tasks now but will also position the organization to more quickly adopt AI.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies, a proud sponsor of Healthcare Scene. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Healthcare Interoperability is Solved … But What Does That Really Mean? – #HITExpo Insights

Posted on June 12, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

One of the best parts of the new community we created at the Health IT Expo conference is the way attendees at the conference and those in the broader healthcare IT community engage on Twitter using the #HITExpo hashtag before, during, and after the event.  It’s a treasure trove of insights, ideas, practical innovations, and amazing people.  Don’t forget that last part since social media platforms are great at connecting people even if they are usually in the news for other reasons.

A great example of some great knowledge sharing that happened on the #HITExpo hashtag came from Don Lee (@dflee30) who runs #HCBiz, a long time podcast which he recorded live from Health IT Expo.  After the event, Don offered his thoughts on what he thought was the most important conversation about “Solving Interoperability” that came from the conference.  You can read his thoughts on Twitter or we’ve compiled all 23 tweets for easy reading below (A Big Thanks to Thread Reader for making this easy).

As shared by Don Lee:

1/ Finally working through all my notes from the #HITExpo. The most important conversation to me was the one about “solving interoperability” with @RasuShrestha@PaulMBlack and @techguy.

2/ Rasu told the story of what UPMC accomplished using DBMotion. How it enabled the flow of data amongst the many hospitals, clinics and docs in their very large system. #hitexpo

3/ John challenged him a bit and said: it sounds like you’re saying that you’ve solved #interoperability. Is that what you’re telling us? #hitexpo

4/ Rasu explained in more detail that they had done the hard work of establishing syntactic interop amongst the various systems they dealt with (I.e. they can physically move the data from one system to another and put it in a proper place). #hitexpo

5/ He went on and explained how they had then done the hard work of establishing semantic interoperability amongst the many systems they deal with. That means now all the data could be moved, put in its proper place, AND they knew what it meant. #hitexpo

6/ Syntactic interop isn’t very useful in and of itself. You have data but it’s not mastered and not yet useable in analytics. #hitexpo

7/ Semantic interop is the mastering of the data in such a way that you are confident you can use it in analytics, ML, AI, etc. Now you can, say, find the most recent BP for a patient pop regardless of which EMR in your system it originated. And have confidence in it. #hitexpo

8/ Semantic interop is closely related to the concept of #DataFidelity that @BigDataCXO talks about. It’s the quality of data for a purpose. And it’s very hard work. #hitexpo

9/ In the end, @RasuShrestha’s answer was that UPMC had done all of that hard work and therefore had made huge strides in solving interop within their system. He said “I’m not flying the mission accomplished banner just yet”. #hitexpo

10/ Then @PaulMBlack – CEO at @Allscripts – said that @RasuShrestha was being modest and that they had in fact “Solved interoperability.”

I think he’s right and that’s what this tweet storm is about. Coincidentally, it’s a matter of semantics. #hitexpo

11/ I think Rasu dialed it back a bit because he knew that people would hear that and think it means something different. #hitexpo

12/ The overall industry conversation tends to be about ubiquitous, semantic interop where all data is available everywhere and everyone knows what it means. I believe Rasu was saying that they hadn’t achieved that. And that makes sense… because it’s impossible. #hitexpo

13/ @GraceCordovano asked the perfect question and I wish there had been a whole session dedicated to answering it: (paraphrasing) What’s the difference between your institutional definition of interop and what the patients are talking about? #hitexpo

14/ The answer to that question is the crux of our issue. The thing patients want and need is for everyone who cares for them to be on the same page. Interop is very relevant to that issue, obviously, but there’s a lot of friction and it goes way beyond tech. #hitexpo

15/ Also, despite common misconception, no other industry has solved this either. Sure, my credit card works in Europe and Asia and gets back to my bank in the US, but that’s just a use case. There is no ubiquitous semantic interop between JP Morgan Chase and HSBC.

16/ There are lots of use cases that work in healthcare too. E-Prescribing, claims processing and all the related HIPAA transactions, etc. #hitexpo

17/ Also worth noting… Canada has single payer system and they also don’t have clinical interoperability.

This is not a problem unique to healthcare nor the US. #hitexpo

18/ So healthcare needs to pick its use cases and do the hard work. That’s what Rasu described on stage. That’s what Paul was saying has been accomplished. They are both right. And you can do it too. #hitexpo

19/ So good news: #interoperability is solved in #healthcare.

Bad news: It’s a ton of work and everyone needs to do it.

More bad news: You have to keep doing it forever (it breaks, new partners, new sources, new data to care about, etc). #hitexpo

19/ Some day there will be patient mediated exchange that solves the patient side of the problem and does it in a way that works for everyone. Maybe on a #blockchain. Maybe something else. But it’s 10+ years away. #hitexpo

20/ In the meantime my recommendation to clinical orgs – support your regional #HIE. Even UPMC’s very good solution only works for data sources they know about. Your patients are getting care outside your system and in a growing # of clinical and community based settings. #hitexpo

21/ the regional #HIE is the only near-term solution that even remotely resembles semantic, ubiquitous #interoperability in #healthcare.
#hitexpo

22/ My recommendation to patients: You have to take matters into your own hands for now. Use consumer tools like Apple health records and even Dropbox like @ShahidNShah suggested in another #hitexpo session. Also, tell your clinicians to support and use the regional #HIE.

23/ So that got long. I’ll end it here. What do you think?

P.S. the #hitexpo was very good. You should check it out in 2019.

A big thank you to Don Lee for sharing these perspectives and diving in much deeper than we can do in 45 minutes on stage. This is what makes the Health IT Expo community special. People with deep understanding of a problem fleshing out the realities of the problem so we can better understand how to address them. Plus, the sharing happens year round as opposed to just at a few days at the conference.

Speaking of which, what do you think of Don’s thoughts above? Is he right? Is there something he’s missing? Is there more depth to this conversation that we need to understand? Share your thoughts, ideas, insights, and perspectives in the comments or on social media using the #HITExpo hashtag.

Reasonable and Unreasonable Healthcare Interoperability Expectations

Posted on February 12, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Other than EMR and EHR, I don’t think there’s any topic I’ve written about more than healthcare interoperability. It’s a challenging topic with a lot of nuances. Plus, it’s a subject which would benefit greatly if we could make it a reality. However, after all these years I’m coming to some simple conclusions that I think often get lost in most discussions. Especially those in the healthcare IT media.

First, we all know that it would be wonderful for all of your healthcare records to be available to anyone who needs them at any time and in any place and not available to those who shouldn’t have access to them. I believe that in the next 15 years, that’s not going to happen. Sure, it would be great if it did (we all see that), but I don’t see it happening.

The reasons why are simple. Our healthcare system doesn’t want it to happen and there aren’t enough benefits to the system to make it happen.

Does that mean we should give up on interoperability? Definitely not!

Just because we can’t have perfect healthcare interoperability doesn’t mean we shouldn’t create meaningful interoperability (Yes, I did use the word meaningful just to annoy you).

I think one of the major failures of most interoperability efforts is that they’re too ambitious. They try to do everything and since that’s not achievable, they end up doing nothing. There are plenty of reasonable interoperability efforts that make a big difference in healthcare. We can’t let the perfect be the enemy of better. That’s been exactly what’s happened with most of healthcare interoperability.

At the HIMSS conference next month, they’re going to once again have an intereroperability showcase full of vendors that can share data. If HIMSS were smart, they’d do away with the showcase and instead only allow those vendors to show dashboards of the amount of data that’s actually being transferred between organizations in real time. We’d learn a lot more from seeing interoperability that’s really happening as opposed to seeing interoperability that could happen but doesn’t because organizations don’t want that type of interoperability to happen.

Interoperability is a challenging topic, but we make it harder than it needs to be because we want to share everything with everyone. I’m looking for companies that are focused on slices of interoperability that practically solve a problem. If you have some of these, let us know about them in the comments.

Breaking Bad: Why Poor Patient Identification is Rooted in Integration, Interoperability

Posted on December 20, 2017 I Written By

The following is a guest blog post by Dan Cidon, Chief Technology Officer, NextGate.

The difficulty surrounding accurate patient ID matching is sourced in interoperability and integration.

Coordinated, accountable, patient-centered care is reliant on access to quality patient data. Yet, healthcare continues to be daunted by software applications and IT systems that don’t communicate or share information effectively. Health data, spread across multiple source systems and settings, breeds encumbrances in the reconciliation and de-duplication of patient records, leading to suboptimal outcomes and avoidable costs of care. For organizations held prisoner by their legacy systems, isolation and silo inefficiencies worsen as IT environments become increasingly more complex, and the growth and speed to which health data is generated magnifies.

A panoramic view of individuals across the enterprise is a critical component for value-based care and population health initiatives. Accurately identifying patients, and consistently matching them with their data, is the foundation for informed clinical decision-making, collaborative care, and healthier, happier populations. As such, the industry has seen a number of high-profile initiatives in the last few years attempting to address the issue of poor patient identification.

The premature end of CHIME’s National Patient ID Challenge last month should be a sobering industry reminder that a universal solution may never be within reach. However, the important lesson emanating in the wake of the CHIME challenge is that technology alone will not solve the problem. Ultimately, the real challenge of identity management and piecing together a longitudinal health record has to do with integration and interoperability. More specifically, it revolves around the demographics and associated identifiers dispersed across multiple systems.

Because these systems often have little reason to communicate with one another, and because they store their data through fragmented architecture, an excessive proliferation of identifiers occurs. The result is unreliable demographic information, triggering further harm in data synchronization and integrity.

Clearly, keeping these identifiers and demographics as localized silos of data is an undesirable model for healthcare that will never function properly. While secondary information such as clinical data should remain local, the core identity of a patient and basic demographics including name, gender, date of birth, address and contact information shouldn’t be in the control of any single system. This information must be externalized from these insulated applications to maintain accuracy and consistency across all connected systems within the delivery network.

However, there are long-standing and relatively simple standards in place, such as HL7 PIX/PDQ, that allow systems to feed a central demographic repository and query that repository for data. Every year, for the past eight years, NextGate has participated in the annual IHE North American Connectathon – the healthcare industry’s largest interoperability testing event. Year after year, we see hundreds of other participating vendors demonstrating that with effective standards, it is indeed possible to externalize patient identity.

In the United Kingdom, for example, there has been slow but steady success of the Patient Demographic Service – a relatively similar concept of querying a central repository for demographics and maintaining a global identifier. While implementation of such a national scale service in the U.S. is unlikely in the near-term, the concept of smaller scale regional registries is clearly an achievable goal. And every deployment of our Enterprise Master Patient Index (EMPI) is a confirmation that such systems can work and do provide value.

What is disappointing, is that very few systems in actual practice today will query the EMPI as part of the patient intake process. Many, if not most, of the systems we integrate with will only fulfill half of the bargain, namely they will feed the EMPI with demographic data and identifiers. This is because many systems have already been designed to produce this outbound communication for purposes other than the management of demographic data. When it comes to querying the EMPI for patient identity, this requires a fundamental paradigm shift for many vendors and a modest investment to enhance their software. Rather than solely relying on their limited view of patient identity, they are expected to query an outside source and integrate that data into their local repository.

This isn’t rocket science, and yet there are so few systems in production today that initiate this simple step. Worse yet, we see many healthcare providers resorting to band aids to remedy the deficiency, such as resorting to ineffective screen scraping technology to manually transfer data from the EMPI to their local systems.

With years of health IT standards in place that yield a centralized and uniform way of managing demographic data, the meager pace and progress of vendors to adopt them is troubling. It is indefensible that a modern registration system, for instance, wouldn’t have this querying capability as a default module. Yet, that is what we see in the field time and time again.

In other verticals where banking and manufacturing are leveraging standards-based exchange at a much faster pace, it really begs the question: how can healthcare accelerate this type of adoption? As we prepare for the upcoming IHE Connectathon in January, we place our own challenge to the industry to engage in an open and frank dialogue to identify what the barriers are, and how can vendors be incentivized, so patients can benefit from the free flow of accurate, real-time data from provider to provider.

Ultimately, accurate patient identification is a fundamental component to leveraging IT for the best possible outcomes. Identification of each and every individual in the enterprise helps to ensure better care coordination, informed clinical decision making, and improved quality and safety.

Dan Cidon is CTO and co-founder NextGate, a leader in healthcare identity management, managing nearly 250 million lives for health systems and HIEs in the U.S. and around the globe.

Waiting For The Perfect “Standard” Is Not The Answer To Healthcare’s Interoperability Problem

Posted on October 16, 2017 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

Have you bought into the “standards will solve healthcare’s interoperability woes” train of thought? Everyone understands that standards are necessary to enable disparate systems to communicate with each other, but as new applications and new uses for data continually appear, healthcare organizations that are waiting for universal standards, are not maximizing the value of their data. More importantly, they will be waiting a long time to realize the full potential of their data.

Healthcare interoperability is not just a matter of transferring data as an entire file from one user to another. Instead, effective exchange of information allows each user to select which elements of a patient’s chart are needed, and then access them in a format that enables analysis of different data sets to provide a holistic picture of the patient’s medical history or clinical trends in a population of patients. Healthcare’s interoperability challenge is further exacerbated by different contextual interpretations of the words within those fields. For instance, how many different ways are there to say heart attack?

The development of the Health Level Seven (HL7®) FHIR®, which stands for Fast Healthcare Interoperability Resources, represents a significant step forward to interoperability. While the data exchange draft that is being developed and published by HL7 eliminates many of the complexities of earlier HL7 versions and facilitates real-time data exchange via web technology, publication of release 4 – the first normative version of the standard – is not anticipated until October 2018.

As these standards are further developed, the key to universal adoption will be simplicity, according to John Lynn, founder of the HealthcareScene.com. However, he suggests that CIOs stop waiting for “perfect standards” and focus on how they can best achieve interoperability now.

Even with standards that can be implemented in all organizations, the complexity and diversity of the healthcare environment means that it will take time to move everyone to the same standards. This is complicated by the number of legacy systems and patchwork of applications that have been added to healthcare IT systems in an effort to meet quickly changing needs throughout the organization. Shrinking financial resources for capital investment and increasing competition for IT professionals limits a health system’s ability to make the overall changes necessary for interoperability – no matter which standards are adopted.

Some organizations are turning to cloud-based, managed service platforms to perform the integration, aggregation and harmonization that makes data available to all users – regardless of the system or application in which the information was originally collected. This approach solves the financial and human resource challenges by making it possible to budget integration and data management requirements as an operational rather than a capital investment. This strategy also relieves the burden on in-house IT staff by relying on the expertise of professionals who focus on emerging technologies, standards and regulations that enable safe, compliant data exchange.

How are you planning to scale your interoperability and integration efforts?  If you're waiting for standards, why are you waiting?

As a leading provider of healthcare interoperability solutions, Liaison is a proud sponsor of Healthcare Scene. While the conversation about interoperability has been ongoing for many years, ideas, new technology and new strategies discussed and shared by IT professionals will lead to successful healthcare data exchange that will transform healthcare and result in better patient care.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

KLAS Summit: Interoperability Doing the Work to Move HealthIT Forward

Posted on October 9, 2017 I Written By

Healthcare as a Human Right. Physician Suicide Loss Survivor. Janae writes about Artificial Intelligence, Virtual Reality, Data Analytics, Engagement and Investing in Healthcare. twitter: @coherencemed

I had the privilege of attending the KLAS research event with leaders in patient data interoperability. From the ONC to EHR vendors- executives from EHR vendors and hospital systems made their way to a summit about standards for measurement and improvement. These meetings are convened with the mutual goal of contributing to advancement in Health IT and improvement of patient outcomes. I’m a big fan of collaborative efforts that produce measurable results. KLAS research is successfully convening meetings everyone in the HealthIT industry has said are necessary for progress.

The theme of Interoperability lately is: Things are not moving fast enough.

The long history of data in health records and variety in standards across records have created a system that is reluctant to change. Some EMR vendors seem to think the next step is a single patient record- their record.

Watching interactions between EHR vendors and the ONC was interesting. Vendors are frustrated that progress and years of financial investment might be overturned by an unstable political atmosphere and lack of funding. Additionally, device innovation and creation is changing the medical device landscape at a rapid rate. We aren’t on the same page with new data and we are creating more and more data from disparate sources.

Informatics experts in healthcare require a huge knowledge base to organize data sharing and create a needs based strategy for data sharing. They have such a unique perspective across the organization. Few of the other executives have the optics into the business sense of the organization. They have to understand clinical workflows and strategy., as well as financial reimbursement. Informatics management is a major burden and responsibility- they are in charge of improving care and making workflows easier for clinicians and patients. EMR use has frequently been cited as a contributor to physician burnout and early retirement. Data moving from one system can have a huge impact on care delivery costs and patient outcomes. Duplicated tests and records can mean delayed diagnosis for surgeons and specialists. Participants of the summit discussed that patients can be part of improving data sharing.

We have made great progress in terms of interoperability but there is still much to be done. Some of the discussion was interesting, such as the monumental task the VA has in patient data with troop deployment and care. There was also frank discussion about business interests and data blocking ranging from government reluctance to create a single patient identifier to a lack of resources to clean duplicated records.

Stakeholders want to know what the next steps are- how do we innovate and how do we improve from this point forward? Do we create it internally or partner with outside vendors for scale? They are tired of the confusion and lack of progress. Participants want more. I asked a few participants what they think will help things move forward more quickly. Not everyone really knows how to make things move forward faster.

Keith Fraidenburg of CHIME praised systems for coming together and sharing patient data- to improve patient outcomes. I spoke with him about the Summit itself and his work with informatics in healthcare. He discussed how the people involved in this effort are some of the hardest working people in healthcare. Their expertise in terms of clinical knowledge and data science is highly specialized and has huge implications in patient outcomes.

“To get agreement on standards would be an important big step forward. It wouldn’t solve everything but to get industry wide standards to move things forward the industry needs a single set of standards or a playbook.”

We might have different interests, but the people involved in interoperability care about interoperability advancement. Klas research formed a collaborative of over 31 organizations that are dedicated to giving great feedback and data about end users. The formation of THE EMR Improvement Collaborative can help measure the success of data interoperability. Current satisfaction measures are helpful, but might not give health IT experts and CMIOs and CIOs the data they need to formulate an interoperability strategy.

The gaps in transitions of care is a significant oversight in the existing interoperability marketplace. Post acute organizations have a huge need for better data sharing and interorganizational trust is a factor. Government mandates about data blocking and regulating sharing has a huge impact on data coordination. Don Rucker, MD, John Fleming, MD, Genevieve Morris and Steve Posnack participated in a listening session about interoperability.  Some EMR vendors mentioned this listening session and ability to have a face to face meeting were the most valuable part of the Summit.

Conversations and meetings about interoperability help bridge the gaps in progress. Convening the key conversations between stakeholders helps healthcare interoperability move faster. There is still work to be done and many opportunities for innovation and improvement. Slow progress is still progress. Sharing data from these efforts by the KLAS research team shows a dedication to driving interoperability advancement. We will need better business communication between stakeholders and better data sharing to meet the needs of an increasingly complex and data rich world.

What do you think the next steps are in interoperability?

Interoperability: Is Your Aging Healthcare Integration Engine the Problem?

Posted on September 18, 2017 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.
There is no shortage of data collected by healthcare organizations that can be used to improve clinical as well as business decisions. Announcements of new technology that collects patient information, clinical outcome data and operational metrics that will make a physician or hospital provide better, more cost-effective care bombard us on a regular basis.

The problem today is not the amount of data available to help us make better decisions; the problem is the inaccessibility of the data. When different users – physicians, allied health professionals, administrators and financial managers – turn to data for decision support, they find themselves limited to their own silos of information. The inability to access and share data across different disciplines within the healthcare organization prevents the user from making a decision based on a holistic view of the patient or operational process.

In a recent article, Alan Portela points out that precision medicine, which requires “the ability to collect real-time data from medical devices at the moment of care,” cannot happen easily without interoperability – the ability to access data across disparate systems and applications. He also points out that interoperability does not exist yet in healthcare.

Why are healthcare IT departments struggling to achieve interoperability?

Although new and improved applications are adopted on a regular basis, healthcare organizations are just now realizing that their integration middleware is no longer able to handle new types of data such as social media, the volume of data and the increasing number of methods to connect on a real-time basis. Their integration platforms also cannot handle the exchange of information from disparate data systems and applications beyond the four walls of hospitals. In fact, hospitals of 500 beds or more average 25 unique data sources with six electronic medical records systems in use. Those numbers will only move up over time, not down.

Integration engines in place throughout healthcare today were designed well before the explosion of the data-collection tools and digital information that exist today. Although updates and additions to integration platforms have enabled some interoperability, the need for complete interoperability is creating a movement to replace integration middleware with cloud-based managed services.

A study by the Aberdeen Group reveals that 76 percent of organizations will be replacing their integration middleware, and 70 percent of those organizations will adopt cloud-based integration solutions in the next three years.

The report also points out that as healthcare organizations move from an on-premises solution to a cloud-based platform, business leaders see migration to the cloud and managed services as a way to better manage operational expenses on a monthly basis versus large, up-front capital investments. An additional benefit is better use of in-house IT staff members who are tasked with mission critical, day-to-day responsibilities and may not be able to focus on continuous improvements to the platform to ensure its ability to handle future needs.

Healthcare has come a long way in the adoption of technology that can collect essential information and put it in the hands of clinical and operational decision makers. Taking that next step to effective, meaningful interoperability is critical.

As a leading provider of healthcare interoperability solutions, Liaison is a proud sponsor of Healthcare Scene. It is only through discussions and information-sharing among Health IT professionals that healthcare will achieve the organizational support for the steps required for interoperability.

Join John Lynn and Liaison for an insightful webinar on October 5, titled: The Future of Interoperability & Integration in Healthcare: How can your organization prepare?

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Healthcare Interoperability and Standards Rules

Posted on September 11, 2017 I Written By

Sunny is a serial entrepreneur on a mission to improve quality of care through data science. Sunny’s last venture docBeat, a healthcare care coordination platform, was successfully acquired by Vocera communications. Sunny has an impressive track record of Strategy, Business Development, Innovation and Execution in the Healthcare, Casino Entertainment, Retail and Gaming verticals. Sunny is the Co-Chair for the Las Vegas Chapter of Akshaya Patra foundation (www.foodforeducation.org) since 2010.

Dave Winer is a true expert on standards. I remember coming across him in the early days of social media when every platform was considering some sort of API. To illustrate his early involvement in standards, Dave was one of the early developers of the RSS standard that is now available on every blog and many other places.

With this background in mind, I was extremely fascinated by a manifesto that Dave Winer published earlier this year that he calls “Rules for Standards-Makers.” Sounds like something we really need in healthcare no?

You should really go and read the full manifesto if you’re someone involved in healthcare standards. However, here’s the list of rules Dave offers standards makers:

  1. There are tradeoffs in standards
  2. Software matters more than formats (much)
  3. Users matter even more than software
  4. One way is better than two
  5. Fewer formats is better
  6. Fewer format features is better
  7. Perfection is a waste of time
  8. Write specs in plain English
  9. Explain the curiosities
  10. If practice deviates from the spec, change the spec
  11. No breakage
  12. Freeze the spec
  13. Keep it simple
  14. Developers are busy
  15. Mail lists don’t rule
  16. Praise developers who make it easy to interop

If you’ve never had to program to a standard, then you might not understand these. However, those who are deep into standards will understand the pitfalls. Plus, you’ll have horror stories about when you didn’t follow these rules and what challenges that caused for you going forward.

The thing I love most about Dave’s rules is that it focuses on simplicity and function. Unfortunately, many standards in healthcare are focused on complexity and perfection. Healthcare has nailed the complexity part and as Dave’s rules highlight, perfection is impossible with standards.

In fact, I skipped over Dave’s first rule for standards makers which highlights the above really well:

Rule #1: Interop is all that matters

As I briefly mentioned in the last CXO Scene podcast, many healthcare CIOs are waiting until the standards are perfect before they worry about interoperability. It’s as if they think that waiting for the perfect standard is going to solve healthcare interoperability. It won’t.

I hope that those building out standards in healthcare will take a deep look at the rules Dave Winer outlines above. We need better standards in healthcare and we need healthcare data to be interoperable.