Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

From Fragmented to Coordinated: The Big Data Challenge

Posted on November 27, 2018 I Written By

The following is a guest blog post by Patty Sheridan, MBA, RHIA, FAHIMA; SVP, Life Sciences at Ciox.

When healthcare organizations have access to as much data as possible, that translates into improved coordination and quality of care, reduced costs for patients, payers and providers, and more efficient medical care. Yet, there is a void in the healthcare data landscape when it comes to securing the right information to make the right decision at the right time. It is becoming increasingly critical to ensure that providers understand data and are able to properly utilize it. Technologies are emerging today that can help deliver a full picture of a patient’s health data, which can lead to more consistent care and the development of improved therapies by helping providers derive better insights from clinical data.

Across the country, patient data resides across multiple systems, and in a variety of structured and unstructured formats. The lack of interoperability makes it difficult for organizations to have access to the data they need to run programs that are critical to patient care. Often, various departments within an organization seek the same information and request it separately and repeatedly, leading to a fragmented picture of a patient’s health status.

Managing Complexity, Inside and Out

While analytics tools work well within select facilities and research communities, these vast data sets and the useful information within them are very complex, especially when combined with data sets from outside organizations. The current state of data illiquidity even makes it challenging to seamlessly share and use data within an organization.

For example, in the life sciences arena, disease staging is often the foundation needed to identify a sample of patients and to link to other relevant data which is then abstracted and mined for real world use; yet clinical and patient reported data is rarely documented in a consistent manner in EHRs. Not only do providers often equivocate and contradict their own documentation, but EHR conventions also promote errors in the documentation of diagnostic findings. Much of the documentation can be found in unstructured EHR notes that require a combination of abstraction and clinician review to determine the data’s relevance.

Improved Interoperability, Improved Outcomes

Problems with EHR interoperability continue to obstruct care coordination, health data exchange and clinical efficiency. EHRs are designed and developed to support patient care delivery but, in today’s world of value-based care, the current state of EHR interoperability is insufficient at best.

Consider the difficulty in collecting a broad medical data set. The three largest EHRs combined still corner less than one-third of the market, and there are hundreds of active EHR vendors across the healthcare landscape, each bringing its own unique approach to the information transfer equation. Because many hospitals use more than one EHR, tracking down records for a single patient at a single hospital often requires connecting to multiple systems. To collect a broader population data set would require ubiquitous connection to all of the hundreds of EHR vendors across the country.

The quality integration of health data systems is essential for patients with chronic conditions, for example. Patients with more serious illnesses often require engagement with several specialists, which means it is particularly important that the findings and data from each specialist are succinctly and properly communicated to fellow doctors and care providers.

Leveraging Technology

As the industry matures in its use of data, emerging technologies are beginning to break down information road blocks. Retrieving, digitizing and delivering medical records is a complex endeavor, and technology must be layered within all operations to streamline data acquisition and make executable data available at scale, securing population-level data more quickly and affordably.

When planning to take advantage of new advanced technologies, seek a vendor partner that provides a mix of traditional and emerging technologies, including robotic process automation (RPA), computer vision, natural language processing (NLP) and machine learning. All of these technologies serve vital functions:

  • RPA can be used to streamline manually intensive and repetitive systematic tasks, increasing the speed and quality at which clinical and administrative data are retrieved from the various end-point EHRs and specialty systems.
  • NLP and neural networks can analyze the large volume of images and text received to extract, organize and provide context to coded content, dealing with ambiguous data and packaging the information in an agreed-upon standard.
  • With machine learning, an augmented workforce can be equipped to increase the quality of records digitization and the continuous learning across the ecosystem, where every touchpoint is a learning opportunity.

Smarter, faster and more qualitative systems of information exchange will soon be the catalysts that lead paradigm-shifting improvements in the U.S. care ecosystem, such as:

  • Arming doctors with relevant information about patients
  • Increasing claims accuracy and accelerating providers’ payments
  • Empowering universities and research organizations with timely, accurate and clinically relevant data sets
  • Correlating epidemics with the preparedness of field teams
  • Alerting pharmacists with counter-interaction warnings

Ultimately, improving information exchange will enable healthcare industry professionals to elevate patient safety and quality, reduce medical and coding errors tenfold and enhance operational efficiencies by providing the relevant data needed to quickly define treatment.

Achieving this paradigm shift depends almost entirely on taking the necessary steps to adopt these emerging technologies and drive a systematic redesign of many of our operations and systems. Only then will we access the insights necessary to truly impact the quality of care across the healthcare landscape.

About Ciox
Ciox, a health technology company and proud sponsor of Healthcare Scene, is dedicated to significantly improving U.S. health outcomes by transforming clinical data into actionable insights. Combined with an unmatched network offering ubiquitous access to healthcare data, Ciox’s expertise, relationships, technology and scale allow for the extraction of insights from structured and unstructured clinical data to create value for healthcare stakeholders. Through its HealthSource technology platform, which includes solutions for data acquisition, release of information, clinical coding, data abstraction, and analytics, Ciox helps clients securely and consistently solve the last mile challenges in clinical interoperability. Ciox improves data management and sharing by modernizing workflows and increasing the accuracy and flow of information, while providing transparency across the healthcare ecosystem and helping clients manage disparate medical records. Learn more at www.ciox.com.

Interoperability Is On An Accelerated Trajectory Says Redox CEO

Posted on November 16, 2018 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He is currently an independent marketing consultant working with leading healthIT companies. Colin is a member of #TheWalkingGallery. His Twitter handle is: @Colin_Hung.

The lack of interoperability in healthcare continues to be a vexing challenge for Health IT companies, IT departments and patients. Redox is a company taking a unique approach to solving this problem. They have built a platform of reusable data connections between healthcare providers and the innovative cloud-based companies that have products those providers want to use.

Redox recently held their second annual Interoperability Summit at the new CatalystHTI facility in Denver Colorado. Over three hundred people attended the event. The diverse audience included: startups, hospitals, large HealthIT vendors, payors and government health agencies. The sessions at the Summit reflected the diversity of the audience and ranged from topics like “Hacking the Health System Sales Cycle” to “FHIR: Be the Right Amount of Excited”.

During the Summit, I sat down with Redox CEO, Luke Bonney, to talk about the state of interoperability, the willingness of the industry to share data and what advice he has for the dozens of startups that approach Redox each month.

Below is a transcript of our conversation.

What is the state of healthcare interoperability today?

I think we are in a good state right now, but more importantly I think we are on an accelerated trajectory to something better.

An accelerated trajectory?

Yes, but in order to explain why I’m saying that, we have to take a step back.

In my opinion healthcare interoperability is inextricably tied to the adoption and migration to the cloud. We will never have true data liquidity, which is the state that everyone wants – physicians, clinicians, administrators, patients, providers, payers, etc – until healthcare fully embraces cloud architectures and cloud thinking.

Healthcare is still predominantly an “on-premise” world. It’s not wrong. It’s just how the industry has grown up. We installed servers behind our own firewalls. As we added systems we bought more servers and of course we added them to the other servers behind the firewall. Eventually we built connections between these systems so that they could talk to each other. But because everything was behind the firewall and because we were really just sharing data within the same organization, we didn’t give much thought to sharing that data in a standard way. As long as we were behind the firewall we could safely exchange data.

When you approach things from a cloud perspective, the thinking is completely different. When you build cloud applications you HAVE TO think about data portability and security. You HAVE TO work out ways to connect systems together across the Internet without a single big firewall acting as your shield.

So as people move more and more to this way of thinking we will see more movement towards frictionless data exchange.

So is healthcare moving more to the cloud?

Working at EPIC and now at Redox, I’ve had a front-row seat to this change in attitude towards the cloud by healthcare providers. Prior to 2015 healthcare IT leaders were still asking “What is the cloud?” and “Why should I bother with it?”. But today leaders are starting to ask “How can I better leverage the cloud for my organization?” It’s great to see so many proactively looking for ways to adopt cloud-based applications.

I also think that the consumer tech giants are helping propel healthcare forward. Companies like Amazon and Google have always been cloud-based. As they push into healthcare they are going to have a huge advantage versus on-premise legacy companies. As they gain traction so too will the cloud.

I can see how embracing the cloud will help healthcare achieve secure connectivity and certainly scalability, but even if we move completely to the cloud won’t we still need to exchange data in a standard way in order to achieve true interoperability?

Having a data standard would certainly be helpful.

Is that going to be HL7 v2? v3? FHIR? Smart-on-FHIR? Or something that Commonwell Alliance puts out?

(Laughing). We do seem to have a lot of standards don’t we.

Actually this is what is driving Redox. There really isn’t a ton of incentive to tear out the investments already made in HL7 v2 or v3. It works for the use cases where it has been deployed. The same applies to FHIR and Commonwell. All these approaches work wonderfully for specific use cases, but I really doubt any one of these approaches is going to be the single solution for all of our interoperability challenges.

Think about it. If I’m a CIO at a hospital and I have a working HL7 v2 integration working between two systems, why would I waste precious resources to move to a different integration standard if there is really nothing to be gained from it? It’d be a waste of time and resources.

The one good thing about all these standards and interoperability initiatives is that we are building an audience that is asking the right questions and pushing healthcare in the right direction. APIs are the right thing to do. FHIR is the right thing to do…and so on. All are relevant and needed.

So if not a universal data standard, what do we need?

The way I see things we might not need a single data standard if someone can build a common platform through which data can be shared. That’s what we’re doing here at Redox. We’re taking a pragmatic approach. Whatever data standard you are using internal is fine with us. We’ll work with you to find a way to share your data through our platform. And once you share it with us once, you don’t have to rebuild that connection over and over again each time a different company wants to connect. We handle that.

Is that the problem Redox set out to solve?

Actually when we started Redox we really just wanted to make it easier for cloud-based healthcare companies to scale and grow. What we realized is that one of the biggest impediments to growth was integrating legacy on-prem systems with cloud-based applications. Even if these companies could convince hospital IT teams to put their integration on the priority list, it would take a long time to actually get it done.

So we built the Redox engine to make this easier. Our goal wasn’t to solve interoperability per say, we just wanted to bring innovative web developers closer to healthcare providers so that they can solve problems together.

But because we were cloud from Day 1, we wanted to build everything in a reusable way, so that once we built a connection to one hospital, we wouldn’t have to build it again when the next company wanted to connect with that same hospital. This network effect wasn’t something we originally set out to build, but now it’s central to our success. It’s why we can talk about being a platform that enables data sharing vs being a tool that helps systems share data.

Solving interoperability is only partly a technology challenge. There is also the challenge of getting the healthcare ecosystem to actually share their data. Because Redox works with so many players in the ecosystem, have you noticed any change in attitude around sharing data?

Let me start by saying that I think everyone WANTS the data. There’s incredible value in health data. Medical records are a gold mine for researchers, public health authorities, pharma companies, payors, etc. Everyone would love nothing more than to build a comprehensive health record for their own purposes. The challenge of course is that it’s not easy to do that today. As you said, this is partly because of technology and partly because no one really wants to share their data altruistically.

I think there is one party that truly wants data to be shared and that’s patients. Patients are way more interested in sharing data than anyone else in the ecosystem. As a patient, data should follow me wherever I go. I never want to wonder if my doctor has all my medical information. I want people to have the data because I want the best outcome possible and my data can help make that happen.

I think companies and organizations in the healthcare ecosystem are slowly waking up to the fact that sharing data helps support their customers – whether those customers are providers, payors, members, patients, clinicians or government agencies. Sharing data makes things better. And as financial pressures in healthcare mount, everyone is looking for ways to do more, better, faster and with more accuracy. Sharing data is necessary for that to happen.

Redox works with a lot with startups and small/medium sized HealthIT companies. What advice would you give to those that are considering working with Redox? What should they have considered?

There are two key questions that I think every HealthIT company should ask themselves. Frist, what is the value your product or service provides? Second, Who is the buyer? Success in healthcare is less about whether your technology and more about aligning three things:

  1. An actual problem that needs to be solved
  2. A solution to that problem
  3. A buyer who can make a buying decision in a healthcare organization

I see a lot of companies that don’t really consider this last question. You can create an amazing product that solves a problem in healthcare but if the target audience for your product cannot make the buying decision then you have a difficult road ahead of you.

Beyond these questions, I would advise companies to really consider how their products integrate into the clinical or administrative workflow. Many startups begin with an application that isn’t integrated with existing hospital systems, like the EHR. But after they gain a little bit of traction they realize they need to become more integrated. But building in real-time data exchange into an application isn’t easy. You need to really think through how your product will handle this.

Lastly I would caution healthcare entrepreneurs about building their applications on the assumption that FHIR will be universally adopted. It isn’t and it will likely take years before it gains real-world traction. There is a lot of excitement around FHIR, but it isn’t the best solution for all situations.

Final Thoughts?

One thing I am encouraged by is the number of people and companies from outside of healthcare that are coming into this space. I think they bring an energy and perspective that will help us all get better. Granted, many of them have stars in their eyes and don’t realize how tough healthcare can be…but, the fact that they aren’t burdened with any legacy thinking is exciting. Healthcare needs more outside thinking.

Taming the Healthcare Compliance and Data Security Monster: How Well Are We Doing?

Posted on October 18, 2018 I Written By

The following is a guest blog post by Lance Pilkington, Vice President of Global Compliance at Liaison Technologies.

Do data breach nightmares keep you up at night?

For 229 healthcare organizations, the nightmare became a reality in 2018. As of late August, more than 6.1 million individuals were affected by 229 healthcare-related breaches, according to the Department of Health and Human Services’ HIPAA Breach Reporting Tool website – commonly call the HIPAA “wall of shame.”

Although security and privacy requirements for healthcare data have been in place for many years, the reality is that many healthcare organizations are still at risk for non-compliance with regulations and for breaches.

In fact, only 65 percent of 112 hospitals and hospital groups recently surveyed by Aberdeen, an industry analyst firm, reported compliance with 11 common regulations and frameworks for data security. According to the healthcare-specific brief – Enterprise Data in 2018: The State of Privacy and Security Compliance in Healthcare – protected health information has the highest percentage of compliance, with 85 percent of participants reporting full compliance, and the lowest compliance rates were reported for ISO 27001 and the General Data Protection Regulation at 63 percent and 48 percent respectively.

An index developed by Aberdeen to measure the maturity of an organization’s compliance efforts shows that although the healthcare organizations surveyed were mature in their data management efforts, they were far less developed in their compliance efforts when they stored and protected data, syndicated data between two applications, ingested data into a central repository or integrated data from multiple, disparate sources.

The immaturity of compliance efforts has real-world consequences for healthcare entities. Four out of five (81 percent) study participants reported at least one data privacy and non-compliance issue in the past year, and two out of three (66 percent) reported at least one data breach in the past year.

It isn’t surprising to find that healthcare organizations struggle with data security. The complexity and number of types of data and data-related processes in healthcare is daunting. In addition to PHI, hospitals and their affiliates handle financial transactions, personally identifiable information, employee records, and confidential or intellectual property records. Adding to the challenge of protecting this information is the ever-increasing use of mobile devices in clinical and business areas of the healthcare organization.

In addition to the complexities of data management and integration, there are budgetary considerations. As healthcare organizations face increasing financial challenges, investment in new technology and the IT personnel to manage it can be formidable. However, healthcare participants in the Aberdeen study reported a median of 37 percent of the overall IT budget dedicated to investment in compliance activities. Study participants from life sciences and other industries included in Aberdeen’s total study reported lower budget commitments to compliance.

This raises the question: If healthcare organizations are investing in compliance activities, why do we still see significant data breaches, fines for non-compliance and difficulty reaching full compliance?

While there are practical steps that every privacy and security officer should take to ensure the organization is compliant with HIPAA, there are also technology options that enhance a healthcare entity’s ability to better manage data integration from multiple sources and address compliance requirements.

An upcoming webinar, The State of Privacy and Security Compliance for Enterprise Data: “Why Are We Doing This Ourselves?” discusses the Aberdeen survey results and presents advice on how healthcare IT leaders can evaluate their compliance-readiness and identify potential solutions can provide some thought-provoking guidance.

One of the solutions is the use of third-party providers who can provide the data integration and management needs of the healthcare organization to ensure compliance with data security requirements. This strategy can also address a myriad of challenges faced by hospitals. Not only can the expertise and specialty knowledge of the third-party take a burden off in-house IT staff but choosing a managed services strategy that eliminates the need for a significant upfront investment enables moving the expense from the IT capital budget to the operating budget with predictable recurring costs.

Freeing capital dollars to invest in other digital transformation strategies and enabling IT staff to focus on mission-critical activities in the healthcare organization are benefits of exploring outsource opportunities with the right partner.

More importantly, moving toward a higher level of compliance with data security requirements will improve the likelihood of a good night’s sleep!

About Lance Pilkington
Lance Pilkington is the Vice President of Global Compliance at Liaison Technologies, a position he has held since joining the company in September 2012. Lance is responsible for establishing and leading strategic initiatives under Liaison’s Trust program to ensure the company is consistently delivering on its compliance commitments. Liaison Technologies is a proud sponsor of Healthcare Scene.

Connecting the Data: Three Steps to Meet Digital Transformation Goals

Posted on July 16, 2018 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

A white paper published by the World Economic Forum in 2016 begins with the statement, “Few industries have the potential to be changed so profoundly by digital technology as healthcare, but the challenges facing innovators – from regulatory barriers to difficulties in digitalizing patient data – should not be underestimated.”

That was two years ago, and many of the same challenges still exist as the digital transformation of healthcare continues.

In a recent HIMSS focus group sponsored by Liaison, participants identified their major digital transformation and interoperability goals for the near future as:

  • EMR rollout and integration
  • Population health monitoring and analytics
  • Remote clinical encounters
  • Mobile clinical applications

These goals are not surprising. Although EMRs have been in place in many healthcare organizations for years, the growth of health systems as they add physicians, clinics, hospitals and diagnostic centers represents a growing need to integrate disparate systems. The continual increase in the number of mobile applications and medical devices that can be used to gather information to feed into EMR systems further exacerbates the challenge.

What is surprising is the low percentage of health systems that believe that they are very or somewhat well-prepared to handle these challenges – only 35 percent of the HIMSS/Liaison focus group members identified themselves as well-prepared.

“Chaos” was a word used by focus group participants to describe what happens in a health system when numerous players, overlapping projects, lack of a single coordinator and a tendency to find niche solutions that focus on one need rather than overall organizational needs drive digital transformation projects.

It’s easy to understand the frustration. Too few IT resources and too many needs in the pipeline lead to multiple groups of people working on projects that overlap in goals – sometimes duplicating each other’s efforts – and tax limited staff, budget and infrastructure resources. It was also interesting to see that focus group participants noted that new technologies and changing regulatory requirements keep derailing efforts over multi-year projects.

Throughout all the challenges identified by healthcare organizations, the issue of data integrity is paramount. The addition of new technologies, including mobile and AI-driven analytics, and new sources of information, increases the need to ensure that data is in a format that is accessible to all users and all applications. Otherwise, the full benefits of digital transformation will not be realized.

The lack of universal standards to enable interoperability are being addressed, but until those standards are available, healthcare organizations must evaluate other ways to integrate and harmonize data to make it available to the myriad of users and applications that can benefit from insights provided by the information. Unlocking access to previously unseen data takes resources that many health organizations have in short supply. And the truth is, we’ll never have the perfect standards as they will always continue to change, so there’s no reason to wait.

Infrastructure, however, was not the number one resource identified in the HIMSS focus group as lacking in participants’ interoperability journey. In fact, only 15 percent saw infrastructure as the missing piece, while 30 percent identified IT staffing resources and 45 percent identified the right level of expertise as the most critical needs for their organization.

As all industries focus on digital transformation, competition for expert staff to handle interoperability challenges makes it difficult for healthcare organizations to attract the talent needed. For this reason, 45 percent of healthcare organizations outsource IT data integration and management to address staffing challenges.

Health systems are also evaluating the use of managed services strategies. A managed services solution takes over the day-to-day integration and data management with the right expertise and the manpower to take on complex work and fluctuating project levels. That way in-house staff resources can focus on the innovation and efficiencies that support patient care and operations, while the operating budget covers data management fees – leaving capital dollars available for critical patient care needs.

Removing day-to-day integration responsibilities from in-house staff also provides time to look strategically at the organization’s overall interoperability needs – coordinating efforts in a holistic manner. The ability to implement solutions for current needs with an eye toward future needs future-proofs an organization’s digital investment and helps avoid the “app-trap” – a reliance on narrowly focused applications with bounded data that cannot be accessed by disparate users.

There is no one answer to healthcare’s digital transformation questions, but taking the following three steps can move an organization closer to the goal of meaningful interoperability:

  • Don’t wait for interoperability standards to be developed – find a data integration and management platform that will integrate and harmonize data from disparate sources to make the information available to all users the way they need it and when they needed.
  • Turn to a data management and integration partner who can provide the expertise required to remain up-to-date on all interoperability, security and regulatory compliance requirements and other mandatory capabilities.
  • Approach digital transformation holistically with a coordinated strategy that considers each new application or capability as data gathered for the benefit of the entire organization rather than siloed for use by a narrowly-focused group of users.

The digital transformation of healthcare and the interoperability challenges that must be overcome are not minor issues, nor are they insurmountable. It is only through the sharing of ideas, information about new technologies and best practices that healthcare organizations can maximize the insights provided by data shared across the enterprise.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies, a proud sponsor of Healthcare Scene. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Healthcare Interoperability Insights

Posted on June 29, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I came across this great video by Diameter Health where Bonny Roberts talked with a wide variety of people at the interoperability showcase at HIMSS. If you want to get a feel for the challenges and opportunities associated with healthcare interoperability, take 5 minutes to watch this video:

What do you think of these healthcare interoperability perspectives? Does one of them stand out more than others?

I love the statement that’s on the Diameter Health website:

“We Cure Clinical Data Disorder”

What an incredible way to describe clinical data today. I’m not sure the ICD-10 code for it, but there’s definitely a lot of clinical data disorder. It takes a real professional to clean the data, organize the data, enrich the data, and know how to make that data useful to people. IT’s not a disorder that most people can treat on their own.

What’s a little bit scary is that this disorder is not going to get any easier. More data is on its way. Better to deal with your disorder now before it becomes a full on chronic condition.

The Truth about AI in Healthcare

Posted on June 18, 2018 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

Those who watched the television show, “The Good Doctor,” in its first season got to see how a young autistic surgeon who has savant syndrome faced challenges in his everyday life as he learns to connect with people in his world. His extraordinary medical skill and intuition not only saves patients’ lives but also creates bridges with co-workers.

During each show, there is at least one scene in which the young doctor “visualizes” the inner workings of the patient’s body – evaluating and analyzing the cause of the medical condition.

Although all physicians can describe what happens to cause illness, the speed, detail and clarity of the young surgeon’s ability to gather information, predict reactions to treatments and identify the protocol that will produce the best outcome greatly surpasses his colleagues’ abilities.

Yes, this is a television show, but artificial intelligence promises the same capabilities that will disrupt all of our preconceived notions about healthcare on both the clinical and the operational sides of the industry.

Doctors rely on their medical training as well as their personal experience with hundreds of patients, but AI can allow clinicians to tap into the experience of hundreds of doctors’ experiences with thousands of patients. Even if physicians had personal experience with thousands of patients, the human mind can’t process all of the data effectively.

How can AI improve patient outcomes as well as the bottom line?

We’re already seeing the initial benefits of AI in many areas of the hospital. A report by Accenture identifies the top three uses of AI in healthcare as robot-assisted surgery, virtual nursing assistants and administrative workflow assistance. These three AI applications alone represent a potential estimated annual benefit of $78 billion for the healthcare industry by 2026.

The benefits of AI include improved precision in surgery, decreased length of stay, reduction in unnecessary hospital visits through remote assessment of patient conditions, and time-saving capabilities such as voice-to-text transcription. According to Accenture, these improvements represent a work time savings of 17 percent for physicians and 51 percent for registered nurses – at a critical time when there is no end in sight for the shortages of both nurses and doctors.

In a recent webinar discussing the role of AI in healthcare, John Lynn, founder of HealthcareScene.com, described other ways that AI can improve diagnosis, treatment and patient safety. These areas include dosage error detection, treatment plan design, determination of medication adherence, medical imaging, tailored prescription medicine and automated documentation.

One of the challenges to fully leveraging the insights and capabilities of AI is the volume of information accumulated in electronic medical records that is unstructured data. Translating this information into a format that can be used by clinical providers as well as financial and administrative staff to optimize treatment plans as well as workflows is possible with natural language processing – a branch of AI that enables technology to interpret speech and text and determine which information is critical.

The most often cited fear about a reliance on AI in healthcare is the opportunity to make mistakes. Of course, humans make mistakes as well. We must remember that AI’s ability to tap into a much wider pool of information to make decisions or recommend options will result in a more deeply-informed decision – if the data is good.

The proliferation of legacy systems, continually added applications and multiple EMRs in a health system increases the risk of data that cannot be accessed or cannot be shared in real-time to aid clinicians or an AI-supported program. Ensuring that data is aggregated into a central location, harmonized, transformed into a usable format and cleaned to provide high quality data is necessary to support reliable AI performance.

While AI might be able to handle the data aggregation and harmonization tasks in the future, we are not there yet. This is not, however, a reason to delay the use of AI in hospitals and other organizations across the healthcare spectrum.

Healthcare organizations can partner with companies that specialize in the aggregation of data from disparate sources to make the information available to all users. Increasing access to data throughout the organization is beneficial to health systems – even before they implement AI tools.

Although making data available to all of the organization’s providers, staff and vendors as needed may seem onerous, it is possible to do so without adding to the hospital’s IT staff burden or the capital improvement budget. The complexities of translating structured and unstructured data, multiple formats and a myriad of data sources can be balanced with data security concerns with the use of a team that focuses on these issues each day.

While most AI capabilities in use today are algorithms that reflect current best practices or research that are programmed by healthcare providers or researchers, this will change. In the future, AI will expand beyond algorithms, and the technology will be able to learn and make new connections among a wider set of data points than today’s more narrowly focused algorithms.

Whether or not your organization is implementing AI, considering AI or just watching its development, I encourage everyone to start by evaluating the data that will be used to “run” AI tools. Taking steps now to ensure clean, easy-to-access data will not only benefit clinical and operational tasks now but will also position the organization to more quickly adopt AI.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies, a proud sponsor of Healthcare Scene. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Healthcare Interoperability is Solved … But What Does That Really Mean? – #HITExpo Insights

Posted on June 12, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

One of the best parts of the new community we created at the Health IT Expo conference is the way attendees at the conference and those in the broader healthcare IT community engage on Twitter using the #HITExpo hashtag before, during, and after the event.  It’s a treasure trove of insights, ideas, practical innovations, and amazing people.  Don’t forget that last part since social media platforms are great at connecting people even if they are usually in the news for other reasons.

A great example of some great knowledge sharing that happened on the #HITExpo hashtag came from Don Lee (@dflee30) who runs #HCBiz, a long time podcast which he recorded live from Health IT Expo.  After the event, Don offered his thoughts on what he thought was the most important conversation about “Solving Interoperability” that came from the conference.  You can read his thoughts on Twitter or we’ve compiled all 23 tweets for easy reading below (A Big Thanks to Thread Reader for making this easy).

As shared by Don Lee:

1/ Finally working through all my notes from the #HITExpo. The most important conversation to me was the one about “solving interoperability” with @RasuShrestha@PaulMBlack and @techguy.

2/ Rasu told the story of what UPMC accomplished using DBMotion. How it enabled the flow of data amongst the many hospitals, clinics and docs in their very large system. #hitexpo

3/ John challenged him a bit and said: it sounds like you’re saying that you’ve solved #interoperability. Is that what you’re telling us? #hitexpo

4/ Rasu explained in more detail that they had done the hard work of establishing syntactic interop amongst the various systems they dealt with (I.e. they can physically move the data from one system to another and put it in a proper place). #hitexpo

5/ He went on and explained how they had then done the hard work of establishing semantic interoperability amongst the many systems they deal with. That means now all the data could be moved, put in its proper place, AND they knew what it meant. #hitexpo

6/ Syntactic interop isn’t very useful in and of itself. You have data but it’s not mastered and not yet useable in analytics. #hitexpo

7/ Semantic interop is the mastering of the data in such a way that you are confident you can use it in analytics, ML, AI, etc. Now you can, say, find the most recent BP for a patient pop regardless of which EMR in your system it originated. And have confidence in it. #hitexpo

8/ Semantic interop is closely related to the concept of #DataFidelity that @BigDataCXO talks about. It’s the quality of data for a purpose. And it’s very hard work. #hitexpo

9/ In the end, @RasuShrestha’s answer was that UPMC had done all of that hard work and therefore had made huge strides in solving interop within their system. He said “I’m not flying the mission accomplished banner just yet”. #hitexpo

10/ Then @PaulMBlack – CEO at @Allscripts – said that @RasuShrestha was being modest and that they had in fact “Solved interoperability.”

I think he’s right and that’s what this tweet storm is about. Coincidentally, it’s a matter of semantics. #hitexpo

11/ I think Rasu dialed it back a bit because he knew that people would hear that and think it means something different. #hitexpo

12/ The overall industry conversation tends to be about ubiquitous, semantic interop where all data is available everywhere and everyone knows what it means. I believe Rasu was saying that they hadn’t achieved that. And that makes sense… because it’s impossible. #hitexpo

13/ @GraceCordovano asked the perfect question and I wish there had been a whole session dedicated to answering it: (paraphrasing) What’s the difference between your institutional definition of interop and what the patients are talking about? #hitexpo

14/ The answer to that question is the crux of our issue. The thing patients want and need is for everyone who cares for them to be on the same page. Interop is very relevant to that issue, obviously, but there’s a lot of friction and it goes way beyond tech. #hitexpo

15/ Also, despite common misconception, no other industry has solved this either. Sure, my credit card works in Europe and Asia and gets back to my bank in the US, but that’s just a use case. There is no ubiquitous semantic interop between JP Morgan Chase and HSBC.

16/ There are lots of use cases that work in healthcare too. E-Prescribing, claims processing and all the related HIPAA transactions, etc. #hitexpo

17/ Also worth noting… Canada has single payer system and they also don’t have clinical interoperability.

This is not a problem unique to healthcare nor the US. #hitexpo

18/ So healthcare needs to pick its use cases and do the hard work. That’s what Rasu described on stage. That’s what Paul was saying has been accomplished. They are both right. And you can do it too. #hitexpo

19/ So good news: #interoperability is solved in #healthcare.

Bad news: It’s a ton of work and everyone needs to do it.

More bad news: You have to keep doing it forever (it breaks, new partners, new sources, new data to care about, etc). #hitexpo

19/ Some day there will be patient mediated exchange that solves the patient side of the problem and does it in a way that works for everyone. Maybe on a #blockchain. Maybe something else. But it’s 10+ years away. #hitexpo

20/ In the meantime my recommendation to clinical orgs – support your regional #HIE. Even UPMC’s very good solution only works for data sources they know about. Your patients are getting care outside your system and in a growing # of clinical and community based settings. #hitexpo

21/ the regional #HIE is the only near-term solution that even remotely resembles semantic, ubiquitous #interoperability in #healthcare.
#hitexpo

22/ My recommendation to patients: You have to take matters into your own hands for now. Use consumer tools like Apple health records and even Dropbox like @ShahidNShah suggested in another #hitexpo session. Also, tell your clinicians to support and use the regional #HIE.

23/ So that got long. I’ll end it here. What do you think?

P.S. the #hitexpo was very good. You should check it out in 2019.

A big thank you to Don Lee for sharing these perspectives and diving in much deeper than we can do in 45 minutes on stage. This is what makes the Health IT Expo community special. People with deep understanding of a problem fleshing out the realities of the problem so we can better understand how to address them. Plus, the sharing happens year round as opposed to just at a few days at the conference.

Speaking of which, what do you think of Don’s thoughts above? Is he right? Is there something he’s missing? Is there more depth to this conversation that we need to understand? Share your thoughts, ideas, insights, and perspectives in the comments or on social media using the #HITExpo hashtag.

Reasonable and Unreasonable Healthcare Interoperability Expectations

Posted on February 12, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Other than EMR and EHR, I don’t think there’s any topic I’ve written about more than healthcare interoperability. It’s a challenging topic with a lot of nuances. Plus, it’s a subject which would benefit greatly if we could make it a reality. However, after all these years I’m coming to some simple conclusions that I think often get lost in most discussions. Especially those in the healthcare IT media.

First, we all know that it would be wonderful for all of your healthcare records to be available to anyone who needs them at any time and in any place and not available to those who shouldn’t have access to them. I believe that in the next 15 years, that’s not going to happen. Sure, it would be great if it did (we all see that), but I don’t see it happening.

The reasons why are simple. Our healthcare system doesn’t want it to happen and there aren’t enough benefits to the system to make it happen.

Does that mean we should give up on interoperability? Definitely not!

Just because we can’t have perfect healthcare interoperability doesn’t mean we shouldn’t create meaningful interoperability (Yes, I did use the word meaningful just to annoy you).

I think one of the major failures of most interoperability efforts is that they’re too ambitious. They try to do everything and since that’s not achievable, they end up doing nothing. There are plenty of reasonable interoperability efforts that make a big difference in healthcare. We can’t let the perfect be the enemy of better. That’s been exactly what’s happened with most of healthcare interoperability.

At the HIMSS conference next month, they’re going to once again have an intereroperability showcase full of vendors that can share data. If HIMSS were smart, they’d do away with the showcase and instead only allow those vendors to show dashboards of the amount of data that’s actually being transferred between organizations in real time. We’d learn a lot more from seeing interoperability that’s really happening as opposed to seeing interoperability that could happen but doesn’t because organizations don’t want that type of interoperability to happen.

Interoperability is a challenging topic, but we make it harder than it needs to be because we want to share everything with everyone. I’m looking for companies that are focused on slices of interoperability that practically solve a problem. If you have some of these, let us know about them in the comments.

Breaking Bad: Why Poor Patient Identification is Rooted in Integration, Interoperability

Posted on December 20, 2017 I Written By

The following is a guest blog post by Dan Cidon, Chief Technology Officer, NextGate.

The difficulty surrounding accurate patient ID matching is sourced in interoperability and integration.

Coordinated, accountable, patient-centered care is reliant on access to quality patient data. Yet, healthcare continues to be daunted by software applications and IT systems that don’t communicate or share information effectively. Health data, spread across multiple source systems and settings, breeds encumbrances in the reconciliation and de-duplication of patient records, leading to suboptimal outcomes and avoidable costs of care. For organizations held prisoner by their legacy systems, isolation and silo inefficiencies worsen as IT environments become increasingly more complex, and the growth and speed to which health data is generated magnifies.

A panoramic view of individuals across the enterprise is a critical component for value-based care and population health initiatives. Accurately identifying patients, and consistently matching them with their data, is the foundation for informed clinical decision-making, collaborative care, and healthier, happier populations. As such, the industry has seen a number of high-profile initiatives in the last few years attempting to address the issue of poor patient identification.

The premature end of CHIME’s National Patient ID Challenge last month should be a sobering industry reminder that a universal solution may never be within reach. However, the important lesson emanating in the wake of the CHIME challenge is that technology alone will not solve the problem. Ultimately, the real challenge of identity management and piecing together a longitudinal health record has to do with integration and interoperability. More specifically, it revolves around the demographics and associated identifiers dispersed across multiple systems.

Because these systems often have little reason to communicate with one another, and because they store their data through fragmented architecture, an excessive proliferation of identifiers occurs. The result is unreliable demographic information, triggering further harm in data synchronization and integrity.

Clearly, keeping these identifiers and demographics as localized silos of data is an undesirable model for healthcare that will never function properly. While secondary information such as clinical data should remain local, the core identity of a patient and basic demographics including name, gender, date of birth, address and contact information shouldn’t be in the control of any single system. This information must be externalized from these insulated applications to maintain accuracy and consistency across all connected systems within the delivery network.

However, there are long-standing and relatively simple standards in place, such as HL7 PIX/PDQ, that allow systems to feed a central demographic repository and query that repository for data. Every year, for the past eight years, NextGate has participated in the annual IHE North American Connectathon – the healthcare industry’s largest interoperability testing event. Year after year, we see hundreds of other participating vendors demonstrating that with effective standards, it is indeed possible to externalize patient identity.

In the United Kingdom, for example, there has been slow but steady success of the Patient Demographic Service – a relatively similar concept of querying a central repository for demographics and maintaining a global identifier. While implementation of such a national scale service in the U.S. is unlikely in the near-term, the concept of smaller scale regional registries is clearly an achievable goal. And every deployment of our Enterprise Master Patient Index (EMPI) is a confirmation that such systems can work and do provide value.

What is disappointing, is that very few systems in actual practice today will query the EMPI as part of the patient intake process. Many, if not most, of the systems we integrate with will only fulfill half of the bargain, namely they will feed the EMPI with demographic data and identifiers. This is because many systems have already been designed to produce this outbound communication for purposes other than the management of demographic data. When it comes to querying the EMPI for patient identity, this requires a fundamental paradigm shift for many vendors and a modest investment to enhance their software. Rather than solely relying on their limited view of patient identity, they are expected to query an outside source and integrate that data into their local repository.

This isn’t rocket science, and yet there are so few systems in production today that initiate this simple step. Worse yet, we see many healthcare providers resorting to band aids to remedy the deficiency, such as resorting to ineffective screen scraping technology to manually transfer data from the EMPI to their local systems.

With years of health IT standards in place that yield a centralized and uniform way of managing demographic data, the meager pace and progress of vendors to adopt them is troubling. It is indefensible that a modern registration system, for instance, wouldn’t have this querying capability as a default module. Yet, that is what we see in the field time and time again.

In other verticals where banking and manufacturing are leveraging standards-based exchange at a much faster pace, it really begs the question: how can healthcare accelerate this type of adoption? As we prepare for the upcoming IHE Connectathon in January, we place our own challenge to the industry to engage in an open and frank dialogue to identify what the barriers are, and how can vendors be incentivized, so patients can benefit from the free flow of accurate, real-time data from provider to provider.

Ultimately, accurate patient identification is a fundamental component to leveraging IT for the best possible outcomes. Identification of each and every individual in the enterprise helps to ensure better care coordination, informed clinical decision making, and improved quality and safety.

Dan Cidon is CTO and co-founder NextGate, a leader in healthcare identity management, managing nearly 250 million lives for health systems and HIEs in the U.S. and around the globe.

Waiting For The Perfect “Standard” Is Not The Answer To Healthcare’s Interoperability Problem

Posted on October 16, 2017 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

Have you bought into the “standards will solve healthcare’s interoperability woes” train of thought? Everyone understands that standards are necessary to enable disparate systems to communicate with each other, but as new applications and new uses for data continually appear, healthcare organizations that are waiting for universal standards, are not maximizing the value of their data. More importantly, they will be waiting a long time to realize the full potential of their data.

Healthcare interoperability is not just a matter of transferring data as an entire file from one user to another. Instead, effective exchange of information allows each user to select which elements of a patient’s chart are needed, and then access them in a format that enables analysis of different data sets to provide a holistic picture of the patient’s medical history or clinical trends in a population of patients. Healthcare’s interoperability challenge is further exacerbated by different contextual interpretations of the words within those fields. For instance, how many different ways are there to say heart attack?

The development of the Health Level Seven (HL7®) FHIR®, which stands for Fast Healthcare Interoperability Resources, represents a significant step forward to interoperability. While the data exchange draft that is being developed and published by HL7 eliminates many of the complexities of earlier HL7 versions and facilitates real-time data exchange via web technology, publication of release 4 – the first normative version of the standard – is not anticipated until October 2018.

As these standards are further developed, the key to universal adoption will be simplicity, according to John Lynn, founder of the HealthcareScene.com. However, he suggests that CIOs stop waiting for “perfect standards” and focus on how they can best achieve interoperability now.

Even with standards that can be implemented in all organizations, the complexity and diversity of the healthcare environment means that it will take time to move everyone to the same standards. This is complicated by the number of legacy systems and patchwork of applications that have been added to healthcare IT systems in an effort to meet quickly changing needs throughout the organization. Shrinking financial resources for capital investment and increasing competition for IT professionals limits a health system’s ability to make the overall changes necessary for interoperability – no matter which standards are adopted.

Some organizations are turning to cloud-based, managed service platforms to perform the integration, aggregation and harmonization that makes data available to all users – regardless of the system or application in which the information was originally collected. This approach solves the financial and human resource challenges by making it possible to budget integration and data management requirements as an operational rather than a capital investment. This strategy also relieves the burden on in-house IT staff by relying on the expertise of professionals who focus on emerging technologies, standards and regulations that enable safe, compliant data exchange.

How are you planning to scale your interoperability and integration efforts?  If you're waiting for standards, why are you waiting?

As a leading provider of healthcare interoperability solutions, Liaison is a proud sponsor of Healthcare Scene. While the conversation about interoperability has been ongoing for many years, ideas, new technology and new strategies discussed and shared by IT professionals will lead to successful healthcare data exchange that will transform healthcare and result in better patient care.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.