Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

VA Lighthouse Lab – Is the Healthcare Industry Getting It Right?

Posted on April 30, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The following is a guest blog by Monica Stout from MedicaSoft

The U.S. Department of Veterans Affairs announced the launch of their Lighthouse Lab platform at HIMSS18 earlier this year. Lighthouse Lab is an open API framework that gives software developers tools to create mobile and web applications to help veterans manage their VA care, services, and benefits. Lighthouse Lab is also intended to help VA adopt more enterprise-wide and commercial-off-the-shelf products and to move the agency more in line with digital experiences in the private sector. Lighthouse Lab has a patient-centric end goal to help veterans better facilitate their care, services, and benefits.

Given its size and reach, VA is easily the biggest healthcare provider in the country. Adopting enterprise-level HL7 Fast Healthcare Interoperability Resources (FHIR)-based application programming interfaces (APIs) as their preferred way to share data when veterans receive care both in the community and VA sends a clear message to industry: rapidly-deployed, FHIR-ready solutions are where industry is going. Simple and fast access to data is not only necessary, but expected. The HL7 FHIR standard and FHIR APIs are here to stay.

There is a lot of value in using enterprise-wide FHIR-based APIs. They use a RESTful approach, which means they use a uniform and predefined set of operations that are consistent with the way today’s web and mobile applications work. This makes it easier to connect and interoperate. Following an 80/20 rule, FHIR focuses on hitting 80% of common use cases instead of 20% of exceptions. FHIR supports a whole host of healthcare needs including mobile, flexible custom workflows, device integrations, and saving money.

There is also value in sharing records. There are so many examples of how a lack of interoperability has harmed patients and hindered care coordination. Imagine if that was not an issue and technology eliminated those issues. With Lighthouse Lab, it appears VA is headed in the direction of innovation and interoperability, including improved patient care for the veterans it serves.

What do you think about VA Lighthouse Lab? Will this be the impetus to push the rest of the healthcare industry toward real interoperability?

About Monica Stout
Monica is a HIT teleworker in Grand Rapids, Michigan by way of Washington, D.C., who has consulted at several government agencies, including the National Aeronautics Space Administration (NASA) and the U.S. Department of Veterans Affairs (VA). She’s currently the Marketing Director at MedicaSoft. Monica can be found on Twitter @MI_turnaround or @MedicaSoftLLC.

About MedicaSoft
MedicaSoft  designs, develops, delivers, and maintains EHR, PHR, and UHR software solutions and HISP services for healthcare providers and patients around the world. MedicaSoft is a proud sponsor of Healthcare Scene. For more information, visit www.medicasoft.us or connect with us on Twitter @MedicaSoftLLC, Facebook, or LinkedIn.

PointClickCare Tackling Readmissions from Long-Term and Post-Acute Care Facilities Head-On

Posted on January 12, 2018 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He is currently an independent marketing consultant working with leading healthIT companies. Colin is a member of #TheWalkingGallery. His Twitter handle is: @Colin_Hung.

Transitioning from an acute care to a long-term/post-acute care (LTPAC) facility can be dangerous.

According to one study, nearly 23% of patients discharged from a hospital to a LTPAC facility had at least 1 readmission. Research indicates that the leading cause of readmission is harm caused by medication (called an adverse drug event). Studies have shown that as much as 56% of all medication errors happen at a transitional point of care.

By the year 2050 more than 27 million Americans will be using LTPAC services. The majority of these LTPAC patients will transition from an acute care facility at least once each year. With this many transitions, the number of medication errors each year would balloon into the millions. The impact on patients and on the healthcare system itself would be astronomical.

Thankfully there is a solution: medication reconciliation

The Agency for Healthcare Research and Quality (AHRQ) states: “Patients frequently receive new medications or have medications changed during hospitalizations. Lack of medication reconciliation results in the potential for inadvertent medication discrepancies and adverse drug events—particularly for patients with low health literacy, or those prescribed high-risk medications or complex medication regimens.”

Medication reconciliation is a process where an accurate list of medications a patient is taking is maintained at all times. That list is compared to admission, transfer and/or discharge orders at all transitional points both within a facility and between facilities. By seeing orders vs existing medications, clinicians and caregivers are able to prevent drug-interactions and complications due to omissions or dosage discrepancies.

What is surprising is the lack of progress in this area.

We have been talking about interoperability for years in HealthIT. Hundreds of vendors make announcements at the annual HIMSS conference about their ability to share data. Significant investments have been made in Health Information Exchanges (HIEs). Yet despite all of this, there has been relatively little progress made or coverage given to this problem of data exchange between hospitals and LTPAC facilities.

One company in the LTPAC space is working to change that. PointClickCare, one of the largest EHR providers to skilled nursing facilities, home care providers and senior living centers in North America, is dedicating resources and energy to overcoming the challenge of data sharing – specifically for medication reconciliation.

“We are tackling the interoperability problem head-on,” says Dave Wessinger, co-founder and Chief Operating Officer at PointClickCare. “The way we see it, there is absolutely no reason why it can take up to three days for an updated list of medications to arrive at our customer’s facility from a hospital. In that time patients are unnecessarily exposed to potential harm. That’s unacceptable and we are working with our customers and partners to address it.”

Over the past 12 months, the PointClickCare team has made significant progress integrating their platform with other players in the healthcare ecosystem – hospitals, pharmacies, HIEs, ACOs, physician practices and labs. According to Wessinger, PointClickCare is now at a point where they have “FHIR-ready” APIs and web-services.

“We believe that medication reconciliation is the key to getting everyone in the ecosystem to unlock their data,” continues Wessinger. “There is such a tremendous opportunity for all of us in the healthcare vendor community to work together to solve one of the biggest causes of hospital readmissions.”

Amie Downs, Senior Director ISTS Info & App Services at Good Samaritan Society, an organization that operates 165 skilled nursing facilities in 24 states and a PointClickCare customer, agrees strongly with Wessinger: “We have the opportunity to make medication reconciliation our first big interoperability win as an industry. We need a use-case that shows benefit. I can’t think of a better one than reducing harm to patients while simultaneously preventing costly readmissions. I think this can be the first domino so to speak.”

Having the technology infrastructure in place is just part of the challenge. Getting organizations to agree to share data is a significant hurdle and once you get organizations to sit down with each other, the challenge is resisting the temptation just to dump data to each other. Downs summed it up this way:

“What is really needed is for local acute care facilities to partner with local long-term and post-acute care facilities. We need to sit down together and pick the data that we each want/need to provide the best care for patients. We need to stop just sending everything to each other through a direct connection, on some sort of encrypted media that travels with the patient, via fax or physically printed on a piece of paper and then expecting the other party to sort it out.”

Downs goes on to explain how narrowing the scope of data exchange is beneficial: “I definitely see a strong future for CCDA data exchange to help in medication reconciliation. Right now medication information is just appended to the file we receive from acute care facilities. We need to agree on what medication information we really need. Right now, we get the entire medication history of the patient. What we really need is just the active medications that the patient is on.”

In addition to working on FHIR and APIs, BJ Boyle, Director of Product Management at PointClickCare, is also leading a data sharing initiative for those instances when there is no fellow EHR platform to connect to. “We are working towards something that is best described as a ‘Post-Acute Care Cloud’ or ‘PAC Cloud’,” explains Boyle. “We’re designing it so that hospital case managers can go to a single place and get all the information they need from the various SNFs they refer patients to. Today, when HL7 integration isn’t possible, case managers have to be given authorized access to the SNF’s system. That’s not ideal.”

PointClickCare has already taken an initial step towards this vision with an offering called eINTERACT. According to the company’s website eINTERACT allows for the “early identification of changes in condition…and the sooner a change in condition is identified, the quicker interventions can be implemented to prevent decline and avoid potential transfers” which is key to managing patient/resident health.

It’s worth noting that John Lynn blogged about LTPAC readmissions in 2014. Unfortunately at the macro/industry level, not much has changed. Dealing with readmissions from LTPAC facilities is not particularly exciting. Much of the attention remains with consumer-monitoring devices, apps and gadgets around the home.

Having said that, I do find it encouraging to see real progress being made by companies like PointClickCare and Good Samaritan Society. I hope to find more examples of practical interoperability that impacts patient care while touring the HIMSS18 exhibit floor in early March. In the meantime, I will be keeping my eye on PointClickCare and the LTPAC space to see how these interoperability initiatives progress.

Waiting For The Perfect “Standard” Is Not The Answer To Healthcare’s Interoperability Problem

Posted on October 16, 2017 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

Have you bought into the “standards will solve healthcare’s interoperability woes” train of thought? Everyone understands that standards are necessary to enable disparate systems to communicate with each other, but as new applications and new uses for data continually appear, healthcare organizations that are waiting for universal standards, are not maximizing the value of their data. More importantly, they will be waiting a long time to realize the full potential of their data.

Healthcare interoperability is not just a matter of transferring data as an entire file from one user to another. Instead, effective exchange of information allows each user to select which elements of a patient’s chart are needed, and then access them in a format that enables analysis of different data sets to provide a holistic picture of the patient’s medical history or clinical trends in a population of patients. Healthcare’s interoperability challenge is further exacerbated by different contextual interpretations of the words within those fields. For instance, how many different ways are there to say heart attack?

The development of the Health Level Seven (HL7®) FHIR®, which stands for Fast Healthcare Interoperability Resources, represents a significant step forward to interoperability. While the data exchange draft that is being developed and published by HL7 eliminates many of the complexities of earlier HL7 versions and facilitates real-time data exchange via web technology, publication of release 4 – the first normative version of the standard – is not anticipated until October 2018.

As these standards are further developed, the key to universal adoption will be simplicity, according to John Lynn, founder of the HealthcareScene.com. However, he suggests that CIOs stop waiting for “perfect standards” and focus on how they can best achieve interoperability now.

Even with standards that can be implemented in all organizations, the complexity and diversity of the healthcare environment means that it will take time to move everyone to the same standards. This is complicated by the number of legacy systems and patchwork of applications that have been added to healthcare IT systems in an effort to meet quickly changing needs throughout the organization. Shrinking financial resources for capital investment and increasing competition for IT professionals limits a health system’s ability to make the overall changes necessary for interoperability – no matter which standards are adopted.

Some organizations are turning to cloud-based, managed service platforms to perform the integration, aggregation and harmonization that makes data available to all users – regardless of the system or application in which the information was originally collected. This approach solves the financial and human resource challenges by making it possible to budget integration and data management requirements as an operational rather than a capital investment. This strategy also relieves the burden on in-house IT staff by relying on the expertise of professionals who focus on emerging technologies, standards and regulations that enable safe, compliant data exchange.

How are you planning to scale your interoperability and integration efforts?  If you're waiting for standards, why are you waiting?

As a leading provider of healthcare interoperability solutions, Liaison is a proud sponsor of Healthcare Scene. While the conversation about interoperability has been ongoing for many years, ideas, new technology and new strategies discussed and shared by IT professionals will lead to successful healthcare data exchange that will transform healthcare and result in better patient care.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Healthcare Interoperability

Posted on February 18, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

UPDATE: In case you missed our discussion, you can watch the video recording below:

Healthcare Interoperability-blog

One of the hottest topics in all of healthcare is the concept of healthcare interoperability. I remember when Farzad Mostashari said that he would use every lever he had at his disposal to make healthcare interoperability happen. Karen DeSalvo and Andy Slavitt have carried on that tradition and really wants to make interoperability of health data a reality in healthcare. However, it’s certainly not without it’s challenges.

With this challenge in mind, on Monday, February 22, 2016 at Noon ET (9 AM PT), I’ll be sitting down with two of the biggest healthcare intoperability nerds I know (I say that with a ton of affection since I love nerds) to talk about the topic. Here’s a little more info on the healthcare interoperability panel we’ll be having:

You can join our live conversation with Mario and Richard and even add your own comments to the discussion or ask them questions. All you need to do to watch live is visit this blog post on Monday, February 22, 2016 at Noon ET (9 AM PT) and watch the video embed at the bottom of the post or you can subscribe to the blab directly. We’ll be doing a more formal interview for the first 30 minutes and then open up the Blab to others who want to add to the conversation or ask us questions. The conversation will be recorded as well and available on this post after the interview.

In this discussion we’ll dive into the always popular FHIR standard and its potential to achieve “scalable interoperability” in health care. We’ll talk about FHIR’s weaknesses and challenges. Then, we’ll dive into health care interoperability testing and the recently announced AEGIS Touchstone Test platform and how it differs from other interoperability testing that’s being done today. We’ll talk about who’s paying for interoperability testing and where this is all headed in the future.

If you’d like to see the archives of Healthcare Scene’s past interviews, you can find and subscribe to all of Healthcare Scene’s interviews on YouTube.

What’s the Glue Holding EHR Migration and Conversion Projects Together? – Optimize Healthcare Integration Series

Posted on August 13, 2015 I Written By

The following is a guest blog post by Stephane Vigot, President of Caristix, a leading provider of healthcare integration products and services. This post is part of the Optimize Healthcare Integration series.
Stephane Vigot - Caristix
Are you considering migrating from an older EHR to a newer EHR or are you in the process of that conversion? If so, you are well aware of the complexity of this process. There are a lot of reasons that drive the EHR conversion decision, but the primary reason that organizations undertake EHR conversion is simply to improve patient care and safety by providing clinicians and caregivers with the right information at the right time.

It’s easy to think that this is all about the technology. EHR conversion is far more than an IT project. It is a central business issue that needs to be strategically sponsored and backed by upper level management. In our previous post, we addressed the issue of aligning integration goals for business and technology.  In a project of this magnitude, aligning business and technology goals becomes critical. Implementation takes hard work, time, and is very expensive. Effectively dealing with scope, budget & time creep, and change management matched to the stated business goals is the key to success. The complex planning needed is just one part of the story but the actual execution can be extremely problematic.

Since the primary reason for undertaking EHR conversion is to improve patient care and safety, clinical workflow is top-of-mind and coupled to data exchange and flow through your systems. On the IT side, your analysts define the project requirements and your developers build the interfaces based on those requirements. But the team that plays the most critical role is your quality team. Think of them as your project’s glue.

QA has layers of responsibilities. They are the ones that hold the requirements as the project blueprint and make sure that those requirements, driven by the pre-identified business needs, are being met. They also make sure that all defined processes are being followed. Where processes are not followed, QA defines the resulting risks that must be accommodated for in the system. A subset of responsibility for QA is in the final gate-keeping of a project, the testing and validation processes that address the functionality and metrics of a project.

Analysts work to build the interfaces and provide QA with expected workflows. If those workflows are not correctly defined, QA steps in to clarify them and the expected data exchange, and builds test cases to best represent that evolving knowledge. Identifying workflow is often done blindly with little or no existing information. Once the interface is built, those test cases become the basis for testing. QA also plays an important role in maintenance and in contributing to the library of artifacts that contribute to guaranteeing interoperability over time.

Though it is difficult to estimate the actual costs of interfacing due to the variance implicit in such projects, functional and integrated testing is often up to 3x more time consuming than development. It’s important to note that this most likely represents defects in the process. Normally, in traditional software development those numbers are inversed with QA taking about 1/3 of development time. It’s quite common that requirements are not complete by the time the project lands in QA’s lap. New requirements are continually discovered during testing. These are usually considered to be bugs but should have been identified before the development phase started. Another major reason for the lengthy time needed is that all testing is commonly done manually. A 25 minute fix may require hours of testing when done manually.

In technology projects, risk is always present. QA teams continuously work to confine and evaluate risk based on a predefined process and to report those issues. The question continually being asked is: what are the odds that X will be a problem? And how important is that impact if there is a problem? Here the devil is in the details. QA is constantly dancing with that devil. Risk is not an all or nothing kind of thing. If one were to try and eliminate all risk, projects would never be completed. QA adds order and definition to projects but there are always blind alleyways and unknown consequences that cannot be anticipated even with the most well defined requirements. Dealing with the unknown unknowns is a constant for QA teams. The question becomes how much risk can be tolerated to create the cleanest and most efficient exchange of date on an ongoing basis.

If QA is your glue, what are you doing to increase the quality of that glue, to turn that into super glue? What you can do is provide tools that offset the challenges your QA team faces. At the same time, these tools help contain project scope, time & budget creep, and maintain continual alignment with business goals. The right tools should help in the identification of requirements prior to interface development and throughout that process, identify the necessary workflows, and help in the QA process of building test cases. De-identification of PHI should be included so that production data can be used in testing. Tools should automate the testing and validation process and include the capability of running tests repetitively. In addition, these tools should provide easily shared traceability of the entire QA process by providing a central depository for all assets and documentation to provide continuity for the interoperability goals defined for the entire ecosystem.

What is your organization experiencing in your conversion projects? We’d love to hear your thoughts in the comments.

Caristix, a leading healthcare integration company, is the sponsor of the Optimize Healthcare Integration blog post series.  Check out this free online demo of Caristix Workgroup product which helps you test your interface and speed up HL7 interface development.

About Stéphane Vigot
Stéphane Vigot, President of Caristix, has over 20 years of experience in product management and business development leadership roles in technology and healthcare IT. Formerly with CareFusion and Cardinal Health, his experience spans from major enterprises to startups. Caristix is one of the few companies in the health IT ecosystem that is uniquely focused on integrating, connecting, and exchanging data between systems. He can be reached at stephane.vigot@caristix.com

The Top 3 Reasons Your Health IT Systems Take So Long To Integrate – Optimize Healthcare Integration Series

Posted on July 1, 2015 I Written By

The following is a guest blog post by Stephane Vigot, President of Caristix, a leading provider of healthcare integration products and services. This post is part of the Optimize Healthcare Integration series.
Stephane Vigot - Caristix
The push for interoperability is on. What’s at the core of interoperability that truly supports next generation analytics, real patient engagement, true care coordination, and high value population health? Data exchange through interfacing. And that means HL7.

HL7 represents 95% of interfacing in hospital environments for clinical systems and many other information systems in healthcare.  Many people make the error of thinking HL7 is just simple strings, but it’s a lot more than that. It’s a system of data organization, a dynamic framework that establishes the base of data exchange through its specifics, syntax and structure. But, despite standards, if you take two identical systems, same vendor, deployed in two different environments, you’ll find discrepancies 100% of the time when it comes to data management.

What’s the result? It takes too long to take systems live. And that means time, money, resource drain, and headaches for integrators, maintenance and quality teams. The most critical impact is on essential clinical care. Beyond that, this negatively impacts your short and long term business goals over time. This impact will grow with the increasing demands of interoperability, particularly with the drive for automation and easy data access and analytics.

There are three primary challenges that feed into this problem of getting a system live. These are:

Aligning the integration goals for business and technology users – This step alone will differentiate success or failure. Without a clear picture of your goals and environment from day one, you can’t measure the required investment and resources. Project planning becomes a wild guess. How do you get everyone involved on deck with a common understanding of the overall project?  Is it crystal clear how your new system fits into your existing ecosystem in the context of your data flow and structure? Do you know what information you need from whom, when? Is all documentation readily available? Are the business impacts of the interface understood?

Complete and clear data transformation requirements – It’s common to manually compare outdated specs, list the differences and jump into the code. This makes it virtually impossible to quickly come up with a complete list. Complete requirements are not identified until too late in the project, sometimes not until it’s in production. Are all data flows and system workflows identified? Are the system’s data semantics clear? Are documented system specs accurate? Has customized data been included?  Are all the transformations and mappings defined?  Have you automated the processes that define requirements?

Testing/Verification – Your QA team knows this is about a lot more than making sure all the dots are connected. You want to get this right before your go live and avoid handling data related emergencies in production with constant break-fix repairs. Are you doing enough testing before go live so your caregivers can count on applications being completely functional for their critical patient care? Are your test cases based on your requirements? Are you testing against your clinical workflows? Do you include edge cases and performance in your testing? Are you testing with de-identified production data that accurately represents your system’s data flow and needs? Is your testing HIPAA compliant? Are you prepared for ongoing maintenance and updating with reusable test cases backed by reliable and repeatable quality measures? Is your testing automated?

What’s the most efficient solution to these three challenges?  Productivity software that supports your integration and workflow process from start to finish. With the right solution, you understand the big picture before you start with complete requirements built upon your specifications that set you up for robust system testing and maintenance. The right solution will cut your project timelines in half, reduce your resource drain and costs, and guarantee predictable results while streamlining the repetitive tasks of your teams. In addition, gap analysis, automatic specification management, HL7 message comparison and editing, debugging tools, PHI de-identification, documentation building, and team collaborative depositories should be included. As seen in the charts below, savings of up to 52% can be realized through optimization with productivity software.
Healthcare Integration Project Time Chart
Do these healthcare integration challenges resonate with you? What is your organization experiencing? We’d love to hear your thoughts in the comments.

Caristix, a leading healthcare integration company, is the sponsor of the Optimize Healthcare Integration blog post series.  If you’d like to learn more about how you can simplify your healthcare integration process, download this Free Whitepaper.

About Stéphane Vigot
Stéphane Vigot, President of Caristix, has over 20 years of experience in product management and business development leadership roles in technology and healthcare IT. Formerly with CareFusion and Cardinal Health, his experience spans from major enterprises to startups. Caristix is one of the few companies in the health IT ecosystem that is uniquely focused on integrating, connecting, and exchanging data between systems. He can be reached at stephane.vigot@caristix.com

The Path to Interoperability

Posted on August 28, 2014 I Written By

The following is a guest blog post by Dave Boerner, Solutions Consultant at Orion Health.

Since the inception of electronic medical records (EMR), interoperability has been a recurrent topic of discussion in our industry, as it is critical to the needs of quality care delivery. With all of the disparate technology systems that healthcare organizations use, it can be hard to assemble all of the information needed to understand a patient’s health profile and coordinate their care. It’s clear that we’re all working hard at achieving this goal, but with new systems, business models and technology developments, the perennial problem of interoperability is significantly heightened.  With the industry transition from fee-for-service to a value-oriented model, the lack of interoperability is a stumbling block for such initiatives as Patient Center Medical Home (PCMH) and Accountable Care Organization (ACO), which rely heavily on accurate, comprehensive data being readily accessible to disparate parties and systems.

In a PCMH, the team of providers that are collaborating need to share timely and accurate information in order to achieve the best care possible for their patient. Enhanced interoperability allows them access to real-time data that is consistently reliable, helping them make more informed clinical decisions. In the same vein, in an ACO, a patient’s different levels of care – from their primary care physician, to surgeon to pharmacist, all need to be bundled together to understand the cost of a treatment. A reliable method is needed to connect these networks and provide a comprehensive view of a patient’s interaction with the system. It’s clear that interoperability is essential in making value-based care a reality.

Of course, interoperability can take many forms and there are many possible paths to the desired outcome of distributed access to comprehensive and accurate patient information.  Standards efforts over the years have taken on the challenge of improving interoperability, and while achievements such as HL7, HIPAA and C-CDA have been fundamental to recent progress, standards alone fall far short of the goal.  After all, even with good intentions all around, standard-making is a fraught process, especially for vendors coming to the table with such a diversity of development cycles, foundational technologies and development priorities.  Not to mention the perverse incentives to limit interoperability and portability to retain market share.  So, despite the historic progress we have made and current initiatives such as the Office of the National Coordinator’s JASON task force, standards initiatives are likely to provide useful foundational support for interoperability, but individual organizations and larger systems will at least for the time being continue to require significant additional technology and effort dedicated to interoperability to meet their needs.

So what is a responsible health system to do? To achieve robust, real-time data exchange amongst its critical systems, organizations need something stronger than just standards. More and more healthcare executives are realizing that direct integration is the more successful approach to taking on their need for interoperability amongst systems. For simpler IT infrastructures, one to one integration of systems can work well. However, given the complexity of larger health systems and networks, the challenge of developing and managing an escalating number interfaces is untenable. This applies not only to instances of connecting systems within an organization, but also connecting systems and organizations throughout a state and region. For these more complex scenarios, utilizing an integration engine is the best practice. Rather than multiple point-to-point connections, which requires costly development, management and maintenance, the integration engine acts as a central hub, allowing all of the healthcare organization’s systems from clinical to claims to radiology to speak to each other in one universal language, no matter the vendor or the version of the technology.  Integration engines provide comprehensive support for an extensive range of communication protocols and message formats, and help interface analysts and hospital IT administrators reduce their workload while meeting complex technical challenges. Organizations can track and document patient interactions in real-time, and can proactively identify at-risk patients and deliver comprehensive intervention and ongoing care. This is the next level of care that organizations are working to achieve.

Interoperability allows for enhanced care coordination, which ultimately helps improve care quality and patient outcomes. At Orion Health, we understand that an open integration engine platform with an all access API is critical for success. Vendors, public health agencies and other health IT stakeholders are all out there fighting the good fight – working together to make complete interoperability among systems a reality. That said, past experience proves that it’s the users that will truly drive this change. Hospital and health system CIOs need to demand solutions that help enhance interoperability, and it will happen. Only with this sustained effort will complete coordination and collaboration across the continuum of care will become a reality.

About David Boerner
David Boerner works as a Solutions Consultant (pre-sales) for Orion Health where he provides technical consultation and specializes in the design and integration of EHR/HIE solutions involving Rhapsody Integration Engine.

Can Big Data Do What Vendors Claim?

Posted on December 6, 2013 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

There’s no doubt about it — the air is ringing with the sounds of vendors promising big things from big data, from population health to clinical support to management of bundled payments. But can they really offer these blessings?  According to enterprise health IT architect Michael Planchart (known to many as @theEHRGuy), there’s a lot of snake oil sales going on.

In his experience, many of the experts on what he calls Big Bad Data either weren’t in healthcare or have never touched healthcare IT until the big data trend hit the industry. And they’re pitching the big data concept to providers that aren’t ready, he says:

  • Most healthcare providers haven’t been collecting data in a consistent way with a sound data governance model.
  • Most hospitals have paper charts that collect data in unstructured and disorganized ways.
  • Most hospitals — he asserts — have spent millions or even billions of dollars on EMRs but have been unable to implement them properly. (And those that have succeeded have done so in “partial and mediocre ways,” he says.)

Given these obstacles,  where is big data going to come from today? Probably not the right place, he writes:

Well, some geniuses from major software vendors thought they could get this data from the HL7 transactions that had been moving back and forth between systems.  Yes, indeed.  They used some sort of “aggregation” software to extract this data out of HL7 v2.x messages.  What a disaster!  Who in their sane mind would think that transactional near real time data could be used as the source for aggregated data?

As Planchart sees it, institutions need quality, pertinent, relevant and accurate data, not coarsely aggregated data from any of the sources hospitals and providers have. Instead of rushing into big data deals, he suggests that CIOs start collecting discrete, relevant and pertinent data within their EMRs, a move which will pay off over the next several years.

In the mean time, my colleague John Lynn suggests, it’s probably best to focus on “skinny data” — a big challenge in itself given how hard it can be to filter out data “noise” — rather than aggregate a bunch of high volume data from all directions.

Trusting Healthcare Data

Posted on August 9, 2013 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Healthcare is generating data at an unprecedented rate. EHR software is becoming a large repository of healthcare data. Patient portals are starting to get data from patients. Labs are creating large amounts of data. Insurance companies have been collecting and playing with data for years. We’re surrounded by healthcare data. The question is: How do we make sure they trust the data?

Anyone who has worked with an Enterprise Data Warehouse (EDW) realizes what a challenge it is to make sure that the day you pull in from multiple systems can be trusted. It’s really hard to trust data that’s coming from a system that you don’t understand or use regularly. When you use the system regularly you have an idea of how it captures the data and the strengths and weaknesses of the data. When the data is in the EDW, you don’t often know those details.

With all of this said, the EDW is a walk in the park when it comes to trusting the data when you compare it to data coming from an outside source. One example is from an HIE, from the patient, or even from some patient device. The irony is that doctors have trusted outside data for quite a while. They receive chart notes faxed over from a specialty doctor all of the time. They trust that note and act on the data presented in the note. So, we shouldn’t act like the idea of trusting outside data is impossible. We just have to learn from the existing sources of trusted data and see how we can make that data flow easily and in a trusted way.

A great example of this is with HL7 lab interfaces. For some reason those interfaces have reached a level of trust where doctors receive lab results and trust that the data in those results is correct. I think we’ll get there with other forms of data transfer from outside entities. It will just take time to build up those networks of trust.

Being able to trust the data that a doctor receives or that’s stored in their data warehouse is one of the most important things we can do. Without the trust in the data, the data has little to no value and won’t provide the benefit to healthcare that we need it to produce. Healthcare big data is happening, but we need trusted big data.

HL7 Invites Clinicians To Help With Standards

Posted on June 8, 2012 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

I don’t know about you, but I’m always interested in ways in which clinicians get a chance to make health data use more to their liking.  In that spirit, here’s an item from Information Week which just caught my eye — one I think you’ll find it interesting too.

Apparently, the HL7 organization has launched a new pilot membership program allowing clinicians to join and share their knowledge of clinical requirements.  The hope is that clinicians will help HL7 develop in a direction that better supports patient-centered care, IW reports.

Anyone who’s involved in direct patient care, including doctors, nurses and pharmacists, can join HL7 for one year for $100.

Clinicians who join will be encouraged to plug in to the group and:

* Improve the usefulness and quality of HIT standards developed by the group, and by doing so, make EMRs more usable

* Help other members understand how data standards affect how they deliver care

* Make sure that HL7 standards can support useful exchange of data between EMRs and across HIEs

While one would hope HL7 takes clinician needs into account regardless  of whether they’re members, it’s good to see the organization making a real pitch for physician membership.

Hospitals, if you want to be at the cutting edge of interoperability I’d offer to pay even that trivial $100 and encourage clinicians to share what they learned within your organization.

By the way, I was particularly intrigued by a side issue mentioned in the article, which was that HL7 has created an infrastructure for connecting personal health data — notably genetic records, IW reports — to care delivery.

Tying in personalized medical data sounds like a very fruitful direction for future HL7 deployments, as it will encourage more such research and create the kind of virtuous cycle we all hope to see. (Research used, more research produced, more used, better care and so on…)