Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

VA Lighthouse Lab – Is the Healthcare Industry Getting It Right?

Posted on April 30, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The following is a guest blog by Monica Stout from MedicaSoft

The U.S. Department of Veterans Affairs announced the launch of their Lighthouse Lab platform at HIMSS18 earlier this year. Lighthouse Lab is an open API framework that gives software developers tools to create mobile and web applications to help veterans manage their VA care, services, and benefits. Lighthouse Lab is also intended to help VA adopt more enterprise-wide and commercial-off-the-shelf products and to move the agency more in line with digital experiences in the private sector. Lighthouse Lab has a patient-centric end goal to help veterans better facilitate their care, services, and benefits.

Given its size and reach, VA is easily the biggest healthcare provider in the country. Adopting enterprise-level HL7 Fast Healthcare Interoperability Resources (FHIR)-based application programming interfaces (APIs) as their preferred way to share data when veterans receive care both in the community and VA sends a clear message to industry: rapidly-deployed, FHIR-ready solutions are where industry is going. Simple and fast access to data is not only necessary, but expected. The HL7 FHIR standard and FHIR APIs are here to stay.

There is a lot of value in using enterprise-wide FHIR-based APIs. They use a RESTful approach, which means they use a uniform and predefined set of operations that are consistent with the way today’s web and mobile applications work. This makes it easier to connect and interoperate. Following an 80/20 rule, FHIR focuses on hitting 80% of common use cases instead of 20% of exceptions. FHIR supports a whole host of healthcare needs including mobile, flexible custom workflows, device integrations, and saving money.

There is also value in sharing records. There are so many examples of how a lack of interoperability has harmed patients and hindered care coordination. Imagine if that was not an issue and technology eliminated those issues. With Lighthouse Lab, it appears VA is headed in the direction of innovation and interoperability, including improved patient care for the veterans it serves.

What do you think about VA Lighthouse Lab? Will this be the impetus to push the rest of the healthcare industry toward real interoperability?

About Monica Stout
Monica is a HIT teleworker in Grand Rapids, Michigan by way of Washington, D.C., who has consulted at several government agencies, including the National Aeronautics Space Administration (NASA) and the U.S. Department of Veterans Affairs (VA). She’s currently the Marketing Director at MedicaSoft. Monica can be found on Twitter @MI_turnaround or @MedicaSoftLLC.

About MedicaSoft
MedicaSoft  designs, develops, delivers, and maintains EHR, PHR, and UHR software solutions and HISP services for healthcare providers and patients around the world. MedicaSoft is a proud sponsor of Healthcare Scene. For more information, visit www.medicasoft.us or connect with us on Twitter @MedicaSoftLLC, Facebook, or LinkedIn.

Is EMR Use Unfair To Patients?

Posted on April 24, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

As we all know, clinicians have good reasons to be aggravated with their EMRs. While the list of grievances is long — and legitimate — perhaps the biggest complaint is loss of control. I have to say that I sympathize; if someone forced me to adopt awkward digital tools to do my work I would go nuts.

We seldom discuss, however, the possibility that these systems impose an unfair burden on patients as well. But that’s the argument one physician makes in a recent op-ed for the American Council on Science and Health.

The author, Jamie Wells, MD, calls the use of EMRs “an ethical disaster,” and suggests that forced implementation of EMRs may violate the basic tenets of bioethics.

Some of the arguments Dr. Wells makes apply exclusively to physicians. For one thing, she contends that penalizing doctors who don’t adapt successfully to EMR use is unfair. She also suggests that EMRs create needless challenges that can erode physicians’ ability to deliver quality care, add significant time to a physician’s workday and force doctors to participate in related continuing education whether or not they want to do so.

Unlike many essays critiquing this topic, Wells also contends that patients are harmed by EMR use.

For example, Wells argues that since patients are never asked whether they want physicians to use EMRs, they never get the chance to consider the risks and benefits associated with EHR data use in developing care plans. Also, they are never given a chance to weigh in on whether they are comfortable having less face time with their physicians, she notes.

In addition, she says that since EMRs prompt physicians to ask questions not relevant to that patient’s care, adding extra steps to the process, they create unfair delays in a patient’s getting relief from pain and suffering.

What’s more, she argues that since EMR systems typically aren’t interoperable, they create inconveniences which can ultimately interfere with the patient’s ability to choose a provider.

Folks, you don’t have to convince me that EMR implementations can unfairly rattle patients and caregivers. As I noted in a previous essay, my mother recently went to a terrifying experience when the hospital where my brother was being cared for went through an EMR implementation during the crucial point in his care. She was rightfully concerned that staff might be more concerned with adapting to the EMR and somewhat less focused on her extremely fragile son’s care.

As I noted in the linked article above. I believe that health executives should spend more time considering potentially negative effects of their health IT initiatives on patients. Maybe these execs will have to have a sick relative at the hospital during a rollout before they’ll make the effort.

Hospital Patient Identification Still A Major Problem

Posted on April 18, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A new survey suggests that problems with duplicate patient records and patient identification are still costing hospitals a tremendous amount of money.

The survey, which was conducted by Black Book Research, collected responses from 1,392 health technology managers using enterprise master patient index technology. Researchers asked them what gaps, challenges and successes they’d seen in patient identification processes from Q3 2017 to Q1 2018.

Survey respondents reported that 33% of denied claims were due to inaccurate patient identification. Ultimately, inaccurate patient identification cost an average hospital $1.5 million last year. It also concluded that the average cost of duplicate records was $1,950 per patient per inpatient stay and more than $800 per ED visit.

In addition, researchers found that hospitals with over 150 beds took an average of more than 5 months to clean up their data. This included process improvements focused on data validity checking, normalization and data cleansing.

Having the right tools in place seemed to help. Hospitals said that before they rolled out enterprise master patient index solutions, an average of 18% of their records were duplicates, and that match rates when sharing data with other organizations averaged 24%.

Meanwhile, hospitals with EMPI support in place since 2016 reported that patient records were identified correctly during 93% of registrations and 85% of externally shared records among non-networked provider.

Not surprisingly, though, this research doesn’t tell the whole story. While using EMPI tools makes sense, the healthcare industry should hardly stop there, according to Gartner Group analyst Wes Rishel.

“We simply need innovators that have the vision to apply proven identity matching to the healthcare industry – as well as the gumption and stubbornness necessary to thrive in a crowded and often slow-moving healthcare IT market,” he wrote.

Wishel argues that to improve patient matching, it’s time to start cross-correlating demographic data from patients with demographic data from third-party sources, such as public records, credit agencies or telephone companies, what makes this data particularly helpful is that it includes not just current and correct attributes for person, but also out-of-date and incorrect attributes like previous addresses, maiden names and typos.

Ultimately, these “referential matching” approaches will significantly outperform existing probabilistic models, Wishel argues.

It’s really shocking that so many healthcare organizations don’t have an EMPI solution in place. This is especially true as cloud EMPI has made EMPI solutions available to organizations of all sizes. EMPI is needed for the financial reasons mentioned above, but also from a patient care and patient safety perspective as well.

PointClickCare Tackling Readmissions from Long-Term and Post-Acute Care Facilities Head-On

Posted on January 12, 2018 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He is currently an independent marketing consultant working with leading healthIT companies. Colin is a member of #TheWalkingGallery. His Twitter handle is: @Colin_Hung.

Transitioning from an acute care to a long-term/post-acute care (LTPAC) facility can be dangerous.

According to one study, nearly 23% of patients discharged from a hospital to a LTPAC facility had at least 1 readmission. Research indicates that the leading cause of readmission is harm caused by medication (called an adverse drug event). Studies have shown that as much as 56% of all medication errors happen at a transitional point of care.

By the year 2050 more than 27 million Americans will be using LTPAC services. The majority of these LTPAC patients will transition from an acute care facility at least once each year. With this many transitions, the number of medication errors each year would balloon into the millions. The impact on patients and on the healthcare system itself would be astronomical.

Thankfully there is a solution: medication reconciliation

The Agency for Healthcare Research and Quality (AHRQ) states: “Patients frequently receive new medications or have medications changed during hospitalizations. Lack of medication reconciliation results in the potential for inadvertent medication discrepancies and adverse drug events—particularly for patients with low health literacy, or those prescribed high-risk medications or complex medication regimens.”

Medication reconciliation is a process where an accurate list of medications a patient is taking is maintained at all times. That list is compared to admission, transfer and/or discharge orders at all transitional points both within a facility and between facilities. By seeing orders vs existing medications, clinicians and caregivers are able to prevent drug-interactions and complications due to omissions or dosage discrepancies.

What is surprising is the lack of progress in this area.

We have been talking about interoperability for years in HealthIT. Hundreds of vendors make announcements at the annual HIMSS conference about their ability to share data. Significant investments have been made in Health Information Exchanges (HIEs). Yet despite all of this, there has been relatively little progress made or coverage given to this problem of data exchange between hospitals and LTPAC facilities.

One company in the LTPAC space is working to change that. PointClickCare, one of the largest EHR providers to skilled nursing facilities, home care providers and senior living centers in North America, is dedicating resources and energy to overcoming the challenge of data sharing – specifically for medication reconciliation.

“We are tackling the interoperability problem head-on,” says Dave Wessinger, co-founder and Chief Operating Officer at PointClickCare. “The way we see it, there is absolutely no reason why it can take up to three days for an updated list of medications to arrive at our customer’s facility from a hospital. In that time patients are unnecessarily exposed to potential harm. That’s unacceptable and we are working with our customers and partners to address it.”

Over the past 12 months, the PointClickCare team has made significant progress integrating their platform with other players in the healthcare ecosystem – hospitals, pharmacies, HIEs, ACOs, physician practices and labs. According to Wessinger, PointClickCare is now at a point where they have “FHIR-ready” APIs and web-services.

“We believe that medication reconciliation is the key to getting everyone in the ecosystem to unlock their data,” continues Wessinger. “There is such a tremendous opportunity for all of us in the healthcare vendor community to work together to solve one of the biggest causes of hospital readmissions.”

Amie Downs, Senior Director ISTS Info & App Services at Good Samaritan Society, an organization that operates 165 skilled nursing facilities in 24 states and a PointClickCare customer, agrees strongly with Wessinger: “We have the opportunity to make medication reconciliation our first big interoperability win as an industry. We need a use-case that shows benefit. I can’t think of a better one than reducing harm to patients while simultaneously preventing costly readmissions. I think this can be the first domino so to speak.”

Having the technology infrastructure in place is just part of the challenge. Getting organizations to agree to share data is a significant hurdle and once you get organizations to sit down with each other, the challenge is resisting the temptation just to dump data to each other. Downs summed it up this way:

“What is really needed is for local acute care facilities to partner with local long-term and post-acute care facilities. We need to sit down together and pick the data that we each want/need to provide the best care for patients. We need to stop just sending everything to each other through a direct connection, on some sort of encrypted media that travels with the patient, via fax or physically printed on a piece of paper and then expecting the other party to sort it out.”

Downs goes on to explain how narrowing the scope of data exchange is beneficial: “I definitely see a strong future for CCDA data exchange to help in medication reconciliation. Right now medication information is just appended to the file we receive from acute care facilities. We need to agree on what medication information we really need. Right now, we get the entire medication history of the patient. What we really need is just the active medications that the patient is on.”

In addition to working on FHIR and APIs, BJ Boyle, Director of Product Management at PointClickCare, is also leading a data sharing initiative for those instances when there is no fellow EHR platform to connect to. “We are working towards something that is best described as a ‘Post-Acute Care Cloud’ or ‘PAC Cloud’,” explains Boyle. “We’re designing it so that hospital case managers can go to a single place and get all the information they need from the various SNFs they refer patients to. Today, when HL7 integration isn’t possible, case managers have to be given authorized access to the SNF’s system. That’s not ideal.”

PointClickCare has already taken an initial step towards this vision with an offering called eINTERACT. According to the company’s website eINTERACT allows for the “early identification of changes in condition…and the sooner a change in condition is identified, the quicker interventions can be implemented to prevent decline and avoid potential transfers” which is key to managing patient/resident health.

It’s worth noting that John Lynn blogged about LTPAC readmissions in 2014. Unfortunately at the macro/industry level, not much has changed. Dealing with readmissions from LTPAC facilities is not particularly exciting. Much of the attention remains with consumer-monitoring devices, apps and gadgets around the home.

Having said that, I do find it encouraging to see real progress being made by companies like PointClickCare and Good Samaritan Society. I hope to find more examples of practical interoperability that impacts patient care while touring the HIMSS18 exhibit floor in early March. In the meantime, I will be keeping my eye on PointClickCare and the LTPAC space to see how these interoperability initiatives progress.

When It Comes To Meaningful Use, Some Vendors May Have An Edge

Posted on December 1, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A new article appearing in the Journal of the American Medical Informatics Association has concluded that while EHRs certified under the meaningful use program should perform more or less equally, they don’t.

After conducting an analysis, researchers found that there were significant associations between specific vendors and level of hospital performance for all six meaningful use criteria they were using as a yardstick. Epic came out on top by this measure, demonstrating significantly higher performance on five of the six criteria.

However, it’s also worth noting that EHR vendor choice by hospitals accounted for anywhere between 7% and 34% of performance variation across the six meaningful use criteria. In other words, researchers found that at least in some cases, EHR performance was influenced as much by the fit between platform and hospital as the platform itself.

To conduct the study, researchers used recent national data on certified EHR vendors hospitals and implemented, along with hospital performance on six meaningful use criteria. They sought to find out:

  • Whether certain vendors were found more frequently among the highest performing hospitals, as measured by performance on Stage 2 meaningful use criteria;
  • Whether the relationship between vendor and hospital performance was consistent across the meaningful use criteria, or whether vendors specialized in certain areas; and
  • What proportion of variation in performance across hospitals could be explained by the vendor characteristics

To measure the performance of various vendors, the researchers chose six core stage two meaningful use criteria, including 60% of medication orders entered using CPOE;  providing 50% of patients with the ability to view/download/transmit their health information; for 50% of patients received from another setting or care provider, medication reconciliation is performed; for 50% of patient transitions to another setting or care provider, a summary of care record is provided; and for 10% of patient transitions to another setting or care provider, a summary of care record is electronically transmitted.

After completing their analysis, researchers found that three hospitals were in the top performance quartile for all meaningful use criteria, and all used Epic. Of the 17 hospitals in the top performance quartile for five criteria, 15 used Epic, one used MEDITECH and one another smaller vendor. Among the 68 hospitals in the top quartile for four criteria, 64.7% used Epic, 11.8% used Cerner and 8.8% used MEDITECH.

When it came to hospitals that were not in the top quartile for any of the criteria, there was no overwhelming connection between vendor and results. For the 355 hospitals in this category, 28.7% used MEDITECH, 25.1% used McKesson, 20.3% used Cerner, 14.4% used MEDHOST and 6.8% used Epic.

All of this being said, the researchers noted that news the hospital characteristics nor the vendor choice explained were then a small amount of the performance variation they saw. This won’t surprise anybody who’s seen firsthand how much other issues, notably human factors, can change the outcome of processes like these.

It’s also worth noting that there might be other causes for these differences. For example, if you can afford the notably expensive Epic systems, then your hospital and health system could likely afford to invest in meaningful use compliance as well. This added investment could explain hospitals meaningful use performance as much as EHR choice.

Surescripts Deal Connects EMR Vendors And PBMs To Improve Price Transparency

Posted on November 22, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

I’m no expert on the pharmacy business, but from where I sit as a consumer it’s always looked to me as though pharmaceutical pricing is something of a shell game. It makes predicting what your airline ticket will cost seem like child’s play.

Yes, in theory, the airlines engage in demand-oriented pricing, while pharma pricing is based on negotiated prices spread among multiple contracted parties, but in either case end-users such as myself have very little visibility into where these numbers are coming from.  And in my opinion, at least, that’s not good for anyone involved. You can say “blah blah blah skin in the game” all you want, but co-pays are a poor proxy for making informed decisions as a patient as to what benefits you’ll accrue and problems you face when buying a drug.

Apparently, Surescripts hopes to change the rules to some degree. It just announced that it has come together with two other interest groups within the pharmacy supply chain to offer patient-specific benefit and price information to providers at the point of care.

Its partners in the venture include a group of EMR companies, including Cerner, Epic, Practice Fusion and Aprima Medical Software, which it says represent 53% of the U.S. physician base. It’s also working with two pharmacy benefit managers (CVS Health and Express Scripts) which embrace almost two-thirds of US patients.

The new Surescripts effort actually has two parts, a Real-Time Prescription Benefit tool and an expanded version of its Prior Authorization solution.  Used together, and integrated with an EHR, these tools will clarify whether the patient’s health insurance will cover the drug suggested by the provider and offer therapeutic alternatives that might come at a lower price.

If you ask me, this is clever but fails to put pressure on the right parties. You don’t have to be a pharmaceutical industry expert to know that middlemen like PBMs and pharmacies use a number of less-than-visible stratagems jack up drug prices. Patients are forced to just cope with whatever deal these parties strike among themselves.

If you really want to build a network which helps consumers keep prices down, go for some real disclosure. Create a network which gathers and shares price information every time the drug changes hands, up to and including when the patient pays for that drug. This could have a massive effect on drug pricing overall.

Hey, look at what Amazon did just by making costs of shipping low and relatively transparent to end-users. They sucked a lot of the transaction costs out of the process of shipping products, then gave consumers tools allowing them to watch that benefit in action.

Give consumers even one-tenth of that visibility into their pharmacy supply chain, and prices would fall like a hot rock. Gee, I wonder why nobody’s ever tried that. Could it be that pharmaceutical manufacturers don’t want us to know the real costs of making and shipping their product?

Waiting For The Perfect “Standard” Is Not The Answer To Healthcare’s Interoperability Problem

Posted on October 16, 2017 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

Have you bought into the “standards will solve healthcare’s interoperability woes” train of thought? Everyone understands that standards are necessary to enable disparate systems to communicate with each other, but as new applications and new uses for data continually appear, healthcare organizations that are waiting for universal standards, are not maximizing the value of their data. More importantly, they will be waiting a long time to realize the full potential of their data.

Healthcare interoperability is not just a matter of transferring data as an entire file from one user to another. Instead, effective exchange of information allows each user to select which elements of a patient’s chart are needed, and then access them in a format that enables analysis of different data sets to provide a holistic picture of the patient’s medical history or clinical trends in a population of patients. Healthcare’s interoperability challenge is further exacerbated by different contextual interpretations of the words within those fields. For instance, how many different ways are there to say heart attack?

The development of the Health Level Seven (HL7®) FHIR®, which stands for Fast Healthcare Interoperability Resources, represents a significant step forward to interoperability. While the data exchange draft that is being developed and published by HL7 eliminates many of the complexities of earlier HL7 versions and facilitates real-time data exchange via web technology, publication of release 4 – the first normative version of the standard – is not anticipated until October 2018.

As these standards are further developed, the key to universal adoption will be simplicity, according to John Lynn, founder of the HealthcareScene.com. However, he suggests that CIOs stop waiting for “perfect standards” and focus on how they can best achieve interoperability now.

Even with standards that can be implemented in all organizations, the complexity and diversity of the healthcare environment means that it will take time to move everyone to the same standards. This is complicated by the number of legacy systems and patchwork of applications that have been added to healthcare IT systems in an effort to meet quickly changing needs throughout the organization. Shrinking financial resources for capital investment and increasing competition for IT professionals limits a health system’s ability to make the overall changes necessary for interoperability – no matter which standards are adopted.

Some organizations are turning to cloud-based, managed service platforms to perform the integration, aggregation and harmonization that makes data available to all users – regardless of the system or application in which the information was originally collected. This approach solves the financial and human resource challenges by making it possible to budget integration and data management requirements as an operational rather than a capital investment. This strategy also relieves the burden on in-house IT staff by relying on the expertise of professionals who focus on emerging technologies, standards and regulations that enable safe, compliant data exchange.

How are you planning to scale your interoperability and integration efforts?  If you're waiting for standards, why are you waiting?

As a leading provider of healthcare interoperability solutions, Liaison is a proud sponsor of Healthcare Scene. While the conversation about interoperability has been ongoing for many years, ideas, new technology and new strategies discussed and shared by IT professionals will lead to successful healthcare data exchange that will transform healthcare and result in better patient care.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Interoperability: Is Your Aging Healthcare Integration Engine the Problem?

Posted on September 18, 2017 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.
There is no shortage of data collected by healthcare organizations that can be used to improve clinical as well as business decisions. Announcements of new technology that collects patient information, clinical outcome data and operational metrics that will make a physician or hospital provide better, more cost-effective care bombard us on a regular basis.

The problem today is not the amount of data available to help us make better decisions; the problem is the inaccessibility of the data. When different users – physicians, allied health professionals, administrators and financial managers – turn to data for decision support, they find themselves limited to their own silos of information. The inability to access and share data across different disciplines within the healthcare organization prevents the user from making a decision based on a holistic view of the patient or operational process.

In a recent article, Alan Portela points out that precision medicine, which requires “the ability to collect real-time data from medical devices at the moment of care,” cannot happen easily without interoperability – the ability to access data across disparate systems and applications. He also points out that interoperability does not exist yet in healthcare.

Why are healthcare IT departments struggling to achieve interoperability?

Although new and improved applications are adopted on a regular basis, healthcare organizations are just now realizing that their integration middleware is no longer able to handle new types of data such as social media, the volume of data and the increasing number of methods to connect on a real-time basis. Their integration platforms also cannot handle the exchange of information from disparate data systems and applications beyond the four walls of hospitals. In fact, hospitals of 500 beds or more average 25 unique data sources with six electronic medical records systems in use. Those numbers will only move up over time, not down.

Integration engines in place throughout healthcare today were designed well before the explosion of the data-collection tools and digital information that exist today. Although updates and additions to integration platforms have enabled some interoperability, the need for complete interoperability is creating a movement to replace integration middleware with cloud-based managed services.

A study by the Aberdeen Group reveals that 76 percent of organizations will be replacing their integration middleware, and 70 percent of those organizations will adopt cloud-based integration solutions in the next three years.

The report also points out that as healthcare organizations move from an on-premises solution to a cloud-based platform, business leaders see migration to the cloud and managed services as a way to better manage operational expenses on a monthly basis versus large, up-front capital investments. An additional benefit is better use of in-house IT staff members who are tasked with mission critical, day-to-day responsibilities and may not be able to focus on continuous improvements to the platform to ensure its ability to handle future needs.

Healthcare has come a long way in the adoption of technology that can collect essential information and put it in the hands of clinical and operational decision makers. Taking that next step to effective, meaningful interoperability is critical.

As a leading provider of healthcare interoperability solutions, Liaison is a proud sponsor of Healthcare Scene. It is only through discussions and information-sharing among Health IT professionals that healthcare will achieve the organizational support for the steps required for interoperability.

Join John Lynn and Liaison for an insightful webinar on October 5, titled: The Future of Interoperability & Integration in Healthcare: How can your organization prepare?

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Open Source Tool Offers “Synthetic” Patients For Hospital Big Data Projects

Posted on September 13, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

As readers will know, using big data in healthcare comes with a host of security and privacy problems, many of which are thorny.

For one thing, the more patient data you accumulate, the bigger the disaster when and if the database is hacked. Another important concern is that if you decide to share the data, there’s always the chance that your partner will use it inappropriately, violating the terms of whatever consent to disclose you had in mind. Then, there’s the issue of working with incomplete or corrupted data which, if extensive enough, can interfere with your analysis or even lead to inaccurate results.

But now, there may be a realistic alternative, one which allows you to experiment with big data models without taking all of these risks. A unique software project is underway which gives healthcare organizations a chance to scope out big data projects without using real patient data.

The software, Synthea, is an open source synthetic patient generator that models the medical history of synthetic patients. It seems to have been built by The MITRE Corporation, a not-for-profit research and development organization sponsored by the U.S. federal government. (This page offers a list of other open source projects in which MITRE is or has been involved.)

Synthea is built on a Generic Module Framework which allows it to model varied diseases and conditions that play a role in the medical history of these patients. The Synthea modules create synthetic patients using not only clinical data, but also real-world statistics collected by agencies like the CDC and NIH. MITRE kicked off the project using models based on the top ten reasons patients see primary care physicians and the top ten conditions that shorten years of life.

Its makers were so thorough that each patient’s medical experiences are simulated independently from their “birth” to the present day. The profiles include a full medical history, which includes medication lists, allergies, physician encounters and social determinants of health. The data can be shared using C-CDA, HL7 FHIR, CSV and other formats.

On its site, MITRE says its intent in creating Synthea is to provide “high-quality, synthetic, realistic but not real patient data and associated health records covering every aspect of healthcare.” As MITRE notes, having a batch of synthetic patient data on hand can be pretty, well, handy in evaluating new treatment models, care management systems, clinical support tools and more. It’s also a convenient way to predict the impact of public health decisions quickly.

This is such a good idea that I’m surprised nobody else has done something comparable. (Well, at least as far as I know no one has.) Not only that, it’s great to see the software being made available freely via the open source distribution model.

Of course, in the final analysis, healthcare organizations want to work with their own data, not synthetic substitutes. But at least in some cases, Synthea may offer hospitals and health systems a nice head start.

Healthcare Interoperability and Standards Rules

Posted on September 11, 2017 I Written By

Sunny is a serial entrepreneur on a mission to improve quality of care through data science. Sunny’s last venture docBeat, a healthcare care coordination platform, was successfully acquired by Vocera communications. Sunny has an impressive track record of Strategy, Business Development, Innovation and Execution in the Healthcare, Casino Entertainment, Retail and Gaming verticals. Sunny is the Co-Chair for the Las Vegas Chapter of Akshaya Patra foundation (www.foodforeducation.org) since 2010.

Dave Winer is a true expert on standards. I remember coming across him in the early days of social media when every platform was considering some sort of API. To illustrate his early involvement in standards, Dave was one of the early developers of the RSS standard that is now available on every blog and many other places.

With this background in mind, I was extremely fascinated by a manifesto that Dave Winer published earlier this year that he calls “Rules for Standards-Makers.” Sounds like something we really need in healthcare no?

You should really go and read the full manifesto if you’re someone involved in healthcare standards. However, here’s the list of rules Dave offers standards makers:

  1. There are tradeoffs in standards
  2. Software matters more than formats (much)
  3. Users matter even more than software
  4. One way is better than two
  5. Fewer formats is better
  6. Fewer format features is better
  7. Perfection is a waste of time
  8. Write specs in plain English
  9. Explain the curiosities
  10. If practice deviates from the spec, change the spec
  11. No breakage
  12. Freeze the spec
  13. Keep it simple
  14. Developers are busy
  15. Mail lists don’t rule
  16. Praise developers who make it easy to interop

If you’ve never had to program to a standard, then you might not understand these. However, those who are deep into standards will understand the pitfalls. Plus, you’ll have horror stories about when you didn’t follow these rules and what challenges that caused for you going forward.

The thing I love most about Dave’s rules is that it focuses on simplicity and function. Unfortunately, many standards in healthcare are focused on complexity and perfection. Healthcare has nailed the complexity part and as Dave’s rules highlight, perfection is impossible with standards.

In fact, I skipped over Dave’s first rule for standards makers which highlights the above really well:

Rule #1: Interop is all that matters

As I briefly mentioned in the last CXO Scene podcast, many healthcare CIOs are waiting until the standards are perfect before they worry about interoperability. It’s as if they think that waiting for the perfect standard is going to solve healthcare interoperability. It won’t.

I hope that those building out standards in healthcare will take a deep look at the rules Dave Winer outlines above. We need better standards in healthcare and we need healthcare data to be interoperable.