Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Connecting the Data: Three Steps to Meet Digital Transformation Goals

Posted on July 16, 2018 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

A white paper published by the World Economic Forum in 2016 begins with the statement, “Few industries have the potential to be changed so profoundly by digital technology as healthcare, but the challenges facing innovators – from regulatory barriers to difficulties in digitalizing patient data – should not be underestimated.”

That was two years ago, and many of the same challenges still exist as the digital transformation of healthcare continues.

In a recent HIMSS focus group sponsored by Liaison, participants identified their major digital transformation and interoperability goals for the near future as:

  • EMR rollout and integration
  • Population health monitoring and analytics
  • Remote clinical encounters
  • Mobile clinical applications

These goals are not surprising. Although EMRs have been in place in many healthcare organizations for years, the growth of health systems as they add physicians, clinics, hospitals and diagnostic centers represents a growing need to integrate disparate systems. The continual increase in the number of mobile applications and medical devices that can be used to gather information to feed into EMR systems further exacerbates the challenge.

What is surprising is the low percentage of health systems that believe that they are very or somewhat well-prepared to handle these challenges – only 35 percent of the HIMSS/Liaison focus group members identified themselves as well-prepared.

“Chaos” was a word used by focus group participants to describe what happens in a health system when numerous players, overlapping projects, lack of a single coordinator and a tendency to find niche solutions that focus on one need rather than overall organizational needs drive digital transformation projects.

It’s easy to understand the frustration. Too few IT resources and too many needs in the pipeline lead to multiple groups of people working on projects that overlap in goals – sometimes duplicating each other’s efforts – and tax limited staff, budget and infrastructure resources. It was also interesting to see that focus group participants noted that new technologies and changing regulatory requirements keep derailing efforts over multi-year projects.

Throughout all the challenges identified by healthcare organizations, the issue of data integrity is paramount. The addition of new technologies, including mobile and AI-driven analytics, and new sources of information, increases the need to ensure that data is in a format that is accessible to all users and all applications. Otherwise, the full benefits of digital transformation will not be realized.

The lack of universal standards to enable interoperability are being addressed, but until those standards are available, healthcare organizations must evaluate other ways to integrate and harmonize data to make it available to the myriad of users and applications that can benefit from insights provided by the information. Unlocking access to previously unseen data takes resources that many health organizations have in short supply. And the truth is, we’ll never have the perfect standards as they will always continue to change, so there’s no reason to wait.

Infrastructure, however, was not the number one resource identified in the HIMSS focus group as lacking in participants’ interoperability journey. In fact, only 15 percent saw infrastructure as the missing piece, while 30 percent identified IT staffing resources and 45 percent identified the right level of expertise as the most critical needs for their organization.

As all industries focus on digital transformation, competition for expert staff to handle interoperability challenges makes it difficult for healthcare organizations to attract the talent needed. For this reason, 45 percent of healthcare organizations outsource IT data integration and management to address staffing challenges.

Health systems are also evaluating the use of managed services strategies. A managed services solution takes over the day-to-day integration and data management with the right expertise and the manpower to take on complex work and fluctuating project levels. That way in-house staff resources can focus on the innovation and efficiencies that support patient care and operations, while the operating budget covers data management fees – leaving capital dollars available for critical patient care needs.

Removing day-to-day integration responsibilities from in-house staff also provides time to look strategically at the organization’s overall interoperability needs – coordinating efforts in a holistic manner. The ability to implement solutions for current needs with an eye toward future needs future-proofs an organization’s digital investment and helps avoid the “app-trap” – a reliance on narrowly focused applications with bounded data that cannot be accessed by disparate users.

There is no one answer to healthcare’s digital transformation questions, but taking the following three steps can move an organization closer to the goal of meaningful interoperability:

  • Don’t wait for interoperability standards to be developed – find a data integration and management platform that will integrate and harmonize data from disparate sources to make the information available to all users the way they need it and when they needed.
  • Turn to a data management and integration partner who can provide the expertise required to remain up-to-date on all interoperability, security and regulatory compliance requirements and other mandatory capabilities.
  • Approach digital transformation holistically with a coordinated strategy that considers each new application or capability as data gathered for the benefit of the entire organization rather than siloed for use by a narrowly-focused group of users.

The digital transformation of healthcare and the interoperability challenges that must be overcome are not minor issues, nor are they insurmountable. It is only through the sharing of ideas, information about new technologies and best practices that healthcare organizations can maximize the insights provided by data shared across the enterprise.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies, a proud sponsor of Healthcare Scene. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Important Patient Data Questions Hospitals Need To Address

Posted on July 13, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Obviously, managing and protecting patients’ personal health information is very important already.  But with high-profile incidents highlighting questionable uses of consumer data — such as the recent Facebook scandal – patients are more aware of data privacy issues than they had been in the past, says Dr. Oleg Bess, founder and CEO of clinical data exchange company 4medica.

According to Bess, hospitals should prepare to answer four key questions about personal health information that patients, the media and regulators are likely to ask. They include:

  • Who owns the patient’s medical records? While providers and EHR vendors may contend that they own patient data, it actually belongs to the patient, Bess says. What’s more, hospitals need to be sure patients should have a clear idea of what data hospitals have about them. They should also be able to access their health data regardless of where it is stored.
  • What if the patient wants his or her data deleted? Unfortunately, deleting patient data may not be possible in many cases due to legal constraints. For example, CMS demands that Medicare providers retain records for a fixed period, and many states have patient record retention laws as well, Bess notes. However, if nothing else, patients should have the ability to decline having their personally-identifiable data shared with third parties other than providers and payers, he writes.
  • Who is responsible for data integrity? Right now, problems with patient data accuracy are common. For example, particularly when patient matching tools like an enterprise master patient index aren’t in place, health data can end up being mangled. To this point, Bess cites a Black Book Research survey concluding that when records are transmitted between hospitals that don’t use these tools, they had just a 24% match rate. Hospital data stewards need to get on top of this problem, he says.
  • Without a national patient ID in place, how should hospitals verify patient identities? In addition to existing issues regarding patient safety, emerging problems such as the growing opioid abuse epidemic would be better handled with a unique patient identifier, Bess contends. According to Bess, while the federal government may not develop unique patient IDs, commercially developed master patient index technology might offer a solution.

To better address patient matching issues, Bess recommends including historical data which goes back decades in the mix if possible. A master patient index solution should also offer enterprise scalability and real-time matching, he says.

Healthcare Interoperability Insights

Posted on June 29, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I came across this great video by Diameter Health where Bonny Roberts talked with a wide variety of people at the interoperability showcase at HIMSS. If you want to get a feel for the challenges and opportunities associated with healthcare interoperability, take 5 minutes to watch this video:

What do you think of these healthcare interoperability perspectives? Does one of them stand out more than others?

I love the statement that’s on the Diameter Health website:

“We Cure Clinical Data Disorder”

What an incredible way to describe clinical data today. I’m not sure the ICD-10 code for it, but there’s definitely a lot of clinical data disorder. It takes a real professional to clean the data, organize the data, enrich the data, and know how to make that data useful to people. IT’s not a disorder that most people can treat on their own.

What’s a little bit scary is that this disorder is not going to get any easier. More data is on its way. Better to deal with your disorder now before it becomes a full on chronic condition.

The Truth about AI in Healthcare

Posted on June 18, 2018 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

Those who watched the television show, “The Good Doctor,” in its first season got to see how a young autistic surgeon who has savant syndrome faced challenges in his everyday life as he learns to connect with people in his world. His extraordinary medical skill and intuition not only saves patients’ lives but also creates bridges with co-workers.

During each show, there is at least one scene in which the young doctor “visualizes” the inner workings of the patient’s body – evaluating and analyzing the cause of the medical condition.

Although all physicians can describe what happens to cause illness, the speed, detail and clarity of the young surgeon’s ability to gather information, predict reactions to treatments and identify the protocol that will produce the best outcome greatly surpasses his colleagues’ abilities.

Yes, this is a television show, but artificial intelligence promises the same capabilities that will disrupt all of our preconceived notions about healthcare on both the clinical and the operational sides of the industry.

Doctors rely on their medical training as well as their personal experience with hundreds of patients, but AI can allow clinicians to tap into the experience of hundreds of doctors’ experiences with thousands of patients. Even if physicians had personal experience with thousands of patients, the human mind can’t process all of the data effectively.

How can AI improve patient outcomes as well as the bottom line?

We’re already seeing the initial benefits of AI in many areas of the hospital. A report by Accenture identifies the top three uses of AI in healthcare as robot-assisted surgery, virtual nursing assistants and administrative workflow assistance. These three AI applications alone represent a potential estimated annual benefit of $78 billion for the healthcare industry by 2026.

The benefits of AI include improved precision in surgery, decreased length of stay, reduction in unnecessary hospital visits through remote assessment of patient conditions, and time-saving capabilities such as voice-to-text transcription. According to Accenture, these improvements represent a work time savings of 17 percent for physicians and 51 percent for registered nurses – at a critical time when there is no end in sight for the shortages of both nurses and doctors.

In a recent webinar discussing the role of AI in healthcare, John Lynn, founder of HealthcareScene.com, described other ways that AI can improve diagnosis, treatment and patient safety. These areas include dosage error detection, treatment plan design, determination of medication adherence, medical imaging, tailored prescription medicine and automated documentation.

One of the challenges to fully leveraging the insights and capabilities of AI is the volume of information accumulated in electronic medical records that is unstructured data. Translating this information into a format that can be used by clinical providers as well as financial and administrative staff to optimize treatment plans as well as workflows is possible with natural language processing – a branch of AI that enables technology to interpret speech and text and determine which information is critical.

The most often cited fear about a reliance on AI in healthcare is the opportunity to make mistakes. Of course, humans make mistakes as well. We must remember that AI’s ability to tap into a much wider pool of information to make decisions or recommend options will result in a more deeply-informed decision – if the data is good.

The proliferation of legacy systems, continually added applications and multiple EMRs in a health system increases the risk of data that cannot be accessed or cannot be shared in real-time to aid clinicians or an AI-supported program. Ensuring that data is aggregated into a central location, harmonized, transformed into a usable format and cleaned to provide high quality data is necessary to support reliable AI performance.

While AI might be able to handle the data aggregation and harmonization tasks in the future, we are not there yet. This is not, however, a reason to delay the use of AI in hospitals and other organizations across the healthcare spectrum.

Healthcare organizations can partner with companies that specialize in the aggregation of data from disparate sources to make the information available to all users. Increasing access to data throughout the organization is beneficial to health systems – even before they implement AI tools.

Although making data available to all of the organization’s providers, staff and vendors as needed may seem onerous, it is possible to do so without adding to the hospital’s IT staff burden or the capital improvement budget. The complexities of translating structured and unstructured data, multiple formats and a myriad of data sources can be balanced with data security concerns with the use of a team that focuses on these issues each day.

While most AI capabilities in use today are algorithms that reflect current best practices or research that are programmed by healthcare providers or researchers, this will change. In the future, AI will expand beyond algorithms, and the technology will be able to learn and make new connections among a wider set of data points than today’s more narrowly focused algorithms.

Whether or not your organization is implementing AI, considering AI or just watching its development, I encourage everyone to start by evaluating the data that will be used to “run” AI tools. Taking steps now to ensure clean, easy-to-access data will not only benefit clinical and operational tasks now but will also position the organization to more quickly adopt AI.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies, a proud sponsor of Healthcare Scene. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

VA Lighthouse Lab – Is the Healthcare Industry Getting It Right?

Posted on April 30, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The following is a guest blog by Monica Stout from MedicaSoft

The U.S. Department of Veterans Affairs announced the launch of their Lighthouse Lab platform at HIMSS18 earlier this year. Lighthouse Lab is an open API framework that gives software developers tools to create mobile and web applications to help veterans manage their VA care, services, and benefits. Lighthouse Lab is also intended to help VA adopt more enterprise-wide and commercial-off-the-shelf products and to move the agency more in line with digital experiences in the private sector. Lighthouse Lab has a patient-centric end goal to help veterans better facilitate their care, services, and benefits.

Given its size and reach, VA is easily the biggest healthcare provider in the country. Adopting enterprise-level HL7 Fast Healthcare Interoperability Resources (FHIR)-based application programming interfaces (APIs) as their preferred way to share data when veterans receive care both in the community and VA sends a clear message to industry: rapidly-deployed, FHIR-ready solutions are where industry is going. Simple and fast access to data is not only necessary, but expected. The HL7 FHIR standard and FHIR APIs are here to stay.

There is a lot of value in using enterprise-wide FHIR-based APIs. They use a RESTful approach, which means they use a uniform and predefined set of operations that are consistent with the way today’s web and mobile applications work. This makes it easier to connect and interoperate. Following an 80/20 rule, FHIR focuses on hitting 80% of common use cases instead of 20% of exceptions. FHIR supports a whole host of healthcare needs including mobile, flexible custom workflows, device integrations, and saving money.

There is also value in sharing records. There are so many examples of how a lack of interoperability has harmed patients and hindered care coordination. Imagine if that was not an issue and technology eliminated those issues. With Lighthouse Lab, it appears VA is headed in the direction of innovation and interoperability, including improved patient care for the veterans it serves.

What do you think about VA Lighthouse Lab? Will this be the impetus to push the rest of the healthcare industry toward real interoperability?

About Monica Stout
Monica is a HIT teleworker in Grand Rapids, Michigan by way of Washington, D.C., who has consulted at several government agencies, including the National Aeronautics Space Administration (NASA) and the U.S. Department of Veterans Affairs (VA). She’s currently the Marketing Director at MedicaSoft. Monica can be found on Twitter @MI_turnaround or @MedicaSoftLLC.

About MedicaSoft
MedicaSoft  designs, develops, delivers, and maintains EHR, PHR, and UHR software solutions and HISP services for healthcare providers and patients around the world. MedicaSoft is a proud sponsor of Healthcare Scene. For more information, visit www.medicasoft.us or connect with us on Twitter @MedicaSoftLLC, Facebook, or LinkedIn.

Is EMR Use Unfair To Patients?

Posted on April 24, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

As we all know, clinicians have good reasons to be aggravated with their EMRs. While the list of grievances is long — and legitimate — perhaps the biggest complaint is loss of control. I have to say that I sympathize; if someone forced me to adopt awkward digital tools to do my work I would go nuts.

We seldom discuss, however, the possibility that these systems impose an unfair burden on patients as well. But that’s the argument one physician makes in a recent op-ed for the American Council on Science and Health.

The author, Jamie Wells, MD, calls the use of EMRs “an ethical disaster,” and suggests that forced implementation of EMRs may violate the basic tenets of bioethics.

Some of the arguments Dr. Wells makes apply exclusively to physicians. For one thing, she contends that penalizing doctors who don’t adapt successfully to EMR use is unfair. She also suggests that EMRs create needless challenges that can erode physicians’ ability to deliver quality care, add significant time to a physician’s workday and force doctors to participate in related continuing education whether or not they want to do so.

Unlike many essays critiquing this topic, Wells also contends that patients are harmed by EMR use.

For example, Wells argues that since patients are never asked whether they want physicians to use EMRs, they never get the chance to consider the risks and benefits associated with EHR data use in developing care plans. Also, they are never given a chance to weigh in on whether they are comfortable having less face time with their physicians, she notes.

In addition, she says that since EMRs prompt physicians to ask questions not relevant to that patient’s care, adding extra steps to the process, they create unfair delays in a patient’s getting relief from pain and suffering.

What’s more, she argues that since EMR systems typically aren’t interoperable, they create inconveniences which can ultimately interfere with the patient’s ability to choose a provider.

Folks, you don’t have to convince me that EMR implementations can unfairly rattle patients and caregivers. As I noted in a previous essay, my mother recently went to a terrifying experience when the hospital where my brother was being cared for went through an EMR implementation during the crucial point in his care. She was rightfully concerned that staff might be more concerned with adapting to the EMR and somewhat less focused on her extremely fragile son’s care.

As I noted in the linked article above. I believe that health executives should spend more time considering potentially negative effects of their health IT initiatives on patients. Maybe these execs will have to have a sick relative at the hospital during a rollout before they’ll make the effort.

Hospital Patient Identification Still A Major Problem

Posted on April 18, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A new survey suggests that problems with duplicate patient records and patient identification are still costing hospitals a tremendous amount of money.

The survey, which was conducted by Black Book Research, collected responses from 1,392 health technology managers using enterprise master patient index technology. Researchers asked them what gaps, challenges and successes they’d seen in patient identification processes from Q3 2017 to Q1 2018.

Survey respondents reported that 33% of denied claims were due to inaccurate patient identification. Ultimately, inaccurate patient identification cost an average hospital $1.5 million last year. It also concluded that the average cost of duplicate records was $1,950 per patient per inpatient stay and more than $800 per ED visit.

In addition, researchers found that hospitals with over 150 beds took an average of more than 5 months to clean up their data. This included process improvements focused on data validity checking, normalization and data cleansing.

Having the right tools in place seemed to help. Hospitals said that before they rolled out enterprise master patient index solutions, an average of 18% of their records were duplicates, and that match rates when sharing data with other organizations averaged 24%.

Meanwhile, hospitals with EMPI support in place since 2016 reported that patient records were identified correctly during 93% of registrations and 85% of externally shared records among non-networked provider.

Not surprisingly, though, this research doesn’t tell the whole story. While using EMPI tools makes sense, the healthcare industry should hardly stop there, according to Gartner Group analyst Wes Rishel.

“We simply need innovators that have the vision to apply proven identity matching to the healthcare industry – as well as the gumption and stubbornness necessary to thrive in a crowded and often slow-moving healthcare IT market,” he wrote.

Wishel argues that to improve patient matching, it’s time to start cross-correlating demographic data from patients with demographic data from third-party sources, such as public records, credit agencies or telephone companies, what makes this data particularly helpful is that it includes not just current and correct attributes for person, but also out-of-date and incorrect attributes like previous addresses, maiden names and typos.

Ultimately, these “referential matching” approaches will significantly outperform existing probabilistic models, Wishel argues.

It’s really shocking that so many healthcare organizations don’t have an EMPI solution in place. This is especially true as cloud EMPI has made EMPI solutions available to organizations of all sizes. EMPI is needed for the financial reasons mentioned above, but also from a patient care and patient safety perspective as well.

PointClickCare Tackling Readmissions from Long-Term and Post-Acute Care Facilities Head-On

Posted on January 12, 2018 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He is currently an independent marketing consultant working with leading healthIT companies. Colin is a member of #TheWalkingGallery. His Twitter handle is: @Colin_Hung.

Transitioning from an acute care to a long-term/post-acute care (LTPAC) facility can be dangerous.

According to one study, nearly 23% of patients discharged from a hospital to a LTPAC facility had at least 1 readmission. Research indicates that the leading cause of readmission is harm caused by medication (called an adverse drug event). Studies have shown that as much as 56% of all medication errors happen at a transitional point of care.

By the year 2050 more than 27 million Americans will be using LTPAC services. The majority of these LTPAC patients will transition from an acute care facility at least once each year. With this many transitions, the number of medication errors each year would balloon into the millions. The impact on patients and on the healthcare system itself would be astronomical.

Thankfully there is a solution: medication reconciliation

The Agency for Healthcare Research and Quality (AHRQ) states: “Patients frequently receive new medications or have medications changed during hospitalizations. Lack of medication reconciliation results in the potential for inadvertent medication discrepancies and adverse drug events—particularly for patients with low health literacy, or those prescribed high-risk medications or complex medication regimens.”

Medication reconciliation is a process where an accurate list of medications a patient is taking is maintained at all times. That list is compared to admission, transfer and/or discharge orders at all transitional points both within a facility and between facilities. By seeing orders vs existing medications, clinicians and caregivers are able to prevent drug-interactions and complications due to omissions or dosage discrepancies.

What is surprising is the lack of progress in this area.

We have been talking about interoperability for years in HealthIT. Hundreds of vendors make announcements at the annual HIMSS conference about their ability to share data. Significant investments have been made in Health Information Exchanges (HIEs). Yet despite all of this, there has been relatively little progress made or coverage given to this problem of data exchange between hospitals and LTPAC facilities.

One company in the LTPAC space is working to change that. PointClickCare, one of the largest EHR providers to skilled nursing facilities, home care providers and senior living centers in North America, is dedicating resources and energy to overcoming the challenge of data sharing – specifically for medication reconciliation.

“We are tackling the interoperability problem head-on,” says Dave Wessinger, co-founder and Chief Operating Officer at PointClickCare. “The way we see it, there is absolutely no reason why it can take up to three days for an updated list of medications to arrive at our customer’s facility from a hospital. In that time patients are unnecessarily exposed to potential harm. That’s unacceptable and we are working with our customers and partners to address it.”

Over the past 12 months, the PointClickCare team has made significant progress integrating their platform with other players in the healthcare ecosystem – hospitals, pharmacies, HIEs, ACOs, physician practices and labs. According to Wessinger, PointClickCare is now at a point where they have “FHIR-ready” APIs and web-services.

“We believe that medication reconciliation is the key to getting everyone in the ecosystem to unlock their data,” continues Wessinger. “There is such a tremendous opportunity for all of us in the healthcare vendor community to work together to solve one of the biggest causes of hospital readmissions.”

Amie Downs, Senior Director ISTS Info & App Services at Good Samaritan Society, an organization that operates 165 skilled nursing facilities in 24 states and a PointClickCare customer, agrees strongly with Wessinger: “We have the opportunity to make medication reconciliation our first big interoperability win as an industry. We need a use-case that shows benefit. I can’t think of a better one than reducing harm to patients while simultaneously preventing costly readmissions. I think this can be the first domino so to speak.”

Having the technology infrastructure in place is just part of the challenge. Getting organizations to agree to share data is a significant hurdle and once you get organizations to sit down with each other, the challenge is resisting the temptation just to dump data to each other. Downs summed it up this way:

“What is really needed is for local acute care facilities to partner with local long-term and post-acute care facilities. We need to sit down together and pick the data that we each want/need to provide the best care for patients. We need to stop just sending everything to each other through a direct connection, on some sort of encrypted media that travels with the patient, via fax or physically printed on a piece of paper and then expecting the other party to sort it out.”

Downs goes on to explain how narrowing the scope of data exchange is beneficial: “I definitely see a strong future for CCDA data exchange to help in medication reconciliation. Right now medication information is just appended to the file we receive from acute care facilities. We need to agree on what medication information we really need. Right now, we get the entire medication history of the patient. What we really need is just the active medications that the patient is on.”

In addition to working on FHIR and APIs, BJ Boyle, Director of Product Management at PointClickCare, is also leading a data sharing initiative for those instances when there is no fellow EHR platform to connect to. “We are working towards something that is best described as a ‘Post-Acute Care Cloud’ or ‘PAC Cloud’,” explains Boyle. “We’re designing it so that hospital case managers can go to a single place and get all the information they need from the various SNFs they refer patients to. Today, when HL7 integration isn’t possible, case managers have to be given authorized access to the SNF’s system. That’s not ideal.”

PointClickCare has already taken an initial step towards this vision with an offering called eINTERACT. According to the company’s website eINTERACT allows for the “early identification of changes in condition…and the sooner a change in condition is identified, the quicker interventions can be implemented to prevent decline and avoid potential transfers” which is key to managing patient/resident health.

It’s worth noting that John Lynn blogged about LTPAC readmissions in 2014. Unfortunately at the macro/industry level, not much has changed. Dealing with readmissions from LTPAC facilities is not particularly exciting. Much of the attention remains with consumer-monitoring devices, apps and gadgets around the home.

Having said that, I do find it encouraging to see real progress being made by companies like PointClickCare and Good Samaritan Society. I hope to find more examples of practical interoperability that impacts patient care while touring the HIMSS18 exhibit floor in early March. In the meantime, I will be keeping my eye on PointClickCare and the LTPAC space to see how these interoperability initiatives progress.

When It Comes To Meaningful Use, Some Vendors May Have An Edge

Posted on December 1, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A new article appearing in the Journal of the American Medical Informatics Association has concluded that while EHRs certified under the meaningful use program should perform more or less equally, they don’t.

After conducting an analysis, researchers found that there were significant associations between specific vendors and level of hospital performance for all six meaningful use criteria they were using as a yardstick. Epic came out on top by this measure, demonstrating significantly higher performance on five of the six criteria.

However, it’s also worth noting that EHR vendor choice by hospitals accounted for anywhere between 7% and 34% of performance variation across the six meaningful use criteria. In other words, researchers found that at least in some cases, EHR performance was influenced as much by the fit between platform and hospital as the platform itself.

To conduct the study, researchers used recent national data on certified EHR vendors hospitals and implemented, along with hospital performance on six meaningful use criteria. They sought to find out:

  • Whether certain vendors were found more frequently among the highest performing hospitals, as measured by performance on Stage 2 meaningful use criteria;
  • Whether the relationship between vendor and hospital performance was consistent across the meaningful use criteria, or whether vendors specialized in certain areas; and
  • What proportion of variation in performance across hospitals could be explained by the vendor characteristics

To measure the performance of various vendors, the researchers chose six core stage two meaningful use criteria, including 60% of medication orders entered using CPOE;  providing 50% of patients with the ability to view/download/transmit their health information; for 50% of patients received from another setting or care provider, medication reconciliation is performed; for 50% of patient transitions to another setting or care provider, a summary of care record is provided; and for 10% of patient transitions to another setting or care provider, a summary of care record is electronically transmitted.

After completing their analysis, researchers found that three hospitals were in the top performance quartile for all meaningful use criteria, and all used Epic. Of the 17 hospitals in the top performance quartile for five criteria, 15 used Epic, one used MEDITECH and one another smaller vendor. Among the 68 hospitals in the top quartile for four criteria, 64.7% used Epic, 11.8% used Cerner and 8.8% used MEDITECH.

When it came to hospitals that were not in the top quartile for any of the criteria, there was no overwhelming connection between vendor and results. For the 355 hospitals in this category, 28.7% used MEDITECH, 25.1% used McKesson, 20.3% used Cerner, 14.4% used MEDHOST and 6.8% used Epic.

All of this being said, the researchers noted that news the hospital characteristics nor the vendor choice explained were then a small amount of the performance variation they saw. This won’t surprise anybody who’s seen firsthand how much other issues, notably human factors, can change the outcome of processes like these.

It’s also worth noting that there might be other causes for these differences. For example, if you can afford the notably expensive Epic systems, then your hospital and health system could likely afford to invest in meaningful use compliance as well. This added investment could explain hospitals meaningful use performance as much as EHR choice.

Surescripts Deal Connects EMR Vendors And PBMs To Improve Price Transparency

Posted on November 22, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

I’m no expert on the pharmacy business, but from where I sit as a consumer it’s always looked to me as though pharmaceutical pricing is something of a shell game. It makes predicting what your airline ticket will cost seem like child’s play.

Yes, in theory, the airlines engage in demand-oriented pricing, while pharma pricing is based on negotiated prices spread among multiple contracted parties, but in either case end-users such as myself have very little visibility into where these numbers are coming from.  And in my opinion, at least, that’s not good for anyone involved. You can say “blah blah blah skin in the game” all you want, but co-pays are a poor proxy for making informed decisions as a patient as to what benefits you’ll accrue and problems you face when buying a drug.

Apparently, Surescripts hopes to change the rules to some degree. It just announced that it has come together with two other interest groups within the pharmacy supply chain to offer patient-specific benefit and price information to providers at the point of care.

Its partners in the venture include a group of EMR companies, including Cerner, Epic, Practice Fusion and Aprima Medical Software, which it says represent 53% of the U.S. physician base. It’s also working with two pharmacy benefit managers (CVS Health and Express Scripts) which embrace almost two-thirds of US patients.

The new Surescripts effort actually has two parts, a Real-Time Prescription Benefit tool and an expanded version of its Prior Authorization solution.  Used together, and integrated with an EHR, these tools will clarify whether the patient’s health insurance will cover the drug suggested by the provider and offer therapeutic alternatives that might come at a lower price.

If you ask me, this is clever but fails to put pressure on the right parties. You don’t have to be a pharmaceutical industry expert to know that middlemen like PBMs and pharmacies use a number of less-than-visible stratagems jack up drug prices. Patients are forced to just cope with whatever deal these parties strike among themselves.

If you really want to build a network which helps consumers keep prices down, go for some real disclosure. Create a network which gathers and shares price information every time the drug changes hands, up to and including when the patient pays for that drug. This could have a massive effect on drug pricing overall.

Hey, look at what Amazon did just by making costs of shipping low and relatively transparent to end-users. They sucked a lot of the transaction costs out of the process of shipping products, then gave consumers tools allowing them to watch that benefit in action.

Give consumers even one-tenth of that visibility into their pharmacy supply chain, and prices would fall like a hot rock. Gee, I wonder why nobody’s ever tried that. Could it be that pharmaceutical manufacturers don’t want us to know the real costs of making and shipping their product?