Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Do We Need Another Interoperability Group?

Posted on September 20, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Over the last few years, industry groups dedicated to interoperability have been popping up like mushrooms after a hard rain. All seem to be dedicated to solving the same set of intractable data sharing problems.

The latest interoperability initiative on my radar, known as the Da Vinci Project, is focused on supporting value-based care.

The Da Vinci Project, which brings together more than 20 healthcare companies, is using HL7 FHIR to foster VBC (Value Based Care). Members include technology vendors, providers, and payers, including Allscripts, Anthem Blue Cross and Blue Shield, Cerner, Epic, Rush University Medical Center, Surescripts, UnitedHealthcare, Humana and Optum. The initiative is hosted by HL7 International.

Da Vinci project members plan to develop a common set of standards for data exchange that can be used nationally. The idea is to help partner organizations avoid spending money on one-off data sharing development projects.

The members are already at work on two test cases, one addressing 30-day medication reconciliation and the other coverage requirements discovery. Next, members will begin work on test cases for document templates and coverage rules, along with eHealth record exchange in support of HEDIS/STARS and clinician exchange.

Of course, these goals sound good in theory. Making it simpler for health plans, vendors and providers to create data sharing standards in common is probably smart.

The question is, is this effort really different from others fronted by Epic, Cerner and the like? Or perhaps more importantly, does its approach suffer from limitations that seem to have crippled other attempts at fostering interoperability?

As my colleague John Lynn notes, it’s probably not wise to be too ambitious when it comes to solving interoperability problems. “One of the major failures of most interoperability efforts is that they’re too ambitious,” he wrote earlier this year. “They try to do everything and since that’s not achievable, they end up doing nothing.”

John’s belief – which I share — is that it makes more sense to address “slices of interoperability” rather than attempt to share everything with everyone.

It’s possible that the Da Vinci Project may actually be taking such a practical approach. Enabling partners to create point-to-point data sharing solutions easily sounds very worthwhile, and could conceivably save money and improve care quality. That’s what we’re all after, right?

Still, the fact that they’re packaging this as a VBC initiative gives me pause. Hey, I know that fee-for-service reimbursement is on its way out and that it will take new technology to support new payment models, but is this really what happening here? I have to wonder.

Bottom line, if the giants involved are still slapping buzzwords on the project, I’m not sure they know what they’re doing yet. I guess we’ll just have to wait and see where they go with it.

Within Two Years, 20% Of Healthcare Orgs Will Be Using Blockchain

Posted on August 16, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

I don’t know about you, but to me, blockchain news seems to be all over the map. It’s like a bunch of shiny objects. Here! Look at the $199 zillion investment this blockchain company just picked up! Wow! Giant Hospital System is using blockchain to automate its cafeteria! And so on. It gets a bit tiring.

However, I’m happy to say that the latest piece of blockchain news to cross my desk seems boring (and practical) in comparison. The news is that according to a Computerworld piece, 20% of healthcare organizations should be using blockchain for operations management and patient identity by 2020, or in other words within two years. And to be clear, we’re talking about systems in day-to-day use, not pilot projects.

The stats come from a report by analyst firm IDC Health Insights, which takes a look at, obviously, blockchain use in the healthcare industry. In the report, researchers note that healthcare has been slower out of the blockchain gate than other industries for reasons that include regulatory and security concerns and blockchain resource availability. Oh, and while the story doesn’t spell this out, good ol’ conservative decision-making has played its part too.

But now things are changing. IDC predicts that in addition to supporting internal operations, blockchain could form the basis for a new health information exchange architecture. Specifically, blockchain could be used to create a mesh network capable of sharing information between stakeholders such as providers, pharmacies, insurance payers and clinical researchers, the report suggests. This architecture could be far more useful than the existing point-to-point approach HIEs use now, as it would be more flexible, more fault-tolerant and less prone to bottlenecks.

As part of the report, IDC offers some advice to healthcare organizations interested in taking on blockchain options. It includes recommendations that they:

  • See to it that any blockchain-related decisions are evidence-based and informed and that stakeholders share information about the pros and cons of blockchain interoperability freely
  • Develop a blockchain interoperability proof of concept which demonstrates how decentralized, distributed and immutable properties could make a contribution
  • Pitch the benefits of blockchain interoperability to providers and patients, letting them know that it could eliminate barriers to getting the data they need when and where they need it
  • Adopt blockchain interoperability early if at all, as this can offer benefits even prior to implementation, and gives leaders a chance to tackle concerns privately if need be

Of course, these suggestions and factoids barely scratch the surface of the blockchain discussion, which is why IDC gets $4,000 a copy for the full report. (Though I should note that the article goes into a lot more depth than I have here.)

Regardless, what came across to me from the article was nonetheless worth thinking about when kicking around possible blockchain strategies. Broadly speaking, providers should get in early, keep everyone involved (including patients and providers ), work out differences over its use privately and see to it that your rollout meets concrete needs. You may want to also read this article on 5 blockchain uses for healthcare. It may not be in places you’d have thought previously.

And now, back to silly blockchain news. I’ll let you know when another set of practical ideas shows up.

Hospitals That Share Patients Don’t Share Patient Data

Posted on August 7, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

If anyone in healthcare needs to catch up on your records, it’s another provider who is treating mutual patients. In this day and age, there’s no good reason why clinicians at one hospital should be guessing what the other would get (or not get as the case is far too often).

Over the last few years, we’ve certainly seen signs of data sharing progress. For example, in early August the marriage between health data sharing networks CommonWell and Carequality was consummated, with providers using Cerner and Greenway Health going live with their connections.

Still, health data exchange is far more difficult than it should be. Despite many years of trying, hospitals still don’t share data with each other routinely, even when they’re treating the same patient.

To learn more about this issue, researchers surveyed pairs of hospitals likely to share patients across the United States. The teams chose pairs which referred the largest volume of patients to each other in a given hospital referral region.

After reaching out to many facilities, the researchers ended up with 63 pairs of hospitals. Researchers then asked them how likely they were to share patient health information with nearby institutions with whom they share patients.

The results, which appeared in the Journal of the American Medical Informatics Association, suggest that while virtually all of the hospitals they studied could be classified as routinely sharing data by federal definitions, that didn’t tell the whole story.

For one thing, while 97% of respondents met the federal guidelines, only 63% shared data routinely with hospitals with the highest shared patient (HSP) volume.

In fact, 23% of respondents reported that information sharing with their HSP hospital was worse than with other hospitals, and 48% said there was no difference. Just 17% said they enjoyed better sharing of patient health data with their HSP volume hospital.

It’s not clear how to fix the problem highlighted in the JAMIA study. While HIEs have been lumbering along for well more than a decade, only a few regional players seem to have developed a trusted relationship with the providers in their area.

The techniques HIEs use to foster such loyalty, which include high-touch methods such as personal check-ins with end users, don’t seem to work as well for some HIE they do for others. Not only that, HIE funding models still vary, which can have a meaningful impact on how successful they’ll be overall.

Regardless, it would be churlish to gloss over the fact that almost two-thirds of hospitals are getting the right data to their peers. I don’t know about you, but this seems like a hopeful development.

Connecting the Data: Three Steps to Meet Digital Transformation Goals

Posted on July 16, 2018 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

A white paper published by the World Economic Forum in 2016 begins with the statement, “Few industries have the potential to be changed so profoundly by digital technology as healthcare, but the challenges facing innovators – from regulatory barriers to difficulties in digitalizing patient data – should not be underestimated.”

That was two years ago, and many of the same challenges still exist as the digital transformation of healthcare continues.

In a recent HIMSS focus group sponsored by Liaison, participants identified their major digital transformation and interoperability goals for the near future as:

  • EMR rollout and integration
  • Population health monitoring and analytics
  • Remote clinical encounters
  • Mobile clinical applications

These goals are not surprising. Although EMRs have been in place in many healthcare organizations for years, the growth of health systems as they add physicians, clinics, hospitals and diagnostic centers represents a growing need to integrate disparate systems. The continual increase in the number of mobile applications and medical devices that can be used to gather information to feed into EMR systems further exacerbates the challenge.

What is surprising is the low percentage of health systems that believe that they are very or somewhat well-prepared to handle these challenges – only 35 percent of the HIMSS/Liaison focus group members identified themselves as well-prepared.

“Chaos” was a word used by focus group participants to describe what happens in a health system when numerous players, overlapping projects, lack of a single coordinator and a tendency to find niche solutions that focus on one need rather than overall organizational needs drive digital transformation projects.

It’s easy to understand the frustration. Too few IT resources and too many needs in the pipeline lead to multiple groups of people working on projects that overlap in goals – sometimes duplicating each other’s efforts – and tax limited staff, budget and infrastructure resources. It was also interesting to see that focus group participants noted that new technologies and changing regulatory requirements keep derailing efforts over multi-year projects.

Throughout all the challenges identified by healthcare organizations, the issue of data integrity is paramount. The addition of new technologies, including mobile and AI-driven analytics, and new sources of information, increases the need to ensure that data is in a format that is accessible to all users and all applications. Otherwise, the full benefits of digital transformation will not be realized.

The lack of universal standards to enable interoperability are being addressed, but until those standards are available, healthcare organizations must evaluate other ways to integrate and harmonize data to make it available to the myriad of users and applications that can benefit from insights provided by the information. Unlocking access to previously unseen data takes resources that many health organizations have in short supply. And the truth is, we’ll never have the perfect standards as they will always continue to change, so there’s no reason to wait.

Infrastructure, however, was not the number one resource identified in the HIMSS focus group as lacking in participants’ interoperability journey. In fact, only 15 percent saw infrastructure as the missing piece, while 30 percent identified IT staffing resources and 45 percent identified the right level of expertise as the most critical needs for their organization.

As all industries focus on digital transformation, competition for expert staff to handle interoperability challenges makes it difficult for healthcare organizations to attract the talent needed. For this reason, 45 percent of healthcare organizations outsource IT data integration and management to address staffing challenges.

Health systems are also evaluating the use of managed services strategies. A managed services solution takes over the day-to-day integration and data management with the right expertise and the manpower to take on complex work and fluctuating project levels. That way in-house staff resources can focus on the innovation and efficiencies that support patient care and operations, while the operating budget covers data management fees – leaving capital dollars available for critical patient care needs.

Removing day-to-day integration responsibilities from in-house staff also provides time to look strategically at the organization’s overall interoperability needs – coordinating efforts in a holistic manner. The ability to implement solutions for current needs with an eye toward future needs future-proofs an organization’s digital investment and helps avoid the “app-trap” – a reliance on narrowly focused applications with bounded data that cannot be accessed by disparate users.

There is no one answer to healthcare’s digital transformation questions, but taking the following three steps can move an organization closer to the goal of meaningful interoperability:

  • Don’t wait for interoperability standards to be developed – find a data integration and management platform that will integrate and harmonize data from disparate sources to make the information available to all users the way they need it and when they needed.
  • Turn to a data management and integration partner who can provide the expertise required to remain up-to-date on all interoperability, security and regulatory compliance requirements and other mandatory capabilities.
  • Approach digital transformation holistically with a coordinated strategy that considers each new application or capability as data gathered for the benefit of the entire organization rather than siloed for use by a narrowly-focused group of users.

The digital transformation of healthcare and the interoperability challenges that must be overcome are not minor issues, nor are they insurmountable. It is only through the sharing of ideas, information about new technologies and best practices that healthcare organizations can maximize the insights provided by data shared across the enterprise.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies, a proud sponsor of Healthcare Scene. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Important Patient Data Questions Hospitals Need To Address

Posted on July 13, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Obviously, managing and protecting patients’ personal health information is very important already.  But with high-profile incidents highlighting questionable uses of consumer data — such as the recent Facebook scandal – patients are more aware of data privacy issues than they had been in the past, says Dr. Oleg Bess, founder and CEO of clinical data exchange company 4medica.

According to Bess, hospitals should prepare to answer four key questions about personal health information that patients, the media and regulators are likely to ask. They include:

  • Who owns the patient’s medical records? While providers and EHR vendors may contend that they own patient data, it actually belongs to the patient, Bess says. What’s more, hospitals need to be sure patients should have a clear idea of what data hospitals have about them. They should also be able to access their health data regardless of where it is stored.
  • What if the patient wants his or her data deleted? Unfortunately, deleting patient data may not be possible in many cases due to legal constraints. For example, CMS demands that Medicare providers retain records for a fixed period, and many states have patient record retention laws as well, Bess notes. However, if nothing else, patients should have the ability to decline having their personally-identifiable data shared with third parties other than providers and payers, he writes.
  • Who is responsible for data integrity? Right now, problems with patient data accuracy are common. For example, particularly when patient matching tools like an enterprise master patient index aren’t in place, health data can end up being mangled. To this point, Bess cites a Black Book Research survey concluding that when records are transmitted between hospitals that don’t use these tools, they had just a 24% match rate. Hospital data stewards need to get on top of this problem, he says.
  • Without a national patient ID in place, how should hospitals verify patient identities? In addition to existing issues regarding patient safety, emerging problems such as the growing opioid abuse epidemic would be better handled with a unique patient identifier, Bess contends. According to Bess, while the federal government may not develop unique patient IDs, commercially developed master patient index technology might offer a solution.

To better address patient matching issues, Bess recommends including historical data which goes back decades in the mix if possible. A master patient index solution should also offer enterprise scalability and real-time matching, he says.

Healthcare Interoperability Insights

Posted on June 29, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I came across this great video by Diameter Health where Bonny Roberts talked with a wide variety of people at the interoperability showcase at HIMSS. If you want to get a feel for the challenges and opportunities associated with healthcare interoperability, take 5 minutes to watch this video:

What do you think of these healthcare interoperability perspectives? Does one of them stand out more than others?

I love the statement that’s on the Diameter Health website:

“We Cure Clinical Data Disorder”

What an incredible way to describe clinical data today. I’m not sure the ICD-10 code for it, but there’s definitely a lot of clinical data disorder. It takes a real professional to clean the data, organize the data, enrich the data, and know how to make that data useful to people. IT’s not a disorder that most people can treat on their own.

What’s a little bit scary is that this disorder is not going to get any easier. More data is on its way. Better to deal with your disorder now before it becomes a full on chronic condition.

The Truth about AI in Healthcare

Posted on June 18, 2018 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

Those who watched the television show, “The Good Doctor,” in its first season got to see how a young autistic surgeon who has savant syndrome faced challenges in his everyday life as he learns to connect with people in his world. His extraordinary medical skill and intuition not only saves patients’ lives but also creates bridges with co-workers.

During each show, there is at least one scene in which the young doctor “visualizes” the inner workings of the patient’s body – evaluating and analyzing the cause of the medical condition.

Although all physicians can describe what happens to cause illness, the speed, detail and clarity of the young surgeon’s ability to gather information, predict reactions to treatments and identify the protocol that will produce the best outcome greatly surpasses his colleagues’ abilities.

Yes, this is a television show, but artificial intelligence promises the same capabilities that will disrupt all of our preconceived notions about healthcare on both the clinical and the operational sides of the industry.

Doctors rely on their medical training as well as their personal experience with hundreds of patients, but AI can allow clinicians to tap into the experience of hundreds of doctors’ experiences with thousands of patients. Even if physicians had personal experience with thousands of patients, the human mind can’t process all of the data effectively.

How can AI improve patient outcomes as well as the bottom line?

We’re already seeing the initial benefits of AI in many areas of the hospital. A report by Accenture identifies the top three uses of AI in healthcare as robot-assisted surgery, virtual nursing assistants and administrative workflow assistance. These three AI applications alone represent a potential estimated annual benefit of $78 billion for the healthcare industry by 2026.

The benefits of AI include improved precision in surgery, decreased length of stay, reduction in unnecessary hospital visits through remote assessment of patient conditions, and time-saving capabilities such as voice-to-text transcription. According to Accenture, these improvements represent a work time savings of 17 percent for physicians and 51 percent for registered nurses – at a critical time when there is no end in sight for the shortages of both nurses and doctors.

In a recent webinar discussing the role of AI in healthcare, John Lynn, founder of HealthcareScene.com, described other ways that AI can improve diagnosis, treatment and patient safety. These areas include dosage error detection, treatment plan design, determination of medication adherence, medical imaging, tailored prescription medicine and automated documentation.

One of the challenges to fully leveraging the insights and capabilities of AI is the volume of information accumulated in electronic medical records that is unstructured data. Translating this information into a format that can be used by clinical providers as well as financial and administrative staff to optimize treatment plans as well as workflows is possible with natural language processing – a branch of AI that enables technology to interpret speech and text and determine which information is critical.

The most often cited fear about a reliance on AI in healthcare is the opportunity to make mistakes. Of course, humans make mistakes as well. We must remember that AI’s ability to tap into a much wider pool of information to make decisions or recommend options will result in a more deeply-informed decision – if the data is good.

The proliferation of legacy systems, continually added applications and multiple EMRs in a health system increases the risk of data that cannot be accessed or cannot be shared in real-time to aid clinicians or an AI-supported program. Ensuring that data is aggregated into a central location, harmonized, transformed into a usable format and cleaned to provide high quality data is necessary to support reliable AI performance.

While AI might be able to handle the data aggregation and harmonization tasks in the future, we are not there yet. This is not, however, a reason to delay the use of AI in hospitals and other organizations across the healthcare spectrum.

Healthcare organizations can partner with companies that specialize in the aggregation of data from disparate sources to make the information available to all users. Increasing access to data throughout the organization is beneficial to health systems – even before they implement AI tools.

Although making data available to all of the organization’s providers, staff and vendors as needed may seem onerous, it is possible to do so without adding to the hospital’s IT staff burden or the capital improvement budget. The complexities of translating structured and unstructured data, multiple formats and a myriad of data sources can be balanced with data security concerns with the use of a team that focuses on these issues each day.

While most AI capabilities in use today are algorithms that reflect current best practices or research that are programmed by healthcare providers or researchers, this will change. In the future, AI will expand beyond algorithms, and the technology will be able to learn and make new connections among a wider set of data points than today’s more narrowly focused algorithms.

Whether or not your organization is implementing AI, considering AI or just watching its development, I encourage everyone to start by evaluating the data that will be used to “run” AI tools. Taking steps now to ensure clean, easy-to-access data will not only benefit clinical and operational tasks now but will also position the organization to more quickly adopt AI.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies, a proud sponsor of Healthcare Scene. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

VA Lighthouse Lab – Is the Healthcare Industry Getting It Right?

Posted on April 30, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The following is a guest blog by Monica Stout from MedicaSoft

The U.S. Department of Veterans Affairs announced the launch of their Lighthouse Lab platform at HIMSS18 earlier this year. Lighthouse Lab is an open API framework that gives software developers tools to create mobile and web applications to help veterans manage their VA care, services, and benefits. Lighthouse Lab is also intended to help VA adopt more enterprise-wide and commercial-off-the-shelf products and to move the agency more in line with digital experiences in the private sector. Lighthouse Lab has a patient-centric end goal to help veterans better facilitate their care, services, and benefits.

Given its size and reach, VA is easily the biggest healthcare provider in the country. Adopting enterprise-level HL7 Fast Healthcare Interoperability Resources (FHIR)-based application programming interfaces (APIs) as their preferred way to share data when veterans receive care both in the community and VA sends a clear message to industry: rapidly-deployed, FHIR-ready solutions are where industry is going. Simple and fast access to data is not only necessary, but expected. The HL7 FHIR standard and FHIR APIs are here to stay.

There is a lot of value in using enterprise-wide FHIR-based APIs. They use a RESTful approach, which means they use a uniform and predefined set of operations that are consistent with the way today’s web and mobile applications work. This makes it easier to connect and interoperate. Following an 80/20 rule, FHIR focuses on hitting 80% of common use cases instead of 20% of exceptions. FHIR supports a whole host of healthcare needs including mobile, flexible custom workflows, device integrations, and saving money.

There is also value in sharing records. There are so many examples of how a lack of interoperability has harmed patients and hindered care coordination. Imagine if that was not an issue and technology eliminated those issues. With Lighthouse Lab, it appears VA is headed in the direction of innovation and interoperability, including improved patient care for the veterans it serves.

What do you think about VA Lighthouse Lab? Will this be the impetus to push the rest of the healthcare industry toward real interoperability?

About Monica Stout
Monica is a HIT teleworker in Grand Rapids, Michigan by way of Washington, D.C., who has consulted at several government agencies, including the National Aeronautics Space Administration (NASA) and the U.S. Department of Veterans Affairs (VA). She’s currently the Marketing Director at MedicaSoft. Monica can be found on Twitter @MI_turnaround or @MedicaSoftLLC.

About MedicaSoft
MedicaSoft  designs, develops, delivers, and maintains EHR, PHR, and UHR software solutions and HISP services for healthcare providers and patients around the world. MedicaSoft is a proud sponsor of Healthcare Scene. For more information, visit www.medicasoft.us or connect with us on Twitter @MedicaSoftLLC, Facebook, or LinkedIn.

Is EMR Use Unfair To Patients?

Posted on April 24, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

As we all know, clinicians have good reasons to be aggravated with their EMRs. While the list of grievances is long — and legitimate — perhaps the biggest complaint is loss of control. I have to say that I sympathize; if someone forced me to adopt awkward digital tools to do my work I would go nuts.

We seldom discuss, however, the possibility that these systems impose an unfair burden on patients as well. But that’s the argument one physician makes in a recent op-ed for the American Council on Science and Health.

The author, Jamie Wells, MD, calls the use of EMRs “an ethical disaster,” and suggests that forced implementation of EMRs may violate the basic tenets of bioethics.

Some of the arguments Dr. Wells makes apply exclusively to physicians. For one thing, she contends that penalizing doctors who don’t adapt successfully to EMR use is unfair. She also suggests that EMRs create needless challenges that can erode physicians’ ability to deliver quality care, add significant time to a physician’s workday and force doctors to participate in related continuing education whether or not they want to do so.

Unlike many essays critiquing this topic, Wells also contends that patients are harmed by EMR use.

For example, Wells argues that since patients are never asked whether they want physicians to use EMRs, they never get the chance to consider the risks and benefits associated with EHR data use in developing care plans. Also, they are never given a chance to weigh in on whether they are comfortable having less face time with their physicians, she notes.

In addition, she says that since EMRs prompt physicians to ask questions not relevant to that patient’s care, adding extra steps to the process, they create unfair delays in a patient’s getting relief from pain and suffering.

What’s more, she argues that since EMR systems typically aren’t interoperable, they create inconveniences which can ultimately interfere with the patient’s ability to choose a provider.

Folks, you don’t have to convince me that EMR implementations can unfairly rattle patients and caregivers. As I noted in a previous essay, my mother recently went to a terrifying experience when the hospital where my brother was being cared for went through an EMR implementation during the crucial point in his care. She was rightfully concerned that staff might be more concerned with adapting to the EMR and somewhat less focused on her extremely fragile son’s care.

As I noted in the linked article above. I believe that health executives should spend more time considering potentially negative effects of their health IT initiatives on patients. Maybe these execs will have to have a sick relative at the hospital during a rollout before they’ll make the effort.

Hospital Patient Identification Still A Major Problem

Posted on April 18, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A new survey suggests that problems with duplicate patient records and patient identification are still costing hospitals a tremendous amount of money.

The survey, which was conducted by Black Book Research, collected responses from 1,392 health technology managers using enterprise master patient index technology. Researchers asked them what gaps, challenges and successes they’d seen in patient identification processes from Q3 2017 to Q1 2018.

Survey respondents reported that 33% of denied claims were due to inaccurate patient identification. Ultimately, inaccurate patient identification cost an average hospital $1.5 million last year. It also concluded that the average cost of duplicate records was $1,950 per patient per inpatient stay and more than $800 per ED visit.

In addition, researchers found that hospitals with over 150 beds took an average of more than 5 months to clean up their data. This included process improvements focused on data validity checking, normalization and data cleansing.

Having the right tools in place seemed to help. Hospitals said that before they rolled out enterprise master patient index solutions, an average of 18% of their records were duplicates, and that match rates when sharing data with other organizations averaged 24%.

Meanwhile, hospitals with EMPI support in place since 2016 reported that patient records were identified correctly during 93% of registrations and 85% of externally shared records among non-networked provider.

Not surprisingly, though, this research doesn’t tell the whole story. While using EMPI tools makes sense, the healthcare industry should hardly stop there, according to Gartner Group analyst Wes Rishel.

“We simply need innovators that have the vision to apply proven identity matching to the healthcare industry – as well as the gumption and stubbornness necessary to thrive in a crowded and often slow-moving healthcare IT market,” he wrote.

Wishel argues that to improve patient matching, it’s time to start cross-correlating demographic data from patients with demographic data from third-party sources, such as public records, credit agencies or telephone companies, what makes this data particularly helpful is that it includes not just current and correct attributes for person, but also out-of-date and incorrect attributes like previous addresses, maiden names and typos.

Ultimately, these “referential matching” approaches will significantly outperform existing probabilistic models, Wishel argues.

It’s really shocking that so many healthcare organizations don’t have an EMPI solution in place. This is especially true as cloud EMPI has made EMPI solutions available to organizations of all sizes. EMPI is needed for the financial reasons mentioned above, but also from a patient care and patient safety perspective as well.