Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Predictive Analytics Will Save Hospitals, Not IT Investment

Posted on October 27, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Most hospitals run on very slim operating margins. In fact, not-for-profit hospitals’ mean operating margins fell from 3.4% in fiscal year 2015 to 2.7% in fiscal year 2016, according to Moody’s Investors Service.

To turn this around, many seem to be pinning their hopes on better technology, spending between 25% and 35% of their capital budget on IT infrastructure investment. But that strategy might backfire, suggests an article appearing in the Harvard Business Review.

Author Sanjeev Agrawal, who serves as president of healthcare and chief marketing officer at healthcare predictive analytics company LeanTaaS, argues that throwing more money at IT won’t help hospitals become more profitable. “Healthcare providers can’t keep spending their way out of trouble by investing in more and more infrastructure,” he writes. “Instead, they must optimize the use of the assets currently in place.”

Instead, he suggests, hospitals need to go the way of retail, transportation and airlines, industries which also manage complex operations and work on narrow margins. Those industries have improved their performance by improving their data science capabilities.

“[Hospitals] need to create an operational ‘air traffic control’ for their hospitals — a centralized command-and-control capability that is predictive, learns continually, and uses optimization algorithms and artificial intelligence to deliver prescriptive recommendations throughout the system,” Agrawal says.

Agrawal predicts that hospitals will use predictive analytics to refine their key care-delivery processes, including resource utilization, staff schedules, and patient admits and discharges. If they get it right, they’ll meet many of their goals, including better patient throughput, lower costs and more efficient asset utilization.

For example, he notes, hospitals can optimize OR utilization, which brings in 65% of revenue at most hospitals. Rather than relying on current block-scheduling techniques, which have been proven to be inefficient, hospitals can use predictive analytics and mobile apps to give surgeons more control of OR scheduling.

Another area ripe for process improvements is the emergency department. As Agrawal notes, hospitals can avoid bottlenecks by using analytics to define the most efficient order for ED activities. Not only can this improve hospital finances, it can improve patient satisfaction, he says.

Of course, Agrawal works for a predictive analytics vendor, which makes him more than a little bit biased. But on the other hand, I doubt any of us would disagree that adopting predictive analytics strategies is the next frontier for hospitals.

After all, having spent many billions collectively to implement EMRs, hospitals have created enormous data stores, and few would argue that it’s high time to leverage them. For example, if they want to adopt population health management – and it’s a question of when, not if — they’ve got to use these tools to reduce outcome variations and improve quality of cost across populations. Also, while the deep-pocketed hospitals are doing it first, it seems likely that over time, virtually every hospital will use EMR data to streamline operations as well.

The question is, will vendors like LeanTaaS take a leading role in this transition, or will hospital IT leaders know what they want to do?  At this stage, it’s anyone’s guess.

A New Hospital Risk-Adjustment Model

Posted on August 23, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Virtually all of the risk adjustment models with which I’m familiar are based on retrospective data. This data clearly has some predictive benefits – maybe it’s too cliché to say the past is prologue – and is already in our hands.

To look at just one example of what existing data archives can do, we need go no further than the pages of this blog. Late last year, I shared the story of a group of French hospitals which are working to predict admission rates as much as 15 days in advance by mining a store of historical data. Not surprisingly, the group’s key data includes 10 years’ worth of admission records.

The thing is, using historical data may not be as helpful when you’re trying to develop risk-adjustment models. After all, among other problems, the metrics by which evaluate care shift over time, and our understanding of disease states changes as well, so using such models to improve care and outcomes has its limitations.

I’ve been thinking about these issues since John shared some information on a risk-adjustment tool which leverages relevant patient care data collected almost in real time.

The Midas Hospital Risk Adjustment Model, which is created specifically for single organizations, samples anywhere from 20 to 600 metrics, which can include data on mortality, hospital-acquired complications, unplanned readmission, lengths of stay and charges. It’s built using the Midas Health Analytics Platform, which comes from a group within healthcare services company Conduent. The platform captures data across hospital functional areas and aggregates it for use in care management

The Midas team chooses what metrics to include using its in-house tools, which include a data warehouse populated with records on more than 100 million claims as well as data from more than 800 hospitals.

What makes the Midas model special, Conduent says, is that it incorporates a near-time feed of health data from hospital information systems. One of the key advantages to doing so is that rather than basing its analysis on ICD-9 data, which was in use until relatively recently, it can leverage clinically-detailed ICD-10 data, the company says.

The result of this process is a model which is far more capable of isolating small but meaningful differences between individual patients, Conduent says. Then, using this model, hospitals risk-adjust clinical and financial outcomes data by provider for hospitalized patients, and hopefully, have a better basis for making future decisions.

This approach sounds desirable (though I don’t know if it’s actually new). We probably need to move in the direction of using fresh data when analyzing care trends. I suspect few hospitals or health system would have the resources to take this on today, but it’s something to consider.

Still, I’d want to know two things before digging into Midas further. First, while the idea sounds good, is there evidence to suggest that collecting recent data offers superior clinical results? And in that vein, how much of an improvement does it offer relative to analysis of historical data? Until we know these things, it’s hard to tell what we’ve got here.

2 Core Healthcare IT Principles

Posted on May 10, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

One of my favorite bloggers I found when I first starting blogging about Healthcare IT was a hospital CIO named Will Weider who blogged on a site he called Candid CIO. At the time he was CIO of Ministry Health Care and he always offered exceptional insights from his perspective as a hospital CIO. A little over a month ago, Will decided to move on as CIO after 22 years. That was great news for me since it meant he’d probably have more time to blog. The good news is that he has been posting more.

In a recent post, Will offered two guiding principles that I thought were very applicable to any company working to take part in the hospital health IT space:

1. Embed everything in the EHR
2. Don’t hijack the physician workflow

Go and read Will’s post to get his insights, but I agree with both of these principles.

I would add one clarification to his first point. I think there is a space for an outside provider to work outside of the EHR. Think of someone like a care manager. EHR software doesn’t do care management well and so I think there’s a space for a third party care management platform. However, if you want the doctor to access it, then it has to be embedded in the EHR. It’s amazing how much of a barrier a second system is for a doctor.

Ironically, we’ve seen the opposite is also true for people like radiologists. If it’s not in their PACS interface, then it takes a nearly herculean effort for them to leave their PACS system to look something up in the EHR. That’s why I was excited to see some PACS interfaces at RSNA last year which had the EHR data integrated into the radiologists’ interface. The same is true for doctors working in an EHR.

Will’s second point is a really strong one. In his description of this principle, he even suggests that alerts should all but be done away within an EHR except for “the most critical safety situations. He’s right that alert blindness is real and I haven’t seen anyone nail the alerts so well that doctors aren’t happy to see the alerts. That’s the bar we should place on alerts that hijack the physician workflow. Will the doctor be happy you hijacked their workflow and gave them the alert? If the answer is no, then you probably shouldn’t send it.

Welcome back to the blogosphere Will! I look forward to many more posts from you in the future.

Cleveland Clinic Works To Eliminate Tech Redundancies

Posted on March 1, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

The Cleveland Clinic has relied on its EMR for quite some time. In fact, it adopted Epic in the 1990s, long before most healthcare organizations were ready to make a bet on EMRs. Today, decades later, the Epic EMR is the “central data hub” for the medical center and is central to both its clinical and operational efforts, according to William Morris, MD, the Clinic’s associate chief information officer.

But Morris, who spoke about the Clinic’s health IT with Health Data Management, also knows its limitations. In an interview with the magazine’s Greg Slabodkin, he notes that while the EMR may be necessary, it isn’t sufficient. The Epic EMR is “just a digital repository,” he told Slabodkin. “Ultimately, it’s what you do with the technology in your ecosystem.”

These days, IT leaders at the Clinic are working to streamline the layers of additional technology which have accreted on top of the EMR over the years. “As an early adopter of Epic, we have accumulated quite a bit of what I’ll call technical debt,” said Doug Smith, interim chief information officer. “What I mean by that is multiple enhancements, bolt-ons, or revisions to the core application. We have to unburden ourselves of that.”

It’s not that Clinic leaders are unhappy with their EMR. In fact, they’re finding ways to tap its power to improve care. For example, to better leverage its EMR data, the Cleveland Clinic has developed data-driven “risk scores” designed to let doctors know if patients need intervention. The models, developed by the Clinic’s Quantitative Health Sciences group, offer outcome risk calculators for several conditions, including cancer, cardiovascular disease and diabetes.

(By the way, if predictive analytics interest you, you might want to check out our coverage of such efforts at New York’s Mount Sinai Hospital, which is developing a platform to predict which patients might develop congestive heart failure and care for patients already diagnosed with the condition more effectively. I’ve also taken a look at a related product being developed by Google’s DeepMind, an app named Streams which will ping clinicians if a patient needs extra attention.)

Ultimately, though, the organization hopes to simplify its larger health IT infrastructure substantially, to the point where 85% of the HIT functionality comes from the core Epic system. This includes keeping a wary eye on Epic upgrades, and implementing new features selectively. “When you take an upgrade in Epic, they are always turning on more features and functions,” Smith notes. “Most are optional.”

Not only will such improvements streamline IT operations, they will make clinicians more efficient, Smith says. “They are adopting standard workflows that also exist in many other organizations—and, we’re more efficient in supporting it because we don’t take as long to validate or support an upgrade.”

As an aside, I’m interested to read that Epic is tossing more features at Cleveland Clinic than it cares to adopt. I wonder if those are what engineers think customers want, or what they’re demanding today?

The Distributed Hospital On The Horizon

Posted on February 24, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

If you’re reading this blog, you already know that distributed, connected devices and networks are the future of healthcare.  Connected monitoring devices are growing more mature by the day, network architectures are becoming amazingly fluid, and with the growth of the IoT, we’re adding huge numbers of smart devices to an already-diverse array of endpoints.  While we may not know what all of this will look when it’s fully mature, we’ve already made amazing progress in connecting care.

But how will these trends play out? One nice look at where all this is headed comes from Jeroen Tas, chief innovation and strategy officer at Philips. In a recent article, Tas describes a world in which even major brick-and-mortar players like hospitals go almost completely virtual.  Certainly, there are other takes out there on this subject, but I really like how Tas explains things.

He starts with the assertion that the hospital of the future “is not a physical location with waiting rooms, beds and labs.” Instead, a hospital will become an abstract network overlay connecting nodes. It’s worth noting that this isn’t just a concept. For an example, Tas points to the Mercy Virtual Care Center, a $54 million “hospital without beds” dedicated to telehealth and connected care.  The Center, which has over 300 employees, cares for patients at home and in beds across 38 hospitals in seven states.

While the virtual hospital may not rely on a single, central campus, physical care locations will still matter – they’ll just be distributed differently. According to Tas, the connected health network will work best if care is provided as needed through retail-type outlets near where people live, specialist hubs, inpatient facilities and outpatient clinics. Yes, of course, we already have all of these things in place, but in the new connected world, they’ll all be on a single network.

Ultimately, even if brick-and-mortar hospitals never disappear, virtual care should make it possible to cut down dramatically on hospital admissions, he suggests.  For example, Tas notes that Philips partner Banner Health has slashed hospital admissions almost 50% by using telehealth and advanced analytics for patients with multiple chronic conditions. (We’ve also reported on a related pilot by Partners HealthCare Brigham and Women’s Hospital, the “Home Hospital,” which sends patients home with remote monitoring devices as an alternative to admissions.)

Of course, the broad connected care outline Tas offers can only take us so far. It’s all well and good to have a vision, but there are still some major problems we’ll have to solve before connected care becomes practical as a backbone for healthcare delivery.

After all, to cite one major challenge, community-wide connected health won’t be very practical until interoperable data sharing becomes easier – and we really don’t know when that will happen. Also, until big data analytics tools are widely accessible (rather than the province of the biggest, best-funded institutions) it will be hard for providers to manage the data generated by millions of virtual care endpoints.

Still, if Tas’s piece is any indication, consensus is building on what next-gen care networks can and should be, and there’s certainly plenty of ways to lay the groundwork for the future. Even small-scale, preliminary connected health efforts seem to be fostering meaningful changes in how care is delivered. And there’s little doubt that over time, connected health will turn many brick-and-mortar care models on their heads, becoming a large – or even dominant – part of care delivery.

Getting there may be tricky, but if providers keep working at connected care, it should offer an immense payoff.

UCSF Partners With Intel On Deep Learning Analytics For Health

Posted on January 30, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

UC San Francisco’s Center for Digital Health Innovation has agreed to work with Intel to deploy and validate a deep learning analytics platform. The new platform is designed to help clinicians make better treatment decisions, predict patient outcomes and respond quickly in acute situations.

The Center’s existing projects include CareWeb, a team-based collaborative care platform built on Salesforce.com social and mobile communications tech; Tidepool, which is building infrastructure for next-gen smart diabetes management apps; Health eHeart, a clinical trials platform using social media, mobile and realtime sensors to change heart disease treatment; and Trinity, which offers “precision team care” by integrating patient data with evidence and multi-disciplinary data.

These projects seem to be a good fit with Intel’s healthcare efforts, which are aimed at helping providers succeed at distributed care communication across desktop and mobile platforms.

As the two note in their joint press release, creating a deep learning platform for healthcare is extremely challenging, given that the relevant data is complex and stored in multiple incompatible systems. Intel and USCF say the next-generation platform will address these issues, allowing them to integrate not only data collected during clinical care but also inputs from genomic sequencing, monitors, sensors and wearables.

To support all of this activity obviously calls for a lot of computing power. The partners will run deep learning use cases in a distributed fashion based on a CPU-based cluster designed to crunch through very large datasets handily. Intel is rolling out the computing environment on its Xeon processor-based platform, which support data management and the algorithm development lifecycle.

As the deployment moves forward, Intel leaders plan to study how deep learning analytics and machine-driven workflows can optimize clinical care and patient outcomes, and leverage what they learn when they create new platforms for the healthcare industry. Both partners believe that this model will scale for future use case needs, such as larger convolutional neural network models, artificial networks patterned after living organizations and very large multidimensional datasets.

Once implemented, the platform will allow users to conduct advanced analytics on all of this disparate data, using machine learning and deep learning algorithms. And if all performs as expected, clinicians should be able to draw on these advanced capabilities on the fly.

This looks like a productive collaboration. If nothing else, it appears that in this case the technology platform UCSF and Intel are developing may be productized and made available to other providers, which could be very valuable. After all, while individual health systems (such as Geisinger) have the resources to kick off big data analytics projects on their own, it’s possible a standardized platform could make such technology available to smaller players. Let’s see how this goes.

A Look At Geisinger’s Big Data Efforts

Posted on December 28, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

This week I got a look at a story appearing in a recent issue of Harvard Business Review which offers a description of Geisinger Health System’s recent big data initiatives. The ambitious project is designed not only to track and analyze patient outcomes, but also to visualize healthcare data across cohorts of patients and networks of providers and even correlate genomic sequences with clinical care. Particularly given that Geisinger has stayed on the cutting edge of HIT for many years, I think it’s worth a look.

As the article’s authors note, Geisinger rolled out a full-featured EMR in 1996, well ahead of most of its peers. Like many other health systems, Geisinger has struggled to aggregate and make use of data. That’s particularly the case because as with other systems, Geisinger’s legacy analytics systems still in place can’t accommodate the growing flood of new data types emerging today.

Last year, Geisinger decided to create a new infrastructure which could bring this data together. It implemented Unified Data Architecture allowing it to integrate big data into its existing data analytics and management.  According to the article, Geisinger’s UDA rollout is the largest practical application of point-of-care big data in the industry. Of particular note, Geisinger is crunching not only enterprise healthcare data (including HIE inputs, clinical departmental systems and patient satisfaction surveys) and consumer health tools (like smartphone apps) but even grocery store and loyalty program info.

Though all of its data hasn’t yet been moved to the UDA, Geisinger has already seen some big data successes, including:

* “Close the Loop” program:  Using natural language processing, the UDA analyzes clinical and diagnostic imaging reports, including free text. Sometimes it detects problems that may not be relevant to the initial issue (such as injuries from a car crash) which can themselves cause serious harm. The program has already saved patient lives.

* Early sepsis detection/treatment: Geisinger uses the UDA to bring all sepsis-patient information in one place as they travel through the hospital. The system alerts providers to real-time physiologic data in patients with life-threatening septic shock, as well as tracking when antibiotics are prescribed and administered. Ninety percent of providers who use this tool consistently adhere to sepsis treatment protocols, as opposed to 40% of those who don’t.

* Surgery costs/outcomes: The Geisinger UDA tracks and integrates surgical supply-chain data, plus clinical data by surgery type and provider, which offers a comprehensive view of performance by provider and surgery type.  In addition to offering performance insight, this approach has also helped generate insights about supply use patterns which allow the health system to negotiate better vendor deals.

To me, one of the most interesting things about this story is that while Geisinger is at a relatively early stage of its big data efforts, it has already managed to generate meaningful benefits from its efforts. My guess is that its early successes are more due to smart planning – which includes worthwhile goals from day one of the rollout — than the technology per se. Regardless, let’s hope other hospital big data projects fare so well. (Meanwhile, for a look at another interesting hospital big data project, check out this story.)

Longitudinal Patient Record Needed To Advance Care?

Posted on November 23, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

In most day to day settings, a clinician only needs a small (if precisely focused) amount of data to make clinical decisions. Both in ambulatory and acute settings, they rely on immediate and near-term information, some collected during the visit, and a handful of historical factors likely to influence or even govern what plan of care is appropriate.

That may be changing, though, according to Cheryl McKay of Orion Health. In a recent blog item, McKay argues that as the industry shifts from fee-for-service payment models to value-based reimbursement, we’ll need new types of medical records to support this model. Today, the longitudinal patient record and community care plan are emerging as substitutes to old EMR models, McKay says. These new entities will be built from varied data sources including payer claims, provider EMRs, patient health devices and the patients themselves.

As these new forms of patient medical record emerge, effective population health management is becoming more feasible, she argues. Longitudinal patient records and community care plans are “essential as we steer away from FFS…The way records are delivered to healthcare providers– with an utter lack of visibility and a lot of noise from various data sources– creates unnecessary risks for everyone involved.”

She contends that putting these types of documentation in place, which summarize patient-based clinical experiences versus episodic clinical experiences, close big gaps in patient history which would otherwise generate mistakes. Longitudinal record-keeping also makes it easier for physicians to aggragate information, do predictive modeling and intervene proactively in patient care at both the patient and population level.

She also predicts that with both a longitudinal patient record and community care plan in place, getting from the providers of all stripes a “panoramic” look at patients, costs will fall as providers stop performing needless tests and procedures. Not only that, these new entities would ideally offer real-time information as well, including event notifications, keeping all the providers involved in sync in providing the patient’s care.

To be sure, this blog item is a pitch for Orion’s technology. While the notion of a community-care plan isn’t owned by anyone in particular, Orion is pitching a specific model which rides upon its population health technology. That being said, I’m betting most of us would agree that the idea (regardless of which vendor you work with) of establishing a community-wide care plan does make sense. And certainly, putting a rich longitudinal patient record in place could be valuable too.

However, given the sad state of interoperability today, I doubt it’s possible to build this model today unless you choose a single vendor-centric solution. At present think it’s more of a dream than a reality for most of us.

Managing Health Information to Ensure Patient Safety

Posted on August 17, 2016 I Written By

Erin Head is the Director of Health Information Management (HIM) and Quality for an acute care hospital in Titusville, FL. She is a renowned speaker on a variety of healthcare and social media topics and currently serves as CCHIIM Commissioner for AHIMA. She is heavily involved in many HIM and HIT initiatives such as information governance, health data analytics, and ICD-10 advocacy. She is active on social media on Twitter @ErinHead_HIM and LinkedIn. Subscribe to Erin’s latest HIM Scene posts here.

This post is part of the HIM Series of blog posts. If you’d like to receive future HIM posts by Erin in your inbox, you can subscribe to future HIM Scene posts here.

Electronic Medical Records (EMRs) have been a great addition to healthcare organizations and I know many would agree that some tasks have been significantly improved from paper to electronic. Others may still be cautious with EMRs due to the potential patient safety concerns that EMRs bring to light.

The Joint Commission expects healthcare organizations to engage in the latest health information technologies but we must do so safely and appropriately. In 2008, The Joint Commission released Sentinel Event Alert Issue 42 which advised organizations to be mindful of the patient safety risks that can result from “converging technologies”.

The electronic technologies we use to gather patient data could pose potential threats and adverse events. Some of these threats include the use of computerized physician order entry (CPOE), information security, incorrect documentation, and clinical decision support (CDS).  Sentinel Event Alert Issue 54 in 2015 again addressed the safety risks of EMRs and the expectation that healthcare organizations will safely implement health information technology.

Having incorrect data in the EMR poses serious patient safety risks that are preventable which is why The Joint Commission has put this emphasis on safely using the technology. We will not be able to blame patient safety errors on the EMR when questioned by surveyors, especially when they could have been prevented.

Ensuring medical record integrity has always been the objective of HIM departments. HIM professionals’ role in preventing errors and adverse events has been apparent from the start of EMR implementations. HIM professionals should monitor and develop methods to prevent issues in the following areas, to name a few:

Copy and paste

Ensure policies are in place to address copy and paste. Records can contain repeated documentation from day to day which could have been documented in error or is no longer current. Preventing and governing the use of copy and paste will prevent many adverse issues with conflicting or erroneous documentation.

Dictation/Transcription errors

Dictation software tools are becoming more intelligent and many organizations are utilizing front end speech recognition to complete EMR documentation. With traditional transcription, we have seen anomalies remaining in the record due to poor dictation quality and uncorrected errors. With front end speech recognition, providers are expected to review and correct their own dictations which presents similar issues if incorrect documentation is left in the record.

Information Security

The data that is captured in the EMR must be kept secure and available when needed. We must ensure the data remains functional and accessible to the correct users and not accessible by those without the need to know. Cybersecurity breaches are a serious threat to electronic data including those within the EMR and surrounding applications.

Downtime

Organizations must be ready to function if there is a planned or unexpected downtime of systems. Proper planning includes maintaining a master list of forms and order-sets that will be called upon in the case of a downtime to ensure documentation is captured appropriately. Historical information should be maintained in a format that will allow access during a downtime making sure users are able to provide uninterrupted care for patients.

Ongoing EMR maintenance

As we continue to enhance and optimize EMRs, we must take into consideration all of the potential downstream effects of each change and how these changes will affect the integrity of the record. HIM professionals need prior notification of upcoming changes and adequate time to test the new functionality. No changes should be made to an EMR without all of the key stakeholders reviewing and approving the changes downstream implications. The Joint Commission claims, “as health IT adoption becomes more widespread, the potential for health IT-related patient harm may increase.”

If you’d like to receive future HIM posts by Erin in your inbox, you can subscribe to future HIM Scene posts here.

EHRs Can Help Find Patients At High Risk Of Dying

Posted on June 1, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Much of the discussion around EMRs and EHRs these days focuses on achieving broad, long-term goals such as improved population health. But here’s some data suggesting that these systems can serve a far more immediate purpose – finding inpatients at imminent risk of death.

A study appearing in The American Journal of Medicine details how researchers from Arizona-based Banner Health created an algorithm looking for key indicators suggesting that patients were in immediate danger of death. It was set up to send an alert when patients met at least two of four systemic inflammatory response syndrome criteria, plus at least one over 14 acute organ dysfunction parameters. The algorithm was applied in real time to 312,214 patients across 24 hospitals in the Banner system.

Researchers found that the alert was able to identify the majority of high-risk patients within 48 hours of their admission to a hospital, allowing clinical staff to deliver early and targeted medical interventions.

This is not the first study to suggest that clinical data analysis can have a significant impact on patients’ health status. Research from last year on clinical decision support tools appearing in Generating Evidence & Methods to Improve Patient Outcomes found that such tools can be beefed up to help providers prevent stroke in vulnerable patients.

In that study, researchers from Ohio State University created the Stroke Prevention in Healthcare Delivery Environments tool to pull together and display data relevant to cardiovascular health. The idea behind the tool was to help clinicians have more effective discussions with patients and help address risk factors such as smoking and weight.

They found that the tool, which was tested at two outpatient settings at Ohio State University’s Wexner Medical Center, garnered a “high” level of satisfaction from providers. Also, patient outcomes improved in some areas, such as diabetes status and body mass index.

Despite their potential, few tools are in place today to achieve such immediate benefits as identifying inpatients at high risk of death. Certainly, clinicians are deluged with alerts, such as the ever-present med interaction warnings, but alerts analyzing specific patients’ clinical picture aren’t common. However, they should be. While drug warnings might irritate physicians, I can’t see them ignoring an alert warning them that the patient might die.

And I can hardly imagine a better use of EMR data than leveraging it to predict adverse events among sick inpatients. After all, few hospitals would spend dozens or hundreds of millions of dollars to implement the system which creates a repository that simply mimics paper records.

In addition to preventing adverse events, real-time EMR data analytics will also support the movement to value-based care. If the system can predict which patients are likely to develop expensive complications, physicians can do a better job of preventing them. While clinicians, understandably, aren’t thrilled will being told how to deliver care, they are trained to respond to problems and solve them.

I’m hoping to read more about technologies that leverage EMR data to solve day-to-day care problems. This is a huge opportunity.