Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Yale New Haven Hospital Partners With Epic On Centralized Operations Center

Posted on February 5, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Info, info, all around, and not a place to manage it all. That’s the dilemma faced by most hospitals as they work to leverage the massive data stores they’re accumulating in their health IT systems.

Yale New Haven Hospital’s solution to the problem is to create a centralized operations center which connects the right people to real-time data analytics. Its Capacity Command Center (nifty alliteration, folks!) was created by YNHH, Epic and the YNHH Clinical Redesign Initiative.

The Command Center project comes five years into YNHH’s long-term High Reliability project, which is designed to prepare the institution for future challenges. These efforts are focused not only on care quality and patient safety but also managing what YNHH says are the highest patient volumes in Connecticut. Its statement also notes that with transfers from other hospitals increasing, the hospital is seeing a growth in patient acuity, which is obviously another challenge it must address.

The Capacity Command Center’s functions are fairly straightforward, though they have to have been a beast to develop.

On the one hand, the Center offers technology which sorts through the flood of operational data generated by and stored in its Epic system, generating dashboards which change in real time and drive process changes. These dashboards present real-time metrics such as bed capacity, delays for procedures and tests and ambulatory utilization, which are made available on Center screens as well as within Epic.

In addition, YNHH has brought representatives from all of the relevant operational areas into a single physical location, including bed management, the Emergency Department, nursing staffing, environmental services and patient transport. Not only is this a good approach overall, it’s particularly helpful when patient admissions levels climb precipitously, the hospital notes.

This model is already having a positive impact on the care process, according to YNHH’s statement. For example, it notes, infection prevention staffers can now identify all patients with Foley catheters and review their charts. With this knowledge in hand, these staffers can discuss whether the patient is ready to have the catheter removed and avoid related urinary tract infections associated with prolonged use.

I don’t know about you, but I was excited to read about this initiative. It sounds like YNHH is doing exactly what it should do to get more out of patient data. For example, I was glad to read that the dashboard offered real-time analytics options rather than one-off projections from old data. Bringing key operational players together in one place makes great sense as well.

Of course, not all hospitals will have the resources to pull something off something like this. YNHH is a 1,541-bed giant which had the cash to take on a command center project. Few community hospitals would have the staff or money to make such a thing happen. Still, it’s good to see somebody at the cutting edge.

Texas Hospital Association Dashboard Offers Risk, Cost Data

Posted on January 22, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

The Texas Hospital Association has agreed to a joint venture with health IT vendor IllumiCare to roll out a new tool for physicians. The new dashboard offers an unusual but powerful mix of risk data and real-time cost information.

According to THA, physician orders represent 87% of hospital expenses, but most know little about the cost of items they order. The new dashboard, Smart Ribbon, gives doctors information on treatment costs and risk of patient harm at the point of care. THA’s assumption is that the data will cause them to order fewer and less costly tests and meds, the group says.

To my mind, the tool sounds neat. IllumiCare’s Smart Ribbon technology doesn’t need to be integrated with the hospital’s EMR. Instead, it works with existing HL-7 feeds and piggybacks onto existing user authorization schemes. In other words, it eliminates the need for creating costly interfaces to EMR data. The dashboard includes patient identification, a timer if the patient is on observational status, a tool for looking up costs and tabs providing wholesale costs for meds, labs and radiology. It also estimates iatrogenic risks resulting from physician decisions.

Unlike some clinical tools I’ve seen, Smart Ribbon doesn’t generate alerts or alarms, which makes it a different beast than many other clinical decision support tools. That doesn’t mean tools that do generate alerts are bad, but that feature does set it apart from others.

We’ve covered many other tools designed to support physicians, and as you’d probably guess, those technologies come in all sizes. For example, last year contributor Andy Oram wrote about a different type of dashboard, PeraHealth, a surveillance system targeting at-risk patients in hospitals.

PeraHealth identifies at-risk patients through analytics and displays them on a dashboard that doctors and nurses can pull up, including trends over several shifts. Its analytical processes pull in nursing assessments in addition to vital signs and other standard data sets. This approach sounds promising.

Ultimately, though, dashboard vendors are still figuring out what physicians need, and it’s hard to tell whether their market will stay alive. In fact, according to one take from Kalorama Information, this year technologies like dashboarding, blockchain and even advanced big data analytics will be integrated into EMRs.

As for me, I think Kalorama’s prediction is too aggressive. While I agree that many freestanding tools will be integrated into the EMR, I don’t think it will happen this or even next year. In the meantime, there’s certainly a place for creating dashboards that accommodate physician workflow and aren’t too intrusive. For the time being, they aren’t going away.

Predictive Analytics Will Save Hospitals, Not IT Investment

Posted on October 27, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Most hospitals run on very slim operating margins. In fact, not-for-profit hospitals’ mean operating margins fell from 3.4% in fiscal year 2015 to 2.7% in fiscal year 2016, according to Moody’s Investors Service.

To turn this around, many seem to be pinning their hopes on better technology, spending between 25% and 35% of their capital budget on IT infrastructure investment. But that strategy might backfire, suggests an article appearing in the Harvard Business Review.

Author Sanjeev Agrawal, who serves as president of healthcare and chief marketing officer at healthcare predictive analytics company LeanTaaS, argues that throwing more money at IT won’t help hospitals become more profitable. “Healthcare providers can’t keep spending their way out of trouble by investing in more and more infrastructure,” he writes. “Instead, they must optimize the use of the assets currently in place.”

Instead, he suggests, hospitals need to go the way of retail, transportation and airlines, industries which also manage complex operations and work on narrow margins. Those industries have improved their performance by improving their data science capabilities.

“[Hospitals] need to create an operational ‘air traffic control’ for their hospitals — a centralized command-and-control capability that is predictive, learns continually, and uses optimization algorithms and artificial intelligence to deliver prescriptive recommendations throughout the system,” Agrawal says.

Agrawal predicts that hospitals will use predictive analytics to refine their key care-delivery processes, including resource utilization, staff schedules, and patient admits and discharges. If they get it right, they’ll meet many of their goals, including better patient throughput, lower costs and more efficient asset utilization.

For example, he notes, hospitals can optimize OR utilization, which brings in 65% of revenue at most hospitals. Rather than relying on current block-scheduling techniques, which have been proven to be inefficient, hospitals can use predictive analytics and mobile apps to give surgeons more control of OR scheduling.

Another area ripe for process improvements is the emergency department. As Agrawal notes, hospitals can avoid bottlenecks by using analytics to define the most efficient order for ED activities. Not only can this improve hospital finances, it can improve patient satisfaction, he says.

Of course, Agrawal works for a predictive analytics vendor, which makes him more than a little bit biased. But on the other hand, I doubt any of us would disagree that adopting predictive analytics strategies is the next frontier for hospitals.

After all, having spent many billions collectively to implement EMRs, hospitals have created enormous data stores, and few would argue that it’s high time to leverage them. For example, if they want to adopt population health management – and it’s a question of when, not if — they’ve got to use these tools to reduce outcome variations and improve quality of cost across populations. Also, while the deep-pocketed hospitals are doing it first, it seems likely that over time, virtually every hospital will use EMR data to streamline operations as well.

The question is, will vendors like LeanTaaS take a leading role in this transition, or will hospital IT leaders know what they want to do?  At this stage, it’s anyone’s guess.

A New Hospital Risk-Adjustment Model

Posted on August 23, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Virtually all of the risk adjustment models with which I’m familiar are based on retrospective data. This data clearly has some predictive benefits – maybe it’s too cliché to say the past is prologue – and is already in our hands.

To look at just one example of what existing data archives can do, we need go no further than the pages of this blog. Late last year, I shared the story of a group of French hospitals which are working to predict admission rates as much as 15 days in advance by mining a store of historical data. Not surprisingly, the group’s key data includes 10 years’ worth of admission records.

The thing is, using historical data may not be as helpful when you’re trying to develop risk-adjustment models. After all, among other problems, the metrics by which evaluate care shift over time, and our understanding of disease states changes as well, so using such models to improve care and outcomes has its limitations.

I’ve been thinking about these issues since John shared some information on a risk-adjustment tool which leverages relevant patient care data collected almost in real time.

The Midas Hospital Risk Adjustment Model, which is created specifically for single organizations, samples anywhere from 20 to 600 metrics, which can include data on mortality, hospital-acquired complications, unplanned readmission, lengths of stay and charges. It’s built using the Midas Health Analytics Platform, which comes from a group within healthcare services company Conduent. The platform captures data across hospital functional areas and aggregates it for use in care management

The Midas team chooses what metrics to include using its in-house tools, which include a data warehouse populated with records on more than 100 million claims as well as data from more than 800 hospitals.

What makes the Midas model special, Conduent says, is that it incorporates a near-time feed of health data from hospital information systems. One of the key advantages to doing so is that rather than basing its analysis on ICD-9 data, which was in use until relatively recently, it can leverage clinically-detailed ICD-10 data, the company says.

The result of this process is a model which is far more capable of isolating small but meaningful differences between individual patients, Conduent says. Then, using this model, hospitals risk-adjust clinical and financial outcomes data by provider for hospitalized patients, and hopefully, have a better basis for making future decisions.

This approach sounds desirable (though I don’t know if it’s actually new). We probably need to move in the direction of using fresh data when analyzing care trends. I suspect few hospitals or health system would have the resources to take this on today, but it’s something to consider.

Still, I’d want to know two things before digging into Midas further. First, while the idea sounds good, is there evidence to suggest that collecting recent data offers superior clinical results? And in that vein, how much of an improvement does it offer relative to analysis of historical data? Until we know these things, it’s hard to tell what we’ve got here.

2 Core Healthcare IT Principles

Posted on May 10, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

One of my favorite bloggers I found when I first starting blogging about Healthcare IT was a hospital CIO named Will Weider who blogged on a site he called Candid CIO. At the time he was CIO of Ministry Health Care and he always offered exceptional insights from his perspective as a hospital CIO. A little over a month ago, Will decided to move on as CIO after 22 years. That was great news for me since it meant he’d probably have more time to blog. The good news is that he has been posting more.

In a recent post, Will offered two guiding principles that I thought were very applicable to any company working to take part in the hospital health IT space:

1. Embed everything in the EHR
2. Don’t hijack the physician workflow

Go and read Will’s post to get his insights, but I agree with both of these principles.

I would add one clarification to his first point. I think there is a space for an outside provider to work outside of the EHR. Think of someone like a care manager. EHR software doesn’t do care management well and so I think there’s a space for a third party care management platform. However, if you want the doctor to access it, then it has to be embedded in the EHR. It’s amazing how much of a barrier a second system is for a doctor.

Ironically, we’ve seen the opposite is also true for people like radiologists. If it’s not in their PACS interface, then it takes a nearly herculean effort for them to leave their PACS system to look something up in the EHR. That’s why I was excited to see some PACS interfaces at RSNA last year which had the EHR data integrated into the radiologists’ interface. The same is true for doctors working in an EHR.

Will’s second point is a really strong one. In his description of this principle, he even suggests that alerts should all but be done away within an EHR except for “the most critical safety situations. He’s right that alert blindness is real and I haven’t seen anyone nail the alerts so well that doctors aren’t happy to see the alerts. That’s the bar we should place on alerts that hijack the physician workflow. Will the doctor be happy you hijacked their workflow and gave them the alert? If the answer is no, then you probably shouldn’t send it.

Welcome back to the blogosphere Will! I look forward to many more posts from you in the future.

Cleveland Clinic Works To Eliminate Tech Redundancies

Posted on March 1, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

The Cleveland Clinic has relied on its EMR for quite some time. In fact, it adopted Epic in the 1990s, long before most healthcare organizations were ready to make a bet on EMRs. Today, decades later, the Epic EMR is the “central data hub” for the medical center and is central to both its clinical and operational efforts, according to William Morris, MD, the Clinic’s associate chief information officer.

But Morris, who spoke about the Clinic’s health IT with Health Data Management, also knows its limitations. In an interview with the magazine’s Greg Slabodkin, he notes that while the EMR may be necessary, it isn’t sufficient. The Epic EMR is “just a digital repository,” he told Slabodkin. “Ultimately, it’s what you do with the technology in your ecosystem.”

These days, IT leaders at the Clinic are working to streamline the layers of additional technology which have accreted on top of the EMR over the years. “As an early adopter of Epic, we have accumulated quite a bit of what I’ll call technical debt,” said Doug Smith, interim chief information officer. “What I mean by that is multiple enhancements, bolt-ons, or revisions to the core application. We have to unburden ourselves of that.”

It’s not that Clinic leaders are unhappy with their EMR. In fact, they’re finding ways to tap its power to improve care. For example, to better leverage its EMR data, the Cleveland Clinic has developed data-driven “risk scores” designed to let doctors know if patients need intervention. The models, developed by the Clinic’s Quantitative Health Sciences group, offer outcome risk calculators for several conditions, including cancer, cardiovascular disease and diabetes.

(By the way, if predictive analytics interest you, you might want to check out our coverage of such efforts at New York’s Mount Sinai Hospital, which is developing a platform to predict which patients might develop congestive heart failure and care for patients already diagnosed with the condition more effectively. I’ve also taken a look at a related product being developed by Google’s DeepMind, an app named Streams which will ping clinicians if a patient needs extra attention.)

Ultimately, though, the organization hopes to simplify its larger health IT infrastructure substantially, to the point where 85% of the HIT functionality comes from the core Epic system. This includes keeping a wary eye on Epic upgrades, and implementing new features selectively. “When you take an upgrade in Epic, they are always turning on more features and functions,” Smith notes. “Most are optional.”

Not only will such improvements streamline IT operations, they will make clinicians more efficient, Smith says. “They are adopting standard workflows that also exist in many other organizations—and, we’re more efficient in supporting it because we don’t take as long to validate or support an upgrade.”

As an aside, I’m interested to read that Epic is tossing more features at Cleveland Clinic than it cares to adopt. I wonder if those are what engineers think customers want, or what they’re demanding today?

The Distributed Hospital On The Horizon

Posted on February 24, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

If you’re reading this blog, you already know that distributed, connected devices and networks are the future of healthcare.  Connected monitoring devices are growing more mature by the day, network architectures are becoming amazingly fluid, and with the growth of the IoT, we’re adding huge numbers of smart devices to an already-diverse array of endpoints.  While we may not know what all of this will look when it’s fully mature, we’ve already made amazing progress in connecting care.

But how will these trends play out? One nice look at where all this is headed comes from Jeroen Tas, chief innovation and strategy officer at Philips. In a recent article, Tas describes a world in which even major brick-and-mortar players like hospitals go almost completely virtual.  Certainly, there are other takes out there on this subject, but I really like how Tas explains things.

He starts with the assertion that the hospital of the future “is not a physical location with waiting rooms, beds and labs.” Instead, a hospital will become an abstract network overlay connecting nodes. It’s worth noting that this isn’t just a concept. For an example, Tas points to the Mercy Virtual Care Center, a $54 million “hospital without beds” dedicated to telehealth and connected care.  The Center, which has over 300 employees, cares for patients at home and in beds across 38 hospitals in seven states.

While the virtual hospital may not rely on a single, central campus, physical care locations will still matter – they’ll just be distributed differently. According to Tas, the connected health network will work best if care is provided as needed through retail-type outlets near where people live, specialist hubs, inpatient facilities and outpatient clinics. Yes, of course, we already have all of these things in place, but in the new connected world, they’ll all be on a single network.

Ultimately, even if brick-and-mortar hospitals never disappear, virtual care should make it possible to cut down dramatically on hospital admissions, he suggests.  For example, Tas notes that Philips partner Banner Health has slashed hospital admissions almost 50% by using telehealth and advanced analytics for patients with multiple chronic conditions. (We’ve also reported on a related pilot by Partners HealthCare Brigham and Women’s Hospital, the “Home Hospital,” which sends patients home with remote monitoring devices as an alternative to admissions.)

Of course, the broad connected care outline Tas offers can only take us so far. It’s all well and good to have a vision, but there are still some major problems we’ll have to solve before connected care becomes practical as a backbone for healthcare delivery.

After all, to cite one major challenge, community-wide connected health won’t be very practical until interoperable data sharing becomes easier – and we really don’t know when that will happen. Also, until big data analytics tools are widely accessible (rather than the province of the biggest, best-funded institutions) it will be hard for providers to manage the data generated by millions of virtual care endpoints.

Still, if Tas’s piece is any indication, consensus is building on what next-gen care networks can and should be, and there’s certainly plenty of ways to lay the groundwork for the future. Even small-scale, preliminary connected health efforts seem to be fostering meaningful changes in how care is delivered. And there’s little doubt that over time, connected health will turn many brick-and-mortar care models on their heads, becoming a large – or even dominant – part of care delivery.

Getting there may be tricky, but if providers keep working at connected care, it should offer an immense payoff.

UCSF Partners With Intel On Deep Learning Analytics For Health

Posted on January 30, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

UC San Francisco’s Center for Digital Health Innovation has agreed to work with Intel to deploy and validate a deep learning analytics platform. The new platform is designed to help clinicians make better treatment decisions, predict patient outcomes and respond quickly in acute situations.

The Center’s existing projects include CareWeb, a team-based collaborative care platform built on Salesforce.com social and mobile communications tech; Tidepool, which is building infrastructure for next-gen smart diabetes management apps; Health eHeart, a clinical trials platform using social media, mobile and realtime sensors to change heart disease treatment; and Trinity, which offers “precision team care” by integrating patient data with evidence and multi-disciplinary data.

These projects seem to be a good fit with Intel’s healthcare efforts, which are aimed at helping providers succeed at distributed care communication across desktop and mobile platforms.

As the two note in their joint press release, creating a deep learning platform for healthcare is extremely challenging, given that the relevant data is complex and stored in multiple incompatible systems. Intel and USCF say the next-generation platform will address these issues, allowing them to integrate not only data collected during clinical care but also inputs from genomic sequencing, monitors, sensors and wearables.

To support all of this activity obviously calls for a lot of computing power. The partners will run deep learning use cases in a distributed fashion based on a CPU-based cluster designed to crunch through very large datasets handily. Intel is rolling out the computing environment on its Xeon processor-based platform, which support data management and the algorithm development lifecycle.

As the deployment moves forward, Intel leaders plan to study how deep learning analytics and machine-driven workflows can optimize clinical care and patient outcomes, and leverage what they learn when they create new platforms for the healthcare industry. Both partners believe that this model will scale for future use case needs, such as larger convolutional neural network models, artificial networks patterned after living organizations and very large multidimensional datasets.

Once implemented, the platform will allow users to conduct advanced analytics on all of this disparate data, using machine learning and deep learning algorithms. And if all performs as expected, clinicians should be able to draw on these advanced capabilities on the fly.

This looks like a productive collaboration. If nothing else, it appears that in this case the technology platform UCSF and Intel are developing may be productized and made available to other providers, which could be very valuable. After all, while individual health systems (such as Geisinger) have the resources to kick off big data analytics projects on their own, it’s possible a standardized platform could make such technology available to smaller players. Let’s see how this goes.

A Look At Geisinger’s Big Data Efforts

Posted on December 28, 2016 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

This week I got a look at a story appearing in a recent issue of Harvard Business Review which offers a description of Geisinger Health System’s recent big data initiatives. The ambitious project is designed not only to track and analyze patient outcomes, but also to visualize healthcare data across cohorts of patients and networks of providers and even correlate genomic sequences with clinical care. Particularly given that Geisinger has stayed on the cutting edge of HIT for many years, I think it’s worth a look.

As the article’s authors note, Geisinger rolled out a full-featured EMR in 1996, well ahead of most of its peers. Like many other health systems, Geisinger has struggled to aggregate and make use of data. That’s particularly the case because as with other systems, Geisinger’s legacy analytics systems still in place can’t accommodate the growing flood of new data types emerging today.

Last year, Geisinger decided to create a new infrastructure which could bring this data together. It implemented Unified Data Architecture allowing it to integrate big data into its existing data analytics and management.  According to the article, Geisinger’s UDA rollout is the largest practical application of point-of-care big data in the industry. Of particular note, Geisinger is crunching not only enterprise healthcare data (including HIE inputs, clinical departmental systems and patient satisfaction surveys) and consumer health tools (like smartphone apps) but even grocery store and loyalty program info.

Though all of its data hasn’t yet been moved to the UDA, Geisinger has already seen some big data successes, including:

* “Close the Loop” program:  Using natural language processing, the UDA analyzes clinical and diagnostic imaging reports, including free text. Sometimes it detects problems that may not be relevant to the initial issue (such as injuries from a car crash) which can themselves cause serious harm. The program has already saved patient lives.

* Early sepsis detection/treatment: Geisinger uses the UDA to bring all sepsis-patient information in one place as they travel through the hospital. The system alerts providers to real-time physiologic data in patients with life-threatening septic shock, as well as tracking when antibiotics are prescribed and administered. Ninety percent of providers who use this tool consistently adhere to sepsis treatment protocols, as opposed to 40% of those who don’t.

* Surgery costs/outcomes: The Geisinger UDA tracks and integrates surgical supply-chain data, plus clinical data by surgery type and provider, which offers a comprehensive view of performance by provider and surgery type.  In addition to offering performance insight, this approach has also helped generate insights about supply use patterns which allow the health system to negotiate better vendor deals.

To me, one of the most interesting things about this story is that while Geisinger is at a relatively early stage of its big data efforts, it has already managed to generate meaningful benefits from its efforts. My guess is that its early successes are more due to smart planning – which includes worthwhile goals from day one of the rollout — than the technology per se. Regardless, let’s hope other hospital big data projects fare so well. (Meanwhile, for a look at another interesting hospital big data project, check out this story.)

Longitudinal Patient Record Needed To Advance Care?

Posted on November 23, 2016 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

In most day to day settings, a clinician only needs a small (if precisely focused) amount of data to make clinical decisions. Both in ambulatory and acute settings, they rely on immediate and near-term information, some collected during the visit, and a handful of historical factors likely to influence or even govern what plan of care is appropriate.

That may be changing, though, according to Cheryl McKay of Orion Health. In a recent blog item, McKay argues that as the industry shifts from fee-for-service payment models to value-based reimbursement, we’ll need new types of medical records to support this model. Today, the longitudinal patient record and community care plan are emerging as substitutes to old EMR models, McKay says. These new entities will be built from varied data sources including payer claims, provider EMRs, patient health devices and the patients themselves.

As these new forms of patient medical record emerge, effective population health management is becoming more feasible, she argues. Longitudinal patient records and community care plans are “essential as we steer away from FFS…The way records are delivered to healthcare providers– with an utter lack of visibility and a lot of noise from various data sources– creates unnecessary risks for everyone involved.”

She contends that putting these types of documentation in place, which summarize patient-based clinical experiences versus episodic clinical experiences, close big gaps in patient history which would otherwise generate mistakes. Longitudinal record-keeping also makes it easier for physicians to aggragate information, do predictive modeling and intervene proactively in patient care at both the patient and population level.

She also predicts that with both a longitudinal patient record and community care plan in place, getting from the providers of all stripes a “panoramic” look at patients, costs will fall as providers stop performing needless tests and procedures. Not only that, these new entities would ideally offer real-time information as well, including event notifications, keeping all the providers involved in sync in providing the patient’s care.

To be sure, this blog item is a pitch for Orion’s technology. While the notion of a community-care plan isn’t owned by anyone in particular, Orion is pitching a specific model which rides upon its population health technology. That being said, I’m betting most of us would agree that the idea (regardless of which vendor you work with) of establishing a community-wide care plan does make sense. And certainly, putting a rich longitudinal patient record in place could be valuable too.

However, given the sad state of interoperability today, I doubt it’s possible to build this model today unless you choose a single vendor-centric solution. At present think it’s more of a dream than a reality for most of us.