Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

“Learning Health System” Pilot Cuts Care Costs While Improving Quality

Posted on January 11, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

As some of you will know, the ONC’s Shared Nationwide Interoperability Roadmap’s goal is to create a “nationwide learning health system.”  In this system, individuals, providers and organizations will freely share health information, but more importantly, will share that information in “closed loops” which allow for continuous learning and care improvement.

When I read about this model – which is backed by the Institute of Medicine — I thought it sounded interesting, but didn’t think it terribly practical. Recently, though, I stumbled upon an experiment which attempts to bring this approach to life. And it’s more than just unusual — it seems to be successful.

What I’m talking about is a pilot study, done by a team from Nationwide Children’s Hospital and The Ohio State University, which involved implementing a “local” learning health system. During the pilot, team members used EHR data to create personalized treatments for patients based on data from others with similar conditions and risk factors.

To date, building a learning health system has been very difficult indeed, largely because integrating EHRs between multiple hospital systems is very difficult. For that reason, researchers with the two organizations decided to implement a “local” learning health system, according to a press statement from Nationwide Children’s.

To build the local learning health system, the team from Nationwide Children’s and Ohio State optimized the EHR to support their efforts. They also relied on a “robust” care coordination system which sat at the core of the EHR. The pilot subjects were a group of 131 children treated through the hospital’s cerebral palsy program.

Children treated in the 12-month program, named “Learn From Every Patient,” experienced a 43% reduction in total inpatient days, a 27% reduction in inpatient admissions, a 30% reduction in emergency department visits and a 29% reduction in urgent care visits.

The two institutions spent $225,000 to implement the pilot during the first year. However, the return on this investment was dramatic.  Researchers concluded that the program cut healthcare costs by $1.36 million. This represented a savings of about $6 for each dollar invested.

An added benefit from the program was that the clinicians working in the CP clinic found that this approach to care simplified documentation, which saved time and made it possible for them to see more patients during each session, the team found.

Not surprisingly, the research team thinks this approach has a lot of potential. “This method has the potential to be an effective complementary or alternative strategy to the top-down approach of learning health systems,” the release said. In other words, maybe bottom-up, incremental efforts are worth a try.

Given these results, it’d be nice to think that we’ll have full interoperability someday, and that we’ll be able to scale up the learning health system approach to the whole US. In the mean time, it’s good to see at least a single health system make some headway with it.

Some Projections For 2017 Hospital IT Spending

Posted on January 4, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A couple of months ago, HIMSS released some statistics from its survey on US hospitals’ plans for IT investment over the next 12 months. The results contain a couple of data points that I found particularly interesting:

  • While I had expected the most common type of planned spending to be focused on population health or related solutions, HIMSS found that pharmacy was the most active category. In fact, 51% of hospitals were planning to invest in one pharmacy technology, largely to improve tracking of medication dispensing in additional patient care environments. Researchers also found that 6% of hospitals were planning to add carousels or packagers in their pharmacies.
  • Eight percent hospitals said that they plan to invest in EMR components, which I hadn’t anticipated (though it makes sense in retrospect). HIMSS reported that 14% of hospitals at Stage 1-4 of its Electronic Medical Record Adoption Model are investing in pharmacy tech for closed loop med administration, and 17% in auto ID tech. Four percent of Stage 6 hospitals plan to support or expand information exchange capabilities. Meanwhile, 60% of Stage 7 hospitals are investing in hardware infrastructure “for the post-EMR world.”

Other data from the HIMSS report included news of new analytics and telecom plans:

  • Researchers say that recent mergers and acquisitions are triggering new investments around telephony. They found that 12% of hospitals with inpatient revenues between $25 million and $125 million – and 6% of hospitals with more than $500 million in inpatient revenues — are investing in VOIP and telemedicine. FWIW, I’m not sure how mergers and acquisitions would trigger telemedicine rollouts, as they’re already well underway at many hospitals — maybe these deals foster new thinking and innovation?
  • As readers know, hospitals are increasingly spending on analytics solutions to improve care and make use of big data. However (and this surprised me) only 8% of hospitals reported plans to buy at least one analytics technology. My guess is that this number is small because a) hospitals may not have collected their big data assets in easily-analyzed form yet and b) that they’re still hoping to make better use of their legacy analytics tools.

Looking at these stats as a whole, I get the sense that the hospitals surveyed are expecting to play catch-up and shore up their infrastructure next year, rather than sink big dollars into future-looking solutions.

Without a doubt, hospital leaders are likely to invest in game-changing technologies soon such as cutting-edge patient engagement and population health platforms to prepare for the shift to value-based health. It’s inevitable.

But in the meantime it probably makes sense for them to focus on internal cost drivers like pharmacy departments, whose average annual inpatient drug spending shot up by more than 23% between 2013 and 2015. Without stanching that kind of bleeding, hospitals are unlikely to get as much value as they’d like from big-idea investments in the future.

A Look At Geisinger’s Big Data Efforts

Posted on December 28, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

This week I got a look at a story appearing in a recent issue of Harvard Business Review which offers a description of Geisinger Health System’s recent big data initiatives. The ambitious project is designed not only to track and analyze patient outcomes, but also to visualize healthcare data across cohorts of patients and networks of providers and even correlate genomic sequences with clinical care. Particularly given that Geisinger has stayed on the cutting edge of HIT for many years, I think it’s worth a look.

As the article’s authors note, Geisinger rolled out a full-featured EMR in 1996, well ahead of most of its peers. Like many other health systems, Geisinger has struggled to aggregate and make use of data. That’s particularly the case because as with other systems, Geisinger’s legacy analytics systems still in place can’t accommodate the growing flood of new data types emerging today.

Last year, Geisinger decided to create a new infrastructure which could bring this data together. It implemented Unified Data Architecture allowing it to integrate big data into its existing data analytics and management.  According to the article, Geisinger’s UDA rollout is the largest practical application of point-of-care big data in the industry. Of particular note, Geisinger is crunching not only enterprise healthcare data (including HIE inputs, clinical departmental systems and patient satisfaction surveys) and consumer health tools (like smartphone apps) but even grocery store and loyalty program info.

Though all of its data hasn’t yet been moved to the UDA, Geisinger has already seen some big data successes, including:

* “Close the Loop” program:  Using natural language processing, the UDA analyzes clinical and diagnostic imaging reports, including free text. Sometimes it detects problems that may not be relevant to the initial issue (such as injuries from a car crash) which can themselves cause serious harm. The program has already saved patient lives.

* Early sepsis detection/treatment: Geisinger uses the UDA to bring all sepsis-patient information in one place as they travel through the hospital. The system alerts providers to real-time physiologic data in patients with life-threatening septic shock, as well as tracking when antibiotics are prescribed and administered. Ninety percent of providers who use this tool consistently adhere to sepsis treatment protocols, as opposed to 40% of those who don’t.

* Surgery costs/outcomes: The Geisinger UDA tracks and integrates surgical supply-chain data, plus clinical data by surgery type and provider, which offers a comprehensive view of performance by provider and surgery type.  In addition to offering performance insight, this approach has also helped generate insights about supply use patterns which allow the health system to negotiate better vendor deals.

To me, one of the most interesting things about this story is that while Geisinger is at a relatively early stage of its big data efforts, it has already managed to generate meaningful benefits from its efforts. My guess is that its early successes are more due to smart planning – which includes worthwhile goals from day one of the rollout — than the technology per se. Regardless, let’s hope other hospital big data projects fare so well. (Meanwhile, for a look at another interesting hospital big data project, check out this story.)

ACO-Affiliated Hospitals May Be Ahead On Strategic Health IT Use

Posted on December 26, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Over the past several years I’ve been struck by how seldom ACOs seem to achieve the objectives they’re built to meet – particularly cost savings and quality improvement goals – even when the organizations involved are pretty sophisticated.

For example, the results generated the Medicare Shared Savings Program and  Pioneer ACO Model have been inconsistent at best, with just 31% of participants getting a savings bonus for 2015, despite the fact that the “Pioneers” were chosen for their savvy and willingness to take on risk.

Some observers suggested this would change as hospitals and ACOs found better health IT solutions, but I’ve always been somewhat skeptical about this. I’m not a fan of the results we got when capitation was the rage, and to me current models have always looked like tarted-up capitation, the fundamental flaws of which can’t be fixed by technology.

All that being said, a new journal article suggests that I may be wrong about the hopelessness of trying to engineer a workable value-based solution with health IT. The study, which was published in the American Journal of Managed Care, has concluded that if nothing else, ACO incentives are pushing hospitals to make more strategic HIT investments than they may have before.

To conduct the study, which compared health IT adoption in hospitals participating in ACOs with hospitals that weren’t ACO-affiliated, the authors gathered data from 2013 and 2014 surveys by the American Hospital Association. They focused on hospitals’ adherence to Stage 1 and Stage 2 Meaningful Use criteria, patient engagement-oriented health IT use and HIE participation.

When they compared 393 ACO hospitals and 810 non-ACO hospitals, the researchers found that a larger percentage of ACO hospitals were capable of meeting MU Stage 1 and Stage 2. They also noted that nearly 40% of ACO hospitals had patient engagement tech in place, as compared with 15.2% of non-ACO hospitals. Meanwhile, 49% of ACO hospitals were involved with HIEs, compared with 30.1% of non-ACO hospitals.

Bottom line, the authors concluded that ACO-based incentives are proving to be more effective than Meaningful Use at getting hospitals adopt new and arguably more effective technologies. Fancy that! (Finding and implementing those solutions is still a huge challenge for ACOs, but that’s a story for another day.)

Of course, the authors seem to take it as a given that patient engagement tech and HIEs are strategic for more or less any hospital, an assumption they don’t do much to justify. Also, they don’t address how hospitals in and out of ACOs are pursuing population health or big data strategies, which seems like a big omission. This weakens their argument somewhat in my view. But the data is worth a look nonetheless.

I’m quite happy to see some evidence that ACO models can push hospitals to make good health IT investment decisions. After all, it’d be a bummer if hospitals had spent all of that time and money building them out for nothing.

Paris Hospitals Use Big Data To Predict Admissions

Posted on December 19, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Here’s a fascinating story in from Paris (or par-ee, if you’re a Francophile), courtesy of Forbes. The article details how a group of top hospitals there are running a trial of big data and machine learning tech designed to predict admission rates. The hospitals’ predictive model, which is being tested at four of the hospitals which make up the Assistance Publiq-Hopitaux de Paris (AP-HP), is designed to predict admission rates as much as 15 days in advance.

The four hospitals participating in the project have pulled together a massive trove of data from both internal and external sources, including 10 years’ worth of hospital admission records. The goal is to forecast admissions by the day and even by the hour for the four facilities participating in the test.

According to Forbes contributor Bernard Marr, the project involves using time series analysis techniques which can detect patterns in the data useful for predicting admission rates at different times.  The hospitals are also using machine learning to determine which algorithms are likely to make good predictions from old hospital data.

The system the hospitals are using is built on the open source Trusted Analytics Platform. According to Marr, the partners felt that the platform offered a particularly strong capacity for ingesting and crunching large amounts of data. They also built on TAP because it was geared towards open, collaborative development environments.

The pilot system is accessible via a browser-based interface, designed to be simple enough that data science novices like doctors, nurses and hospital administration staff could use the tool to forecast visit and admission rates. Armed with this knowledge, hospital leaders can then pull in extra staffers when increased levels of traffic are expected.

Being able to work in a distributed environment will be key if AP-HP decides to roll the pilot out to all of its 44 hospitals, so developers built with that in mind. To be prepared for the future, which might call for adding a great deal of storage and processing power, they designed distributed, cloud-based system.

“There are many analytical solutions for these type of problems, [but] none of them have been implemented in a distributed fashion,” said Kyle Ambert, an Intel data scientist and TAP contributor who spoke with Marr. “Because we’re interested in scalability, we wanted to make sure we could implement these well-understood algorithms in such a way that they work over distributed systems.”

To make this happen, however, Ambert and the development team have had to build their own tools, an effort which resulted in the first contribution to an open-source framework of code designed to carry out analysis over scalable, distributed framework, one which is already being deployed in other healthcare environments, Marr reports.

My feeling is that there’s no reason American hospitals can’t experiment with this approach. In fact, maybe they already are. Readers, are you aware of any US facilities which are doing something similar? (Or are most still focused on “skinny” data?)

Easing The Transition To Big Data

Posted on December 16, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Tapping the capabilities of big data has become increasingly important for healthcare organizations in recent years. But as HIT expert Adheet Gogate notes, the transition is not an easy one, forcing these organizations to migrate from legacy data management systems to new systems designed specifically for use with new types of data.

Gogate, who serves as vice president of consulting at Citius Tech, rightly points out that even when hospitals and health systems spend big bucks on new technology, they may not see any concrete benefits. But if they move through the big data rollout process correctly, their efforts are more likely to bear fruit, he suggests. And he offers four steps organizations can take to ease this transition. They include:

  • Have the right mindset:  Historically, many healthcare leaders came up through the business in environments where retrieving patient data was difficult and prone to delays, so their expectations may be low. But if they hope to lead successful big data efforts, they need to embrace the new data-rich environment, understand big data’s potential and ask insightful questions. This will help to create a data-oriented culture in their organization, Gogate writes.
  • Learn from other industries: Bear in mind that other industries have already grappled with big data models, and that many have seen significant successes already. Healthcare leaders should learn from these industries, which include civil aviation, retail and logistics, and consider adopting their approaches. In some cases, they might want to consider bringing an executive from one of these industries on board at a leadership level, Gogate suggests.
  • Employ the skills of data scientists: To tame the floods of data coming into their organization, healthcare leaders should actively recruit data scientists, whose job it is to translate the requirements of the methods, approaches and processes for developing analytics which will answer their business questions.  Once they hire such scientists, leaders should be sure that they have the active support of frontline staffers and operations leaders to make sure the analyses they provide are useful to the team, Gogate recommends.
  • Think like a startup: It helps when leaders adopt an entrepreneurial mindset toward big data rollouts. These efforts should be led by senior leaders comfortable with this space, who let key players act as their own enterprise first and invest in building critical mass in data science. Then, assign a group of core team members and frontline managers to areas where analytics capabilities are most needed. Rotate these teams across the organization to wherever business problems reside, and let them generate valuable improvement insights. Over time, these insights will help the whole organization improve its big data capabilities, Gogash says.

Of course, taking an agile, entrepreneurial approach to big data will only work if it has widespread support, from the C-suite on down. Also, healthcare organizations will face some concrete barriers in building out big data capabilities, such as recruiting the right data scientists and identifying and paying for the right next-gen technology. Other issues include falling reimbursements and the need to personalize care, according to healthcare CIO David Chou.

But assuming these other challenges are met, embracing big data with a willing-to-learn attitude is more likely to work than treating it as just another development project. And the more you learn, the more successful you’ll be in the future.

Sutter Health Blends EHR, Patient-Reported Data For MS Treatment

Posted on December 5, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

The Sutter Health network is launching a new research project which will blend patient-reported and EHR-based data to improve the precision of multiple sclerosis treatment. Sutter will fund the project with a $1.2 million award from the California Initiative to Advance Precision Medicine.

To conduct the project, Sutter Health researchers are partnering with colleagues at the University of California, San Francisco. Working together, the team is developing a neurology application dubbed MS-SHARE which will be used by patients and doctors during appointments, and by patients between appointments.

During the 18-month demonstration project, the team will build the app with input from the health system’s doctors as well as MS patients. Throughout the process of care, the app will organize both patient-reported data and EHR data, in a manner intended to let doctors and patients view the data together and work together on care planning.

Over the short term, researchers and developers are focusing on outcomes like patient and doctor use of the app and enhancing the patient experience. Its big picture goals, meanwhile, include the ability to improve patient outcomes, such as disease progression and symptom control. Ultimately, the team hopes the results of this project go beyond supporting multiple sclerosis patients to helping to improve care for other neurological diseases such as Parkinson’s Disease, seizure disorders and migraine headaches.

The Sacramento, Calif.-based health network pitches the project as potentially transformative. “MS-SHARE has the potential to change how doctors and patients spend their time during appointments,” the press release asserts. “Instead of ‘data finding and gathering,’ doctors and patients can devote more time to conversation about how the care is working and how it needs to be changed to meet patient needs.”

Time for an editorial aside here. As a patient with a neurological disorder (Parkinson’s), I’m here to say that while this sounds like an excellent start at collaborating with patients, at first glance it may be doomed to limited success at best.

What I mean is as follows. When I meet with the neurologist to discuss progression of my symptoms, he or she typically does little beyond the standard exam. In fact, my sense is that most seem quite satisfied that they know enough about my status to make decisions after doing that exam. In most cases, little or nothing about my functioning outside the office makes it into the chart.

What I’m trying to say here is that based on my experience, it will take more than a handy-dandy app to win neurologists over to collaborating over charts and data with any patient. (Honestly, I think that’s true of many doctors outside this specialty, too.) And I’m not suggesting that this is because they’re arrogant, although they may be in some cases. Rather, I’m suggesting that it’s a workflow issue. Integrating patients in the discussion isn’t just a change of pace, it could be seen as a distraction that could lead to worse care rather than better. It will be interesting to see if that’s how things turn out.

Hospital Program Uses Connected Health Monitoring To Admit Patients “To Home”

Posted on November 28, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A Boston-based hospital has kicked off a program in which it will evaluate whether a mix of continuous connected patient monitoring and clinicians is able to reduce hospitalizations for common medical admissions.

The Home Hospital pilot, which will take place at Partners HealthCare Brigham and Women’s Hospital, is being led by David Levine, MD, MA, a physician who practices at the hospital. The hospital team is working with two vendors to implement the program, Vital Connect and physIQ. Vital Connect is supplying a biosensor that will continuously stream patient vital signs; those vital signs, in turn, will be analyzed and viewable through physIQ’s physiology analytics platform.

The Home Hospital pilot is one of two efforts planned by the team to analyze how technology in home-based care can treat patients who might otherwise have been admitted to the hospital. For this initiative, a randomized controlled trial, patients diagnosed at the BWH Emergency Department with exacerbation of heart failure, pneumonia, COPD, cellulitis or complicated urinary tract infection are being placed at home with the Vital Connect/physIQ solution and receive daily clinician visits.

The primary aim of this program, according to participants, is to demonstrate that the in-home model they’ve proposed can provide appropriate care at a lower cost at home, as well as improving outcomes measures such as health related quality of life, patient safety and quality and overall patient experience.

According to a written statement, the first phase of the initiative began in September of this year involves roughly 60 patients, half of whom are receiving traditional in-hospital care, while the other half are being treated at home. With the early phase looking at the success, the hospital will probably scale up to including 500 patients in the pilot in early 2017.

Expect to see more hospital-based connected care options like these emerge over the next year or two, as they’re just too promising to ignore at this point.

Perhaps the most advanced I’ve written about to date must be the Chesterfield, Mo-based Mercy Virtual Care Center, which describes itself as a “hospital without beds.” The $54M Virtual Care Center, which launched in October 2015, employs 330 staffers providing a variety of telehealth services, including virtual hospitalists, telestroke and perhaps most relevant to this story, the “home monitoring” service, which provides continuous monitoring for more than 3,800 patients.

My general impression is that few hospitals are ready to make the kind of commitment Mercy did, but that most are curious and some quite interested in actively implementing connected care and monitoring as a significant part of their service line. It’s my guess that it won’t take many more successful tests to convince wide swath of hospitals to get off the fence and join them.

Longitudinal Patient Record Needed To Advance Care?

Posted on November 23, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

In most day to day settings, a clinician only needs a small (if precisely focused) amount of data to make clinical decisions. Both in ambulatory and acute settings, they rely on immediate and near-term information, some collected during the visit, and a handful of historical factors likely to influence or even govern what plan of care is appropriate.

That may be changing, though, according to Cheryl McKay of Orion Health. In a recent blog item, McKay argues that as the industry shifts from fee-for-service payment models to value-based reimbursement, we’ll need new types of medical records to support this model. Today, the longitudinal patient record and community care plan are emerging as substitutes to old EMR models, McKay says. These new entities will be built from varied data sources including payer claims, provider EMRs, patient health devices and the patients themselves.

As these new forms of patient medical record emerge, effective population health management is becoming more feasible, she argues. Longitudinal patient records and community care plans are “essential as we steer away from FFS…The way records are delivered to healthcare providers– with an utter lack of visibility and a lot of noise from various data sources– creates unnecessary risks for everyone involved.”

She contends that putting these types of documentation in place, which summarize patient-based clinical experiences versus episodic clinical experiences, close big gaps in patient history which would otherwise generate mistakes. Longitudinal record-keeping also makes it easier for physicians to aggragate information, do predictive modeling and intervene proactively in patient care at both the patient and population level.

She also predicts that with both a longitudinal patient record and community care plan in place, getting from the providers of all stripes a “panoramic” look at patients, costs will fall as providers stop performing needless tests and procedures. Not only that, these new entities would ideally offer real-time information as well, including event notifications, keeping all the providers involved in sync in providing the patient’s care.

To be sure, this blog item is a pitch for Orion’s technology. While the notion of a community-care plan isn’t owned by anyone in particular, Orion is pitching a specific model which rides upon its population health technology. That being said, I’m betting most of us would agree that the idea (regardless of which vendor you work with) of establishing a community-wide care plan does make sense. And certainly, putting a rich longitudinal patient record in place could be valuable too.

However, given the sad state of interoperability today, I doubt it’s possible to build this model today unless you choose a single vendor-centric solution. At present think it’s more of a dream than a reality for most of us.

Health System Sees Big Dividends From Sharing Data

Posted on November 21, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

For some health organizations, the biggest obstacle to data sharing isn’t technical. Many a health IT pundit has argued — I think convincingly — that while health organizations understand the benefits of data sharing, they still see it as against their financial interests, as patients with access to data everywhere aren’t bound to them.

But recently, I read an intriguing story by Healthcare IT News about a major exception to the rule. The story laid out how one healthcare system has been sharing its data with community researchers in an effort to promote innovation. According to writer Mike Miliard, the project was able to proceed because the institution was able to first eliminate many data silos, giving it a disciplined view of the data it shared.

At Sioux Falls, South Dakota-based Sanford Health, one health leader has departed from standard health system practices and shared a substantial amount of proprietary data with research organizations in his community, including certain clinical, claims, financial and operational data. Sanford is working with researchers at South Dakota State University on mathematics issues, University of South Dakota business researchers, Dakota State University on computer science/informatics and University of North Dakota on public health.

The effort is led by Benson Hsu, MD, vice president of enterprise data and analytics for the system. Hsu tells the magazine that the researchers have been developing analytical apps which are helping the health system with key issues like cost efficiencies, patient engagement and quality improvement. And more radically, Hsu plans to share what he discovers with competitors in the community.

Hsu laid the groundwork for the program, HIN reports, by integrating far-flung data across the sprawling health system, including multiple custom versions of the Epic EHR, multiple financial accounts and a variety of HR systems; analytics silos cutting across areas from clinical decision support and IT reports to HR/health plan analytics; and data barriers which included a lack of common data terms, benchmarking tools and common analytic calculator. But after spending a year pulling these areas into a functioning analytics foundation, Sanford was ready to share data with outside entities.

At first, Hsu’s managers weren’t fond of the idea of sharing masses of clinical data with anyone, but he sold them on the idea. “It’s the right thing to do. More importantly, it’s the right thing to do for the community — and the community is going to recognize that Sanford health is here for the community,” he argued. “Secondly, it’s innovation. Innovation in our backyard, based on our population, our social determinants, our disparities.”

According to HIN, this “crowdsourced” approach to analytics has helped Sanford make progress with predicting risk, chronic disease management, diagnostic testing and technology utilization, among other things. And there’s no reason to think that the effort won’t keep generating progress.

Many institutions would have shot down an effort like this immediately, before it could accomplish results. But it seems that Sanford’s creative approach to big data and analytics is paying off. While it might not work everywhere, I’m betting there are many other institutions that could benefit from tapping the intellect of researchers in their community. After all, no matter how smart people are, some answers always lie outside your walls.