Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Searching for Disruptive Healthcare Innovation in 2017

Posted on January 17, 2017 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin is a true believer in #HealthIT, social media and empowered patients. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He currently leads the marketing efforts for @PatientPrompt, a Stericycle product. Colin’s Twitter handle is: @Colin_Hung

Disruptive Innovation has been the brass ring for technology companies ever since Clayton Christensen popularized the term in his seminal book The Innovator’s Dilemma in 1997. According to Christensen, disruptive innovation is:

“A process by which a product or service takes root initially in simple applications at the bottom of a market and then relentlessly moves up market, eventually displacing established competitors.”

Disruption is more likely to occur, therefore, when you have a well established market with slow-moving large incumbents who are focused on incremental improvements rather than truly innovative offerings. Using this definition, healthcare has been ripe for innovation for a number of years. But where is the AirBNB/Uber/Google of healthcare?

On a recent #hcldr tweetchat we asked what disruptive healthcare technologies might emerge in 2017. By far the most popular response was Artificial Intelligence (AI) and Machine Learning.

Personally, I’m really excited about the potential of AI applied to diagnostics and decision support. There is just no way a single person can stay up to speed on all the latest clinical research while simultaneously remembering every symptom/diagnosis from the past. I believe that one day we will all be using AI assistance to guide our care – as common as we use a GPS today to help navigate unknown roads.

Some #hcldr participants, however, were skeptical of AI.

While I don’t think @IBMWatson is on the same trajectory as Theranos, there is merit to being wary of “over-hype” when it comes to new technologies. When a shining star like Theranos falls, it can set an entire industry back and stifle innovation in an area that may warrant investment. Can you imagine seeking funding for a technology that uses small amounts of blood to detect diseases right now? Too much hype can prematurely kill innovation.

Other potentially disruptive technologies that were raised during the chat included: #telehealth, #wearables, patient generated health data (#PDHD), combining #HealthIT with consumer services and #patientengagement.

The funniest and perhaps most thoughtful tweet came from @YinkaVidal, who warned us that innovations have a window of usefulness. What was once ground-breaking can be rendered junk by the next generation.

What do you believe will be the disruptive healthcare technology to emerge in 2017?

“Learning Health System” Pilot Cuts Care Costs While Improving Quality

Posted on January 11, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

As some of you will know, the ONC’s Shared Nationwide Interoperability Roadmap’s goal is to create a “nationwide learning health system.”  In this system, individuals, providers and organizations will freely share health information, but more importantly, will share that information in “closed loops” which allow for continuous learning and care improvement.

When I read about this model – which is backed by the Institute of Medicine — I thought it sounded interesting, but didn’t think it terribly practical. Recently, though, I stumbled upon an experiment which attempts to bring this approach to life. And it’s more than just unusual — it seems to be successful.

What I’m talking about is a pilot study, done by a team from Nationwide Children’s Hospital and The Ohio State University, which involved implementing a “local” learning health system. During the pilot, team members used EHR data to create personalized treatments for patients based on data from others with similar conditions and risk factors.

To date, building a learning health system has been very difficult indeed, largely because integrating EHRs between multiple hospital systems is very difficult. For that reason, researchers with the two organizations decided to implement a “local” learning health system, according to a press statement from Nationwide Children’s.

To build the local learning health system, the team from Nationwide Children’s and Ohio State optimized the EHR to support their efforts. They also relied on a “robust” care coordination system which sat at the core of the EHR. The pilot subjects were a group of 131 children treated through the hospital’s cerebral palsy program.

Children treated in the 12-month program, named “Learn From Every Patient,” experienced a 43% reduction in total inpatient days, a 27% reduction in inpatient admissions, a 30% reduction in emergency department visits and a 29% reduction in urgent care visits.

The two institutions spent $225,000 to implement the pilot during the first year. However, the return on this investment was dramatic.  Researchers concluded that the program cut healthcare costs by $1.36 million. This represented a savings of about $6 for each dollar invested.

An added benefit from the program was that the clinicians working in the CP clinic found that this approach to care simplified documentation, which saved time and made it possible for them to see more patients during each session, the team found.

Not surprisingly, the research team thinks this approach has a lot of potential. “This method has the potential to be an effective complementary or alternative strategy to the top-down approach of learning health systems,” the release said. In other words, maybe bottom-up, incremental efforts are worth a try.

Given these results, it’d be nice to think that we’ll have full interoperability someday, and that we’ll be able to scale up the learning health system approach to the whole US. In the mean time, it’s good to see at least a single health system make some headway with it.

Some Projections For 2017 Hospital IT Spending

Posted on January 4, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A couple of months ago, HIMSS released some statistics from its survey on US hospitals’ plans for IT investment over the next 12 months. The results contain a couple of data points that I found particularly interesting:

  • While I had expected the most common type of planned spending to be focused on population health or related solutions, HIMSS found that pharmacy was the most active category. In fact, 51% of hospitals were planning to invest in one pharmacy technology, largely to improve tracking of medication dispensing in additional patient care environments. Researchers also found that 6% of hospitals were planning to add carousels or packagers in their pharmacies.
  • Eight percent hospitals said that they plan to invest in EMR components, which I hadn’t anticipated (though it makes sense in retrospect). HIMSS reported that 14% of hospitals at Stage 1-4 of its Electronic Medical Record Adoption Model are investing in pharmacy tech for closed loop med administration, and 17% in auto ID tech. Four percent of Stage 6 hospitals plan to support or expand information exchange capabilities. Meanwhile, 60% of Stage 7 hospitals are investing in hardware infrastructure “for the post-EMR world.”

Other data from the HIMSS report included news of new analytics and telecom plans:

  • Researchers say that recent mergers and acquisitions are triggering new investments around telephony. They found that 12% of hospitals with inpatient revenues between $25 million and $125 million – and 6% of hospitals with more than $500 million in inpatient revenues — are investing in VOIP and telemedicine. FWIW, I’m not sure how mergers and acquisitions would trigger telemedicine rollouts, as they’re already well underway at many hospitals — maybe these deals foster new thinking and innovation?
  • As readers know, hospitals are increasingly spending on analytics solutions to improve care and make use of big data. However (and this surprised me) only 8% of hospitals reported plans to buy at least one analytics technology. My guess is that this number is small because a) hospitals may not have collected their big data assets in easily-analyzed form yet and b) that they’re still hoping to make better use of their legacy analytics tools.

Looking at these stats as a whole, I get the sense that the hospitals surveyed are expecting to play catch-up and shore up their infrastructure next year, rather than sink big dollars into future-looking solutions.

Without a doubt, hospital leaders are likely to invest in game-changing technologies soon such as cutting-edge patient engagement and population health platforms to prepare for the shift to value-based health. It’s inevitable.

But in the meantime it probably makes sense for them to focus on internal cost drivers like pharmacy departments, whose average annual inpatient drug spending shot up by more than 23% between 2013 and 2015. Without stanching that kind of bleeding, hospitals are unlikely to get as much value as they’d like from big-idea investments in the future.

A Look At Geisinger’s Big Data Efforts

Posted on December 28, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

This week I got a look at a story appearing in a recent issue of Harvard Business Review which offers a description of Geisinger Health System’s recent big data initiatives. The ambitious project is designed not only to track and analyze patient outcomes, but also to visualize healthcare data across cohorts of patients and networks of providers and even correlate genomic sequences with clinical care. Particularly given that Geisinger has stayed on the cutting edge of HIT for many years, I think it’s worth a look.

As the article’s authors note, Geisinger rolled out a full-featured EMR in 1996, well ahead of most of its peers. Like many other health systems, Geisinger has struggled to aggregate and make use of data. That’s particularly the case because as with other systems, Geisinger’s legacy analytics systems still in place can’t accommodate the growing flood of new data types emerging today.

Last year, Geisinger decided to create a new infrastructure which could bring this data together. It implemented Unified Data Architecture allowing it to integrate big data into its existing data analytics and management.  According to the article, Geisinger’s UDA rollout is the largest practical application of point-of-care big data in the industry. Of particular note, Geisinger is crunching not only enterprise healthcare data (including HIE inputs, clinical departmental systems and patient satisfaction surveys) and consumer health tools (like smartphone apps) but even grocery store and loyalty program info.

Though all of its data hasn’t yet been moved to the UDA, Geisinger has already seen some big data successes, including:

* “Close the Loop” program:  Using natural language processing, the UDA analyzes clinical and diagnostic imaging reports, including free text. Sometimes it detects problems that may not be relevant to the initial issue (such as injuries from a car crash) which can themselves cause serious harm. The program has already saved patient lives.

* Early sepsis detection/treatment: Geisinger uses the UDA to bring all sepsis-patient information in one place as they travel through the hospital. The system alerts providers to real-time physiologic data in patients with life-threatening septic shock, as well as tracking when antibiotics are prescribed and administered. Ninety percent of providers who use this tool consistently adhere to sepsis treatment protocols, as opposed to 40% of those who don’t.

* Surgery costs/outcomes: The Geisinger UDA tracks and integrates surgical supply-chain data, plus clinical data by surgery type and provider, which offers a comprehensive view of performance by provider and surgery type.  In addition to offering performance insight, this approach has also helped generate insights about supply use patterns which allow the health system to negotiate better vendor deals.

To me, one of the most interesting things about this story is that while Geisinger is at a relatively early stage of its big data efforts, it has already managed to generate meaningful benefits from its efforts. My guess is that its early successes are more due to smart planning – which includes worthwhile goals from day one of the rollout — than the technology per se. Regardless, let’s hope other hospital big data projects fare so well. (Meanwhile, for a look at another interesting hospital big data project, check out this story.)

Paris Hospitals Use Big Data To Predict Admissions

Posted on December 19, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Here’s a fascinating story in from Paris (or par-ee, if you’re a Francophile), courtesy of Forbes. The article details how a group of top hospitals there are running a trial of big data and machine learning tech designed to predict admission rates. The hospitals’ predictive model, which is being tested at four of the hospitals which make up the Assistance Publiq-Hopitaux de Paris (AP-HP), is designed to predict admission rates as much as 15 days in advance.

The four hospitals participating in the project have pulled together a massive trove of data from both internal and external sources, including 10 years’ worth of hospital admission records. The goal is to forecast admissions by the day and even by the hour for the four facilities participating in the test.

According to Forbes contributor Bernard Marr, the project involves using time series analysis techniques which can detect patterns in the data useful for predicting admission rates at different times.  The hospitals are also using machine learning to determine which algorithms are likely to make good predictions from old hospital data.

The system the hospitals are using is built on the open source Trusted Analytics Platform. According to Marr, the partners felt that the platform offered a particularly strong capacity for ingesting and crunching large amounts of data. They also built on TAP because it was geared towards open, collaborative development environments.

The pilot system is accessible via a browser-based interface, designed to be simple enough that data science novices like doctors, nurses and hospital administration staff could use the tool to forecast visit and admission rates. Armed with this knowledge, hospital leaders can then pull in extra staffers when increased levels of traffic are expected.

Being able to work in a distributed environment will be key if AP-HP decides to roll the pilot out to all of its 44 hospitals, so developers built with that in mind. To be prepared for the future, which might call for adding a great deal of storage and processing power, they designed distributed, cloud-based system.

“There are many analytical solutions for these type of problems, [but] none of them have been implemented in a distributed fashion,” said Kyle Ambert, an Intel data scientist and TAP contributor who spoke with Marr. “Because we’re interested in scalability, we wanted to make sure we could implement these well-understood algorithms in such a way that they work over distributed systems.”

To make this happen, however, Ambert and the development team have had to build their own tools, an effort which resulted in the first contribution to an open-source framework of code designed to carry out analysis over scalable, distributed framework, one which is already being deployed in other healthcare environments, Marr reports.

My feeling is that there’s no reason American hospitals can’t experiment with this approach. In fact, maybe they already are. Readers, are you aware of any US facilities which are doing something similar? (Or are most still focused on “skinny” data?)

Easing The Transition To Big Data

Posted on December 16, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Tapping the capabilities of big data has become increasingly important for healthcare organizations in recent years. But as HIT expert Adheet Gogate notes, the transition is not an easy one, forcing these organizations to migrate from legacy data management systems to new systems designed specifically for use with new types of data.

Gogate, who serves as vice president of consulting at Citius Tech, rightly points out that even when hospitals and health systems spend big bucks on new technology, they may not see any concrete benefits. But if they move through the big data rollout process correctly, their efforts are more likely to bear fruit, he suggests. And he offers four steps organizations can take to ease this transition. They include:

  • Have the right mindset:  Historically, many healthcare leaders came up through the business in environments where retrieving patient data was difficult and prone to delays, so their expectations may be low. But if they hope to lead successful big data efforts, they need to embrace the new data-rich environment, understand big data’s potential and ask insightful questions. This will help to create a data-oriented culture in their organization, Gogate writes.
  • Learn from other industries: Bear in mind that other industries have already grappled with big data models, and that many have seen significant successes already. Healthcare leaders should learn from these industries, which include civil aviation, retail and logistics, and consider adopting their approaches. In some cases, they might want to consider bringing an executive from one of these industries on board at a leadership level, Gogate suggests.
  • Employ the skills of data scientists: To tame the floods of data coming into their organization, healthcare leaders should actively recruit data scientists, whose job it is to translate the requirements of the methods, approaches and processes for developing analytics which will answer their business questions.  Once they hire such scientists, leaders should be sure that they have the active support of frontline staffers and operations leaders to make sure the analyses they provide are useful to the team, Gogate recommends.
  • Think like a startup: It helps when leaders adopt an entrepreneurial mindset toward big data rollouts. These efforts should be led by senior leaders comfortable with this space, who let key players act as their own enterprise first and invest in building critical mass in data science. Then, assign a group of core team members and frontline managers to areas where analytics capabilities are most needed. Rotate these teams across the organization to wherever business problems reside, and let them generate valuable improvement insights. Over time, these insights will help the whole organization improve its big data capabilities, Gogash says.

Of course, taking an agile, entrepreneurial approach to big data will only work if it has widespread support, from the C-suite on down. Also, healthcare organizations will face some concrete barriers in building out big data capabilities, such as recruiting the right data scientists and identifying and paying for the right next-gen technology. Other issues include falling reimbursements and the need to personalize care, according to healthcare CIO David Chou.

But assuming these other challenges are met, embracing big data with a willing-to-learn attitude is more likely to work than treating it as just another development project. And the more you learn, the more successful you’ll be in the future.

Using NLP with Machine Learning for Predictive Analytics in Healthcare

Posted on December 12, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

There are a lot of elements involved in doing predictive analytics in healthcare effectively. In most cases I’ve seen, organizations working on predictive analytics do some but not all that’s needed to really make predictive analytics as effective as possible. This was highlighted to me when I recently talked with Frank Stearns, Executive Vice President from HBI Solutions at the Digital Health Conference in NYC.

Here’s a great overview of the HBI Solutions approach to patient risk scores:

healthcare-predictive-analytics-model

This process will look familiar to most people in the predictive analytics space. You take all the patient data you can find, put it into a machine learning engine and output a patient risk score. One of the biggest trends happening with this process is the real-time nature of this process. Plus, I also love the way the patient risk score includes the attributes that influenced a patients risk score. Both of these are incredibly important when trying to make this data actionable.

However, the thing that stood out for me in HBI Solutions’ approach is the inclusion of natural language processing (NLP) in their analysis of the unstructured patient data. I’d seen NLP being used in EHR software before, but I think the implementation of NLP is even more powerful in doing predictive analytics.

In the EHR world, you have to be absolutely precise. If you’re not precise with the way you code a visit, you won’t get paid. If you’re not precise with how the diagnosis is entered into the EHR, that can have long term consequences. This has posed a real challenge for NLP since NLP is not 100% accurate. It’s gotten astoundingly good, but still has its shortcomings that require a human review when utilizing it in an EHR.

The same isn’t true when applying NLP to unstructured data when doing predictive analytics. Predictive analytics by its very nature incorporates some modicum of variation and error. It’s understood that predictive analytics could be wrong, but is an indication of risk. Certainly a failing in NLP’s recognition of certain data could throw off a predictive analytic. That’s unfortunate, but the predictive analytics aren’t relied on the same way documentation in an EHR is relied upon. So, it’s not nearly as big of a deal.

Plus, the value that’s received from applying NLP to pull out the nuggets of information that exists in the unstructured narrative sections of healthcare data is well worth that small amount of risk of the NLP being incorrect. As Frank Stearns from HBI solutions pointed out to me, the unstructured data is often where the really valuable data about a patients’ risk score exist.

I’d be interested in having HBI Solutions do a study of the whole list of findings that are often available in the unstructured data that weren’t available otherwise. However, it’s not hard to imagine a doctor documenting patient observations in the unstructured EHR narrative that they didn’t want to include as a formal diagnosis. Not the least of these are behavioral health observations that the doctor saw, observed, and documented but didn’t want to fully diagnose. NLP can pull these out of the narrative and include them in their patient risk score.

Given this perspective, it’s hard to imagine we’ll ever be able to get away from using NLP or related technology to pull out the valuable insights in the unstructured data. Plus, it’s easy to see how predictive analytics that don’t use NLP are going to be deficient when trying to use machine learning to analyze patients. What’s amazing is that HBI Solutions has been applying machine learning to healthcare for 5 years. That’s a long time, but also explains why they’ve implemented such advanced solutions like NLP in their predictive analytics solutions.

Population Health 101: The One Where It All Starts

Posted on December 7, 2016 I Written By

The following is a guest blog post by Abhinav Shashank, CEO & Co-founder of Innovaccer.
population-health-101
Former US President Abraham Lincoln once said, “Give me six hours to chop down a tree and I’ll spend four hours sharpening the ax.”  After having a look at the efficiency of the US healthcare system, one cannot help but notice the irony. A country spending $10,345 per person on healthcare shouldn’t be on the last spot of OECD rankings for life expectancy at birth!

Increasing Troubles
report from Commonwealth Fund points out how massive the US health care budget is. Various US governments have left no stone unturned in becoming the highest spender on healthcare, but have equally managed to see most of its money going down the drain!

Here are some highlights from the report:

  1. The US is 3rd when it comes to public spending on health care. The figure is $4197 per capita, but it covers only 34% of its residents. On the other hand, the UK spends only $2,802 per capita and covers 100% of the population!
  2. With $1,074, US has the 2nd highest private spending on healthcare.
  3. In 2013, US allotted 17.1% of its GDP to healthcare, which was the highest of any OECD country.   In terms of money, this was almost 50% more than the country in the 2nd spot.
  4. In the year 2013, the number of practicing physicians in the US was 2.6 per 1000 persons, which is less than the OECD median (3.2).
  5. The infant mortality rate in the US was also higher than other OECD nations.
  6. 68 percent of the population above 65 in the US is suffering from two or more chronic conditions, which is again the highest among OECD nations.

The major cause of these problems is the lack of knowledge about the population trends. The strategies in place will vibrantly work with the law only if they are designed according to the needs of the people.

population-health-trends

What is Population Health Management?
Population health management (PHM) might have been mentioned in ACA (2010), but the meaning of it is lost on many. I feel, the definition of population health, given by Richard J. Gilfillan, President and CEO of Trinity Health, is the most suitable one.

Population health refers to addressing the health status of a defined population. A population can be defined in many different ways, including demographics, clinical diagnoses, geographic location, etc. Population health management is a clinical discipline that develops, implements and continually refines operational activities that improve the measures of health status for defined populations.

The true realization of Population Health Management  (PHM) is to design a care delivery model which provides quality coordinated care in an efficient manner. Efforts in the right direction are being made, but the tools required for it are much more advanced and most providers lack the resources to own them.

Countless Possibilities
If Population Health Management is in place, technology can be leveraged to find out proactive solutions to acute episodes. Based on past episodes and outcomes, a better decision could be made.

The concept of health coaches and care managers can actually be implemented. When a patient is being discharged, care managers can confirm the compliance with health care plans. They can mitigate the possibility of readmission by keeping up with the needs and appointments of patients. Patients could be reminded about their medications. The linked health coaches could be intimated to further reduce the possibility of readmission.

Let us consider Diabetes for instance. Many times Diabetes is hereditary and preventive measures like patient engagement would play an important role in mitigating risks. Remote Glucometers, could be useful in keeping a check on patient sugar levels at home. It could also send an alert to health coaches and at-risk population could be engaged in near real-time.

Population Health Management not only keeps track of population trends but also reduces the cost of quality care. The timely engagement of at-risk population reduces the possibility of extra expenditure in the future. It also reduces the readmission rates. The whole point of population health management is to be able to offer cost effective quality-care.

The best thing to do with the past is to learn from it. If providers implement in the way Population Health Management is meant to be, then the healthcare system would be far better and patient-centric.

Success Story
A Virginia based collaborative started a health information based project in mid-2010. Since then, 11 practices have been successful in earning recognition from NCQA (National Committee for Quality Assurance). The implemented technologies have had a profound impact on organization’s performance.

  1. For the medical home patients, the 30-day readmission rate is below 2%.
  2. The patient engagement scores are at 97th percentile.
  3. With the help of the patient outreach program almost 40,000 patients have been visited as a part of preventive measures.

All this has increased the revenue by $7 million.

Barriers in the journey of Population Health Management
Currently, population health management faces a lot of challenges. The internal management and leadership quality has to be top notch so that interests remain aligned. Afterall, Population Health Management is all about team effort.

The current reimbursement model is also a concern. It has been brought forward from the 50s and now it is obsolete. Fee-for-service is anything, but cost-effective.

Patient-centric care is the heart of Population Health Management. The transition to this brings us to the biggest challenge and opportunity. Data! There is a lot of unstructured Data. True HIE can be achieved only if data are made available in a proper format. A format which doesn’t require tiring efforts from providers to get patient information. Providers should be able to gain access to health data in seconds.

The Road Ahead
We believe, the basic requirement for Population Health Management is the patient data. Everything related to a patient, such as, the outcome reports, the conditions in which the patient was born, lives, works, age and others is golden. To accurately determine the cost, activity-based costing could come in handy.

Today, the EMRs aren’t capable enough to address population health. The most basic model of population health management demands engagement on a ‘per member basis’ which can track and inform the cost of care at any point. The EMRs haven’t been designed in such a way. They just focus on the fee-for-service model.

In recent years, there has been an increased focus on population health management. Advances in the software field have been prominent and they account for the lion’s share of the expenditure on population health. I think, this could be credited to Affordable Care Act of 2010, which mandated the use of population health management solutions.

Today, the Population Health Management market is worth $14 billion and according to a report by Tractica, in five years, this value will be $31.8 billion. This is a good sign because it shows that the focus is on value-based care. There is no doubt we have miles to go, but at least now we are on the right path!

Hospital Program Uses Connected Health Monitoring To Admit Patients “To Home”

Posted on November 28, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A Boston-based hospital has kicked off a program in which it will evaluate whether a mix of continuous connected patient monitoring and clinicians is able to reduce hospitalizations for common medical admissions.

The Home Hospital pilot, which will take place at Partners HealthCare Brigham and Women’s Hospital, is being led by David Levine, MD, MA, a physician who practices at the hospital. The hospital team is working with two vendors to implement the program, Vital Connect and physIQ. Vital Connect is supplying a biosensor that will continuously stream patient vital signs; those vital signs, in turn, will be analyzed and viewable through physIQ’s physiology analytics platform.

The Home Hospital pilot is one of two efforts planned by the team to analyze how technology in home-based care can treat patients who might otherwise have been admitted to the hospital. For this initiative, a randomized controlled trial, patients diagnosed at the BWH Emergency Department with exacerbation of heart failure, pneumonia, COPD, cellulitis or complicated urinary tract infection are being placed at home with the Vital Connect/physIQ solution and receive daily clinician visits.

The primary aim of this program, according to participants, is to demonstrate that the in-home model they’ve proposed can provide appropriate care at a lower cost at home, as well as improving outcomes measures such as health related quality of life, patient safety and quality and overall patient experience.

According to a written statement, the first phase of the initiative began in September of this year involves roughly 60 patients, half of whom are receiving traditional in-hospital care, while the other half are being treated at home. With the early phase looking at the success, the hospital will probably scale up to including 500 patients in the pilot in early 2017.

Expect to see more hospital-based connected care options like these emerge over the next year or two, as they’re just too promising to ignore at this point.

Perhaps the most advanced I’ve written about to date must be the Chesterfield, Mo-based Mercy Virtual Care Center, which describes itself as a “hospital without beds.” The $54M Virtual Care Center, which launched in October 2015, employs 330 staffers providing a variety of telehealth services, including virtual hospitalists, telestroke and perhaps most relevant to this story, the “home monitoring” service, which provides continuous monitoring for more than 3,800 patients.

My general impression is that few hospitals are ready to make the kind of commitment Mercy did, but that most are curious and some quite interested in actively implementing connected care and monitoring as a significant part of their service line. It’s my guess that it won’t take many more successful tests to convince wide swath of hospitals to get off the fence and join them.

Longitudinal Patient Record Needed To Advance Care?

Posted on November 23, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

In most day to day settings, a clinician only needs a small (if precisely focused) amount of data to make clinical decisions. Both in ambulatory and acute settings, they rely on immediate and near-term information, some collected during the visit, and a handful of historical factors likely to influence or even govern what plan of care is appropriate.

That may be changing, though, according to Cheryl McKay of Orion Health. In a recent blog item, McKay argues that as the industry shifts from fee-for-service payment models to value-based reimbursement, we’ll need new types of medical records to support this model. Today, the longitudinal patient record and community care plan are emerging as substitutes to old EMR models, McKay says. These new entities will be built from varied data sources including payer claims, provider EMRs, patient health devices and the patients themselves.

As these new forms of patient medical record emerge, effective population health management is becoming more feasible, she argues. Longitudinal patient records and community care plans are “essential as we steer away from FFS…The way records are delivered to healthcare providers– with an utter lack of visibility and a lot of noise from various data sources– creates unnecessary risks for everyone involved.”

She contends that putting these types of documentation in place, which summarize patient-based clinical experiences versus episodic clinical experiences, close big gaps in patient history which would otherwise generate mistakes. Longitudinal record-keeping also makes it easier for physicians to aggragate information, do predictive modeling and intervene proactively in patient care at both the patient and population level.

She also predicts that with both a longitudinal patient record and community care plan in place, getting from the providers of all stripes a “panoramic” look at patients, costs will fall as providers stop performing needless tests and procedures. Not only that, these new entities would ideally offer real-time information as well, including event notifications, keeping all the providers involved in sync in providing the patient’s care.

To be sure, this blog item is a pitch for Orion’s technology. While the notion of a community-care plan isn’t owned by anyone in particular, Orion is pitching a specific model which rides upon its population health technology. That being said, I’m betting most of us would agree that the idea (regardless of which vendor you work with) of establishing a community-wide care plan does make sense. And certainly, putting a rich longitudinal patient record in place could be valuable too.

However, given the sad state of interoperability today, I doubt it’s possible to build this model today unless you choose a single vendor-centric solution. At present think it’s more of a dream than a reality for most of us.