Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

The Distributed Hospital On The Horizon

Posted on February 24, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

If you’re reading this blog, you already know that distributed, connected devices and networks are the future of healthcare.  Connected monitoring devices are growing more mature by the day, network architectures are becoming amazingly fluid, and with the growth of the IoT, we’re adding huge numbers of smart devices to an already-diverse array of endpoints.  While we may not know what all of this will look when it’s fully mature, we’ve already made amazing progress in connecting care.

But how will these trends play out? One nice look at where all this is headed comes from Jeroen Tas, chief innovation and strategy officer at Philips. In a recent article, Tas describes a world in which even major brick-and-mortar players like hospitals go almost completely virtual.  Certainly, there are other takes out there on this subject, but I really like how Tas explains things.

He starts with the assertion that the hospital of the future “is not a physical location with waiting rooms, beds and labs.” Instead, a hospital will become an abstract network overlay connecting nodes. It’s worth noting that this isn’t just a concept. For an example, Tas points to the Mercy Virtual Care Center, a $54 million “hospital without beds” dedicated to telehealth and connected care.  The Center, which has over 300 employees, cares for patients at home and in beds across 38 hospitals in seven states.

While the virtual hospital may not rely on a single, central campus, physical care locations will still matter – they’ll just be distributed differently. According to Tas, the connected health network will work best if care is provided as needed through retail-type outlets near where people live, specialist hubs, inpatient facilities and outpatient clinics. Yes, of course, we already have all of these things in place, but in the new connected world, they’ll all be on a single network.

Ultimately, even if brick-and-mortar hospitals never disappear, virtual care should make it possible to cut down dramatically on hospital admissions, he suggests.  For example, Tas notes that Philips partner Banner Health has slashed hospital admissions almost 50% by using telehealth and advanced analytics for patients with multiple chronic conditions. (We’ve also reported on a related pilot by Partners HealthCare Brigham and Women’s Hospital, the “Home Hospital,” which sends patients home with remote monitoring devices as an alternative to admissions.)

Of course, the broad connected care outline Tas offers can only take us so far. It’s all well and good to have a vision, but there are still some major problems we’ll have to solve before connected care becomes practical as a backbone for healthcare delivery.

After all, to cite one major challenge, community-wide connected health won’t be very practical until interoperable data sharing becomes easier – and we really don’t know when that will happen. Also, until big data analytics tools are widely accessible (rather than the province of the biggest, best-funded institutions) it will be hard for providers to manage the data generated by millions of virtual care endpoints.

Still, if Tas’s piece is any indication, consensus is building on what next-gen care networks can and should be, and there’s certainly plenty of ways to lay the groundwork for the future. Even small-scale, preliminary connected health efforts seem to be fostering meaningful changes in how care is delivered. And there’s little doubt that over time, connected health will turn many brick-and-mortar care models on their heads, becoming a large – or even dominant – part of care delivery.

Getting there may be tricky, but if providers keep working at connected care, it should offer an immense payoff.

Indiana Health System Takes On Infection Control With Predictive Analytics

Posted on February 22, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

At Indiana University Health, a 15-hospital non-profit health system, they’ve taken aim at reducing the rate of central-line associated bloodstream infections – better known to infection control specialists as CLABSIs.

According to the CDC, CLABSIs are preventable, but at present still result in thousands of deaths each year and add billions of dollars in costs to U.S. healthcare system spending. According to CDC data, patient mortality rates related to CLABSI range from 12% to 25%, and the infections cost $3,700 to $36,000 per episode.

Hospitals have been grappling with this problem for a long time, but now technology may offer preventive options. To cut its rate of CLABSIs, IU Health has decided to use predictive analytics in addition to traditional prevention strategies, according to an article in the AHA’s Hospitals & Health Systems magazine.

Reducing the level of hospital-acquired infections suffered by your patients always makes sense, but IU Health arguably has additional incentives to do it. The decision to attack CLABSIs comes as IU Health takes on a strategic initiative likely to demand a close watch on such metrics. At the beginning of January, Indiana University Health kicked off its participation in the CMS Next Generational Accountable Care Organization Model, putting its ACO in the national spotlight as a potential model for improving fee-for-service Medicare.

According to H&HN, IU Health has launched its predictive analytics pilot for CLABSI prevention at its University Hospital location, which includes a 600-bed Level I trauma center and 300-bed tertiary care center which also serves as one of the 10 largest transplant centers in the U.S.

Executives there told the magazine that the predictive analytics effort was an outgrowth of its long-term EMR development effort, which has pushed them to streamline data flow across platforms and locations over the past several years.

The hospital’s existing tech prior to the predictive analytics effort did include an e-surveillance program for hospital-acquired infections, but even using the full powers of the EMR and e-surveillance solution together, the hospitals could only monitor for CLABSI which had already been diagnosed.

This retrospective approach succeeded in cutting IU Health’s CLABSI rate from 1.7 CLABSIs over central-line days in 2015 to 1.2 last year. But IU Health hopes to improve the hospital’s results even further by getting ahead of the game.

Last year, the system implemented a data visualization platform designed to give providers a quick-and-easy look at data in real time. The platform lets managers keep track of many important variables easily, including whether hospital units have skipped any line maintenance activities or failed to follow-through on CLABSI bundles. It’s also saving time for nurse managers, who used to have to track data manually, and letting them check on patient trend line data at a glance.

The H&HN article doesn’t say whether the hospital has managed to cut its CLABSI rate any further, but it’s hard to imagine how predictive analytics could deliver zero results. Let’s wish IU Health further luck in cutting CLABSI rates down further.

National Health Service Hospitals Use Data Integration Apps

Posted on February 20, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

While many providers in the US are still struggling with selecting and deploying apps, the UK National Health Service trusts are ready to use them to collect vital data.

According to the New Scientist, the four National Health Services serving the United Kingdom are rolling out two apps which help patients monitor their health at home. Both of the apps, which are being tested at four hospitals in Oxfordshire, UK, focus on management of a disease state.

One, called GDm-health, helps manage the treatment of gestational diabetes, which affects one in 10 pregnant women. Women use the app to send each of their blood glucose readings to the clinician monitoring their diabetes. The Oxford University Institute of Biomedical Engineering led development of the app, which has allowed patients to avoid needless in-person visits. In fact, the number of patient visits has dropped by 25%, the article notes.

The other app, which was also developed by the Institute, helps patients manage chronic obstructive pulmonary disease, which affects between 1 million and 1.5 million UK patients. COPD patients check their heart rate and blood oxygen saturation every day, entering each result into the app.

After collecting three months of measurements, the app “learns” to recognize what a normal oxygen sat level is for that patient. Because it has data on what is normal for that patient, it will neither alert clinicians too often nor ignore potential problems. During initial use the app, which already been through a 12-month clinical trial, cut hospital admissions among this population by 17% and general practitioner visits by 40%.

NHS leaders are also preparing to launch a third app soon. The technology, which is known as SEND, is an iPad app designed to collect information on hospital patients. As they make their rounds, nurses will use the app to input data on patients’ vital signs. The system then automatically produces an early warning score for each patient, and provides an alert if the patient’s health may be deteriorating.

One might think that because UK healthcare is delivered by centralized Trusts, providers there don’t face data-sharing problems in integrating data from apps like these. But apparently, we would be wrong. According to Rury Holman of the Oxford Biomedical Research Centre, who spoke with New Scientist, few apps are designed to work with NHS’ existing IT systems.

“It’s a bit like the Wild West out there with lots of keen and very motivated people producing these apps,” he told the publication. “What we need are consistent standards and an interface with electronic patient records, particularly with the NHS, so that information, with permission from the patients, can be put to use centrally.”

In other words, even in a system providing government-delivered, ostensibly integrated healthcare, it’s still hard to manage data sharing effectively. Guess we shouldn’t feel too bad about the issues we face here in the US.

Many Providers Still Struggle With Basic Data Sharing

Posted on February 15, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

One might assume that by this point, virtually every provider with a shred of IT in place is doing some form of patient data exchange. After all, many studies tout the number of healthcare data send and receive transactions a given vendor network or HIE has seen, and it sure sounds like a lot. But if a new survey is any indication, such assumptions are wrong.

According a study by Black Book Research, which surveyed 3,391 current hospital EMR users, 41% of responding medical record administrators find it hard to exchange patient health records with other providers, especially if the physicians involved aren’t on their EMR platform. Worse, 25% said they still can’t use any patient information that comes in from outside sources.

The problem isn’t a lack of interest in data sharing. In fact, Black Book found that 81% of network physicians hoped that their key health system partners’ EMR would provide interoperability among the providers in the system. Moreover, the respondents say they’re looking forward to working on initiatives that depend on shared patient data, such as value-based payment, population health and precision medicine.

The problem, as we all know, is that most hospitals are at an impasse and can’t find ways to make interoperability happen. According to the survey, 70% of hospitals that responded weren’t using information outside of their EMR.  Respondents told Black Book that they aren’t connecting clinicians because external provider data won’t integrate with their EMR’s workflow.

Even if the data flows are connected, that may not be enough. Researchers found that 22% of surveyed medical record administrators felt that transferred patient information wasn’t presented in a useful format. Meanwhile, 21% of hospital-based physicians contended that shared data couldn’t be trusted as accurate when it was transmitted between different systems.

Meanwhile, the survey found, technology issues may be a key breaking point for independent physicians, many of whom fear that they can’t make it on their own anymore.  Black Book found that 63% of independent docs are now mulling a merger with a big healthcare delivery system to both boost their tech capabilities and improve their revenue cycle results. Once they have the funds from an acquisition, they’re cleaning house; the survey found that EMR replacement activities climbed 52% in 2017 for acquired physician practices.

Time for a comment here. I wish I agreed with medical practice leaders that being acquired by a major health system would solve all of their technical problems. But I don’t, really. While being acquired may give them an early leg up, allowing them to dump their arguably flawed EMR, I’d wager that they won’t have the attention of senior IT people for long.

My sense is that hospital and health system leaders are focused externally rather than internally. Most of the big threats and opportunities – like ACO integration – are coming at leaders from the outside.

True, if a practice is a valuable ally, but independent of the health system, CIOs and VPs may spend lots of time and money to link arms with them technically. But once they get in house, it’s more of a “get in line” situation from what I’ve seen.  Readers, what is your experience?

An Approach For Privacy – Protecting Big Data

Posted on February 6, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

There’s little doubt that the healthcare industry is zeroing in on some important discoveries as providers and researchers mine collections of clinical and research data. Big data does come with some risks, however, with some observers fearing that aggregated and shared information may breach patient privacy. However, at least one study suggests that patients can be protected without interrupting data collection.

In what it calls a first, a new study appearing in the Journal of the American Medical Informatics Association has demonstrated that protecting the privacy of patients can be done without too much fuss, even when the patient data is pulled into big data stores used for research.

According to the study, a single patient anonymization algorithm can offer a standard level of privacy protection across multiple institutions, even when they are sharing clinical data back and forth. Researchers say that larger clinical datasets can protect patient anonymity without generalizing or suppressing data in a manner which would undermine its use.

To conduct the study, researchers set a privacy adversary out to beat the system. This adversary, who had collected patient diagnoses from a single unspecified clinic visit, was asked to match them to a record in a de-identified research dataset known to include the patient. To conduct the study, researchers used data from Vanderbilt University Medical Center, Northwestern Memorial Hospital in Chicago and Marshfield Clinic.

The researchers knew that according to prior studies, the more data associated with each de-identified record, and the more complex and diverse the patient’s problems, the more likely it was that their information would stick out from the crowd. And that would typically force managers to generalize or suppress data to protect patient anonymity.

In this case, the team hoped to find out how much generalization and suppression would be necessary to protect identities found within the three institutions’ data, and after, whether the protected data would ultimately be of any use to future researchers.

The team processed relatively small datasets from each institution representing patients in a multi-site genotype-disease association study; larger datasets to represent patients in the three institutions’ bank of de-identified DNA samples; and large sets which stood in for each’s EMR population.

Using the algorithm they developed, the team found that most of the data’s value was preserved despite the occasional need for generalization and suppression. On average, 12.8% of diagnosis codes needed generalization; the medium-sized biobank models saw only 4% of codes needing generalization; and among the large databases representing EMR populations, only 0.4% needed generalization and no codes required suppression.

More work like this is clearly needed as the demand for large-scale clinical, genomic and transactional datasets grows. But in the meantime, this seems to be good news for budding big data research efforts.

Boston Children’s Benefits From the Carequality and CommonWell Agreement

Posted on February 3, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Recently two of the bigger players working on health data interoperability – Carequality and the CommonWell Health Alliance – agreed to share data with each other. The two, which were fierce competitors, agreed that CommonWell would share data with any Carequality participant, and that Carequality users would be able to use the CommonWell record locator service.

That is all well and good, but at first I wasn’t sure if it would pan out. Being the cranky skeptic that I am, I assumed it would take quite a while for the two to get their act together, and that we’d hear little more of their agreement for a year or two.

But apparently, I was wrong. In fact, a story by Scott Mace of HealthLeaders suggests that Boston Children’s Hospital and its physicians are likely to benefit right away. According to the story, the hospital and its affiliated Pediatric Physicians Organization at Children’s Hospital (PPOC) will be able to swap data nicely despite their using different EMRs.

According to Mace, Boston Children’s runs a Cerner EMR, as well as an Epic installation to manage its revenue cycle. Meanwhile, PPOC is going live with Epic across its 80 practices and 400 providers. On the surface, the mix doesn’t sound too promising.

To add even more challenges to the mix, Boston Children’s also expects an exponential jump in the number of patients it will be caring for via its Medicaid ACO, the article notes.

Without some form of data sharing compatibility, the hospital and practice would have faced huge challenges, but now it has an option. Boston Children’s is joining CommonWell, and PPOC is joining Carequality, solving a problem the two have struggled with for a long time, Mace writes.

Previously, the story notes, the hospital tried unsuccessfully to work with a local HIE, the Mass Health Information HIway. According to hospital CIO Dan Nigrin, MD, who spoke with Mace, providers using Mass Health were usually asked to push patient data to their peers via Direct protocol, rather than pull data from other providers when they needed it.

Under the new regime, however, providers will have much more extensive access to data. Also, the two entities will face fewer data-sharing hassles, such as establishing point-to-point or bilateral exchange agreements with other providers, PPOC CIO Nael Hafez told HealthLeaders.

Even this step upwards does not perfect interoperability make. According to Micky Tripathi, president and CEO of the Massachusetts eHealth Collaborative, providers leveraging the CommonWell/Carequality data will probably customize their experience. He contends that even those who are big fans of the joint network may add, for example, additional record locator services such as one provided by Surescripts. But it does seem that Boston Children’s and PPOC are, well, pretty psyched to get started with data sharing as is.

Now, back to me as Queen Grump again. I have to admit that Mace paints a pretty attractive picture here, and I wish Boston Children’s and PPOC much success. But my guess is that there will still be plenty of difficult issues to work out before they have even the basic interoperability they’re after. Regardless, some hope of data sharing is better than none at all. Let’s just hope this new data sharing agreement between CommonWell and Carequality lives up to its billing.

Health IT Preserves Idaho Hospital’s Independence

Posted on February 1, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Most of the time, when I write about hospital IT adoption, I end up explaining why a well-capitalized organization is going into the red to implement its EMR. But I recently found a story in RevCycle Intelligence in which a struggling hospital actually seems to have benefitted financially from investing in IT infrastructure. According to the story, a 14-bed critical access hospital in Idaho recently managed to stave off a forced merger or even closure by rolling out an updated EMR and current revenue cycle management technology.

Only a few years ago, Arco, Idaho-based Lost Rivers Medical Center was facing serious financial hurdles, and its technology was very outdated. In particular, it was using an EMR from 1993, which was proving so inflexible that the claims stayed in accounts receivable for an average of 108 days. “We didn’t have wifi,” CEO Brad Huerta told the site. “We didn’t have fiber. We literally had copper wires for our phone system…we had an EMR in a technical sense, but nobody was using it. It was a proverbial paperweight.”

Not only was the cost of paying for upgrades daunting, the hospital’s location was as well. Arco is a “frontier” location, making it hard to recruit IT staffers to implement and maintain infrastructure, staff and servers, the story notes. Though “fiercely independent,” as Huerta put it, it was getting hard for Lost Rivers to succeed without merging with a larger organization.

That being said, Huerta and his team decided to stick it out. They feared diluting their impact, or losing the ability to offer services like trauma care and tele-pharmacy, if they were to merge with a bigger organization.

Instead of conceding defeat, Huerta decided to focus on improving the hospital’s revenue cycle performance, which would call for installing an up-to-date EMR and more advanced medical billing tools. After the hospital finished putting in fiber in its area, Lost Rivers invested in athenahealth’s cloud-based EMR and medical billing tools.

Once the hospital put its new systems in place, it was able to turn things around on the revenue cycle front. Total cash flow climbed rapidly, and days in accounts receivable fell from 108 to 52 days.

According to Huerta, part of the reason the hospital was able to make such significant improvements was that the new systems improved workflow. In the past, he told RevCycle Intelligence, providers and staff often failed to code services correctly or bill patients appropriately, which led to financial losses.

Now, doctors chart on laptops, tablets or even phones while at the patients’ bedside. Not only did this improve coding accuracy, it cut down on the amount of time doctors spend in administrative work, giving them time to generate revenue by seeing additional patients.

What’s more, the new system has given Lost Rivers access to some of the advantages of merging with other facilities without having to actually do so. According to the story, the system now connects the critical access hospital with larger health systems, as the athenahealth system captures rule changes made by the other organization and effectively shares the improvements with Lost Rivers. This means the coding proposed by the system gradually gets more accurate, without forcing Lost Rivers to spend big bucks on coding training, Huertas said.

While the story doesn’t say so specifically, I’m sure that Lost Rivers is spending a lot on its spiffy new EMR and billing tech, which must have been painful at least at first. But it’s always good to see the gamble pay off.

UCSF Partners With Intel On Deep Learning Analytics For Health

Posted on January 30, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

UC San Francisco’s Center for Digital Health Innovation has agreed to work with Intel to deploy and validate a deep learning analytics platform. The new platform is designed to help clinicians make better treatment decisions, predict patient outcomes and respond quickly in acute situations.

The Center’s existing projects include CareWeb, a team-based collaborative care platform built on Salesforce.com social and mobile communications tech; Tidepool, which is building infrastructure for next-gen smart diabetes management apps; Health eHeart, a clinical trials platform using social media, mobile and realtime sensors to change heart disease treatment; and Trinity, which offers “precision team care” by integrating patient data with evidence and multi-disciplinary data.

These projects seem to be a good fit with Intel’s healthcare efforts, which are aimed at helping providers succeed at distributed care communication across desktop and mobile platforms.

As the two note in their joint press release, creating a deep learning platform for healthcare is extremely challenging, given that the relevant data is complex and stored in multiple incompatible systems. Intel and USCF say the next-generation platform will address these issues, allowing them to integrate not only data collected during clinical care but also inputs from genomic sequencing, monitors, sensors and wearables.

To support all of this activity obviously calls for a lot of computing power. The partners will run deep learning use cases in a distributed fashion based on a CPU-based cluster designed to crunch through very large datasets handily. Intel is rolling out the computing environment on its Xeon processor-based platform, which support data management and the algorithm development lifecycle.

As the deployment moves forward, Intel leaders plan to study how deep learning analytics and machine-driven workflows can optimize clinical care and patient outcomes, and leverage what they learn when they create new platforms for the healthcare industry. Both partners believe that this model will scale for future use case needs, such as larger convolutional neural network models, artificial networks patterned after living organizations and very large multidimensional datasets.

Once implemented, the platform will allow users to conduct advanced analytics on all of this disparate data, using machine learning and deep learning algorithms. And if all performs as expected, clinicians should be able to draw on these advanced capabilities on the fly.

This looks like a productive collaboration. If nothing else, it appears that in this case the technology platform UCSF and Intel are developing may be productized and made available to other providers, which could be very valuable. After all, while individual health systems (such as Geisinger) have the resources to kick off big data analytics projects on their own, it’s possible a standardized platform could make such technology available to smaller players. Let’s see how this goes.

Do Health IT Certificate Of Need Requirements Make Sense?

Posted on January 23, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

The other day, I read an interesting piece about the University of Vermont Medical Center’s plans to create an integrated EMR connecting its four network hospitals. The article noted that unlike its peers in some other states, UVMC was required to file a Certificate of Need (CON) application with the state before it proceeds with the work.  And that struck me as deserving some analysis.

According to a story appearing in Healthcare Informatics,  UVMC plans to invest an initial $112.4 million in the project, which includes an upgrade to informatics, billing and scheduling systems used by UVMC and network facilities Central Vermont Medical Center, Champlain Valley Physicians Hospital and Elizabethtown Community Hospital. The total costs of implementing and operating the integrated system should hit $151.6 million over the first six years. (For all of you vendor-watchers, UVMC is an Epic shop.)

In its CON application, UVMC noted that some of the systems maintained by network hospitals are 20 years old and in dire need of replacement. It also asserted that if the four hospitals made upgrades independently rather than in concert, it would cost $200 million and still leave the facilities without a connection to each other.

Given the broad outline provided in the article, these numbers seem reasonable, perhaps even modest given what execs are trying to accomplish. And that would be all most hospital executives would need to win the approval of their board and steam ahead with the project, particularly if they were gunning for value-based contracts.

But clearly, this doesn’t necessarily mean that such investments aren’t risky, or don’t stand a chance of triggering a financial meltdown. For example, there’s countless examples of health systems which have faced major financial problems (like this and this),  operational problems (particularly in this case) or have been forced to make difficult tradeoffs (such as this). And their health IT decisions can have a major impact on the rest of the marketplace, which sometimes bears the indirect costs of any mistakes they make.

Given these concerns, I think there’s an argument to be made for requiring hospitals to get CONs for major health IT investments. If there’s any case to be made for CON programs make any sense, I can’t see why it doesn’t apply here. After all, the idea behind them is to look at the big picture rather than incremental successes of one organization. If investment in, say, MRIs can increase costs needlessly, the big bucks dropped on health IT systems certainly could.

Part of the reason I sympathize with these requirements is I believe that healthcare IS fundamentally different than any other industry, and that as a public good, should face oversight that other industries do not. Simply put, healthcare costs are everybody’s costs, and that’s unique.

What’s more, I’m all too familiar with the bubble in which hospital execs and board members often live. Because they are compelled to generate the maximum profit (or excess) they can, there’s little room for analyzing how such investments impact their communities over the long term. Yes, the trend toward ACOs and population health may mitigate this effect to some degree, but probably not enough.

Of course, there’s lots of arguments against CONs, and ultimately against government intervention in the marketplace generally. If nothing else, it’s obvious that CON board members aren’t necessarily impartial arbiters of truth. (I once knew a consultant who pushed CONs through for a healthcare chain, who said that whichever competitor presented the last – not the best — statistics to the room almost always won.)

Regardless, I’d be interested in studying the results of health IT CON requirements in five or ten years and see if they had any measurable impact on healthcare competition and costs.  We’d learn a lot about health IT market dynamics, don’t you think?

“Learning Health System” Pilot Cuts Care Costs While Improving Quality

Posted on January 11, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

As some of you will know, the ONC’s Shared Nationwide Interoperability Roadmap’s goal is to create a “nationwide learning health system.”  In this system, individuals, providers and organizations will freely share health information, but more importantly, will share that information in “closed loops” which allow for continuous learning and care improvement.

When I read about this model – which is backed by the Institute of Medicine — I thought it sounded interesting, but didn’t think it terribly practical. Recently, though, I stumbled upon an experiment which attempts to bring this approach to life. And it’s more than just unusual — it seems to be successful.

What I’m talking about is a pilot study, done by a team from Nationwide Children’s Hospital and The Ohio State University, which involved implementing a “local” learning health system. During the pilot, team members used EHR data to create personalized treatments for patients based on data from others with similar conditions and risk factors.

To date, building a learning health system has been very difficult indeed, largely because integrating EHRs between multiple hospital systems is very difficult. For that reason, researchers with the two organizations decided to implement a “local” learning health system, according to a press statement from Nationwide Children’s.

To build the local learning health system, the team from Nationwide Children’s and Ohio State optimized the EHR to support their efforts. They also relied on a “robust” care coordination system which sat at the core of the EHR. The pilot subjects were a group of 131 children treated through the hospital’s cerebral palsy program.

Children treated in the 12-month program, named “Learn From Every Patient,” experienced a 43% reduction in total inpatient days, a 27% reduction in inpatient admissions, a 30% reduction in emergency department visits and a 29% reduction in urgent care visits.

The two institutions spent $225,000 to implement the pilot during the first year. However, the return on this investment was dramatic.  Researchers concluded that the program cut healthcare costs by $1.36 million. This represented a savings of about $6 for each dollar invested.

An added benefit from the program was that the clinicians working in the CP clinic found that this approach to care simplified documentation, which saved time and made it possible for them to see more patients during each session, the team found.

Not surprisingly, the research team thinks this approach has a lot of potential. “This method has the potential to be an effective complementary or alternative strategy to the top-down approach of learning health systems,” the release said. In other words, maybe bottom-up, incremental efforts are worth a try.

Given these results, it’d be nice to think that we’ll have full interoperability someday, and that we’ll be able to scale up the learning health system approach to the whole US. In the mean time, it’s good to see at least a single health system make some headway with it.