Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Waiting For The Perfect “Standard” Is Not The Answer To Healthcare’s Interoperability Problem

Posted on October 16, 2017 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

Have you bought into the “standards will solve healthcare’s interoperability woes” train of thought? Everyone understands that standards are necessary to enable disparate systems to communicate with each other, but as new applications and new uses for data continually appear, healthcare organizations that are waiting for universal standards, are not maximizing the value of their data. More importantly, they will be waiting a long time to realize the full potential of their data.

Healthcare interoperability is not just a matter of transferring data as an entire file from one user to another. Instead, effective exchange of information allows each user to select which elements of a patient’s chart are needed, and then access them in a format that enables analysis of different data sets to provide a holistic picture of the patient’s medical history or clinical trends in a population of patients. Healthcare’s interoperability challenge is further exacerbated by different contextual interpretations of the words within those fields. For instance, how many different ways are there to say heart attack?

The development of the Health Level Seven (HL7®) FHIR®, which stands for Fast Healthcare Interoperability Resources, represents a significant step forward to interoperability. While the data exchange draft that is being developed and published by HL7 eliminates many of the complexities of earlier HL7 versions and facilitates real-time data exchange via web technology, publication of release 4 – the first normative version of the standard – is not anticipated until October 2018.

As these standards are further developed, the key to universal adoption will be simplicity, according to John Lynn, founder of the HealthcareScene.com. However, he suggests that CIOs stop waiting for “perfect standards” and focus on how they can best achieve interoperability now.

Even with standards that can be implemented in all organizations, the complexity and diversity of the healthcare environment means that it will take time to move everyone to the same standards. This is complicated by the number of legacy systems and patchwork of applications that have been added to healthcare IT systems in an effort to meet quickly changing needs throughout the organization. Shrinking financial resources for capital investment and increasing competition for IT professionals limits a health system’s ability to make the overall changes necessary for interoperability – no matter which standards are adopted.

Some organizations are turning to cloud-based, managed service platforms to perform the integration, aggregation and harmonization that makes data available to all users – regardless of the system or application in which the information was originally collected. This approach solves the financial and human resource challenges by making it possible to budget integration and data management requirements as an operational rather than a capital investment. This strategy also relieves the burden on in-house IT staff by relying on the expertise of professionals who focus on emerging technologies, standards and regulations that enable safe, compliant data exchange.

How are you planning to scale your interoperability and integration efforts?  If you're waiting for standards, why are you waiting?

As a leading provider of healthcare interoperability solutions, Liaison is a proud sponsor of Healthcare Scene. While the conversation about interoperability has been ongoing for many years, ideas, new technology and new strategies discussed and shared by IT professionals will lead to successful healthcare data exchange that will transform healthcare and result in better patient care.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Interoperability: Is Your Aging Healthcare Integration Engine the Problem?

Posted on September 18, 2017 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.
There is no shortage of data collected by healthcare organizations that can be used to improve clinical as well as business decisions. Announcements of new technology that collects patient information, clinical outcome data and operational metrics that will make a physician or hospital provide better, more cost-effective care bombard us on a regular basis.

The problem today is not the amount of data available to help us make better decisions; the problem is the inaccessibility of the data. When different users – physicians, allied health professionals, administrators and financial managers – turn to data for decision support, they find themselves limited to their own silos of information. The inability to access and share data across different disciplines within the healthcare organization prevents the user from making a decision based on a holistic view of the patient or operational process.

In a recent article, Alan Portela points out that precision medicine, which requires “the ability to collect real-time data from medical devices at the moment of care,” cannot happen easily without interoperability – the ability to access data across disparate systems and applications. He also points out that interoperability does not exist yet in healthcare.

Why are healthcare IT departments struggling to achieve interoperability?

Although new and improved applications are adopted on a regular basis, healthcare organizations are just now realizing that their integration middleware is no longer able to handle new types of data such as social media, the volume of data and the increasing number of methods to connect on a real-time basis. Their integration platforms also cannot handle the exchange of information from disparate data systems and applications beyond the four walls of hospitals. In fact, hospitals of 500 beds or more average 25 unique data sources with six electronic medical records systems in use. Those numbers will only move up over time, not down.

Integration engines in place throughout healthcare today were designed well before the explosion of the data-collection tools and digital information that exist today. Although updates and additions to integration platforms have enabled some interoperability, the need for complete interoperability is creating a movement to replace integration middleware with cloud-based managed services.

A study by the Aberdeen Group reveals that 76 percent of organizations will be replacing their integration middleware, and 70 percent of those organizations will adopt cloud-based integration solutions in the next three years.

The report also points out that as healthcare organizations move from an on-premises solution to a cloud-based platform, business leaders see migration to the cloud and managed services as a way to better manage operational expenses on a monthly basis versus large, up-front capital investments. An additional benefit is better use of in-house IT staff members who are tasked with mission critical, day-to-day responsibilities and may not be able to focus on continuous improvements to the platform to ensure its ability to handle future needs.

Healthcare has come a long way in the adoption of technology that can collect essential information and put it in the hands of clinical and operational decision makers. Taking that next step to effective, meaningful interoperability is critical.

As a leading provider of healthcare interoperability solutions, Liaison is a proud sponsor of Healthcare Scene. It is only through discussions and information-sharing among Health IT professionals that healthcare will achieve the organizational support for the steps required for interoperability.

Join John Lynn and Liaison for an insightful webinar on October 5, titled: The Future of Interoperability & Integration in Healthcare: How can your organization prepare?

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Open Source Tool Offers “Synthetic” Patients For Hospital Big Data Projects

Posted on September 13, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

As readers will know, using big data in healthcare comes with a host of security and privacy problems, many of which are thorny.

For one thing, the more patient data you accumulate, the bigger the disaster when and if the database is hacked. Another important concern is that if you decide to share the data, there’s always the chance that your partner will use it inappropriately, violating the terms of whatever consent to disclose you had in mind. Then, there’s the issue of working with incomplete or corrupted data which, if extensive enough, can interfere with your analysis or even lead to inaccurate results.

But now, there may be a realistic alternative, one which allows you to experiment with big data models without taking all of these risks. A unique software project is underway which gives healthcare organizations a chance to scope out big data projects without using real patient data.

The software, Synthea, is an open source synthetic patient generator that models the medical history of synthetic patients. It seems to have been built by The MITRE Corporation, a not-for-profit research and development organization sponsored by the U.S. federal government. (This page offers a list of other open source projects in which MITRE is or has been involved.)

Synthea is built on a Generic Module Framework which allows it to model varied diseases and conditions that play a role in the medical history of these patients. The Synthea modules create synthetic patients using not only clinical data, but also real-world statistics collected by agencies like the CDC and NIH. MITRE kicked off the project using models based on the top ten reasons patients see primary care physicians and the top ten conditions that shorten years of life.

Its makers were so thorough that each patient’s medical experiences are simulated independently from their “birth” to the present day. The profiles include a full medical history, which includes medication lists, allergies, physician encounters and social determinants of health. The data can be shared using C-CDA, HL7 FHIR, CSV and other formats.

On its site, MITRE says its intent in creating Synthea is to provide “high-quality, synthetic, realistic but not real patient data and associated health records covering every aspect of healthcare.” As MITRE notes, having a batch of synthetic patient data on hand can be pretty, well, handy in evaluating new treatment models, care management systems, clinical support tools and more. It’s also a convenient way to predict the impact of public health decisions quickly.

This is such a good idea that I’m surprised nobody else has done something comparable. (Well, at least as far as I know no one has.) Not only that, it’s great to see the software being made available freely via the open source distribution model.

Of course, in the final analysis, healthcare organizations want to work with their own data, not synthetic substitutes. But at least in some cases, Synthea may offer hospitals and health systems a nice head start.

Healthcare Interoperability and Standards Rules

Posted on September 11, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Dave Winer is a true expert on standards. I remember coming across him in the early days of social media when every platform was considering some sort of API. To illustrate his early involvement in standards, Dave was one of the early developers of the RSS standard that is now available on every blog and many other places.

With this background in mind, I was extremely fascinated by a manifesto that Dave Winer published earlier this year that he calls “Rules for Standards-Makers.” Sounds like something we really need in healthcare no?

You should really go and read the full manifesto if you’re someone involved in healthcare standards. However, here’s the list of rules Dave offers standards makers:

  1. There are tradeoffs in standards
  2. Software matters more than formats (much)
  3. Users matter even more than software
  4. One way is better than two
  5. Fewer formats is better
  6. Fewer format features is better
  7. Perfection is a waste of time
  8. Write specs in plain English
  9. Explain the curiosities
  10. If practice deviates from the spec, change the spec
  11. No breakage
  12. Freeze the spec
  13. Keep it simple
  14. Developers are busy
  15. Mail lists don’t rule
  16. Praise developers who make it easy to interop

If you’ve never had to program to a standard, then you might not understand these. However, those who are deep into standards will understand the pitfalls. Plus, you’ll have horror stories about when you didn’t follow these rules and what challenges that caused for you going forward.

The thing I love most about Dave’s rules is that it focuses on simplicity and function. Unfortunately, many standards in healthcare are focused on complexity and perfection. Healthcare has nailed the complexity part and as Dave’s rules highlight, perfection is impossible with standards.

In fact, I skipped over Dave’s first rule for standards makers which highlights the above really well:

Rule #1: Interop is all that matters

As I briefly mentioned in the last CXO Scene podcast, many healthcare CIOs are waiting until the standards are perfect before they worry about interoperability. It’s as if they think that waiting for the perfect standard is going to solve healthcare interoperability. It won’t.

I hope that those building out standards in healthcare will take a deep look at the rules Dave Winer outlines above. We need better standards in healthcare and we need healthcare data to be interoperable.

Is It Time To Put FHIR-Based Development Front And Center?

Posted on August 9, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

I like to look at questions other people in the #HIT world wonder about, and see whether I have a different way of looking at the subject, or something to contribute to the discussion. This time I was provoked by one asked by Chad Johnson (@OchoTex), editor of HealthStandards.com and senior marketing manager with Corepoint Health.

In a recent HealthStandards.com article, Chad asks: “What do CIOs need to know about the future of data exchange?” I thought it was an interesting question; after all, everyone in HIT, including CIOs, would like to know the answer!

In his discussion, Chad argues that #FHIR could create significant change in healthcare infrastructure. He notes that if vendors like Cerner or Epic publish a capabilities-based API, providers’ technical, clinical and workflow teams will be able to develop custom solutions that connect to those systems.

As he rightfully points out, today IT departments have to invest a lot of time doing rework. Without an interface like FHIR in place, IT staffers need to develop workflows for one application at a time, rather than creating them once and moving on. That’s just nuts. It’s hard to argue that if FHIR APIs offer uniform data access, everyone wins.

Far be it from me to argue with a good man like @OchoTex. He makes a good point about FHIR, one which can’t be emphasized enough – that FHIR has the potential to make vendor-specific workflow rewrites a thing of the past. Without a doubt, healthcare CIOs need to keep that in mind.

As for me, I have a couple of responses to bring to the table, and some additional questions of my own.

Since I’m an HIT trend analyst rather than actual tech pro, I can’t say whether FHIR APIs can or can’t do what Chat is describing, though I have little doubt that Chad is right about their potential uses.

Still, I’d contend out that since none other than FHIR project director Grahame Grieve has cautioned us about its current limitations, we probably want to temper our enthusiasm a bit. (I know I’ve made this point a few times here, perhaps ad nauseum, but I still think it bears repeating.)

So, given that FHIR hasn’t reached its full potential, it may be that health IT leaders should invest added time on solving other important interoperability problems.

One example that leaps to mind immediately is solving patient matching problems. This is a big deal: After all, If you can’t match patient records accurately across providers, it’s likely to lead to wrong-patient related medical errors.

In fact, according to a study released by AHIMA last year, 72 percent of HIM professional who responded work on mitigating possible patient record duplicates every week. I have no reason to think things have gotten better. We must find an approach that will scale if we want interoperable data to be worth using.

And patient data matching is just one item on a long list of health data interoperability concerns. I’m sure you’re aware of other pressing problems which could undercut the value of sharing patient records. The question is, are we going to address those problems before we began full-scale health data exchange? Or does it make more sense to pave the road to data exchange and address bumps in the road later?

The More Hospital IT Changes, The More It Remains The Same

Posted on June 23, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Once every year or two, some technical development leads the HIT buzzword list, and at least at first it’s very hard to tell whether that will stick. But over time, the technologies that actually work well are subsumed into the industry as it exists, lose their buzzworthy quality and just do their job.

Once in a while, the hot new thing sparks real change — such as the use of mobile health applications — but more often the ideas are mined for whatever value they offer and discarded.  That’s because in many cases, the “new thing” isn’t actually novel, but rather a slightly different take on existing technology.

I’d argue that this is particularly true when it comes to hospital IT, given the exceptionally high cost of making large shifts and the industry’s conservative bent. In fact, other than the (admittedly huge) changes fostered by the adoption of EMRs, hospital technology deployments are much the same as they were ten years ago.

Of course, I’d be undercutting my thesis dramatically if I didn’t stipulate that EMR adoption has been a very big deal. Things have certainly changed dramatically since 2007, when an American Hospital Association study reported that 32% percent of hospitals had no EMR in place and 57% had only partially implemented their EMR, with only the remaining 11% having implemented the platform fully.

Today, as we know, virtually every hospital has implemented an EMR integrated it with ancillary systems (some more integrated and some less).  Not only that, some hospitals with more mature deployments in place have used EMRs and connected tools to make major changes in how they deliver care.

That being said, the industry is still struggling with many of the same problems it did in a decade ago.

The most obvious example of this is the extent to which health data interoperability efforts have stagnated. While hospitals within a health system typically share data with their sister facilities, I’d argue that efforts to share data with outside organizations have made little material progress.

Another major stagnation point is data analytics. Even organizations that spent hundreds of millions of dollars on their EMR are still struggling to squeeze the full value of this data out of their systems. I’m not suggesting that we’ve made no progress on this issue (certainly, many of the best-funded, most innovative systems are getting there), but such successes are still far from common.

Over the longer-term, I suspect the shifts in consciousness fostered by EMRs and digital health will gradually reshape the industry. But don’t expect those technology lightning bolts to speed up the evolution of hospital IT. It’s going take some time for that giant ship to turn.

Google’s DeepMind Rolling Out Bitcoin-Like Health Record Tracking To Hospitals

Posted on May 8, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Blockchain technology is gradually becoming part of how we think about healthcare data. Even government entities like the ONC and FDA – typically not early adopters – are throwing their hat into the blockchain ring.

In fact, according to recent research by Deloitte, healthcare and life sciences companies are planning the most aggressive blockchain deployments of any industry. Thirty-five percent of Deloitte’s respondents told the consulting firm that they expected to put blockchain into production this year.

Many companies are tackling the practical uses of blockchain tech in healthcare. But to me, few are more interesting than Google’s DeepMind, a hot new AI firm based in the UK acquired by Google a few years ago.

DeepMind has already signed an agreement with a branch of Britain’s National Health Trust, under which it will access patient data in the development healthcare app named Streams. Now, it’s launching a new project in partnership with the NHS, in which it will use a new technology based on bitcoin to let hospitals, the NHS and over time, patients track what happens to personal health data.

The new technology, known as “Verifiable Data Audit,” will create a specialized digital ledger which automatically records every time someone touches patient data, according to British newspaper The Guardian.

In a blog entry, DeepMind co-founder Mustafa Suleyman notes that the system will track not only that the data was used, but also why. In addition, the ledger supporting the audit will be set to append-only, so once the system records an activity, that record can’t be erased.

The technology differs from existing blockchain models in some important ways, however. For one thing, unlike in other blockchain models, Verifiable Data Audit won’t rely on decentralized ledger verification of a broad set of participants. The developers have assumed that trusted institutions like hospitals can be relied on to verify ledger records.

Another way in which the new technology is different is that it doesn’t use a chain infrastructure. Instead, it’s using a mathematical function known as a Merkle tree. Every time the system adds an entry to the ledger, it generates a cryptographic hash summarizing not only that latest ledger entry, but also the previous ledger values.

DeepMind is also providing a dedicated online interface which participating hospitals can use to review the audit trail compiled by the system, in real-time. In the future, the company hopes to make automated queries which would “sound the alarm” if data appeared to be compromised.

Though DeepMind does expect to give patients direct oversight over how, where and why their data has been used, they don’t expect that to happen for some time, as it’s not yet clear how to secure such access. In the mean time, participating hospitals are getting a taste of the future, one in which patients will ultimate control access to their health data assets.

Is There a Case to Be Made that Interoperability Saves Hospitals Money?

Posted on April 17, 2017 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Back in 2013 I argued that we needed a lot less talk and a lot more action when it came to interoperability in healthcare. It seemed very clear to me then and even now that sharing health data was the right thing to do for the patient. I have yet to meet someone who thinks that sharing a person’s health data with their providers is not the right thing to do for the patient. No doubt we shouldn’t be reckless with how we share the data, but patient care would improve if we shared data more than we do today.

While the case for sharing health data seems clear from the patient perspective, there were obvious business reasons why many organizations didn’t want to share their patients health data. From a business perspective it was often seen as an expense that they’d incur which could actually make them lose money.

These two perspectives is what makes healthcare interoperability so challenging. We all know it’s the right thing to do, but there are business reasons why it doesn’t make sense to invest in it.

While I understand both sides of the argument, I wondered if we could make the financial case for why a hospital or healthcare organization should invest in interoperability.

The easy argument is that value based care is going to require you to share data to be successful. That previous repeat X-ray that was seen as a great revenue source will become a cost center in a value based reimbursement world. At least that’s the idea and healthcare organizations should prepare for this. That’s all well and could, but the value based reimbursement stats show that we’re not there yet.

What are the other cases we can make for interoperability actually saving hospitals money?

I recently saw a stat that 70% of accidental deaths and injuries in hospitals are caused by communication issues. Accidental deaths and injuries are very expensive to a hospital. How many lives could be saved, hospital readmissions avoided, or accidental injuries could be prevented if providers had the right health data at the right place and the right time?

My guess is that not having the right healthcare data to treat a patient correctly is a big problem that causes a lot of patients to suffer needlessly. I wonder how many malpractice lawsuits could be avoided if the providers had the patients full health record available to them. Should malpractice insurance companies start offering healthcare organizations a doctors a discount if they have high quality interoperability solutions in their organization?

Obviously, I’m just exploring this idea. I’d love to hear your thoughts on it. Can interoperability solutions help a hospital save money? Are their financial reasons why interoperability should be implemented now?

While I still think we should make health data interoperability a reality because it’s the right thing to do for the patients, it seems like we need to dive deeper into the financial reasons why we should be sharing patient’s health data. Otherwise, we’ll likely never see the needle move when it comes to health data sharing.

Database Linked With Hospital EMR To Encourage Drug Monitoring

Posted on March 31, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

According to state officials, Colorado occupies the unenviable position of second worst in the US for prescription drug misuse, with more than 255,000 Coloradans misusing prescribed medications.

One way the state is fighting back is by running the Colorado Prescription Drug Monitoring Program which, like comparable efforts in other states, tracks prescriptions for controlled medications. Every regular business day, the state’s pharmacists upload prescription data for medications listed in Schedules II through V.

While this effort may have value, many physicians haven’t been using the database, largely because it can be difficult to access. In fact, historically physicians have been using the system only about 30 percent of the time when prescribing controlled substances, according to a story appearing in HealthLeaders Media.

As things stand, it can take physicians up to three minutes to access the data, given that they have to sign out of their EMR, visit the PDMP site, log in using separate credentials, click through to the right page, enter patient information and sort through possible matches before they got to the patient’s aggregated prescription history. Given the ugliness of this workflow, it’s no surprise that clinicians aren’t searching out PDMP data, especially if they don’t regard a patient as being at a high risk for drug abuse or diversion.

But perhaps taking some needless steps out of the process can make a difference, a theory which one of the state’s hospitals is testing. Colorado officials are hoping a new pilot program linking the PDMP database to an EMR will foster higher use of the data by physicians. The pilot, funded by a federal grant through the Bureau of Justice Assistance, connects the drug database directly to the University of Colorado Hospital’s Epic EMR.

The project began with a year-long building out phase, during which IT leaders created a gateway connecting the PDMP database and the Epic installation. Several months ago, the team followed up with a launch at the school of medicine’s emergency medicine department. Eventually, the PDMP database will be available in five EDs which have a combined total of 270,000 visits per year, HealthLeaders notes.

Under the pilot program, physicians can access the drug database with a single click, directly from within the Epic EMR system. Once the PDMP database was made available, the pilot brought physicians on board gradually, moving from evaluating their baseline use, giving clinicians raw data, giving them data using a risk-stratification tool and eventually requiring that they use the tool.

Researchers guiding the pilot are evaluating whether providers use the PDMP more and whether it has an impact on high-risk patients. Researchers will also analyze what happened to patients a year before, during and a year after their ED visits, using de-identified patient data.

It’s worth pointing out that people outside of Colorado are well aware of the PDMP access issue. In fact, the ONC has been paying fairly close attention to the problem of making PDMP data more accessible. That being said, the agency notes that integrating PDMPs with other health IT systems won’t come easily, given that no uniform standards exist for linking prescription drug data with health IT systems. ONC staffers have apparently been working to develop a standard approach for delivering PDMP data to EMRs, pharmacy systems and health information exchanges.

However, at present it looks like custom integration will be necessary. Perhaps pilots like this one will lead by example.

The Distributed Hospital On The Horizon

Posted on February 24, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

If you’re reading this blog, you already know that distributed, connected devices and networks are the future of healthcare.  Connected monitoring devices are growing more mature by the day, network architectures are becoming amazingly fluid, and with the growth of the IoT, we’re adding huge numbers of smart devices to an already-diverse array of endpoints.  While we may not know what all of this will look when it’s fully mature, we’ve already made amazing progress in connecting care.

But how will these trends play out? One nice look at where all this is headed comes from Jeroen Tas, chief innovation and strategy officer at Philips. In a recent article, Tas describes a world in which even major brick-and-mortar players like hospitals go almost completely virtual.  Certainly, there are other takes out there on this subject, but I really like how Tas explains things.

He starts with the assertion that the hospital of the future “is not a physical location with waiting rooms, beds and labs.” Instead, a hospital will become an abstract network overlay connecting nodes. It’s worth noting that this isn’t just a concept. For an example, Tas points to the Mercy Virtual Care Center, a $54 million “hospital without beds” dedicated to telehealth and connected care.  The Center, which has over 300 employees, cares for patients at home and in beds across 38 hospitals in seven states.

While the virtual hospital may not rely on a single, central campus, physical care locations will still matter – they’ll just be distributed differently. According to Tas, the connected health network will work best if care is provided as needed through retail-type outlets near where people live, specialist hubs, inpatient facilities and outpatient clinics. Yes, of course, we already have all of these things in place, but in the new connected world, they’ll all be on a single network.

Ultimately, even if brick-and-mortar hospitals never disappear, virtual care should make it possible to cut down dramatically on hospital admissions, he suggests.  For example, Tas notes that Philips partner Banner Health has slashed hospital admissions almost 50% by using telehealth and advanced analytics for patients with multiple chronic conditions. (We’ve also reported on a related pilot by Partners HealthCare Brigham and Women’s Hospital, the “Home Hospital,” which sends patients home with remote monitoring devices as an alternative to admissions.)

Of course, the broad connected care outline Tas offers can only take us so far. It’s all well and good to have a vision, but there are still some major problems we’ll have to solve before connected care becomes practical as a backbone for healthcare delivery.

After all, to cite one major challenge, community-wide connected health won’t be very practical until interoperable data sharing becomes easier – and we really don’t know when that will happen. Also, until big data analytics tools are widely accessible (rather than the province of the biggest, best-funded institutions) it will be hard for providers to manage the data generated by millions of virtual care endpoints.

Still, if Tas’s piece is any indication, consensus is building on what next-gen care networks can and should be, and there’s certainly plenty of ways to lay the groundwork for the future. Even small-scale, preliminary connected health efforts seem to be fostering meaningful changes in how care is delivered. And there’s little doubt that over time, connected health will turn many brick-and-mortar care models on their heads, becoming a large – or even dominant – part of care delivery.

Getting there may be tricky, but if providers keep working at connected care, it should offer an immense payoff.