Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Healthcare Interoperability is Solved … But What Does That Really Mean? – #HITExpo Insights

Posted on June 12, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

One of the best parts of the new community we created at the Health IT Expo conference is the way attendees at the conference and those in the broader healthcare IT community engage on Twitter using the #HITExpo hashtag before, during, and after the event.  It’s a treasure trove of insights, ideas, practical innovations, and amazing people.  Don’t forget that last part since social media platforms are great at connecting people even if they are usually in the news for other reasons.

A great example of some great knowledge sharing that happened on the #HITExpo hashtag came from Don Lee (@dflee30) who runs #HCBiz, a long time podcast which he recorded live from Health IT Expo.  After the event, Don offered his thoughts on what he thought was the most important conversation about “Solving Interoperability” that came from the conference.  You can read his thoughts on Twitter or we’ve compiled all 23 tweets for easy reading below (A Big Thanks to Thread Reader for making this easy).

As shared by Don Lee:

1/ Finally working through all my notes from the #HITExpo. The most important conversation to me was the one about “solving interoperability” with @RasuShrestha@PaulMBlack and @techguy.

2/ Rasu told the story of what UPMC accomplished using DBMotion. How it enabled the flow of data amongst the many hospitals, clinics and docs in their very large system. #hitexpo

3/ John challenged him a bit and said: it sounds like you’re saying that you’ve solved #interoperability. Is that what you’re telling us? #hitexpo

4/ Rasu explained in more detail that they had done the hard work of establishing syntactic interop amongst the various systems they dealt with (I.e. they can physically move the data from one system to another and put it in a proper place). #hitexpo

5/ He went on and explained how they had then done the hard work of establishing semantic interoperability amongst the many systems they deal with. That means now all the data could be moved, put in its proper place, AND they knew what it meant. #hitexpo

6/ Syntactic interop isn’t very useful in and of itself. You have data but it’s not mastered and not yet useable in analytics. #hitexpo

7/ Semantic interop is the mastering of the data in such a way that you are confident you can use it in analytics, ML, AI, etc. Now you can, say, find the most recent BP for a patient pop regardless of which EMR in your system it originated. And have confidence in it. #hitexpo

8/ Semantic interop is closely related to the concept of #DataFidelity that @BigDataCXO talks about. It’s the quality of data for a purpose. And it’s very hard work. #hitexpo

9/ In the end, @RasuShrestha’s answer was that UPMC had done all of that hard work and therefore had made huge strides in solving interop within their system. He said “I’m not flying the mission accomplished banner just yet”. #hitexpo

10/ Then @PaulMBlack – CEO at @Allscripts – said that @RasuShrestha was being modest and that they had in fact “Solved interoperability.”

I think he’s right and that’s what this tweet storm is about. Coincidentally, it’s a matter of semantics. #hitexpo

11/ I think Rasu dialed it back a bit because he knew that people would hear that and think it means something different. #hitexpo

12/ The overall industry conversation tends to be about ubiquitous, semantic interop where all data is available everywhere and everyone knows what it means. I believe Rasu was saying that they hadn’t achieved that. And that makes sense… because it’s impossible. #hitexpo

13/ @GraceCordovano asked the perfect question and I wish there had been a whole session dedicated to answering it: (paraphrasing) What’s the difference between your institutional definition of interop and what the patients are talking about? #hitexpo

14/ The answer to that question is the crux of our issue. The thing patients want and need is for everyone who cares for them to be on the same page. Interop is very relevant to that issue, obviously, but there’s a lot of friction and it goes way beyond tech. #hitexpo

15/ Also, despite common misconception, no other industry has solved this either. Sure, my credit card works in Europe and Asia and gets back to my bank in the US, but that’s just a use case. There is no ubiquitous semantic interop between JP Morgan Chase and HSBC.

16/ There are lots of use cases that work in healthcare too. E-Prescribing, claims processing and all the related HIPAA transactions, etc. #hitexpo

17/ Also worth noting… Canada has single payer system and they also don’t have clinical interoperability.

This is not a problem unique to healthcare nor the US. #hitexpo

18/ So healthcare needs to pick its use cases and do the hard work. That’s what Rasu described on stage. That’s what Paul was saying has been accomplished. They are both right. And you can do it too. #hitexpo

19/ So good news: #interoperability is solved in #healthcare.

Bad news: It’s a ton of work and everyone needs to do it.

More bad news: You have to keep doing it forever (it breaks, new partners, new sources, new data to care about, etc). #hitexpo

19/ Some day there will be patient mediated exchange that solves the patient side of the problem and does it in a way that works for everyone. Maybe on a #blockchain. Maybe something else. But it’s 10+ years away. #hitexpo

20/ In the meantime my recommendation to clinical orgs – support your regional #HIE. Even UPMC’s very good solution only works for data sources they know about. Your patients are getting care outside your system and in a growing # of clinical and community based settings. #hitexpo

21/ the regional #HIE is the only near-term solution that even remotely resembles semantic, ubiquitous #interoperability in #healthcare.
#hitexpo

22/ My recommendation to patients: You have to take matters into your own hands for now. Use consumer tools like Apple health records and even Dropbox like @ShahidNShah suggested in another #hitexpo session. Also, tell your clinicians to support and use the regional #HIE.

23/ So that got long. I’ll end it here. What do you think?

P.S. the #hitexpo was very good. You should check it out in 2019.

A big thank you to Don Lee for sharing these perspectives and diving in much deeper than we can do in 45 minutes on stage. This is what makes the Health IT Expo community special. People with deep understanding of a problem fleshing out the realities of the problem so we can better understand how to address them. Plus, the sharing happens year round as opposed to just at a few days at the conference.

Speaking of which, what do you think of Don’s thoughts above? Is he right? Is there something he’s missing? Is there more depth to this conversation that we need to understand? Share your thoughts, ideas, insights, and perspectives in the comments or on social media using the #HITExpo hashtag.

Making Healthcare Data Useful

Posted on May 14, 2018 I Written By

The following is a guest blog by Monica Stout from MedicaSoft

At HIMSS18, we spoke about making health data useful to patients with the Delaware Health Information Network (DHIN). Useful data for patients is one piece of the complete healthcare puzzle. Providers also need useful data to provide more precise care to patients and to reach patient populations who would benefit directly from the insights they gain. Payers want access to clinical data, beyond just claims data, to aggregate data historically. This helps payers define which patients should be included in care coordination programs or who should receive additional disease management assistance or outreach.

When you’re a provider, hospital, health system, health information exchange, or insurance provider and have the data available, where do you start? It’s important to start at the source of the data to organize it in a way that makes insights and actions possible. Having the data is only half of the solution for patients, clinicians or payers. It’s what you do with the data that matters and how you organize it to be usable. Just because you may have years of data available doesn’t mean you can do anything with it.

Historically, healthcare has seen many barriers to marrying clinical and claims data. Things like system incompatibility, poor data quality, or siloed data can all impact organizations’ ability to access, organize, and analyze data stores. One way to increase the usability of your data is to start with the right technology platform. But what does that actually mean?

The right platform starts with a data model that is flexible enough to support a wide variety of use models. It makes data available via open, standards-based APIs. It organizes raw data into longitudinal records. It includes services, such as patient matching and terminology mapping, that make it easy to use the data in real-world applications. The right platform transforms raw data into information that that aids providers and payers improve outcomes and manage risk and gives patients a more complete view of their overall health and wellness.

Do you struggle with making your data insightful and actionable? What are you doing to transform your data? Share your insights, experiences, challenges, and thoughts in the comments or with us on Twitter @MedicaSoftLLC.

About Monica Stout
Monica is a HIT teleworker in Grand Rapids, Michigan by way of Washington, D.C., who has consulted at several government agencies, including the National Aeronautics Space Administration (NASA) and the U.S. Department of Veterans Affairs (VA). She’s currently the Marketing Director at MedicaSoft. Monica can be found on Twitter @MI_turnaround or @MedicaSoftLLC.

About MedicaSoft
MedicaSoft  designs, develops, delivers, and maintains EHR, PHR, and UHR software solutions and HISP services for healthcare providers and patients around the world. MedicaSoft is a proud sponsor of Healthcare Scene. For more information, visit www.medicasoft.us or connect with us on Twitter @MedicaSoftLLC, Facebook, or LinkedIn.

Improving Data Outcomes: Just What The Doctor Ordered

Posted on May 8, 2018 I Written By

The following is a guest blog post by Dave Corbin, CEO of HULFT.

Health care has a data problem. Vast quantities are generated but inefficiencies around sharing, retrieval, and integration have acute repercussions in an environment of squeezed budgets and growing patient demands.

The sensitive nature of much of the data being processed is a core issue. Confidential patient information has traditionally encouraged a ‘closed door’ approach to data management and an unease over hyper-accessibility to this information.

Compounding the challenge is the sheer scale and scope of the typical health care environment and myriad of departmental layers. The mix of new and legacy IT systems used for everything from billing records to patient tracking often means deep silos and poor data connections, the accumulative effect of which undermines decision-making. As delays become commonplace, this ongoing battle to coordinate disparate information manifests itself in many different ways in a busy hospital.

Optimizing bed occupancies – a data issue?

One example involves managing bed occupancy, a complex task which needs multiple players to be in the loop when it comes to the latest on a patient’s admission or discharge status. Anecdotal evidence points to a process often informed manually via feedback with competing information. Nurses at the end of their shift may report that a patient is about to be discharged, unaware that a doctor has since requested more tests to be carried out for that patient. As everyone is left waiting for the results from the laboratory, the planned changeover of beds is delayed with many knock-on effects, increasing congestion and costs and frustrating staff and patients in equal measure.

How data is managed becomes a critical factor in tackling the variations that creep into critical processes and resource utilization. In the example above, harnessing predictive modelling and data mining to forecast the number of patient discharges so that the number of beds available for the coming weeks can be estimated more accurately will no doubt become an increasingly mainstream option for the sector.

Predictive analytics is great and all, but first….

Before any of this can happen, health care organizations need a solid foundation of accessible and visible data which is centralized, intuitive, and easy to manage.

Providing a holistic approach to data transfer and integration, data logistics can help deliver security, compliance, and seamless connectivity speeding up the processing of large volumes of sensitive material such as electronic health records – the kind of data that simply cannot be lost. These can ensure the reliable and secure exchange of intelligence with outside health care vendors and partners.

For data outcomes, we’re calling for a new breed of data logistics that’s intuitive and easy to use. Monitoring interfaces which enable anyone with permission to access the network to see what integrations and transfers are running in real time with no requirement for programming or coding are the kind of intervention which opens the data management to a far wider section of an organization.

Collecting data across a network of multiple transfer and integration activities and putting it in a place where people can use, manage and manipulate becomes central to breaking down the barriers that have long compromised efficiencies in the health care sector.

HULFT works with health care organizations of all sizes to establish a strong back-end data infrastructure that make front-end advances possible. Learn how one medical technology pioneer used HULFT to drive operational efficiencies and improve quality assurance in this case study.

Dave Corbin is CEO of HULFT, a comprehensive data logistics platform that allows IT to find, secure, transform and move information at scale. HULFT is a proud sponsor of Health IT Expo, a practical innovation conference organized by Healthcare Scene.  Find out more at hulftinc.com

Health Orgs Were In Talks To Collect SDOH Data From Facebook

Posted on April 9, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

These days, virtually everyone in healthcare has concluded that integrating social determinants of health data with existing patient health information can improve care outcomes. However, identifying and collecting useful, appropriately formatted SDOH information can be a very difficult task. After all, in most cases it’s not just lying around somewhere ripe for picking.

Recently, however, Facebook began making the rounds with a proposal that might address the problem. While the research initiative has been put on hold in light of recent controversy over Facebook’s privacy practices, my guess is that the healthcare players involved will be eager to resume talks if the social media giant manages to calm the waters.

According to CNBC, Facebook was talking to healthcare organizations like Stanford Medical School and American College of Cardiology, in addition to several other hospitals, about signing a data-sharing agreement. Under the terms of the agreement, the healthcare organizations would share anonymized patient data, which Facebook planned to match up with user data from its platform.

Facebook’s proposal will sound familiar to readers of this site. It suggested combining what a health system knows about its patients, such as their age, medication list and hospital admission history, with Facebook-available data such as the user’s marital status, primary language and level of community involvement.

The idea would then be to study, with an initial focus on cardiovascular health, whether this combined data could improve patient care, something its prospective partners seem to think possible. The CNBC story included a gushing statement from American College of Cardiology interim CEO Cathleen Gates suggesting that such data sharing could create revolutionary results. According to Gates, the ACC believes that mixing anonymized Facebook data with anonymized ACC data could help greatly in furthering scientific research on how social media can help in preventing and treating heart disease.

As the business site notes, the data would not include personally identifiable information. That being said, Facebook proposed to use hashing to match individuals existing in both data sets. If the project were to have gone forward, Facebook might’ve shared data on roughly 87 million users.

Looked at one way, this arrangement could raise serious privacy questions. After all, healthcare organizations should certainly exercise caution when exchanging even anonymized data with any outside organization, and with questions still lingering on how willing Facebook is to lock data down projects like this become even riskier.

Still, under the right circumstances, Facebook could prove to be an all but ideal source of comprehensive, digitized SDOH data. Well now, arguably, might not be the time to move ahead, hospitals should keep this kind of possibility in mind.

Health Leaders Go Beyond EHRs To Tackle Value-Based Care

Posted on March 30, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

In the broadest sense, EHRs were built to manage patient populations — but largely one patient at a time. As a result, it’s little wonder that they aren’t offering much support for value-based care as is, as a recent report from Sage Growth Partners suggests.

Sage spoke with 100 healthcare executives to find out what they saw as their value-based care capabilities and obstacles. Participants included leaders from a wide range of entities, including an ACO, several large physician practices and a midsize integrated delivery network.

The overall sense Sage seems to have gotten from its research was that while value-based care contracts are beginning to pay off, health execs are finding it difficult support these contacts using the EHRs they have in place. While their EHRs can produce quality reports, most don’t offer data aggregation and analytics, risk stratification, care coordination or tools to foster patient and clinician engagement, the report notes.

To get the capabilities they need for value-based contracting, health organizations are layering population health management solutions on top of their EHRs. Though these additional PHM tools may not be fully mature, health executives told Sage that there already seeing a return on such investments.

This is not necessarily because these organizations aren’t comfortable with their existing EHR. The Sage study found that 65% of respondents were somewhat or highly unlikely to replace their EHR in the next three years.

However, roughly half of the 70% of providers who had EHRs for at least three years also have third-party PHM tools in place as well. Also, 64% of providers said that EHRs haven’t delivered many important value-based contracting tools.

Meanwhile, 60% to 75% of respondents are seeking value-based care solutions outside their EHR platform. And they are liking the results. Forty-six percent of the roughly three-quarters of respondents who were seeing ROI with value-based care felt that their third-party population PHM solution was essential to their success.

Despite their concerns, healthcare organizations may not feel impelled to invest in value-based care tools immediately. Right now, just 5% of respondents said that value-based care accounted for over 50% of their revenues, while 62% said that such contracts represented just 0 to 10% of their revenues. Arguably, while the growth in value-based contracting is continuing apace, it may not be at a tipping point just yet.

Still, traditional EHR vendors may need to do a better job of supporting value-based contracting (not that they’re not trying). The situation may change, but in the near term, health executives are going elsewhere when they look at building their value-based contracting capabilities. It’s hard to predict how this will turn out, but if I were an enterprise EHR vendor, I’d take competition with population health management specialist vendors very seriously.

Study Offers EHR-Based Approach To Predicting Post-Hospital Opioid Use

Posted on March 27, 2018 I Written By

Sunny is a serial entrepreneur on a mission to improve quality of care through data science. Sunny’s last venture docBeat, a healthcare care coordination platform, was successfully acquired by Vocera communications. Sunny has an impressive track record of Strategy, Business Development, Innovation and Execution in the Healthcare, Casino Entertainment, Retail and Gaming verticals. Sunny is the Co-Chair for the Las Vegas Chapter of Akshaya Patra foundation (www.foodforeducation.org) since 2010.

With opioid abuse a raging epidemic in the United States, hospitals are looking for effective ways to track and manage opioid treatment effectively. In an effort to move in this direction, a group of researchers has developed a model which predicts the likelihood of future chronic opioid use based on hospital EHR data.

The study, which appears in the Journal of General Internal Medicine, notes that while opioids are frequently prescribed in hospitals, there has been little research on predicting which patients will progress to chronic opioid therapy (COT) after they are discharged. (The researchers defined COT as when patients were given a 90-day supply of opioids with less than a 30-day gap in supply over a 180-day period or receipt of greater than 10 opioid prescriptions during the past year.)

To address this problem, researchers set out to create a statistical model which could predict which hospitalized patients would end up on COT who had not been on COT previously. Their approach involved doing a retrospective analysis of EHR data from 2008 to 2014 drawn from records of patients hospitalized in an urban safety-net hospital.

The researchers analyzed a wide array of variables in their analysis, including medical and mental health diagnoses, substance and tobacco use, chronic or acute pain, surgery during hospitalization, having received opioid or non-opioid analgesics or benzodiazepines during the past year, leaving the hospital with opioid prescriptions and milligrams of morphine equivalents prescribed during their hospital stay.

After conducting the analysis, researchers found that they could predict COT in 79% of patients, as well as predicting when patients weren’t on COT 78% of the time.

Being able to predict which patients will end up on COT after discharge could prove to be a very effective tool. As the authors note, using EHR data to create such a predictive model could offer many benefits, particularly the ability to identify patients at high risk for future chronic opioid use.

As the study notes, if clinicians have this information, they can offer early patient education on pain management strategies and where possible, wean them off of opioids before discharging them. They’ll also be more likely to consider incorporating alternative pain therapies into their discharge planning.

While this data is exciting and provides great opportunities, we need to be careful how we use this information. Done incorrectly it could cause the 21% who are misidentified as at risk for COT to end up needing COT. It’s always important to remember that identifying those at risk is only the first challenge. The second challenge is what do you do with that data to help those at risk while not damaging those who are misidentified as at risk.

One issue the study doesn’t address is whether data on social determinants of health could improve their predictions. Incorporating both SDOH and patient-generated data might lend further insight into their post-discharge living conditions and solidify discharge planning. However, it’s evident that this model offers a useful approach on its own.

Yale New Haven Hospital Partners With Epic On Centralized Operations Center

Posted on February 5, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Info, info, all around, and not a place to manage it all. That’s the dilemma faced by most hospitals as they work to leverage the massive data stores they’re accumulating in their health IT systems.

Yale New Haven Hospital’s solution to the problem is to create a centralized operations center which connects the right people to real-time data analytics. Its Capacity Command Center (nifty alliteration, folks!) was created by YNHH, Epic and the YNHH Clinical Redesign Initiative.

The Command Center project comes five years into YNHH’s long-term High Reliability project, which is designed to prepare the institution for future challenges. These efforts are focused not only on care quality and patient safety but also managing what YNHH says are the highest patient volumes in Connecticut. Its statement also notes that with transfers from other hospitals increasing, the hospital is seeing a growth in patient acuity, which is obviously another challenge it must address.

The Capacity Command Center’s functions are fairly straightforward, though they have to have been a beast to develop.

On the one hand, the Center offers technology which sorts through the flood of operational data generated by and stored in its Epic system, generating dashboards which change in real time and drive process changes. These dashboards present real-time metrics such as bed capacity, delays for procedures and tests and ambulatory utilization, which are made available on Center screens as well as within Epic.

In addition, YNHH has brought representatives from all of the relevant operational areas into a single physical location, including bed management, the Emergency Department, nursing staffing, environmental services and patient transport. Not only is this a good approach overall, it’s particularly helpful when patient admissions levels climb precipitously, the hospital notes.

This model is already having a positive impact on the care process, according to YNHH’s statement. For example, it notes, infection prevention staffers can now identify all patients with Foley catheters and review their charts. With this knowledge in hand, these staffers can discuss whether the patient is ready to have the catheter removed and avoid related urinary tract infections associated with prolonged use.

I don’t know about you, but I was excited to read about this initiative. It sounds like YNHH is doing exactly what it should do to get more out of patient data. For example, I was glad to read that the dashboard offered real-time analytics options rather than one-off projections from old data. Bringing key operational players together in one place makes great sense as well.

Of course, not all hospitals will have the resources to pull something off something like this. YNHH is a 1,541-bed giant which had the cash to take on a command center project. Few community hospitals would have the staff or money to make such a thing happen. Still, it’s good to see somebody at the cutting edge.

Texas Hospital Association Dashboard Offers Risk, Cost Data

Posted on January 22, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

The Texas Hospital Association has agreed to a joint venture with health IT vendor IllumiCare to roll out a new tool for physicians. The new dashboard offers an unusual but powerful mix of risk data and real-time cost information.

According to THA, physician orders represent 87% of hospital expenses, but most know little about the cost of items they order. The new dashboard, Smart Ribbon, gives doctors information on treatment costs and risk of patient harm at the point of care. THA’s assumption is that the data will cause them to order fewer and less costly tests and meds, the group says.

To my mind, the tool sounds neat. IllumiCare’s Smart Ribbon technology doesn’t need to be integrated with the hospital’s EMR. Instead, it works with existing HL-7 feeds and piggybacks onto existing user authorization schemes. In other words, it eliminates the need for creating costly interfaces to EMR data. The dashboard includes patient identification, a timer if the patient is on observational status, a tool for looking up costs and tabs providing wholesale costs for meds, labs and radiology. It also estimates iatrogenic risks resulting from physician decisions.

Unlike some clinical tools I’ve seen, Smart Ribbon doesn’t generate alerts or alarms, which makes it a different beast than many other clinical decision support tools. That doesn’t mean tools that do generate alerts are bad, but that feature does set it apart from others.

We’ve covered many other tools designed to support physicians, and as you’d probably guess, those technologies come in all sizes. For example, last year contributor Andy Oram wrote about a different type of dashboard, PeraHealth, a surveillance system targeting at-risk patients in hospitals.

PeraHealth identifies at-risk patients through analytics and displays them on a dashboard that doctors and nurses can pull up, including trends over several shifts. Its analytical processes pull in nursing assessments in addition to vital signs and other standard data sets. This approach sounds promising.

Ultimately, though, dashboard vendors are still figuring out what physicians need, and it’s hard to tell whether their market will stay alive. In fact, according to one take from Kalorama Information, this year technologies like dashboarding, blockchain and even advanced big data analytics will be integrated into EMRs.

As for me, I think Kalorama’s prediction is too aggressive. While I agree that many freestanding tools will be integrated into the EMR, I don’t think it will happen this or even next year. In the meantime, there’s certainly a place for creating dashboards that accommodate physician workflow and aren’t too intrusive. For the time being, they aren’t going away.

Using Geography to Combat the Opioid Crisis

Posted on January 10, 2018 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He is currently an independent marketing consultant working with leading healthIT companies. Colin is a member of #TheWalkingGallery. His Twitter handle is: @Colin_Hung.

When it comes to the opioid crisis, the numbers aren’t good. According to the latest CDC numbers, over 66,000 Americans died from drug overdoses between May 2016 and May 2017. Unfortunately this continues the rapid upward trend over the past five years.

Credit: New York Times, The First Count of Fentanyl Deaths in 2016: Up 540% in Three Years, 2 Sept 2017, https://www.nytimes.com/interactive/2017/09/02/upshot/fentanyl-drug-overdose-deaths.html

One of the biggest drivers for this increase is the prevalence of opioids – a class of drugs that includes pain medications, heroin and fentanyl (a synthetic opioid). The opioid crisis is not the stereotypical street-drug problem. It is not confined to inner cities or to any socio-economic boundaries. It affects all neighborhoods…and therein lies one of the greatest challenges of dealing with the crisis, knowing where to deploy precious resources.

As governments and public health authorities begin to take more aggressive action, some are wisely turning to geographic information systems (GIS) in order to determine where the need is greatest. GIS (also called geospatial mapping) are designed specifically to capture, store, manage and analyze geographical data. It has been a mainstay in mining, engineering and environmental sciences since the early 1990’s. For more information about GIS, please see this excellent PBS documentary. In recent years, GIS has been applied to a number of new areas including healthcare.

Esri is one of the companies doing pioneering GIS work in healthcare and recently they have focused on applying their ArcGIS technology to help tackle the opioid crisis. “One of the basic challenges that public health authorities face is clearly defining the scope of the opioid problem in their local area.” says Estella Geraghty MD, Chief Medical Officer & Health Solutions Director at Esri. “The good news is that the information to map the extent of the problem is available, it’s just stored in disparate systems and in incompatible formats. We help bring it all together.”

Geraghty points to their work with the Tri-County Health Department (TCHD) as an example of how effective GIS can be. TCHD is one of the largest public health agencies in the US, serving 1.5 million residents in three of Denver’s metropolitan counties: Adams, Arapahoe and Douglas. Using Esri’s ArcGIS solution, TCHD created an open data site that allows internal teams and external partners to pool and share their opioid health information using a visual map of the region as a common base of reference.

According to Esri: “Since the creation of the Open Data site, there has been a dramatic increase in both the information available to the public and the community’s understanding of the opioid crisis.” You can see the Open Data site here and if you scroll down you will see six different maps available to the public. Particularly sobering is the Opioid Overdose Deaths from 2011-2016, which allows you to zoom in down to specific streets/blocks. Another interesting map is the Household Medication Take-Back Locations which seems to indicate there is a lack of coverage for the city of Denver.

Esri itself has created its own site to bring attention to the opioid crisis at a national level. Two maps in particular stand out to me. The first is the map of Opioid Prescriptions per Provider. The red zones on that map represent areas where a high number of opioid prescriptions are being made by relatively few providers. This points to potential areas where opioid abuse may be occurring.

By mapping the data in this way, some interesting insights emerge. Take Taliaferro County in Georgia for example where 2,069 claims out of a total of 29,016 were for opioids, yet the county only has 2 providers. Or Clinch County in Georgia where a whopping 10% of all claims were for opioids.

The second interesting map is Lost Loved Ones (located at the bottom of the Esri site). This is a completely open map where anyone can pay tribute to a loved one who has been lost to the opioid crisis. Each dot is a person – a stark reminder that behind each statistic is a son, daughter, mother, or father who has died from opioids. Anyone can add to the map by clicking the button at the top of the map.

There is something to be said about seeing data overlaid onto an interactive map. It takes data from abstract lines, bars or numbers on a page and transforms it into something more tangible, more “real”. I suspect that for many on the front lines of this crisis, having the opioid data visualized in this manner helps to drive home the need for additional resources.

“Esri is helping public health officials all over the country make better decisions,” continued Geraghty. “We are helping them determine if they have enough coverage for places where people can drop off expired drugs, places where Naloxone is available and mental health program coverage. We can visually present the types of drugs being dropped off by region. We can track where first responders have had to use Naloxone. We plan on continuing to collaborate closely with customers, especially with public health authorities. This opioid crisis is impacting so many neighborhoods. We can make a difference.”

Given the continued upward trend in opioid-related deaths, healthcare can use all the difference makers it can get.

Pennsylvania Health Orgs Agree to Joint $1 Billion Network Dev Effort

Posted on December 27, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

If the essence of deal-making is putting your money where your mouth is, a new agreement between Pennsylvania healthcare giants fit the description. They’ve certainly bitten off a mouthful.

Health organizations, Penn State Health and Highmark Health, have agreed to make a collective investment of more than $1 billion. That is a pretty big number to swallow, even for two large organizations, though it very well may take even more to develop the kind of network they have in mind.

The two are building out what they describe as a “community-based healthcare network,” which they’re designing to foster collaboration with community doctors and keep care local across its service areas.  Makes sense, though the initial press release doesn’t do much to explain how the two are going to make that happen.

The agreement between Penn State and Highmark includes efforts to support population health, the next step in accepting value-based payment. The investors’ plans include the development of population health management capabilities and the use of analytics to manage chronic conditions. Again, pretty much to be expected these days, though their goals are more likely to actually be met given the money being thrown at the problem.

That being said, one possible aspect of interest to this deal is its inclusion of a regionally-focused academic medical center. Penn State plans to focus its plans around teaching hospital Milton S. Hershey Medical Center, a 548-bed hospital affiliated with more than 1,100 clinicians. In my experience, too few agreements take enough advantage of hospital skills in their zeal to spread their arms around large areas, so involving the Medical Center might offer extra benefits to the agreement.

Highmark Health, for its part, is an ACO which encompasses healthcare business serving almost 50 million consumers cutting across all 50 states.  Clearly, an ACO with national reach has every reason in the world to make this kind of investment.

I don’t know what the demographics of the Penn State market are, but one can assume a few things about them, given the the big bucks the pair are throwing at the deal:

  • That there’s a lot of well-insured consumers in the region, which will help pay for a return on the huge investment the players are making
  • That community doctors are substantially independent, but the two allies are hoping to buy a bunch of practices and solidify their network
  • That prospective participants in the network are lacking the IT tools they need to make value-based schemes work, which is why, in part, the two players need to spend so heavily

I know that ACOs and healthcare systems are already striking deals like this one. If you’re part of a health system hoping to survive the next generation of reimbursement, big budgets are necessary, as are new strategies better adapted to value-based reimbursement.

Still, this is a pretty large deal by just about any measure. If it works out, we might end up with new benchmarks for building better-distributed healthcare networks.