Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

What If You Live Tweeted an EHR Go Live?

Posted on December 3, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Have you ever wondered what an EHR go live is like? Ok, those of you who have been through one probably don’t want to relive that experience and may even have a little PTSD from the experience. However, as an EHR addict myself, I couldn’t resist watching the Golden Valley Memorial Healthcare (GVMH) in Clinton, MO live tweet their MEDITECH Expanse go live on the @gvmhe Twitter account.

I loved this kind of transparency and documenting of a go live. Pretty cool to see the process. The only thing I wish they would have done is used a hashtag throughout and shared it with others that were tweeting about the go live. If they had, then it would have been easier to find great tweets like this one from their CMIO Bill Dailey, MD:

I won’t share the full go live stream since you can go and read it on the @gvmhe account. However, here were some tweets that stood out.


This is an exciting and nerve wracking part of any go-live.


I’m sure the team will look back on this picture fondly. Plus, they’ll probably note all the people who were too busy to get in on the picture.


One of the best and worst parts of a go-live. The countdown clock which shows you how long until the real work begins and how much time you have left to finish your preparations. It’s always ironic that there’s always more prep that could be done, but you have to go live anyway.


You have to have a little fun during the go live.


The stress is real. Is there an ICD-10 for EHR go lives?


It’s like New Year’s, but less champagne and kissing. I like the matching shirts though.


Another stressful clock


War room in action!


The inevitable issues of getting your vendors on the phone. I wonder how effective this tweet was in helping the vendor respond. Especially since the tweet above was the 2nd one.


The moment before go live.


15 minutes later!


Don’t forget the power of food during a go live.


Must be a pretty happy Christmas gift to have the go live done and with relatively few hiccups.


The reality of the first few days.


I wonder how they measured this, but pretty interesting to consider.


Monday with a full day of patients. Congrats GVMH!

I left off a number of things, so go and check out the full @gvmhe Twitter feed. Plus, you can follow along to see how the first few weeks on MEDITECH Expanse goes for them. I hope they keep tweeting once all the go live staff leave. That’s usually a challenging time as well.

Next Steps In Making Healthcare AI Practical

Posted on November 30, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

In recent times, AI has joined blockchain on the list of technologies that just sort of crept into the health IT toolkit.

After all, blockchain was borne out of the development of bitcoin, and not so long ago the idea that it was good for anything else wasn’t out there. I doubt its creators ever contemplated using it for secure medical data exchange, though the notion seems obvious in retrospect.

And until fairly recently, artificial intelligence was largely a plaything for advanced computing researchers. I’m sure some AI researchers gave thought to cyborg doctors that could diagnose patients while beating them at chess and serving them lunch, but few practical applications existed.

Today, blockchain is at the core of countless health IT initiatives, many by vendors but an increasing number by providers as well. Healthcare AI projects, for their part, seem likely to represent the next wave of “new stuff” adoption. It’s at the stage blockchain was a year or two ago.

Before AI becomes more widely adopted in healthcare circles, though, the industry needs to tackle some practical issues with AI, and the list of “to-dos” keeps expanding. Only a few months ago, I wrote an item citing a few obstacles to healthcare AI deployment, which included:

  • The need to make sure clinicians understand how the AI draws its conclusions
  • Integrating AI applications with existing clinical workflow
  • Selecting, cleaning and normalizing healthcare data used to “train” the AI

Since then, other tough challenges to the use of healthcare AI have emerged as the healthcare leaders think things over, such as:

Agreeing on best practices

Sure, hospitals would be interested in rolling out machine learning if they could, say, decrease the length of hospital stays for pneumonia and save millions. The thing is, how would they get going? At present, there’s no real playbook as to how these kinds of applications should be conceptualized, developed and maintained. Until healthcare leaders reach a consensus position on how healthcare AI projects should generally work, such projects may be too risky and/or prohibitively expensive for providers to consider.

Identifying use cases

As an editor, I see a few interesting healthcare AI case studies trickle into my email inbox every week, which keeps me intrigued. The thing is, if I were a healthcare CIO this probably wouldn’t be enough information to help me decide whether it’s time to take up the healthcare AI torch. Until we’ve identified some solid use cases for healthcare AI, almost anything providers do with it is likely to be highly experimental. Yes, there are some organizations that can afford to research new tech but many just don’t have the staff or resources to invest. Until some well-documented standard use cases for healthcare AI emerge, they’re likely to hang back.

The healthcare AI discussion is clearly at a relatively early stage, and more obstacles are likely to show up as providers grapple with the technology. In the meantime, getting these handled is certainly enough of a challenge.

Less Than Half of Healthcare Users Trust Critical Organizational Data

Posted on November 29, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

If you’re a healthcare CIO, you must hope that your users trust and feel they can leverage data to do their jobs better. However, some of your colleagues don’t seem to be so sure. A new study has concluded that less than half of users in responding healthcare organizations have a high degree of trust in their clinical, operational or financial data.

The study, which was conducted by Dimensional Insight, surveyed 85 chief information officers and other senior health IT leaders. It asked these leaders how they rated trust in the data leveraged by their various user communities, the percentage of user population they felt was self-service oriented and making data-driven decisions, and whether they planned to increase or decrease their investments in data trust and self-service analytics.

When rating the level of data trust on a 10-point scale, just 40% of respondents rated their trust in financial data at eight or above, followed by 40% of clinical data users and 36% of operational data users.

Perhaps, then, it follows that healthcare organizations responding to the survey had low levels of self-service data use. Clinical data users had a particularly low rate of self-service use, while financial users seemed fairly likely to be accessing and using data independently.

Given these low levels of trust and self-service data usage, it’s not surprising to find out that 76% of respondents said they plan to invest in increasing their investment in improving clinical data trust, 77% their investments in improving operational data trust and 70%  their investment in financial data trust.

Also, 78% said they plan to increase their spending on self-service analytics for clinical data and 73% expect to spend more on self-service analytics for operational data. Meanwhile, while 68% plan to increase spending on financial self-service analytics, 2% actually planned to decrease the spending in this area, suggesting that this category is perhaps a bit healthier.

In summing up, the report included recommendations on creating more trust in organizational data from George Dealy, Dimensional Insight’s vice president of healthcare applications. Dealy’s suggestions include making sure that subject matter experts help to design systems providing information critical to their decision-making process, especially when it comes to clinicians. He also points out that health IT leaders could benefit from keeping key users aware of what data exists and making it easy for them to access it.

Unfortunately, there are still far too many data silos protected by jealous guardians in one department or another. While subject matter experts can design the ideal data sharing platform for their needs, there’s still a lot of control issues to address before everyone gets what they need. In other words, increasing trust is well and good, but the real task is seeing to it that the data is rich and robust when users get it.

AI May Be Less Skilled At Analyzing Images From Outside Organizations

Posted on November 26, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Using AI technologies to analyze medical images is looking more and more promising by the day. However, new research suggests that when AI tools have to cope with images from multiple health systems, they have a harder time than when they stick to just one.

According to a new study published in PLOS Medicine, interest is growing in analyzing medical images using convolutional neural networks, a class of deep neural networks often dedicated to this purpose. To date, CNNs have made progress in analyzing X-rays to diagnose disease, but it’s not clear whether CNNs trained on X-rays from one hospital or system will work just as well in other hospitals and health systems.

To look into this issue, the authors trained pneumonia screening CNNs on 158,323 chest X-rays, including 112,120 X-rays from the NIH Clinical Center, 42,396 X-rays from Mount Sinai Hospital and 3,807 images from the Indiana University Network for Patient Care.

In their analysis, the researchers examined the effect of pooling data from sites with a different prevalence of pneumonia. One of their key findings was that when two training data sites had the same pneumonia prevalence, the CNNs performed consistently, but when a 10-fold different in pneumonia rates were introduced between sites, their performance diverged. In that instance, the CNN performed better on internal data than that supplied by an external organization.

The research team found that in 3 out of 5 natural comparisons, the CNNs’ performance on chest X-rays from outside hospitals was significantly lower than on held-out X-rays from the original hospital system. This may point to future problems when health systems try to use AI for imaging on partners’ data. This is not great to learn given the benefits AI-supported diagnosis might offer across, say, an ACO.

On the other hand, it’s worth noting that the CNNs were able to determine which organization originally created the images at an extremely high rate of accuracy and calibrate its diagnostic predictions accurately. In other words, it sounds as though over time, CNNs might be able to adjust to different sets of data on the fly. (The researchers didn’t dig into how this might affect their computing performance.)

Of course, it’s possible that we’ll develop a method for normalizing imaging data that works in the age of AI, in which case the need to adjust for different data attributes may not be needed.  However, we’re at the very early stages of training AIs for image sharing, so it’s anyone’s guess as to what form that normalization will take.

Sharing Records with Patients is the Right Thing to Do – OpenNotes

Posted on November 21, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I’ve been a big fan of the OpenNotes effort for a long time. While I’ve heard every excuse in the business for why patients shouldn’t have access to their chart, all of those reasons have fallen flat. Much of that is thanks to the good work of the people at OpenNotes.

If your organization has not embraced opening up your chart notes to patients, what’s holding you back? The case for opening your notes to patients is clear.

If you want a more humorous look at this, check out this video featuring e-Patient Dave and clip’s from Seinfeld.

I’m not sure how I missed this video when it first came out, but it’s timeless. Plus, there’s no one better to share this message than e-Patient Dave whose life was literally saved because he demanded access to his chart.

No doubt, a lot of things have changed in the 20 years since the above episode aired. One of those things is patients desire to access their chart and technology’s ability to deliver the chart to the patient at basically no cost.

If your organization hasn’t embraced OpenNotes, I encourage you all to do so now. They can answer all your questions and address all your doubts. Join the Movement and improve the care you provide patients.

Apple Health, Opioid Challenge, Safety Risk Heat Maps, and athenahealth Acquisition

Posted on November 20, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

We’re back again with a quick roll around Twitter in a round up of some of the interesting tweets we’ve seen shared. This was quite a diverse set of tweets, so I think there will be something of interest for everyone in this Twitter Round Up.


This tweet is a little annoying for me. I know Matthew has the best of intentions, but there’s no way I’d call and ask my provider or hospital to take part in this. I’m an Android user. This type of access does nothing for me. Apple users seem to forget that. Plus, it’s worth mentioning that there are more Android users out there than Apple users. It’s great that Apple is doing this, but it’s not the game changing thing that so many make it out to be.


Numbers like this always take me back. I just have to keep reminding myself that the opioid crisis wasn’t created over night and it won’t be fixed over night either.


Love this type of collaboration and creativity. One of the big things missing in healthcare is getting doctors off the reimbursement treadmill so they can take part in these types of creative activities. Also, a heat map of patient safety risk is pretty interesting to consider.


No doubt, we’ll hear a lot more about this acquisition in the future. As soon as Jonathan Bush was out as CEO, this company and people’s perception of this company changed. He was the heart and soul of the company and it’s going to be much different going forward. As far as the hospital piece of this tweet. I’ll be really interested to see if private equity is brave enough to continue Jonathan Bush’s ambitious hospital EHR strategy. I won’t be surprised if they pull the plug on it, but time will tell.

What’s the Future of Open Source EHR, Vista?

Posted on November 19, 2018 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I was going through some old draft posts (as I mentioned yesterday) and found a post that was started by Nate DiNiro which said “Will the VA and DoD help tip the scales on VistA adoption with OSEHRA?” Granted, this post was first started back in 2011. It’s amazing how much has changed since then.

We all know about the DoD’s selection of Cerner and Leidos to replace their EHR. In a more surprising move was the VA’s decision to sole-source their EHR selection of Cerner based on the idea that it was essential they go with Cerner after the DoD selected Cerner. Certainly a topic for another blog post.

We’ve certainly heard many complaints from those in the VA community that are going to have a really hard time giving up Vista which was basically tailored for many of their unique needs. However, there seems to be nothing stopping that ship now.

Given these events, it brings up an interesting question about the future of Vista as the VA replaces their version of Vista with Cerner. The good news for those healthcare organizations on Vista is that it’s now open source. So, the software can persist as long as there is a community of developers behind it. The core question is how much of Vista’s ongoing development came from the VA versus the community.

The two players I’ve seen using the open source Vista EHR platform are MedSphere and WorldVista. I’ll admit that I haven’t seen too much news from either of them lately, but they both seem to be humming along.

I took a look at the ONC’s latest Health IT Dashboard stats for hospitals. In 2017 (their latest data), it reported 11 “providers with certified technology” for Medsphere and 1 for WorldVistA. Of course, this is just those who have taken part in the meaningful use government program. It’s reasonable to assume that some open source EHR customers probably didn’t want to take part in meaningful use. Plus, these numbers don’t include international Vista installs which obviously can’t take part in meaningful use.

Given these numbers and the VA pulling Vista out, I have a feeling it’s going to be a hard road ahead for Vista.

I’ll never forget when it was first announced that the VA was open sourcing Vista and that anyone that wanted a free EHR could have it. What was amazing is that the HIM manager I was working with found an article talking about this announcement and brought it to me. She wondered why we were paying for an EHR if Vista was available for free. It gave me a chance to explain to her that “free software” doesn’t mean it’s free to implement and manage. Not to mention the fact that this was a small ambulatory clinic that was likely not a good fit for the hospital focused Vista software.

What have you heard or seen with Vista? Has more been happening with the open source versions of Vista that I just haven’t seen? As a big open source user myself (my blogs run on pretty much all open source software), I’d love to see an open source EHR succeed. Unfortunately, it just hasn’t seen near the adoption it needs to really create that momentum yet.

Interoperability Is On An Accelerated Trajectory Says Redox CEO

Posted on November 16, 2018 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He is currently an independent marketing consultant working with leading healthIT companies. Colin is a member of #TheWalkingGallery. His Twitter handle is: @Colin_Hung.

The lack of interoperability in healthcare continues to be a vexing challenge for Health IT companies, IT departments and patients. Redox is a company taking a unique approach to solving this problem. They have built a platform of reusable data connections between healthcare providers and the innovative cloud-based companies that have products those providers want to use.

Redox recently held their second annual Interoperability Summit at the new CatalystHTI facility in Denver Colorado. Over three hundred people attended the event. The diverse audience included: startups, hospitals, large HealthIT vendors, payors and government health agencies. The sessions at the Summit reflected the diversity of the audience and ranged from topics like “Hacking the Health System Sales Cycle” to “FHIR: Be the Right Amount of Excited”.

During the Summit, I sat down with Redox CEO, Luke Bonney, to talk about the state of interoperability, the willingness of the industry to share data and what advice he has for the dozens of startups that approach Redox each month.

Below is a transcript of our conversation.

What is the state of healthcare interoperability today?

I think we are in a good state right now, but more importantly I think we are on an accelerated trajectory to something better.

An accelerated trajectory?

Yes, but in order to explain why I’m saying that, we have to take a step back.

In my opinion healthcare interoperability is inextricably tied to the adoption and migration to the cloud. We will never have true data liquidity, which is the state that everyone wants – physicians, clinicians, administrators, patients, providers, payers, etc – until healthcare fully embraces cloud architectures and cloud thinking.

Healthcare is still predominantly an “on-premise” world. It’s not wrong. It’s just how the industry has grown up. We installed servers behind our own firewalls. As we added systems we bought more servers and of course we added them to the other servers behind the firewall. Eventually we built connections between these systems so that they could talk to each other. But because everything was behind the firewall and because we were really just sharing data within the same organization, we didn’t give much thought to sharing that data in a standard way. As long as we were behind the firewall we could safely exchange data.

When you approach things from a cloud perspective, the thinking is completely different. When you build cloud applications you HAVE TO think about data portability and security. You HAVE TO work out ways to connect systems together across the Internet without a single big firewall acting as your shield.

So as people move more and more to this way of thinking we will see more movement towards frictionless data exchange.

So is healthcare moving more to the cloud?

Working at EPIC and now at Redox, I’ve had a front-row seat to this change in attitude towards the cloud by healthcare providers. Prior to 2015 healthcare IT leaders were still asking “What is the cloud?” and “Why should I bother with it?”. But today leaders are starting to ask “How can I better leverage the cloud for my organization?” It’s great to see so many proactively looking for ways to adopt cloud-based applications.

I also think that the consumer tech giants are helping propel healthcare forward. Companies like Amazon and Google have always been cloud-based. As they push into healthcare they are going to have a huge advantage versus on-premise legacy companies. As they gain traction so too will the cloud.

I can see how embracing the cloud will help healthcare achieve secure connectivity and certainly scalability, but even if we move completely to the cloud won’t we still need to exchange data in a standard way in order to achieve true interoperability?

Having a data standard would certainly be helpful.

Is that going to be HL7 v2? v3? FHIR? Smart-on-FHIR? Or something that Commonwell Alliance puts out?

(Laughing). We do seem to have a lot of standards don’t we.

Actually this is what is driving Redox. There really isn’t a ton of incentive to tear out the investments already made in HL7 v2 or v3. It works for the use cases where it has been deployed. The same applies to FHIR and Commonwell. All these approaches work wonderfully for specific use cases, but I really doubt any one of these approaches is going to be the single solution for all of our interoperability challenges.

Think about it. If I’m a CIO at a hospital and I have a working HL7 v2 integration working between two systems, why would I waste precious resources to move to a different integration standard if there is really nothing to be gained from it? It’d be a waste of time and resources.

The one good thing about all these standards and interoperability initiatives is that we are building an audience that is asking the right questions and pushing healthcare in the right direction. APIs are the right thing to do. FHIR is the right thing to do…and so on. All are relevant and needed.

So if not a universal data standard, what do we need?

The way I see things we might not need a single data standard if someone can build a common platform through which data can be shared. That’s what we’re doing here at Redox. We’re taking a pragmatic approach. Whatever data standard you are using internal is fine with us. We’ll work with you to find a way to share your data through our platform. And once you share it with us once, you don’t have to rebuild that connection over and over again each time a different company wants to connect. We handle that.

Is that the problem Redox set out to solve?

Actually when we started Redox we really just wanted to make it easier for cloud-based healthcare companies to scale and grow. What we realized is that one of the biggest impediments to growth was integrating legacy on-prem systems with cloud-based applications. Even if these companies could convince hospital IT teams to put their integration on the priority list, it would take a long time to actually get it done.

So we built the Redox engine to make this easier. Our goal wasn’t to solve interoperability per say, we just wanted to bring innovative web developers closer to healthcare providers so that they can solve problems together.

But because we were cloud from Day 1, we wanted to build everything in a reusable way, so that once we built a connection to one hospital, we wouldn’t have to build it again when the next company wanted to connect with that same hospital. This network effect wasn’t something we originally set out to build, but now it’s central to our success. It’s why we can talk about being a platform that enables data sharing vs being a tool that helps systems share data.

Solving interoperability is only partly a technology challenge. There is also the challenge of getting the healthcare ecosystem to actually share their data. Because Redox works with so many players in the ecosystem, have you noticed any change in attitude around sharing data?

Let me start by saying that I think everyone WANTS the data. There’s incredible value in health data. Medical records are a gold mine for researchers, public health authorities, pharma companies, payors, etc. Everyone would love nothing more than to build a comprehensive health record for their own purposes. The challenge of course is that it’s not easy to do that today. As you said, this is partly because of technology and partly because no one really wants to share their data altruistically.

I think there is one party that truly wants data to be shared and that’s patients. Patients are way more interested in sharing data than anyone else in the ecosystem. As a patient, data should follow me wherever I go. I never want to wonder if my doctor has all my medical information. I want people to have the data because I want the best outcome possible and my data can help make that happen.

I think companies and organizations in the healthcare ecosystem are slowly waking up to the fact that sharing data helps support their customers – whether those customers are providers, payors, members, patients, clinicians or government agencies. Sharing data makes things better. And as financial pressures in healthcare mount, everyone is looking for ways to do more, better, faster and with more accuracy. Sharing data is necessary for that to happen.

Redox works with a lot with startups and small/medium sized HealthIT companies. What advice would you give to those that are considering working with Redox? What should they have considered?

There are two key questions that I think every HealthIT company should ask themselves. Frist, what is the value your product or service provides? Second, Who is the buyer? Success in healthcare is less about whether your technology and more about aligning three things:

  1. An actual problem that needs to be solved
  2. A solution to that problem
  3. A buyer who can make a buying decision in a healthcare organization

I see a lot of companies that don’t really consider this last question. You can create an amazing product that solves a problem in healthcare but if the target audience for your product cannot make the buying decision then you have a difficult road ahead of you.

Beyond these questions, I would advise companies to really consider how their products integrate into the clinical or administrative workflow. Many startups begin with an application that isn’t integrated with existing hospital systems, like the EHR. But after they gain a little bit of traction they realize they need to become more integrated. But building in real-time data exchange into an application isn’t easy. You need to really think through how your product will handle this.

Lastly I would caution healthcare entrepreneurs about building their applications on the assumption that FHIR will be universally adopted. It isn’t and it will likely take years before it gains real-world traction. There is a lot of excitement around FHIR, but it isn’t the best solution for all situations.

Final Thoughts?

One thing I am encouraged by is the number of people and companies from outside of healthcare that are coming into this space. I think they bring an energy and perspective that will help us all get better. Granted, many of them have stars in their eyes and don’t realize how tough healthcare can be…but, the fact that they aren’t burdened with any legacy thinking is exciting. Healthcare needs more outside thinking.

Interoperability Problems Undercut Conclusions of CHIME Most Wired Survey

Posted on November 13, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Most of you have probably already seen the topline results from CHIME’s  “Healthcare’s Most Wired: National Trends 2018” study, which was released last month.

Some of the more interesting numbers coming out of the survey, at least for me, included the following:

  • Just 60% of responding physicians could access a hospital network’s virtual patient visit technology from outside its network, which kinda defeats the purpose of decentralizing care delivery.
  • The number of clinical alerts sent from a surveillance system integrated with an EHR topped out at 58% (alerts to critical care units), with 35% of respondents reporting that they had no surveillance system in place. This seems like quite a lost opportunity.
  • Virtually all (94%) participating organizations said that their organization’s EHR could consume discrete data, and 64% said they could incorporate CCDs and CCRs from physician-office EHRs as discrete data.

What really stands out for me, though, is that if CHIME’s overall analysis is correct, many aspects of our data analytics and patient engagement progress still hang in the balance.

Perhaps by design, the hospital industry comes out looking like it’s doing well in most of the technology strategy areas that it has questions about in the survey, but leaves out some important areas of weakness.

Specifically, in the introduction to its survey report, the group lists “integration and interoperability” as one of two groups of foundational technologies that must be in place before population health management/value-based care,  patient engagement and telehealth programs can proceed.

If that’s true, and it probably is, it throws up a red flag, which is probably why the report glossed over the fact that overall interoperability between hospitals is still very much in question. (If nothing else, it’s high time the hospitals adjust their interoperability expectations.) While it did cite numbers regarding what can be done with CCDs, it didn’t address the much bigger problems the industry faces in sharing data more fluidly.

Look, I don’t mean to be too literal here. Even if CHIME didn’t say so specifically, hospitals and health systems can make some progress on population health, patient engagement, and telehealth strategies even if they’re forced to stick to using their own internal data. Failing to establish fluid health data sharing between facility A and facility B may lead to less-than-ideal results, but it doesn’t stop either of them from marching towards goals like PHM or value-based care individually.

On the other hand, there certainly is an extent to which a lack of interoperability drags down the quality of our results. Perhaps the data sets we have are good enough even if they’re incomplete, but I think we’ve already got a pretty good sense that no amount of CCD exchange will get the results we ultimately hope to see. In other words, I’m suggesting that we take the CHIME survey’s data points in context.

MRI Installation Slip Disables Hospital iOS Devices

Posted on November 9, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

The following is the story of an MRI installation which took a surprising turn. According to a recent post on Reddit which has since gone viral in the IT press, a problem with the installation managed to shut down and completely disable every iOS-based device in the facility.

A few weeks ago, Erik Wooldridge of  Chicago’s Morris Hospital, a perplexed member of the r/sysadmin subreddit, posted the following:

This is probably the most bizarre issue I’ve had in my career in IT. One of our multi-practice facilities is having a new MRI installed and apparently something went wrong when testing the new machine. We received a call near the end of the day from the campus that none of their cell phones work after testing [the] MRI… After going out there we discovered that this issue only impacted iOS devices. iPads, iPhones, and Apple Watches were all completely disabled.

According to Wooldridge, the outage affected about 40 users. Many of the affected devices were completely dead. Others that could power on seemed to have issues with the cellular radio, though the Wi-Fi connections continued to work. Over time, the affected devices began to recover, but one iPhone had severe service issues after the incident, and while some of the Apple Watches remained on, the touchscreens hadn’t begun working after several days.

At first, Morris and his colleagues feared that the outage could be due to an electromagnetic pulse, a terrifying possibility which could’ve meant very bad things for its data center. Fortunately, that didn’t turn out to be the problem.

Later the vendor, GE, told the poster and his colleagues that the problem was a leakage of liquid helium used for the MRI’s superconducting magnets. GE engineers turned out to be right that the leak was the source of the problems, but couldn’t explain why Android devices were untouched by the phenomenon.

Eventually, a blogger named Kyle Wiens with iFixit.org seems to found an explanation for why iOS devices were hit so hard by the helium leak. Apparently, even Apple admits that exposing iPhones to evaporating liquefied gases such as helium could take them offline.

While no one’s suggesting that liquefied helium is good for any type of microelectronic device, the bottom line seems to be that the iOS devices are more sensitive to this effect than the Android devices. Let’s hope most readers never need to test this solution out.