Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Sutter Health Blends EHR, Patient-Reported Data For MS Treatment

Posted on December 5, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or

The Sutter Health network is launching a new research project which will blend patient-reported and EHR-based data to improve the precision of multiple sclerosis treatment. Sutter will fund the project with a $1.2 million award from the California Initiative to Advance Precision Medicine.

To conduct the project, Sutter Health researchers are partnering with colleagues at the University of California, San Francisco. Working together, the team is developing a neurology application dubbed MS-SHARE which will be used by patients and doctors during appointments, and by patients between appointments.

During the 18-month demonstration project, the team will build the app with input from the health system’s doctors as well as MS patients. Throughout the process of care, the app will organize both patient-reported data and EHR data, in a manner intended to let doctors and patients view the data together and work together on care planning.

Over the short term, researchers and developers are focusing on outcomes like patient and doctor use of the app and enhancing the patient experience. Its big picture goals, meanwhile, include the ability to improve patient outcomes, such as disease progression and symptom control. Ultimately, the team hopes the results of this project go beyond supporting multiple sclerosis patients to helping to improve care for other neurological diseases such as Parkinson’s Disease, seizure disorders and migraine headaches.

The Sacramento, Calif.-based health network pitches the project as potentially transformative. “MS-SHARE has the potential to change how doctors and patients spend their time during appointments,” the press release asserts. “Instead of ‘data finding and gathering,’ doctors and patients can devote more time to conversation about how the care is working and how it needs to be changed to meet patient needs.”

Time for an editorial aside here. As a patient with a neurological disorder (Parkinson’s), I’m here to say that while this sounds like an excellent start at collaborating with patients, at first glance it may be doomed to limited success at best.

What I mean is as follows. When I meet with the neurologist to discuss progression of my symptoms, he or she typically does little beyond the standard exam. In fact, my sense is that most seem quite satisfied that they know enough about my status to make decisions after doing that exam. In most cases, little or nothing about my functioning outside the office makes it into the chart.

What I’m trying to say here is that based on my experience, it will take more than a handy-dandy app to win neurologists over to collaborating over charts and data with any patient. (Honestly, I think that’s true of many doctors outside this specialty, too.) And I’m not suggesting that this is because they’re arrogant, although they may be in some cases. Rather, I’m suggesting that it’s a workflow issue. Integrating patients in the discussion isn’t just a change of pace, it could be seen as a distraction that could lead to worse care rather than better. It will be interesting to see if that’s how things turn out.

Bringing EHR Data to Radiologists

Posted on December 2, 2016 I Written By

John Lynn is the Founder of the blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of and John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

One of the most interesting things I saw at RSNA 2016 in Chicago this week was Philips’ Illumeo. Beside being a really slick radiology interface that they’ve been doing forever, they created a kind of “war room” like dashboard for the patient that included a bunch of data that is brought in from the EHR using FHIR.

When I talked with Yair Briman, General Manager for Healthcare Informatics Solutions and Services at Philips, he talked about the various algorithms and machine learning that goes into the interface that a radiologist sees in Illumeo. As has become an issue in much of healthcare IT, the amount of health data that’s available for a patient is overwhelming. In Illumeo, Philips is working to only present the information that’s needed for the patient at the time that it’s needed.

For example, if I’m working on a head injury, do I want to see the old X-ray from a knee issue you had 20 years ago? Probably not, so that information can be hidden. I may be interested in the problem list from the EHR, but do I really need to know about a cold that happened 10 years ago? Probably not. Notice the probably. The radiologist can still drill down into that other medical history if they want, but this type of smart interface that understands context and hides irrelevant info is something we’re seeing across all of healthcare IT. It’s great to see Philips working on it for radiologists.

While creating a relevant, adaptive interface for radiologists is great, I was fascinated by Philips work pulling in EHR data for the radiologist to see in their native interface. Far too often we only talk about the exchange happening in the other direction. It’s great to see third party applications utilizing data from the EHR.

In my discussion with Yair Briman, he pointed out some interesting data. He commented that Philips manages 135 billion images. For those keeping track at home, that amounts to more than 25 petabytes of data. I don’t think most reading this understand how large a petabyte of data really is. Check out this article to get an idea. Long story short: that’s a lot of data.

How much data is in every EHR? Maybe one petabyte? This is just a guess, but it’s significantly smaller than imaging since most EHR data is text. Ok, so the EHR data is probably 100 terabytes of text and 900 terabytes of scanned faxes. (Sorry, I couldn’t help but take a swipe at faxes) Regardless, this pales in comparison to the size of radiology data. With this difference in mind, should we stop thinking about trying to pull the radiology data into the EHR and start spending more time on how to pull the EHR data into a PACS viewer?

What was also great about the Philips product I saw was that it had a really slick browser based HTML 5 viewer for radiology images. Certainly this is a great way to send radiology images to a referring physician, but it also pointed to the opportunity to link all of these radiology images from the EHR. The reality is that most doctors don’t need all the radiology images in the EHR. However, if they had an easy link to access the radiology images in a browser when they did need it, that would be a powerful thing. In fact, I think many of the advanced EHR implementations have or are working on this type of integration.

Of course, we shouldn’t just stop with physicians. How about linking all your radiology images from the patient portal as well? It’s nice when they hand you a DVD of your radiology images. It would be much nicer to be able to easily access them anytime and from anywhere through the patient portal. The great part is, the technology to make this happen is there. Now we just need to implement it and open the kimono to patients.

All in all, I love that Philips is bringing the EHR data to the radiologists. That context can really improve healthcare. I also love that they’re working to make the interface smarter by removing data that’s irrelevant to the specific context being worked on. I also can’t wait until they make all of this imaging data available to patients.

Hospital Program Uses Connected Health Monitoring To Admit Patients “To Home”

Posted on November 28, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or

A Boston-based hospital has kicked off a program in which it will evaluate whether a mix of continuous connected patient monitoring and clinicians is able to reduce hospitalizations for common medical admissions.

The Home Hospital pilot, which will take place at Partners HealthCare Brigham and Women’s Hospital, is being led by David Levine, MD, MA, a physician who practices at the hospital. The hospital team is working with two vendors to implement the program, Vital Connect and physIQ. Vital Connect is supplying a biosensor that will continuously stream patient vital signs; those vital signs, in turn, will be analyzed and viewable through physIQ’s physiology analytics platform.

The Home Hospital pilot is one of two efforts planned by the team to analyze how technology in home-based care can treat patients who might otherwise have been admitted to the hospital. For this initiative, a randomized controlled trial, patients diagnosed at the BWH Emergency Department with exacerbation of heart failure, pneumonia, COPD, cellulitis or complicated urinary tract infection are being placed at home with the Vital Connect/physIQ solution and receive daily clinician visits.

The primary aim of this program, according to participants, is to demonstrate that the in-home model they’ve proposed can provide appropriate care at a lower cost at home, as well as improving outcomes measures such as health related quality of life, patient safety and quality and overall patient experience.

According to a written statement, the first phase of the initiative began in September of this year involves roughly 60 patients, half of whom are receiving traditional in-hospital care, while the other half are being treated at home. With the early phase looking at the success, the hospital will probably scale up to including 500 patients in the pilot in early 2017.

Expect to see more hospital-based connected care options like these emerge over the next year or two, as they’re just too promising to ignore at this point.

Perhaps the most advanced I’ve written about to date must be the Chesterfield, Mo-based Mercy Virtual Care Center, which describes itself as a “hospital without beds.” The $54M Virtual Care Center, which launched in October 2015, employs 330 staffers providing a variety of telehealth services, including virtual hospitalists, telestroke and perhaps most relevant to this story, the “home monitoring” service, which provides continuous monitoring for more than 3,800 patients.

My general impression is that few hospitals are ready to make the kind of commitment Mercy did, but that most are curious and some quite interested in actively implementing connected care and monitoring as a significant part of their service line. It’s my guess that it won’t take many more successful tests to convince wide swath of hospitals to get off the fence and join them.

6 Things EHRs Should Be Thankful For

Posted on November 25, 2016 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin is a true believer in #HealthIT, social media and empowered patients. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He currently leads the marketing efforts for @PatientPrompt, a Stericycle product. Colin’s Twitter handle is: @Colin_Hung

Tis’ the season for being thankful for the friends, family and bounty we have in our lives. It is a time to celebrate the end of the season with copious amounts of food and reflect upon the good things that have happened in our lives this year.

In the spirit of Thanksgiving, I thought it would be fun to give voice to what an EHR would be thankful for this year. So if I put my mind into that of an EHR here are the top 6 things I’d be thankful for.

  1. Meaningful Use. Thank you for five great years. It was only through the infusion of $35B that thousands of my brethren were adopted/implemented across the United States. Without MU, we EHRs would not have proliferated to the degree that we have. #forevergrateful
  2. Doctors. Absolute thankful for all the doctors who use us everyday. We love how much time and attention you are giving us in 2016. It’s almost embarrassing how you stare at our screens and don’t get distracted by the other people in the exam room with you (I think you call them patients…and I think I have a field for that). We look forward to more of the same next year. Thank you!
  3. Nurses. Thank you to all the nurses out there. Your constant clicking on our drop-down boxes and check boxes are like a daily “tickle”. We hope you aren’t too mad at us for making it difficult to get the information you want. It’s only because we want to spend more time with you. #love
  4. EHR Consultants. I am grateful this year for the army of EHR consultants that are out there. Without you, we EHRs would have been relegated to the scrap heap long ago. Thank you for working hard to optimize us, customizing us to better suit user needs and to teaching people how to use us effectively. We owe our longevity to you.
  5. Health IT Media. Thank you to the Health IT media for keeping the spotlight on EHRs in 2016 – despite it being the last year of the Meaningful Use program. Whether you like us or not, we EHRs have become the backbone of healthcare and there are a lot of things that can be improved – but only if people stay focused on their EHR journeys. Installation was just the first step. So all you columnists, writers, bloggers and Tweeters out there, please keep EHRs on the radar.
  6. EHR Vendors. I shudder to think of where we would be without our creators in 2016. It was exciting to watch you build “partner ecosystems” around us. These add-ons really helped to unlock the usefulness of the data we’ve been keeping safe. I know we wanted to work on something called “usability” but I’m sure we’ll get to it next year.

Happy Thanksgiving!

Longitudinal Patient Record Needed To Advance Care?

Posted on November 23, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or

In most day to day settings, a clinician only needs a small (if precisely focused) amount of data to make clinical decisions. Both in ambulatory and acute settings, they rely on immediate and near-term information, some collected during the visit, and a handful of historical factors likely to influence or even govern what plan of care is appropriate.

That may be changing, though, according to Cheryl McKay of Orion Health. In a recent blog item, McKay argues that as the industry shifts from fee-for-service payment models to value-based reimbursement, we’ll need new types of medical records to support this model. Today, the longitudinal patient record and community care plan are emerging as substitutes to old EMR models, McKay says. These new entities will be built from varied data sources including payer claims, provider EMRs, patient health devices and the patients themselves.

As these new forms of patient medical record emerge, effective population health management is becoming more feasible, she argues. Longitudinal patient records and community care plans are “essential as we steer away from FFS…The way records are delivered to healthcare providers– with an utter lack of visibility and a lot of noise from various data sources– creates unnecessary risks for everyone involved.”

She contends that putting these types of documentation in place, which summarize patient-based clinical experiences versus episodic clinical experiences, close big gaps in patient history which would otherwise generate mistakes. Longitudinal record-keeping also makes it easier for physicians to aggragate information, do predictive modeling and intervene proactively in patient care at both the patient and population level.

She also predicts that with both a longitudinal patient record and community care plan in place, getting from the providers of all stripes a “panoramic” look at patients, costs will fall as providers stop performing needless tests and procedures. Not only that, these new entities would ideally offer real-time information as well, including event notifications, keeping all the providers involved in sync in providing the patient’s care.

To be sure, this blog item is a pitch for Orion’s technology. While the notion of a community-care plan isn’t owned by anyone in particular, Orion is pitching a specific model which rides upon its population health technology. That being said, I’m betting most of us would agree that the idea (regardless of which vendor you work with) of establishing a community-wide care plan does make sense. And certainly, putting a rich longitudinal patient record in place could be valuable too.

However, given the sad state of interoperability today, I doubt it’s possible to build this model today unless you choose a single vendor-centric solution. At present think it’s more of a dream than a reality for most of us.

Health System Sees Big Dividends From Sharing Data

Posted on November 21, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or

For some health organizations, the biggest obstacle to data sharing isn’t technical. Many a health IT pundit has argued — I think convincingly — that while health organizations understand the benefits of data sharing, they still see it as against their financial interests, as patients with access to data everywhere aren’t bound to them.

But recently, I read an intriguing story by Healthcare IT News about a major exception to the rule. The story laid out how one healthcare system has been sharing its data with community researchers in an effort to promote innovation. According to writer Mike Miliard, the project was able to proceed because the institution was able to first eliminate many data silos, giving it a disciplined view of the data it shared.

At Sioux Falls, South Dakota-based Sanford Health, one health leader has departed from standard health system practices and shared a substantial amount of proprietary data with research organizations in his community, including certain clinical, claims, financial and operational data. Sanford is working with researchers at South Dakota State University on mathematics issues, University of South Dakota business researchers, Dakota State University on computer science/informatics and University of North Dakota on public health.

The effort is led by Benson Hsu, MD, vice president of enterprise data and analytics for the system. Hsu tells the magazine that the researchers have been developing analytical apps which are helping the health system with key issues like cost efficiencies, patient engagement and quality improvement. And more radically, Hsu plans to share what he discovers with competitors in the community.

Hsu laid the groundwork for the program, HIN reports, by integrating far-flung data across the sprawling health system, including multiple custom versions of the Epic EHR, multiple financial accounts and a variety of HR systems; analytics silos cutting across areas from clinical decision support and IT reports to HR/health plan analytics; and data barriers which included a lack of common data terms, benchmarking tools and common analytic calculator. But after spending a year pulling these areas into a functioning analytics foundation, Sanford was ready to share data with outside entities.

At first, Hsu’s managers weren’t fond of the idea of sharing masses of clinical data with anyone, but he sold them on the idea. “It’s the right thing to do. More importantly, it’s the right thing to do for the community — and the community is going to recognize that Sanford health is here for the community,” he argued. “Secondly, it’s innovation. Innovation in our backyard, based on our population, our social determinants, our disparities.”

According to HIN, this “crowdsourced” approach to analytics has helped Sanford make progress with predicting risk, chronic disease management, diagnostic testing and technology utilization, among other things. And there’s no reason to think that the effort won’t keep generating progress.

Many institutions would have shot down an effort like this immediately, before it could accomplish results. But it seems that Sanford’s creative approach to big data and analytics is paying off. While it might not work everywhere, I’m betting there are many other institutions that could benefit from tapping the intellect of researchers in their community. After all, no matter how smart people are, some answers always lie outside your walls.

Rush University Medical Center Rolls Out OpenNotes

Posted on November 18, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or

Back in 2010, a group of primary care doctors from three different healthcare organizations across the US came together to launch a project in which they’d begin sharing their clinical notes directly with their patients. The doctors involved were part of a 12-month study designed to explore how such sharing would affect healthcare. The project was a success, and today, 10 million patients have access to their clinicians’ notes via OpenNotes.

Now, Rush University Medical Center has joined the party. The 664-bed academic hospital, which is based in Chicago, now allows patients to see all of their doctor’s notes through a secure web link which is part of Epic’s MyChart portal. According to Internet Health Management, Rush has been piloting OpenNotes since February and rolled it out across the system last month.  Patients could already use MyChart to review physician instructions, prescriptions and test orders online.

If past research is any indication, the new service is likely to be hit with patients. According to a study from a few years ago, which looked at 3,874 primary care patients at Beth Israel Deaconess Medical Center, Geisinger Health System and Harborview Medical Center, 99% of study participants wanted continued access to clinician notes after having it for one year. This was true despite the fact that almost 37% of patients reported being concerned about privacy after using the portal during that time.

Dr. Allison Weathers, Rush associate chief medical information officer, told the site that having access to the notes can help individuals with complex health needs and under the care of multiple providers. “Research shows that when patients can access their physicians’ notes, they better understand the medical issues and treatment plan as active partners in their care,” she said. “When a patient is sick, tired or stressed during a doctor’s visit, they may forget what the doctors said or prescribed.”

I think it’s also apparent that giving patients access to clinician notes helps them engage further with the process of care. Ordinarily, for many patients, medical notes from their doctor are just something that they hand along to another doctor. However, when they have easy access to their notes, alongside of the test results, appointment scheduling, physician email access and other portal functions, it helps them become accustomed to wading through these reports.

Of course, some doctors still aren’t OpenNotes-friendly. It’s easy to see why. For many, the idea of such sharing private notes — and perhaps some unflattering conclusions — has been out of the question. Many have suggested that if patients read the notes, they can’t feel free to share their real opinion on matters of patient care and prognosis. But the growth of the OpenNotes program suggests to me that the effect of sharing notes has largely been beneficial, giving patients the opportunity not only to correct any factual mistakes but to better understand their provider’s perspective. As I see it, only good can come from this over the long run.

EMR Data Archival Strategy Deep Dive – Tackling EHR & EMR Transition Series

Posted on November 14, 2016 I Written By

The following is a guest blog post by Robert Downey, VP of Product Development at Galen Healthcare Solutions.

Inside the world of data archival (Download this Free Data Archive Whitepaper for a deep dive into the subject), there are nearly as many different types of archives as there are vendors. Many of the existing archival solutions that have gained popularity with large healthcare organizations are ones that are also frequently utilized by other sectors and often claim to be able to “archive anything.”

This can be very appealing, as an organization going through a merger will often retire dozens or even hundreds of systems, some clinical, but most only tangentially related to the delivery of care. HR systems, general ledger financial systems, inventory management, time tracking, inventory tracking systems, and CRMs are just a few of the systems that might also be slated for the chopping block. The idea of retiring all of these into a single logical archival solution is very appealing, but this approach can be a dangerous one. The needs of healthcare organizations are not necessarily the same as the needs of other sectors.
To understand why some archival approaches are superior to others, it’s useful to visualize the way each of the solutions extract, store, and visualize data. The methodologies used typically trade fidelity (how well it preserves the original shape and precision of the data) for accessibility (how easy it is to get at the information you need), and they trade how easily the solution can archive disparate sources of data (such as archiving both an EMR and a time-tracking system) with, again, accessibility.

There are certainly other ways to judge an archival solution. For instance, an important factor may be whether or not the solution is hosted by the archival vendor on-premises or remotely. Some factors, such as the reliability of the system, service level agreements, or its overall licensing cost are big inputs into the equation as well, but those aren’t necessarily specific to the overall archival strategy utilized by the solution. There are also factors that are so critical, such as security and regulatory compliance, that deficiencies in these areas are deal-breakers. Now that we have the criteria with which to judge the solution, let’s delve into the specific archival strategies being used in the marketplace.

Raw Data Backups
A shockingly large number of organizations treat raw data backups of the various databases and file systems as their archival solution. There are some scenarios in which this may be good enough, such as when the source system is not so much being retired as it is being upgraded or otherwise still maintained. Another scenario might be when the data in question comes from systems so well known that the organization won’t have significant issues retrieving information when it becomes necessary. The greatest benefit to this approach is that acquiring the data is fairly trivial. Underlying data stores almost always offer easy built-in backup mechanisms. Indeed, the ability to back up data is a certification requirement for EMRs, as well as a HIPAA and HITECH legal requirement. This strategy also offers “perfect” data fidelity, as the data is in the raw, original format.
Once it actually comes time to access the “archived” data, however, the organization is forced to fully reverse engineer the underlying database schemas and file system encodings. This leads to mammoth costs and protracted timelines for even simple data visualization, and it’s a major undertaking to offer any kind of significant direct clinician or compliance access to data.

Another danger with raw database backups is that many clinical system vendors have language in their licensing related to the “reverse engineering” of their products. So while it may be “your” data, the vendor may consider their schema intellectual property — and the act of deciphering it, not to mention keeping a copy of it after the licensing agreements with the system vendor have been terminated — may well be a direct violation of the original licensing agreement.

Hybrid Modeled / Extracted Schema
A common approach utilized by healthcare-specific archival solutions is to create a lightweight EMR and practice management schema that includes the most common data attributes from many different source system vendors and then map the data in the source system to this fully modeled schema. The mapping involved is usually limited to fieldtype mapping rather than dictionary mapping, although occasionally, dictionary data which feeds user interface aspects such as grouping (problem categories, for instance) may require some high-level mapping.

This approach usually yields excellent clinical accessibility because the vendor can create highly focused clinical workflows just like an EMR vendor can. Since these visualizations don’t need to be created or altered based on the source system being archived, it means that there is generally no data visualization implementation cost.
As the mapping is limited to the schema, the extraction and load phase is usually not as expensive as a full EMR data migration, but because every required source field must have a place in the target archival schema, the process is typically more time-consuming and expensive than the hybrid modeled / extracted schema or non-discrete document approaches. That said, vendors that have a solid library of extraction processes for various source systems can often offer lower initial implementation costs than would otherwise be possible.

The compliance accessibility and data fidelity of this strategy can be problematic, however, as unknown fields are often dropped and data types are frequently normalized. This fundamentally alters a substantial portion of the data being archived in the same way that a full data migration can — although, again, not as severely given the typical lack of data dictionary mapping requirements. In some cases, vendors will recommend that a full backup of the original data be kept in addition to the “live” archive, providing some level of data fidelity problem mitigation. Should a compliance request require this information, however, the organization may be left in a similar position to those utilizing raw data backups or extracted schema stores with no pre-built visualizations.

Archival solutions utilizing this strategy may also frequently require augmentation by the vendor as new sources of data are encountered. This can make the implementation phase longer, as those changes typically need to happen before any data can be loaded.

There will never be a one-size-fits-all archival solution across organizations, and even within an organization, when determining the strategy for multiple systems. Another key takeaway is to always be wary of all the “phases of implementation.” Many vendors will attempt to win deals with quick and inexpensive initial implementations, but they leave significant work for when the data actually needs to be visualized in a meaningful way. That task either falls on the organization, or it must be further contracted with the archival solution provider.

It also is valuable to consider solutions specifically designed for archival purposes and, ideally, one that focuses on the healthcare sector. There are simply too many archival-specific scenarios to utilize a general purpose data backup, and many organizations find that the healthcare-specific requirements make general purpose archival products ill-suited for their needs.

Download Galen Healthcare’s full archival whitepaper to evaluate available EMR data migration & EMR data archival options and processes critical to EMR replacement and legacy system decommissioning.

About Robert Downey
Robert is Vice President, Product Development, at Galen Healthcare Solutions. He has nearly 10 years of healthcare IT experience and over 20 years in Software Engineering. Robert is responsible for design and development of Galen’s products and supporting technology, including the VitalCenter Online Archival solution. He is an expert in healthcare IT and software development, as well as cloud based solutions delivery. Connect with Robert on LinkedIn.

About Galen Healthcare Solutions
Galen Healthcare Solutions is an award-winning, #1 in KLAS healthcare IT technical & professional services and solutions company providing high-skilled, cross-platform expertise and proud sponsor of the Tackling EHR & EMR Transition Series. For over a decade, Galen has partnered with more than 300 specialty practices, hospitals, health information exchanges, health systems and integrated delivery networks to provide high-quality, expert level IT consulting services including strategy, optimization, data migration, project management, and interoperability. Galen also delivers a suite of fully integrated products that enhance, automate, and simplify the access and use of clinical patient data within those systems to improve cost-efficiency and quality outcomes. For more information, visit Connect with us on Twitter, Facebook and LinkedIn.

Hospital CIOs Say Better Data Security Is Key Goal

Posted on November 9, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or

A new study has concluded that while they obviously have other goals, an overwhelming majority of healthcare CIOs see data protection as their key objective for the near future. The study, which was sponsored by Spok and administered by CHIME, more than 100 IT leaders were polled on their perspective on communications and healthcare.

In addition to underscoring the importance of data security efforts, the study also highlighted the extent to which CIOs are being asked to add new functions and wear new hats (notably patient satisfaction management).

Goals and investments
When asked what business goals they expected to be focused on for the next 18 months, the top goal of 12 possible options was “strengthening data security,” which was chosen by 81%. “Increasing patient satisfaction” followed relatively closely at 70%, and “improving physician satisfaction” was selected by 65% of respondents.

When asked which factors were most important in making investments in communications-related technologies for their hospital, the top factor of 11 possible options was “best meets clinician/organizational needs” with 82% selecting that choice, followed by “ease of use for end users (e.g. physician/nurse) at 80% and “ability to integrate with current systems (e.g. EHR) at 75%.

When it came to worfklows they hoped to support with better tools, “care coordination for treatment planning” was the clear leader, chosen by 67% of respondents, followed by patient discharge (48%), “patient handoffs within hospital” (46%) and “patient handoffs between health services and facilities” chosen by 40% of respondents selected.

Mobile developments
Turning to mobile, Spok asked healthcare CIOs which of nine technology use cases were driving the selection and deployment of mobile apps. The top choices, by far, were “secure messaging in communications among care team” at 84% and “EHR access/integrations” with 83%.

A significant number of respondents (68%) said they were currently in the process of rolling out a secure texting solution. Respondents said their biggest challenges in doing so were “physician adoption/stakeholder buy-in” at 60% and “technical setup and provisioning” at 40%. A substantial majority (78%) said they’d judge the success of their rollout by the rate the solution was adopted by by physicians.

Finally, when Spok asked the CIOs to take a look at the future and predict which issues will be most important to them three years from now, the top-rated choice was “patient centered care,” which was chosen by 29% of respondents,” “EHR integrations” and “business intelligence.”

A couple of surprises
While much of this is predictable, I was surprised by a couple things.

First, the study doesn’t seem to have been designed for statistical significance, it’s still worth noting that so many CIOs said improving patient satisfaction was one of their top three goals for the next 18 months. I’m not sure what they can do to achieve this end, but clearly they’re trying. (Exactly what steps they should take is a subject for another article.)

Also, I didn’t expect to see so many CIOs engaged in rolling out secure texting, partly because I would’ve expected such rollouts to already have been in place at this point, and partly because I assume that more CIOs would be more focused on higher-level mobile apps (such as EHR interfaces). I guess that while mobile clinical integration efforts are maturing, many healthcare facilities aren’t ready to take them on yet.

Health System Pays Docs To Use Cerner EHR

Posted on November 7, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or

Typically, we cover US-based stories in this blog, but the following is just too intriguing to miss. According to a Vancouver newspaper, an area hospital system agreed to pay physicians a daily fee to use its unpopular Cerner EHR, positioning the payments as compensation for unpaid overtime spent learning the system.

The Times Colonist is reporting that local hospital system Island Health has offered on-call physicians at its Nanaimo Regional General Hospital $260 a day, and emergency department physicians up to $780 a day, to use its unpopular Cerner system.

The newspaper cites a memo from hospital chief medical officer and executive vice president Dr. Jeremy Etherington, which says that the payment was in recognition of “the extra burden the new electronic health record has placed on many physicians during the rollout phase” of the new EHR.

In 2013, Island Health (which is based in British Columbia, Canada) signed a 10 year, $50 million deal with Cerner to implement its platform across its three hospitals. More recently, in March of this year, Island Health’s three facilities went live on the Cerner platform.

Within weeks, physicians at Nanaimo Regional Hospital were flooding executives with complaints about the new platform, which they claimed we randomly lost, buried or changed orders for drugs and diagnostic tests. Some physicians at the hospital reverted to using pen and paper to complete orders.

Not long after, physicians signed a petition asking the health system to stop further implementation, citing safety and workability concerns, but executives still moved forward with the rollout.

Neither the newspaper article nor other reports could identify how many physicians accepted the offer from Island Health. Also, the health systems management hasn’t shared how it picked doctors who were eligible for the payout, and what criteria it used to determine the size of the higher emergency department physician payouts. However, according to a Nanaimo physician and medical staff member quoted by Becker’s Health IT & CIO Review quotes, execs structured the payments to reflect the unpaid overtime doctors put in to learn the system.

As for the claims that the Cerner system was causing clinical problems and even perhaps endangering patients, that issue is still seemingly unresolved. In late July, British Columbia Minister of Health Terry Lake apparently ordered a review of the Cerner system, but results of that review do not appear to be available just yet.

It’s not clear whether the payments bought Island Health enough goodwill to mollify the bad feelings of doctors who didn’t receive one of these payments, nor whether those who are being paid will stay bought. And that’s the real question here. Call the payments a publicity stunt, an attempt at fairness or cynical political strategy, they may not be enough to get physicians onto the system if they are convinced it doesn’t work. I guess we’ll have to wait and see what happens.