Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

EMR Vendors Slow To Integrate Telemedicine Options

Posted on August 27, 2015 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Despite the massive growth in demand for virtual medical services, major EMR vendors are still proving slow to support such options, seemingly ceding the market to more agile telemedicine startups.

Independent telemedicine vendors targeting consumers are growing like weeds. Players like Doctor on Demand, NowClinic, American Well and HealthTap are becoming household names, touted not only in healthcare blogs but on morning TV talk shows. These services, which typically hire physicians as consultants, offer little continuity of care but provide a level of easy access unheard of in other settings.

Part of what’s fueling this growth is that health insurers are finally starting to pay for virtual medical visits. For example, Medicare and nearly every state Medicaid plan also cover at least some telemedicine services. Meanwhile, 29 states require that private payers cover telehealth the same as in-person services.

Hospitals and health systems are also getting on board the telemedicine train. For example, Stanford Healthcare recently rolled out a mobile health app, connected to Apple HealthKit and its Epic EMR, which allows patients to participate in virtual medical appointments through its ClickWell Care clinic. Given how popular virtual doctor visits have become, I’m betting that most next-gen apps created by large providers will offer this option.

EMR vendors, for their part, are adding telemedicine support to their platforms, but they’re not doing much to publicize it. Take Epic, whose EpicCare Ambulatory EMR can be hooked up to a telemedicine module. The EpicCare page on its site mentions that telemedicine functionality is available, but certainly does little to convince buyers to select it. In fact, Epic has offered such options for years, but I never knew that, and lately I spend more time tracking telemedicine than I do any other HIT trend.

As I noted in my latest broadcast on Periscope (follow @ziegerhealth), EMR vendors are arguably the best-positioned tech vendors to offer telemedicine services. After all, EMRs are already integrated into a hospital or clinic’s infrastructure and workflow. And this would make storage and clinical classification of the consults easier, making the content of the videos more valuable. (Admittedly, developing a classification scheme — much less standards — probably isn’t trivial, but that’s a subject for another article.)

What’s more, rather than relying on the rudimentary information supplied by patient self-reports, clinicians could rely on full-bodied medical data stored in that EMR. I could even see next-gen video visit technology which exposes medical data to patients and allows patients to discuss it live with doctors.

But that’s not how things are evolving. Instead, it seems that providers are largely outsourcing telemedicine services, a respectable but far less robust way to get things done. I don’t know if this will end up being the default way they deliver virtual visits, but unless EMR vendors step up, they’ll certainly have to work harder to get a toehold in this market.

I don’t know why so few EMR companies are rolling out their own virtual visit options. To me, it seems like a no-brainer, particularly for smaller ambulatory vendors which still need to differentiate themselves. But if I were an investor in a lagging EMR venture, you can bet your bottom dollar I’d want to know the answer.

EMRs Must Support Hospital Outcomes Reporting

Posted on August 25, 2015 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Should a hospital be paid if it doesn’t make its outcomes statistics public? Pediatric heart surgeon Dr. Jeffrey Jacobs says “no.” Jacobs, who chairs the Society of Thoracic Surgeons National Database workforce, recently told CNN that he believes reimbursement should be tied to whether a hospital shares data transparently. “We believe in the right of patients and families to know these outcomes,” said Jacobs, who is with the Johns Hopkins All Children’s Heart Institute in St. Petersburg, FL.

Jacobs’ views might be on the extreme side of the industry spectrum, but they’re growing more common. In today’s healthcare industry, which pushes patients to be smart shoppers, hospitals are coming under increasing pressure to share some form of outcomes data with the public.

I’ve argued elsewhere that in most cases, most hospital report cards and ratings are unlikely to help your average consumer, as they don’t offer much context how the data was compiled and why those criteria mattered. But this problem should be righting itself. Given that most hospitals have spent millions on EMR technology, you’d think that they’d finally be ready to produce say, risk-adjusted mortality, error rates and readmissions data patients can actually use.

Today, EMRs are focused on collecting and managing clinical data, not providing context on that data, but this can be changed. Hospitals can leverage EMRs to create fair, risk-adjusted outcomes reports, at least if they have modules that filter for key data points and connect them with non-EMR-based criteria such as a physician’s experience and training.

While this kind of functionality isn’t at the top of hospitals’ must-buy list, they’re likely to end up demanding that EMRs offer such options in the future. I foresee a time when outcomes reporting will be a standard feature of EMRs, even if that means mashing up clinical data with outside sources. EMRs will need to interpret and process information sources ranging from credentialing databases and claims to physician CVs alongside acuity modifiers.

I know that what I’m suggesting isn’t trivial. Mixing non-clinical data with clinical records would require not only new EMR technology, but systems for classifying non-clinical data in a machine-readable and parseable format. Creating a classification scheme for this outside data is no joke, and at first there will probably be intermittent scandals when EMR-generated outcomes reports don’t tell the real story.

Still, in a world that increasingly demands quality data from providers, it’s hard to argue that you can share data with everyone but the patients you’re treating. Patients deserve decision support too.

It’s more than time for hospitals to stop hiding behind arguments that interpreting outcomes data is too hard for consumers and start providing accurate outcomes data. With a multi-million-dollar tool under their roof designed to record every time a doctor sneezes, analyzing their performance doesn’t take magic powers, though it may shake things up among the medical staff.  Bottom line, there’s less excuse than ever not to be transparent with outcomes. And if that takes adding new functionality to EMRs, well, it’s time to do that.

NYC Hospitals Face Massive Problems With Epic Install

Posted on August 24, 2015 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A municipal hospital system’s Epic EMR install has gone dramatically south over the past two years, with four top officials being forced out and a budget which has more than doubled.

In early 2013, New York City-based Health and Hospitals Corp. announced that it had signed a $302 million EMR contract with Epic. The system said that it planned to implement the Epic EMR at 11 HHC hospitals, four long term care facilities, six diagnostic treatment centers and more than 70 community-based clinics.

The 15-year contract, which was set to be covered by federal funding, was supposed to cover everything from soup to nuts, including software and database licenses, professional services, testing and technical training, software maintenance, and database support and upgrades.

Fast forward to the present, and the project has plunged into crisis. The budget has expanded to $764 million, and HHC’s CTO, CIO, the CIO’s interim deputy and the project’s head of training have been given the axe amidst charges of improper billing. Seven consultants — earning between $150 and $185 an hour — have also been kicked off of the payroll.

With HHC missing so many top leaders, the system has brought in a consulting firm to stabilize the Epic effort. Washington, DC-based Clinovations, which brought in an interim CMIO, CIO and other top managers to HHC, now has a $4 million, 15-month contract to provide project management.

The Epic launch date for the first two hospitals in the network was originally set for November 2014 but has been moved up to April 2016, according to the New York PostHHC leaders say that the full Epic launch should take place in 2018 if all now goes as planned. The final price tag for the system could end up being as high as $1.4 billion, the newspaper reports.

So how did the massive Epic install effort go astray? According to an audit by the city’s Technology Development Corp., the project has been horribly mismanaged. “At one point, there were 14 project managers — but there was no leadership,” the audit report said.

The HHC consultants didn’t help much either, according to an employee who spoke to the Post. The employee said that the consultants racked up travel, hotels and other expenses to train their own employees before they began training HHC staff.

HHC is now telling the public that things will be much better going forward. Spokeswoman Ana Marengo said that the chain has adopted a new oversight and governance structure that will prevent the implementation from falling apart again.”We terminated consultants, appointed new leadership, and adopted new timekeeping tools that will help strengthen the management of this project,” Marengo told the newspaper.

What I’d like to know is just what items in the budget expanded so much that a $300-odd million all-in contract turned into a $1B+ debacle. While nobody in the Post articles has suggested that Epic is at fault in any of this, it seems to me that it’s worth investigating whether the vendor managed to jack up its fees beyond the scope of the initial agreement. For example, if HHC was forced to pay for more Epic support than it had originally expected it wouldn’t come cheap. Then again, maybe the extra costs mostly come from paying for people with Epic experience. Epic has driven up the price of these people by not opening up the Epic certification opportunities.

On the surface, though, this appears to be a high-profile example of a very challenging IT project that went bad in a hurry. And the fact that city politics are part of the mix can’t have been helpful. What happened to HHC could conceivably happen to private health systems, but the massive budget overrun and billing questions have government stamped all over them. Regardless, for New York City patients’ sake I hope HHC gets the implementation right from here on in.

25% of EHR Budget Goes to EHR Training…At Least for the DoD EHR

Posted on August 14, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

In all of the news surrounding Leidos and Cerner winning the DoD EHR bid, I was really struck by this one piece from Healthcare IT News:

Training. As is often the case in massive software implementations, training eats up a lot of the costs and, in the DoD’s case, “over 25 percent of the contract goes to training users and clinicians,” Miller said.

Think about how much training you can get for $1+ billion. I get that training is not cheap. I also get that the DoD EHR implementation is a massive project, but that’s a lot of money for training. Do you think that most EHR implementations spend 1/4 of their budget on training?

Hopefully people will chime in with their answer to that question in the comments. My experience is that hospitals probably should budget 1/4 of their budget for training, but most don’t get anywhere near that amount. Plus, the EHR training budget often starts much larger and then when the budget overruns start to happen, EHR training is one of the first places they go to cut the budget.

How much EHR training is enough in your experience? Should it be 25% of the budget? I’m not sure how much is needed, but I do know that most organizations don’t purchase enough. Sounds like the DoD might be the exception.

What’s the Future of Health Information “Disposal”?

Posted on July 30, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

While at the HIM Summit, Deborah Green from AHIMA talked about the information lifecycle in healthcare. She showed a number of representations and flow charts of how information is collected and used in healthcare. Although, the part of the chart that intrigued me the most was the “disposal” element at the end. In fact, it prompted me to tweet the following:


As you look back at history disposal of paper charts was pretty straightforward. Most of the charts were organized by year and so you could have a 6 year retention policy. You’d collect all the charts that were older than 6 years and then either shred the old charts or move them to a more long term storage facility.

This concept gets much murkier in the world of EHR and digital charts. In fact, I talked with Deborah after her talk and asked if they’ve ever seen an EHR vendor which had a feature that would allow them to digitally “dispose” of an electronic chart. I’ve talked to hundreds of EHR vendors and I’ve never seen such a feature.

As a tech guy, I’ll admit that I wouldn’t want to be the programmer responsible for writing the code that “disposes” of an electronic chart. EHR software has been coded to never delete anything. At a maximum it might mark a record as inactive or essentially hide a record, but very few things in an EHR are ever really deleted. The concept of deletion is scary and has lots of consequences. Plus, what happens if your algorithm to delete old charts goes wrong and deletes the wrong information? You can fix that with some great backups, but I can imagine a lot of scenarios where even the backup could fail.

Technical challenges of an EHR delete feature aside, what does the future of digital chart “disposal” look like? What should digital chart disposal look like? Do we “shred” digital charts? Do we “shred” part of them? Do we keep them forever?

The reality is that the decision of what to do with the electronic chart is also dependent on the culture of the hospital. Research organizations want to keep all of the data forever and never ever delete anything. That old data might be a benefit to their research. Rural organizations often want to keep their data as long as possible as well. The idea of deleting their friends and neighbors data is foreign to them. In a larger urban area many organizations want to dispose of the chart as soon as the retention requirements are met. Having the old chart is a liability to them. Not having the chart helps remove that liability from their organization. Those are a few, but EHR vendors are going to have to deal with the wide variety of requirements.

If you think of the bigger picture, what’s the consequence if we shred something that could benefit the patient later? Will we need all of the historical patient information in order to provide a patient the best care possible?

These are challenging issues and I don’t think EHR vendors have really tackled them. This is largely because most organizations haven’t had an EHR long enough that they’re ready to start purging digital charts. However, that day is fast approaching. It will be interesting to see the wide variety of requests that organizations make when it comes to disposing of digital charts. It will also be interesting to see how EHR vendors implement these requests.

Why Not “Meaningful Interoperability” For EMR Vendors?

Posted on July 28, 2015 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

At this point, arguably, Meaningful Use has done virtually all of the work that it was designed to do. But as we all know, vendors are behind the curve. If they aren’t forced to guarantee interoperability — or at least meet a standard that satisfies most interconnectivity demands — they’re simply not going to bother.

While there’s obviously a certification process in place for EMR vendors which requires them to meet certain standards, interoperability seemingly didn’t make the cut. And while there’s many ways vendors could have shown they’re on board, none have done anything that really unifies the industry.

PR-driven efforts like the CommonWell Alliance don’t impress me much, as I’m skeptical that they’ll get anywhere. And the only example I can think of where a vendor  is doing something to improve interoperability, Epic’s Care Everywhere, is intended only to connect between Epic implementations. It’s not exactly an efficient solution.

A case in point: One of own my Epic-based providers logged on to Care Everywhere a couple of weeks ago to request my chart from another institution, but as of yet, no chart has arrived. That’s not exactly an effective way to coordinate care! (Of course, Epic in particular only recently dropped its fees for clinical data sharing, which weren’t exactly care coordination-friendly either.)

Increasingly, I’ve begun to think that the next stage of EMR maturation will come from some kind of “Meaningful Interoperability” incentive paid to vendors who really go the extra mile. Yes, this is iffy financially, but I believe it has to be done. As time and experience have shown, EMR vendors have approximately zero compelling reasons to foster universal interoperability, and perhaps a zillion to keep their systems closed.

Of course, the problem with rewarding interoperability is to decide which standards would be the accepted ones. Mandating interoperability would also force regulators to decide whether variations from the core standard were acceptable, and how to define what “acceptable” interoperability was. None of this is trivial.

The feds would also have to decide how to phase in vendor interoperability requirements, a process which would have to run on its own tracks, as provider Meaningful Use concerns itself with entirely different issues. And while ONC might be the first choice that comes to mind in supervising this process, it’s possible a separate entity would be better given the differences in what needs to be accomplished here.

I realize that some readers might believe that I’m dreaming if I believe this will ever happen. After all, given the many billions spent coaxing (or hammering) providers to comply with Meaningful Use, the Congress may prefer to lean on the stick rather than the carrot. Also, vendors aren’t dependent on CMS, whose involvement made it important for providers to get on board. And it may seem more sensible to rejigger certification programs — but if that worked they’d have done it already.

But regardless of how it goes down, the federal government is likely to take action at some point on this issue. The ongoing lack of interoperability between EMRs has become a sore spot with at least some members of Congress, for good reasons. After all, the lack of free and easy sharing of clinical data has arguably limited the return on the $30B spent on Meaningful Use. But throwing the book at vendors isn’t going to cut it, in my view. As reluctant as Congressional leaders may be to throw more money at the problem, it may be the only way to convince recalcitrant EMR vendors to invest significant development resources in creating interoperable systems.

Key Big Data Challenges Providers Must Face

Posted on July 17, 2015 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Everybody likes to talk about the promise of big data, but managing it is another story. Taming big data will take new strategies and new IT skills, neither of which are a no-brainer, according to new research by the BPI Network.

While BPI Network has identified seven big data pain points, I’d argue that they boil down to just a few key issues:

* Data storage and management:  While providers may prefer to host their massive data stores in-house, this approach is beginning to wear out, at least as the only strategy in town. Over time, hospitals have begun moving to cloud-based solutions, at least in hybrid models offloading some of their data. As they cautiously explore outsourcing some of their data management and storage, meanwhile, they have to make sure that they have security locked down well enough to comply with HIPAA and repel hackers.

Staffing:  Health IT leaders may need to look for a new breed of IT hire, as the skills associated with running datacenters have shifted to the application level rather than data transmission and security levels. And this has changed hiring patterns in many IT shops. When BPI queried IT leaders, 41% said they’d be looking for application development pros, compared with 24% seeking security skills. Ultimately, health IT departments will need staffers with a different mindset than those who maintained datasets over the long term, as these days providers need IT teams that solve emerging problems.

Data and application availability: Health IT execs may finally be comfortable moving at least some of their data into the cloud, probably because they’ve come to believe that their cloud vendor offers good enough security to meet regulatory requirements. But that’s only a part of what they need to consider. Whether their data is based in the cloud or in a data center, health IT departments need to be sure they can offer high data availability, even if a datacenter is destroyed. What’s more, they also need to offer very high availability to EMRs and other clinical data-wrangling apps, something that gets even more complicated if the app is hosted in the cloud.

Now, the reality is that these problems aren’t big issues for every provider just yet. In fact, according to an analysis by KPMG, only 10% of providers are currently using big data to its fullest potential. The 271 healthcare professionals surveyed by KPMG said that there were several major barriers to leveraging big data in their organization, including having unstandardized data in silos (37%), lacking the right technology infrastructure (17%) and failing to have data and analytics experts on board (15%).  Perhaps due to these roadblocks, a full 21% of healthcare respondents had no data analytics initiatives in place yet, though they were at the planning stages.

Still, it’s good to look at the obstacles health IT departments will face when they do take on more advanced data management and analytics efforts. After all, while ensuring high data and app availability, stocking the IT department with the right skillsets and implementing a wise data management strategy aren’t trivial, they’re doable for CIOs that plan ahead. And it’s not as if health leaders have a choice. Going from maintaining an enterprise data warehouse to leveraging health data analytics may be challenging, but it’s critical to make it happen.

Why Should You Invest in Health Information Governance?

Posted on July 14, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Hospitals are becoming large data centers of health information. In some ways, they’ve always been the storage facility of health information, but how we store, transfer, access, and share health information is dramatically changing in our new digital world. Plus, the volume of information we collect and store is expanding dramatically. This is why health information governance is becoming an extremely important topic in every hospital.

In order to better understand what’s happening with health Information Governance, I sat down with Rita Bowen, Senior Vice President of HIM and Privacy Officer at HealthPort, to talk about the topic. We shot these videos as one long video, but then chopped them up into shorter versions so you could more easily watch the ones that interest you most. You can find 2 of the videos below and 3 more over on EMR and HIPAA.

Who Should Manage Information Governance at Healthcare Organizations?

Why Invest in Health Information Governance?

8 Biggest Epic Price Tags in 2015

Posted on July 3, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Akanksha Jayanthi from Becker’s Hospital Review has aggregated a list of Epic purchases in 2015. The article does make the disclaimer that some hospitals and health systems have not yet disclosed the price of their Epic purchase. So, there are likely more Epic purchases. However, the Becker’s list gives you some insight into how much it costs to purchase Epic.

  • Partners HealthCare: $1.2 billion
  • Lehigh Valley Health Network: $200 million
  • Mayo Clinic: “Hundreds of millions”
  • Lahey Hospital & Medical Center: $160 million
  • Lifespan: $100 million
  • Erlanger Health System: $97 million
  • Wheaton Franciscan Healthcare: $54 million
  • Saint Francis Medical Center: $43 million

This list isn’t surprising for me. In fact, the most surprising part is that Epic would sell a $43 million implementation. That would have previously been unheard of from Epic. However, we’ve seen Epic moving slowly down the chain. I’m not sure if that’s because the top of the chain has dried up or something else, but Epic has definitely been doing smaller implementations which they wouldn’t have considered before.

What should also be noted is that many of these numbers are estimates. With projects of this size, it’s really easy for the cost of the EHR implementation to balloon out of control. In fact, the Partners HealthCare Epic implementation at the top of the list is a great example. It was originally estimated at $600 million and you can see that estimate has doubled.

When you look at these numbers, is it any surprise that investors want to take down Epic? I’d like to see a list of the Epic renewal prices. Can you imagine what the Epic renewal for Kaiser’s $9 billion Epic EHR implementation will be? That’s where the opportunity lies for someone wanting to disrupt Epic.

Interoperability Becoming Important To Consumers

Posted on June 26, 2015 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

The other day, I was talking with my mother about her recent primary care visit — and she was pretty po’d. “I can’t understand why my cardiologist didn’t just send the information to my family doctor,” she said. “Can’t they do that online these days? Why isn’t my doctor part of it?”

Now, to understand why this matters you need to know that my mother, who’s extremely bright, is nonetheless such a technophobe that she literally won’t touch my father’s desktop PC. She’s never opened a brower and has sent perhaps two or three e-mails in her life. She doesn’t even know how to use the text function on her basic “dumb” phone.

But she understands what interoperability is — even if the term would be foreign — and has little patience for care providers that don’t have it in place.

If this was just about my 74-year-old mom, who’s never really cared for technology generally, it would just be a blip. But research suggests that she’s far from alone.

In fact, a study recently released by the Society for Participatory Medicine and conducted by ORC International suggests that most U.S. residents are in my mother’s camp. Nearly 75% of Americans surveyed by SPM said that it was very important that critical health information be shared between hospitals, doctors and other providers.

What’s more, respondents expect these transfers to be free. Eighty seven percent were dead-set against any fees being charged to either providers or patients for health data transfers. That flies in the face of current business practices, in which doctors may pay between $5,000 to $50,000 to connect with laboratories, HIEs or government, sometimes also paying fees each time they send or receive data.

There’s many things to think about here, but a couple stand out in my mind.

For one thing, providers should definitely be on notice that consumers have lost patience with cumbersome paper record transfers in the digital era. If my mom is demanding frictionless data sharing, then I can only imagine what Millenials are thinking. Doctors and hospitals may actually gain a marketing advantage by advertising how connected they are!

One other important issue to consider is that interoperability, arguably a fevered dream for many providers today, may eventually become the standard of care. You don’t want to be the hospital that stands out as having set patients adrift without adequate data sharing, and I’d argue that the day is coming sooner rather than later when that will mean electronic data sharing.

Admittedly, some consumers may remain exercised only as long as health data sharing is discussed on Good Morning America. But others have got it in their head that they deserve to have their doctors on the same page, with no hassles, and I can’t say the blame them. As we all know, it’s about time.