Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Hospital EMR Adoption Divide Widening, With Critical Access Hospitals Lagging

Posted on September 8, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

I don’t know about you, but I was a bit skeptical when HIMSS Analytics rolled out its EMRAM {Electronic Medical Record Adoption Model) research program. As some of you doubtless know, EMRAM breaks EMR adoption into eight stages, from Stage 0 (no health IT ancillaries installed) to Stage 7 (complete EMR installed, with data analytics on board).

From its launch onward, I’ve been skeptical about EMRAM’s value, in part because I’ve never been sure that hospital EMR adoption could be packaged neatly into the EMRAM stages. Perhaps the research model is constructed well, but the presumption that a multivariate process of health IT adoption can be tracked this way is a bit iffy in my opinion.

On the other hand, I like the way the following study breaks things out. New research published in the Journal of the American Medical Informatics Association looks at broader measures of hospital EHR adoption, as well as their level of performance in two key categories.

The study’s main goal was to assess the divide between hospitals using their EHRs in an advanced fashion and those that were not. One of the key steps in their process was to crunch numbers in a manner allowing them to identify hospital characteristics associated with high adoption in each of the advanced use criteria.

To conduct the research, the authors dug into 2008 to 2015 American Hospital Association Information Technology Supplement survey data. Using the data, the researchers measured “basic” and “comprehensive” EHR adoption among hospitals. (The ONC has created definitions for both basic and advanced adoption.)

Next, the research team used new supplement questions to evaluate advanced use of EHRs. As part of this process, they also used EHR data to evaluate performance management and patient engagement functions.

When all was said and done, they drew the following conclusions:

  • 80.5% of hospitals had adopted a basic EHR system, up 5.3% from 2014
  • 37.5% of hospitals had adopted at least 8 (of 10) EHR data sets useful for performance measurement
  • 41.7% of hospitals adopted at least 8 (of 10) EHR functions related to patient engagement

One thing that stood out among all the data was that critical access hospitals were less likely to have adopted at least 8 performance measurement functions and at least eight patient engagement functions. (Notably, HIMSS Analytics research from 2015 had already found that rural hospitals had begun to close this gap.)

“A digital divide appears to be emerging [among hospitals], with critical-access hospitals in particular lagging behind,” the article says. “This is concerning, because EHR-enabled performance measurement and patient engagement are key contributors to improving hospital performance.”

While the results don’t surprise me – and probably won’t surprise you either – it’s a shame to be reminded that critical access hospitals are trailing other facilities. As we all know, they’re always behind the eight ball financially, often understaffed and overloaded.

Given their challenges, it’s predictable that critical access hospitals would continue lag behind in the health IT adoption curve. Unfortunately, this deprives them of feedback which could improve care and perhaps offer a welcome boost to their efficiency as well. It’s a shame the way the poor always get poorer.

Google’s DeepMind Rolling Out Bitcoin-Like Health Record Tracking To Hospitals

Posted on May 8, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Blockchain technology is gradually becoming part of how we think about healthcare data. Even government entities like the ONC and FDA – typically not early adopters – are throwing their hat into the blockchain ring.

In fact, according to recent research by Deloitte, healthcare and life sciences companies are planning the most aggressive blockchain deployments of any industry. Thirty-five percent of Deloitte’s respondents told the consulting firm that they expected to put blockchain into production this year.

Many companies are tackling the practical uses of blockchain tech in healthcare. But to me, few are more interesting than Google’s DeepMind, a hot new AI firm based in the UK acquired by Google a few years ago.

DeepMind has already signed an agreement with a branch of Britain’s National Health Trust, under which it will access patient data in the development healthcare app named Streams. Now, it’s launching a new project in partnership with the NHS, in which it will use a new technology based on bitcoin to let hospitals, the NHS and over time, patients track what happens to personal health data.

The new technology, known as “Verifiable Data Audit,” will create a specialized digital ledger which automatically records every time someone touches patient data, according to British newspaper The Guardian.

In a blog entry, DeepMind co-founder Mustafa Suleyman notes that the system will track not only that the data was used, but also why. In addition, the ledger supporting the audit will be set to append-only, so once the system records an activity, that record can’t be erased.

The technology differs from existing blockchain models in some important ways, however. For one thing, unlike in other blockchain models, Verifiable Data Audit won’t rely on decentralized ledger verification of a broad set of participants. The developers have assumed that trusted institutions like hospitals can be relied on to verify ledger records.

Another way in which the new technology is different is that it doesn’t use a chain infrastructure. Instead, it’s using a mathematical function known as a Merkle tree. Every time the system adds an entry to the ledger, it generates a cryptographic hash summarizing not only that latest ledger entry, but also the previous ledger values.

DeepMind is also providing a dedicated online interface which participating hospitals can use to review the audit trail compiled by the system, in real-time. In the future, the company hopes to make automated queries which would “sound the alarm” if data appeared to be compromised.

Though DeepMind does expect to give patients direct oversight over how, where and why their data has been used, they don’t expect that to happen for some time, as it’s not yet clear how to secure such access. In the mean time, participating hospitals are getting a taste of the future, one in which patients will ultimate control access to their health data assets.

Database Linked With Hospital EMR To Encourage Drug Monitoring

Posted on March 31, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

According to state officials, Colorado occupies the unenviable position of second worst in the US for prescription drug misuse, with more than 255,000 Coloradans misusing prescribed medications.

One way the state is fighting back is by running the Colorado Prescription Drug Monitoring Program which, like comparable efforts in other states, tracks prescriptions for controlled medications. Every regular business day, the state’s pharmacists upload prescription data for medications listed in Schedules II through V.

While this effort may have value, many physicians haven’t been using the database, largely because it can be difficult to access. In fact, historically physicians have been using the system only about 30 percent of the time when prescribing controlled substances, according to a story appearing in HealthLeaders Media.

As things stand, it can take physicians up to three minutes to access the data, given that they have to sign out of their EMR, visit the PDMP site, log in using separate credentials, click through to the right page, enter patient information and sort through possible matches before they got to the patient’s aggregated prescription history. Given the ugliness of this workflow, it’s no surprise that clinicians aren’t searching out PDMP data, especially if they don’t regard a patient as being at a high risk for drug abuse or diversion.

But perhaps taking some needless steps out of the process can make a difference, a theory which one of the state’s hospitals is testing. Colorado officials are hoping a new pilot program linking the PDMP database to an EMR will foster higher use of the data by physicians. The pilot, funded by a federal grant through the Bureau of Justice Assistance, connects the drug database directly to the University of Colorado Hospital’s Epic EMR.

The project began with a year-long building out phase, during which IT leaders created a gateway connecting the PDMP database and the Epic installation. Several months ago, the team followed up with a launch at the school of medicine’s emergency medicine department. Eventually, the PDMP database will be available in five EDs which have a combined total of 270,000 visits per year, HealthLeaders notes.

Under the pilot program, physicians can access the drug database with a single click, directly from within the Epic EMR system. Once the PDMP database was made available, the pilot brought physicians on board gradually, moving from evaluating their baseline use, giving clinicians raw data, giving them data using a risk-stratification tool and eventually requiring that they use the tool.

Researchers guiding the pilot are evaluating whether providers use the PDMP more and whether it has an impact on high-risk patients. Researchers will also analyze what happened to patients a year before, during and a year after their ED visits, using de-identified patient data.

It’s worth pointing out that people outside of Colorado are well aware of the PDMP access issue. In fact, the ONC has been paying fairly close attention to the problem of making PDMP data more accessible. That being said, the agency notes that integrating PDMPs with other health IT systems won’t come easily, given that no uniform standards exist for linking prescription drug data with health IT systems. ONC staffers have apparently been working to develop a standard approach for delivering PDMP data to EMRs, pharmacy systems and health information exchanges.

However, at present it looks like custom integration will be necessary. Perhaps pilots like this one will lead by example.

“Learning Health System” Pilot Cuts Care Costs While Improving Quality

Posted on January 11, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

As some of you will know, the ONC’s Shared Nationwide Interoperability Roadmap’s goal is to create a “nationwide learning health system.”  In this system, individuals, providers and organizations will freely share health information, but more importantly, will share that information in “closed loops” which allow for continuous learning and care improvement.

When I read about this model – which is backed by the Institute of Medicine — I thought it sounded interesting, but didn’t think it terribly practical. Recently, though, I stumbled upon an experiment which attempts to bring this approach to life. And it’s more than just unusual — it seems to be successful.

What I’m talking about is a pilot study, done by a team from Nationwide Children’s Hospital and The Ohio State University, which involved implementing a “local” learning health system. During the pilot, team members used EHR data to create personalized treatments for patients based on data from others with similar conditions and risk factors.

To date, building a learning health system has been very difficult indeed, largely because integrating EHRs between multiple hospital systems is very difficult. For that reason, researchers with the two organizations decided to implement a “local” learning health system, according to a press statement from Nationwide Children’s.

To build the local learning health system, the team from Nationwide Children’s and Ohio State optimized the EHR to support their efforts. They also relied on a “robust” care coordination system which sat at the core of the EHR. The pilot subjects were a group of 131 children treated through the hospital’s cerebral palsy program.

Children treated in the 12-month program, named “Learn From Every Patient,” experienced a 43% reduction in total inpatient days, a 27% reduction in inpatient admissions, a 30% reduction in emergency department visits and a 29% reduction in urgent care visits.

The two institutions spent $225,000 to implement the pilot during the first year. However, the return on this investment was dramatic.  Researchers concluded that the program cut healthcare costs by $1.36 million. This represented a savings of about $6 for each dollar invested.

An added benefit from the program was that the clinicians working in the CP clinic found that this approach to care simplified documentation, which saved time and made it possible for them to see more patients during each session, the team found.

Not surprisingly, the research team thinks this approach has a lot of potential. “This method has the potential to be an effective complementary or alternative strategy to the top-down approach of learning health systems,” the release said. In other words, maybe bottom-up, incremental efforts are worth a try.

Given these results, it’d be nice to think that we’ll have full interoperability someday, and that we’ll be able to scale up the learning health system approach to the whole US. In the mean time, it’s good to see at least a single health system make some headway with it.

Hospitals Offering Broad Access To Health Data, But There Are Limits

Posted on October 5, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A new study released by the ONC concludes that hospitals are almost universally offering patients ability to view their data electronically, with large numbers offering patients the ability to view and share their data digitally as well.

While the data reveals that hospitals have become more ready to offer electronic access to patient records, it also suggests that they are struggling to provide a full array of electronic access options. The fact that some hospitals still haven’t gotten there may be just a phase, but it may also suggest that issues still remain which they need to address before they offer a full range of patient data functions.

On the one hand, the results of the study are promising. The ONC data demonstrates that there’s been a very substantial uptick in the deployment of patient data access technologies between 2012 and 2015. The data shows that in 2015, 95% of U.S. hospitals gave patients the ability to view their health information electronically, 87% allowed them to download their health information and 69% offered the trifecta (patients get to view, download and transmit the health information).

These numbers represent huge changes that took place during the period studied. For example, in 2013 no state had 40% or more of its hospitals offering patients the ability to view, download or transmit their data, and now all states have at least 40% of their hospitals offering all three options. Meanwhile, the volume of hospitals offering view and download availability has grown 70% when compared to 2012, the ONC reports. And the proportion of hospitals providing view, download and transmit capabilities increased seven fold from 2013.

These numbers track closely with data reported by the American Hospital Association earlier this year, which found that 92% of hospitals responding to its survey offered patients access to the medical records in 2015, up from just 43% in 2013. The AHA also found that 84% of hospitals allowed patients to download information from their records, 70% let patients suggest changes to their medical record and 70% had made it possible for patients to send a referral summary electronically.

All that being said, however, I find it a bit troubling that roughly 30% of hospitals aren’t offering the all three major functions mentioned above. It appears that a failure to offer patients the ability to share their data is what disqualifies most of the 31% from being included in the list of broadly-functioning data sharing candidates. And that’s just too bad.

I guess I shouldn’t be surprised that a substantial subset of hospitals haven’t enabled such sharing, given that many still seem to see the data as proprietary. (I can’t prove this but I’ve heard many anecdotes to that effect.) But I’m still disappointed to find that many hospitals haven’t enabled such a lightweight model of interoperability.

$34.7 Billion Spent on Meaningful Use

Posted on July 8, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

CMS has put out the latest data on meaningful use participation and payments. They broke the Medicare dollars out by meaningful use stage 1 and stage 2. Meaningful use stage 1 cost nearly $20 billion. Meaningful use stage 2 cost $3.4 billion. The amounts were less for stage 2, but that’s still a massive drop off.

Less than half of eligible providers participated in stage 2 that participated in stage 1 (308k compared to 145k). Participating hospitals dropped from 4600 hospitals to 3096. This illustrates well what we’ve been saying for a while as far as hospitals still largely participating in meaningful use and most doctors choosing not to participate.

Also interesting to note is that at its peak, meaningful use was paying about $10 billion per year. In 2015, they spent $2.8 billion.

What I didn’t see in this report was any numbers on the cost savings that the meaningful use program provided. All the OIG estimates for meaningful use talked about how much money would be spent, but they also calculated how much money would be saved as well. As I recall they estimated about $36 billion in spending, but about $16 billion in savings. That would put the cost of the meaningful use program at $20 billion instead of the full $36 billion which it looks like we’ve now pretty much spent.

I like that HHS puts out this accountability as far as where the meaningful use money was spent. Shouldn’t we have some accountability as far as the savings as well? Do they not have a way to calculate it? Are they afraid that there weren’t cost savings? Or that meaningful use actually added costs? Maybe it’s in another report and I just missed it. If you know of that other report, I’d love to see it.

What do you think of these numbers? What’s been the benefit of the $34.7 billion that’s been spent? I’d love to hear your thoughts in the comments.

Data Sharing Largely Isn’t Informing Hospital Clinical Decisions

Posted on July 6, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Some new data released by ONC suggests that while healthcare data is being shared far more frequently between hospitals than in the past, few hospital clinicians use such data regularly as part of providing patient care.

The ONC report, which is based on a supplement to the 2015 edition of an annual survey by the American Hospital Association, concluded that 96% of hospitals had an EHR in place which was federally tested and certified for the Meaningful Use program. That’s an enormous leap from 2009, the year federal economic stimulus law creating the program was signed, when only 12.2% of hospitals had even a basic EHR in place.

Also, hospitals have improved dramatically in their ability to share data with other facilities outside their system, according to an AHA article from February. While just 22% of hospitals shared data with peer facilities in 2011, that number had shot up to 57% in 2014. Also, the share of hospitals exchanging data with ambulatory care providers outside the system climbed from 37% to 60% during the same period.

On the other hand, hospitals are not meeting federal goals for data use, particularly the use of data not created within their institution. While 82% of hospitals shared lab results, radiology reports, clinical care summaries or medication lists with hospitals or ambulatory care centers outside of their orbit — up from 45% in 2009 — the date isn’t having as much of an impact as it could.

Only 18% of those surveyed by the AHA said that hospital clinicians often used patient information gathered electronically from outside sources. Another 35% reported that clinicians used such information “sometimes,” 20% used it “rarely” and 16% “never” used such data. (The remaining 11% said that they didn’t know how such data was used.)

So what’s holding hospital clinicians back? More than half of AHA respondents (53%) said that the biggest barrier to using interoperable data integrating that data into physician routines. They noted that since shared information usually wasn’t available to clinicians in their EHRs, they had to go out of the regular workflows to review the data.

Another major barrier, cited by 45% of survey respondents, was difficulty integrating exchange information into their EHR. According to the AHA survey, only 4 in 10 hospitals had the ability to integrate data into their EHRs without manual data entry.

Other problems with clinician use of shared data concluded that information was not always available when needed (40%), that it wasn’t presented in a useful format (29%) and that clinicians did not trust the accuracy of the information (11%). Also, 31% of survey respondents said that many recipients of care summaries felt that the data itself was not useful, up from 26% in 2014.

What’s more, some technical problems in sharing data between EHRs seem to have gotten slightly worse between the 2014 and 2015 surveys. For example, 24% of respondents the 2014 survey said that matching or identifying patients was a concern in data exchange. That number jumped to 33% in the 2015 results.

By the way, you might want to check out this related chart, which suggests that paper-based data exchange remains wildly popular. Given the challenges that still exist in sharing such data digitally, I guess we shouldn’t be surprised.

New Federal Health IT Strategic Plan for 2015-2020

Posted on December 8, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The big news came out today that HHS had released its Health IT Strategic Plan for 2015-2020. You can find more details about the plan and also read the 28 page Federal Health IT Strategic plan online. Unlike many of the regulations, this strategic plan is very readable and gives a pretty good idea of where ONC wants to take healthcare IT (hint: interoperability). Although, the document is available for comment, so your comments could help to improve the proposed plan.

I think this image from the document really does a nice job summarizing the plan’s goals:
Federal Health IT Strategic Plan Summary

When I see a plan like this, the goals are noble and appropriate. No doubt we could argue about some of the details, but I think this is directionally good. What I’m not so sure about is how this plan will really help healthcare reach the specified goals. I need to dive into the specific strategies offered in the document to know if they really have the ability to reach these goals. I might have to take each goal and strategy and make a series out of it.

What do you think of this new health IT strategic plan?

The State of Government Healthcare IT Initiatives

Posted on November 12, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Brian Eastwood has created a really great article on CIO.com that looks at why Healthcare IT is under fire. His finally couple paragraphs summarize the current challenge for government healthcare IT initiatives:

ONC – as well as HHS at large – admittedly finds itself between Scylla and Charybdis. Too much regulation (medical devices) can do just as much harm as too little regulation (interoperability). Moving too quickly (meaningful use) can cause as much frustration as moving too slowly (telehealth). Politics can explain some industry challenges (reform’s uncertain future) but not others (public perception of Healthcare.gov).

That said, healthcare wants to change. Healthcare has to change. As healthcare continues its rapid, unprecedented march toward modernity, industry leaders have every right to expect – no, demand – a strong, confident voice in their corner. Right now, ONC can barely muster a whisper when, instead, it should be shouting.

I don’t think I’ve seen a better concise summary of the challenges that ONC, CMS, FDA, etc face. This shouldn’t be seen as an excuse for these organizations. We all face challenges in our job and we have to learn to balance them all. The same is true for organizations like ONC.

What makes this challenge even harder for ONC is that they’re in the midst of a massive change in leadership. Not to mention a leader, Karen DeSalvo, who at best has her time split between important issues like Ebola and her work as National Coordinator over healthcare IT. Considering DeSalvo’s passion for public health, you can guess where she’s going to spend most of her time.

In some ways it reminds me of when I started my first healthcare IT blog: EMR and HIPAA. As I started blogging, I realized that I had a real passion for writing about EMR. The same could not be said for HIPAA. Despite it’s name, I was spending most of my time writing about EMR and only covering HIPAA when breaches or other major changes happened. I imagine that DeSalvo will take a similar path.

Without a dedicated leader, I don’t see any way that Brian Eastwood’s vision of ONC shouting with confidence becoming a reality. A bifurcated leader won’t likely be able to muster more than the current whisper. It’s no wonder that CHIME, HIMSS and other major organizations are asking for DeSalvo to be full time at ONC or for her to be replaced with someone who can be dedicated full time to ONC.

What should be clear to us all is that healthcare IT isn’t going anywhere. Technology is going to be a major part of healthcare going forward. Why the government wouldn’t want to make a sound investment with strong leadership is beyond me.

What’s Happening with All the Departures at ONC?

Posted on October 3, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

In many ways, it’s expected that there will be a fair amount of change in the leadership of an organization when the leader leaves. The new leader often wants to bring in their people with whom they’ve worked with before and trust. Plus, I’ve previously noted that the Golden Age of EHR is over and so it’s not surprising that many people would leave ONC as the MU money is running out and the future of ONC is uncertain.

You’ll see the letter below that Karen DeSalvo just sent out about the latest ONC departure: Judy Murphy, Chief Nursing Officer (CNO) at ONC. This is the fourth high level leader that’s left ONC in the past few months. For those keeping track at home, Doug Fridsma MD, ONC’s Chief Science Officer, Joy Pritts, the first Chief Privacy Officer at ONC, and Lygeia Ricciardi, Director of the Office of Consumer eHealth, are the other 3 that have left ONC.

When Karen DeSalvo announced the ONC reorganization, here’s the leadership team she outlined:
Office of Care Transformation: Kelly Cronin
Office of the Chief Privacy Officer: Joy Pritts
Office of the Chief Operating Officer: Lisa Lewis
Office of the Chief Scientist: Doug Fridsma, MD, PhD
Office of Clinical Quality and Safety: Judy Murphy, RN
Office of Planning, Evaluation, and Analysis: Seth Pazinski
Office of Policy: Jodi Daniel
Office of Programs: Kim Lynch
Office of Public Affairs and Communications: Nora Super
Office of Standards and Technology: Steve Posnack

Three of the people on this list have already left ONC. That’s a pretty big hit to an organization that will likely have to do some hard work to ensure they’re included in future budgets in a post-MU era. It’s hard to fault any of these people who have an opportunity to make a lot more money working in industry. It will be fun to see who steps in to replace all these departures (including Dr. Jon White and Dr. Andy Gettinger who DeSalvo talks about in her letter below).

Must be an interesting time in the hallways of ONC.

Letter from Karen DeSalvo to ONC team about the departure of Judy Murphy, CNO of ONC:

ONC Team:

I am writing to let you know that Judy Murphy, our Chief Nursing Officer (CNO) and Director of the Office of Clinical Quality and Safety (OCQS), will be leaving ONC to take on an exciting new position as Chief Nursing Officer with IBM Healthcare Global Business Services. Her last day will be October 17.

Judy came to ONC in December 2011 and continued her established tradition of giving passionately and tirelessly to the entire health IT community. As Deputy National Coordinator for Programs and Policy, she led the HITECH funded program offices to achieve key milestones, such as the RECs providing assistance to 150,000 providers and helping 100,000 of them meet the meaningful use incentive requirements (exceeding the goal by 150%). She ensured that dedicated resources were available to help 1,300 critical access and rural hospitals exceed the same goals by 200%. She helped grow the MUVer (Meaningful Use Vanguard) Program to 1,000 providers and the Health IT Fellows Program to 45, giving us real boots on the ground to help providers adopt and use EHRs.

Her long-standing reputation of patient advocacy and maintaining a “patient-centric” point of view helped in ONC’s creation of the Office of Consumer eHealth, as well as identify annual strategic goals to promote consumer engagement. With the office, she helped launch the now very successful “Blue Button: Download your Health Data” campaign initiative.

Most recently, as CNO, she championed a Nursing Engagement Strategy for ONC and initiated the joint ONC and American Nurses Association Health IT for Nurses Summit which was attended by 200 RNs and NPs. In addition, her astute organizational and project management skills were put to use strengthening portfolio management and project performance management at ONC.

In her time here, she received several awards spotlighting her work, including the HIMSS Federal Health IT Leadership Award, the AMIA President’s Leadership Award, and the Distinguished Alumni Achievement Award from her alma mater, Alverno College, in Wisconsin.

We are planning a smooth transition of Judy’s current duties. Judy’s CNO responsibilities will be entrusted to the other nurses at ONC until a replacement CNO can be named.

Dr. Jon White will be on a part-time detail to ONC from the Agency for Healthcare Research and Quality (AHRQ) to serve as interim lead of OCQS and serve as ONC’s Acting Chief Medical Officer, reporting to Deputy National Coordinator Jacob Reider, while ONC searches for permanent staff to fill these positions. Dr. White directs AHRQ’s Health IT portfolio and will continue in that role part-time.

Dr. Andy Gettinger, from Dartmouth Hitchcock Medical Center, has agreed to lead the OCQS Safety team and the patient safety work. Dr. Gettinger comes to us with vast experience in many areas of health IT and we are excited to welcome him to the team. Judy is working closely with Jon, Andy, the extraordinary OCQS team, and me to ensure a seamless transition of her responsibilities.

Please join me in wishing Judy all the best in her new role, thanking her for her public service to our nation, and welcoming Andy and Jon to our team.

kd