Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

New Federal Health IT Strategic Plan for 2015-2020

Posted on December 8, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The big news came out today that HHS had released its Health IT Strategic Plan for 2015-2020. You can find more details about the plan and also read the 28 page Federal Health IT Strategic plan online. Unlike many of the regulations, this strategic plan is very readable and gives a pretty good idea of where ONC wants to take healthcare IT (hint: interoperability). Although, the document is available for comment, so your comments could help to improve the proposed plan.

I think this image from the document really does a nice job summarizing the plan’s goals:
Federal Health IT Strategic Plan Summary

When I see a plan like this, the goals are noble and appropriate. No doubt we could argue about some of the details, but I think this is directionally good. What I’m not so sure about is how this plan will really help healthcare reach the specified goals. I need to dive into the specific strategies offered in the document to know if they really have the ability to reach these goals. I might have to take each goal and strategy and make a series out of it.

What do you think of this new health IT strategic plan?

John Glaser to Stay on as Senior VP of Cerner Upon Close of Acquisition

Posted on November 19, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

In case you’re living under a rock (or more affectionately, you’re too busy working to follow the inside baseball of EHR company acquisition), Cerner is set to acquire Siemens in late winter or early spring pending all the needed approvals for companies this size. Watching the merging of these two companies is going to be very interesting indeed.

Neil Versel just reported that John Glaser, current CEO of Siemens Health Services, has announced that upon close of acquisition he’ll be joining the Cerner team as a Senior VP. I also love that John Glaser made this announcement on the Cerner blog.

I think this is a big deal since I believe John Glaser is at the point in his career that he could do just about anything (or nothing) if that’s what he desired. The few times I’ve interacted with John Glaser, he was sincerely interested in moving healthcare forward through the use of advanced IT. I imagine that’s what’s motivating him to stay with Cerner. No doubt, Cerner is sitting on a huge opportunity.

In John Glaser’s blog post, he provided an interesting insight into Neal Patterson’s comments at the Cerner user conference:

In his CHC keynote address, Cerner CEO Neal Patterson did a masterful job of conveying Cerner’s commitment to patient-centered care. Before he spoke, a patient and her nurse were introduced with explanation that the woman’s life was saved by a Cerner sepsis alerting system. Neal then shared the incredible challenges he and his wife have faced in her battle with cancer because of limited interoperability.

Neal’s keynote was very personal – about how we can make a loved one’s care journey easier by ensuring that all records – every detail – are available electronically and accurately wherever the patient receives care. It was the case for interoperability but also the case for making a patient’s life easier and the care better.

It’s hard for me to say how much of this was theatrics, but I’m glad they are at least talking the right talk. I really do hope that Neal’s personal experience will drive interoperability forward. Neil Versel suggested that interoperability would be John Glaser’s focus at Cerner. I hope he’s successful.

While at CHIME, I talked with Judy Faulkner, CEO of Epic, and we talked briefly about interoperability. At one point in our conversation I asked Judy, “Do you know the opportunity that you have available to you?” She looked at me with a bit of a blank stare (admittedly we were both getting our lunch). I then said, “You are big enough and have enough clout that you (Epic) could set the standard for interoperability and the masses would follow.” I’m not sure she’s processed this opportunity, but it’s a huge one that they have yet to capitalize on for the benefit of healthcare as we know it.

The same opportunity is available for Cerner as well. I really hope that both companies embrace open data, open APIs, and interoperability in a big way. Both have stated their interest in these areas, but I’d like to see a little less talk…a lot more action. They’re both well positioned to be able to make interoperability a reality. They just need to understand what that really means and go to work on it.

I’m hopeful that both companies are making progress on this. Having John Glaser focused on it should help that as well. The key will be that both companies have to realize that interoperability is what’s best for healthcare in general and in the end that will be what’s best for their customers as well.

Do Hospitals Want Interoperability?

Posted on November 17, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

I’ve had this discussion come up over and over again today in a series of discussions that I’ve had at the NYeC’s Digital Health Conference in NYC. Many people are blaming the EHR vendors for not being interoperable. Other people are blaming standards. Some like to blame HIPAA (which is ironic since it was passed to make health data portable). There are many more reasons that people give for why healthcare isn’t exchanging data and that interoperability isn’t a reality.

Although, in all of these discussions, I keep going back to the core question of whether hospitals and healthcare organizations really want that healthcare data to be interoperable. As I look back on the past, I can think of some doctors who’ve wanted it for a while, but I think the healthcare industry as a whole didn’t really want interoperability to happen. They would never admit this in public, because we all know on face that there are benefits to the healthcare system and the patient for interoperability. However, interoperability would have been a bad thing financially for many healthcare organizations.

It’s one of the dirty little secrets of healthcare. Sure, the EHR vendors never provided the interoperability functionality, but that’s largely because the healthcare providers never asked for it and largely didn’t want that functionality. They were all a little complicit in hiding the dirty little secret that healthcare organizations were benefiting from the inefficiency of the system.

I’m extremely hopeful that we’re starting to see a shift away from the above approach. I think the wheels are turning where hospitals are starting to see why their organization is going to need to be interoperable or their reimbursement will be affected. ACOs are leading this charge as the hospitals are going to need the data from other providers in order to improve the care they provide and lower costs.

Now, I think the biggest barrier to interoperability for most hospitals is figuring out the right way to approach it. Will their EHR vendor handle it? Do they need to create their own solution? Are CCD’s enough? Should they use Direct? Should they use a local HIE? Should they do a private HIE? Of course, this doesn’t even talk about the complexities of the hospital system and outside providers. Plus, there’s no one catch all answer.

I hope that we’re entering a new era of healthcare interoperability. I certainly think we’re heading in that direction. What are you seeing in your organizations?

More Epic Interoperability Discussion

Posted on October 7, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Looks like Epic is starting to open up and join the conversation about healthcare interoperability. The latest is an article in the New York Times which includes a few comments from Judy Faulkner, CEO of Epic. Here’s the main comments from Judy:

In 2005, when it became clear to her [Judy] that the government was not prepared to create a set of rules around interoperability, Ms. Faulkner said, her team began writing the code for Care Everywhere. Initially seen as a health information exchange for its own customers, Care Everywhere today connects hospitals all over the country as well as to various public health agencies and registries.

“Let’s say a patient is coming from U.C.L.A. and going to the University of Chicago, an Epic-to-Epic hospital. Boom. That’s easy,” Ms. Faulkner said. “These are hospitals that have agreed to the Rules of the Road, a legal contract, that says the other organization is going to take good care of the data.”

This is a really interesting approach. Blame the government for not applying a standard. Talk about how you’ve had to do it yourself and that’s why you built Care Everywhere. I wish that Judy would come out with the heart of the matter. Epic’s customers never asked for it and so they never did it. I believe that’s the simple reality. Remember that interoperability might be a big negative for many healthcare systems. If they’re interoperable, that could be a hit to revenue. Hopefully ACOs and other value based reimbursement will change this.

The key to coming clean like this though, is to come out with a deep set of initiatives that show that while it wasn’t something you worked on in the past, you’re going all in on interoperability now. We’re a very forgiving people, and if Epic (or any other large EHR vendor for that matter) came out with a plan to be interoperable, many would jump on board and forgive them for past transgressions (wherever the blame may lie).

Unfortunately, we don’t yet see this. I’d love to catch up with Judy Faulkner at CHIME and talk to her about it. The key will be to have a full spectrum interoperability plan and not just Care Everywhere that doesn’t work everywhere. Remember that Epic has charts for about 50% of the US patient population, but that’s still only 50%. Plus, of the 50% of patients they do have, a very very small percentage of them are all stored in the same Epic system. My guess would be that 99+% of patients who have a record in Epic have their medical records in other places as well. This means that Epic will need data from other non-Epic systems.

As I’ve said before, Epic wouldn’t need to wait for the government to do this. They are more than large enough to set the standard for the industry. In fact, doing so puts them in a real position of power. Plus, it’s the right thing to do for the US healthcare system.

Will the interoperability be perefect? No. It will take years and years to get everything right, but that’s ok. Progress will be better than what we have now. I love this quote from the NY Times article linked above:

“We’ve spent half a million dollars on an electronic health record system about three years ago, and I’m faxing all day long. I can’t send anything electronically over it,” said Dr. William L. Rich III, a member of a nine-person ophthalmology practice in Northern Virginia and medical director of health policy for the American Academy of Ophthalmology.

I hope that Epic continues down the path to interoperability and becomes even more aggressive. I think the climate’s right for them to make it happen. They’re in a really unique position to be able to really change the way we think and talk about interoperability. I’m interested to see if they seize the opportunity or just talk about it.

Of course, we’ve focused this article talking about Epic. That’s what happens when you’re the A list celebrity on the red carpet. People want to talk about you. The NY Times article pretty aptly points out that the other EHR vendors aren’t much more or less interoperable than Epic. Feel free to replace Epic with another large EHR vendor’s name and the story will likely read the same.

My hope is that EHR vendors won’t wait for customers to demand interoperability, but will instead make interoperability so easy that their customers will love taking part. Watch for a future series of posts on Healthcare Intoperability and why this is much easier said than done.

The Path to Interoperability

Posted on August 28, 2014 I Written By

The following is a guest blog post by Dave Boerner, Solutions Consultant at Orion Health.

Since the inception of electronic medical records (EMR), interoperability has been a recurrent topic of discussion in our industry, as it is critical to the needs of quality care delivery. With all of the disparate technology systems that healthcare organizations use, it can be hard to assemble all of the information needed to understand a patient’s health profile and coordinate their care. It’s clear that we’re all working hard at achieving this goal, but with new systems, business models and technology developments, the perennial problem of interoperability is significantly heightened.  With the industry transition from fee-for-service to a value-oriented model, the lack of interoperability is a stumbling block for such initiatives as Patient Center Medical Home (PCMH) and Accountable Care Organization (ACO), which rely heavily on accurate, comprehensive data being readily accessible to disparate parties and systems.

In a PCMH, the team of providers that are collaborating need to share timely and accurate information in order to achieve the best care possible for their patient. Enhanced interoperability allows them access to real-time data that is consistently reliable, helping them make more informed clinical decisions. In the same vein, in an ACO, a patient’s different levels of care – from their primary care physician, to surgeon to pharmacist, all need to be bundled together to understand the cost of a treatment. A reliable method is needed to connect these networks and provide a comprehensive view of a patient’s interaction with the system. It’s clear that interoperability is essential in making value-based care a reality.

Of course, interoperability can take many forms and there are many possible paths to the desired outcome of distributed access to comprehensive and accurate patient information.  Standards efforts over the years have taken on the challenge of improving interoperability, and while achievements such as HL7, HIPAA and C-CDA have been fundamental to recent progress, standards alone fall far short of the goal.  After all, even with good intentions all around, standard-making is a fraught process, especially for vendors coming to the table with such a diversity of development cycles, foundational technologies and development priorities.  Not to mention the perverse incentives to limit interoperability and portability to retain market share.  So, despite the historic progress we have made and current initiatives such as the Office of the National Coordinator’s JASON task force, standards initiatives are likely to provide useful foundational support for interoperability, but individual organizations and larger systems will at least for the time being continue to require significant additional technology and effort dedicated to interoperability to meet their needs.

So what is a responsible health system to do? To achieve robust, real-time data exchange amongst its critical systems, organizations need something stronger than just standards. More and more healthcare executives are realizing that direct integration is the more successful approach to taking on their need for interoperability amongst systems. For simpler IT infrastructures, one to one integration of systems can work well. However, given the complexity of larger health systems and networks, the challenge of developing and managing an escalating number interfaces is untenable. This applies not only to instances of connecting systems within an organization, but also connecting systems and organizations throughout a state and region. For these more complex scenarios, utilizing an integration engine is the best practice. Rather than multiple point-to-point connections, which requires costly development, management and maintenance, the integration engine acts as a central hub, allowing all of the healthcare organization’s systems from clinical to claims to radiology to speak to each other in one universal language, no matter the vendor or the version of the technology.  Integration engines provide comprehensive support for an extensive range of communication protocols and message formats, and help interface analysts and hospital IT administrators reduce their workload while meeting complex technical challenges. Organizations can track and document patient interactions in real-time, and can proactively identify at-risk patients and deliver comprehensive intervention and ongoing care. This is the next level of care that organizations are working to achieve.

Interoperability allows for enhanced care coordination, which ultimately helps improve care quality and patient outcomes. At Orion Health, we understand that an open integration engine platform with an all access API is critical for success. Vendors, public health agencies and other health IT stakeholders are all out there fighting the good fight – working together to make complete interoperability among systems a reality. That said, past experience proves that it’s the users that will truly drive this change. Hospital and health system CIOs need to demand solutions that help enhance interoperability, and it will happen. Only with this sustained effort will complete coordination and collaboration across the continuum of care will become a reality.

About David Boerner
David Boerner works as a Solutions Consultant (pre-sales) for Orion Health where he provides technical consultation and specializes in the design and integration of EHR/HIE solutions involving Rhapsody Integration Engine.

Time for Government to Step Out of the Way of EHR and Let the Market Takeover?

Posted on May 22, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

The always interesting and insightful John Moore from Chilmark research has a post up that asks a very good question. The question is whether it’s time for the government to get out of the EHR regulation business and let the market forces back in so they can innovate. I love this section of the post which describes our current situation really well:

But as often happens with government initiatives, initial policy to foster adoption of a given technology can have unintended consequences no matter how well meaning the original intent may be.

During my stint at MIT my research focus was diffusion of technology into regulated markets. At the time I was looking at the environmental market and what both the Clean Air Act and Clean Water Act did to foster technology adoption. What my research found was that the policies instituted by these Acts led to rapid adoption of technology to meet specific guidelines and subsequently contributed to a cleaner environment. However, these policies also led to a complete stalling of innovation as the policies were too prescriptive. Innovation did not return to these markets until policies had changed allowing market forces to dictate compliance. In the case of the Clean Air Act, it was the creation of a market for trading of COx, SOx and NOx emissions.

We are beginning to see something similar play-out in the HIT market. Stage one got the adoption ball rolling for EHRs. Again, this is a great victory for federal policy and public health. But we are now at a point where federal policy needs to take a back seat to market forces. The market itself will separate the winners from the losers.

His points highlight another reason why I think that ONC should blow up meaningful use. In my plan, I basically see it as the government getting out of the EHR business. I do disagree with John Moore’s comments that the government should step away from interoperability. If they do, we just won’t have interoperability. I guess he’d make the argument that value based reimbursement will force it, but not in the same way that the rest of the EHR incentive money could force the issue.

I have learned that to really get out of this game or even do what I describe will take an act of congress. HHS can’t do this without their help. Although, they could get pretty close. Plus, maybe they could exert their influence to get congress to act, but I won’t be holding my breathe on that one.

$11 Billion DoD EHR Contract

Posted on May 1, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Nextgov has a great article up which outlines many of the details of the soon to be bid out Healthcare Management Systems Modernization contract. I’d prefer to call it the DoD EHR Contract or AHLTA replacement contract. Certainly there’s more to it than EHR, but that will be the core of the contract and the big name EHR vendors will all be involved.

Here’s a section of the article which gives you an idea of the size of the contract:

The DHMSM contract’s estimated lifecycle value is approximately $11 billion and would include initial operating capabilities by 2017 and full functionality by 2023, according to Dr. Jonathan Woodson, assistant secretary of Defense for health affairs, who testified in February before the House Appropriations Committee’s defense panel.

Even in Washington, $11 billion is a lot of money, and it would surely rank among the largest IT-related contracts in government. What’s unique about this effort is that the Pentagon wants a single contractor to lead the integration of a commercial electronic health records system to cover its nearly 10 million beneficiaries and large assortment of health care facilities worldwide. Defense is one of the largest health care providers in the country, on par in size with the Veterans Affairs Department and private sector leaders like Kaiser Permanente.

The article also states that they want to issue a contract with one vendor. We’ll see how that plays out. I’ve seen some rumors out there about who will be bidding. No doubt it will be a combination of the usual government contractors and EHR consultant companies together with EHR vendors like Epic and Cerner.

What’s going to be really interesting is the VA and its Vista EHR (Vista Evolution) is said to be bidding on the contract as well. We’ll see if they’re the only Vista bid or if others join in as well.

Here’s another great insight into the DoD EHR Contract:

Major data gaps in patient records occur when health care is delivered to beneficiaries outside the DOD network, and today approximately half of DOD’s 9.8 million beneficiaries receive their health care outside the network.

This was an obvious complaint of the AHLTA system. The DoD and VA couldn’t even get their EHR systems to be interoperable. I’m not optimistic that interoperability will be obtainable even under this new contract. That includes DoD to VA interoperability, let alone trying to connect with the care the DoD beneficiaries receive outside the DoD network.

I realize that nothing with the DoD health system is simple. Its an enormous system with all sorts of crazy government regulations. However, I’d hope for $11 billion, we could do better than we’re doing for our veterans.

At HIMSS, I talked with a largely military EHR vendor. They told me that they were close to being able to exchange records between different locations. That’s right. At HIMSS 2014 the amazing breakthrough was that 2 locations with the same EHR software were close to being able to exchange data.

A Meaningful EHR Certification

Posted on April 16, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

In many ways this post could be considered a continuation of my previous post on data liberation. I’ve really loved the idea of a creating a meaningful EHR Certification and that could include data liberation. Let’s be honest for a minute. Do any of you find value in the current EHR certification?

You know that a certification is screwed up when it requires certain interoperability standards and then when you go to actually implement the sharing of data between two systems you find out that the two systems are working on two different standards. They are close standards, but close doesn’t count with standards. Many have asked the question, “What did the EHR certification do if it couldn’t test the standard?” I have no answer to that question.

Now imagine we created an EHR certification that actually did require a standard for interoperability. Not a flavor of a standard, or something that closely resembles a standard. I’m talking about a standard. Would hospitals find this useful? I think so.

Another example of a meaningful EHR certification could be certifying that an EHR vendor will not hold your EHR data hostage. Think about how beneficial that would be to the industry. Instead of EHR vendors trying to trap your data in their system, they could focus on providing the end user what they need so the end user never wants to leave that EHR. What a beautiful shift that would be for our industry.

There could be many more things that could be meaningfully certified. However, this would be a simple and good place to start. I have no doubt that some would be resistant to this certification. That’s why those who do become meaningfully certified need to get the proper boost in PR that a meaningful certification should deserve. No EHR vendor wants to be caste as the EHR vendor who can’t figure out the standard and that holds its customers hostage. Yet, that’s what they’re able to get away with today.

What do you think of this idea?

eFax in Hospitals

Posted on January 23, 2014 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Over the years, I’ve had the chance to interact with basically all of the major eFax services out there. I’ve even had a number of them as advertisers. This largely makes sense since healthcare it still the haven for fax. I won’t go into all the reasons why fax is still so popular in healthcare, but it’s still the most trusted form of interoperability in healthcare. As an eFax vendor pointed out to me, fax is great because it produces an unalterable document. Sure, it’s not impossible to alter, but it’s pretty difficult.

I am hopeful that fax will one day be replaced by true interoperability in healthcare. Although, I’m more hopeful that Direct Project will get us there even sooner. Fore those not familiar with Direct Project, it’s like fax, but with meta data attached to it and securely sent over the internet. Both true interoperability of data and Direct Project still have a long ways to go though. So, don’t hold your breathe on those taking out fax….yet.

A trend I have seen happening is organization replacing their current fax solution with some sort of eFax option. In many cases this shift has been driven by issues with fax when you’re in a VoIP environment. Yes, I know that many of the VoIP environments can support fax, but it takes work. In fact, it takes just as much work getting it to function as it is to just implement some sort of eFax solution.

The other real benefit I’ve seen many consider when looking at eFax is the cost structure of eFax. Instead of having to invest in faxing hardware all up front, many organizations like that the eFax can be bought on a pay as you go or usage based plan. If indeed faxing is starting to go away in favor of some other electronic transfer of data, then your organization can save money on faxing as your fax load is reduced.

There are a few trends I’ve seen with eFax in healthcare. What trends have you been seeing in your organization?

2013 Hospital EHR and Health IT Trends

Posted on December 31, 2013 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

There are a number of amazing milestones and trends happening with EHR and Healthcare IT. I think as we look back on 2013, we’ll remember it for a number of important changes that impact us for many years to come. Here are a few of the top trends and milestones that I’ll remember in 2013.

Epic and Cerner Separate Themselves – This has certainly been happening for a couple of years, but 2013 is the year I’ll remember that everyone agreed that for big hospitals it’s a two horse race between Cerner and Epic. There’s still an amazing battle brewing for the small hospital with no clear winner yet. However, in the large hospital race the battle between Cerner and Epic is on. Epic had been winning most of the deals, but Cerner just gave them a big left hook when Intermountain chose Cerner.

I expect we’re living in an Epic and Cerner world until at least a few years post meaningful use. The job listings on Healthcare IT Central illustrate Cerner and Epic dominance as well.

Near Universal EHR Adoption in Hospitals – I can’t find the latest EHR adoption (meaningful use) numbers from ONC, but the last ones I saw were in the high 80’s. That basically leaves a number of small rural hospitals that likely don’t have much tech infrastructure at all, let alone an EHR. Every major hospital institution now has an EHR. I guess we can now stop talking about hospital EHR adoption and start talking about hospital EHR use?

The Cracks in the Healthcare Interoperability Damn Appear – Interoperability has always been a hard nut to crack in healthcare. Everyone knew it was the right thing to do, but there were some real systemic reasons organizations didn’t go that direction. Not to mention, there was little financial motivation to do it (and often financial disincentive to do it).

With that background, I think in 2013 we’ve started to see the cracks in the damn that was holding up interoperability. They are still just cracks, but once water starts seeping through the crack the whole structure of the damn will break and the water will start flowing freely. Watch for the same with interoperability. Some of this year’s cracks were started with the announcement of CommonWell. I think in response to being left out of CommonWell, Epic has chosen to start being more interoperable as well.

Skinny Data Happens – I was first introduced to the concept of skinny data vs big data at HIMSS 2013 by Encore Health Resources. While I’m not sure if the skinny data branding will stick, the concept of doing a data project with a slice of data that has meaningful (excuse the use of the word) outcomes is the trend in data analytics and it’s going to dominate the conversations going forward.

As I posted on EMR and EHR, Big Data is Like Teenage Sex, but skinny data is very different. Skinny data is about doing something valuable with the data. Sadly, not enough people are doing skinny data, but they all will in 2014.

Hospitals Ignore Consumer Health Devices – Consumer health devices are popping up everywhere in healthcare. We’re quickly reaching the point that consumers can monitor all of their vital information at near hospital grade quality using their smartphone and sometimes an external device. This is a real revolution in medical devices. Many are still making their way through FDA approval, but some have passed and are starting to work on traction.

With all of this innovation, hospitals seemed to have mostly ignored what’s happening. Sure, the larger ones have a few pilot projects going. However, most hospitals have no idea what’s about to hit them upside the head. Gone will be the days of patients going to the hospital to be “monitored.” I don’t think most hospitals are ready for this shift.