Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Interoperability Is On An Accelerated Trajectory Says Redox CEO

Posted on November 16, 2018 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He is currently an independent marketing consultant working with leading healthIT companies. Colin is a member of #TheWalkingGallery. His Twitter handle is: @Colin_Hung.

The lack of interoperability in healthcare continues to be a vexing challenge for Health IT companies, IT departments and patients. Redox is a company taking a unique approach to solving this problem. They have built a platform of reusable data connections between healthcare providers and the innovative cloud-based companies that have products those providers want to use.

Redox recently held their second annual Interoperability Summit at the new CatalystHTI facility in Denver Colorado. Over three hundred people attended the event. The diverse audience included: startups, hospitals, large HealthIT vendors, payors and government health agencies. The sessions at the Summit reflected the diversity of the audience and ranged from topics like “Hacking the Health System Sales Cycle” to “FHIR: Be the Right Amount of Excited”.

During the Summit, I sat down with Redox CEO, Luke Bonney, to talk about the state of interoperability, the willingness of the industry to share data and what advice he has for the dozens of startups that approach Redox each month.

Below is a transcript of our conversation.

What is the state of healthcare interoperability today?

I think we are in a good state right now, but more importantly I think we are on an accelerated trajectory to something better.

An accelerated trajectory?

Yes, but in order to explain why I’m saying that, we have to take a step back.

In my opinion healthcare interoperability is inextricably tied to the adoption and migration to the cloud. We will never have true data liquidity, which is the state that everyone wants – physicians, clinicians, administrators, patients, providers, payers, etc – until healthcare fully embraces cloud architectures and cloud thinking.

Healthcare is still predominantly an “on-premise” world. It’s not wrong. It’s just how the industry has grown up. We installed servers behind our own firewalls. As we added systems we bought more servers and of course we added them to the other servers behind the firewall. Eventually we built connections between these systems so that they could talk to each other. But because everything was behind the firewall and because we were really just sharing data within the same organization, we didn’t give much thought to sharing that data in a standard way. As long as we were behind the firewall we could safely exchange data.

When you approach things from a cloud perspective, the thinking is completely different. When you build cloud applications you HAVE TO think about data portability and security. You HAVE TO work out ways to connect systems together across the Internet without a single big firewall acting as your shield.

So as people move more and more to this way of thinking we will see more movement towards frictionless data exchange.

So is healthcare moving more to the cloud?

Working at EPIC and now at Redox, I’ve had a front-row seat to this change in attitude towards the cloud by healthcare providers. Prior to 2015 healthcare IT leaders were still asking “What is the cloud?” and “Why should I bother with it?”. But today leaders are starting to ask “How can I better leverage the cloud for my organization?” It’s great to see so many proactively looking for ways to adopt cloud-based applications.

I also think that the consumer tech giants are helping propel healthcare forward. Companies like Amazon and Google have always been cloud-based. As they push into healthcare they are going to have a huge advantage versus on-premise legacy companies. As they gain traction so too will the cloud.

I can see how embracing the cloud will help healthcare achieve secure connectivity and certainly scalability, but even if we move completely to the cloud won’t we still need to exchange data in a standard way in order to achieve true interoperability?

Having a data standard would certainly be helpful.

Is that going to be HL7 v2? v3? FHIR? Smart-on-FHIR? Or something that Commonwell Alliance puts out?

(Laughing). We do seem to have a lot of standards don’t we.

Actually this is what is driving Redox. There really isn’t a ton of incentive to tear out the investments already made in HL7 v2 or v3. It works for the use cases where it has been deployed. The same applies to FHIR and Commonwell. All these approaches work wonderfully for specific use cases, but I really doubt any one of these approaches is going to be the single solution for all of our interoperability challenges.

Think about it. If I’m a CIO at a hospital and I have a working HL7 v2 integration working between two systems, why would I waste precious resources to move to a different integration standard if there is really nothing to be gained from it? It’d be a waste of time and resources.

The one good thing about all these standards and interoperability initiatives is that we are building an audience that is asking the right questions and pushing healthcare in the right direction. APIs are the right thing to do. FHIR is the right thing to do…and so on. All are relevant and needed.

So if not a universal data standard, what do we need?

The way I see things we might not need a single data standard if someone can build a common platform through which data can be shared. That’s what we’re doing here at Redox. We’re taking a pragmatic approach. Whatever data standard you are using internal is fine with us. We’ll work with you to find a way to share your data through our platform. And once you share it with us once, you don’t have to rebuild that connection over and over again each time a different company wants to connect. We handle that.

Is that the problem Redox set out to solve?

Actually when we started Redox we really just wanted to make it easier for cloud-based healthcare companies to scale and grow. What we realized is that one of the biggest impediments to growth was integrating legacy on-prem systems with cloud-based applications. Even if these companies could convince hospital IT teams to put their integration on the priority list, it would take a long time to actually get it done.

So we built the Redox engine to make this easier. Our goal wasn’t to solve interoperability per say, we just wanted to bring innovative web developers closer to healthcare providers so that they can solve problems together.

But because we were cloud from Day 1, we wanted to build everything in a reusable way, so that once we built a connection to one hospital, we wouldn’t have to build it again when the next company wanted to connect with that same hospital. This network effect wasn’t something we originally set out to build, but now it’s central to our success. It’s why we can talk about being a platform that enables data sharing vs being a tool that helps systems share data.

Solving interoperability is only partly a technology challenge. There is also the challenge of getting the healthcare ecosystem to actually share their data. Because Redox works with so many players in the ecosystem, have you noticed any change in attitude around sharing data?

Let me start by saying that I think everyone WANTS the data. There’s incredible value in health data. Medical records are a gold mine for researchers, public health authorities, pharma companies, payors, etc. Everyone would love nothing more than to build a comprehensive health record for their own purposes. The challenge of course is that it’s not easy to do that today. As you said, this is partly because of technology and partly because no one really wants to share their data altruistically.

I think there is one party that truly wants data to be shared and that’s patients. Patients are way more interested in sharing data than anyone else in the ecosystem. As a patient, data should follow me wherever I go. I never want to wonder if my doctor has all my medical information. I want people to have the data because I want the best outcome possible and my data can help make that happen.

I think companies and organizations in the healthcare ecosystem are slowly waking up to the fact that sharing data helps support their customers – whether those customers are providers, payors, members, patients, clinicians or government agencies. Sharing data makes things better. And as financial pressures in healthcare mount, everyone is looking for ways to do more, better, faster and with more accuracy. Sharing data is necessary for that to happen.

Redox works with a lot with startups and small/medium sized HealthIT companies. What advice would you give to those that are considering working with Redox? What should they have considered?

There are two key questions that I think every HealthIT company should ask themselves. Frist, what is the value your product or service provides? Second, Who is the buyer? Success in healthcare is less about whether your technology and more about aligning three things:

  1. An actual problem that needs to be solved
  2. A solution to that problem
  3. A buyer who can make a buying decision in a healthcare organization

I see a lot of companies that don’t really consider this last question. You can create an amazing product that solves a problem in healthcare but if the target audience for your product cannot make the buying decision then you have a difficult road ahead of you.

Beyond these questions, I would advise companies to really consider how their products integrate into the clinical or administrative workflow. Many startups begin with an application that isn’t integrated with existing hospital systems, like the EHR. But after they gain a little bit of traction they realize they need to become more integrated. But building in real-time data exchange into an application isn’t easy. You need to really think through how your product will handle this.

Lastly I would caution healthcare entrepreneurs about building their applications on the assumption that FHIR will be universally adopted. It isn’t and it will likely take years before it gains real-world traction. There is a lot of excitement around FHIR, but it isn’t the best solution for all situations.

Final Thoughts?

One thing I am encouraged by is the number of people and companies from outside of healthcare that are coming into this space. I think they bring an energy and perspective that will help us all get better. Granted, many of them have stars in their eyes and don’t realize how tough healthcare can be…but, the fact that they aren’t burdened with any legacy thinking is exciting. Healthcare needs more outside thinking.

Interoperability Problems Undercut Conclusions of CHIME Most Wired Survey

Posted on November 13, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Most of you have probably already seen the topline results from CHIME’s  “Healthcare’s Most Wired: National Trends 2018” study, which was released last month.

Some of the more interesting numbers coming out of the survey, at least for me, included the following:

  • Just 60% of responding physicians could access a hospital network’s virtual patient visit technology from outside its network, which kinda defeats the purpose of decentralizing care delivery.
  • The number of clinical alerts sent from a surveillance system integrated with an EHR topped out at 58% (alerts to critical care units), with 35% of respondents reporting that they had no surveillance system in place. This seems like quite a lost opportunity.
  • Virtually all (94%) participating organizations said that their organization’s EHR could consume discrete data, and 64% said they could incorporate CCDs and CCRs from physician-office EHRs as discrete data.

What really stands out for me, though, is that if CHIME’s overall analysis is correct, many aspects of our data analytics and patient engagement progress still hang in the balance.

Perhaps by design, the hospital industry comes out looking like it’s doing well in most of the technology strategy areas that it has questions about in the survey, but leaves out some important areas of weakness.

Specifically, in the introduction to its survey report, the group lists “integration and interoperability” as one of two groups of foundational technologies that must be in place before population health management/value-based care,  patient engagement and telehealth programs can proceed.

If that’s true, and it probably is, it throws up a red flag, which is probably why the report glossed over the fact that overall interoperability between hospitals is still very much in question. (If nothing else, it’s high time the hospitals adjust their interoperability expectations.) While it did cite numbers regarding what can be done with CCDs, it didn’t address the much bigger problems the industry faces in sharing data more fluidly.

Look, I don’t mean to be too literal here. Even if CHIME didn’t say so specifically, hospitals and health systems can make some progress on population health, patient engagement, and telehealth strategies even if they’re forced to stick to using their own internal data. Failing to establish fluid health data sharing between facility A and facility B may lead to less-than-ideal results, but it doesn’t stop either of them from marching towards goals like PHM or value-based care individually.

On the other hand, there certainly is an extent to which a lack of interoperability drags down the quality of our results. Perhaps the data sets we have are good enough even if they’re incomplete, but I think we’ve already got a pretty good sense that no amount of CCD exchange will get the results we ultimately hope to see. In other words, I’m suggesting that we take the CHIME survey’s data points in context.

Real Interoperability and Other Micro-moments From #PCCSummit18

Posted on November 7, 2018 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He is currently an independent marketing consultant working with leading healthIT companies. Colin is a member of #TheWalkingGallery. His Twitter handle is: @Colin_Hung.

I love attending user group conferences. They are THE BEST way to get a true sense of what is on the minds of healthcare professionals. I find that people at user meetings are very open and candid. I don’t know why this happens, but I’m grateful it does.

This week, I had the privilege of attending PointClickCare’s annual #PCCSummit18 in Nashville, TN. PointClickCare is the leading EHR provider to the Long-Term and Post-Acute Care (LTPAC) space. Their customers are Skilled Nursing Facilities (SNFs), Senior Living organizations and Home Care providers.

I learned so much about the challenges facing LTPAC providers and I had so much fun connecting with PointClickCare staff as well as their customers. These are some of the memorable/notable moments from the event.

Real Interoperability happening between Hospitals and LTPAC

Interoperability wasn’t just talked about at #PCCSummit18, you could actually see it in action. PointClickCare’s partnership with Redox and their upcoming release of the Harmony interoperability module. More on this in a future article.

Investing in LTPAC Innovation Paying Off

For years PointClickCare has poured millions of dollars into R&D – researching, building, testing and in some cases acquiring new products for the LTPAC market. That investment in innovation continues to pay dividends as end-users and partners applauded each of the new modules/features unveiled at #PCCSummit18.

We’re still talking about faxes?!

The most eye-opening data point shared at #PCCSummit18 came via a real-time audience survey in one of the breakout sessions on LTPAC process optimization. The presenters asked the audience to text back their answer to the following question:

In the past 12 months, which (patient) transitions improvement projects, or remote patient reporting projects have you been a part of?

  1. Improved paper/fax processes
  2. Direct Messaging
  3. 3rd party tools
  4. None

You can see the surprising result. The majority of the audience had either not worked on any such transition improvement project or had been part of one that improved a paper/fax process. Yikes! We have a lot of work to do in #HealthIT.

Using storytelling to make data memorable

My favorite breakout session was by Doug Landis, a professionally trained actor who went onto become the chief storyteller at Box and who is now a venture capitalist. Landis’s presentation was full of useful tips and tactics on how to present data in a memorable way through the power of stories.

No single path to success

On the theme of storytelling, 4 Nashville songwriters presented their stories as the keynote session on Day 3. Each of musicians came to Nashville wanting to become the next breakout star. What happened instead is that each became a songwriter who created a piece that helped a rising star hit it big on the music charts – Carrie Underwood, Lady Antebellom and Miranda Lambert to name just a few. Their stories are proof positive that there are many roads to success and sometimes your own success can be found by helping other succeed.

Everyone leaving happy

Every attendee that I spoke with had nothing but praise for PointClickCare. They felt well taken care of, they thought the venue was fantastic, they thought the social events were incredible and they loved the food. It’s fun to be part of a conference where everyone leaves happy.

 

New Reporting and Interop Features Hit The Right Note for PointClickCare

Posted on November 6, 2018 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He is currently an independent marketing consultant working with leading healthIT companies. Colin is a member of #TheWalkingGallery. His Twitter handle is: @Colin_Hung.

The new reporting and data sharing capabilities of PointClickCare‘s LTPAC EHR platform were a big hit with the 2,000 users gathered on Day 1 of the company’s annual #PCCSummit18 being held in Nashville TN.

In the opening session, Co-Founder and COO of PointClickCare, Dave Wessinger, bravely walked through the company’s new report engine in a live demo. He started by showing off the new searching capability that will allow users to quickly find the report they need by simply typing a keyword into the search bar. Any report with a matching word in its description appears in the results. This one feature replaces dozens of weekly calls to systems administrators who have to help end-users find the right report to run because the current system has limited ability to organize and find reports. There was an audible “Yes” and collective fist pump from many in the audience.

Wessinger then went on to demonstrate the new data visualization tools and data export capabilities in the report engine.

“The export capabilities alone are a game changer for me,” said Timothy Carey, Director of Data and Performance Analytics at BaneCare. “Right now it’s not that easy to export data from a report into Excel where it can be further analyzed or combined with other data sources. The new export capabilities will cut out many hours from our work week.”

Skilled Nursing Facilities (or SNFs) like BaneCare have to produce detailed reports on the patients (residents) that are transferred to them from their acute care partners. These reports are required by the case managers at the acute care organizations – who need them to ensure their patients are getting the post-acute care their physicians prescribed.

“Having the right data and providing it quickly to our acute care partners is what differentiates us from competing facilities,” continued Carey. “The goal is to be the preferred LTPAC partner to acute care organizations and being able to provide timely data is a key criteria of being a good partner. Having good data also helps our own organization determine where to invest additional resources.”

BJ Boyle, VP of Product Management at PointClickCare followed Wessinger on the main stage where he proceeded to give a live demonstration of the company’s new data sharing module called Harmony. Harmony was first announced at HIMSS18 and was something Boyle and I spoke about in this interview.

One of the main components of Harmony is a customizable dashboard that allows case managers at acute care organizations to see how their referred patients are faring at the SNF. Through Harmony, the case mangers and SNF staff can see the same patient data in real-time. This allows for unprecedented collaboration between the organizations.

“Right now we spend a lot of time making phone calls, sending emails and in meetings with our acute care partner,” said Cyndi Howell, Lead RNAC and PCC Clinical Liaison at Willow Valley Communities. “This is needed to keep each organization informed of what’s happening with patients that we are both responsible for. We do it because we are both committed to providing the best care possible. We love working collaboratively with our partners at Lancaster General Hospital (part of Penn Medicine). It’s just what we have to do to take care of people in our community.”

When Willow Valley Communities implements Harmony, they will no longer have to manually pull data from their PointClickCare system in order to facilitate the discussions with Lancaster. Instead, staff from both organizations will simply log onto Harmony and view the same data together in real-time.

“We are very excited and happy about Harmony,” explained Howell. “It’s going to make all our lives so much easier and patients will end up benefitting from better and more coordinated care.”

The real-time dashboard isn’t the only feature of Harmony. The module also featured a robust data integration engine, powered by Redox, that will allow PointClickCare to quickly connect it’s cloud-base system to EHRs at acute care organizations.

“PointClickCare wanted to get off the integration treadmill,” said Boyle. “It simply wasn’t scalable to connect to each hospital system one by one. We are happy to partner with our friends at Redox and leverage the power of their engine and the network of providers/vendors they work with.”

Through the Redox engine, patients transitioning from an acute care organization to a SNF or other LTPAC facility will have all their data seamlessly sent as part of the discharge process. No more faxes or paper-based binders of medical information.

“Part of our vision is for everyone in healthcare to have a complete view of the patients they are taking care of,” stated Luke Bonney, CEO and Co-Founder of Redox who presented with Boyle in a breakout session later on Day 1 of #PCCSummit18. “That can only happen when every member of the healthcare ecosystem can share data in an easy way and in a format that is meaningful to everyone involved.”

Luke Bonney, CEO at Redox (left) and BJ Boyle, VP Product Management at PointClickCare

“I am totally bought into the vision,” said Carey. “All of us here at BaneCare want patients to have the best possible experience while in our facilities. That means we need all the relevant information right at the point of transition from the acute care organization – medications, care plans, etc. Harmony will automate this entire step.”

I must admit I did not expect to meet so many people here at #PCCSummit18 who were excited about interoperability. I was also truly surprised that there are so many organizations actively working together on practical interoperability use cases that are true win-win-wins (for acute care organizations, LTPAC facilities and patients).

But then again, when you are in Nashville (aka Music City) you’d expect a little harmony.

Hospitals Sharing More Patient Data Than Ever, But Is It Having An Impact On Patient Care?

Posted on November 1, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Brace yourself for more happy talk in a positive interoperability spin, folks. Even if they aren’t exchanging as much health data as they might have hoped, hospitals are sharing more patient health data than they ever have before, according to a new report from the ONC.

The ONC, which recently analyzed 2017 data from the American Hospital Association’s Information Technology Supplement Survey, concluded that 93% of non-federal acute care hospitals have upgraded to the 2015 Edition Health IT Certification Criteria or plan to upgrade. These criteria include new technical capabilities that support health data interoperability.

Today, most hospitals (88%) can send patient summary of care records electronically, and receive them from outside sources (74%), ONC’s analysis concluded. In addition, last year the volume of hospitals reporting that they could query and integrate patient health data significantly increased.

Not only that, the volume of hospitals engaged in four key interoperability activities (electronically sending, receiving, finding and integrating health data) climbed 41% over 2016. On the downside, however, only four in 10 hospitals reported being able to find patient health information, send, receive and integrate patient summary of care records from outside sources into their data.

According to ONC, hospitals that work across these four key interoperability domains tend to be more sophisticated than their peers who don’t.

In fact, in 2017 83% of hospitals able to send, receive, find, and integrate outside health information also had health information electronic available at the point of care. This is a 20% higher level than hospitals engaging in just three domains, and a whopping seven times higher than hospitals that don’t engage in any domain.

Without a doubt, on its face this is good news. What’s not to like? Hospitals seem to be stepping up the interoperability game, and this can only be good for patients over time.

On the other hand, it’s hard for me to measure just how important it is in the near term. Yes, it seems like hospitals are getting more nimble, more motivated and more organized when it comes to data sharing, but it’s not clear what impact this may be having on patient care processes and outcomes.

Over time, most interoperability measures I’ve seen have focused more on receipt and transmission of patient health data far more than integration of that data into EHRs. I’d argue that it’s time to move beyond measuring back and forth of data and put more impact on how often physicians use that data in their work.

There’s certainly a compelling case to be made that health data interoperability matters. I’ve never disputed that. But I think it’s time we measure success a bit more stringently. In other words, if ONC can’t define the clinical benefits of health data exchange clearly, in terms that matter to physicians, it’s time to make it happen.

Taming the Healthcare Compliance and Data Security Monster: How Well Are We Doing?

Posted on October 18, 2018 I Written By

The following is a guest blog post by Lance Pilkington, Vice President of Global Compliance at Liaison Technologies.

Do data breach nightmares keep you up at night?

For 229 healthcare organizations, the nightmare became a reality in 2018. As of late August, more than 6.1 million individuals were affected by 229 healthcare-related breaches, according to the Department of Health and Human Services’ HIPAA Breach Reporting Tool website – commonly call the HIPAA “wall of shame.”

Although security and privacy requirements for healthcare data have been in place for many years, the reality is that many healthcare organizations are still at risk for non-compliance with regulations and for breaches.

In fact, only 65 percent of 112 hospitals and hospital groups recently surveyed by Aberdeen, an industry analyst firm, reported compliance with 11 common regulations and frameworks for data security. According to the healthcare-specific brief – Enterprise Data in 2018: The State of Privacy and Security Compliance in Healthcare – protected health information has the highest percentage of compliance, with 85 percent of participants reporting full compliance, and the lowest compliance rates were reported for ISO 27001 and the General Data Protection Regulation at 63 percent and 48 percent respectively.

An index developed by Aberdeen to measure the maturity of an organization’s compliance efforts shows that although the healthcare organizations surveyed were mature in their data management efforts, they were far less developed in their compliance efforts when they stored and protected data, syndicated data between two applications, ingested data into a central repository or integrated data from multiple, disparate sources.

The immaturity of compliance efforts has real-world consequences for healthcare entities. Four out of five (81 percent) study participants reported at least one data privacy and non-compliance issue in the past year, and two out of three (66 percent) reported at least one data breach in the past year.

It isn’t surprising to find that healthcare organizations struggle with data security. The complexity and number of types of data and data-related processes in healthcare is daunting. In addition to PHI, hospitals and their affiliates handle financial transactions, personally identifiable information, employee records, and confidential or intellectual property records. Adding to the challenge of protecting this information is the ever-increasing use of mobile devices in clinical and business areas of the healthcare organization.

In addition to the complexities of data management and integration, there are budgetary considerations. As healthcare organizations face increasing financial challenges, investment in new technology and the IT personnel to manage it can be formidable. However, healthcare participants in the Aberdeen study reported a median of 37 percent of the overall IT budget dedicated to investment in compliance activities. Study participants from life sciences and other industries included in Aberdeen’s total study reported lower budget commitments to compliance.

This raises the question: If healthcare organizations are investing in compliance activities, why do we still see significant data breaches, fines for non-compliance and difficulty reaching full compliance?

While there are practical steps that every privacy and security officer should take to ensure the organization is compliant with HIPAA, there are also technology options that enhance a healthcare entity’s ability to better manage data integration from multiple sources and address compliance requirements.

An upcoming webinar, The State of Privacy and Security Compliance for Enterprise Data: “Why Are We Doing This Ourselves?” discusses the Aberdeen survey results and presents advice on how healthcare IT leaders can evaluate their compliance-readiness and identify potential solutions can provide some thought-provoking guidance.

One of the solutions is the use of third-party providers who can provide the data integration and management needs of the healthcare organization to ensure compliance with data security requirements. This strategy can also address a myriad of challenges faced by hospitals. Not only can the expertise and specialty knowledge of the third-party take a burden off in-house IT staff but choosing a managed services strategy that eliminates the need for a significant upfront investment enables moving the expense from the IT capital budget to the operating budget with predictable recurring costs.

Freeing capital dollars to invest in other digital transformation strategies and enabling IT staff to focus on mission-critical activities in the healthcare organization are benefits of exploring outsource opportunities with the right partner.

More importantly, moving toward a higher level of compliance with data security requirements will improve the likelihood of a good night’s sleep!

About Lance Pilkington
Lance Pilkington is the Vice President of Global Compliance at Liaison Technologies, a position he has held since joining the company in September 2012. Lance is responsible for establishing and leading strategic initiatives under Liaison’s Trust program to ensure the company is consistently delivering on its compliance commitments. Liaison Technologies is a proud sponsor of Healthcare Scene.

Connecting the Data: Three Steps to Meet Digital Transformation Goals

Posted on July 16, 2018 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

A white paper published by the World Economic Forum in 2016 begins with the statement, “Few industries have the potential to be changed so profoundly by digital technology as healthcare, but the challenges facing innovators – from regulatory barriers to difficulties in digitalizing patient data – should not be underestimated.”

That was two years ago, and many of the same challenges still exist as the digital transformation of healthcare continues.

In a recent HIMSS focus group sponsored by Liaison, participants identified their major digital transformation and interoperability goals for the near future as:

  • EMR rollout and integration
  • Population health monitoring and analytics
  • Remote clinical encounters
  • Mobile clinical applications

These goals are not surprising. Although EMRs have been in place in many healthcare organizations for years, the growth of health systems as they add physicians, clinics, hospitals and diagnostic centers represents a growing need to integrate disparate systems. The continual increase in the number of mobile applications and medical devices that can be used to gather information to feed into EMR systems further exacerbates the challenge.

What is surprising is the low percentage of health systems that believe that they are very or somewhat well-prepared to handle these challenges – only 35 percent of the HIMSS/Liaison focus group members identified themselves as well-prepared.

“Chaos” was a word used by focus group participants to describe what happens in a health system when numerous players, overlapping projects, lack of a single coordinator and a tendency to find niche solutions that focus on one need rather than overall organizational needs drive digital transformation projects.

It’s easy to understand the frustration. Too few IT resources and too many needs in the pipeline lead to multiple groups of people working on projects that overlap in goals – sometimes duplicating each other’s efforts – and tax limited staff, budget and infrastructure resources. It was also interesting to see that focus group participants noted that new technologies and changing regulatory requirements keep derailing efforts over multi-year projects.

Throughout all the challenges identified by healthcare organizations, the issue of data integrity is paramount. The addition of new technologies, including mobile and AI-driven analytics, and new sources of information, increases the need to ensure that data is in a format that is accessible to all users and all applications. Otherwise, the full benefits of digital transformation will not be realized.

The lack of universal standards to enable interoperability are being addressed, but until those standards are available, healthcare organizations must evaluate other ways to integrate and harmonize data to make it available to the myriad of users and applications that can benefit from insights provided by the information. Unlocking access to previously unseen data takes resources that many health organizations have in short supply. And the truth is, we’ll never have the perfect standards as they will always continue to change, so there’s no reason to wait.

Infrastructure, however, was not the number one resource identified in the HIMSS focus group as lacking in participants’ interoperability journey. In fact, only 15 percent saw infrastructure as the missing piece, while 30 percent identified IT staffing resources and 45 percent identified the right level of expertise as the most critical needs for their organization.

As all industries focus on digital transformation, competition for expert staff to handle interoperability challenges makes it difficult for healthcare organizations to attract the talent needed. For this reason, 45 percent of healthcare organizations outsource IT data integration and management to address staffing challenges.

Health systems are also evaluating the use of managed services strategies. A managed services solution takes over the day-to-day integration and data management with the right expertise and the manpower to take on complex work and fluctuating project levels. That way in-house staff resources can focus on the innovation and efficiencies that support patient care and operations, while the operating budget covers data management fees – leaving capital dollars available for critical patient care needs.

Removing day-to-day integration responsibilities from in-house staff also provides time to look strategically at the organization’s overall interoperability needs – coordinating efforts in a holistic manner. The ability to implement solutions for current needs with an eye toward future needs future-proofs an organization’s digital investment and helps avoid the “app-trap” – a reliance on narrowly focused applications with bounded data that cannot be accessed by disparate users.

There is no one answer to healthcare’s digital transformation questions, but taking the following three steps can move an organization closer to the goal of meaningful interoperability:

  • Don’t wait for interoperability standards to be developed – find a data integration and management platform that will integrate and harmonize data from disparate sources to make the information available to all users the way they need it and when they needed.
  • Turn to a data management and integration partner who can provide the expertise required to remain up-to-date on all interoperability, security and regulatory compliance requirements and other mandatory capabilities.
  • Approach digital transformation holistically with a coordinated strategy that considers each new application or capability as data gathered for the benefit of the entire organization rather than siloed for use by a narrowly-focused group of users.

The digital transformation of healthcare and the interoperability challenges that must be overcome are not minor issues, nor are they insurmountable. It is only through the sharing of ideas, information about new technologies and best practices that healthcare organizations can maximize the insights provided by data shared across the enterprise.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies, a proud sponsor of Healthcare Scene. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Revenue Cycle Trends To Watch This Year

Posted on July 13, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Revenue cycle management is something of a moving target. Every time you think you’ve got your processes and workflow in line, something changes and you have to tweak them again. No better example of that was the proposed changes to E/M that came out yesterday. While we wait for that to play out, here’s one look at the trends influencing RCM strategies this year, according to Healthcare IT leaders revenue cycle lead Larry Todd, CPA.

Mergers

As healthcare organizations merge, many legacy systems begin to sunset. That drives them to roll out new systems that can support organizational growth. Health leaders need to figure out how to retire old systems and embrace new ones during a revenue cycle implementation. “Without proper integrations, many organizations will be challenged to manage their reimbursement processes,” Todd says.

Claims denial challenges

Providers are having a hard time addressing claims denials and documentation to support appeals. RCM leaders need to find ways to tighten up these processes and reduce denial rates. They can do so either by adopting third-party systems or working within their own infrastructure, he notes.

CFO engagement

Any technology implementation will have an impact on revenue, so CFOs should stay engaged in the rollout process, he says. “These are highly technical projects, so there’s a tendency to hand over the reins to IT or the software vendor,” notes Todd, a former CFO. “But financial executives need to stay engaged throughout the project, including weekly implementation status updates.”

Providers should form a revenue cycle action team which includes all the stakeholders to the table, including the CFO and clinicians, he says. If the CFO is involved in this process, he or she can offer critical executive oversight of decisions made that impact A/R and cash.

User training and adoption

During the transition from a legacy system to a new platform, healthcare leaders need to make sure their staff are trained to use it. If they aren’t comfortable with the new system, it can mean trouble. Bear in mind that some employees may have used the legacy system for many years and need support as they make the transition. Otherwise, they may balk and productivity could fall.

Outside expertise

Given the complexity of rolling out new systems, it can help to hire experts who understand the technical and operational aspects of the software, along with organizational processes involved in the transition. “It’s very valuable to work with a consulting firm that employs real consultants – people who have worked in operations for years,” Todd concludes.

The Truth about AI in Healthcare

Posted on June 18, 2018 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

Those who watched the television show, “The Good Doctor,” in its first season got to see how a young autistic surgeon who has savant syndrome faced challenges in his everyday life as he learns to connect with people in his world. His extraordinary medical skill and intuition not only saves patients’ lives but also creates bridges with co-workers.

During each show, there is at least one scene in which the young doctor “visualizes” the inner workings of the patient’s body – evaluating and analyzing the cause of the medical condition.

Although all physicians can describe what happens to cause illness, the speed, detail and clarity of the young surgeon’s ability to gather information, predict reactions to treatments and identify the protocol that will produce the best outcome greatly surpasses his colleagues’ abilities.

Yes, this is a television show, but artificial intelligence promises the same capabilities that will disrupt all of our preconceived notions about healthcare on both the clinical and the operational sides of the industry.

Doctors rely on their medical training as well as their personal experience with hundreds of patients, but AI can allow clinicians to tap into the experience of hundreds of doctors’ experiences with thousands of patients. Even if physicians had personal experience with thousands of patients, the human mind can’t process all of the data effectively.

How can AI improve patient outcomes as well as the bottom line?

We’re already seeing the initial benefits of AI in many areas of the hospital. A report by Accenture identifies the top three uses of AI in healthcare as robot-assisted surgery, virtual nursing assistants and administrative workflow assistance. These three AI applications alone represent a potential estimated annual benefit of $78 billion for the healthcare industry by 2026.

The benefits of AI include improved precision in surgery, decreased length of stay, reduction in unnecessary hospital visits through remote assessment of patient conditions, and time-saving capabilities such as voice-to-text transcription. According to Accenture, these improvements represent a work time savings of 17 percent for physicians and 51 percent for registered nurses – at a critical time when there is no end in sight for the shortages of both nurses and doctors.

In a recent webinar discussing the role of AI in healthcare, John Lynn, founder of HealthcareScene.com, described other ways that AI can improve diagnosis, treatment and patient safety. These areas include dosage error detection, treatment plan design, determination of medication adherence, medical imaging, tailored prescription medicine and automated documentation.

One of the challenges to fully leveraging the insights and capabilities of AI is the volume of information accumulated in electronic medical records that is unstructured data. Translating this information into a format that can be used by clinical providers as well as financial and administrative staff to optimize treatment plans as well as workflows is possible with natural language processing – a branch of AI that enables technology to interpret speech and text and determine which information is critical.

The most often cited fear about a reliance on AI in healthcare is the opportunity to make mistakes. Of course, humans make mistakes as well. We must remember that AI’s ability to tap into a much wider pool of information to make decisions or recommend options will result in a more deeply-informed decision – if the data is good.

The proliferation of legacy systems, continually added applications and multiple EMRs in a health system increases the risk of data that cannot be accessed or cannot be shared in real-time to aid clinicians or an AI-supported program. Ensuring that data is aggregated into a central location, harmonized, transformed into a usable format and cleaned to provide high quality data is necessary to support reliable AI performance.

While AI might be able to handle the data aggregation and harmonization tasks in the future, we are not there yet. This is not, however, a reason to delay the use of AI in hospitals and other organizations across the healthcare spectrum.

Healthcare organizations can partner with companies that specialize in the aggregation of data from disparate sources to make the information available to all users. Increasing access to data throughout the organization is beneficial to health systems – even before they implement AI tools.

Although making data available to all of the organization’s providers, staff and vendors as needed may seem onerous, it is possible to do so without adding to the hospital’s IT staff burden or the capital improvement budget. The complexities of translating structured and unstructured data, multiple formats and a myriad of data sources can be balanced with data security concerns with the use of a team that focuses on these issues each day.

While most AI capabilities in use today are algorithms that reflect current best practices or research that are programmed by healthcare providers or researchers, this will change. In the future, AI will expand beyond algorithms, and the technology will be able to learn and make new connections among a wider set of data points than today’s more narrowly focused algorithms.

Whether or not your organization is implementing AI, considering AI or just watching its development, I encourage everyone to start by evaluating the data that will be used to “run” AI tools. Taking steps now to ensure clean, easy-to-access data will not only benefit clinical and operational tasks now but will also position the organization to more quickly adopt AI.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies, a proud sponsor of Healthcare Scene. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Geisinger Integrates Precision Medicine Into Care

Posted on May 21, 2018 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Lately, it seems like we read about new advances in precision medicine every day. Increasingly, physicians are able to adjust drug therapies and predict conditions like cancer and heart disease before they blossom, particularly in the case of some cancers. However, many health organizations are still focused on research rather than delivering genomic medicine results to consumers.

The process of basing medical decisions on genomic data has certainly begun, with a number of health systems jumping on board. For example, a few months ago Intermountain Healthcare begin the process of validating and launching several tests designed to identify hereditary genetic patterns that might lead to disease. Intermountain expects this work to be particularly fruitful for individuals with a family history of breast cancer or ovarian cancer. The test should identify both those previously diagnosed with cancer and healthy individuals with hereditary cancer gene mutations.

Now, at least one health system is taking things even further. Geisinger Health says it has announced that it plans to expand its genomics program beyond its research phase and into everyday care for all patients. The new program will not only target patients who have obvious symptoms, but instead, all patients Geisinger treats. The health systems clinical DNA sequencing efforts will begin with a 1000-patient pilot program taking place in mid-to-late 2018.

According to David Ledbetter, Ph.D., Geisinger executive vice president and chief scientific officer, the program will not only help current patients but also amass data that will help future patients. “As we sequence the exomes of our patients and learn even more about particular genome variants and their impact on different health conditions, we predict that as many as 10 to 15 percent of our patients will benefit,” he said.

The new strategy follows on the success of its MyCode Community Health Initiative, which it launched in 2014 in collaboration with Regeneron Pharmaceuticals. Since then, Geisinger has been analyzing the DNA of patients participating in the program, which has attracted more than 190,000 patient sign-ups to date. To date, more than 500 MyCode participants have been notified that they have a genomic variant which increases the chance that they’ll develop cancer or heart disease.

Geisinger’s effort sounds exciting, there’s little doubt. However, programs like these face some obstacles which the health system wouldn’t call attention to a press release. For example, as my colleague John Lynn notes, integrating genomic data with other clinical information could be quite difficult, and sharing it even more so.

“Healthcare organizations have problems even sharing something as standard and simple as a PDF,” he wrote last year. “Once we have real genomic data and the markers behind them, EHRs won’t have any idea how to handle them. We’ll need a whole new model and approach or our current interoperability problems will look like child’s play.” Let’s hope the industry develops this new approach soon.