Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

5 Ways Allscripts Will Help Fight Opioid Abuse In 2018

Posted on May 22, 2018 I Written By

The following is a guest blog post by Paul Black, CEO of Allscripts, a proud sponsor of Health IT Expo.

Prescription opioid misuse and overdoses are on the rise. The Centers for Disease Control and Prevention (CDC) reports that more than 40 Americans die every day from prescription opioid overdose. It also estimates that the economic impact in the United States is $78.5 billion a year, including the costs of healthcare, lost productivity, addiction treatment and criminal justice involvement.

The opioid crisis has taken a devastating toll on our communities, families and loved ones. It is a complex problem that will require a lot of hard work from stakeholders across the healthcare continuum.

We all have a part to play. At Allscripts, we feel it is our responsibility to continuously improve our solutions to help providers address public health concerns. Our mission is to design technology that enables smarter care, delivered with greater precision, for better outcomes.

Here are five ways Allscripts plans to help clinicians combat the opioid crisis in 2018:

1) Establish a baseline. Does your patient population have a problem with opioids?

Before healthcare organizations can start addressing opioid abuse, they need to understand how the crisis is affecting their patient population. We are all familiar with the national statistics, but how does the crisis manifest in each community? What are the specific prescribing practices or overdose patterns that need the most attention?

Now that healthcare is on a fully digital platform, we can gain insights from the data. Organizations can more precisely manage the needs of each patient population. We are working with clients to uncover some of these patterns. For example, one client is using Sunrise™ Clinical Performance Manager (CPM) reports to more closely examine opioid prescribing patterns in emergency rooms.

2) Secure the prescribing process. Is your prescribing process safe and secure?

Electronic prescribing of controlled substances (EPCS) can help reduce fraud. Unfortunately, even though the technology is widely available, it is not widely adopted. Areas where clinicians regularly use EPCS have seen significantly less prescription fraud and abuse.

EPCS functionality is already in place across our EHRs. While more than 90% of all pharmacies are EPCS-enabled, only 14% of controlled substances are prescribed electronically. We’re making EPCS adoption one of our top priorities at Allscripts, and we continue to discuss the benefits with policymakers.

3) Provide clinical decision support. Are you current with evidence-based best practices?

We are actively pursuing partnerships with health plans, pharmaceutical companies and third-party content providers to collaborate on evidence-based prescribing guidelines. These guidelines may suggest quantity limits, recommendations for fast-acting versus extended-release medications, protocols for additional and alternative therapies, and expanded educational material and content.

We’ll use the clinical decision support technologies we already have in place to present these assessment tools and guidelines at the time needed within clinical workflows. Our goal is to provide the information to providers at the right time, so that they can engage in productive conversations with patients, make informed decisions and create optimal treatment plans.

4) Simplify access to Prescription Drug Monitoring Programs (PDMPs). Are you avoiding prescribing because it’s too hard to check PDMPs?

PDMPs are state-level databases that collect, monitor and analyze e-prescribing data from pharmacies and prescribers. The CDC Guidelines recommend clinicians should review the patient’s history of controlled substance prescriptions by checking PDMPs.

PDMPs, however, are not a unified source of information, which can make it challenging for providers to check them at the point of care. The College of Healthcare Information Management Executives (CHIME) has called for better EHR-PDMP integration, combined with data-driven reports to identify physician prescribing patterns.

In 2018, we’re working on integrating the PDMP into the clinician’s workflow for every patient. The EHR will take PDMP data and provide real-time alert scores that can make it easier to discern problems at the point of care.

5) Predict risk. Can big data help you predict risk for addiction?

Allscripts has a team of data scientists dedicated to transforming data into information and actionable insights. These analysts combine vast amounts of information from within the EHR, our Clinical Data Warehouse – data that represents millions of patients – and public health mechanisms (such as PDMPs).

We use this “data lake” to develop algorithms to identify at-risk patients and reveal prescription patterns that most often lead to abuse, overdose and death. Our research on this is nascent, and early insights are compelling.

The opioid epidemic cannot be solved overnight, nor is it something any of us can address alone. But we are enthusiastic about the teamwork and efforts of our entire industry to address this complex, multi-faceted epidemic.

Hear Paul Black discuss the future of health IT beyond the EHR at this year’s HIT Expo.

Making Healthcare Data Useful

Posted on May 14, 2018 I Written By

The following is a guest blog by Monica Stout from MedicaSoft

At HIMSS18, we spoke about making health data useful to patients with the Delaware Health Information Network (DHIN). Useful data for patients is one piece of the complete healthcare puzzle. Providers also need useful data to provide more precise care to patients and to reach patient populations who would benefit directly from the insights they gain. Payers want access to clinical data, beyond just claims data, to aggregate data historically. This helps payers define which patients should be included in care coordination programs or who should receive additional disease management assistance or outreach.

When you’re a provider, hospital, health system, health information exchange, or insurance provider and have the data available, where do you start? It’s important to start at the source of the data to organize it in a way that makes insights and actions possible. Having the data is only half of the solution for patients, clinicians or payers. It’s what you do with the data that matters and how you organize it to be usable. Just because you may have years of data available doesn’t mean you can do anything with it.

Historically, healthcare has seen many barriers to marrying clinical and claims data. Things like system incompatibility, poor data quality, or siloed data can all impact organizations’ ability to access, organize, and analyze data stores. One way to increase the usability of your data is to start with the right technology platform. But what does that actually mean?

The right platform starts with a data model that is flexible enough to support a wide variety of use models. It makes data available via open, standards-based APIs. It organizes raw data into longitudinal records. It includes services, such as patient matching and terminology mapping, that make it easy to use the data in real-world applications. The right platform transforms raw data into information that that aids providers and payers improve outcomes and manage risk and gives patients a more complete view of their overall health and wellness.

Do you struggle with making your data insightful and actionable? What are you doing to transform your data? Share your insights, experiences, challenges, and thoughts in the comments or with us on Twitter @MedicaSoftLLC.

About Monica Stout
Monica is a HIT teleworker in Grand Rapids, Michigan by way of Washington, D.C., who has consulted at several government agencies, including the National Aeronautics Space Administration (NASA) and the U.S. Department of Veterans Affairs (VA). She’s currently the Marketing Director at MedicaSoft. Monica can be found on Twitter @MI_turnaround or @MedicaSoftLLC.

About MedicaSoft
MedicaSoft  designs, develops, delivers, and maintains EHR, PHR, and UHR software solutions and HISP services for healthcare providers and patients around the world. MedicaSoft is a proud sponsor of Healthcare Scene. For more information, visit www.medicasoft.us or connect with us on Twitter @MedicaSoftLLC, Facebook, or LinkedIn.

Investment in IT Infrastructure Needed to Power Healthcare Transformation

Posted on May 11, 2018 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He is currently an independent marketing consultant working with leading healthIT companies. Colin is a member of #TheWalkingGallery. His Twitter handle is: @Colin_Hung.

Genomics, artificial intelligence, chatbots and a host of other technologies are accelerating the transformation of healthcare from a paper-based system to a digital one. In order to power this transformation, IT infrastructure (storage, computational power, security, etc) needs to move from an implementation afterthought to the forefront of strategic planning. Windstream Enterprise is one company that is working closely with healthcare providers to ensure their IT infrastructure is ready for the challenges ahead and helping to put the infrastructure conversation front-and-center.

Windstream is part of the wave of companies that have successful track-records in other industries that are now bringing their solutions to healthcare. This wave is being led by technology giants like Amazon, Apple and Google. The moves they make in healthcare get a lot of attention and rightfully so. Check out the excellent coverage by Christina Farr over at MSNBC for more information.

Although I am intrigued by what the big tech companies are doing, what I truly find fascinating (and frankly inspiring) is the work of the hundreds of companies not named Amazon, Apple and Google. Windstream is one of the companies I have been interested in ever since I saw them at HIMSS17 and wondered “What is an Internet access provider doing in healthcare?”

Windstream was formed back in 2006 when Alltel spun off it’s landline business and merged it with VALOR Communications Group. Back then they provided 3.4 million access lines (telephone and internet connections) in 16 states. Over the years they have continued to grow through acquisition, expanding into fiber transport networks and fixed wireless. Windstream Enterprises, a division of Windstream, has had tremendous success helping clients in the retail and banking industries build and manage their technology infrastructure.

I recently had the chance to sit down with Windstream’s President and CEO, Tony Thomas and one of their clients, the University of Kentucky Healthcare (UK), to talk about why healthcare needs to continue to invest in good IT infrastructure.

You can watch the full recording of our conversation here:

Thomas sees a lot of similarity between the digital transformation happening in healthcare and the ones that swept through the financial services and retail sectors.

“When you look at the success we’ve had in retail and banking, and then you look at where healthcare is heading, the commonality is the push to technology,” explained Thomas. “You can see that technology is changing the way that healthcare gets done. There is a focus on the patient experience and cost transformation.” This same focus on improving customer experiences and reducing costs is what helped spur the adoption of cloud and other advanced technologies at banks and retailers.

One driver of technology adoption in healthcare is the higher expectations patients have of healthcare provider. These higher expectations are fueled by the prevalence of (and convenience of) consumer technologies that have made our lives so much easier: online shopping, online banking, booking appointments through our smartphones, etc.

In 2016, Deloitte released a study that compared consumer use of technology for health vs other aspects of their lives. Not surprisingly their findings showed that healthcare lagged behind.

I see the delta between the use of technology as a consumer vs as a patient as a demand gap. The more healthcare lags behind, the more demand patients will put on healthcare organizations to adopt technologies that mirror what they experience as consumers.

To close that gap, organizations have accelerated the implementation of technologies like chatbots, omni-channel communications, artificial intelligence and data analytics. However, when you couple these new technologies with the use electronic health records and advanced lab systems, the result is explosive data growth.

“Over the last two years we have generated over 1.2 Petabytes of pathology data,” noted Cody Bumgardner PhD, Assistant Professor of Pathology & Laboratory Medicine at the University of Kentucky Healthcare (a Windstream client). “Pathology is really the collection of different points of data: images, genomic & laboratory data. Digital Pathology is taking all that data, making it both operationally effective and ready for computational analysis – transforming it into something useful and actionable for clinicians. Pathology and pathological reports arguably provide the most actionable data in the patient record and it is relatively low-cost relative to other data sources.”

“As the number of health and wellness devices increases,” continued Bumgardner. “It means we will have to collect and analyze more and more data. You will need some solid infrastructure to allow that data to flow and you will need good computational power as close to the point of data generation as possible.”

Getting funds and resources to keep IT infrastructure up to date is not easy. Jan Bates, Director of Systems Operations at the University of Kentucky Healthcare summed it up succinctly: “It’s hard to get buy-in from executives because it’s not something they have a keen interest in discussing. In fact, many find it boring. You HAVE TO relate infrastructure back to the business. You have to answer the questions: What benefits will the organization realize? and What will the organization be able to do when the infrastructure is well maintained?”

The answer according to Windstream’s CEO Tony Thomas is nothing short of transforming the way healthcare is delivered: “We are really at an exciting time in healthcare. There are tons of new technologies emerging [like AI]. We’re going to need solid investments in the underlying infrastructure to support those technologies which will revolutionize the way healthcare is delivered.”

It gives me hope that companies like Windstream are bringing their extensive data and infrastructure management expertise from the banking sector to healthcare. Although healthcare is a unique industry, that doesn’t mean we are restricted to adopting solutions developed by healthcare insiders. We can (and should) leverage the best from other industries and adapt them to the unique aspects of healthcare.

“Given the opportunities we see in healthcare, we are increasing our investment here.” stated Thomas. “We think we can be a provider of choice for many healthcare organizations. In the end it’s all about the application of technology to solve problems in healthcare.”

Well said.

Improving Data Outcomes: Just What The Doctor Ordered

Posted on May 8, 2018 I Written By

The following is a guest blog post by Dave Corbin, CEO of HULFT.

Health care has a data problem. Vast quantities are generated but inefficiencies around sharing, retrieval, and integration have acute repercussions in an environment of squeezed budgets and growing patient demands.

The sensitive nature of much of the data being processed is a core issue. Confidential patient information has traditionally encouraged a ‘closed door’ approach to data management and an unease over hyper-accessibility to this information.

Compounding the challenge is the sheer scale and scope of the typical health care environment and myriad of departmental layers. The mix of new and legacy IT systems used for everything from billing records to patient tracking often means deep silos and poor data connections, the accumulative effect of which undermines decision-making. As delays become commonplace, this ongoing battle to coordinate disparate information manifests itself in many different ways in a busy hospital.

Optimizing bed occupancies – a data issue?

One example involves managing bed occupancy, a complex task which needs multiple players to be in the loop when it comes to the latest on a patient’s admission or discharge status. Anecdotal evidence points to a process often informed manually via feedback with competing information. Nurses at the end of their shift may report that a patient is about to be discharged, unaware that a doctor has since requested more tests to be carried out for that patient. As everyone is left waiting for the results from the laboratory, the planned changeover of beds is delayed with many knock-on effects, increasing congestion and costs and frustrating staff and patients in equal measure.

How data is managed becomes a critical factor in tackling the variations that creep into critical processes and resource utilization. In the example above, harnessing predictive modelling and data mining to forecast the number of patient discharges so that the number of beds available for the coming weeks can be estimated more accurately will no doubt become an increasingly mainstream option for the sector.

Predictive analytics is great and all, but first….

Before any of this can happen, health care organizations need a solid foundation of accessible and visible data which is centralized, intuitive, and easy to manage.

Providing a holistic approach to data transfer and integration, data logistics can help deliver security, compliance, and seamless connectivity speeding up the processing of large volumes of sensitive material such as electronic health records – the kind of data that simply cannot be lost. These can ensure the reliable and secure exchange of intelligence with outside health care vendors and partners.

For data outcomes, we’re calling for a new breed of data logistics that’s intuitive and easy to use. Monitoring interfaces which enable anyone with permission to access the network to see what integrations and transfers are running in real time with no requirement for programming or coding are the kind of intervention which opens the data management to a far wider section of an organization.

Collecting data across a network of multiple transfer and integration activities and putting it in a place where people can use, manage and manipulate becomes central to breaking down the barriers that have long compromised efficiencies in the health care sector.

HULFT works with health care organizations of all sizes to establish a strong back-end data infrastructure that make front-end advances possible. Learn how one medical technology pioneer used HULFT to drive operational efficiencies and improve quality assurance in this case study.

Dave Corbin is CEO of HULFT, a comprehensive data logistics platform that allows IT to find, secure, transform and move information at scale. HULFT is a proud sponsor of Health IT Expo, a practical innovation conference organized by Healthcare Scene.  Find out more at hulftinc.com

An Approach For Privacy – Protecting Big Data

Posted on February 6, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

There’s little doubt that the healthcare industry is zeroing in on some important discoveries as providers and researchers mine collections of clinical and research data. Big data does come with some risks, however, with some observers fearing that aggregated and shared information may breach patient privacy. However, at least one study suggests that patients can be protected without interrupting data collection.

In what it calls a first, a new study appearing in the Journal of the American Medical Informatics Association has demonstrated that protecting the privacy of patients can be done without too much fuss, even when the patient data is pulled into big data stores used for research.

According to the study, a single patient anonymization algorithm can offer a standard level of privacy protection across multiple institutions, even when they are sharing clinical data back and forth. Researchers say that larger clinical datasets can protect patient anonymity without generalizing or suppressing data in a manner which would undermine its use.

To conduct the study, researchers set a privacy adversary out to beat the system. This adversary, who had collected patient diagnoses from a single unspecified clinic visit, was asked to match them to a record in a de-identified research dataset known to include the patient. To conduct the study, researchers used data from Vanderbilt University Medical Center, Northwestern Memorial Hospital in Chicago and Marshfield Clinic.

The researchers knew that according to prior studies, the more data associated with each de-identified record, and the more complex and diverse the patient’s problems, the more likely it was that their information would stick out from the crowd. And that would typically force managers to generalize or suppress data to protect patient anonymity.

In this case, the team hoped to find out how much generalization and suppression would be necessary to protect identities found within the three institutions’ data, and after, whether the protected data would ultimately be of any use to future researchers.

The team processed relatively small datasets from each institution representing patients in a multi-site genotype-disease association study; larger datasets to represent patients in the three institutions’ bank of de-identified DNA samples; and large sets which stood in for each’s EMR population.

Using the algorithm they developed, the team found that most of the data’s value was preserved despite the occasional need for generalization and suppression. On average, 12.8% of diagnosis codes needed generalization; the medium-sized biobank models saw only 4% of codes needing generalization; and among the large databases representing EMR populations, only 0.4% needed generalization and no codes required suppression.

More work like this is clearly needed as the demand for large-scale clinical, genomic and transactional datasets grows. But in the meantime, this seems to be good news for budding big data research efforts.

UCSF Partners With Intel On Deep Learning Analytics For Health

Posted on January 30, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

UC San Francisco’s Center for Digital Health Innovation has agreed to work with Intel to deploy and validate a deep learning analytics platform. The new platform is designed to help clinicians make better treatment decisions, predict patient outcomes and respond quickly in acute situations.

The Center’s existing projects include CareWeb, a team-based collaborative care platform built on Salesforce.com social and mobile communications tech; Tidepool, which is building infrastructure for next-gen smart diabetes management apps; Health eHeart, a clinical trials platform using social media, mobile and realtime sensors to change heart disease treatment; and Trinity, which offers “precision team care” by integrating patient data with evidence and multi-disciplinary data.

These projects seem to be a good fit with Intel’s healthcare efforts, which are aimed at helping providers succeed at distributed care communication across desktop and mobile platforms.

As the two note in their joint press release, creating a deep learning platform for healthcare is extremely challenging, given that the relevant data is complex and stored in multiple incompatible systems. Intel and USCF say the next-generation platform will address these issues, allowing them to integrate not only data collected during clinical care but also inputs from genomic sequencing, monitors, sensors and wearables.

To support all of this activity obviously calls for a lot of computing power. The partners will run deep learning use cases in a distributed fashion based on a CPU-based cluster designed to crunch through very large datasets handily. Intel is rolling out the computing environment on its Xeon processor-based platform, which support data management and the algorithm development lifecycle.

As the deployment moves forward, Intel leaders plan to study how deep learning analytics and machine-driven workflows can optimize clinical care and patient outcomes, and leverage what they learn when they create new platforms for the healthcare industry. Both partners believe that this model will scale for future use case needs, such as larger convolutional neural network models, artificial networks patterned after living organizations and very large multidimensional datasets.

Once implemented, the platform will allow users to conduct advanced analytics on all of this disparate data, using machine learning and deep learning algorithms. And if all performs as expected, clinicians should be able to draw on these advanced capabilities on the fly.

This looks like a productive collaboration. If nothing else, it appears that in this case the technology platform UCSF and Intel are developing may be productized and made available to other providers, which could be very valuable. After all, while individual health systems (such as Geisinger) have the resources to kick off big data analytics projects on their own, it’s possible a standardized platform could make such technology available to smaller players. Let’s see how this goes.

Some Projections For 2017 Hospital IT Spending

Posted on January 4, 2017 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A couple of months ago, HIMSS released some statistics from its survey on US hospitals’ plans for IT investment over the next 12 months. The results contain a couple of data points that I found particularly interesting:

  • While I had expected the most common type of planned spending to be focused on population health or related solutions, HIMSS found that pharmacy was the most active category. In fact, 51% of hospitals were planning to invest in one pharmacy technology, largely to improve tracking of medication dispensing in additional patient care environments. Researchers also found that 6% of hospitals were planning to add carousels or packagers in their pharmacies.
  • Eight percent hospitals said that they plan to invest in EMR components, which I hadn’t anticipated (though it makes sense in retrospect). HIMSS reported that 14% of hospitals at Stage 1-4 of its Electronic Medical Record Adoption Model are investing in pharmacy tech for closed loop med administration, and 17% in auto ID tech. Four percent of Stage 6 hospitals plan to support or expand information exchange capabilities. Meanwhile, 60% of Stage 7 hospitals are investing in hardware infrastructure “for the post-EMR world.”

Other data from the HIMSS report included news of new analytics and telecom plans:

  • Researchers say that recent mergers and acquisitions are triggering new investments around telephony. They found that 12% of hospitals with inpatient revenues between $25 million and $125 million – and 6% of hospitals with more than $500 million in inpatient revenues — are investing in VOIP and telemedicine. FWIW, I’m not sure how mergers and acquisitions would trigger telemedicine rollouts, as they’re already well underway at many hospitals — maybe these deals foster new thinking and innovation?
  • As readers know, hospitals are increasingly spending on analytics solutions to improve care and make use of big data. However (and this surprised me) only 8% of hospitals reported plans to buy at least one analytics technology. My guess is that this number is small because a) hospitals may not have collected their big data assets in easily-analyzed form yet and b) that they’re still hoping to make better use of their legacy analytics tools.

Looking at these stats as a whole, I get the sense that the hospitals surveyed are expecting to play catch-up and shore up their infrastructure next year, rather than sink big dollars into future-looking solutions.

Without a doubt, hospital leaders are likely to invest in game-changing technologies soon such as cutting-edge patient engagement and population health platforms to prepare for the shift to value-based health. It’s inevitable.

But in the meantime it probably makes sense for them to focus on internal cost drivers like pharmacy departments, whose average annual inpatient drug spending shot up by more than 23% between 2013 and 2015. Without stanching that kind of bleeding, hospitals are unlikely to get as much value as they’d like from big-idea investments in the future.

Paris Hospitals Use Big Data To Predict Admissions

Posted on December 19, 2016 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Here’s a fascinating story in from Paris (or par-ee, if you’re a Francophile), courtesy of Forbes. The article details how a group of top hospitals there are running a trial of big data and machine learning tech designed to predict admission rates. The hospitals’ predictive model, which is being tested at four of the hospitals which make up the Assistance Publiq-Hopitaux de Paris (AP-HP), is designed to predict admission rates as much as 15 days in advance.

The four hospitals participating in the project have pulled together a massive trove of data from both internal and external sources, including 10 years’ worth of hospital admission records. The goal is to forecast admissions by the day and even by the hour for the four facilities participating in the test.

According to Forbes contributor Bernard Marr, the project involves using time series analysis techniques which can detect patterns in the data useful for predicting admission rates at different times.  The hospitals are also using machine learning to determine which algorithms are likely to make good predictions from old hospital data.

The system the hospitals are using is built on the open source Trusted Analytics Platform. According to Marr, the partners felt that the platform offered a particularly strong capacity for ingesting and crunching large amounts of data. They also built on TAP because it was geared towards open, collaborative development environments.

The pilot system is accessible via a browser-based interface, designed to be simple enough that data science novices like doctors, nurses and hospital administration staff could use the tool to forecast visit and admission rates. Armed with this knowledge, hospital leaders can then pull in extra staffers when increased levels of traffic are expected.

Being able to work in a distributed environment will be key if AP-HP decides to roll the pilot out to all of its 44 hospitals, so developers built with that in mind. To be prepared for the future, which might call for adding a great deal of storage and processing power, they designed distributed, cloud-based system.

“There are many analytical solutions for these type of problems, [but] none of them have been implemented in a distributed fashion,” said Kyle Ambert, an Intel data scientist and TAP contributor who spoke with Marr. “Because we’re interested in scalability, we wanted to make sure we could implement these well-understood algorithms in such a way that they work over distributed systems.”

To make this happen, however, Ambert and the development team have had to build their own tools, an effort which resulted in the first contribution to an open-source framework of code designed to carry out analysis over scalable, distributed framework, one which is already being deployed in other healthcare environments, Marr reports.

My feeling is that there’s no reason American hospitals can’t experiment with this approach. In fact, maybe they already are. Readers, are you aware of any US facilities which are doing something similar? (Or are most still focused on “skinny” data?)

Easing The Transition To Big Data

Posted on December 16, 2016 I Written By

Anne Zieger is veteran healthcare branding and communications expert with more than 25 years of industry experience. and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also worked extensively healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Tapping the capabilities of big data has become increasingly important for healthcare organizations in recent years. But as HIT expert Adheet Gogate notes, the transition is not an easy one, forcing these organizations to migrate from legacy data management systems to new systems designed specifically for use with new types of data.

Gogate, who serves as vice president of consulting at Citius Tech, rightly points out that even when hospitals and health systems spend big bucks on new technology, they may not see any concrete benefits. But if they move through the big data rollout process correctly, their efforts are more likely to bear fruit, he suggests. And he offers four steps organizations can take to ease this transition. They include:

  • Have the right mindset:  Historically, many healthcare leaders came up through the business in environments where retrieving patient data was difficult and prone to delays, so their expectations may be low. But if they hope to lead successful big data efforts, they need to embrace the new data-rich environment, understand big data’s potential and ask insightful questions. This will help to create a data-oriented culture in their organization, Gogate writes.
  • Learn from other industries: Bear in mind that other industries have already grappled with big data models, and that many have seen significant successes already. Healthcare leaders should learn from these industries, which include civil aviation, retail and logistics, and consider adopting their approaches. In some cases, they might want to consider bringing an executive from one of these industries on board at a leadership level, Gogate suggests.
  • Employ the skills of data scientists: To tame the floods of data coming into their organization, healthcare leaders should actively recruit data scientists, whose job it is to translate the requirements of the methods, approaches and processes for developing analytics which will answer their business questions.  Once they hire such scientists, leaders should be sure that they have the active support of frontline staffers and operations leaders to make sure the analyses they provide are useful to the team, Gogate recommends.
  • Think like a startup: It helps when leaders adopt an entrepreneurial mindset toward big data rollouts. These efforts should be led by senior leaders comfortable with this space, who let key players act as their own enterprise first and invest in building critical mass in data science. Then, assign a group of core team members and frontline managers to areas where analytics capabilities are most needed. Rotate these teams across the organization to wherever business problems reside, and let them generate valuable improvement insights. Over time, these insights will help the whole organization improve its big data capabilities, Gogash says.

Of course, taking an agile, entrepreneurial approach to big data will only work if it has widespread support, from the C-suite on down. Also, healthcare organizations will face some concrete barriers in building out big data capabilities, such as recruiting the right data scientists and identifying and paying for the right next-gen technology. Other issues include falling reimbursements and the need to personalize care, according to healthcare CIO David Chou.

But assuming these other challenges are met, embracing big data with a willing-to-learn attitude is more likely to work than treating it as just another development project. And the more you learn, the more successful you’ll be in the future.

Using NLP with Machine Learning for Predictive Analytics in Healthcare

Posted on December 12, 2016 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

There are a lot of elements involved in doing predictive analytics in healthcare effectively. In most cases I’ve seen, organizations working on predictive analytics do some but not all that’s needed to really make predictive analytics as effective as possible. This was highlighted to me when I recently talked with Frank Stearns, Executive Vice President from HBI Solutions at the Digital Health Conference in NYC.

Here’s a great overview of the HBI Solutions approach to patient risk scores:

healthcare-predictive-analytics-model

This process will look familiar to most people in the predictive analytics space. You take all the patient data you can find, put it into a machine learning engine and output a patient risk score. One of the biggest trends happening with this process is the real-time nature of this process. Plus, I also love the way the patient risk score includes the attributes that influenced a patients risk score. Both of these are incredibly important when trying to make this data actionable.

However, the thing that stood out for me in HBI Solutions’ approach is the inclusion of natural language processing (NLP) in their analysis of the unstructured patient data. I’d seen NLP being used in EHR software before, but I think the implementation of NLP is even more powerful in doing predictive analytics.

In the EHR world, you have to be absolutely precise. If you’re not precise with the way you code a visit, you won’t get paid. If you’re not precise with how the diagnosis is entered into the EHR, that can have long term consequences. This has posed a real challenge for NLP since NLP is not 100% accurate. It’s gotten astoundingly good, but still has its shortcomings that require a human review when utilizing it in an EHR.

The same isn’t true when applying NLP to unstructured data when doing predictive analytics. Predictive analytics by its very nature incorporates some modicum of variation and error. It’s understood that predictive analytics could be wrong, but is an indication of risk. Certainly a failing in NLP’s recognition of certain data could throw off a predictive analytic. That’s unfortunate, but the predictive analytics aren’t relied on the same way documentation in an EHR is relied upon. So, it’s not nearly as big of a deal.

Plus, the value that’s received from applying NLP to pull out the nuggets of information that exists in the unstructured narrative sections of healthcare data is well worth that small amount of risk of the NLP being incorrect. As Frank Stearns from HBI solutions pointed out to me, the unstructured data is often where the really valuable data about a patients’ risk score exist.

I’d be interested in having HBI Solutions do a study of the whole list of findings that are often available in the unstructured data that weren’t available otherwise. However, it’s not hard to imagine a doctor documenting patient observations in the unstructured EHR narrative that they didn’t want to include as a formal diagnosis. Not the least of these are behavioral health observations that the doctor saw, observed, and documented but didn’t want to fully diagnose. NLP can pull these out of the narrative and include them in their patient risk score.

Given this perspective, it’s hard to imagine we’ll ever be able to get away from using NLP or related technology to pull out the valuable insights in the unstructured data. Plus, it’s easy to see how predictive analytics that don’t use NLP are going to be deficient when trying to use machine learning to analyze patients. What’s amazing is that HBI Solutions has been applying machine learning to healthcare for 5 years. That’s a long time, but also explains why they’ve implemented such advanced solutions like NLP in their predictive analytics solutions.