Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Waiting For The Perfect “Standard” Is Not The Answer To Healthcare’s Interoperability Problem

Posted on October 16, 2017 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.

Have you bought into the “standards will solve healthcare’s interoperability woes” train of thought? Everyone understands that standards are necessary to enable disparate systems to communicate with each other, but as new applications and new uses for data continually appear, healthcare organizations that are waiting for universal standards, are not maximizing the value of their data. More importantly, they will be waiting a long time to realize the full potential of their data.

Healthcare interoperability is not just a matter of transferring data as an entire file from one user to another. Instead, effective exchange of information allows each user to select which elements of a patient’s chart are needed, and then access them in a format that enables analysis of different data sets to provide a holistic picture of the patient’s medical history or clinical trends in a population of patients. Healthcare’s interoperability challenge is further exacerbated by different contextual interpretations of the words within those fields. For instance, how many different ways are there to say heart attack?

The development of the Health Level Seven (HL7®) FHIR®, which stands for Fast Healthcare Interoperability Resources, represents a significant step forward to interoperability. While the data exchange draft that is being developed and published by HL7 eliminates many of the complexities of earlier HL7 versions and facilitates real-time data exchange via web technology, publication of release 4 – the first normative version of the standard – is not anticipated until October 2018.

As these standards are further developed, the key to universal adoption will be simplicity, according to John Lynn, founder of the HealthcareScene.com. However, he suggests that CIOs stop waiting for “perfect standards” and focus on how they can best achieve interoperability now.

Even with standards that can be implemented in all organizations, the complexity and diversity of the healthcare environment means that it will take time to move everyone to the same standards. This is complicated by the number of legacy systems and patchwork of applications that have been added to healthcare IT systems in an effort to meet quickly changing needs throughout the organization. Shrinking financial resources for capital investment and increasing competition for IT professionals limits a health system’s ability to make the overall changes necessary for interoperability – no matter which standards are adopted.

Some organizations are turning to cloud-based, managed service platforms to perform the integration, aggregation and harmonization that makes data available to all users – regardless of the system or application in which the information was originally collected. This approach solves the financial and human resource challenges by making it possible to budget integration and data management requirements as an operational rather than a capital investment. This strategy also relieves the burden on in-house IT staff by relying on the expertise of professionals who focus on emerging technologies, standards and regulations that enable safe, compliant data exchange.

How are you planning to scale your interoperability and integration efforts?  If you're waiting for standards, why are you waiting?

As a leading provider of healthcare interoperability solutions, Liaison is a proud sponsor of Healthcare Scene. While the conversation about interoperability has been ongoing for many years, ideas, new technology and new strategies discussed and shared by IT professionals will lead to successful healthcare data exchange that will transform healthcare and result in better patient care.

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

AHA Asks Congress To Reduce Health IT Regulations for Medicare Providers

Posted on September 22, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

The American Hospital Association has sent a letter to Congress asking members to reduce regulatory burdens for Medicare providers, including mandates affecting a wide range of health IT services.

The letter, which is addressed to the House Ways and Means Health subcommittee, notes that in 2016, CMS and other HHS agencies released 49 rules impacting hospitals and health systems, which make up nearly 24,000 pages of text.

“In addition to the sheer volume, the scope of changes required by the new regulations is beginning to outstrip the field’s ability to absorb them,” says the letter, which was signed by Thomas Nickels, executive vice president of government relations and public policy for the AHA. The letter came with a list of specific changes AHA is proposing.

Proposals of potential interest to health IT leaders include the following. The AHA is asking Congress to:

  • Expand Medicare coverage of telehealth to patients outside of rural areas and expand the types of technology that can be used. It also suggests that CMS should automatically reimburse for Medicare-covered services when delivered via telehealth unless there’s an individual exception.
  • Remove HIPAA barriers to sharing patient medical information with providers that don’t have a direct relationship with that patient, in the interests of improving care coordination and outcomes in a clinically-integrated setting.
  • Cancel Stage 3 of the Meaningful Use program, institute a 90-day reporting period for future program years and eliminate the all-or-nothing approach to compliance.
  • Suspend eCQM reporting requirements, given how difficult it is at present to pull outside data into certified EHRs for quality reporting.
  • Remove requirements that hospitals attest that they have bought technology which supports health data interoperability, as well as that they responded quickly and in good faith to requests for exchange with others. At present, hospitals could face penalties for technical issues outside their control.
  • Refocus the ONC to address a narrower scope of issues, largely EMR standards and certification, including testing products to assure health data interoperability.

I am actually somewhat surprised to say that these proposals seem to be largely reasonable. Typically, when they’re developed by trade groups, they tend to be a bit too stacked in favor of that group’s subgroup of concerns. (By the way, I’m not taking a position on the rest of the regulatory ideas the AHA put forth.)

For example, expanding Medicare telehealth coverage seems prudent. Given their age, level of chronic illness and attendant mobility issues, telehealth could potentially do great things for Medicare beneficiaries.

Though it should be done carefully, tweaking HIPAA rules to address the realities of clinical integration could be a good thing. Certainly, no one is suggesting that we ought to throw the rulebook out the window, it probably makes sense to square it with today’s clinical realities.

Also, the idea of torquing down MU 3 makes some sense to me as well, given the uncertainties around the entirety of MU. I don’t know if limiting future reporting to 90-day intervals is wise, but I wouldn’t take it off of the table.

In other words, despite spending much of my career ripping apart trade groups’ legislative proposals, I find myself in the unusual position of supporting the majority of the ones I list above. I hope Congress gives these suggestions some serious consideration.

Interoperability: Is Your Aging Healthcare Integration Engine the Problem?

Posted on September 18, 2017 I Written By

The following is a guest blog post by Gary Palgon, VP Healthcare and Life Sciences Solutions at Liaison Technologies.
There is no shortage of data collected by healthcare organizations that can be used to improve clinical as well as business decisions. Announcements of new technology that collects patient information, clinical outcome data and operational metrics that will make a physician or hospital provide better, more cost-effective care bombard us on a regular basis.

The problem today is not the amount of data available to help us make better decisions; the problem is the inaccessibility of the data. When different users – physicians, allied health professionals, administrators and financial managers – turn to data for decision support, they find themselves limited to their own silos of information. The inability to access and share data across different disciplines within the healthcare organization prevents the user from making a decision based on a holistic view of the patient or operational process.

In a recent article, Alan Portela points out that precision medicine, which requires “the ability to collect real-time data from medical devices at the moment of care,” cannot happen easily without interoperability – the ability to access data across disparate systems and applications. He also points out that interoperability does not exist yet in healthcare.

Why are healthcare IT departments struggling to achieve interoperability?

Although new and improved applications are adopted on a regular basis, healthcare organizations are just now realizing that their integration middleware is no longer able to handle new types of data such as social media, the volume of data and the increasing number of methods to connect on a real-time basis. Their integration platforms also cannot handle the exchange of information from disparate data systems and applications beyond the four walls of hospitals. In fact, hospitals of 500 beds or more average 25 unique data sources with six electronic medical records systems in use. Those numbers will only move up over time, not down.

Integration engines in place throughout healthcare today were designed well before the explosion of the data-collection tools and digital information that exist today. Although updates and additions to integration platforms have enabled some interoperability, the need for complete interoperability is creating a movement to replace integration middleware with cloud-based managed services.

A study by the Aberdeen Group reveals that 76 percent of organizations will be replacing their integration middleware, and 70 percent of those organizations will adopt cloud-based integration solutions in the next three years.

The report also points out that as healthcare organizations move from an on-premises solution to a cloud-based platform, business leaders see migration to the cloud and managed services as a way to better manage operational expenses on a monthly basis versus large, up-front capital investments. An additional benefit is better use of in-house IT staff members who are tasked with mission critical, day-to-day responsibilities and may not be able to focus on continuous improvements to the platform to ensure its ability to handle future needs.

Healthcare has come a long way in the adoption of technology that can collect essential information and put it in the hands of clinical and operational decision makers. Taking that next step to effective, meaningful interoperability is critical.

As a leading provider of healthcare interoperability solutions, Liaison is a proud sponsor of Healthcare Scene. It is only through discussions and information-sharing among Health IT professionals that healthcare will achieve the organizational support for the steps required for interoperability.

Join John Lynn and Liaison for an insightful webinar on October 5, titled: The Future of Interoperability & Integration in Healthcare: How can your organization prepare?

About Gary Palgon
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. Gary holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.

Rush Sues Patient Monitoring Vendor, Says System Didn’t Work

Posted on August 25, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Rush University Medical Center has filed suit against one of its health IT vendors, claiming that its patient monitoring system didn’t work as promised and may have put patients in danger.

According to a story in the Chicago Tribune, Rush spent $18 million installing the Infinity Acute Monitoring Solution system from Telford, PA-based Draeger Inc. between 2012 and early 2016.  The Infinity system included bedside monitors, larger data aggregating monitors at central nursing stations, battery-powered portable monitors and M300 wireless patient-worn monitors.

However, despite years of attempting to fix the system, its patient alarms were still unreliable and inaccurate, it contends in the filing, which accuses Draeger of breach of contract, unjust enrichment and fraud.

In the suit, the 664-bed hospital and academic medical center says that the system was dogged by many issues which could have had an impact on patient safety. For example, it says, the portable monitors stopped collecting data when moved to wireless networks and sometimes stole IP addresses from bedside monitors, knocking the bedside monitor off-line leaving the patient unmonitored.

In addition, the system allegedly sent out false alarms for heart arrhythmia patients with pacemakers, distracting clinicians from performing their jobs, and failed monitor apnea until 2015, according to the complaint. Even then, the system wasn’t monitoring some sets of apnea patients accurately, it said. Near the end, the system erased some patient records as well, it contends.

Not only that, Draeger didn’t deliver everything it was supposed to provide, the suit alleges, including wired-to-wireless monitoring and monitoring for desaturation of neonatal patients’ blood oxygen.

As if that weren’t enough, Draeger didn’t respond effectively when Rush executives told it about the problems it was having, according to the suit. “Rather than effectively remediating these problems, Draeger largely, and inaccurately, blamed them on Rush,” it contends.

While Draeger provided a software upgrade for the system, it was extremely difficult to implement, didn’t fix the original issues and created new problems, the suit says.

According to Rush, the Draeger system was supposed to last 10 years. However, because of technical problems it observed, the medical center replaced the system after only five years, spending $30 million on the new software, it says.

Rush is asking the court to make Draeger pay that the $18 million it spent on the system, along with punitive damages and legal fees.

It’s hard to predict the outcome of such a case, particularly given that the system’s performance has to have depended in part on how Rush managed the implementation. Plus, we’re only seeing the allegations made by Rush in the suit and not Draeger’s perspective which could be very different and offer other details. Regardless, it seems likely these proceedings will be watched closely in the industry. Regardless of whether they are at fault or not, no vendor can afford to get a reputation for endangering patient safety, and moreover, no hospital can afford to buy from them if they do.

Is It Time To Put FHIR-Based Development Front And Center?

Posted on August 9, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

I like to look at questions other people in the #HIT world wonder about, and see whether I have a different way of looking at the subject, or something to contribute to the discussion. This time I was provoked by one asked by Chad Johnson (@OchoTex), editor of HealthStandards.com and senior marketing manager with Corepoint Health.

In a recent HealthStandards.com article, Chad asks: “What do CIOs need to know about the future of data exchange?” I thought it was an interesting question; after all, everyone in HIT, including CIOs, would like to know the answer!

In his discussion, Chad argues that #FHIR could create significant change in healthcare infrastructure. He notes that if vendors like Cerner or Epic publish a capabilities-based API, providers’ technical, clinical and workflow teams will be able to develop custom solutions that connect to those systems.

As he rightfully points out, today IT departments have to invest a lot of time doing rework. Without an interface like FHIR in place, IT staffers need to develop workflows for one application at a time, rather than creating them once and moving on. That’s just nuts. It’s hard to argue that if FHIR APIs offer uniform data access, everyone wins.

Far be it from me to argue with a good man like @OchoTex. He makes a good point about FHIR, one which can’t be emphasized enough – that FHIR has the potential to make vendor-specific workflow rewrites a thing of the past. Without a doubt, healthcare CIOs need to keep that in mind.

As for me, I have a couple of responses to bring to the table, and some additional questions of my own.

Since I’m an HIT trend analyst rather than actual tech pro, I can’t say whether FHIR APIs can or can’t do what Chat is describing, though I have little doubt that Chad is right about their potential uses.

Still, I’d contend out that since none other than FHIR project director Grahame Grieve has cautioned us about its current limitations, we probably want to temper our enthusiasm a bit. (I know I’ve made this point a few times here, perhaps ad nauseum, but I still think it bears repeating.)

So, given that FHIR hasn’t reached its full potential, it may be that health IT leaders should invest added time on solving other important interoperability problems.

One example that leaps to mind immediately is solving patient matching problems. This is a big deal: After all, If you can’t match patient records accurately across providers, it’s likely to lead to wrong-patient related medical errors.

In fact, according to a study released by AHIMA last year, 72 percent of HIM professional who responded work on mitigating possible patient record duplicates every week. I have no reason to think things have gotten better. We must find an approach that will scale if we want interoperable data to be worth using.

And patient data matching is just one item on a long list of health data interoperability concerns. I’m sure you’re aware of other pressing problems which could undercut the value of sharing patient records. The question is, are we going to address those problems before we began full-scale health data exchange? Or does it make more sense to pave the road to data exchange and address bumps in the road later?

We Can’t Afford To Be Vague About Population Health Challenges

Posted on June 19, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Today, I looked over a recent press release from Black Book Research touting its conclusions on the role of EMR vendors in the population health technology market. Buried in the release were some observations by Alan Hutchison, vice president of Connect & Population Health at Epic.

As part of the text, the release observes that “the shift from quantity-based healthcare to quality-based patient-centric care is clearly the impetus” for population health technology demand. This sets up some thoughts from Hutchison.

The Epic exec’s quote rambles a bit, but in summary, he argues that existing systems are geared to tracking units of care under fee-for-service reimbursement schemes, which makes them dinosaurs.

And what’s the solution to this problem? Why, health systems need to invest in new (Epic) technology geared to tracking patients across their path of care. “Single-solution systems and systems built through acquisition [are] less able to effectively understand the total cost of care and where the greatest opportunities are to reduce variation, improve outcomes and lower costs,” Hutchison says.

Yes, I know that press releases generally summarize things in broad terms, but these words are particularly self-serving and empty, mashing together hot air and jargon into an unappetizing patty. Not only that, I see a little bit too much of stating as fact things which are clearly up for grabs.

Let’s break some of these issues down, shall we?

  • First, I call shenanigans on the notion that the shift to “value-based care” means that providers will deliver quality care over quantity. If nothing else, the shifts in our system can’t be described so easily. Yeah, I know, don’t expect much from a press release, but words matter.
  • Second, though I’m not surprised Hutchison made the argument, I challenge the notion that you must invest in entirely new systems to manage population health.
  • Also, nobody is mentioning that while buying a new system to manage pop health data may be cleaner in some respects, it could make it more difficult to integrate existing data. Having to do that undercuts the value of the new system, and may even overshadow those benefits.

I don’t know about you, but I’m pretty tired of reading low-calorie vendor quotes about the misty future of population health technology, particularly when a vendor rep claims to have The Answer.  And I’m done with seeing clichéd generalizations about value-based care pass for insight.

Actually, I get a lot more out of analyses that break down what we *don’t* know about the future of population health management.

I want to know what hasn’t worked in transitioning to value-based reimbursement. I hope to see stories describing how health systems identified their care management weaknesses. And I definitely want to find out what worries senior executives about supporting necessary changes to their care delivery models.

It’s time to admit that we don’t yet know how this population health management thing is going to work and abandon the use of terminally vague generalizations. After all, once we do, we can focus on the answering our toughest questions — and that’s when we’ll begin to make real progress.

UCHealth Adds Claims Data To Population Health Dataset

Posted on April 24, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

A Colorado-based health system is implementing a new big data strategy which incorporates not only data from clinics, hospitals and pharmacies, but also a broad base of payer claim data.

UCHealth, which is based in Aurora, includes a network of seven hospitals and more than 100 clinics, caring collectively for more than 1.2 million unique patients in 2016. Its facilities include the University of Colorado Hospital, the principal teaching hospital for the University of Colorado School of Medicine.

Leaders at UCHealth are working to improve their population health efforts by integrating data from seven state insurers, including Anthem Blue Cross and Blue Shield, Cigna, Colorado Access, Colorado Choice Health Plans, Colorado Medicaid, Rocky Mountain Health Plans and United Healthcare.

The health system already has an Epic EMR in place across the system which, as readers might expect, offers a comprehensive view of all patient treatment taking place at the system’s clinics and hospitals.

That being said, the Epic database suffers from the same limitations as any other locally-based EMR. As UCHealth notes, its existing EMR data doesn’t track whether a patient changes insurers, ages into Medicare, changes doctors or moves out of the region.

To close the gaps in its EMR data, UCHealth is using technology from software vendor Stratus, which offers a healthcare data intelligence application. According to the vendor, UCHealth will use Stratus technology to support its accountable care organizations as well as its provider clinical integration strategy.

While health system execs expect to benefit from integrating payer claims data, the effort doesn’t satisfy every item on their wish list. One major challenge they’re facing is that while Epic data is available to all the instant it’s added, the payer data is not. In fact, it can take as much as 90 days before the payer data is available to UCHealth.

That being said, UCHealth’s leaders expect to be able to do a great deal with the new dataset. For example, by using Stratus, physicians may be able to figure out why a patient is visiting emergency departments more than might be expected.

Rather than guessing, the physicians will be able to request the diagnoses associated with those visits. If the doctor concludes that their conditions can be treated in one of the system’s primary care clinics, he or she can reach out to these patients and explain how clinic-based care can keep them in better health.

And of course, the health system will conduct other increasingly standard population health efforts, including spotting health trends across their community and better understanding each patient’s medical needs.

Over the next several months, 36 of UCHealth’s primary care clinics will begin using the Stratus tool. While the system hasn’t announced a formal pilot test of how Stratus works out in a production setting, rolling this technology out to just 36 doctors is clearly a modest start. But if it works, look for other health systems to scoop up claims data too!

Database Linked With Hospital EMR To Encourage Drug Monitoring

Posted on March 31, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

According to state officials, Colorado occupies the unenviable position of second worst in the US for prescription drug misuse, with more than 255,000 Coloradans misusing prescribed medications.

One way the state is fighting back is by running the Colorado Prescription Drug Monitoring Program which, like comparable efforts in other states, tracks prescriptions for controlled medications. Every regular business day, the state’s pharmacists upload prescription data for medications listed in Schedules II through V.

While this effort may have value, many physicians haven’t been using the database, largely because it can be difficult to access. In fact, historically physicians have been using the system only about 30 percent of the time when prescribing controlled substances, according to a story appearing in HealthLeaders Media.

As things stand, it can take physicians up to three minutes to access the data, given that they have to sign out of their EMR, visit the PDMP site, log in using separate credentials, click through to the right page, enter patient information and sort through possible matches before they got to the patient’s aggregated prescription history. Given the ugliness of this workflow, it’s no surprise that clinicians aren’t searching out PDMP data, especially if they don’t regard a patient as being at a high risk for drug abuse or diversion.

But perhaps taking some needless steps out of the process can make a difference, a theory which one of the state’s hospitals is testing. Colorado officials are hoping a new pilot program linking the PDMP database to an EMR will foster higher use of the data by physicians. The pilot, funded by a federal grant through the Bureau of Justice Assistance, connects the drug database directly to the University of Colorado Hospital’s Epic EMR.

The project began with a year-long building out phase, during which IT leaders created a gateway connecting the PDMP database and the Epic installation. Several months ago, the team followed up with a launch at the school of medicine’s emergency medicine department. Eventually, the PDMP database will be available in five EDs which have a combined total of 270,000 visits per year, HealthLeaders notes.

Under the pilot program, physicians can access the drug database with a single click, directly from within the Epic EMR system. Once the PDMP database was made available, the pilot brought physicians on board gradually, moving from evaluating their baseline use, giving clinicians raw data, giving them data using a risk-stratification tool and eventually requiring that they use the tool.

Researchers guiding the pilot are evaluating whether providers use the PDMP more and whether it has an impact on high-risk patients. Researchers will also analyze what happened to patients a year before, during and a year after their ED visits, using de-identified patient data.

It’s worth pointing out that people outside of Colorado are well aware of the PDMP access issue. In fact, the ONC has been paying fairly close attention to the problem of making PDMP data more accessible. That being said, the agency notes that integrating PDMPs with other health IT systems won’t come easily, given that no uniform standards exist for linking prescription drug data with health IT systems. ONC staffers have apparently been working to develop a standard approach for delivering PDMP data to EMRs, pharmacy systems and health information exchanges.

However, at present it looks like custom integration will be necessary. Perhaps pilots like this one will lead by example.

National Health Service Hospitals Use Data Integration Apps

Posted on February 20, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

While many providers in the US are still struggling with selecting and deploying apps, the UK National Health Service trusts are ready to use them to collect vital data.

According to the New Scientist, the four National Health Services serving the United Kingdom are rolling out two apps which help patients monitor their health at home. Both of the apps, which are being tested at four hospitals in Oxfordshire, UK, focus on management of a disease state.

One, called GDm-health, helps manage the treatment of gestational diabetes, which affects one in 10 pregnant women. Women use the app to send each of their blood glucose readings to the clinician monitoring their diabetes. The Oxford University Institute of Biomedical Engineering led development of the app, which has allowed patients to avoid needless in-person visits. In fact, the number of patient visits has dropped by 25%, the article notes.

The other app, which was also developed by the Institute, helps patients manage chronic obstructive pulmonary disease, which affects between 1 million and 1.5 million UK patients. COPD patients check their heart rate and blood oxygen saturation every day, entering each result into the app.

After collecting three months of measurements, the app “learns” to recognize what a normal oxygen sat level is for that patient. Because it has data on what is normal for that patient, it will neither alert clinicians too often nor ignore potential problems. During initial use the app, which already been through a 12-month clinical trial, cut hospital admissions among this population by 17% and general practitioner visits by 40%.

NHS leaders are also preparing to launch a third app soon. The technology, which is known as SEND, is an iPad app designed to collect information on hospital patients. As they make their rounds, nurses will use the app to input data on patients’ vital signs. The system then automatically produces an early warning score for each patient, and provides an alert if the patient’s health may be deteriorating.

One might think that because UK healthcare is delivered by centralized Trusts, providers there don’t face data-sharing problems in integrating data from apps like these. But apparently, we would be wrong. According to Rury Holman of the Oxford Biomedical Research Centre, who spoke with New Scientist, few apps are designed to work with NHS’ existing IT systems.

“It’s a bit like the Wild West out there with lots of keen and very motivated people producing these apps,” he told the publication. “What we need are consistent standards and an interface with electronic patient records, particularly with the NHS, so that information, with permission from the patients, can be put to use centrally.”

In other words, even in a system providing government-delivered, ostensibly integrated healthcare, it’s still hard to manage data sharing effectively. Guess we shouldn’t feel too bad about the issues we face here in the US.

UCSF Partners With Intel On Deep Learning Analytics For Health

Posted on January 30, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

UC San Francisco’s Center for Digital Health Innovation has agreed to work with Intel to deploy and validate a deep learning analytics platform. The new platform is designed to help clinicians make better treatment decisions, predict patient outcomes and respond quickly in acute situations.

The Center’s existing projects include CareWeb, a team-based collaborative care platform built on Salesforce.com social and mobile communications tech; Tidepool, which is building infrastructure for next-gen smart diabetes management apps; Health eHeart, a clinical trials platform using social media, mobile and realtime sensors to change heart disease treatment; and Trinity, which offers “precision team care” by integrating patient data with evidence and multi-disciplinary data.

These projects seem to be a good fit with Intel’s healthcare efforts, which are aimed at helping providers succeed at distributed care communication across desktop and mobile platforms.

As the two note in their joint press release, creating a deep learning platform for healthcare is extremely challenging, given that the relevant data is complex and stored in multiple incompatible systems. Intel and USCF say the next-generation platform will address these issues, allowing them to integrate not only data collected during clinical care but also inputs from genomic sequencing, monitors, sensors and wearables.

To support all of this activity obviously calls for a lot of computing power. The partners will run deep learning use cases in a distributed fashion based on a CPU-based cluster designed to crunch through very large datasets handily. Intel is rolling out the computing environment on its Xeon processor-based platform, which support data management and the algorithm development lifecycle.

As the deployment moves forward, Intel leaders plan to study how deep learning analytics and machine-driven workflows can optimize clinical care and patient outcomes, and leverage what they learn when they create new platforms for the healthcare industry. Both partners believe that this model will scale for future use case needs, such as larger convolutional neural network models, artificial networks patterned after living organizations and very large multidimensional datasets.

Once implemented, the platform will allow users to conduct advanced analytics on all of this disparate data, using machine learning and deep learning algorithms. And if all performs as expected, clinicians should be able to draw on these advanced capabilities on the fly.

This looks like a productive collaboration. If nothing else, it appears that in this case the technology platform UCSF and Intel are developing may be productized and made available to other providers, which could be very valuable. After all, while individual health systems (such as Geisinger) have the resources to kick off big data analytics projects on their own, it’s possible a standardized platform could make such technology available to smaller players. Let’s see how this goes.