Free Hospital EMR and EHR Newsletter Want to receive the latest news on EMR, Meaningful Use, ARRA and Healthcare IT sent straight to your email? Join thousands of healthcare pros who subscribe to Hospital EMR and EHR for FREE!

Cleveland Clinic Works To Eliminate Tech Redundancies

Posted on March 1, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

The Cleveland Clinic has relied on its EMR for quite some time. In fact, it adopted Epic in the 1990s, long before most healthcare organizations were ready to make a bet on EMRs. Today, decades later, the Epic EMR is the “central data hub” for the medical center and is central to both its clinical and operational efforts, according to William Morris, MD, the Clinic’s associate chief information officer.

But Morris, who spoke about the Clinic’s health IT with Health Data Management, also knows its limitations. In an interview with the magazine’s Greg Slabodkin, he notes that while the EMR may be necessary, it isn’t sufficient. The Epic EMR is “just a digital repository,” he told Slabodkin. “Ultimately, it’s what you do with the technology in your ecosystem.”

These days, IT leaders at the Clinic are working to streamline the layers of additional technology which have accreted on top of the EMR over the years. “As an early adopter of Epic, we have accumulated quite a bit of what I’ll call technical debt,” said Doug Smith, interim chief information officer. “What I mean by that is multiple enhancements, bolt-ons, or revisions to the core application. We have to unburden ourselves of that.”

It’s not that Clinic leaders are unhappy with their EMR. In fact, they’re finding ways to tap its power to improve care. For example, to better leverage its EMR data, the Cleveland Clinic has developed data-driven “risk scores” designed to let doctors know if patients need intervention. The models, developed by the Clinic’s Quantitative Health Sciences group, offer outcome risk calculators for several conditions, including cancer, cardiovascular disease and diabetes.

(By the way, if predictive analytics interest you, you might want to check out our coverage of such efforts at New York’s Mount Sinai Hospital, which is developing a platform to predict which patients might develop congestive heart failure and care for patients already diagnosed with the condition more effectively. I’ve also taken a look at a related product being developed by Google’s DeepMind, an app named Streams which will ping clinicians if a patient needs extra attention.)

Ultimately, though, the organization hopes to simplify its larger health IT infrastructure substantially, to the point where 85% of the HIT functionality comes from the core Epic system. This includes keeping a wary eye on Epic upgrades, and implementing new features selectively. “When you take an upgrade in Epic, they are always turning on more features and functions,” Smith notes. “Most are optional.”

Not only will such improvements streamline IT operations, they will make clinicians more efficient, Smith says. “They are adopting standard workflows that also exist in many other organizations—and, we’re more efficient in supporting it because we don’t take as long to validate or support an upgrade.”

As an aside, I’m interested to read that Epic is tossing more features at Cleveland Clinic than it cares to adopt. I wonder if those are what engineers think customers want, or what they’re demanding today?

The Distributed Hospital On The Horizon

Posted on February 24, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

If you’re reading this blog, you already know that distributed, connected devices and networks are the future of healthcare.  Connected monitoring devices are growing more mature by the day, network architectures are becoming amazingly fluid, and with the growth of the IoT, we’re adding huge numbers of smart devices to an already-diverse array of endpoints.  While we may not know what all of this will look when it’s fully mature, we’ve already made amazing progress in connecting care.

But how will these trends play out? One nice look at where all this is headed comes from Jeroen Tas, chief innovation and strategy officer at Philips. In a recent article, Tas describes a world in which even major brick-and-mortar players like hospitals go almost completely virtual.  Certainly, there are other takes out there on this subject, but I really like how Tas explains things.

He starts with the assertion that the hospital of the future “is not a physical location with waiting rooms, beds and labs.” Instead, a hospital will become an abstract network overlay connecting nodes. It’s worth noting that this isn’t just a concept. For an example, Tas points to the Mercy Virtual Care Center, a $54 million “hospital without beds” dedicated to telehealth and connected care.  The Center, which has over 300 employees, cares for patients at home and in beds across 38 hospitals in seven states.

While the virtual hospital may not rely on a single, central campus, physical care locations will still matter – they’ll just be distributed differently. According to Tas, the connected health network will work best if care is provided as needed through retail-type outlets near where people live, specialist hubs, inpatient facilities and outpatient clinics. Yes, of course, we already have all of these things in place, but in the new connected world, they’ll all be on a single network.

Ultimately, even if brick-and-mortar hospitals never disappear, virtual care should make it possible to cut down dramatically on hospital admissions, he suggests.  For example, Tas notes that Philips partner Banner Health has slashed hospital admissions almost 50% by using telehealth and advanced analytics for patients with multiple chronic conditions. (We’ve also reported on a related pilot by Partners HealthCare Brigham and Women’s Hospital, the “Home Hospital,” which sends patients home with remote monitoring devices as an alternative to admissions.)

Of course, the broad connected care outline Tas offers can only take us so far. It’s all well and good to have a vision, but there are still some major problems we’ll have to solve before connected care becomes practical as a backbone for healthcare delivery.

After all, to cite one major challenge, community-wide connected health won’t be very practical until interoperable data sharing becomes easier – and we really don’t know when that will happen. Also, until big data analytics tools are widely accessible (rather than the province of the biggest, best-funded institutions) it will be hard for providers to manage the data generated by millions of virtual care endpoints.

Still, if Tas’s piece is any indication, consensus is building on what next-gen care networks can and should be, and there’s certainly plenty of ways to lay the groundwork for the future. Even small-scale, preliminary connected health efforts seem to be fostering meaningful changes in how care is delivered. And there’s little doubt that over time, connected health will turn many brick-and-mortar care models on their heads, becoming a large – or even dominant – part of care delivery.

Getting there may be tricky, but if providers keep working at connected care, it should offer an immense payoff.

UCSF Partners With Intel On Deep Learning Analytics For Health

Posted on January 30, 2017 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

UC San Francisco’s Center for Digital Health Innovation has agreed to work with Intel to deploy and validate a deep learning analytics platform. The new platform is designed to help clinicians make better treatment decisions, predict patient outcomes and respond quickly in acute situations.

The Center’s existing projects include CareWeb, a team-based collaborative care platform built on Salesforce.com social and mobile communications tech; Tidepool, which is building infrastructure for next-gen smart diabetes management apps; Health eHeart, a clinical trials platform using social media, mobile and realtime sensors to change heart disease treatment; and Trinity, which offers “precision team care” by integrating patient data with evidence and multi-disciplinary data.

These projects seem to be a good fit with Intel’s healthcare efforts, which are aimed at helping providers succeed at distributed care communication across desktop and mobile platforms.

As the two note in their joint press release, creating a deep learning platform for healthcare is extremely challenging, given that the relevant data is complex and stored in multiple incompatible systems. Intel and USCF say the next-generation platform will address these issues, allowing them to integrate not only data collected during clinical care but also inputs from genomic sequencing, monitors, sensors and wearables.

To support all of this activity obviously calls for a lot of computing power. The partners will run deep learning use cases in a distributed fashion based on a CPU-based cluster designed to crunch through very large datasets handily. Intel is rolling out the computing environment on its Xeon processor-based platform, which support data management and the algorithm development lifecycle.

As the deployment moves forward, Intel leaders plan to study how deep learning analytics and machine-driven workflows can optimize clinical care and patient outcomes, and leverage what they learn when they create new platforms for the healthcare industry. Both partners believe that this model will scale for future use case needs, such as larger convolutional neural network models, artificial networks patterned after living organizations and very large multidimensional datasets.

Once implemented, the platform will allow users to conduct advanced analytics on all of this disparate data, using machine learning and deep learning algorithms. And if all performs as expected, clinicians should be able to draw on these advanced capabilities on the fly.

This looks like a productive collaboration. If nothing else, it appears that in this case the technology platform UCSF and Intel are developing may be productized and made available to other providers, which could be very valuable. After all, while individual health systems (such as Geisinger) have the resources to kick off big data analytics projects on their own, it’s possible a standardized platform could make such technology available to smaller players. Let’s see how this goes.

A Look At Geisinger’s Big Data Efforts

Posted on December 28, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

This week I got a look at a story appearing in a recent issue of Harvard Business Review which offers a description of Geisinger Health System’s recent big data initiatives. The ambitious project is designed not only to track and analyze patient outcomes, but also to visualize healthcare data across cohorts of patients and networks of providers and even correlate genomic sequences with clinical care. Particularly given that Geisinger has stayed on the cutting edge of HIT for many years, I think it’s worth a look.

As the article’s authors note, Geisinger rolled out a full-featured EMR in 1996, well ahead of most of its peers. Like many other health systems, Geisinger has struggled to aggregate and make use of data. That’s particularly the case because as with other systems, Geisinger’s legacy analytics systems still in place can’t accommodate the growing flood of new data types emerging today.

Last year, Geisinger decided to create a new infrastructure which could bring this data together. It implemented Unified Data Architecture allowing it to integrate big data into its existing data analytics and management.  According to the article, Geisinger’s UDA rollout is the largest practical application of point-of-care big data in the industry. Of particular note, Geisinger is crunching not only enterprise healthcare data (including HIE inputs, clinical departmental systems and patient satisfaction surveys) and consumer health tools (like smartphone apps) but even grocery store and loyalty program info.

Though all of its data hasn’t yet been moved to the UDA, Geisinger has already seen some big data successes, including:

* “Close the Loop” program:  Using natural language processing, the UDA analyzes clinical and diagnostic imaging reports, including free text. Sometimes it detects problems that may not be relevant to the initial issue (such as injuries from a car crash) which can themselves cause serious harm. The program has already saved patient lives.

* Early sepsis detection/treatment: Geisinger uses the UDA to bring all sepsis-patient information in one place as they travel through the hospital. The system alerts providers to real-time physiologic data in patients with life-threatening septic shock, as well as tracking when antibiotics are prescribed and administered. Ninety percent of providers who use this tool consistently adhere to sepsis treatment protocols, as opposed to 40% of those who don’t.

* Surgery costs/outcomes: The Geisinger UDA tracks and integrates surgical supply-chain data, plus clinical data by surgery type and provider, which offers a comprehensive view of performance by provider and surgery type.  In addition to offering performance insight, this approach has also helped generate insights about supply use patterns which allow the health system to negotiate better vendor deals.

To me, one of the most interesting things about this story is that while Geisinger is at a relatively early stage of its big data efforts, it has already managed to generate meaningful benefits from its efforts. My guess is that its early successes are more due to smart planning – which includes worthwhile goals from day one of the rollout — than the technology per se. Regardless, let’s hope other hospital big data projects fare so well. (Meanwhile, for a look at another interesting hospital big data project, check out this story.)

Longitudinal Patient Record Needed To Advance Care?

Posted on November 23, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

In most day to day settings, a clinician only needs a small (if precisely focused) amount of data to make clinical decisions. Both in ambulatory and acute settings, they rely on immediate and near-term information, some collected during the visit, and a handful of historical factors likely to influence or even govern what plan of care is appropriate.

That may be changing, though, according to Cheryl McKay of Orion Health. In a recent blog item, McKay argues that as the industry shifts from fee-for-service payment models to value-based reimbursement, we’ll need new types of medical records to support this model. Today, the longitudinal patient record and community care plan are emerging as substitutes to old EMR models, McKay says. These new entities will be built from varied data sources including payer claims, provider EMRs, patient health devices and the patients themselves.

As these new forms of patient medical record emerge, effective population health management is becoming more feasible, she argues. Longitudinal patient records and community care plans are “essential as we steer away from FFS…The way records are delivered to healthcare providers– with an utter lack of visibility and a lot of noise from various data sources– creates unnecessary risks for everyone involved.”

She contends that putting these types of documentation in place, which summarize patient-based clinical experiences versus episodic clinical experiences, close big gaps in patient history which would otherwise generate mistakes. Longitudinal record-keeping also makes it easier for physicians to aggragate information, do predictive modeling and intervene proactively in patient care at both the patient and population level.

She also predicts that with both a longitudinal patient record and community care plan in place, getting from the providers of all stripes a “panoramic” look at patients, costs will fall as providers stop performing needless tests and procedures. Not only that, these new entities would ideally offer real-time information as well, including event notifications, keeping all the providers involved in sync in providing the patient’s care.

To be sure, this blog item is a pitch for Orion’s technology. While the notion of a community-care plan isn’t owned by anyone in particular, Orion is pitching a specific model which rides upon its population health technology. That being said, I’m betting most of us would agree that the idea (regardless of which vendor you work with) of establishing a community-wide care plan does make sense. And certainly, putting a rich longitudinal patient record in place could be valuable too.

However, given the sad state of interoperability today, I doubt it’s possible to build this model today unless you choose a single vendor-centric solution. At present think it’s more of a dream than a reality for most of us.

Managing Health Information to Ensure Patient Safety

Posted on August 17, 2016 I Written By

Erin Head is the Director of Health Information Management (HIM) and Quality for an acute care hospital in Titusville, FL. She is a renowned speaker on a variety of healthcare and social media topics and currently serves as CCHIIM Commissioner for AHIMA. She is heavily involved in many HIM and HIT initiatives such as information governance, health data analytics, and ICD-10 advocacy. She is active on social media on Twitter @ErinHead_HIM and LinkedIn. Subscribe to Erin's latest HIM Scene posts here.

This post is part of the HIM Series of blog posts. If you’d like to receive future HIM posts by Erin in your inbox, you can subscribe to future HIM Scene posts here.

Electronic Medical Records (EMRs) have been a great addition to healthcare organizations and I know many would agree that some tasks have been significantly improved from paper to electronic. Others may still be cautious with EMRs due to the potential patient safety concerns that EMRs bring to light.

The Joint Commission expects healthcare organizations to engage in the latest health information technologies but we must do so safely and appropriately. In 2008, The Joint Commission released Sentinel Event Alert Issue 42 which advised organizations to be mindful of the patient safety risks that can result from “converging technologies”.

The electronic technologies we use to gather patient data could pose potential threats and adverse events. Some of these threats include the use of computerized physician order entry (CPOE), information security, incorrect documentation, and clinical decision support (CDS).  Sentinel Event Alert Issue 54 in 2015 again addressed the safety risks of EMRs and the expectation that healthcare organizations will safely implement health information technology.

Having incorrect data in the EMR poses serious patient safety risks that are preventable which is why The Joint Commission has put this emphasis on safely using the technology. We will not be able to blame patient safety errors on the EMR when questioned by surveyors, especially when they could have been prevented.

Ensuring medical record integrity has always been the objective of HIM departments. HIM professionals’ role in preventing errors and adverse events has been apparent from the start of EMR implementations. HIM professionals should monitor and develop methods to prevent issues in the following areas, to name a few:

Copy and paste

Ensure policies are in place to address copy and paste. Records can contain repeated documentation from day to day which could have been documented in error or is no longer current. Preventing and governing the use of copy and paste will prevent many adverse issues with conflicting or erroneous documentation.

Dictation/Transcription errors

Dictation software tools are becoming more intelligent and many organizations are utilizing front end speech recognition to complete EMR documentation. With traditional transcription, we have seen anomalies remaining in the record due to poor dictation quality and uncorrected errors. With front end speech recognition, providers are expected to review and correct their own dictations which presents similar issues if incorrect documentation is left in the record.

Information Security

The data that is captured in the EMR must be kept secure and available when needed. We must ensure the data remains functional and accessible to the correct users and not accessible by those without the need to know. Cybersecurity breaches are a serious threat to electronic data including those within the EMR and surrounding applications.

Downtime

Organizations must be ready to function if there is a planned or unexpected downtime of systems. Proper planning includes maintaining a master list of forms and order-sets that will be called upon in the case of a downtime to ensure documentation is captured appropriately. Historical information should be maintained in a format that will allow access during a downtime making sure users are able to provide uninterrupted care for patients.

Ongoing EMR maintenance

As we continue to enhance and optimize EMRs, we must take into consideration all of the potential downstream effects of each change and how these changes will affect the integrity of the record. HIM professionals need prior notification of upcoming changes and adequate time to test the new functionality. No changes should be made to an EMR without all of the key stakeholders reviewing and approving the changes downstream implications. The Joint Commission claims, “as health IT adoption becomes more widespread, the potential for health IT-related patient harm may increase.”

If you’d like to receive future HIM posts by Erin in your inbox, you can subscribe to future HIM Scene posts here.

EHRs Can Help Find Patients At High Risk Of Dying

Posted on June 1, 2016 I Written By

Anne Zieger is veteran healthcare editor and analyst with 25 years of industry experience. Zieger formerly served as editor-in-chief of FierceHealthcare.com and her commentaries have appeared in dozens of international business publications, including Forbes, Business Week and Information Week. She has also contributed content to hundreds of healthcare and health IT organizations, including several Fortune 500 companies. She can be reached at @ziegerhealth or www.ziegerhealthcare.com.

Much of the discussion around EMRs and EHRs these days focuses on achieving broad, long-term goals such as improved population health. But here’s some data suggesting that these systems can serve a far more immediate purpose – finding inpatients at imminent risk of death.

A study appearing in The American Journal of Medicine details how researchers from Arizona-based Banner Health created an algorithm looking for key indicators suggesting that patients were in immediate danger of death. It was set up to send an alert when patients met at least two of four systemic inflammatory response syndrome criteria, plus at least one over 14 acute organ dysfunction parameters. The algorithm was applied in real time to 312,214 patients across 24 hospitals in the Banner system.

Researchers found that the alert was able to identify the majority of high-risk patients within 48 hours of their admission to a hospital, allowing clinical staff to deliver early and targeted medical interventions.

This is not the first study to suggest that clinical data analysis can have a significant impact on patients’ health status. Research from last year on clinical decision support tools appearing in Generating Evidence & Methods to Improve Patient Outcomes found that such tools can be beefed up to help providers prevent stroke in vulnerable patients.

In that study, researchers from Ohio State University created the Stroke Prevention in Healthcare Delivery Environments tool to pull together and display data relevant to cardiovascular health. The idea behind the tool was to help clinicians have more effective discussions with patients and help address risk factors such as smoking and weight.

They found that the tool, which was tested at two outpatient settings at Ohio State University’s Wexner Medical Center, garnered a “high” level of satisfaction from providers. Also, patient outcomes improved in some areas, such as diabetes status and body mass index.

Despite their potential, few tools are in place today to achieve such immediate benefits as identifying inpatients at high risk of death. Certainly, clinicians are deluged with alerts, such as the ever-present med interaction warnings, but alerts analyzing specific patients’ clinical picture aren’t common. However, they should be. While drug warnings might irritate physicians, I can’t see them ignoring an alert warning them that the patient might die.

And I can hardly imagine a better use of EMR data than leveraging it to predict adverse events among sick inpatients. After all, few hospitals would spend dozens or hundreds of millions of dollars to implement the system which creates a repository that simply mimics paper records.

In addition to preventing adverse events, real-time EMR data analytics will also support the movement to value-based care. If the system can predict which patients are likely to develop expensive complications, physicians can do a better job of preventing them. While clinicians, understandably, aren’t thrilled will being told how to deliver care, they are trained to respond to problems and solve them.

I’m hoping to read more about technologies that leverage EMR data to solve day-to-day care problems. This is a huge opportunity.

Can HIM Professionals Become Clinical Documentation Improvement Specialists?

Posted on April 21, 2016 I Written By

Erin Head is the Director of Health Information Management (HIM) and Quality for an acute care hospital in Titusville, FL. She is a renowned speaker on a variety of healthcare and social media topics and currently serves as CCHIIM Commissioner for AHIMA. She is heavily involved in many HIM and HIT initiatives such as information governance, health data analytics, and ICD-10 advocacy. She is active on social media on Twitter @ErinHead_HIM and LinkedIn. Subscribe to Erin's latest HIM Scene posts here.

Most acute care hospitals have implemented a clinical documentation improvement (CDI) program to drive appropriate reimbursement and clarification of documentation. These roles typically live (and should live) within the HIM department. Clinical Documentation Specialists (CDS) work closely with the medical staff and coders to ensure proper documentation and must have an understanding of coding and reimbursement methodologies along with clinical knowledge.

Certain aspects of the CDI or CDS role require in-depth clinical knowledge and experience to read and understand what documentation is already in the chart and find what is missing. Some diagnoses may be hiding in ambiguous documentation and it is up to the CDS to gather consensus from the medical staff to clarify through front-end queries. There are many tools available to assist in this process by creating worklists and documentation suggestions based on diagnosis criteria and best practices. The focus of CDI is not entirely on reimbursement, although it is a nice reward to receive appropriate reimbursement for the treatment provided while obtaining compliant documentation for regulatory purposes.

Determining or changing the potential DRG prior to discharging a patient provides a secondary data source for many healthcare functions such as case management, the plan of care, decision support, and alternative payment models. For these reasons, a CDS must know the coding guidelines for selecting a principal diagnosis that will ultimately determine the DRG.

Inpatient coders also have the foundational skills to perform this role. Coders and HIM professionals are required to have advanced knowledge of anatomy and physiology, pharmacology, and clinical documentation. Therefore, to answer my original question “Can HIM professionals become Clinical Documentation Improvement Specialists?”, the answer is absolutely. But I will say that it depends on the organization as to whether nursing licensure and clinical experience is required in the job description.

Some organizations have mixed CDI teams consisting of coders and nurses while others may allow only nurses to qualify for this role. The impact of who performs the CDS role in the CDI program all lies in the understanding of the documentation, knowledge of coding guidelines, and detective work to remedy missing or conflicting documentation.

If you’d like to receive future HIM posts by Erin in your inbox, you can subscribe to future HIM Scene posts here.

A Look At Precision Medicine Solutions Available Today

Posted on December 22, 2015 I Written By

John Lynn is the Founder of the HealthcareScene.com blog network which currently consists of 10 blogs containing over 8000 articles with John having written over 4000 of the articles himself. These EMR and Healthcare IT related articles have been viewed over 16 million times. John also manages Healthcare IT Central and Healthcare IT Today, the leading career Health IT job board and blog. John is co-founder of InfluentialNetworks.com and Physia.com. John is highly involved in social media, and in addition to his blogs can also be found on Twitter: @techguy and @ehrandhit and LinkedIn.

Personalized and Precision Medicine are all the buzz since President Obama announced the Precision Medicine Initiative. However, after the government tragedy known as meaningful use, many are reasonably skeptical of government initiatives to improve healthcare. Plus, the rhetoric around what’s possible with precision medicine and the realities that most hospitals and doctors face every day feels like a massive disconnect.

The reality is that there’s good reason to be skeptical of precision medicine. Think about the scope of the problem. The world of health data that we live in today is 10-20 times bigger that it was even a decade ago. That’s a massive increase in the amount of data available. Plus, much of that data is unstructured data. Combine the volume of data with the accessibility (or lack therof) of that data and it’s easy to see why some are skeptical of really implementing precision medicine in their hospital today.

When you look at current EHR systems, none of them are built to enable precision medicine. First, they were built as massive billing engines and not as engines designed to improve care. Second, meaningful use has hijacked their development roadmap for years and will likely continue to hijack their development teams for years to come. Finally, there’s been so much money doing what they’re doing, what motivation do the entrenched EHR companies have to go out and do more?

The unfortunate reality of EHR systems is that they’re not built for real time availability of data analytics that provides improved care and precision, personalized medicine. Some may get there eventually, but we’re unlikely to see them get there anytime soon. I’ve heard precision medicine defined as a puzzle with 3 billion pieces. We have to start looking outside of traditional EHR companies to start solving such a complex puzzle.

The good news is that even though EHR vendors are not providing precision medicine solutions, we’re starting to see other vendors providing precision medicine solutions today. You no longer need to wait for an EHR vendor to participate.

One example of precision medicine happening today is the recently announced SAP Foundation for Health (we’ll forgive them on the somewhat confusing name). At the core of the SAP Foundation for Health is the SAP Hana engine. Unlike many EHR systems, SAP Hana was designed for real time data analysis of massive amounts of data and that includes both granular and free form data. You can see this capability first hand in the work SAP is doing with ASCO (American Society of Clinical Oncology) and their CancerLinQ project.

Dr. Clifford Hudis from CancerLinQ (Created by ASCO) described how personalized medicine to his grandfather was going around and visiting each patient. Over time that practice stopped and we started seeing patients in clinics where we generally only had one data set available to us: the clinical data that we captured ourselves on a paper chart. Unfortunately, as we moved electronic, we just recreated our paper chart world in electronic form. It’s too bad we didn’t do more during our shift to going electronic. However, that still means we have the opportunity to aggregate and analyze health data for the benefit of our patients. In some ways, we’re starting to democratize access to health data in order to enable precision medicine.

As Dr. Hudis pointed out, healthcare currently really only learns from patients who take part in clinical research trials. In other words, that only amounts to about 3% of adult patients who contribute to our learning. This limits our view since most clinical research trials have a biased sample which aren’t representative of the general population. How can we create personalized medicine if we only have data on 3% of the patient population? This is the problem CancerLinQ and SAP Foundation for Health are working to solve. Can they create a platform that learns from every patient?

ASCO together with SAP’s Foundation for Health is working to aggregate and analyze data across cancer patients regardless of whether they’re part of a clinical research study or not. In the past, Dr. Hudis pointed out that cancer tracking use to track cancer populations with simple groups like “small cell cancer” versus “non-small cell cancer.” That was a start, but had limited precision when trying to treat a patient. With this relatively new world of genomics, ASCO can now identify, track, and compare a patient’s cancer by specific genomic alterations. This is a fantastic development since tumors generally contain changed DNA. We can now use these DNA abnormalities to classify and track cancer patients in a much more precise way than we’ve done in the past.

This platform enables oncologists the opportunity to see real time information about their patient that’s personalized to the patients own genetic abnormalities. Instead of calling around to their network of oncologist friends, Cancer LinQ provides real time access to other patient populations with similar genetic abnormalities and could give them insight into what treatments are working for similar patients. This can also provide benchmarking for oncologists to see how they compare against their colleagues. Plus, it can show real time data to an oncologist so they can know how thorough and consistent they are with their patient population. Instead of working in a bubble, the oncologist can leverage the network of data to provide true precision medicine for their patients.

Another great example of precision medicine happening today is seen in the work of Carlos Bustamante, Professor of Genetics and Stanford University School of Medicine. Carlos is using SAP Foundation for Health to quickly identify genetic abnormalities in high performing athletes. Rather than recount the stories of Carlos’ work here, I’ll just link to this video where Carlos talks about the amazing insights they’ve found from studying the genomic abnormalties of high performing athletes. I love that his precision medicine work with high performing athletes has significant potential benefits for every patient.

Carlos is spot on in the video linked above when he says that the drop in genomic sequencing costs would be like taking a $400,000 Ferrari and now selling it for 10 cents. What originally took $13 billion and years of effort to sequence the first genome now takes $1500 and a few days. Access to every patient’s genome is going to change the types of drugs we develop, the treatment options we provide patients, our choice of drugs to treat a patient, and much much more. You can see that first hand in the work that ASCO and Stanford University School of Medicine are doing. Is there any more personalized medicine than the human genome?

Of course, the genome is just one of the many factors we’re seeing in the precision medicine revolution. We can’t forget about other variables that impact a patient’s health like environmental, behavioral, patient preference, and much more. We really are looking at a multi-billion piece puzzle and we’re just getting started. Remember that healthcare is not linear, but we’ve been treating it like it is for years. Healthcare is a complex matrices of challenges and we need our technology solutions to reflect that fact.

I see a beautiful future for precision medicine that’s already begun and builds into the future. We’re developing and targeting new drugs, devices and services that work for populations and individuals. We’re seeing new open, secure platforms that provide real-time flexible R&D analysis, genomics and other “omics” disciplines, patient cohort building and analysis, patient trial matching, and extended care collaboration solutions.

Data by itself is not valuable. However, the right engine on top of the right data is changing how we look at healthcare. We’re getting a much more precise view of each individual patient. Where have you seen precision medicine starting to take hold? What precision medicine solutions are you using in your organization?

Also, check out this infographic which looks at SAP’s view of precision medicine:
Personalized Medicine You Can Do Today

SAP is uniquely positioned to help advance personalized medicine. The SAP Foundation for Health is built on the SAP Hana platform which provides scalable cloud analytics solutions across the spectrum of healthcare. SAP is a sponsor of Influential Networks of which Healthcare Scene is a member.

Not So Far Far Away From Star Wars Medical Droids

Posted on December 18, 2015 I Written By

Colin Hung is the co-founder of the #hcldr (healthcare leadership) tweetchat one of the most popular and active healthcare social media communities on Twitter. Colin is a true believer in #HealthIT, social media and empowered patients. Colin speaks, tweets and blogs regularly about healthcare, technology, marketing and leadership. He currently leads the marketing efforts for @PatientPrompt, a Stericycle product. Colin’s Twitter handle is: @Colin_Hung

Friday December 18th is the day that Star Wars: The Force Awakens hit theatres. It carries with it the dreams of generations of fans. From old timers like me (who remember watching Star Wars: A New Hope in a converted opera house in 1977) to the new generation who grew up watching the prequels and the Clone Wars – everyone is looking forward to this new film.

As a fan, I thought it would be remiss of me if I didn’t write a blog using Star Wars as the theme this week.

One of the things that always struck me about Star Wars was the lack of doctors in the movies. Unlike the Star Trek universe where we had the lovable character of Dr. Leonard McCoy (Bones), you never really see a physician in Star Wars. Instead all the healing is done by droids.

In Empire Strikes Back, we are introduced to a medical droid that heals Luke Skywalker after his encounter with the abominable snowman-like Wampa on the frozen planet of Hoth. At the end of the movie we see other droids caring for Luke after he loses his hand after battling Darth Vader.

Back in the 80s when Empire Strikes Back was released these medical droids were pure science fiction. In 2015 medical robots are a reality and some are surprisingly similar to the ones depicted in the movie. Take for example the da Vinci Surgical Robot by Intuitive Surgical (on the left) which looks like a precursor version to the FX series of medical droids from Star Wars (on the right).

Da Vinci Xi Robot and Star Wars FX Medical Droid

I’ve never seen the da Vinci surgical robot, but the write-ups have been incredible. This robot allows surgeons to perform minimally invasive surgeries using the four finely controlled arms. The surgeon controls everything through a console. It is not hard to imagine that one day soon the surgeon performing the surgery may not be in the same hospital or even the same country as the robot itself – the ultimate in telemedicine!

Surgical robots are a hot area of healthcare innovation. Just last week Johnson & Johnson and Verily Life Sciences (formerly Google Life Sciences) got together to create Verb Surgical. According to the press release, “in the coming years, Verb Surgical aims to develop a comprehensive surgical solutions platform that will incorporate leading-edge robotic capabilities and best-in-class medical device technology for operating room professionals”.

As more companies enter this space, the faster these robots will evolve.

However, having articulating surgical robots only gets us part-way to a fully functional Start Wars medical droid. We have the body, but now we need the brains. That’s where IBM’s Watson comes in.

Watson is arguably the closest thing we currently have to artificial intelligence. IBM’s brainchild is able to analyze data and draw patterns/conclusions faster than any computer system that has ever existed. It is already capable of crunching through millions medical records and use that knowledge to help with cancer treatment. In pilots with several institutions, Watson is already assisting with diagnosis and treatment of disease.

It’s not hard to imagine that one day a Watson-like system will be combined with a surgical robot. Add in a little bit of advanced machine vision plus a few antimicrobial nanomaterials and all of a sudden you have the basics of a Star Wars medical droid.

The optimist in me believes it will happen in my lifetime. I only wish lightsabers and x-wing fighters weren’t so far far away.

Image Credit

Da Vinci Xi Robot – engadget http://www.engadget.com/2014/04/01/da-vinci-xi-surgical-robot/

FX medial droid – starwars.wikia.com http://starwars.wikia.com/wiki/FX-series_medical_assistant_droid