I will occasionally report in Lab Soft News about examples of artificial intelligence (AI) that are being introduced into healthcare because the use of such tools will radically change the way care is rendered. One such example is a recently developed algorithm that generates warnings about which patients are in imminent danger of "coding" in the hospital (see: Ochsner Health System: Preventing cardiac arrests with AI that predicts which patients will ‘code’). Such a warning enables physicians to intervene earlier for them. Below is an excerpt from the article:
In modern hospitals, doctors and nurses are trained to rush into action when a patient “codes” – suffers a cardiac or respiratory arrest and needs immediate medical help. But what if doctors could see into the future and know who is about to code, so they could prevent a code from happening in the first place? A new artificial intelligence tool launched by Ochsner Health System enables doctors to do just that, by analyzing thousands of data points to predict which patients will deteriorate in the near future.....In a 90-day pilot with the system last fall, Ochsner successfully reduced the hospital’s typical number of codes by 44 percent. It’s now expanding the technology to a 24-hour schedule and to more hospitals in its network, which constitutes Louisiana’s largest not-for-profit system.....Predictive models need massive amounts of data, and the groundwork with Epic’s comprehensive health records software now allows Ochsner to query a large database from 11 hospitals. That data enabled Ochsner to build, train and validate a model to predict patient deterioration....
Building the model was hard work, but integrating it into a human workflow was also tough....Patient statuses constantly change, and the system sends only a handful of alerts a day – six to 10, out of hundreds of patients. Each alert requires attention from a provider trained in quick diagnoses and not just resuscitation..... Alerts provide four-hour warning To reduce false positives, his team trained the model for Ochsner’s population, which has a lot of patients with kidney and heart failure. Lab values worsen before a dialysis patient undergoes treatment, but the patient isn’t at risk of coding.....Too few alerts missed high-risk patients, but too many produced alert fatigue. Alerts sent too soon didn’t convey urgency, while those sent too late didn’t allow for interventions. A four-hour warning turned out to be ideal, giving enough time for a rapid response provider to finish what she’s doing, walk – not run – to a patient’s room and conduct an evaluation. Interventions may be a medication change, transfer to ICU or another form of elevated care.
Take the time to read this whole article because it goes into detail about how unwise it is to "deploy an algorithm" into a healthcare setting without adequate preparation. Such preparation involves integrating the tool into the normal workflow in order to accommodate to the new processes that it brings with it. For example, the tool needs to be refined to reduce false positives. It also needs to be designed to provide some advance warning when correct, which is to say. when it identifies patients whose clinical status is deteriorating but missed. Attention also needs to be paid to the type of interventions that are required for them. As noted above, such interventions include medication change, transfer to ICU or another form of elevated care.
All of this remind me of the Apache (Acute Physiology and Chronic Health Evaluation) system that was introduced into critical care units about four decades ago (see: APACHE 1978-2001: The Development of a Quality Assurance System Based on Prognosis). An early decision support tooI, the software was used to assess the prognosis of patients in critical care units based on a panel of lab test results and some clinical observations. One of the initial intended uses for an Apache score for an ICU patient was to identify those near death and for whom continuing monitoring and intervention was useless. Such patients could be discharged from the critical care unit, freeing up a bed for a salvageable patient and reducing healthcare costs. However, the use of Apache for this purpose caused an uproar among some patient family members.
Today such interventions might be deemed useful in order to allow a patient to die at home rather than in a sterile ICU. Again the following lesson bears repeating. Predictive algorithms in healthcare can't be deployed in a vacuum -- a priori, they need to be integrated into workflow and clinical processes with attention to their efficiency and value. It's important to note that Google has already been able to predict in-hospital mortality using deep learning research based on EHR data (see: Scalable and accurate deep learning with electronic health records).
Comments