AI Overriding Health Care Decisions By Brian Simpson
We have been covering health concerns about AI, much of it based upon educated guesses about where AI will go. But, there are already concerns about what is being done with AI even now. Thus, in some US hospitals, but not as far as I could ascertain, any Australian hospitals yet, health decisions are being made by AI systems, overriding the practical experience of nurses. The problem is that the AI algorithm is limited, not being able to take in all the possibilities and realistically evaluate them, and justify a decision. It performs badly when there are competing medical hypotheses, consistent with the evidence, and has difficulties evaluating such situations. I simply would hate to be sick in a hospital bed, and have my life hang in the balance to be decided by a machine. Cold, and very often, just plain wrong.
https://www.naturalnews.com/2023-06-19-ai-overriding-decisions-human-care-nurses-hospitals.html
“Actual human beings are getting phased out of health care in exchange for artificial intelligence (AI) robots that are now reportedly overruling nurses at hospitals.
The life-or-death decisions that have long been made by real people at health care facilities are now being made by computers that have been programmed with who-knows-what to do God-only-knows to patients.
One oncology nurse by the name of Melissa Beebe who relies on her observational skills to help patients in need of emergency care spoke with The Wall Street Journal about the changes she is seeing in the way care is administered due to the AI infiltration.
“I’ve been working with cancer patients for 15 years so I know a septic patient when I see one,” Beebe said about an alert she recently received in the oncology unit at UC Davis Medical Center in California that she knew was wrong. “I knew this patient wasn’t septic.”
The alert Beebe received had been created by AI based on an elevated white blood cell count it observed in said patient, which it correlated with a septic infection. What the AI system failed to recognize is that the patient in question also had leukemia, which can also cause similar elevated white blood cell counts.
“The algorithm, which was based on artificial intelligence, triggers the alert when it detects patterns that match previous patients with sepsis,” the Journal reported. “The algorithm didn’t explain its decision.”
Being admitted to a hospital in the age of AI is a recipe for early death
The rules at the hospital where Beebe works stipulate that she and all other nurses must follow certain protocols whenever a patient is flagged for sepsis – even if the flag is a mistake based on wrong assumptions made by AI.
The only way to override the AI’s decision is to get a doctor to approve – though if the modified decision ends up being wrong, then nurses can face disciplinary action. The threat of this causes most of them to simply follow orders, even when they know those orders are wrong.
“When an algorithm says, ‘Your patient looks septic,’ I can’t know why,” Beebe, a representative of the California Nurses Association, says. “I just have to do it.”
“I’m not demonizing technology,” she added while noting that, in the case of the aforementioned cancer patient, she was right and the AI was wrong. “But I feel moral distress when I know the right thing to do and I can’t do it.”
While there are arguably some things that AI can maybe, possibly do better than a human being, relying on AI systems to control the direction of medicine and care at hospitals is dangerous business.
Who is to say that the AI machines will not suddenly start targeting certain patients for early elimination if their names come up on a government-created “agitator” list, as one hypothetical dystopian outcome? What about when the AI machines are just plain wrong and hospital staff are too tired, ambivalent, or even apathetic to try to override it and risk their own careers in the process?
“AI should be used as clinical decision support and not to replace the expert,” warns Kenrick Cato, a professor of nursing at the University of Pennsylvania and a nurse scientist at the Children’s Hospital of Philadelphia.”
“Hospital administrators need to understand there are lots of things an algorithm can’t see in a clinical setting.””
Comments