AI and medicine

The DO Book Club, September 2023: “The Algorithm Will See You Now”

J.L. Lycette, MD, imagines a future in which artificial intelligence (AI) has been so thoroughly integrated into the practice of medicine that an AI algorithm is making life-and-death decisions for patients.

Topics

Welcome back to The DO Book Club!

Imagine the year is 2035 and the miracle of artificial intelligence (AI) has been so thoroughly integrated into the art and science of medical care that life-and-death decisions are made for us by a computerized algorithm that can predict whether cancer treatment will be successful or not. What could possibly go wrong? J.L. Lycette, MD, imagines for us in her futuristic medical thriller, “The Algorithm Will See You Now.”

In this book, surgery resident Hope Kestrel works for Seattle-based health system Prognostic Intelligent Medical Algorithms (PRIMA). She has earned the official title of high resident, which PRIMA uses instead of chief resident, in its Oncologic and Surgical Intervention Success (OASIS) unit. The use of computerized surveillance and ominous acronyms reads like an Orwellian fantasy. The omnipresent robotic Online Speech and Recognition System (OSLR) purposely alludes to Sir William Osler, whose voice can be called up from anywhere in the hospital.

Connecting algorithms to patients

Diagnosis and treatment are performed at PRIMA entirely with AI and OSLR. The Algorithm examines patrons’ (its word for patients) previously input genetic information and other details to determine whether they will benefit from a particular treatment or are instead likely to be a “non-responder.” The goal is to optimize.

“The AI frees both patients and doctors from the fallacy of choice,” Dr. Kestrel proclaims. “The algorithms are more trustworthy than people.” (p. 8)

Things start to go wrong for several of the ancillary characters and their family members. Some of the physicians, including Dr. Kestrel, suspect that something is rotten at the core of the Algorithm as PRIMA seeks to leverage its technology in a corporate takeover of both regional and national cancer care. Villainous hospital administrator Maddox removes dissenters, pits residents against each other and strongarms her way through any threats to her inhumane and greedy plans.

The author, Dr. Lycette, practices as an oncologist in Oregon, so her concern regarding the use and misuse of AI to treat cancer is right on target. Her multilayered story addresses issues of racial bias in treatment, sexual harassment and the removal of human decision and interaction in health care. In the treatment of cancer, one of the most emotionally charged circumstances, can AI be trusted to make humane choices?

Of course, a machine-driven algorithm cannot hold someone’s hand or be present with them at the bedside. In one particularly creepy, but not inconceivable, case in the book, when it is time to deliver devastating news to a patron, the physician does not sit down face-to-face with them to deliver the news. A nurse delivers the bad news so the physician can totally disengage from sick people and do more “productive” work on responders.

As the opposition to the plan of the unscrupulous medical health care provider, a rogue podcaster questions the methods and morals of the Algorithm-driven health care corporation. The podcaster alludes to the cold-hearted words of the corporate CEO to describe the billions of dollars spent on ineffective treatments at the end of life. We are reminded, “the most dangerous lies are the ones that use the truth to sell themselves.” (p. 42)

Homing in on our choices

The novel is a tense, fast-paced thriller that keeps the reader guessing what will happen next. I do not normally read this type of book for precisely this reason; it made me so uncomfortable. I kept turning those pages to see how the very likable characters were going to work their way out of quite a few jams. There is a plot to save a mistreated patient and to foil the nefarious plan of the villain and her corporation. At the same time, the reader takes a voyage of self-discovery along with the protagonist.

In the lingering grief over the death of her mother, young Dr. Kestrel foolishly buys in to the heartless decisions of the Algorithm to ration care to the patrons and remove patient choice. But removing the ability of patients to choose how they want to be treated at the end of life was ripe for misuse by a greedy system that sought to save money by denying and rationing care. Dr. Kestrel finally realizes, “… To hope is to choose. Choice makes us human.” (p. 220)

In the way that art seems to imitate life, the recent entry of AI into our daily lives and health care decisions is predicted by Dr. Lycette. She cleverly weaves in the moral injury of nurses, the unequal distribution of power and limits on opioid pain relievers into the story. Where will humanity and justice in medical care go when AI is fully incorporated into our system?

By the end, the hero comes to the truth:

“Hope used to think that doctors could—and should—be perfect only if they had the right technology. But now, she knew that technology didn’t make a doctor who she was. A doctor had to see beyond.

“It wasn’t about predicting cure. It was about nurturing hope. Even at the end—especially at the end.” (p. 256)

In a system that runs for profit and is controlled mightily by corporate interests, Dr. Lycette’s tale of the depersonalization of health care is chilling indeed. It also reaffirms that the vision and work of physicians and nurses who truly care and work to change the system can win the day. “The Algorithm Will See You Now” will keep you reading long past your bedtime.

Editor’s note: The views expressed in this article are the author’s own and do not necessarily represent the views of The DO or the AOA.

Related reading:

The DO Book Club, July 2023: ‘Darkness Visible: A Memoir of Madness’

The DO Book Club, June 2023: “Long Walk out of the Woods: A Physician’s Story of Addiction, Depression, Hope and Recovery”

2 comments

  1. Steven Kamajian

    knowledge and wisdom are two separate fields.Health care involves caring ….caring for another person…AI can present data and give standard explanations with cascades of algorithms but it can not care. It can present knowledge and protocols and try to explain…but that is the minimum and easy part of our profession.The physician’s profession is to also give peace of mind…to comfort …to reasure…to coach ..to encourage…to support…and ultimately to have the patient’s back so that the patient knows that costs and statistics are not making the decisions. How many insurance malpractice cases are a result of the patient or the family being told “there is nothing we can do” (oh…we didn’t bother mentioning that a tertiary referral center might be able to fix this issue….oh we forgot to mention that end of life care doesn’t have to start today). Trust but verify. Who will be monitoring the programing (insurance companies, the government), who monitors the data entry (oh the patient is 19 not 91), etc.The engineering analogy still hold true…the physician is not the conductor on the train punching the ticket, nor the driver at the controls of the locomotive…the physician is the one who lays the track and gives the direction where the train shall go. The patient is the one which tells the physician which stop they are planning to get off at and if the physician and the patient interact properly the journey not only lasts longer but it will be better.

Leave a comment Please see our comment policy