Denied by AI: How Medicare Advantage plans use algorithms to cut off care for seniors in need

Home Page Join NYPAN! Donate Share this article!
 

MIKE REDDY FOR STAT

An algorithm, not a doctor, predicted a rapid recovery for Frances Walter, an 85-year-old Wisconsin woman with a shattered left shoulder and an allergy to pain medicine. In 16.6 days, it estimated, she would be ready to leave her nursing home.

by Casey Ross and Bob Herman

On the 17th day, her Medicare Advantage insurer, Security Health Plan, followed the algorithm and cut off payment for her care, concluding she was ready to return to the apartment where she lived alone. Meanwhile, medical notes in June 2019 showed Walter’s pain was maxing out the scales and that she could not dress herself, go to the bathroom, or even push a walker without help.

It would take more than a year for a federal judge to conclude the insurer’s decision was “at best, speculative” and that Walter was owed thousands of dollars for more than three weeks of treatment. While she fought the denial, she had to spend down her life savings and enroll in Medicaid just to progress to the point of putting on her shoes, her arm still in a sling.

Health insurance companies have rejected medical claims for as long as they’ve been around. But a STAT investigation found artificial intelligence is now driving their denials to new heights in Medicare Advantage, the taxpayer-funded alternative to traditional Medicare that covers more than 31 million people.

Behind the scenes, insurers are using unregulated predictive algorithms, under the guise of scientific rigor, to pinpoint the precise moment when they can plausibly cut off payment for an older patient’s treatment. The denials that follow are setting off heated disputes between doctors and insurers, often delaying treatment of seriously ill patients who are neither aware of the algorithms, nor able to question their calculations.

Older people who spent their lives paying into Medicare, and are now facing amputation, fast-spreading cancers, and other devastating diagnoses, are left to either pay for their care themselves or get by without it. If they disagree, they can file an appeal, and spend months trying to recover their costs, even if they don’t recover from their illnesses.

“We take patients who are going to die of their diseases within a three-month period of time, and we force them into a denial and appeals process that lasts up to 2.5 years,” Chris Comfort, chief operating officer of Calvary Hospital, a palliative and hospice facility in the Bronx, N.Y., said of Medicare Advantage. “So what happens is the appeal outlasts the beneficiary.”

The algorithms sit at the beginning of the process, promising to deliver personalized care and better outcomes. But patient advocates said in many cases they do the exact opposite — spitting out recommendations that fail to adjust for a patient’s individual circumstances and conflict with basic rules on what Medicare plans must cover.

“While the firms say [the algorithm] is suggestive, it ends up being a hard-and-fast rule that the plan or the care management firms really try to follow,” said David Lipschutz, associate director of the Center for Medicare Advocacy, a nonprofit group that has reviewed such denials for more than two years in its work with Medicare patients. “There’s no deviation from it, no accounting for changes in condition, no accounting for situations in which a person could use more care.”

Medicare Advantage has become highly profitable for insurers as more patients over 65 and people with disabilities flock to plans that offer lower premiums and prescription drug coverage, but give insurers more latitude to deny and restrict services.

Over the last decade, a new industry has formed around these plans to predict how many hours of therapy patients will need, which types of doctors they might see, and exactly when they will be able to leave a hospital or nursing home. The predictions have become so integral to Medicare Advantage that insurers themselves have started acquiring the makers of the most widely used tools. Elevance, Cigna, and CVS Health, which owns insurance giant Aetna, have all purchased these capabilities in recent years. One of the biggest and most controversial companies behind these models, NaviHealth, is now owned by UnitedHealth Group.

It was NaviHealth’s algorithm that suggested Walter could be discharged after a short stay. Its predictions about her recovery were referenced repeatedly in NaviHealth’s assessments of whether she met coverage requirements. Two days before her payment denial was issued, a medical director from NaviHealth again cited the algorithm’s estimated length of stay prediction — 16.6 days — in asserting that Walter no longer met Medicare’s coverage criteria because she had sufficiently recovered, according to records obtained by STAT.

Her insurer, Security Health Plan, which had contracted with NaviHealth to manage nursing home care, declined to respond to STAT’s questions about its handling of Walter’s case, saying that doing so would violate the health privacy law known as HIPAA.

Walter died shortly before Christmas last year.

NaviHealth did not respond directly to STAT’s questions about the use of its algorithm. But a spokesperson for the company said in a statement that its coverage decisions are based on Medicare criteria and the patient’s insurance plan. “The NaviHealth predict tool is not used to make coverage determinations,” the statement said. “The tool is used as a guide to help us inform providers, families and other caregivers about what sort of assistance and care the patient may need both in the facility and after returning home.”

As the influence of these predictive tools has spread, a recent examination by federal inspectors of denials made in 2019 found that private insurers repeatedly strayed beyond Medicare’s detailed set of rules. Instead, they were using internally developed criteria to delay or deny care.

But the precise role the algorithms play in these decisions has remained opaque.

STAT’s investigation revealed these tools are becoming increasingly influential in decisions about patient care and coverage. The investigation is based on a review of hundreds of pages of federal records, court filings, and confidential corporate documents, as well as interviews with physicians, insurance executives, policy experts, lawyers, patient advocates, and family members of Medicare Advantage beneficiaries.

It found that, for all of AI’s power to crunch data, insurers with huge financial interests are leveraging it to help make life-altering decisions with little independent oversight. AI models used by physicians to detect diseases such as cancer, or suggest the most effective treatment, are evaluated by the Food and Drug Administration. But tools used by insurers in deciding whether those treatments should be paid for are not subjected to the same scrutiny, even though they also influence the care of the nation’s sickest patients.

In interviews, doctors, medical directors, and hospital administrators described increasingly frequent Medicare Advantage payment denials for care routinely covered in traditional Medicare. UnitedHealthcare and other insurers said they offer to discuss a patient’s care with providers before a denial is made. But many providers said their attempts to get explanations are met with blank stares and refusals to share more information. The black box of the AI has become a blanket excuse for denials.

“They say, ‘That’s proprietary,’” said Amanda Ford, who facilitates access to rehabilitation services for patients following inpatient stays at Lowell General Hospital in Massachusetts. “It’s always that canned response: ‘The patient can be managed in a lower level of care.’”

READ MORE OF THIS STORY

 
Ting Barrow