The 21st Century Cures Act Will Finally Put Algorithms at Doctors’ Fingertips

Authored By 
Adam Chekroud
Blog contributor 
Policy Fellow
July 17, 2017

Most medical research is based on the idea of an “average” patient. Now, more and more people are hoping that we can “personalize” medicine and move away from averages. The 21st Century Cures Act, which breezed through the House (392-26) and the Senate (94-5), before being signed off by President Obama late last year, may finally help realize that ambition.

The bill exempts specific kinds of health software from being regulated as a medical device. This revised definition paves the way for clinicians to use a host of digital tools to make better diagnostic and treatment decisions for their patients – because they’re less regulated.

An example: in 2006, a group of researchers at Massachusetts General Hospital developed a brief clinical assessment that can tell whether a patient is more likely to be suffering unipolar or bipolar depression. In clinical practice, the two conditions can be difficult to tell apart – yet the authors developed a statistical model that could tell the difference 87% of the time.

Techniques like this can help us understand treatment as well as diagnosis. In treating depression, patients often go through a trial-and-error process of multiple medications before finding the right one. Earlier this year our group at Yale analyzed the symptom profiles of over 4,000 patients with depression, using artificial intelligence to learn complex relationships that predict whether a specific antidepressant will help a patient get better2. The algorithm, which performed better than many practicing psychiatrists, is now available as an online questionnaire and is being used by other clinicians daily.

Decision support software is not just limited to psychiatry. The Memorial Sloan Kettering Cancer Center has hosted a number of tools like this to help physicians understand the nature of a patient’s prostate cancer, and predict the likely outcomes of treatment.

Because there are thousands of ways that two patients might differ, the number of “personalized” rules to remember for each diagnosis or treatment can become overwhelming. To get around this, statistical models usually have to be implemented using computers.

However, until now, the FDA would have considered this type of algorithm a medical device– equivalent to a pacemaker or a medical thermometer, for instance.

The new act no longer considers these applications as traditional medical devices. Instead, it will allow the use of “health software” designed to display or analyze medical information. The asterisk: these recommendations must be given to clinicians, rather than straight to patients.

This opens up a world of opportunities to personalize healthcare that will no longer take years to realize. With a better understanding of a patient’s molecular and symptom profile, we will be able to design or develop treatments that focus on an individual’s response, rather than the average response.

The change will especially please academics and digital health start-up companies, as it will allow us to translate our research into real-world impact much sooner.

Despite the advances the act will bring, it also has limitations. First and foremost, the software must remain clinician (rather than patient) facing, and the clinician should not be expected to rely primarily on the software as the only guiding information. Finally, software that is reasonably likely to have serious adverse health consequences, or is used to control another regulated device, will continue to be regulated as a medical device. This might include, for example, real-time software used in critical care contexts, or automatic monitoring of vitals under anesthesia.

However, the safety and efficacy of these tools — just like other medical interventions —requires careful inquiry, and it is not clear who will now be responsible for this.

We cannot not simply pass this regulatory burden onto clinicians, who may not be familiar with the latest scientific or medical device literature, and could formerly rely on the FDA for this scrutiny.

Even more concerning is that the clinicians who will eventually bear the onus of understanding may not be specialist providers, like psychiatrists or cardiologists. Health software is often designed to allow less specialized clinicians (e.g. primary care providers) to treat more complex cases, in order to reduce the demand for specialty care.

By finally putting algorithms into doctors’ hands, 21st Century Cures will allow researchers and companies to test their tools in real-world settings much sooner than they ever could. However, failing to evaluate the safety and utility of medical software could rush the use of new and unproven tools, and ultimately risk reducing the quality of care that patients receive.

As someone who founded a health software startup, I could have a lot to gain from 21st Century Cures. I’m concerned that, if we aren’t careful, patients could have a lot to lose.

Adam Chekroud is a PhD Candidate at Yale University and an ISPS Graduate Policy Fellow. He is co-founder & Chief Scientist at Spring Health.