Christina Brandon

Writer | Researcher

How algorithms break things

We’re finally into February, and I hope this month trots by at good clip, instead of the slogging along like January. There’s over a foot of snow in Chicago, and I’m drinking Irish coffees and making galettes for dinner -- seriously it’s an easy, flexible dish that will leave you feeling fancy and sated. Pair with a glass of wine and a green salad and indulge in a fantasy that you’re in a Parisian bistro, or any bistro. 

In between my food and drink indulgences, I’ve been thinking more and more about the not sexy but interesting and vital work of process, that is what systems are in place that allow, for example, teams to function well, that allows products to be designed and built, or in the case below, that fairly allocate vaccines.

I’ve touched on this before, about the behind-the-scenes, often invisible, thinking that goes into the products and services we use. But here’s a case that shows how that thinking can be riddled with blind spots.

This article details how Stanford Medicine botched their vaccine rollout. An algorithm gave priority to doctors who were not the ones on the front lines seeing Covid patients over those who were. Any person in charge would surely place those who would face the most risk at the top of the list, not the bottom. 

Besides flaws in the construction of the algorithm itself, there was a very human issue: at no point did the medical workers get any visibility into why they were placed where they were in the line to get the vaccine. It sounds like it had the feel of being assigned a random number, vs one programmed with a set of decision criteria. 

A core problem was one of process. Throughout the whole process of deciding who will get the shot first, from building and running the program, to communicating its “decision,” the people in charge didn’t include time to communicate with or provide transparency to the staff about how and why those decisions were made.

The algorithm also failed to take in feedback from the frontlines, which would have immediately alerted officials to significant problems with the algorithm.

Algorithms can be created because of a real desire for fairness, which is a great thing, especially in a case like this with a life-saving vaccine in short supply. They can also be deployed for cost-saving reasons, whether it is money and/or labor. Either way, what often ends up happening is those in charge punt the problem to an algorithm and walk away, as if the machine alone can fix it.

This Stanford Medicine example illustrates why this is a faulty way of thinking and the frustration and confusion and risk involved when we just toss out algorithmic decisions with a shoulder shrug. These programs clearly don’t fit neatly into our lives, into whatever processes we, our hospitals, our workplaces, have already established. They break them. 

I keep thinking, what if a couple of doctors or nurses or anyone on the medical staff were brought into the process early on, before the vaccine allocations were announced. What if the medical staff were asked, simply, “What do you think of this? What’s missing or confusing?” No doubt Stanford would have heard and seen their concerns. They would have been able to course-correct before rolling out their program to everyone. Before creating a disaster.

What this shows is, we need to rethink our relationship to AI, to the kinds of algorithms used by Stanford Medicine. We think we can simply delegate tasks, but maybe thinking about it as a partnership is better. We humans have to figure out how to work with these tools and how to reimagine our work, our processes to account for the breakage wrought by algorithms. New systems must be designed with the input from those affected. People need to be part of this reimagining. 


Subscribe to the newsletter Humdrum to get thoughtful essays about how design and technology affect our everyday lives. Sent monthly.


Subscribe to my newsletter Humdrum for thoughtful explorations in how technology and design affect our everyday lives. Delivered monthly. Subscribers also get a free copy of my book, Failing Better.