To Love, To Hate AI

AIDesign EthicsUnsolicited Opinion

When I was teaching during the golden age of bootcamps, I had too many students who, like me, were originally drawn to design as a less starving alternative to artist. And like me, they didn't immediately grasp the responsibility of the role.

So I started introducing the course with a quote from cynical mathematician Donald Saari. Slide 01, black text on white:

For a price, I will come to your organization just prior to your next important election. You tell me who you want to win. I will talk with the voters to determine their preferences over the candidates. Then I will design a 'democratic voting method' which involves all candidates. In the election, the specified candidate will win.

He was joking about the offer, but the mechanism is very real.1

Design works the same way, but it’s an industry where rigging the system isn’t a provocative joke or dirty secret; it’s the whole point.

Click. Buy. Like.

For a price, we study people, map their behavior, and build systems that steer them where someone—typically someone signing our checks—wants them to go.

Follow. Share. Repeat.

Structure shapes understanding. Systems define behavior. Language colors perception. Design isn't moral; it's prescriptive. It creates rules, incentives, hierarchies that all favor one party over another.

The age of AI is here, as every bullshit tech puffpiece published over the last three years will tell you. It's not a matter of stopping it. That ship's done sailed, but we still have time to decide what that looks like. Who's in the room when decisions get made? Whose values inform what gets prioritized?

There's a moral panic happening across an industry where disruption is a business plan—and disruption is, by its nature, destructive. For a lot of smart people across design and engineering, the only ethical stance is refusal. If you touch AI, you're complicit. The problem is, AI gets built whether you engage with it or not. When you refuse to participate, you're not protecting your values; you're retreating.

I watched this happen with the web. The people who cared most about what it could be stepped back when we left the wild west, when people became brands, when it started feeling more like a mall than a neighborhood. They let someone else make the decisions. We got surveillance instead of curiosity, extraction instead of connection. Those values largely ended up relegated to the fringes of FOSS, the indie web movement, and hobbyists.2

Opting out doesn't make the system better, it didn't 20 years ago, and it won't now.3 It just means your perspective isn't in the room. If you believe design has power—if you believe that how we build systems actually matters—then your absence is a choice with consequences.

Absurd sums of money are being pumped into wildly unprofitable tech by companies desperate for ROI. I don't trust that AI will be built responsibly by default. I don't trust that ethics will be prioritized over growth metrics, or that safety will win against speed. That’s why I believe AI's biggest critics need to be the most involved in its development.

"Fuck AI," right? Except it's not OpenAI, Meta, or any other company pouring money into AI slop who stand to lose. It's already rigged in their favor. If we consider ourselves advocates for the user—for people, for ethical tech in general—now's the time to either advocate or admit we were failed artists this whole time.

Footnotes

  1. See also: gerrymandering.

  2. I'm a big believer in FOSS and the DIY web, so this is coming from first-hand experience more than anything.

  3. Still pissed at how successful (and powerful) some of these early tech companies became purely due to a lack of alternatives.

Loading interactions...