Restorative
Back to blog

We Consumed Fashion, Now We Consume Tokens

April 8, 2026 · Restorative · 6 min read

Fast fashion taught us nothing. We just moved the same compulsive consumption into the cloud, and the planet is paying the same price.

Fast fashion broke something in how we relate to making things. The whole game was speed and volume: produce it cheap, buy it fast, throw it away faster. We knew it was wasteful. We talked about it endlessly. And then we did it anyway because the friction was so low and the dopamine hit was real.

Now look at what we're doing with AI.

The same pattern, different medium. Instead of a €4 polyester top from a fast fashion drop, we're generating the millionth to-do list app, the habit tracker nobody asked for, the task manager that's identical to the last one except this person didn't like the color scheme. We're consuming tokens the way we consumed trend cycles: compulsively, at volume, without pausing to ask whether the thing needed to exist.

The Vibe Coding Problem Is a Consumption Problem

I went to ChangeNOW 2026 with one specific argument in my head. I wanted to challenge my inner vibe coder, the persona building things not because there's a real need but because building has become as frictionless as scrolling. You can spin up a full app in an afternoon now. That's remarkable. It's also how we ended up with fast fashion racks that reset every two weeks.

Being ADHD is not an excuse for burning through compute resources to avoid finishing a thought. Neither is "I just wanted to try something." Excel exists. Notion exists. A piece of paper exists. The compulsion to build a new tool instead of using an existing one is, at this point, a consumption habit dressed up as productivity.

At that same event I met Jin from Peace Therapist, an NGO doing mental health recovery work with people affected by war. Real problem. Measurable harm. People whose nervous systems have been shattered by conflict, and Jin is building infrastructure to help them recover. That's what intentional building looks like. That's the bar. Not "I built something this weekend" but "I built something that addresses a gap that wasn't already covered by seventeen existing tools."

The contrast was very hard to ignore. Same technology, same tokens burned. One person building toward a real need. Everyone else adding to the digital landfill.

The Cloud Has a Physical Address

There's a convenient fiction in digital consumption, which is that because you can't hold it, it doesn't weigh anything. A request to a large language model feels like a thought. It feels instantaneous, weightless, free. But it's none of those things.

Data centers are physical buildings full of servers that generate heat, require cooling, draw power, and consume water for that cooling. A lot of water. Estimates vary (hey, would be actually cool if cloud companies running LLMs would disclose the numbers and prove me wrong), but a single conversation with a frontier model is far from free in terms of usage of energy and water for cooling depending on the data center's location and infrastructure. Multiply that by billions of daily queries. Multiply that by the habit of running the same query four times because you didn't really think through what you needed before hitting send.

Fast fashion's environmental cost was visible if you looked: the rivers in India running blue from dye runoff, the mountains of discarded clothing in the Atacama Desert. Cloud's environmental cost is mostly invisible because it happens in industrial zones outside cities, hidden behind interfaces that feel clean and minimal. But the cost is there. It's just been abstracted away from the person generating it.

That abstraction is doing a lot of work to keep the consumption loop running.

Intention Is the Only Meaningful Difference

The argument I keep coming back to is not that we should use less AI. It's that the way most people are using it right now is structurally identical to fast fashion consumption, and the outcomes are going to rhyme.

Fast fashion didn't fail because making clothes is bad. It failed because the relationship between maker and consumer was built on extraction, not value. The whole model depended on you not thinking too hard about what you actually needed, just responding to the stimulus of low price and high availability.

AI consumption looks the same from this angle. The model is: make it frictionless, make it fast, make it feel like it costs nothing, and watch people consume at volume. The companies benefit from the volume. The planet absorbs the cost. The user gets a mediocre output they could have produced with twenty minutes of focused thinking.

Intention breaks this loop. Not perfectly, not always, but structurally. When I decide before inferring the model what I actually need, what I'm going to do with the output, and whether this is the right tool for this specific problem, I consume less and get more. The session is tighter. The output is better. The water I notionally consumed was in service of something real.

This is what I mean when I say that intentional AI produces better outcomes than fast AI. It's not a moral claim. It's a practical one. Intention makes you a better operator because it forces the clarifying work that most people skip.

What a Sustainable Relationship With Compute Actually Looks Like

I don't think individuals doing mindful AI usage will save the planet. That's the same trap as telling people to bring their own tote bags while corporations design systems that make single-use plastic structurally necessary. The infrastructure decisions matter more than individual choices, and I'm not pretending otherwise.

But individual choices do shape what we normalize. Fast fashion normalized the idea that clothing should be cheap, disposable, and trend-driven. That normalization made it harder to argue for a different model, not impossible, but harder. The cultural story got set.

Right now the cultural story around AI is being set. The vibe coding narrative, the "I built this in a weekend" flex, the constant generation as a form of identity expression: this is setting a norm. And the norm it's setting today is fast fashion with servers.

The counter-narrative, the one worth building, is that the quality of your thinking before you open the AI UI of choice matters more than the speed at which you generate output. That constraint is generative. That asking whether something needs to exist is not a creative block but a creative discipline.

I waled and talked with Jin for maybe ten minutes before we both moved on to different parts of the event. But in that conversation she was clearer about the problem she was solving than most teams I've seen spend months circling. She knew what she was building and why. The technology was in service of a REAL thing.

That clarity is the only thing that makes the footprint worth it.

We got a second chance after fast fashion to build a different relationship with a new medium. The fact that the medium is digital doesn't make the choice easier. It makes it more invisible, which is worse. The Atacama Desert can't absorb discarded tokens, but somewhere there's a data center drawing water, cooling the servers that ran the query you forgot to think through.

Want the next post?

Leave your email and I’ll send you the next one. No spam, no nonsense.