Friday evening, Sam Altman was out. Saturday morning, the board was in chaos. Monday morning, he was back in. The OpenAI weekend happened, and everyone who uses GPT for anything got reminded of something they probably knew and actively ignored: you’re building on somebody else’s platform, and that platform has a board.
I had a conversation with the team on Monday morning about what this means for us. Not the hyperbolic “AI is dying” takes that were doing the rounds on Twitter, but the actual question. Filter runs AI tooling. Not exclusively, but it’s in our stack now. Code generation, content scaffolding, sometimes reasoning over data. What happens when the model provider has an internal crisis? What’s our exposure?
We’re not locked into GPT-4 for everything. We use it in specific places where it’s clearly the best tool. Content generation, code review, that kind of thing. If OpenAI went away tomorrow, we’d have to rewrite some things, find alternatives for others, and that would be an engineering problem, not a business killer. We’d be annoyed but we can pivot pretty quickly.
But if I were a startup that had built their entire product on top of OpenAI’s API, with zero redundancy, and no ability to swap to Claude or Google’s offering without retraining the entire product, that weekend would have been a bit different.
But there’s a flip side. Total redundancy is expensive. Testing against multiple models, building abstractions, maintaining fallback paths. That takes engineering time. And if you’re a five-person startup trying to get product-market fit, you don’t have that time. So the practical answer is: build with one, know your Plan B, and start building it before you need it.
The other thing the OpenAI weekend taught everyone is that the lab companies actually matter more than the cloud companies in this story. You can switch between cloud providers relatively smoothly because they offer similar services. You can’t switch foundation models smoothly because the model itself is the differentiation. This is a much deeper lock-in than we’ve had before, and it’s worth being honest about.
What it doesn’t mean is that you should avoid using AI tools. It means you should build the same way you’d build anything that depends on external infrastructure. Know your dependencies. Plan for failure. Keep optionality. Don’t pretend a company with a board is the same as a utility.
We’re going to keep using a set of tools that includes GPT, Claude, and probably one or two others as we figure out what they’re good for.
—