When Your CEO Forces AI Adoption With Weekend Firings
Coinbase CEO Brian Armstrong just admitted he fired engineers for refusing to use AI coding tools.
And honestly, the way he did it is exactly what's wrong with how tech executives think about engineering.Here's what happened: Armstrong gave his team one week to onboard AI coding tools.
When engineers pushed back saying adoption would take months, he basically said "fuck that" and demanded everyone be onboard by Friday. Then he scheduled a Saturday meeting with anyone who hadn't complied."I jumped on this call on Saturday and there was a couple people that had not done it," Armstrong explained. "Some of them had a good reason, because they were just getting back from some trip or something, and some of them didn't and they got fired."Weekend. Firing. Session. With the CEO.## Why This Approach Is Backwards
I've been writing code for 15 years, and I've used Copilot extensively.
It's decent for boilerplate but suggests security vulnerabilities constantly.
Recent research from IEEE shows that 27% of Copilot's code suggestions contain vulnerabilities.
The idea that you can mandate 50% AI-generated code at a financial company while maintaining security is insane, especially when financial regulators are explicitly warning about AI security risks.
Armstrong even admits this contradiction: "It's not clear how you run an AI-coded code base and what the best way to do it is." So he's firing people for not adopting tools he doesn't understand how to implement properly.
The whole situation reveals a fundamental misunderstanding of how good tools spread. When Docker solved deployment hell, nobody needed CEO ultimatums
- engineers adopted it because it worked.
When Git replaced SVN, it happened organically because the benefits were obvious.
The fact that Armstrong had to threaten jobs suggests AI coding tools aren't the productivity revolution everyone claims.## The Reality of AI Coding ToolsDon't get me wrong
- tools like Copilot can be useful.
They're great for generating boilerplate, suggesting API usage patterns, and helping with syntax in unfamiliar languages.
But they're nowhere near reliable enough for the kind of mandate Armstrong implemented.I've seen Copilot suggest:
rm -rf /
as a debugging solution- SQL injection vulnerabilities in authentication code
- Race conditions in concurrent code
- Memory leaks in performance-critical sectionsSecurity researchers have documented these issues extensively, with GitGuardian reporting that developers need specialized training to identify AI-generated vulnerabilities.
Even Pillar Security discovered new attack vectors specifically targeting AI coding assistants.
The real skill isn't getting AI to write code. It's recognizing when AI suggestions will destroy your production environment.
That takes experience and judgment you can't mandate in a one-week ultimatum.## What This Means for EngineersArmstrong's approach is becoming common across the industry. The message is clear: you can't ignore AI tools anymore, even if you've seen them generate garbage.
But here's the thing
- being forced to use broken tools doesn't make you more productive. It makes you a babysitter for AI that hallucinates edge cases and security vulnerabilities.The engineers who got fired might have been asking the right questions about code quality and maintainability. They just worked for a CEO who values adoption metrics over engineering judgment.For financial services, this creates a ridiculous situation. Armstrong warns against sloppy coding while demanding 50% AI-generated code. Every line still needs human review, which raises the obvious question: if you need to double-check everything anyway, where's the productivity gain?
The New York Department of Financial Services has explicitly warned about AI cybersecurity risks in banking operations.## The Bigger Problem
Armstrong's weekend firing spree isn't really about AI adoption.
It's about executives who don't understand engineering trying to optimize metrics that don't matter.Coinbase now has 33% of code written by AI, targeting 50% by year-end. But what matters isn't the percentage of AI-generated code
- it's whether the code works, scales, and doesn't lose customer money.The engineers who resisted might have understood something Armstrong doesn't: good software engineering isn't about maximizing AI adoption. It's about building systems that work reliably when real money is on the line.But when your CEO schedules weekend firing sessions, engineering judgment becomes irrelevant. You either comply with the metrics or update your LinkedIn.