Robby Walker was the guy in charge of Siri until earlier this year, and now he's quitting Apple entirely. Sources say he's frustrated with how slowly Apple moves on AI, which is corporate speak for "Siri is embarrassingly bad and nobody wants to fix it properly."
Here's the brutal reality: I can ask ChatGPT to write a Python script to analyze my expenses, help me plan a vacation, or explain quantum physics. I ask Siri to set a timer and she gives me web search results about "timer meditation techniques." It's 2025 and Apple's AI can't reliably understand basic commands that Google Assistant handled in 2018.
The Technical Reasons Siri Is Broken
Apple crippled Siri from the start with two decisions that seemed smart at the time but are now killing them:
1. Privacy-first on-device processing: While Google and Amazon were building massive cloud-based AI systems that could learn from millions of users, Apple insisted on doing everything locally on your phone. This sounds great for privacy, but your iPhone doesn't have the processing power of a data center.
2. Rigid command structure: Siri was built around predefined commands and specific phrase patterns. You have to learn to talk to Siri in specific ways - "Hey Siri, set a timer for 5 minutes" works, but "Hey Siri, remind me in 5 minutes" might not. Modern AI should adapt to how humans naturally speak, not the other way around.
Apple's Cultural Problem with AI
Walker's departure highlights a deeper issue: Apple's perfectionist culture is toxic for AI development. They want to ship polished, finished products, but AI improves through iteration and user feedback.
Google releases "beta" AI features that get better over time. Microsoft integrated ChatGPT into everything and iterates based on user behavior. Apple launches AI features that feel like they went through 3 years of committees, legal reviews, and focus groups - which kills any innovative spark.
I know people who worked on Siri teams. They describe endless meetings about edge cases, compliance reviews for every new capability, and feature requests that take months to approve. Meanwhile, ChatGPT went from idea to 100 million users in 2 months.
What Actually Needs to Change
Apple Intelligence (their new AI initiative) is supposed to fix this, but early demos look like more of the same - limited, careful features that won't offend anyone and won't be particularly useful either.
What they really need to do:
- Admit Siri is broken and rebuild it from scratch using modern large language models
- Accept some cloud processing for complex queries - privacy is important but usability matters too
- Ship beta AI features and iterate based on real usage instead of internal testing
- Compete on AI capabilities instead of just talking about privacy
But this requires a fundamental culture change that Apple has shown zero willingness to make. So we'll keep getting incremental Siri improvements while ChatGPT, Claude, and Google Assistant get exponentially better.
Walker probably got tired of fighting this bureaucracy and decided to join a company that actually wants to build good AI. I don't blame him.