I watched Apple's September 9th event and couldn't stop cringing. Their "revolutionary" iPhone 17 AI features? Text prediction that's still worse than SwiftKey from 2015, visual search that my old Pixel 3 did better, and photo sorting that Google Photos has been doing automatically for years.
Google Duplex books restaurants for me while I'm in meetings. Siri still fucks up when I say "set two alarms and text Sarah I'm running late." It treats each part as a separate request, then asks me to clarify which Sarah from my contacts. Google figured this out in 2018.
The A19 chip has enough horsepower to run decent AI, but Apple's obsession with privacy kneecaps everything. They want local processing only, which means iPhone users get neutered AI that barely works versus Google's cloud-powered stuff that actually saves time.
iOS 26's visual search lets you point your camera at text to copy it. Revolutionary! Except Google Lens has been doing this since 2017 AND it translates in real-time AND recognizes objects AND suggests shopping links. Apple's version just... copies text. Slowly.
The productivity gap makes me want to throw my work iPhone out the window. My Android automatically transcribes meetings, drafts email replies, and creates calendar events from text messages. iPhone users still copy-paste between apps like it's 2012.
Apple spent a year hyping "Apple Intelligence" while their engineers scrambled to rebuild basic functionality that Android has had forever. They pushed major Siri upgrades to iOS 27 next year because what they've built still sucks.
The iPhone 17 isn't innovation - it's Apple admitting they're years behind and begging for more time to catch up to 2018 Android features.