Style, UX, and the AI Gap That Could Define the Next Era of Mobile
I usually don’t pick on Apple products because they generally have good UX, but lately things have been off. Start to track a run? Got difficult. Photos is hard to use and find images. Simple tasks that used to feel effortless now feel like navigating a maze. And I’m not alone in noticing this.
For two decades, the iPhone was the gold standard, the device that made smartphones feel human. But somewhere between chasing the next aesthetic trend and racing to catch up in the AI arms race, something has gotten lost. The question now isn’t whether Apple is still selling iPhones. They are, impressively so. Apple hit a record 20% global market share in 2025, its highest-ever quarterly shipment share in Q4. The question is whether the experience of using one is keeping up with the brand promise that made people loyal in the first place.
The UX That Once Made Apple Untouchable
There was a time when picking up an iPhone for the first time required no instructions. You just knew how to use it. That intuitive simplicity was Apple’s superpower, and it wasn’t accidental. It was the result of obsessive attention to design detail.
That era feels increasingly distant.
iOS 26 introduced “Liquid Glass,” a translucent, shimmery design language that looks stunning in a keynote and frustrating in real life. The Nielsen Norman Group, one of the world’s most respected UX research organizations, put it plainly: one of the oldest findings in usability is that anything placed on top of something else becomes harder to see. Yet Apple is now encouraging users to set photos as backgrounds for text messages, camouflaging words against beach vacation shots and pet fur. The result? Text becomes harder to read, controls are harder to find, and the interface (once Apple’s greatest competitive advantage) now trips people up.
The UX Design community noticed it too. Critics noted that iOS 26 appeared to be change “driven by the need to appear innovative, not by real user needs.” This echoes a warning from design pioneer Alan Cooper about “thrashing”: changing things to signal progress rather than respond to real user problems. Apple has walked back design decisions before when backlash grew loud enough. The question is whether they’re listening this time.
The Fitness Tracking Fumble
Let’s go back to that run. Fitness tracking on iPhone should be frictionless. Apple has had years, the Apple Watch ecosystem, and Health app infrastructure to nail this. Yet starting a workout, finding your pace mid-run, or pulling up a past run’s data has become a multi-tap scavenger hunt. Workout data is siloed across Health, Fitness, and third-party apps, and they don’t talk to each other cleanly. Compare this to Garmin or even Google Fit, where workout summaries surface the moment you’re done, and the gap is obvious.
Apple Intelligence was supposed to help here. “Workout Buddy” launched with iOS 26, but broader AI-powered health insights and personalized coaching remain shallow compared to what the hardware is capable of. Apple has world-class sensors on the wrist and world-class chips in the phone. The connective tissue, the intelligence that turns raw data into something genuinely useful, is still missing.
Photos: A Beautiful Mess
The Photos app redesign has been one of the most complained-about changes in recent memory. Apple reorganized everything: collections, albums, memories, in ways that broke muscle memory built over years. Finding a specific photo now requires knowing which of several new tabs to check, and the search function, while technically capable, often surfaces results that feel arbitrary.
This is particularly maddening because the underlying technology is impressive. Apple’s on-device photo recognition can identify faces, landmarks, pets, and objects with remarkable accuracy. But the interface buries those capabilities under layers of reorganized UI that even longtime iPhone users find confusing.
The Photos redesign is a microcosm of Apple’s current design problem: powerful technology wrapped in an interface that creates friction instead of removing it.
The AI Problem Apple Can’t Hide Anymore
Apple announced Apple Intelligence with enormous fanfare at WWDC 2024. A year later, the verdict from users and critics is the same: it fell short. News summaries had to be rolled back after the system began generating fabricated headlines. The personalized Siri upgrade, the one that was supposed to understand context across your apps and life, still hasn’t materialized at the level promised. The next-generation Siri that Apple teased remains a work in progress.
Meanwhile, Google’s Gemini is embedded directly into Android, offering contextual assistance that feels genuinely useful. Samsung’s Galaxy AI features have become a selling point in markets where Apple once dominated on brand alone. And in China, Apple’s most important single market, local competitors like Huawei and Xiaomi are shipping phones with AI models trained on local language and culture that Apple simply can’t match from Cupertino.
Apple built its AI on a promise of privacy: on-device processing that doesn’t send your data to the cloud. That’s a noble and differentiating principle. But “private AI” is only compelling if the AI is actually good. Right now, Apple Intelligence is neither powerful enough to be a reason to buy an iPhone, nor polished enough to be a reason to stay.
What AI Could and Should Do for the iPhone
Here’s the hopeful part: the potential is enormous, and Apple has the infrastructure to deliver it. The path forward isn’t just adding more AI features. It’s using AI to fundamentally fix the UX problems that have piled up.
A smarter Siri that actually knows you. The original vision of Siri, a true personal assistant that understands your calendar, your relationships, your habits, and your preferences, is still unrealized. Imagine asking “Did I take a photo of that restaurant menu last week?” and getting the right answer instantly. Or saying “Start my usual Thursday run” and having Siri launch the right app, set your goal, and connect your AirPods without three extra taps. This is table stakes for what AI should do on a smartphone in 2026.
Proactive UX that surfaces what you need before you look for it. The best AI doesn’t wait to be asked. Your iPhone knows you have a flight tomorrow. It should surface your boarding pass, the weather at your destination, and your rental car confirmation without you digging through emails. Google has been doing versions of this for years. Apple has the data (kept private, on-device) to do it better.
Photos that actually organize themselves. The Photos app should be able to answer “find the photo I took of my kid’s soccer trophy” or “show me all the menus I’ve photographed” without requiring perfect search terms or knowing which tab to look in. On-device vision models are capable of this today. The interface just needs to let them do their job.
AI-assisted fitness coaching. Apple Watch collects extraordinary health data. AI should connect the dots, noticing that your heart rate variability has been low this week, that you’ve been sleeping less, and suggesting you take an easy day before pushing hard on a run. That kind of proactive, personalized coaching would make the Watch-iPhone ecosystem feel genuinely irreplaceable.
Design that adapts to the user, not the trend. The most ambitious use of AI in UX would be an interface that learns how you use your phone and adapts accordingly. Frequent photographers get a Camera shortcut that learns your style. Heavy texters get an optimized Messages layout. Accessibility needs get automatically detected and accommodated. Instead of one design for 1.5 billion users, AI could enable an iPhone that feels personally tailored, which is, ironically, what Apple used to promise.
The Stakes Are Higher Than the Sales Numbers Suggest
Apple’s sales numbers look great on paper. Record shipments. Record market share. Record revenue. But the smartphone market is maturing, upgrade cycles are lengthening (the average iPhone user now holds onto their device for over three years), and the competitive pressure from Samsung, Google, and Chinese manufacturers is intensifying in ways that pure brand loyalty can no longer fully absorb.
The iPhone’s moat has always been the experience. The hardware, the ecosystem, the UX, combined into something that felt worth paying a premium for. That premium is now harder to justify when fitness tracking is clunky, Photos is confusing, AI features are half-baked, and the new design language is generating more complaints than compliments.
Apple has the talent, the chips, and the on-device infrastructure to build the most intelligent, most private, and most intuitive smartphone experience in the world. The technology is there. What’s needed is a return to the discipline that made Apple’s UX legendary in the first place: less thrashing, more solving. Less announcing features before they’re ready, more delivering on the ones that are. Less aesthetics for aesthetics’ sake, more design that genuinely makes people’s lives easier.
The grip isn’t gone. But it’s loosening. And that’s exactly the kind of problem Apple has always been best at solving, when it decides to.
Have thoughts on Apple’s UX direction? The conversation is wide open.
#UXDesign #ProductDesign #Apple #iPhone #ArtificialIntelligence #HumanCenteredDesign #DigitalStrategy #MobileDesign #InteractionDesign #DesignLeadership #ProductThinking #ExplainableAI
I usually don’t pick on Apple products because they generally have good UX. But lately, things have been off.
Trying to track a run? Harder than it should be. Looking for a photo you know you took? Good luck. Simple tasks that used to feel effortless now feel like navigating a maze.
The iPhone is still the world’s best-selling smartphone. But is the experience still worth the premium?
I wrote about the UX cracks starting to show, the AI gap Apple hasn’t closed, and what needs to change. Fitness tracking, Photos, Siri, design choices that look great in keynotes and frustrate in real life. And what AI could actually fix, if Apple gets serious.
The grip isn’t gone. But it’s loosening.
Full article in the link. Would love to know if you’re feeling this too.