Amazon tried the smartphone thing once. The Fire Phone launched in 2014 with Dynamic Perspective, a head-tracking interface nobody asked for. It bombed spectacularly. The company took a $170 million write-down and killed the project within a year.
Now they're trying again.
According to GeekWire, Amazon is building a new smartphone codenamed "Transformer" - and this time, the pitch is AI integration. The device is being led by J Allard, the former Microsoft executive who helped shape the Xbox and the Zune. That last one is... not the most reassuring credential.
What Makes This Different
The Fire Phone failed because it was a solution looking for a problem. Dynamic Perspective was clever engineering with no practical use. The shopping integration felt forced. The app ecosystem was barren. It was a device built to serve Amazon's business model, not the customer's needs.
This time, the angle is mobile personalisation powered by AI. The phone would sync deeply with Alexa, learning patterns and surfacing contextual actions. The most ambitious part? Reports suggest Amazon might bypass traditional app stores entirely, building a system where AI agents handle tasks without needing individual apps at all.
That's either visionary or delusional, depending on how the execution lands.
The App Store Problem
If Amazon actually attempts to replace apps with AI-driven task completion, they're not just building a phone - they're challenging the fundamental architecture of mobile computing. Apple and Google have spent 15 years training users to think in apps. Banking app. Email app. Messaging app. The mental model is deeply embedded.
An AI-first phone would need to be dramatically better at common tasks to justify the cognitive shift. Not incrementally better. Not "this is neat" better. It would need to make app-based workflows feel clunky by comparison.
The problem is twofold. First, most people don't find apps clunky. They're predictable, reliable, and familiar. Second, the businesses behind those apps have no incentive to let Amazon's AI layer replace their carefully designed interfaces. Why would a bank let Alexa handle transactions when they've spent millions building their own secure app experience?
Amazon would either need to build integrations with thousands of services - a massive, slow, politically fraught process - or accept that their phone works beautifully for Amazon services and awkwardly for everything else. That's the Fire Phone trap all over again.
The Business Model Question
Amazon doesn't make money selling hardware. The Kindle is cheap because it sells books. Echo devices are cheap because they drive Prime subscriptions and shopping. Fire tablets exist to get people into Amazon's content ecosystem.
A smartphone is a much higher bar. People expect flagship performance, premium cameras, long battery life, and seamless integration with their existing digital lives. That's expensive to deliver. If Amazon prices the phone as a loss leader, they need a clear path to recurring revenue. If they price it competitively with iPhones, they're asking customers to take a risk on an unproven platform.
The AI pitch only works if it unlocks something valuable enough to offset the friction of switching. What does this phone do that an iPhone with the Alexa app installed cannot? That's the question Amazon needs a compelling answer to.
Why This Might Actually Work
Here's the optimistic case. The smartphone market has been boring for years. Incremental camera improvements and slightly faster chips. The interface paradigm hasn't meaningfully changed since the iPhone launched in 2007. If AI can genuinely reduce the friction of daily tasks - booking travel, managing schedules, handling errands - there's a real opening.
Amazon also has scale that few companies can match. They have cloud infrastructure, logistics networks, payment systems, content libraries, and existing customer relationships. If they can tie all of that together into a cohesive experience, the phone becomes less about hardware and more about ecosystem integration.
And unlike 2014, the technology might actually be ready. Large language models can understand context and intent in ways that previous generations of voice assistants could not. Multimodal AI can parse images, text, and voice together. The gap between what people want their phone to do and what it can actually do is narrower than it's ever been.
That said - wanting something to work and making it work are very different things. Amazon's track record on consumer hardware is mixed at best. The Echo succeeded because it was cheap, simple, and solved a specific problem. A smartphone is none of those things.
We'll see if the second attempt lands differently than the first.