Is iPhone 16 with Apple Intelligence powerful enough to entice buyers?

Time will tell whether Apple did enough at its iPhone 16 lineup announcement to push customers to upgrade for access to new Apple Intelligence features.

Most advance leakers knew much of the new hardware appearing in the four new iPhone 16s, including new A18 silicon for iPhone 16 and iPhone 16 Plus, and the A18 Pro chip for iPhone 16 Pro and iPhone 16 Pro Max.  Those are the chips advertised to make Apple Intelligence features possible, with their faster cores and power savings.

Pricing might have been a bit of a surprise, with the top-of-the-line Pro Max with 512 GB at $1399, putting it $100 higher than another GenAI-type phone, the Google Pixel 9 Pro XL Porcelain with 512 GB going for $1,300. (Apple will also sell a Pro Max with 1 TB for $1,599.) Still, the entry-level price for the lowest-tier iPhone 16 will be $799 with 128 GB, which could entice some people to upgrade.

What could be offsetting to those interested in the new AI features: Apple described a period of rolling out the Apple Intelligence features in coming weeks, even as the phones launch Sept. 20. An iOS 18 upgrade is also part of what unlocks the intelligence.  In its official press release, Apple notes Apple Intelligence is in beta, with “AI-opening possibilities/Coming this fall.”  A footnote indicates Apple Intelligence will be in beta on all four models with device language in US English as an iOS 18 update this fall, with “some features and additional languages…coming over the course of the next year.”

Analysts have questioned how big of a deal Apple Intelligence will be. In May, a survey by Canalys found that 7% of consumers globally were very highly inclined to buy a GenAI smartphone, withanother 18% saying they were highly inclined.  About 40% were not inclined or just "a little" inclined to buy such a phone.  Samsung and Google Pixel are already on the GenAI track, but some analysts have predicted greater interest in what Apple comes up with, because of its mostly closed ecosystem and self-developed processors as well as control over app development.  The LLM parameters on Apple devices are not as extensive as on Android, TrendForce argued earlier.

“For years, artificial intelligence and machine learning have been essential in delivering so many of the features and experiences you love,” said CEO Tim Cook at the introduction of Monday’s announcement. “In June, we launched Apple Intelligence, our powerful new personal intelligence system, which will have an incredible impact.”

At least one Gartner analyst, Annette Zimmerman, declared after hearing the keynote “there will be a huge demand for the iPhone 16.” She had said she earlier assumed purchases by the group of iPhone users that always upgrade after 24 months, while the beyond the two-year upgraders would wait until more features are released in coming months, causing a delay in increased sale. But that later group will act faster, she argued.

She listed AI-enabled visual control features like Camera Control and Visual Control as being important and predicted they “will become part of iPhone users’ daily lives.”  Camera Control is a feature that allows users to push a side Action button and slide from telephoto to a closer range lens, among other controls.

Despite a "huge list of upgrades" with new camera features, bigger screens, the new Action button and Apple Intelligence, IDC analyst Nabila Popal said Apple won't see a major bump in sales. "The bigger impact of Apple Intelligence will be next year when language support expands, when Apple establishes partnerships with local AI models in China...and as consumers get more familiar with AI features."  IDC predicts a 4% year-over-year growth in Apple shipments in 2025, up from a flat 0.8% growth in 2024. 

"Bottom line, this is a long-term play for Apple, and while we may not see the bigger impact immediately, Apple Intelligence will eventually change the smartphone user experience completely, like with the first iPhone," Popal added. 

While Cook described an “incredible impact,” Apple software engineering senior vice president Craig Federighi described Apple Intelligence with several examples enabled with iPhone 16 hardware. He grouped them under four headings: “Express yourself; Relive memories; Prioritize and focus; Get things done.”

In one example, Apple highlighted using Notes or Phone apps to capture audio recordings and transcripts, with Apple Intelligence then generating summaries of the transcripts to get the most important information. Also, the Image Playground app is described as allowing a user to create unique images in seconds based on a description, concept or a person from the Photos library. Image styles can be animations, illustrations or sketches.

Apple has also promised “richer language understanding and an enhanced voice to make communicating with Siri even more natural.” If a user stumbles over words, Siri will use AI to “know what you’re getting at,” Apple’s online materials say.

Another example: Siri can help you find your passport number while booking a flight, “without compromising your privacy.”

Federighi also described the ability to create original emojis with “endless possibilities” that can be created by typing a description. And, he said searching photos will be easier, as Apple Intelligence will allow searching for a friend who was dancing  a red dress.  It will even enable finding a specific moment inside a video in the user’s phone.

Apple is also keen on designs that protect personal privacy, which Apple said is integrated into the core of iPhone with on-device processing. However, Apple is also using Private Cloud Compute with Apple Intelligence, where server-based cloud services running on Apple silicon handle complex requests.

Apple has created a series of use cases for Apple Intelligence on the iPhone contained on a five-minute video.