As Apple's annual fall event draws near, amidst the wave of large AI models, the question of when Apple will release its own "AppleGPT" has become a topic of interest.

After all, with Huawei leading and Xiaomi following, Chinese smartphone manufacturers are eagerly integrating large AI models into mobile devices. Yet, Apple seems to be taking a more "restrained" approach to the progress of large AI models.

During a recent analyst conference following the release of their latest financial report, Apple CEO Tim Cook emphasized two key terms when discussing AI progress: "product" and "responsibility."

Cook mentioned that Apple has been researching generative artificial intelligence (AI) and other models for years, stating they are integrated into almost every product they manufacture. He further added that Apple will continue to invest and innovate, responsibly leveraging these technologies to enhance people's lives.

"Don't expect Apple to talk about AI like its peers," said an analyst who has long followed Apple's AI developments. According to CNBC data, compared to other tech giants who are vocal about their AI endeavors, Apple seems less inclined to tell more stories through AI in this technological wave.

Is this really the case? While tech giants worldwide are actively embracing AI, it's unlikely that Apple would be content to lag behind. The company's mysterious stance on AI is a result of its "vertical integration" and "hardware-software synergy" strategy.

To say that Apple will fall behind in this tech wave is premature, as Apple is gearing up for a new AI "counterattack." This article will focus on two key questions:

  1. How far has Apple's AI strategy progressed?
  2. Why is Apple so calm about AI?

01 Apple's AI Strategy: The Wasted "First-Mover Advantage" For most, Apple's AI journey began with Siri.

Siri had a promising start. Backed by international research and acquired by Steve Jobs for over $200 million, "Hey Siri" became the representation of AI for many during that era.

From the outset, Jobs envisioned Siri not just as a smart voice assistant but as a bet on a new mode of interaction. He believed voice-based interaction was a natural human inclination. Siri co-founder Norman Winarsky recalled, "Jobs recognized the power of that technology earlier than others, realizing the unique value of having a personal assistant that interacts with you as a real person would."

This meant that Siri's scope was broader than products like Google Assistant, which was primarily a search tool.

Jobs had established a strong first-mover advantage for Apple's AI strategy. However, subsequent vague positioning, a closed ecosystem, and management issues caused Siri to fall behind.

How did Siri squander its advantage and become perceived as an "artificial idiot"? It can be summarized in three steps:

The first step was ambiguous positioning, oscillating between being an "execution engine" and a "search product."

A former Siri employee mentioned that the initial hope was to shape Siri into a virtual intelligent voice butler, similar to "Samantha" from the movie "Her." However, after Jobs' passing, Apple brought in talent from other tech giants, leading to multiple shifts in Siri's positioning, gradually evolving it into a search engine.

Bill Stasior, who had previously overseen Siri's rise and fall, had been responsible for Amazon's search and advertising business. He leaned towards making Siri a world-class search engine, integrating Siri with Apple's search resources to enhance its performance. However, search engines and smart voice assistants aren't mutually exclusive. The key is to align with the business's core objectives and provide a better user experience.

In contrast, Google Assistant, despite its late start, has had Google's full support, serving as the vanguard of Google's "AI first" strategy. Google understood the early "task-oriented" AI needs of users and combined its search strengths to meet them.

The second step was excessive internal consumption. Poor management gradually marginalized Siri.

Bill Stasior might be one of the most unfortunate "workers" in history. During his seven years overseeing the Siri project at Apple, he reported to four different bosses.

The third step was the closed ecosystem. Apple's emphasis on secrecy and the closed nature of iOS meant external developers couldn't get involved, preventing Siri from forming an ideal data feedback loop.

It wasn't until the launch of iOS 10 in 2016 that this trend began to reverse, marking the beginning of Apple's new AI journey.

The reason iOS 10 is considered a turning point in Apple's AI strategy is twofold. On one hand, Apple's AI strategy became clearer - they no longer pinned their hopes on a smart voice assistant but integrated related AI achievements into iPhones running iOS 10.

On the other hand, in terms of data and ecosystem, Apple introduced SiriKit for third-party integration and initiated Apple's "Differential Privacy," ensuring user data was encrypted when uploaded to Apple's servers, placing a strong emphasis on "privacy protection."

In the development ecosystem, in 2017, Apple released two major development tools: Core ML and Create ML. The former made it easier for developers to train machine learning models and integrate them into apps, while the latter specialized in AI computations on mobile devices.

With everything in place, Apple officially began its "AI counterattack."

02 Embracing Openness: Apple's "Counterattack"

In fact, when compared to the proactive stance on AI by tech giants like Google and Microsoft, Apple's measured approach to AI is, to some extent, both a reflection of the historical reasons presented earlier and an inevitable outcome of its "vertical integration" and "hardware-software synergy" strategy.

Understanding Apple as a product company or a supply chain enterprise makes it clear why Cook seldom discusses AI ambitions.

Apple's enduring value as a tech company stems from its global industrial layout on one hand, with its powerful "Apple chain" ensuring stability and robust cost control. On the other, its consistently cutting-edge products (which may not always be perfect but are always controversial) allow Apple to maximize profit margins and dominate the conversation in the value chain.

Within this business model, understanding Apple's AI strategy reveals that the company has been quietly making significant strides.

Firstly, there's vertical integration.

Currently, Apple's AI investments span semiconductor manufacturing, machine learning, voice recognition, facial recognition, and expression tracking. These diverse investments have helped Apple establish a broad AI application ecosystem. Apple rarely discloses its investment plans, primarily because, within its closed ecosystem, these AI companies are firmly under Apple's control, becoming part of the "Apple Kingdom."

Secondly, there's the synergy of hardware and software.

On one side, AI applications in Apple's operating system enhance product performance, creating a differentiated user experience. For instance, Vision Pro has pushed human-computer interaction into the augmented reality era by eliminating the commonly equipped controllers in current VR headsets and adopting a "gaze tracking + gesture + voice" multi-modal interaction method.

At the same time, Apple excels at integrating its software capabilities to offer users a smarter experience. Moreover, due to the low marginal cost of software services, they are more likely to generate positive growth momentum.

On the other side, Apple's in-house AI chip development and hardware innovation capabilities also drive the implementation of local AI features on consumer electronic hardware platforms.

In 2014, Apple incorporated a dedicated AI chip into the development of its new-generation SoC architecture. The design shifted AI tasks from being processed by the CPU and GPU to being handled by this specialized AI chip, ultimately integrating the chip into end devices. Apple's A-series chips have now been updated to A16, with Apple's AI capabilities continuously evolving. To efficiently execute more complex machine learning tasks on end devices, Apple introduced its Neural Engine processor (Apple Neural Engine).

Apple's goal is to maximize the use of the Neural Engine, reduce memory consumption, and offer speeds superior to the main CPU or GPU. AI features supported by the Neural Engine include natural language processing, computer vision, augmented reality technology, video analysis, and image management. These AI features are integrated into Apple devices. While they might sometimes go unnoticed by users, they require both computational power and algorithmic support.

Furthermore, the realization of AI features demands high levels of collaborative innovation in both software and hardware. Take facial recognition, a significant application in the AI field, as an example. Since iOS 10, Apple began integrating the deep learning-based facial detection method, Face ID, into its system. Implementing Face ID requires considering computational power while balancing privacy (facial data must be retained on the device) and recognition accuracy. This sets higher standards for both algorithms and hardware. Apple's ability to implement these features owes much to the multiple sensors and optical devices set up on the "notch screen."

In other words, under the strategy of "vertical integration" and "hardware-software synergy," Apple aims to make "AI omnipresent."

03 A Tougher Innovation Landscape, More Competitive Rivals

Apple's unique business strategy and years of AI groundwork seem to place a higher emphasis on AI's practicality and usability compared to other tech companies. However, this doesn't mean Apple can rest easy.

Recently, from internal deployment of "Apple GPT" to poaching generative AI talent from tech companies like Meta, Apple's anxiety is evident.

This anxiety stems partly from the negative effects of the "vertical integration" and "hardware-software synergy" strategy.

The advantage of vertical integration is that Apple has built its own kingdom. However, the downside is that the closed ecosystem amplifies the challenges of management. Apple's key R&D departments have been experiencing significant talent attrition. According to The Information, many engineers and executives from Apple's chip department have left in search of better opportunities. Even Apple's Senior Vice President of Hardware Technologies, Johnny Srouji, has expressed concerns about this.

This has led to an awkward "innovator's dilemma" for Apple's hardware and software innovation. For instance, with the A-series chips, the progress of Apple's in-house chips has gradually slowed down in recent generations, and a series of product feature iterations have been criticized as "incremental innovations."

As Richard Kramer, a partner at research firm Arete Research, stated, "Apple has entered a phase of incremental reform." Previously, TF International Securities analyst Ming-Chi Kuo also mentioned that Apple's progress in generative AI significantly lags behind its competitors.

Apple's rivals are unanimously going all-in on AI in this tech wave.

Tech giants like Microsoft, Google, and Amazon are planning to integrate generative AI into every aspect of their business. Amazon's CEO, Andrew R. Jassy, has also stated plans to invest over 50 billion dollars in cash by 2023, with a focus on AWS's technological advancements. On the other hand, after open-sourcing Llama 2, Meta is also exploring the integration of AI with its business.

Although Apple's latest quarterly report showcased stable profitability and the success of Cook's "soft and hard" strategy, given the sluggish demand for end devices and the global smartphone market's downturn, Apple, like most tech companies, needs to present a new narrative.

AI might be one of the few cards in Apple and Cook's hand. After all, as long as they're still in the game, the trillion-dollar Apple can always find its own solution.