Zephyrnet Logo

Chinese-market iPhones could feature AI powered by Baidu

Date:

Future iPhones in China could include AI features powered by Baidu’s ERNIE chat bot.

Apple is apparently in talks with the Chinese web giant to integrate its machine-learning technology into iPhones sold in the Middle Kingdom, according to a Wall Street Journal report on Friday citing people familiar with the matter.

Cupertino has been exploring integrating third-party models from Google (which funnily enough invented BERT) and OpenAI into its iDevices to bolster its AI ambitions. Doing so in China, at least, may run into obstacles.

Last summer Chinese authorities instituted rules requiring models to be reviewed by regulators prior to their public launch. This is presumably to ensure that guardrails are in place to prevent them from generating responses that are, ahem, incompatible with government policy.

Launched in early 2023, ERNIE is Baidu’s attempt at a ChatGPT-style large language model and, as we reported at the time, it’s self censoring. When asked about Chinese President Xi Jinping, the chatbot remained silent, while other political questions ranging from Tiananmen Square to the treatment of Uyghur minorities were outright rejected.

Apple already uses Baidu as the default search engine on iPhones sold in China, so it’s not surprising Cupertino would consider expanding that relationship to include AI.

If Apple does integrate ERNIE into its China market devices, it wouldn’t be the first. Samsung is already using Baidu’s ERNIE chat bot in Galaxy smartphones sold in China. In the rest of the world, the South Korean electronics giant is relying on Google’s Gemini.

Up until recently Apple has largely avoided using the term AI in its marketing, preferring the term machine learning to describe capabilities baked into its smartphones, tablets, and Macs.

Apple’s software already makes extensive use of ML, whether it be for processing photographs, performing object detection or optical character recognition, or to suggest apps based on your usage patterns. Most of these operations are run on the device’s integrated neural processing unit (NPU) wherever possible.

Meanwhile, outside the Apple bubble, we’re being flooded with “AI” PCs and smartphones. Intel, AMD, and Qualcomm have all announced chips with integrated neural network accelerators to handle advanced models on personal devices rather than off in the cloud.

Possibly in light of this, Apple has started to embrace AI in its marketing. The iGiant touted its recently launched M3 MacBook Air as the “world’s best consumer laptop for AI,” and highlighted the ability to run LLMs and diffusion models locally on the machine “with great performance.”

However, it’s not clear at this time whether Apple intends to build LLMs into its operating system or instead integrate them using API calls to remote models. We suspect we may learn more about Apple’s AI strategy at WWDC in June. ®

spot_img

Latest Intelligence

spot_img