Artificial intelligence is a new technology Apple uses to improve essential functions in its new gadgets; this differs from other companies trying to make significant changes with AI.
Apple showed off a new line of iPhones and a new watch, neither of which used the term “artificial intelligence” to describe the technology. The new AI features are powered by better designs for the semiconductors inside. The features make simple things like making calls and taking pictures much better.
AI didn’t appear at Apple’s developer meeting in June, either. However, AI has quietly changed Apple’s core software products for months.
On the other hand, Microsoft and Alphabet’s Google set high goals for the change they wanted to make with AI. Industry leaders have worried about the possible harm of letting new tools like generative AI grow without limits.
Apple gave the Series 9 Watch a new chip with better data-crunching abilities. The “Neural Engine” with four cores can do machine learning jobs up to twice as fast. Apple calls the parts of its chips that speed up AI processes the “Neural Engine.”
Siri, Apple’s voice assistant, is 25% more correct because of the AI parts of the watch chip.
But adding the machine learning chip parts also let Apple launch a new way to interact with the device: people can “double tap” by pinching their watch hand with two fingers to do things like answer or end phone calls, pause music, or bring up information like the weather.
The idea is to give people a way to control their Apple Watch when the hand that isn’t wearing it is busy doing something else, like holding a cup of coffee or walking a dog. When a person taps their fingers together, the new chip and machine learning work together to pick up on small movements and changes in blood flow.
The company that makes the iPhone also showed how its other phones can take better pictures. The company has always had a “portrait mode” that blurs the background by simulating a giant camera lens with computer power. But people had to remember to turn it on. Now, the camera knows when a person is in the shot and collects the information to blur the background later.
Apple is one of many companies that add AI to its phones. For example, users of Google’s Pixel phones can get rid of annoying people or things in photos.