If you’re not aware of it right away, your iPhone is already using artificial intelligence.
These days, businesses like Google and Samsung actively promote artificial intelligence (AI). However, did you realise that your iPhone also employs AI in numerous ways that you are unaware of? Let’s get into more detail about these aspects.
Copy Text From Images and Videos via Live Text
Using Live Text on your iPhone, you may copy text from any picture or video. It can recognise handwritten and typed text in a variety of languages, including Chinese, French, and German, using machine learning and image recognition. Although it’s important to note that you’ll need an iPhone Xs or later to access this feature, you can also use it on your iPad or Mac.
Simply launch the Camera app, focus on the text you wish to capture, and then hit the Live Text button located in the lower-right corner of the viewfinder to activate it. You can then use that link to copy the text, translate it, or look it up online. As an alternative, you can use Live Text in the Photos app by selecting the desired text by long-pressing on it when a picture or video opens.
People and Pets Recognition in Photos
You can arrange people and your pets using a great function in the Photos app. This makes it really simple to locate particular people inside your photo library. Since Apple places a high value on privacy, no data is uploaded to Apple’s servers and all of the capabilities discussed in this article—including this one—use on-device processing for artificial intelligence activities.
Simply launch the Photos app, select People & Pets by scrolling down beneath the Album tab. You may examine every photo of every person that frequently appears in your gallery by simply touching on their icon, which will display a grid of all of them.
Photonic Engine and Night Mode
We’re all aware that iPhones have great cameras, but the underlying software is equally as crucial as the hardware.
The function that best showcases the software prowess of your iPhone is night mode. When you take a photograph in low light using the Night mode enabled on the camera, it takes several pictures at various exposures and then use machine learning techniques to combine them and bring out the best features of each picture.
The Photonic Engine included in the iPhone 14 Pro and later versions is another fantastic illustration of how AI is being used in photography on iPhones. Utilising the larger sensors, it immediately applies computational methods to uncompressed images to enhance low-light photography, producing photographs with greater brightness and detail.
I’ve included a photo below that I took using the Photonic Engine on my iPhone 14 while using Night mode to demonstrate the impact AI and computational photography can have.
Personalized Suggestions in the Journal App
One of Apple’s initial efforts to promote mental wellness is the Journal app. Although the majority of users might think the app is rather simple and doesn’t use any AI, a lot more work is being done behind the scenes.
The software analyses your previous activities, including your workouts, music choices, and even the individuals you have been talking to, using on-device machine learning. It makes tailored recommendations based on this information, aiming to predict your present emotions and psychological condition.
All of this data collection may seem like a nightmare for your privacy, but according to Apple, every entry is end-to-end encrypted. Your data is processed entirely on the device, so it never leaves your iPhone.
Personal Voice
There are many accessibility options on your iPhone, but my personal favourite is Personal Voice. This tool helps people who may someday lose their capacity to talk due to cognitive diseases like ALS.
Upon enabling the feature, approximately fifteen minutes of audio are recorded. Your iPhone crunches this information over night to create a synthesised voice that you may use anywhere. It’s an astounding display of the power of the Neural Engine, highlighting the extraordinary possibilities of Apple’s on-device AI hardware.
Once configured, you can use your keyboard to speak during FaceTime and iPhone calls using your Personal Voice.
Image Descriptions
For those who have vision impairments, this accessibility feature is really helpful, even if it is fully concealed on your iPhone. If you have low eyesight, you can utilise Image Descriptions with VoiceOver to have your iPhone read aloud what it perceives in an image.
However, you can use the Camera app with VoiceOver enabled to obtain real-time information as well, thus the feature is not just restricted to the Photos app. As an alternative, you can utilise the Magnifier app and activate the feature by hitting the plus (+) button next to Image Descriptions after selecting the Settings icon in the bottom-left corner.
Face ID
Because Face ID unlocks your iPhone so smoothly, it’s often neglected. However, did you know that it uses the Apple Neural Engine to painstakingly create a 3D map of your face?
The TrueDepth camera, which projects and analyses over 30,000 invisible infrared dots across your face, is used in this procedure to gather depth data. These dots produce an accurate depth map that is utilised to build an all-encompassing picture of your facial structure.
Predictive Text and Autocorrect
Before iOS 17, one of my main complaints about iPhones was how unreliable the autocorrect was in comparison to Android. Fortunately, a lot has changed, as the iPhone keyboard now employs machine learning models each time a key is tapped.
The keyboard’s ability to recognise the context of the sentence you are typing and provide you with more precise autocorrect options is another significant improvement. Apple was able to achieve this by implementing a transformer language model, which makes use of neural networks to analyse the connections between words in your sentences.
Remember that AI is also used by iPhones in their Predictive Text feature, which provides inline suggestions for sentence completion.
Apple has quietly integrated artificial intelligence (AI) into a number of its functions, making your iPhone far smarter than you may have imagined. Even though we’ve just discussed a few sneaky tips today, we anticipate that Apple will reveal a number of new AI features at WWDC 2024.