Every year when Google and Apple release their latest operating system updates, there are always features that take inspiration from each other. Whether it’s a new customization, design or accessibility feature, someone has always done it first. Here are the five biggest iOS 16 features that Google made first and Android phones can do right now.
Smarter lock screen
I’ll be the first to admit that the new lock screen on iOS 16 is fantastic. The way Apple uses the depth effect to add depth and realism to photos and make them interact with the clock is awesome and fantastic. Having Apple Watch-style complications with health, stock, battery, and weather is great. You can change the iOS font and clock/complication color to match the wallpaper or a color of your choice. Everything depends on you.
Google did a lot of this first. Google has the At a Glance widget, which intelligently provides you with similar information by predicting what you’ll need. It still shows the weather and the date, but other information like upcoming events, tropical storm warnings, or boarding passes before you catch a flight are smart. These are more powerful than those offered by Apple – you just can’t manually choose what you want all the time. The color of the clock may also change. It will pull from the Material You color palette, which matches your wallpaper. You have four color scheme options with Android 12 and up to 12 options on Android 13.
A much smaller feature added by Apple was Live Activities, which allows apps to add a widget to the bottom of the lock screen with information such as sports scores or Uber distance. It basically looks like Android notifications, which app developers have been using for years on Android.
The new iOS 16 lock screen is great for iOS users, it looks great and works well, but it’s also something Android users have known for years. iOS users are lucky to get it now, although it’s safe to say that Apple took heavy inspiration from Google.
Auto-share in photos
In iOS 16, you can now have the Photos app automatically share your family’s photos in a shared album that you all have access to. It has options to allow all photos after a certain date or all photos with them. There’s even a button in the camera app that automatically places photos in the shared album. This shared album now gives everyone equal access to add, edit, and delete photos. Everyone has equal access and everything is shared with everyone in the album.
Google Photos has been doing this for at least two years. Partner Sharing, the equivalent of Google Photos, lets you automatically share photos that include that person. It has all the same features as Apple except it is not limited to only Apple products. Since Google Photos is web-based, you can upload DSLR photos from any computer and share them as well.
Beyond that, Google also offers automatic albums that you can share. This will automatically add any photos you take of a certain person or pet and add them to an album which can be shared with a link or directly through the app. You can even turn on collaboration so others can add their photos too. A whole group of friends can set it up to automatically add all of each other’s photos to the album and everyone has access.
Google’s feature has been around a bit longer and is still a bit more powerful than Apple’s. Luckily for iOS users, you can simply download the Google Photos app on your iPhone to access these features now and don’t have to wait for iOS 16.
Smarter dictation with punctuation and user interaction
On iOS 16, dictation now lets you edit and interact with what you dictate while you dictate it. You can click and delete things and just tell the phone what you want to do, and it will do it. It also automatically fills in punctuation.
These dictation features are an almost direct clone of the Google Assistant voice input from the Pixel 6 and 6 Pro. It has the same kind of features for interacting with text as you type, voice control over what you’ve already typed, and proper punctuation.
From my use of both iOS 16 and Assistant voice type, Google still has a major lead with this feature. iOS 16 likes to put punctuation where it shouldn’t and still struggles to understand me correctly. This is the first beta of iOS 16 though, so it’s likely that this feature will improve.
Several stops in Maps
Apple Maps now supports adding up to 15 stops along a route in maps. This seemingly simple feature has been present in Google Maps for years at this point. The only real difference between these features is that Apple Maps supports up to 15 stops while Google Maps has a maximum of 10. If you want multiple stops now on iOS, you can always download the Google Maps app on your iPhone.
Live captions were introduced in 2019 at Google I/O to use Google’s speech recognition technology to provide captions for content on phones that did not yet have closed captions. It would work in real time and generate them for any audio except phone calls. In March this year, Google announced it for phone calls as well.
iOS 16 brings exactly the same functionality. It captions real-time audio on any app, including in calls and FaceTime. The user interface even looks identical. After a quick test, however, it seems to be a bit slower than Google’s alternative and not as accurate.
Learn more about iOS 16:
FTC: We use revenue-generating automatic affiliate links. After.
Check out 9to5Google on YouTube for more info: