Like every year, this new version of iOS (which is available for iPhone 8 and newer starting September 12) is full of improvements and changes to nearly every app and screen on iPhone. In recent years, this is almost all new releases. iOS is a great, mature piece of software, and Apple is clearly no longer looking for an excuse to reinvent the wheel. But this year, Apple found a part of its software that hasn’t gotten much attention recently and revamped it. The lock screen is the real star of iOS 16. Apple has completely reimagined its purpose, shifting it from a clock and a bunch of notifications to something much more like a second home screen. The lock screen widgets were an instant upgrade to my phone life: I can now see my calendar without unlocking my phone or even swiping right to get to that widget page that everyone always forgets about and i have a tiny widget that launches a new note in my notes app. My favorite iOS 16 widget comes from the Streaks habit tracker. I have a daily goal of “take 5,000 steps” (we’re still in a pandemic, I work from home, and 5,000 steps feels like an accomplishment some days now) and a widget on my lock screen with a meter that slowly fills up as I approach that goal. number. It’s a subtle reminder every time I look at my phone that I should probably get out and touch the grass. The iPhone has never been good at these kinds of light interactions. Before iOS 16, most things required you to pick up your phone, unlock it, swipe to the right of the home screen, and open an app. Apple has tried to shrink this process through Siri voice commands, and part of the overall appeal of the Apple Watch is the easier access to simple tasks. But “put a bunch of them on the lock screen” might be Apple’s best solution yet. And when you combine that with iPhone 14 Pro’s always-on displays, iPhone becomes a source of useful information without a single tap required. You can do a lot with a lock screen, but setting things up is a lot of work. Image: Apple / David Pierce However, Apple isn’t done here. For one thing, these widgets are still annoyingly non-interactive. they can update with new information, but the only way to use them is to tap them to open their app. Why can’t I long press the Calendar widget to see my entire day? Why can’t I tap the Streaks “drink water” widget to actually record my water consumption? The new Dynamic Island on the iPhone 14 Pro is a slight improvement in this regard, at least while you’re actively using the phone: you see a tiny strip of information on the pill at the top of the screen, and you can tap it to open the app, or press and hold to extension to the full widget. However, I would rather be able to play and stop than the pill itself. Live Activities is also a kind of interactive widget, with live updates of sports scores and more, but only a few first-party apps seem to use it so far. (Kudos to Clock, the eternal early adopter of iOS features.) In general, widgets are still basically app shortcuts, and I’d prefer them to be tiny apps. In general, widgets are still basically app shortcuts, and I’d prefer them to be tiny apps With iOS 16, lock screens are also a way to change focus modes, which is super smart. I’ve never been one to change my background. I’ve had the same photo there for four years. But now, I have a home and lock screen background for the work week and another for weekends combined with a focus mode that turns off email and Slack notifications. My work week lock screen shows information. The weekend lock screen is very relaxed. Image: Apple / David Pierce Setting up all of this is a bit of work — you have to choose a background, choose font colors for the clock, add lock screen widgets for each, and then go through the Focus rigamarole over and over again — but it’s half worth it. -time because now I can just swipe through all the different functions on my phone. The focus modes in particular still seem to require an advanced degree to get right, but the lock screens make for a nice context switching mechanism and I’ve got it. My weekend lock screen is a picture of my dog ​​and every time I see it, something in my brain says “get off your phone and go outside”.

All the little things

It’s a long-standing and remarkably true joke that two-thirds of Apple’s new iOS features each year are only Android features from six years ago. Much of the other third party is Apple taking features from third party apps and baking them into the operating system itself. Aside from Apple occasionally pretending to invent decades-old software tricks, this is the right strategy: most users don’t want to download tons of apps or learn new things, and the more functional the iPhone is out of the box, the better it’s going to be for them. more people. There is one place where Apple does things that no other manufacturer or operating system can match, and that’s the camera. With iOS 16, you get Live Text on Video, which means you can capture some footage, then pause playback (doesn’t work while recording) and long-press on some text to copy it. It’s not perfect — he’ll occasionally think “organic” is spelled “WACIGINIC” — but it’s good enough to be useful. Same with the feature that can automatically capture the subject, as long as it’s human or animal, from a photo so you can paste or save it somewhere else. It works much better when your subject and background are really well separated, but I’ve been consistently impressed with how well it was able to mask and separate my dog’s head from the couch or my face from the wall behind me. With iOS 16, you can automatically separate a subject from the background. It mostly works! Image: Apple / David Pierce Beyond that, iOS 16 is full of semi-obvious features that seem like Apple could and should have added a long time ago. Unsending and scheduled messages in Mail are obvious — almost every other email service and app has offered these features for years, but they work pretty well in Mail now. The same goes for Maps, which can now make multiple stops in a single trip. It works well, although it’s not terribly advanced, and it makes you wonder what took Apple so long. But here we are. With iOS 16 comes a handful of new accessibility features, including a really cool system-wide closed captioning feature and some clever real-time image recognition. Of all these quality of life improvements, there are two that have made my phone life significantly better. The first is haptic feedback while typing. After weeks of use and hearing that soft hum every time I hit a key, I don’t know how I broke my fingers on stationary glass. I’m not sure it made me a better keyboardist, but it’s a much more pleasant typing setup. The second is to mark conversations as unread in Messages. For too many years, my general behavior with text messages was to either reply immediately or forget all about the message and never get back to it. Now, I can mark a message as unread and find it later. It’s still ridiculous that iOS 16 doesn’t let you filter to only show unread messages, but I’ll take what I can get. How did I ever type without haptics? In Messages, you can also now unsend and edit a message. If you and your recipient are using iOS 16, it works seamlessly: the text changes in place, with a small blue “Edited” symbol underneath that you can tap to see all versions of the message. (You can edit up to five times and up to 15 minutes after you first sent it.) If you’re not running the latest Apple software on your hardware, you’re doomed to that horrible “David edited this message” text that Android users will learn to know very well. The unsend feature, meanwhile, only works from iMessage to iMessage. There is no retrieving this text you sent to an Android friend. And don’t hold your breath for RCS to solve this problem. Editing and unsending messages is great, as long as you remember to press and hold. Image: Apple / David Pierce One feature I had high hopes for was the dictation improvements in iOS 16. In theory, you can dictate both more and better than ever: it now has emoji recognition, so the “heart emoji” actually renders the heart emoji and tries also to auto insert punctuation marks. You can now also dictate and type at the same time, which is confusing if you accidentally hit the mic button without realizing it and suddenly your text field is filled with background chatter that picked up the mic. These features were so random that I stopped using them altogether. And honestly, if you can remember the names of all the emojis, you should be studied by scientists. As phones have gotten bigger, Apple has started to shift the emphasis of the user interface to the bottom of the screen. The URL bar in Safari, the Spotlight search bar, and all sorts of other clickable UI fields have been moved down to save your rubbery thumbs. It’s definitely the right idea, but it takes a while to get used to the look. typing…