ios_7_home_screen_hero_v2

Definitive review of Apple’s completely redesigned iOS 7 software update for iPhone, iPod touch, and iPad

iOS 7 represents nothing more nor less than a radical rethinking of mainstream multitouch interface. A complete visual departure from previous versions, it focuses on clarity by removing all but the most essential elements and chrome, deference by getting out of the way of content and apps, and depth by building the entire experience around a physics and particle engine that moves, blurs, parallaxes, and layers in virtual 3D. It touches every app, every pixel, and every bit of the system. It’s far from perfect, and there are issues as superficial as icons and as deep as consistency yet to be overcome, but along with new features like Control Center and AirDrop, and improvements to Notification Center, multitasking, the Camera and Photos apps, Safari, Siri, and more, it’s the most exciting update to iOS in years, and to mobile interface since the original iPhone. But it’s also facing most competitive market ever and, given the alternatives, will it be enough?

Note: Some of this material was originally published in our iOS preview but was incomplete and outdated due to Apple’s non-disclosure agreement (NDA). It’s been fully updated, expanded upon, and refined here into our full on iOS 7 review. Enjoy!

Previously on iOS

iOS was introduced back in 2007 with the original iPhone and has been expanded, refined, and improved ever since. Part of knowing where we’re going is knowing where we’ve been. Here are our reviews of past versions of iOS for iPhone and iPod touch, and since 2010, iOS for iPad as well.

Compatibility and updating

iOS 7 comes pre-installed on any new iPhone, iPod touch, or iPad. It’s available as a free update to anyone using an iPhone 4, iPhone 4s, iPhone 5, or iPhone, an iPad 2, iPad 3, iPad 4, or iPad mini, and an iPod touch 6. Not all features are available on older devices.

You can update over-the-air (OTA) on on-device or over USB using iTunes on Mac or Windows. OTA updates-in-place are typically the fastest, setting up as a new device is typically the best way to get the best performance.

iOS 7 interface and experience get physical if not always beautiful

The biggest change to iOS 7, and the most important, is the system-wide redesign. With, Apple has taken interface and experience from static to dynamic. It’s more nuanced than that, of course, but that you have to see it moving to understand how it looks and works reveals the essential truth of that statement. iOS 7 feels alive and vibrant. It’s the vision of Apple’s senior vice president of design. Formerly restricted to hardware, he’s now responsible for hardware and software both, and his predilection for stripping away everything inessential until only the most authentic, most necessary elements is evident. The green felt is gone. The wooden shelves are gone. The stitched leather Steve Jobs was so fond of is gone. In their place is a lot of solid colors with only the subtlest of gradients and textures remaining. Architecturally, it’s laid bare. Design-wise, there’s nowhere left to hide.

If you’ve used iOS before, everything is going to look different after you update to iOS 7. The change is striking. Here are some examples showing iOS 6 on top and iOS 7 on the bottom, including the Lock screen, Home screen, and Notification Center. (Yes, the hands on the Clock icon on the iOS 7 Home screen really do move now.)

Say goodbye to rich textured themes in Game Center, Compass, and Newsstand. You’re not going to have green felt to kick around anymore:

And say hello to objects that exist in a virtual three-dimensional space, and can be directly manipulated, like multitasking cards, Safari tabs, which behave like a souped up version of Passbook passes.

Every screen of every app has been given a fresh coat of paint, from Calendar to Notes to Reminders. Every. One.

The redesign is based on three key principles: clarity, deference, and depth.

Depth is handled by the new physics and particle engine. The entire interface and experience is built on it. Screens no longer move because someone animates them, they drop and collide and bounce because of behavior ascribed to them. Likewise icons fly in like a fleet coming out of hyperspace, and apps and folders and days and months zoom in and out like portals into deeper worlds, chat bubbles bounce like balloons, cards knock together, and wallpapers and entire Home screens that shift with your every movement, providing glimpses into what’s just below the surface.

Gaussian blur shaders are used liberally throughout iOS 7 as well. So much so they seem to be on more than they’re off. The pixels below are sampled in real time, so if a banner moves in the App Store beneath Notification Center, you see it, blurred, moving beneath Notification Center. When you swipe between modes in the Camera app, the live preview image you’re looking at blurs as it transitions. When you start FaceTime – now a dedicated app even on iPhone – before you place call you see your own image, captured by the front-facing camera, blurred and looking back at you. It’s computationally expensive enough to make a graphics engineer cry, it’s also something as visually distinctive as the physics and particle animations.

Here’s what it looks like:

iOS has always stressed direct manipulation and 1:1 touch tracking, because it created the illusion of genuine interaction. Combine that with the new engines, and now the illusion is even better. You’re not just tapping buttons, you’re moving objects through their own virtual space. You’re not just flipping through a stack of tabs tediously drawn to look and move like cards, hiccuping and losing proper perspective as you go. You’re flipping real card-shaped objects that fly past, always in perfect perspective because they’re rendered to be. It’s so real, it begins to feel like a game. and that’s exactly the point. Real gamification is about enabling discovery though play. It’s about rewarding intuition with delight. It’s about making computing fun.

Here are some examples of directly manipulable objects in the form of multitasking cards, Safari tabs, and Passbook passes:

Deference is handled by getting rid of the heavy chrome, the obtrusive and unmoving title bars and tab bars and thickly delineated buttons past. Now everything is edge to edge, from the subtle animations of snow and rain and lightning in the Weather app to the unified search field that minimizes and controls that fade away in Safari, to the use of translucency so content can continue to provide context. It’s also philosophical, prompting developers to rely less on Apple’s default UIKit and paint the screen in a way that best suits their own tastes and apps.

Buttons have also been simplified to the point where, in many cases, I don’t even know if I can still call them buttons (though Apple does). They’re utterly without chrome or adornment of any kind, naked bits of sometimes colored text that trust in a new generation’s learned knowledge of multitouch.

Clarity is best highlighted by the new Text Kit, which allows fonts to dynamically scale not only in size but in weight so type always looks great, and people who want a bigger size for increased legibility can have it, screen size be damned. Text Kit isn’t getting the attention other elements of iOS 7 are getting, but it absolutely deserves it.

It also highlights one of Apple’s best qualities. They look at problems that need to be solved, not solutions that keep being proposed. When people said they wanted multitasking, they didn’t mean they wanted battery-melting infinite processes. They wanted to listen to Pandora while surfing the web. So Apple made an API for that, and other high-demand background services, and has now created just-in-time multitasking to meet even more needs (see below). Likewise, when some people say they want a bigger screen, what they’re saying is they want to see more content and have text at a bigger size. Deference and Text Kit solve both those problems on devices of all sizes.

Here’s the new Lock screen, which shows the text, the translucency, and the physics-based wallpaper:

Speaking of which, the new default system font is Helvetica Neue, and it tends towards the ultra-thin at times. Why Apple didn’t go with their own, custom system font is a mystery, and while Helvetica Neue looks beautiful at times, it can be hard to read as well. Luckily the same Text Kit system allows you to easily scale and thicken it if and as needed.

There are other problems too. The icons all reference the same grid now, one that seems drawn from Apple’s hardware designs. They range from beautiful, like Photos, to unbalanced, like Safari, to background dependent, like Stocks and Voice Memos. Rumor has it they were specced out by the graphic design department instead of the human interaction department, something Jony Ive felt would bring fresh eyes and a new approach. It’s triggered some legitimate criticism and some change aversion both. Over the last 3 months most of the icons have come to no longer bother me, but flat or not, few leap out at me as genuine improvements. Also, the glyphs are thin the point of looking fragile, and sometimes simplified to the point of non-obviousness.

Likewise, some of the interfaces are breathtakingly gorgeous to the degree that even now I can’t stop starting at them. Everything from passcode entry to the dialer is palpably improved. Other interfaces, not so much. Particularly the status bar, which comes off as much cluttered and confusing now than at any point previously. Also, the cellular signal strength indicators, circles now instead of curved bars, convey the same information yet take up far more space.

Here’s a look at Safari, which shows the deference of receding interface, but also the status bar and glyphs:

It feels like some of the key ideas of iOS 7 – clarity, deference, and depth – were taken a step too far in some places, to the point where they get in their own way. Maybe that’s the way it works. Maybe it’s what happens when you’re sprinting so quickly towards something new and you can’t decelerate fast enough after crossing the line.

Hopefully that gets pulled back and polished in future versions. It’s hard changing something used and depended upon by hundreds of millions of people. Even with the massive changes in iOS 7, most major components remain spatially consistent. People familiar with where the phone icon was in iOS 6 will find it in exactly the same place in iOS 7.

The new buttons that look like naked text links on the web might confuse some people. The new location for Spotlight – swipe down on any Home screen – might frustrate those simply trying to find their apps and the new way to access video – swipe left on the Camera – might escape those simply trying to make a recording.

That’ll change quickly. Shock, if any, will pass. This is better. It sets the stage for the future and since Apple seldom looks back, they’ll drag the rest of us along with them. A week in, a month, a year, we’ll look back at old versions of iOS the way we look back at old video games.

iOS 7 also looks fantastic on the iPad. It should. The original iOS was designed in a pre-iPad world and was retrofitted onto its bigger screen. This is the first iOS designed for the iPad. It’s open. It’s expansive. It fills the screen without spreading itself too thin. You could argue it looks even better on the iPad than iPhone.

Right before iOS 7 was announced, I asked what was next for human interface. With iOS 7, Apple answered.

iOS 7 Siri gets Wikipedia, Bing, Settings, Twitter, but doesn’t get on-board

Siri sits on top of iOS as a secondary, natural language interface layer. A personal digital assistant big on personality and partnerships, but challenged in reliability, with iOS 7 Apple has continued to add new services while redesigning everything that’s come before. Gone is the linen and beautifully rendered sports, movie, and other widgets, and in their places is the starker, cleaner, and more translucent treatment. It’ll even fly in sample questions for you if you’re not sure what to ask. The resulting look is sometimes hauntingly great, other times murkily bad.

The principal new element is a sound wave that harkens back to Siri’s predecessor, Voice Control. It’s a fun visual. Not as fun is the heaviness of the text, which looks out of place compared to the thinner treatment found in the rest of the interface. It does help usability, however, and more specifically, glance-ability, which is more important than consistency when it comes to how Siri is used.

Siri gets two new, high quality voices in iOS 7. One is male, the other female. They’re not available in all languages yet, but it’s just a matter of time. Having new voices was increasingly important for Apple. The original female Siri voice wasn’t exclusive to Apple, and that was an odd choice to begin with, and something others could use to graft onto Apple’s attention, and competitors could use to tease them. Hopefully these new voices are original, and Apple’s alone.

New features include the ability to change Settings. While that’s geek-centric and mirrors another new iOS 7 feature, Control Center, it’s also welcome. Likewise Siri can now access more communications feature. Where previously Siri could find email and messages, and read messages, playing voice mail is a nice addition. So is the ability to find and show tweets. Hopefully Apple continues to expand on this until Siri can find, read, and otherwise access with all messaging on iOS.

New services include Wikipedia and Microsoft for Siri search, especially image search. Some might assume that it’s just one more casualty in Apple and Google’s cold war, but Siri has always been a partnership play and it’s just as possible Microsoft offered Apple the best deal. What remains to be seen is how good the results are, because that’s the only thing that really matters at the end of the day.

Siri has also become persistent. Previously if you left Siri for any reason and then came back, all your previous results were gone. Now you can simply scroll backwards and see search results, movie listings, and whatever else you recently called up. This might seem trivial, but it’s incredibly useful.

What Apple hasn’t added is any local, on-device functionality for Siri. Google’s been doing this for a while, and it helps minimize network connections and backend servers as a point of failure. Basically, for any action that involves only the apps on the phone or tablet, for example, toggling a setting or adding a reminder, all voice parsing is done on the device. Only when a request needs to go online, like to check the web or check with a service, does it hit servers. Siri currently goes to the servers for everything, making it slower and subject to more failures than Google’s voice tech.

Siri also didn’t get was any of the predictive assistant services Google Now enjoys. Like Google on Android (and in more limited fashion in the Google Search app for iOS), Apple on iOS can aggregate all sorts of calendar, location, environmental, and social data, and can synthesize from it where we are, where we need to be, with whom, and under what conditions. Instead of waiting for us to ask, Siri could be providing it preemptively so we don’t even need to ask.

Apple has shown they’re doing a little bit of that with Notification Center’s new Today screen, which will tell you the time (with traffic) to your next most likely location. Perhaps they’ll evolve a system complementary to Siri, rather than a component of Siri, to handle predictive assistance. That’s be a shame though, since Siri has that Pixar-like personality that helps make assistant services accessible.

Ideally, a predictive Siri would replace the current notifications on the Lock screen, and the Today screen. Either way, Google seems closer to the movie version of Tony Stark’s Jarvis right now than Apple, and I hope that turns around, and soon.

iOS 7 Notification Center flirts with prediction, stays away from action

Notification Center could previously be pulled down from anywhere in iOS except the Lock screen. Now it can be pulled down from the Lock screen as well. Notification Center previously shared the linen texture indicative of the sub-layers below iOS. Now it shares the gaussian blur shader – smoked glass variant – indicative of the layers above. That makes it more consistent both visually and behaviorally. If privacy is a concern, however, Lock screen access can be disabled in Settings.

Because, like the rest of iOS 7, Notification Center uses the new physics engine, you can not only pull it down now but yank it and watch it collide with the bottom, bounce, and then settle into place. Fun. It also makes use of navigation gestures, in this case to move through three new, tabbed states: Today, All, and Missed.

All is similar to what Notification Center has shown since iOS 5, though the Weather and Stocks widgets have been moved to the new Today view, and the iOS 6 Tweet and Post to Facebook button are gone. Some might lament their loss, but they were out of place there. The wrong solution to a real problem. Unfortunately, no right solution has replaced them.

Missed is similar, but constrained to the last 24 hrs. How useful that is depends on the volume and type of your notifications. Labeling it “Missed”, however, doesn’t seem to accurately define its contents. “Recent” would be a better fit.

Today shows you the current day and date with a brief, written description of the current weather in your current location, and a written description of your next appointment. It can also tell you if current traffic conditions will impact your next trip. As visual representations of data go, it’s non-optimal.

The written out weather and next appointment are a step backwards when it comes to glance-ability, if a step forwards in terms of informational density. The graphical weather widget was easier to take in a glance, but provided little more than “sunny” or “rainy”. In a perfect world, Apple would find a way to balance both. Re-introduce a graphical element and keep the deeper text. Likewise with stocks, which used to scroll in one tidy widget, and now sprawls out row after row after row after row…

Integrating traffic information for frequent locations, on the other hand, is outstanding and hopefully only the first indication that Apple is heading towards a more Google Now-style implementation where they parse location, time, calendar, and every other metric they have at their disposal and present contextually appropriate, predictive alerts in Notification Center or in something even better that replaces it in the future. Hopefully the near future.

If you’re not a fan of your phone tracking you, which is how it predicts where you want to go and when, you can disable Frequent Locations in Settings > Privacy > System Services. Privacy, like security, is at perpetual war with convenience.

Beneath the text you get a more elaborate, more graphical look at Calendar, Reminders, and Stocks, as well as another written out description, this time recapping what’s coming up tomorrow. If you can make it down that far. Like in previous versions of iOS, you can turn off what shows up in Notification Center, and in the Today view specifically, and use Do Not Disturb to selectively make sure notifications don’t become annoyances.

Unfortunately, Apple still hasn’t added any gesture-based way to dismiss notifications. Other platforms have allowed you to swipe away notifications for a long time already. The immediacy of “tossing things away” is tough to beat. Hopefully Apple addresses this, because the tiny little X button is discoverable, but not very usable.

Apple has, however, added notification sync, so when you dismiss a notification on one device, it will dismiss it on all devices, so you don’t have to deal with the same alerts, again and again and again…

Perhaps the biggest omission in the whole system remains interactive notifications (sometimes called actionable notifications), which Apple just introduced for the Mac in OS X Mavericks but hasn’t added to iOS. The ability to quickly respond to a message, reset a timer, or otherwise handle simple items without having to switch apps is even more necessary on mobile than on the desktop. Android has had them for while, so here’s hoping OS X is just a precursor to the same or similar system on iOS, and sooner rather than later.

iOS 7 Control Center provides quick if not customizable access to toggles

Quick access to system-level toggles has been something every power user has wanted since the day the original iPhone shipped. Some 7 years later, Apple gives us Control Center. Like Notification Center, Control Center is a layer that you can slide over the main iOS interface, including the Lock screen if you so choose. It enjoys the same, bouncing, playful iOS 7 physics, and the same blur effect that mutes but doesn’t entirely obliterate what’s underneath. Unlike Notification Center, which comes from the top down, Control Center is activated by swiping up from beneath the screen, and rather than dark, smoked glass, it’s given a light, frosted treatment.

That Control Center functions so much like Notification Center, and even uses similar nomenclature makes it easy to understand, even for non-power-users who haven’t been lamenting its absence on iOS for years. It’ll give the obsessive compulsive among us nearly instant access to toggles we probably ought not be toggling all the time, but it’ll also give plenty of regular people a fast, easy way – and more obvious than the old fast app switcher controls – to get at things as simple as media controls and even a flashlight when they need them.

Control Center’s top row provides handy on/off switches for commonly used settings like Airplane mode (which, when turned on, will turn off the cellular radio), the Wi-Fi radio, and the Bluetooth radio, as well as toggles for Do Not Disturb mode, and the portrait/landscape orientation lock. Black means off, white means on, and a brief bit of text will show up to confirm it so.

The next row is a brightness slider, from dark to light, then media controls that includes a positional scrubber, the title of the track/episode you’re listening to or watching, the name of the album/series that track/episode is from, skip backwards or forwards buttons, pause/play, and a volume slider. If you tap the track title, you’ll be taken to whichever app is currently playing the media, be it Music, Podcasts, or something else.

If available, AirDrop and AirPlay occupy the next row, and allow you to quickly access sheets with their individual options.

The bottom row consists of icons to toggle the LED flash-come-flashlight on or off, and variants of Clock, Calculator, and Camera icons for quickly accessing those apps.

The wedding cake design is serviceable and keeps all the controls organized while avoiding clutter. The toggles on the top look good, though some of the lines lower down are thin to the point of fragility. The only downside is that Control Center isn’t customizable, at least not yet. If you’d rather have different toggles, like personal hotspot, or different fast app access, like Twitter, well that’s your tough luck, at least for now.

But, baby steps. I once wrote that iOS wasn’t meant for geeks, and while I still think that’s generally true, with iOS 7 and OS X Mavericks, Apple is now showing that they have more than enough love to go around.

iOS 7 Gesture navigation provides expert if inconsistent shortcuts

iOS 7 continues Apple’s long history of gesture-based controls, some system-wide like the new swipe up from the bottom bezel to open Control Center, and some app (or multi-app) specific, like the new swipe right from the left bezel to travel back up the hierarchy Mail or Messages or the history in Safari, or the new toss to close apps in multitasking or tabs in Safari. There are also fantastic new “peek” gestures that let you pull left just a little bit to see individual time stamps in Messages, or pull down to turn a notification banner into the full-fledged Notification Center. Gesture controls can be tricky, however. If not direct they can be hard to discover, if not consistent they can be hard to habituate, and if not carefully considered they can collide and conflict with each other, both system-wide and app specific.

For example, when Apple first introduced four-finger navigation gestures for the iPad, you could accidentally swipe your way out of Fruit Ninja and into Mail. Now, you can swipe up in Hue to try and manage your lights and end up with Control Center instead. You can disable Control Center from being accessible inside apps, but since not everyone will, developers have to assume it’ll stay on, and cede basic gestures to Apple and the system.

Because the swipe-right gesture appears limited to certain apps, namely Mail and Messages, it won’t collide with other apps already using that gesture. However, the way Apple is implementing the interface in iOS 7 in general, because of that gesture in Mail or Messages, could make other apps look odd. Especially ones that currently use the popular “hamburger button and basement sidebar” design (I’m looking at you Facebook, Google apps, etc.) Even if iOS doesn’t stomp all over them, if they look wrong, or simply feel wrong on iOS 7, they may be forced to change and become more Mail or Messages-like. (And that might not be a bad thing.)

The good news is that all of these are direct manipulations. The bad news is that they’re not all consistent or symmetrical. Direct manipulations are more easily discovered than abstract gesture controls (which iOS stays completely away from for everything but accessibility), but in order for them to be habituated they need to be consistent. Notification Center is the perfect example. Any time, from anywhere, you can swipe down and what happens is exactly what you expect to happen – it appears. Control Center is the same.

The sideways gestures are where iOS 7 starts running into problems. First, because they’re only implemented in specific apps, they require the user to remember which apps include them. Worse, because they’re implemented inconsistently and asymmetrically across apps, they require the user to remember what they do in each app. That’s a high cognitive burden.

For example, in Safari – and in Photos, Calendar, Weather, and other apps before it – swiping from left to right takes you backwards through the sequence, and swiping right to left takes you forward. That’s logical and symmetrical. Even Camera, where swiping changes modes, moves through the modes in sequence and remains consistent.

However, in Mail and Messages, swiping from left to right doesn’t take you back through the sequence of messages, but up in the message hierarchy. You swipe back from message to message list to – in mail alone – message list box. Where it gets more challenging is swiping from right to left, because not only doesn’t that take you forward through the sequence, it doesn’t take you deeper into the hierarchy either. What it does is switch from direct manipulation to quasi-abtract command, revealing a destructive action – delete. That’s not only asymmetrical (swiping different directions results in different behaviors), and inconsistent with other apps, it’s a massive contextual change.

Photos can have hierarchies with albums, Calendar days with months, so there’s some overlap, but Apple’s recognizing that hierarchies in Messages and Mail are far more important in real-world use cases than they are in other apps, and re-assigning the gesture. They’re also keeping it simple by not, for example, leaving a one finger swipe to move through sequences of messages and using a two-finger swipe to move back to the hierarchy. That’s understandable and, in a world filled with trade-offs, sensible.

Switching from direct manipulation to go back to abstract command to delete is less understandable and sensible, but more a reflection of a legacy control Apple’s been using since iOS 1 (iPhone OS 1.0).

Here are some examples, with the Mail gestures (back vs. delete) on the left, Safari gestures (back vs. forward) in the center, and downward swipe gesture on Home (Notification Center vs. Spotlight) on the right:

In a perfect world swiping from right to left from the edge would move you into whatever message your touching, while touching a message and holding would allow you to delete it, much like cards and tabs. Apple has used modal gestures before, for example an edit button that changes an upward movement from the general scroll gesture to a specific item re-arranging gesture. Likewise, swiping down from the bezel reveals Notification Center, but swiping down from the screen in Home reveals Spotlight search. It adds complexity but also functionality. Detect if the gesture started at or near the edge, and if so make it navigation. If not, if it started on the meaty part of an item in a list, make it editorial. It will require learning, but not much.

The most important thing is consistency. Unless and until a swipe takes you back in every app where there’s something to go back to, it’ll always be harder to remember and become habituated to. Unless and until a forward swipe does something in every app where there’s a backward swipe, and there’s something to forward to, likewise. Unless you can pull a pass up out of Passbook as easily as you can shove one down and back into the stack… You get the idea.

For gestures to succeed for the mainstream, they have to always be where they’re expected, and always do as expected. With iOS 7, we’re only part way there.

iOS 7 Multitasking made intelligent

Mobile multitasking is all about compromise. You either limit what can be done by apps, or you limit the battery life of the device running them. iOS has always been fantastic at multitasking. It was built on the same foundation as OS X, after all. The very first iPhone demo showed Steve Jobs start some music, fade into a phone call, jump out to check the web and email, jump back to the call, and then fade back to the music. The compromise then was no third-party apps, and post iOS 2.0 and the App Store, no multitasking for third-party apps.

iOS 4.0 brought multitasking to App Store apps, but compromised on who got access and what they could do. VoIP, navigation, and streaming audio were wide open, everything else tightly timed or still turned off. With iOS 7, Apple is trying to have their background and their battery life too, and they’re using some very smart technology to do it. Instead of simply allowing persistent, pre-emptive multitasking like OS X does on the desktop, and like how some competitors do on mobile, Apple is recognizing that they have neither a power cable plugged into the wall, nor a desire to offload battery and task management to their customers, and they’re using a dynamic, just-in-time system to try and get the best of both worlds. Here’s how it works:

Intelligent scheduling lets apps you use frequently – for example, Facebook or Twitter if you check them near-constantly – to update frequently so whenever you launch them, they’ll have all the latest information ready and waiting for you. Apps you use regularly but not frequently – for example, if you check the news when you wake up and before you go to sleep – can update just before you typically check them so they use less power but still have the information you want, when you want it.

Opportunism and coalescence let apps take advantage of circumstances to update efficiently as well. For example, apps can update during any of the very many times a day you unlock your device and the system is powered up. Apps that require it can update when your radio signal is strong and power requirements are at a minimum. And if and when something like GPS gets powered up for one app, other apps that need it can tag along for the ride and get their updates handled as well.

Where previously you’d get a push notification, go to the app, and then have to wait for the app to download the data, now push triggers prompt background updates so that the data is ready and waiting for you by the time the app opens. At least in theory. Developers can even send silent/invisible push triggers to wake up their apps for update, which greatly increases the usefulness.

With the iPhone 5s specifically, the M7 motion coprocessor will persistently track accelerometer, magnometer (digital compass), and gyroscope data without the need to power up the main processor. Apps can then pull the data, which essentially gives them full background access without the need to actually be open and consuming resources in the background.

This all works based on the concept of perception being reality. It doesn’t really matter when an update happens as long as it happens before we see it. That’s what makes just-in-time so much more efficient – and so much less wasteful – than all-the-time.

All of this sounds great in theory, but it remains to be seen how well it will work in practice, especially at first. As more and more developers integrate the new multitasking features, and Apple continues to improve the system, it should become better and better.

The new multitasking interface, however, is a huge improvement right now. The old fast app switcher was never a great solution. Apple tested other metaphors for iOS 4 before they settled on it, including something like OS X Expose, but Safari Pages, and more expressly, webOS cards, always felt like a better solution. Cards not only match the physicality of iOS 7 in general, they’re something with which almost everyone is already familiar.

Not that it looks perfect yet. There’s a Home card, for example, that might help ensure mainstream users aren’t confused about how to find the Home screen, but there’s already Home button for that. All Home does in card view is break the metaphor (how can the card view sit on top of Home when Home is in it?).

Unlike some other platforms, cards aren’t kept “live”. You can’t watch a video play in card mode, for example, and it doesn’t seem like websites update if you just sit there staring at Safari either. It’s arguable live cards aren’t necessary and not a great use of resources, but like constant blur filters they can be an impressive effect.

Also, in webOS, every instance of an app could have a card. For example, you could have multiple web pages open at the same time in card view, or multiple email message drafts ready and waiting. Multiple web pages would quickly over run the interface, however, and are better handled in Safari’s rolodex. webOS used Stacks to organize sets of cards. Again, greater complexity, but greater functionality. Right now, simpler feels better.

Thankfully, Apple did duplicate the webOS method for closing apps. Instead of holding apps down until they jiggle, and then hitting the little X badge – which conflated the action with deleting apps from the Home screen – you simply touch and hold a card and then toss it up and away. You can also toss multiple cards away at once (up to three – the maximum shown on screen at any time). And no, there’s still no option to “kill all apps”, because you don’t ever need to “kill all apps” even if sometimes it’s a fast way to troubleshoot rogue processes.

In addition to the new card interface, Apple also retained the old fast app switcher’s icons, placing them at the bottom of the cards. Cards capture static views from the apps they represent, but those representations might not immediately be recognizable. One mostly white page can be hard to differentiate from another mostly white pager. Icons are made to be recognizable, even at a glance. Cards and icons together provide for both greater information and faster recognition. Win. Win.

Back before iOS 6, I hoped for a better fast app switcher. With iOS 7, Apple delivered.

iOS 7 Camera gets real-time filters… and a square

Like much of iOS 7, the Camera app has gotten a complete makeover, but for the most part has remained spatially consistent with previous versions. The shutter button, flash button, camera-switch button, and photo thumbnail are all exactly where they used to be. Options has been replaced by a dedicated HDR button, however, panorama moved, and the grid toggle banished to Settings.

Moreover, the method for changing between still and video has changed. Instead of a binary switch, you can now swipe left from still to video camera, and also swipe right to get to the new square mode (cropped still), and right again to get to panorama. Taking the place of the old still/video switch is the new filters button. There’s a real-time blur effect between each mode, of course, just for good measure.

On the iPhone 5s you also get a burst mode on the still camera using the same shutter button, 3x video zoom, and an additional video camera – 120fps slow motion.

Taking photos on iOS 7 is lightning fast. Gone is the old shutter closing animation, new is a fade-to-white-and-back so fast if you blink you might miss it. You can just tap, tap, tap, and take photo after photo after phone. Did I mention how fast it is? Even high-dynamic range (HDR) is noticeably faster, though still much slower than non-HDR photos.

On the iPhone 5s, if you hold your finger down on the shutter button it’ll take bursts of photos at 10fps (10 photo frames a second). Instead of overwhelming you with tens of photos per second, however, Apple leverages the new A7 to automagically choose and present the best ones, including the multiple highlights of an action shot, if available, yet still lets you dive into all the shots if you ever want to pick your own. That’s a great example of providing primary level ease of use, and secondary level expanded use, and how these types of features should be done. By everyone.

The new filters apply to the still and square cameras. They don’t apply to the video or panorama cameras. If you apply one, they’re live and you’ll see them in the preview the way they’ll look when the photo is taken. They’re subtle, as filters go. Mono, Tonal, Noir, Fade, Chrome, Process, Transfer, and Instant. There are three types of black and white, one desaturated, one over saturated, and and one each that tint towards blue, red, and green. Nothing blown out, nothing vignetted, and nothing overly dramatic.

Here’s what the iOS 7 black and white filters look like when compared to the black and white filters from,Instagram, Google+, Twitter, and Camera Noir, in order:

There’s no tilt-shift, no frame or border effects, and no sliders for controlling the amount of filtration. Depending on your tastes, that’s either a huge negative, or a huge plus. I’ve wanted Apple to co-opt filtering for a while now, given how many other apps were piling on the feature. Whether this helps calm that down, or only spur it on further remains to be seen.

Alongside the new square mode, the new filters highlight the immense influence Instagram has had on mobile photography. I don’t mind it at all. When not using Instagram I’ve still had the urge to square-cut photos anyway, and having to do it in post with the crop tool is less than elegant.

The new eye-candy, the new features are fine. It’s the new speed that’s killer.

iOS 7 Photos filter your life into years, collections, and moments

iOS 7: Photos automagically filters your life into collections, moments, and more

After 6 years and 6 versions of iOS, the trusty sunflower icon that’s come to represent the Photos app has been retired and a new, more abstract, multi-color “flower” has taken its place. So has a completely new interface metaphor. Back in 2007, the ability to pinch-to-zoom photos was one of the major multitouch selling points of the original iPhone. Back in 2010, so was the ability to peek into stacks of photos on the iPad. What was once done by genius animation is now done by iOS 7’s new physics engine. You can still swipe. You can still pinch. You can still peek (though it’s a little clumsy right now). But after 6 years and 6 versions, Apple is also replacing the default view, the never-ending linear chronology of images known as the Camera Roll, with a new Photos view, divided in Years, Collections, and Moments, that automatically organizing your images based on time and location.

Moments divides up your photos more completely, introducing breaks for every major change in time or place. Photo thumbnails are roughly the same size as they were in the old Camera Roll, fitting four across in portrait mode.