When Steve Jobs introduced the original iPhone in 2007, he had to convince millions of people that a touchscreen was a better input method than a physical keyboard. The way he pulled it off involved mimicking physical targets on the screen, in the form of buttons placed in a specific area on the screen. While this choice eventually degenerated into the hated “skeuomorphic” design, it was arguably the easiest way to bridge the gap between the physical keyboard/keypad methaphor and the touchscreen one. Indeed, exceptions to this “on-screen button” metaphor were few and far between, usually to make space for a scrolling interaction (eg music app, date picker). Heck, app themselves are buttons on a fixed space on the screen, and initially they could not even be moved around.
All this made sense in a world where touchscreen was still a fancy but unusual UI: buttons are ubiquitous, familiar and give users clear targets on the screen. They also imply that interacting with this thing on the screen will result in some kind of action.
Apple eventually abandoned skeuphormic design with iOS 7. I assume you know iOS 7 inside out by now (otherwise have a look here, I’ll be waiting) so I won’t repeat the obvious. But I believe that under the skin of iOS 7 lies a fundamental revision of the button metaphor as we know it. And while buttons will still be around, the original physicality of the iOS screen, with static targets patiently waiting to be activated, is on its way out.
The beginning of the end: Notification Center and iOS 7
I regard Notification Center as the first sign of this design change. Introduced with iOS 6, Notification Center violates the principle that each screen represents a physical slate that the user can either interact with or leave. Indeed, there is no specific target on the screen to activate it (it’s not a thing on the screen). You need to swipe from the top, and whether you swipe down from top-center, top-left or top-right, the system does not care. That’s a start.
Where this concept took off is iOS 7, particularly with Control Center and the new Spotlight search tool. Control Center is my favorite app even not being an app. It gives quick access to common actions and settings with a swipe up from the bottom of the screen (again, no horizontal target).
After swiping, your targets will be, well, buttons. But like for Notification Center, there’s no sign that Control Center is there, and activating it is more “location agnostic” than other iOS elements (and apps in particular). Also, Control Center can be activated from anywhere, even from inside an app. The new Spotlight is similar, allowing to search for apps or information with a swipe down from the middle of the screen.
Another interesting way iOS 7 is changing, while not abandoning, buttons is visible in Mail. In Mail you can swipe down to refresh the mailbox, or swipe left — from list view — to archive/delete (more is coming with iOS 8).
Finally and most importantly, Apple introduced a system wide swipe-right-to-go-back-to-the-previous-page gesture, again allowing for a different interaction for something that used to need the push of a button.
iOS 8 and 3rd Party Apps
With iOS 8, Apple is set to take this design approach to the next level. This post is not supposed to be a review of iOS 8 because a) I never actually used it and b) in a few weeks there will be a ton of great reviews flocking around the internet. I will just list here some of the key areas where this trend is clearly visible:
- Interactive Notification Center: the new rendition of the Notification Center allows for quick interaction by just swiping down from the notification banner.
- Multitasking screen: triggered by a double tap on the home button (as in iOS 7), the new mustitasking screen is now more interactive, showing favourite and recent contacts you can tap to call or send a text (you might argue that the pictures are in fact buttons, still they’re not necessarily positioned in a specific space on the screen).
- Messages: while the most celebrated new feature is group messaging, most of the audio messaging gesture-based. Video messaging seems to be working in a similar manner, though Apple did not show a live demo at WWDC. On the Apple website the “wheel” for video recording and sharing is to the left of the screen (while the microphone icon is to the right, just above the keyboard), which might mean that it is also activated via a gesture.
- The swipe-to-delete gesture that made it’s way into iOS 7 Mail app is now even more pervasive, and settings now typically don’t show buttons but reveal options with just a swipe to the left. For instance, this is how you remove people from group messages. Most interestingly, it looks like most of iOS features are now reachable without ever tapping the top part of the display (more on that in a second). It’s also important to note that this trend it not unique to Apple. Indeed Apple has largely drawn inspiration from 3rd party apps that have experimented with gesture-based interaction. Some obvious examples include Unread, Clear, Listen, WhatsApp, Snapchat among others.
Assuming I’m right in spotting this trend of a) removing the need for buttons to interact with iOS or b) removing the need for the buttons to be on a specific point on the screen, the next obvious question is why. Why would Apple move away from an easy way of interacting with iOS, one that arguably sustained the initial success of the iPhone and iPad?
I believe the answer is the rumored larger screen of the new iPhone. A 4.7" iPhone is widely expected to be introduced during this fall (likely on Sep 9th). But, as many Android users confess to themselves in the mirror, creating a UI for a large screen phone is not an easy-peasy job. The reason is the size of our hands, that did not grow 1" since the iPhone introduction 7 years ago (see this old post from Dustin Curtis for a visual explanation still based on the 3.5" iPhone 4).
You might argue that iOS is already working on the larger screen of the iPad, but the iPad is a completely different ballgame being a 2-handed device. And in any case iOS on the iPad is different from iOS on the iPhone (just as an example, think about the keyboard design and behavior).
The benefits of a larger screen are obvious. More screen real estate improves the reading experience and video playback, allows for a larger portion of our emails to be previewed and so forth. But it also comes with a major drawback: it makes the edges of the screen harder to reach while using the device with one hand. Steve Jobs was famously convinced that 3.5" was the perfect screen size for one handed usage (though he most likely gave green light to the project that later became the 4" iPhone 5). Tim Cook, when asked about the possibility of a larger screen iPhone mentioned “trade offs” such as “resolution, color quality, brightness, reflectivity, screen longevity, power consumption, portability, compatibility with apps, many things”. He added that Apple “would not ship a larger display iPhone while these trade offs exist”.
While Cook’s technical trade offs should now be fixed via component and process improvements, the one handed usage problem remains. How to fix that? How to tweak the iOS UI to have great one handed usage AND a larger display. I’m sure you connected the dots by now. I believe the answer is to reduce the need for tapping buttons, replacing them with gestures. When buttons are still needed, make them location agnostic (or at least keep them closer to the lower half of the screen). For example, allow me to activate a “wheel” to create and share my videos by swiping from the side of the screen, without ever having to reach to the top.
Obviously all this is just a wild guess. I have no insider knowledge of Apple is going to do, nor am I a UI expert by any means. And the issue remains of what happens to the swipe from the top to activate Notification Center (although with interactive notifications the basic activities are available without the swipe from top gesture).
I guess time will tell, in the meantime feel free to point out how wrong I am in the comments.