This post first appeared on Medium.

When Steve Jobs introduced the original iPhone in 2007, he had to convince millions of people using a touchscreen to interact with a phone was better than using a physical keyboard which was the common interface for the — don’t laugh — market leading Nokia (with Symbian) and RIM (with Blackberry). By and large, that involved mimicking physical targets on the screen, mostly in the form of fixed buttons places in a specific area on the screen. While this fundamental design choice later degenerated into the largely hated “skeuphormic” design championed by Scott Forstal (and Steve Jobs himself), it was arguably the easiest way to bridge the gap between the physical keyboard/keypad methaphor and the touchscreen one. Possibly for this reason the exceptions to this “on-screen button” metaphor were few and far between, mostly to make space for a scrolling interaction like in the music (at the time “iPod”) app or in the date picker. Heck, app themselves are buttons on a fixed space on the screen, and initially they could not even be moved.

This made sense in a world where touchscreen was still a fancy but unusual UI: buttons are ubiquitous, familiar and they manage to give users clear targets on the screen while inherently implying that interacting with this thing on the screen will result in an action.

Skeuphormic design was famously abandoned by Apple with iOS 7, that adopts a much cleaner and “flatter” interface. I assume you know iOS 7 inside out by now (otherwise have a look here, I’ll be waiting) so I won’t repeat the obvious. However I believe that under the skin of iOS 7 lies a fundamental revision of the button metaphor as we know it. I don’t mean that we won’t have buttons anymore, but I think that the original physicality of the iOS screen, with static targets patiently waiting to be activated, is on its way out.

The beginning of the end: Notification Center and iOS 7

I regard Notification Center as the first sign of this design change. Introduced with iOS 6 (and not an original concept by any means, having been used in Android for quite some time), Notification Center violates the principle that each screen represents a physical slate that the user can either interact with or leave, typically via the home button or on-screen buttons. While the initial Notification Center was mostly a passive experience, what sets it apart from anything else in iOS is the absence of a specific target on the screen for it to be activated. In other words, it’s not a thing on the screen. Yes, you need to slide your finger down from the top, but there’s no sign it is there, and whether you swipe down from top-center, top-left or top-right, the system does not care. That’s a start.

Where this concept really took off is iOS 7, particularly with Control Center and the new Spotlight search tool. Control Center is my favourite app even not being an app. It gives quick access to common actions and settings with a swipe up from the bottom of the screen (again, no horizontal target).

While after swiping your targets will be — you know — buttons, like Notification Center there is no sign that Control Center is there, and activating it is more “location agnostic” than other iOS elements (and apps in particular). Also, Control Center can be activated from basically anywhere, even from inside an app. The new and finally useful Spotlight is a similar example, allowing to search for apps or information with a swipe down from the middle of the screen (though it can’t be activated from an app).

Another interesting way iOS 7 is changing, while not abandoning, buttons is visible in Mail. Some of the most useful interactions with Apple Mail app are activated via swipe down (to refresh the mailbox) and swipe left from the preview of an email to archive/delete (this particular set of actions will receive a significant boost in iOS 8 with more gestures for deleting without tapping, flagging etc). Again, with these actions Apple largely mimicked innovation that started in 3rd party apps or Android, but the fact that they included these gestures in some of their most used apps (see below for Safari) further incentivises developers to move in the gesture direction as opposed to the buttons one.

Finally, and possibly most importantly, Apple introduced a system wide swipe-right-to-go-back-to-the-previous-page gesture, again allowing for a different interaction for something that used to need the push of a button.

iOS 8 and 3rd Party Apps

With iOS 8, Apple is set to take this design approach to the next level. Again, this post is not supposed to be a review of iOS 8 because a) I never actually used it and b) in a few weeks there will be a ton of great reviews flocking around the internet. I will just list here some of the key areas where this trend is clearly visible:

  • Interactive Notification Center: the new rendition of the Notification Center allows for quick interaction by just swiping down from the notification banner.
  • Multitasking screen: triggered by a double tap on the home button (and an iteration of what was introduced in iOS 7), the new mustitasking screen is now more interactive, showing favourite and recent contacts that can simply be tapped to call or send a text (you might argue that the pictures are infact buttons — and you would be right — but they’re not necessarily positioned in a specific place on the screen).
  • Messages: while the most celebrated new feature is arguably group messaging, most of the audio messaging is based on gestures (even though it starts by tapping and holding the microphone button). Video messaging seems to be working in a similar manner but Apple did not show a live demo and, interestingly, on the Apple website the “wheel” for video recording and sharing is to the left of the screen (while the microphone icon is to the right, just above the keyboard).
  • The swipe-to-delete gesture that made it’s way into iOS 7 Mail app is now even more pervasive, and settings now typically don’t show buttons but can reveal options with just a swipe to the left (eg that’s how you remove people from group messages). Most interestingly, it looks like most of iOS features are now reachable without ever tapping the top part of the display (more on that in a second). It’s also important to note that this trend it not unique to Apple. In fact, as mentioned, Apple has largely drawn inspiration from the many 3rd party apps have experimented with gesture-based interaction for quite some time. Some obvious examples include Unread, Clear, Listen, WhatsApp, Snapchat among others.

Why?

Assuming I’m right in spotting this trend of a) removing the need for buttons to interact with iOS or b) removing the need for the buttons to be on a specific point on the screen, the next obvious question is why. Why would Apple move away from an easy way of interacting with iOS, one that arguably sustained the initial success of the iPhone and iPad?

I believe the answer is the rumored larger screen of the new iPhone. A 4.7" iPhone is widely expected to be introduced during this fall (likely on Sep 9th), but as many Android users confess to themselves in the mirror when they decide to be really honest, creating a UI for a large screen phone is not an easy-peasy job. The reason, of course, is the size of our hands, that in most cases did not grow 1" since the iPhone introduction 7 years ago (see this old post from Dustin Curtis for a visual explanation still based on the 3.5" iPhone 4).

You might argue that iOS is already working on the larger screen of the iPad, but the iPad is a completely different ballgame being a 2-handed device. And we don’t have to forget that, even if the differences might be almost invisible to the users, iOS on the iPad is different vs iOS on the iPhone (just as an example, think about the keyboard design and behaviour).

The benefits of a larger screen are quite obvious. More screen real estate improves the reading experience and video playback, allows for a larger portion of our emails to be previewed and so forth, but it comes with a major drawback, making the edges of the screen harder to reach while using the device with one hand. Steve Jobs famously convinced that 3.5" was the optimal screen size for one handed usage (though he most likely gave green light to the project that later became the 4" iPhone 5). As for Tim Cook, when he was asked whether Apple would launch a larger screen iPhone he mentioned “trade offs” such as “resolution, color quality, brightness, reflectivity, screen longevity, power consumption, portability, compatibility with apps, many things”. He added that Apple “would not ship a larger display iPhone while these trade offs exist”.

While Cook’s technical trade offs should largely be fixed now via component and process improvements, the one handed usage problem remains. How to fix that? How to tweak the iOS UI to have great one handed usage AND a larger display. I’m sure you connected the dots by now. I believe the answer is to reduce the need for tapping buttons, replacing them with gestures. When buttons are still needed, make them location agnostic (or at least keep them closer to the lower half of the screen). For example, allow me to activate a “wheel” to create and share my videos by swiping from the side of the screen, without ever having to reach to the top.

Obviously all this is just a wild guess. I have no insider knowledge of Apple is going to do, nor am I a UI expert by any means. And the issue remains of what happens to the swipe from the top to activate Notification Center (although with the new interactive notifications you might argue that the basic activities are available without the swipe from top gesture, and if you want to catch up with older notifications you could wait for a moment when you could use two hands).

I guess time will tell, in the meantime feel free to point out how wrong I am in the comments.

Posted
AuthorSimone Rizzo
CategoriesTechnology