Perception, reality, and the iPhone

Our ever-changing mental models.

Nate Smith
UX Collective

--

Cover Image: 3D view of iPhone’s card stack with one-point perspective lines

What we think of as a phone was upended in 2007, when iPhone made its way into the world. Six years after its initial release, iOS 7 signaled a new direction for the design of the device’s operating system, and many of the things pioneered in that update remain largely unchanged in the eight years since. More recently, iPhone X ushered in yet another paradigm shift, and interaction patterns across the mobile device market are becoming increasingly homogenized. I’d like to consider how iPhone’s interface design has been shaped over the years to emulate an organic, interactive space, and how our sensory perception has adapted accordingly. Visual effects like blurring and scaling are more than mere aesthetic nuances; in fact, means of visual communication like these are precisely what help people build mental models, and continually broaden their understanding of what these incredible multitools can do.

Incremental Redefinitions

“Every new medium contains another pre-existing medium as its content.” — Marshall McLuhan

It’s no coincidence that the paradigm-shifting device was called i-Phone, instead of i-PocketComputer or Newton. People understood phones pretty well; they could use them as a reference point. Apple specifically chose to call this post-WIMP device a “phone,” even though it was, of course, much more than that. It’s worth considering what kind of phone interfaces people were familiar with at the time iPhone was released:

Photo of Motorola Razr interface, depicting red bars and blue text message containers.

Although, compared to desktop computers, these small displays were very limited in their ability to render graphics, various colors and icons helped people understand how to interact with different elements across the screen, as well as which elements mapped to which buttons on the keyboard. Motorola’s Razr, like many cell phones, used different colored bars to distinguish informational and interactive areas. Apple referenced visual features like these, and many others, throughout the design of its iPhone operating system. The first six versions of the software looked mostly the same, and the visual style was a playground for graphic designers, who spent hours tweaking layers and blend modes in Photoshop (a makeshift photo editor-turned-interface design tool).

iOS 5 Interface elements, such as buttons, alerts, bars, tables, etc.
iOS 5 interface elements (OSX Daily)

Perception and the Design of Objects

When we perceive objects in the world, we make assumptions about their utility; we see things as either tools or obstacles. If I think of a thing as being use-able or use-full, that means I think I understand some of its features; I can see that this thing does that. If there’s a mismatch between what I “think” something does, and what it “actually” does, its utility is lost to me. Therefore there is no such thing as a “wrong way” to use something. There is only use or nonuse. The level of abstract thinking required to perceive utility in something like a computer is astounding.

From desktop PCs to cell phones, software cannot be discussed properly without a consideration of the hardware on which it manifests. The physical affordances of an object indicate to people how it can be used (consider scissors, light switches, keyboards, etc.). Industrial designers are concerned with, among many things, the size and form of an object; these dictate decisions that are made regarding every feature. Even though the primary interaction with a piece of hardware may happen through a graphical interface on a display, the overall object itself and its formal properties are not arbitrary. In other words, the design of a graphical interface is the design of features of an object.

In one fell swoop, iOS 7 dispelled many of the skeuomorphic elements characteristic of previous versions of iOS. This led many techies and designers to describe the new aesthetic as “flat design.” However, this was maybe less of a dramatic design direction departure, and more of an acknowledgment of something that was already the case: smartphones were flat.

iOS 7 and the Misnomer of Flat Design

“iOS 7 has a whole new structure that is coherent and that is applied across the entire system… Distinct, functional layers help establish hierarchy and order, and the use of translucency gives you a sense of your context. These planes, combined with new approaches to animation and motion create a sense of depth and vitality.“ — Jonny Ive

Animation of key design features in iOS 7: lock screen, control center, app switcher, messages.
Introducing iOS 7 (Apple)

“iOS 7 isn’t some huge change, some complete rethinking of how we use our smartphones. It’s really just a coat of paint.” — The Verge

The flashy redesign of iOS 7 certainly led to its fair share of buzz—and controversy—in the tech world. Much of the conversation surrounding the update was, understandably, regarding the vibrant, flatter appearance of the interface. Where you previously found textures reminiscent of felt and linen, neon gradients with simple icons and typography constituted the entire experience. Where you previously found distinct, beveled buttons, simple text labels signified controls. Where you previously found an unlock slider knob on a track, the entire screen now responded to the movement of your finger. Normal, non-tech-ey, non-designer-ey people might have described the update as making the system “brighter” or “more colorful.” Or maybe “harder to read,” since the system’s typeface, Helvetica, was replaced with its ultralight counterpart in Helvetica Neue. In any case, this software update certainly didn’t go unnoticed.

If anything, mobile interfaces got deeper, not flatter

Perhaps the more overlooked—yet more impactful—changes in iOS 7 were refinements to the system’s interaction design and motion design. Gestures we now take for granted were introduced for the first time, such as swipe up from the bottom of the display to pull up Control Center, or swipe right from the left edge of the display to go back to the previous page. The whole system seemed to be more aware of its hardware; to feel more sensitive to every touch and gesture; to move in response to your movement. iOS 7 and subsequent releases have leaned on several related design motifs to help build a sense of space and depth:

  1. Translucency let blurred visuals from lower surfaces peek through, influencing the color of higher surfaces, while retaining readability and focus.
  2. Parallax changed layers’ perspective based on the device’s position or explicit input (closer things appear to move more than distant things).
  3. Windowing enabled browsing through open apps with small previews of content above their respective icons.

Space, distance, and movement

Distance, of course, describes the amount of space between two things or people. Let’s consider two different sets of things, whose spatial relationships influence the way interfaces are designed with depth in mind:

  1. The human hand and a multitouch display, and
  2. App windows and a home screen.

Even though a distinction might be drawn between the above, as one appears physical and the other virtual, our means of understanding each of these relationships is not actually that different. In the same way your finger touches down and lifts up from a display, apps on an iPhone lift off of the home screen and create spatial relationships with one another. Even though software might be considered artificial, our minds use the same mechanisms to make sense of “animated” movement as they do to make sense of real-world movement. Furthermore, physical movements and the movements of those around us shape our consciousness in a fundamental way—we think in terms of space and motion, not just language—as psychologist Barbra Tversky describes in Mind in Motion.

With multitouch, the physical distance between the human hand and the content of a display was eliminated. This created a perception of direct manipulation of content, as opposed to indirect manipulation of content with a mouse and cursor. As new functionality and visuals like translucency, parallax, and windowing were implemented, Apple was building a clearer sense of space and depth inside of the software. Blurred lines between the physical and the digital set the stage for a post-button interface.

Animation of iPhone X home gesture. Mail app is open, person swipes up, and mail collapses into top left corner of home grid
iPhone X introduced an entirely gestural interface (Apple)

The fact of iPhone’s simultaneously engineered hardware and software has allowed Apple to design both physical and graphical interactions in tandem. Released in 2017, iPhone X introduced entirely new ways to perform the device’s most basic functions. But before that happened, another device quietly buffered the new way of interacting with the old. The previous year’s iPhone 7 saw the removal of both the headphone jack and the mechanical home button; the former garnered much more attention, and that’s probably testament to the ingenuity of the latter’s design. Of course these changes to the hardware were probably essential in making the device water resistant, but with the changes, the ensuing interactions scaffolded a new way of thinking about the device’s physicality.

iPhone 7’s home button was not really a button at all: it was a solid, force-sensitive depression that triggered both haptic and audio feedback when pressed, providing the illusion of a mechanical button click. Most people might have noticed a slightly different feeling the first time they pressed it, but this new sensation quickly became the norm. The physical and the digital were now interchangeable, fused like never before. Just a year later, iPhone X saw the removal of the home button entirely, yet two on-screen shortcut “buttons” were incorporated for flashlight and camera. Each required a firm press and provided haptic and audio feedback, just like iPhone 7’s “physical” button.

iPhone 7 and iPhone X
iPhone 7’s artificial home button functions in exactly the same way as iPhone X’s dynamic shortcut buttons.

The first decade of iPhones and iPhone-inspired smartphones all used fixed buttons below the display for primary navigation (Android devices used a set of three buttons: back, home, and app switcher). Each year’s new iPhone and iOS release showed careful attention to detail in their interaction design, and gradually chipped away at the notion that such a portable computer device requires buttons at all. Fluid, gestural interfaces are what Apple sees as the future of screen-based human-computer interaction, clearly evident in the evolution of iPhone’s operating system, and in Apple’s numerous claims that iPad is the future of productivity. This direction has only been further solidified with Android’s introduction of an almost identical set of gestures.

Designing for Space & Motion

Beyond the visual techniques already discussed, designers should consider—and experiment with—the main approaches Apple’s used to create a sense of space in interfaces. At this time, Apple’s HIG doesn’t provide much along these lines. Google’s Material Design offers a good breakdown of surfaces and elevation, but this is certainly not the only approach, and resulting products may or may not fit neatly into iPhone’s established spatial and motion patterns.

Scale

We intuitively know that most objects in the physical world clearly have a fixed size; when something’s far away, it just looks smaller than when it’s close—it’s not as though it’s actually smaller. Our minds are constantly re-adjusting our interpretation of the scale of things in relation to other things (consider how large the moon looks when it’s just on the horizon, compared to when it’s high in the sky). So when we see something in an interface getting smaller (reducing its scale), we could imagine that its mass is actually decreasing, like a balloon. But a conception that’s more closely aligned with how we think of things in the world, is that that thing is just moving further away from you.

Depth of Field

In photography, “depth of field” describes the distance of subject matter that a lens has in-focus, while the distance beyond the DoF will appear blurry. An intentionally shallow DoF can be used to emphasize things in the foreground, like a person in a portrait. Depth of field is one of the methods used in iOS to differentiate distant surfaces. When you open an app, the home screen recedes into the background, out of the depth of field. As you swipe up to return home, you’re throwing the app away from you, until you release and it falls into place, bringing the home screen back into the DoF.

Animation showing iPhone X’s home gesture from a side view, depicting scale and depth of field changing

Both scale and depth of field are essential in establishing the primary spatial model in iPhone. The app to home transition described above has been gradually improved, but has remained fairly consistent since iOS 7. It provides a clear sense of your position in relation to a single app and to the grid of apps on the home screen.

Depth of an App

What’s been less clear, though, is how “deep” any given app is meant to be. That is, until last year’s iOS 13 update, which introduced a new pattern across the system: modal views that appear on top of primary views push the primary views slightly into the background — further away and slightly dimmed, but still within the depth of field. Music and several other native apps used this pattern for a few years, but iOS 13 saw its system-wide adoption on share sheets and activity views. We’ll likely see further expansion of this pattern in future OS updates, and it certainly gives interface designers and engineers an opportunity to think more clearly about depth, space, and motion in today’s interfaces.

Animation showing iOS 13 modal card stack: Music App, Share Sheet, Calendar Event Modal
Animation showing iOS 13 share sheet with side view (depth of field and scale)

Co-Evolution

Tools and technology, by definition, expand our human capacity and influence our sensory perception; we are always co-evolving with the tools we create. Modern technology, like iPhone, has profoundly changed the way we live and make sense of the world. Traditionally, designers consider certain physical affordances as being inherently intuitive, but this view seems increasingly limited. That is, it’s not just that we’re continually moving towards a more perfect form of anything we design. It’s that the things we design can reshape our perception of what’s possible, useful, and desirable. Therefore, the form and experience of an interface is not only a design or technical challenge, but also a cultural one.

Thank you for reading. If you found this interesting, please leave a response below, or hit me up on twitter!

--

--