Following up on the story on the Withings scale, here is a piece on the design of the interface.

The project

When I started designing at Withings in December 2012, two products were in the work; a new scale (the Smart Body Analyser), and the Withings Pulse. The hardware choices were already made on the Pulse, the product was going to production soon, but whole interaction and UI remained to be designed.

While the Smart Body Analyser was an iteration on an existing product, the Pulse was a totally new line, with no existing base to draw from.

As a wearable activity tracker, the Pulse would be monitoring the user’s activity along the day, counting Steps, Running, Elevation, analyzing Sleep and measuring Heart Rate.

The article focuses on two aspects of the project:

  • The hardware interaction
  • The look & feel of the interface

Hardware Interaction

The elements of interaction

Using a systematic approach, I listed all the hardware elements that could be used for user interaction:

  • A screen
  • A hardware button placed on top of the device
  • A 4 zones touch sensitive surface (TSI)
  • An accelerometer

Then explored all interaction possibilities that they offered:

Document listing the possible Interactions using the Pulse’s hardware

Most gestures interactions were put aside because of their complexity to implement and to master by the user (Gestures need to be learned).

On the other hand, I mapped the content users would need access to, the actions that could be performed:

  • Step counting, elevation, distance, calory burned of the day
  • Start/end the “night mode”
  • Make a Heart rate measurement
  • Access to a few days of history on all data
  • Events like: weak battery, step goal achieved
  • System things like time, reset, connect…

The goal was to make the best use of the interaction vocabulary available to create an easy to use and learn interface.

Prototyping for hardware interactions

At that point, we had no working prototypes, so I built a simulator of the hardware (in Flash, running on a iPhone) to prototype interaction:

Very early Interaction concept art, just being briefed on the hardware interaction elements

The touch sensitive zone was supposed to be under the screen, and was seen as the most important interaction element. It allowed:

  • Swipe left and right
  • Click on several zones of the screen
  • press and drag elements from left to right (Slide to activate)

The working hypothesis was then to Press the hardware button to light the screen ON and use the touch zone as the main interaction element.

Video of an early Interaction mockup based on the device’s spec

When the prototype devices arrived; a few things became clear:

  • The touch zone would suffer latency in order to reduce false positives
  • After testing with external users, it was clear that the touch zone under the screen, it was very difficult to use even after more than 20 minutes of use
  • Using the touch screen while moving was difficult
  • At the opposite, the hardware button was very easy to use, in all conditions

Given those observations, I turned the architecture around, used the hardware button as the main interaction element (one Press, light on, second Press, switch to second screen, etc…) and the touch zone as the element of interaction for secondary interactions (browsing the history of a given metric, start a program).

The problem of the touch zone location remained, and visual solutions kept failing in user tests. The solution was finally to update the hardware specs (a few weeks before production), replacing the opaque TSI by a transparent one and placing it on top of the screen, making it a touch sensitive screen.

Delivering the User Interface specifications

Specifications for the Pulse User Interface

The look & feel

The Smart Body Analyser and the Pulse were very different hardware; Different reading distance, different screen types (LCD screen for the Smart Body Analyser, OLED for the Pulse), even different pixel shapes (rectangle pixels and squared pixels).

The challenge was to create a UI that would carry a similar identity while accommodating different contraints.

The Pulse’s screen

The Pulse’s screen is a 1 x 0,3 inches (2,5 x 0,8 cm) mono color OLED screen with a 128 x 32 squared pixels definition, high frame rate and almost no afterglow. It is meant to be read from 20 inches (50 cm) distance.
No grey levels, so no Anti Aliasing.

The screen was integrated inside a rubber touch black plastic box that would make it totally invisible when OFF, but would also blur the display when ON.

Working “in situ”

The best way to work was to display test screens directly on the final product, via a Serial port and some Processing code. It allowed an efficient workflow:

One of the first UI test on a prototype of the Pulse

This is a picture of a prototype. The blurry effect of the rubber touch case was largely reduced on the final product, but still had the characteristics.
It was a legibility constraint as well as a natural Anti Aliasing that was interesting to use.

From the Smart Body Analyzer to the Pulse

In order to balance the blur effect, I needed to add more negative space in the font that was developed for the scale:

Left: Smart Body Analyzer’s font grid to Pulse’s font grid — Right: the “8” on the Smart Body Analyzer and on the Pulse
Iteration on the font, still on a prototype

Like on the scale, a secondary font would be used for texts and secondary informations.

Research on fonts for the Pulse

The final result is very close to the Smart Body Analyzer’s font:

Left: Smart Body Analyzer’s font — Right: Pulse’s font

The reading distance being very short on the Pulse, we could have an even smaller font giving secondary informations on the screen:

Iteration on the font, still on a prototype
Pulse’s smaller font

On this snapshot of the final product, you can see all three types; 10:47 is written with the main digit font, AM is using the main alpha numerical font, SCOTT is written with the smaller font:

Final fonts, on the final product

A look at the final UI

Here is a look at the final product and software, and a selection of screens.

A few mockup screens of the Pulse
Video of final UI of the Withings Pulse

Final words

What turned out to be key in the development of the Pulse’s interface was:

  • Getting feedback early using simulator when the product was not yet available
  • User test the product as soon as prototypes were available
  • Visualize the UI mockups directly on the device

Each of those elements allowed making key decisions on the product (Validate an architecture, change a key hardware component, optimize the look & feel for the hardware) to make the most out of the devices components and use their characteristics at best.

The next article in the series will focus on the Withings Thermo; I had the chance to work on the project as product manager in the very early phase of development, leading product definition and being part of key hardware decision.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here