PulseUX Blog

Theory, Analysis and Reviews on UX User Experience Research and Design

HomeMissionAbout Charles L. Mauro CHFP

How 4 Simple Usability Heuristics Could Have Saved SNAP Glasses

Wall Street And Hardware SNAP is the quintessential example of spinning gold from code. This is not to take away from SNAP’s apparent success as a virtual social media platform and possible business performance. Although the quarterly reports are, in real cash flow terms, terrifyingly low for such a staggering market cap, and marketers are not flocking to SNAP as predicted. We will leave that issue to the spreadsheet wizards at the investment banks that pumped SNAP stock to a heady IPO. Apparently during the pre-IPO roadshow, SNAP indicated that hardware would be a major part of its value proposition going forward. This was a big ask by the investment banks and a big promise on the part of SNAP. Based on a detailed usability and feature/function allocation analysis it is clear that for the most part SNAP Spectacles (Glasses) failed due to poor usability and a flawed feature/function design. Here is what SNAP could have done to dramatically improve the success of this potentially groundbreaking product concept.

Broken Usability From The First User Experience (FUE) Onward When purchasers opened the clever container holding the SNAP hardware there was only a single small cardboard box containing a glass cleaning cloth and the charging cable with instructions for syncing the glasses with your smartphone printed on the inside of the small box. There was not a single word of instructions focused on how the SNAP Spectacles were to be manipulated successfully by the purchaser. Translating this FUE into SNAP language means essentially: “Hey, we designed this great new clever hardware and if you cannot figure it out on your own then who cares, that is your problem. By the way, we already dinged you for $200 and besides, they look cool on your bookshelf”. This is essentially in line with how the SNAP web experience is designed in terms of helping the user develop a mental model of how it actually functions.

SNAP Is Not Apple This approach may work for Apple which can capitalize on billions of hours of prior experience when creating new products, but it is not going to work for SNAP hardware which asks users to intuitively understand an entirely new user interface and related set of status displays that make NO SENSE and are fundamentally flawed in terms of human factors engineering design. With this approach by SNAP is going to come warehouses full of hardware that no one wants at any price. Which is apparently exactly what has happened as reported recently by TechCrunch.

What Could SNAP Have Done To Solve The Problem? In order for a hardware device to have high levels of usability and UX quality, it must be tested using rigorous research methods before design freeze using a range of mockups, simulations, and functional prototypes. It turns out that such testing is a fundamental requirement of all consumer-facing hardware that will eventually deliver a high degree of engagement from consumers. One can often predict the potential usability and UX performance of a hardware device simply by examining the development process of teams involved and the degree to which formal human factors engineering science and unbiased professional usability testing has been applied. These are criteria never investigated by institutional investors but are often routine for VC funds that have a winning track record.

What Really Is professional Usability Testing? This question may seem obvious ,but in fact hardware and software development groups routinely fail to make use of professional usability and UX optimization research methods available. This is due to simple arrogance and lack of awareness. Very few UX Design programs teach formal usability testing methodologies. They also rarely understand which methods to utilize during various phases of hardware development.

Standard methods Hardware development teams have available a wide palette of formal research methodologies that can be employed to ensure the usability and UX performance of hardware and for that matter software as well. Traditional usability testing methods include:

  • Lab-based usability testing
  • Large-sample online usability testing
  • Formative user testing
  • Summative user testing
  • Environmental usability testing
  • Heuristics analysis / best practice reviews
  • Children’s Online Privacy Protection Act (COPPA)-age user testing
  • User guide testing
  • Reference guide testing
  • Behavioral data mining
  • Ethnographic and field research
  • Media and cross-platform testing
  • Cognitive modeling and user testing
  • Persona development and needs testing

These methods have been traditionally utilized to optimize the usability and UX performance of hardware devices and software interfaces prior to design freeze.

Advanced methods Recently a broad palette of more advanced testing methodologies have been developed that allow hardware and software development teams to dramatically increase the probability that a new product will be successful in the marketplace. These new methods include:

  • Environmental navigation eye-tracking and workload analysis
  • Advanced emotional response user testing
  • Advanced consumer preference testing
  • 3D spatial tracking and UX optimization
  • Electromyography / physiological effort testing
  • Specialized eye-tracking for whole system UX optimization
  • Newtonian force measurement
  • Multi-factorial visual design testing and UX optimization
  • Physical ergonomic optimization
  • Cognitive learning decay modeling
  • Mobile device utilization and cognitive resource allocation
  • Wearable eye-tracking for UX optimization
  • Ethnographic total user experience optimization (TUXO) and data mining
  • Virtual world avatar behavior tracking and UX optimization

These methodologies vary in terms of cost, time to execute and relative scientific validity. In order to be successful, individuals with advanced degrees in human factors engineering science apply these methodologies. It is important to note that the science of human factors engineering as a professional discipline is not part of the normal skill set of UX designers or industrial designers commonly charged with the creation of hardware or software solutions that offer high levels of UX and usability performance. All of the most successful hardware and software development teams in industry have teams of professional human factors engineering experts fully engaged during product design, testing, verification, training system development and new feature/function identification.

When Professional Heuristics May Be Good Enough Of all the methods listed above that SNAP could have employed to improve the UX performance of the SNAP Spectacles, that with the shortest lead-time and lowest cost and is professional human factors engineering heuristics analysis. This methodology involves the execution of an audit of the proposed hardware platform early in the development process or at any point during later development. Heuristics UX analysis must be executed before design freeze. In order for heuristics analysis to be effective, it must be conducted by a highly experienced professional human factors engineer. The best result is obtained from a certified HFE professional (CHFP), or individual with similar qualifications. The use of heuristics analysis was selected for the following discussion due to its low cost and fast execution time. Heuristics do not fully replace other forms of respondent-based observational research listed above. However, when properly applied, a heuristics analysis does produce robust insights on potential usability and UX optimization problems early in development.

The Process Is Direct A professional heuristics analysis involves rating the product on a standardized set of twenty heuristic rules. The first step is to determine whether or not the hardware interaction design and physical product design violates a given rule, and if so, to determine the severity of the rule violation. Rule violation severity is rated on a scale between 1 and 5 with 1 being “no violation and no usability and UX impact” and 5 being “extreme impact likely to significantly degrade usability and UX performance”. Below are four of the twenty rule ratings for the SNAP Spectacles. The original analysis on which this article is based involved ratings across all twenty heuristic rules. The SNAP Spectacles violated all twenty heuristic rules. The severity ratings for violations were high in almost all dimensions. The important point is that by executing a low cost and rapid response heuristic analysis, SNAP could have understood and in fact predicted the failure of the current hardware in the marketplace. This type of analysis could have been executed at any point prior to design freeze. In professional usability optimization practice heuristics analysis has been conducted as early a the paper prototype phase with the resulting analysis identifying serious usability and UX performance problems at the earliest stages of development. Below is a matrix of four heuristic tests and rule ratings for the production SNAP Spectacles.

Heuristic Test #1 – Device Information Clarity Rule

Violation: Yes – Severity: 5

Problem / Analysis: All feedback provided by the Spectacles is presented as light patterns on either the outward-facing light display or the inner hinge of the glasses. The display states require the user to learn an entirely new set of information formats with associated mapping to device functions.  Prior instruction is required to know what each of these light patterns represents, yet no such information is provided either with the product or in an easy to locate format. Thus, all critical feedback requires secondary processing of functional information not provided in device instructional materials.

Why This Matters: Users of any form of hardware and software want the information flowing from their new device to be clear and understandable. When users are forced to work with information formats that are totally unique and are presented without ANY explanatory information, they are left to random walk the interface to hopefully determine what the device is trying to communicate. This is the primary usability failure of the SNAP Glasses and is a perfect example of a clever hardware design overriding the far more important usability and information clarity attributes of the device during routine operation and common error states. Strike 1 for the SNAP Spectacles.


Heuristic Test #2 – Device Element Conspicuity Rule

Violation: Yes – Severity: 4.5

Problem / Analysis: The outward-facing light display is not in view when the user is wearing the Spectacles. Thus, the user is unable to check battery level while wearing the Spectacles. The hardware design does not make clear which elements of the device control critical functions and how one interacts with device elements to make productive use of the SNAP Spectacles. This is known in the field of human factors science as poor control/display compatibility.

Why This Matters: When users first engage with a new product they bring to such experiences a vast knowledge base of prior experience with other devices. The surprising factor that most UX designers fail to grasp is that the best UX designs, first and foremost, take extensive advantage of their user’s prior knowledge. When a hardware or software design enters the marketplace like the SNAP Spectacles that require an entirely new learning profile in terms of which components require interaction and which contain information display the amount of cognitive effort heaped upon the user is often beyond the implied benefit of using the product. Simply put: new hardware needs to be cognitively familiar or a significant number of users will simply give up…this problem started the slow degradation of the SNAP hardware into usability purgatory. There is more. Strike 2 for the SNAP Spectacles.

Heuristic Test #3 – Device Non-Modal Rule

Violation: Yes – Severity: 4.5

Problem / Analysis: All critical device interface functional states are presented through the same basic visual status ring light display and small flashing LED on the front and inside of the glasses. This requires the user to expend excessive cognitive workload when attempting to understand key device states by displaying different information in the same display format. The use of multi-modal display violates fundamental HFE science.

Why This Matters: Multi-modal displays are well understood to create high levels of cognitive complexity. The reason for this is found in learning theory where it has been demonstrated that learning something new is far less complex than unlearning and then relearning something. This is exactly what happens when different device states are displayed through the same display interface. Every time the user looks at the SNAP circular display they are forced to forget what the display indicated before and to query long-term memory for new meaning flowing from the same display. Violation of the Non-Modal Rule pushed the SNAP Spectacles further in the domain of truly poor usability. Strike 3 for the SNAP Spectacles.

Heuristic Test #4 – Device and Error Messaging Rule

Violation: Yes – Severity: 5

Problem / Analysis: When the initial pairing of the Spectacles to the mobile device is unsuccessful, the user is not provided information about why the error occurred and how they can correct the error. One is left to random walk the device and online interface looking for possible solutions to initial paring problem. The user must eventually search the web for assistance to be found in videos presented on YouTube by other frustrated users. This is especially problematic during the First User Experience (FUE), as the instructions provided with the Spectacles suggest users’ first steps should be to turn their phone’s Bluetooth on, install and open the latest version of Snapchat, swipe down in the app to view their Snapcode and press the button atop the left hinge of the Spectacles to pair them with their mobile device. In reality, the Spectacles must first be charged to be able to pair them with a mobile device and begin using them. Users do not receive this information from the device nor the mobile app. Users cannot understand and manage critical error states during paring, battery level detection and how to interface the Spectacles with the SNAP on screen App interface. In these types of devices, error state management builds user confidence and over time contributes in a major way to brand value. There is more than one reason that Apple has a Genius Bar…Think error state management.

Why This Matters: Device error state management is the key to helping users build true confidence in using a product. All products fail at some point during normal use cycles. However, most UX designers fail to think of error state management as part of normal use-case.  As a result software and hardware design teams fail to develop a display framework and instructional support that allows the user to easily recover from error states. Error state management can have a greater impact on brand image than almost any another product attribute. The SNAP Spectacles leave the user in error state management purgatory. So arrogant is the UX design of the SNAP Spectacles that the device and instructional support, for the most part, fails to even acknowledge that the device may fail to sync, lose charge or otherwise stop playing nice with the user.  There is virtually no way that the SNAP Spectacles could have generated wide user acceptance or high levels of engagement. The cognitive workload far exceeds the functional benefits offered to the user. Strike 4 for the SNAP Spectacles.

One can see from the heuristic analysis above that the SNAP hardware had the most basic usability and UX optimization problems and that the severity ratings were on the extreme end of the scale. These types of findings were consistent across all twenty rule assessments. Clearly, this hardware was going to cause consumers a high degree of usability and UX performance pain. This was totally knowable months before design freeze. Even if SNAP employed a hardware accelerator approach in the development of the Spectacles it would have been trivial to conduct professional human factors heuristic reviews during acceleration. Leaving usability and UX optimization to the UX design and hardware engineering teams is a clear path to problems in the marketplace for innovative new products like the SNAP spectacles.

Read The Full Usability Analysis of the SNAP Spectacles For a comprehensive discussion covering the failure of the SNAP Spectacles visit the long-form post here.

Contributors

Chris Morley, M.S. Human Factors Engineer / Aileen S. Gabriel, Human Factors Engineer

About MAURO Usability Science Founded in 1975, we are among the most experienced international consulting firms focused on helping world-class clients and leading startups solve business-critical problems related to the usability and interactive quality of their products and services. In short, we help make complex products simple and simple products empowering. We are proud to have solutions that are running at the heart of the world economy….More.

Subscribe to email updates


Post a Comment

Your email address will not be published. Required fields are marked *