Why SNAP Spectacles Failed: A Detailed Professional Usability Heuristics Analysis
Spinning gold from code This is an object lesson in why hardware design is exceedingly difficult for software businesses. It is always an interesting exercise to watch a software-based and cloud-delivered business, even a very clever one with a massive user base and a seemingly successful business model, attempt hardware design. There is a certain arrogance that comes from developing a successful business based essentially on code, server farms and prodigious levels of capital looking for the next Facebook. Such business models, when they are successful, can be a lot like minting money without the attendant risks associated with producing actual physical products. With the right combination of usability, features, social media exposure and a large measure of luck, seemingly ridiculous ideas turn into billion dollar corporations backed by Wall Street and investors. Such is the basic trajectory of SNAP.
Wall Street and hardware SNAP is the quintessential example of spinning gold from code. This is not to take away from SNAP’s apparent success as a virtual social media platform and possible business performance. Although the quarterly reports are, in real cash flow terms, terrifyingly low for such a staggering market cap, and marketers are not flocking to SNAP as predicted. We will leave that issue to the spreadsheet wizards at the investment banks that pumped SNAP stock to a heady IPO. Apparently during the pre-IPO roadshow, SNAP indicated that hardware would be a major part of its value proposition going forward. This was a big ask by the investment banks and a big promise on the part of SNAP. Here is why.
Software is not hardware It turns out that creating world-class hardware is, on several critical levels, vastly more complex than doing the same for internet-delivered, software-based feature delivery. It has always been a surprise how many software developers think of hardware engineering as less complex and less demanding. That is not the case, as hardware design today is vastly more complex than most software developers understand.
Prior experience means a lot It is especially difficult for those whose lifeblood stems from software-based business models to transition to the design and production of highly engaging consumer-facing high-tech hardware. Perhaps without such working knowledge or as a factor of apparent arrogance, SNAP has made the promise that it will deliver both innovative and usable hardware and software in order to achieve its projected growth and market success. SNAP’s goal is to create the best of both worlds…hmmm, not so fast. If the SNAP Spectacles are any indication, one might reasonably question such an objective.
Why creating world-class hardware and software is really hard It turns out that creating both successful internet-based software and world-class hardware as one’s primary business model falls into the category of near magic. In fact, you can count the number of major corporations that have achieved this feat today on one finger…APPLE. This is one reason that Apple is the most highly valued company on the planet and is the primary reason that others are currently attempting to develop the same hardware/software paradigm.
Amazon and Google are not Apple Let’s not think for even a moment that Amazon or Google have software/hardware combinations that approach the combined performance of APPLE. The hardware Amazon and Google have designed and sold make at best, micro levels of contribution to either’s profits. To be clear, Amazon is in the warehouse/shipping business and Google is in the advertising business…full stop. The extent to which hardware fits into either business model is structurally limited to opening sales and distribution channels into primary revenue models. Hardware will never be a major profit center for either Google or Amazon in the way of Apple.
Clearly, when social media purveyors like SNAP cross the great divide from software development to creating actual consumer-facing hardware, success is not assured and in fact the reverse is almost always guaranteed. The reason that this is true is that the underlying decision models and associated professional skill sets are vastly different for hardware vs. software. The skill set that makes a great software engineer rarely translates to the ability to create consumer-facing hardware of the highest order.
The point is simply that, on the surface, the most important software and hardware development problems look somewhat the same to those lacking a deep understanding of hardware and software development processes. However, underneath the covers, software and hardware development are entirely different universes. Of the hundreds of major ways in which hardware and software development varies a look at just two that are well understood to be strong predictors of success in the marketplace brings into focus why the SNAP Glasses were a failure.
Core development tasks that directly impact customer acceptance
Task 1: Determine which features/functions to deliver to the customer
Task 2: Determine how to optimize overall system usability
In the following discussion, these two development tasks are examined in detail to demonstrate how each is executed in professionally managed product development programs.
Software design encourages flexibility Software, by its very nature, is a highly flexible technology that resides inside a framework that allows for constant updating and modification. This is also a major limitation, because with flexibility frequently comes lack of discipline, lack of planning, and propensity to ignore professional usability testing during development.
When determining which features and functions to deliver to customers, a software-based business model like SNAP frequently starts with a feature set that it feels will compel engagement. This rarely works out the first time, but the key point is that given the flexibility embedded in software-based business models, the development team can remove, add, changeup, or combine features and functions almost on the fly. Compared to hardware design, software lacks all manner of feature function planning rigor. Zuckerberg famously champions this approach, along with others who have said in part, “build it fast, launch it today, fix it tomorrow”. The surprising outcome of this style of development is that it tends to work, assuming the team can hit on a core feature set that drives increasing levels of user engagement. Facebook is really nothing more than a massive patch up around a simple messaging structure. Many have said that software-based business models must have two features: 1) one core feature that drives increasing engagement and 2) the ability to rapidly scale while making updates to the core feature set. When there are serious usability problems, software development allows for modifications without massive disruption to the business model. There is an incremental optimization option in software usability that is allowed by the nature of the development model. The point with a software-based business model is that one can, usually without the threat of a total loss, change up a feature set and live to tell about it.
Hardware design limits flexibility The exact opposite is true for hardware design. Hardware designers that are charged with creating actual physical products like the SNAP Spectacles are bound from the start by a lack of flexibility in terms of feature/function design and allocation. The key difference between hardware and software design is that all hardware projects have what is known technically as “Design Freeze”. This is the point where no other changes can be made to the hardware design because the physical product must be committed to actual production. In hardware design, there is no “launch now and fix later”. What you freeze is what you build.
Why hardware success is so elusive There is a long list of complex processes that hardware development must progress through, including the following brief summary of well-known gating functions.
The look and feel of the hardware design First the overall visual design of the product must be determined by the industrial design team and the preferred visual design must be passed to component engineering for actual engineering of internal and external components parts.
Creation of the feature set and user interaction framework This is the step of hardware design that is most important in terms of determining overall hardware usability and feature function engagement. In world-class hardware, this step encompasses a detailed examination of how the user will interact with the hardware in terms of cognitive structure and physical engagement. This involves a determination of the primary and secondary use cases, task flows, and error messaging and cognitive workload calculations. Physical hardware design also frequently requires ergonomic analysis to determine how manipulation of the actual 3D product will impact ease of use and user engagement. Such task flows and related interface designs must be simulated and tested repeatedly to determine usability and feature engagement. This may well be the defining difference between hardware and software development. In world-class hardware development programs, the hardware product is not approved for design freeze without the conduct of scientifically valid user testing by an unbiased research team.
Design freeze On professionally managed hardware development programs at this point in the design of the physical product and the underlying UI, a Design Freeze is required. This is the point at which all key aspects of the hardware and UI design have been defined and the product is passed to component design and production engineering. Changes to the core product beyond this point are highly disruptive to funding and scheduling.
Production freeze Production of complex consumer-facing hardware requires management of a complex global supply chain that presents an entirely new set of development challenges for those tasked with hardware production. Based on the industrial design and UI design defined in the design freeze, all external and internal component parts must be designed separately and as a combined product. The interconnections between each component must be verified and tested, complex piece parts must be designed and tested for a wide range of performance variables including production viability, structural integrity, heat transfer, material finish, section thickness, assembly, vibration, final assembled rigidity, once these processes are complete, production tooling must be produced for all component parts including metal stampings, injection molded parts, PC boards, memory modules, visual display, battery holders and all related interconnections, Production tooling must be produced from solid blocks of steel so that parts can be molded rapidly and with high quality.
The software aspects of hardware design Today, all new high tech hardware have prodigious levels of software that drives the product and related user experience. The software components of the user interface must be architected by those responsible for the UI design and all internal code written tested and verified for reliability with the underlying firmware and IC. Suitable API must be written to ensure that the hardware will interface with IOT frameworks. The software development team must produce and test the security layer required to ensure the hardware cannot be hacked while still allowing interactions with a vast range of other devices, data sources and social media platforms.
The hardware APP development The APP development team must create the screen-based APP UI that is fully tested for usability and interconnections with the hardware product design. All of these interfaces must then be tested for usability and interoperability with the companies cloud-delivered, software-based infrastructure.
Testing the product for compliance The product must be submitted to FCC and other agencies for approval based on federal and international compliance rulings.
Not nearly done yet Once production lines are set up and, testing processes determined, package designs must be created and set up on package assembly lines, assembly of the product must be tested and production line employees trained. Quality control procedures must be put in place and verified. Pre-production assembly must be tested and updated. Sub-assemblies contracted to outside vendors must be received, tested and logged into the production system. Product storage, shipping, and logistics determined, sales and marketing promotions set up, pricing models tested and refined, shelf space negotiated with retailers…the list goes on. This is another way of saying hardware is hard.
A word about hardware accelerators There is an important new development in hardware development known as “Hardware Acceleration”. The process involves working within a special development framework where startups or even more mature companies can theoretically reduce the time-to-market for hardware-based products. This new process is best described in the recent explosive growth of hardware accelerators located in Shenzhen, China. The best known of these entities is HAX, which offers a unique business proposition for early-stage companies involved in hardware development. Hardware acceleration does not reduce the importance of feature/function definition and user testing in any way. To date, none of the most well-known hardware accelerators employ professionally managed user testing for usability or user engagement. Companies that rely on these new accelerators are putting at risk their entire market success for the benefit of possibly shipping an unusable product to the marketplace. Whether or not SNAP employed a hardware accelerator is unknown but such an approach routinely produces solutions with unverified usability and feature/function engagement. As can be seen from the brief description of how features and functions are defined in hardware vs software projects varies widely. The second critical way that software and hardware vary and one that is critical market success is related to usability of each system.
This development task is probably where hardware development varies the most compared to software development is in the optimization of the user experience and usability of each system’s consumer-facing interfaces. In web-based software, development teams have the benefit of immediate feedback once the system has been launched and the ability to fix serious usability or UX problems relatively rapidly. Most software-based, cloud-delivered business models today objectively have minimal professional usability and UX optimization research conducted during development or prior to primary launch. Let’s be clear, most of these new cloud-based businesses like SNAP have large teams of UX designers filling conference room walls with sticky notes all in the interest of creating a compelling user experience. This is not the same as conducting objective, professional and unbiased usability and UX optimization research.
Once cloud-based interfaces are up and running with real users, usability testing is even less likely. Only when major pain points show up do software development groups undertake a kind of forensic approach to identifying usability and UX design problems. Rarely does such analysis include formal usability and UX optimization research. One can see the rationale for this oversight given that continuous change is built into the software-based and cloud-delivered business model. This is a costly and arrogant approach but not usually fatal in software-based and cloud-delivered business models. However, on the hardware side, such oversight is career-ending and stock price damaging.
Self-evident operation is preordained Some cloud-based businesses are so arrogant about their ability to create winning software and related business models that they refuse to offer any form of instructions for use that might aid new customers in accessing features and functions of the system. SNAP is famous for this approach. If you wish to learn how to use SNAP online you have two options, beg a friend to give you help or figure out the interface by random walk…a term of art in the field of usability science. Clearly, as one can see immediately when attempting to use the SNAP Spectacles, this same development model has found its way into the SNAP way of thinking in terms of hardware design.
Apparently, SNAP must have discovered that significant usability problems were present in the SNAP spectacles after production but before the actual release of the product. When one visited the now infamous SNAP vending machines, SNAP staff were on the floor and at the exit making sure that anyone purchasing a set of glasses from their vending machines received a small printed card directing them to a specific URL for help.
The FUE / First User Experience However, when one opened the clever container holding the SNAP hardware there was only a single small cardboard box containing a glass cleaning cloth and the charging cable with instructions for syncing the glasses with your smartphone printed on the inside of the small box. There was not a single word of instructions focused on how the SNAP Spectacles were to be manipulated successfully by the purchaser. Translating this FUE into SNAP language means essentially: “Hey, we designed this great new clever hardware and if you cannot figure it out on your own then who cares, that is your problem. By the way, we already dinged you for $200 and besides, they look cool on your bookshelf”. This is essentially in line with how the SNAP web experience is designed in terms of helping the user develop a mental model of how it actually functions. This approach may work for Apple which can capitalize on billions of hours of user prior experience but it is not going to work for SNAP hardware which asks users to intuitively understand an entirely new user interface and related set of status displays that make NO SENSE and are fundamentally flawed in terms of human factors engineering design. With this approach by SNAP is going to come warehouses full of hardware that no one wants at any price. Which is apparently exactly what has happened as reported recently by TechCrunch.
Arrogance and Wall Street To be totally clear, such arrogance with respect to optimizing the usability of hardware is only going to cause SNAP and its investors continued heartburn as this approach goes nowhere in the long term and dramatically impacts corporate profit, brand reputation, and investor confidence. Actually developing innovative and highly usable hardware devices is massively complex compared to the SNAP software-based and cloud-delivered business model. It would not be a stretch to assume that Wall Street and a prodigious number of investors have miscalculated SNAP’s ability to hitch profits to a combined hardware + software business model. If the current SNAP Spectacles are an indication of future performance short positions are bound to increase.
What could SNAP have done to solve the problem? In order for a hardware device to have high levels of usability and UX quality, it must be tested using rigorous research methods before design freeze using a range of mockups, simulations, and functional prototypes. It turns out that such testing is a fundamental requirement of all consumer-facing hardware that will eventually deliver a high degree of consumer engagement. One can often predict the potential usability and UX performance of a hardware device simply by examining the development process of teams involved and the degree to which formal human factors engineering science and unbiased professional usability testing has been applied. These are criteria never investigated by institutional investors but are often routine for VC funds that have a winning track record.
What really is professional usability testing? This question may seem obvious, but in fact hardware development groups routinely fail to make use of usability and UX optimization research methods available. This is due to simple arrogance and lack of awareness. Very few UX Design programs teach formal usability testing methodologies. They also rarely understand which methods to utilize during various phases of hardware development.
Standard methods Hardware development teams have available a wide palette of formal research methodologies that can be employed to ensure the usability and UX performance of hardware and for that matter software as well. Traditional usability testing methods include:
- Lab-based usability testing
- Large-sample online usability testing
- Formative user testing
- Summative user testing
- Environmental usability testing
- Heuristics analysis / best practice reviews
- Children’s Online Privacy Protection Act (COPPA)-age user testing
- User guide testing
- Reference guide testing
- Behavioral data mining
- Ethnographic and field research
- Media and cross-platform testing
- Cognitive modeling and user testing
- Persona development and needs testing
These methods have been traditionally utilized to optimize the usability and UX performance of hardware devices and software interfaces prior to design freeze.
Advanced methods However, recently an entire palette of more advanced testing methodologies have been developed that allow hardware and software development teams to dramatically increase the probability that a new product will be successful in the marketplace. These new methods including:
- Environmental navigation eye-tracking and workload analysis
- Advanced emotional response user testing
- Advanced consumer preference testing
- 3D spatial tracking and UX optimization
- Electromyography / physiological effort testing
- Specialized eye-tracking for whole system UX optimization
- Newtonian force measurement
- Multi-factorial visual design testing and UX optimization
- Physical ergonomic optimization
- Cognitive learning decay modeling
- Mobile device utilization and cognitive resource allocation
- Wearable eye-tracking for UX optimization
- Ethnographic total user experience optimization (TUXO) and data mining
- Virtual world avatar behavior tracking and UX optimization
These methodologies vary in terms of cost, time to execute and relative scientific validity. In order to be successful, individuals with advanced degrees in human factors engineering science apply these methodologies. It is important to note that the science of human factors engineering as a professional discipline is not part of the normal skill set of UX designers or industrial designers commonly charged with the creation of hardware or software solutions that offer high levels of UX and usability performance. All of the most successful hardware and software development teams in industry have teams of professional human factors engineering experts fully engaged during product design, testing, verification, training system development and new feature/function identification.
When professional heuristics may be good enough Of all the methods listed above that SNAP could have employed to improve the UX performance of the SNAP Spectacles, that with the shortest lead-time and lowest cost and is professional human factors engineering heuristics analysis. This methodology involves the execution of an audit of the proposed hardware platform early in the development process or at any point during later development. Heuristics UX analysis must be executed before design freeze. In order for heuristics analysis to be effective, it must be conducted by a highly experienced professional human factors engineer. The best result is obtained from a certified HFE professional (CHFP ), or individual with similar qualifications. The use of heuristics analysis was selected for the following discussion due to its low cost and fast execution time. Heuristics do not fully replace other forms of respondent-based observational research listed above. Heuristics when properly applied does produce robust insights on potential usability and UX optimization problems early in development.
The process is direct A professional heuristics analysis involves rating the product on a standardized set of twenty heuristic rules. The first step is to determine whether or not the hardware interaction design and physical product design violates a given rule, and if so, to determine the severity of the rule violation. Rule violation severity is rated on a scale between 1 and 5 with 1 being “no violation and no usability and UX impact” and 5 being “extreme impact likely to significantly degrade usability and UX performance”. Below are four of the twenty rule ratings for the SNAP Spectacles. The original analysis on which this article is based involved ratings across all twenty heuristic rules. The SNAP Spectacles violated all twenty heuristic rules. The severity ratings for violations were high in almost all dimensions. The important point is that by executing a low cost and rapid response heuristic analysis, SNAP could have understood and in fact predicted the failure of the current hardware in the marketplace. This type of analysis could have been executed at any point prior to design freeze. In our professional usability optimization practice heuristics analysis has been conducted as early a the paper prototype phase with the resulting analysis identifying serious usability and UX performance problems at the earliest stages of development. Below is a matrix of four heuristic tests and rule ratings for the production SNAP Spectacles.
Violation: Yes – Severity: 5
Problem: All feedback provided by the Spectacles is presented as light patterns on either the outward-facing light display or the inner hinge of the glasses. The display states require user to learn an entirely new set of information formats with associated mapping to device functions Prior instruction is required to know what each of these light patterns corresponds to. Thus, all critical feedback requires secondary processing of functional information not provided in device instructional materials.
Violation: Yes – Severity: 4.5
Problem: The outward-facing light display is not in view when the user is wearing the Spectacles. Thus, the user is unable to check battery level while wearing the Spectacles. The hardware design does not make clear which elements of the device control critical functions and how one interacts with device elements to produce productive use of the SNAP Spectacles. Device reflects poor control/display compatibility.
Violation: Yes – Severity: 4.5
Problem: All critical device interface functional states are presented through the same basic visual status ring light display. This requires the user to expended excessive cognitive workload when attempting to understand key device states by displaying different information in the same display format. The use of multi-modal display violates fundamental HFE science.
Violation: Yes – Severity: 5
Problem: When the initial pairing of Spectacles to the mobile device is unsuccessful, the user is not provided information about why the error occurred and how they can correct the error. One is left to random walk the device and online interface looking for possible solutions to initial paring problem. The user must eventually search the web for assistance to be found in videos presented on YouTube by other frustrated users. This is especially problematic during the first user experience (FUE), as the instructions provided with the Spectacles suggest users’ first steps should be to turn their phone’s Bluetooth on, install and open the latest version of Snapchat, swipe down in the app to view their Snapcode and press the button atop the left hinge of the Spectacles to pair them with their mobile device. In reality, the Spectacles must first be charged to be able to pair them with a mobile device and begin using them. Users do not receive this information from the device nor the mobile app. Users cannot understand and manage critical error states during paring, battery level detection and how to interface the Spectacles with the SNAP on screen App interface. In these types of devices error state management builds user confidence and over time contributes in a major way to brand value. There is more than one reason that Apple has a Genius Bar…Think error state management.
One can see from the heuristic analysis above that the SNAP hardware had the most basic usability and UX optimization problems and that the severity ratings were on the extreme end of the scale. These types of findings were consistent across all twenty rule assessments. Clearly, this hardware was going to cause consumers a high degree of usability and UX performance pain. This was totally knowable months before design freeze. Even if SNAP employed a hardware accelerator approach in the development of the Spectacles it would have been trivial to conduct professional human factors heuristic reviews during acceleration. Leaving usability and UX optimization to the UX design and hardware engineering teams is a clear path to problems in the marketplace for innovative new products like the SNAP spectacles.
Really how important is the usability of the SNAP Spectacles? As the relatively simple professional heuristic analysis above shows, the usability of the SNAP glasses are a major problem but ultimately the larger problem is simply a failure to allocate functions that would have had an emotional resonance with the existing user base of the SNAP social media platform. The Spectacles simply fail on the most important structural level in terms of feature function allocation and mapping to the underlying engagement model of SNAP. Even if outstanding usability performance were present in the SNAP hardware, the Spectacles would have likely been a failure in the marketplace. The fundamental question that should have been asked by Wall Street and now investors in SNAP is how does any new hardware – or, for that matter software innovations – engage and expand the SNAP user experience that currently drives its user base to higher levels of engagement. In the field of formal usability science, this is known as function allocation. What can the SNAP Spectacles do better, actually much better, than the user’s smartphone? The answer? Nothing. Producing a novel image capture channel without attendant engagement innovations is and always has been a non-starter.
SNAP is not alone But let’s be clear: SNAP is not the only major high-tech entity to attempt the development of spectacle-based data capture and information display hardware. By far the largest and most visible failure was Google Glass. It is interesting to note that Google Glass offered unprecedented technology solution to data capture and data display. It did so in a potentially transformative manner. However, Google Glass was a massive failure of industrial design. It was and remains a primary example of how the visual appearance design of a product is far more complex than most industrial designers realize. In the case of Google Glass, the industrial team failed entirely to realize that any form factor that is directly positioned on the user’s face produces an extraordinary amount of cognitive impact in terms of impression and projected meaning on the part of those observing and wearing the product.
The human face is a new frontier for hardware design The Perception of faces is so important in our evolutionary and day-to-day existence that face perception has its own primary neurological center in the human brain. Anything associated with the human face is loaded with special significance and embedded meaning. Objects positioned on the face undergo instantaneous assessment when viewed by others in ones social sphere. The industrial designers of Google Glass applied their own biased visual style to the design of Google Glass to create a wildly differentiated and high-tech visual impression, an impression that instantly communicated negative functional attributes of privacy over-reach and high tech elitism. Google Glass was dead in the water from day one based on naïve application of industrial design visual style theme that was both inappropriate and psychologically off-putting. This was an unfortunate mistake on the part of the Google Glass industrial design team because the underlying technical platform was and is exceedingly innovative and potentially useful. Today, in certain occupational applications where visual style is moderated by professional need, Google Glass is apparently finding a rich set of new applications. In the same way that SNAP Spectacles failed to drive customer engagement with its core platform, Google Glass failed to drive customer acceptance for its core feature set based on how it appeared visually, not how it functioned technically.
Heat map of a pharmaceutical dosage table generated from MUS eye tracking data showing confusion over a specific dosage combination required to properly deliver the associated drug.
The future of visual gaze tracking hardware Even though SNAP and Google have produced major flameouts in the design and production of these types of interfaces, it is clear that spectacle-based data capture and information display as a structural concept remains an area of massive potential. Hardware that successfully integrates data capture of the environment with actual fixations of what the users is viewing produces exceedingly powerful insights into how we navigate our everyday lives and make decisions in a rapidly changing technology-based world. We know from our work in the use of advanced eye-tracking technology to conduct consumer research that such a paradigm is very powerful. Take for example a recent study undertaken by our UX Research Lab on medical device design and instructions for use.
Eye-Tracking as a research and data capture tool As noted in the list earlier in this article, one of the advanced usability and UX optimization testing methodologies available to hardware development groups is head-mounted eye-tracking. This methodology utilizes advanced data capture glasses to track the users entire visual search behavior and can record and provide highly reliable research data on what the user is viewing, how long they view certain information, which information they return to during a given task, which information they fail to view and read entirely and a wide range of more sophisticated information including the relative measure of cognitive workload required to deal with a hardware device and or its instruction set. This following example shows a recent study conducted by our UX Labs Group examining the relative usability of consumer-facing blood pressure device sold in pharmacies. This study utilized eye-tracking to determine where in the FUE First User Experience consumers and users of this type of device encountered critical errors and confusion. The image below shows the component parts of the study set up. This research included an examination of the total user experience (TUX) including unboxing and use.
Visualizing the cognitive problem The image below shows one of the studies’ heat maps, which demonstrates during this respondent’s first time use of the hardware they had high levels of confusion and errors associated with a specific set of images and text during initial set up. As can be seen from the image below, the user focused high levels of visual attention on a specific small set of instructions and ignored most of the rest of the procedures. The confusion in this example shows how important it is to test even simple instructions in unison with the actual hardware. In this study, the user failed the first time use of the device because they skipped entirely the step required to insert a set of AA batteries. They continued to attempt to take their own blood pressure with the device for several minutes before finally giving up. From a human information processing point of view, the level of cognitive complexity in this device is comparable to that found in the SNAP Spectacles. This is based on the number of steps required to achieve initial success, the relative complexity of the steps, the amount of prior learning that the user came to the task with and the interconnection of the device to an external APP recording and tracking system. The confusion shown in the heat map below is due to the insertion of a very low-frequency use case exception into the task flow. This type of deviation is well known in human factors engineering science to dramatically increase the cognitive workload of users and increase critical errors. This is similar to the SNAP Spectacles problem that occurs if one happens NOT to update to the latest SNAPCHAT smartphone application.
Where is the big opportunity? It is clear to those involved in professional HFE research that the types of usability and UX performance problems as seen in the SNAP hardware were totally avoidable through the application of standardized professional usability testing methods. What is less obvious is how more advanced research methods like the behavioral response to product features and 3D spatial tracking could have been used by SNAP to actually provide users with a totally new set of functions that would have driven deeper platform engagement and increased user acquisition to the online platform and profits from exceptional hardware design. In terms of Google Glass, the use of Multi-factorial visual design testing and UX Optimization research would have both saved an amazing product and likely provided Google with the insight to dramatically understand how industrial design solutions are tested and optimized for the long term.
Chris Morley, M.S. Human Factors Engineer / Aileen S. Gabriel, Human Factors Engineer
About MAURO Usability Science Founded in 1975, we are among the most experienced international consulting firms focused on helping world-class clients and leading startups solve business-critical problems related to the usability and interactive quality of their products and services. In short, we help make complex products simple and simple products empowering. We are proud to have solutions that are running at the heart of the world economy….More.