PulseUX Blog

Theory, Analysis and Reviews on UX User Experience Research and Design

HomeMissionAbout Charles L. Mauro CHFP

2007 Annual User Experience Design Review

2007 was a significant year for user experience design. Several UED innovations fundamentally altered the way users will interact with important technology platforms in the future. Most notable was the introduction of the iPhone, which changed how mobile Telco systems are developed and presented to users. Important user experience design innovations in gaming applications were Guitar Hero and the Nintendo Wii. Google Docs received kudos, but with interesting reservations. Recent developments at MTV are also noted.

Overview

2007 was a significant year for user experience design. Several UED innovations fundamentally altered the way users will interact with important technology platforms in the future. Most notable was the introduction of the iPhone, which changed how mobile Telco systems are developed and presented to users. Important user experience design innovations in gaming applications were Guitar Hero and the Nintendo Wii. Google Docs received kudos, but with interesting reservations. Recent developments at MTV are also noted.

The 2007 Annual User Experience Design Review also discusses 3 user experience design missteps. We make special note of Wall Street’s obsession with algorithm-based risk management systems that in recent years have progressively removed the human component from risk assessment. The Review looks at Facebook/Beacon and why it raised so much concern with users and the national media. In a new section called “Viewpoint on Social Change and Technology”, the authors ask critical questions about the viability of the One Laptop Per Child (OLPC) program being promoted as a solution to third-world education. Here is our take on the most noteworthy events in user experience design for 2007.

Milestones of 2007

Analysis: The most important UED introduction in 2007 was the Apple iPhone. In terms of total units sold, the iPhone was barely a blip on the sales radar of the 1 billion plus cell phones purchased worldwide last year. However, in terms of short-term media impact, iPhone was an unqualified winner. The iPhone media coverage eclipsed any other product launch in recent history.

Looking at mid-term impact, the iPhone fundamentally altered the way mobile technology platforms that interface with Telco systems will be developed and presented to the customer. It will be far more difficult for Telco giants to dictate the configuration, feature set, and interface of mobile telephony. Those days are gone and reverberations were felt from South Korea to Finland.

However, the most important impact of the iPhone will be long-term. For the first time, a world-class corporation has shown convincingly that it is possible to develop a mobile configuration which includes hundreds of features while maintaining a highly engaging user experience. Apple created a market-altering combination of size, weight and user experience fluency. This will fundamentally change mobile platform user experience design over the long term.

Predictions / 2008: Apple will continue to dominate the market for this level of device. Other major cell phone manufacturers will scramble to catch up but find a competitive solution illusive. Major Telco’s will rethink their negotiation strategies (Verizon, are you out there?). Other major mobile device manufacturers will waste time with Android, the open platform alliance for mobile devices which fails to acknowledge that the future is all about the user experience, not infrastructure.

Interesting side note: One of the most frequently searched terms on Google last year was “iPhone”.

Analysis: It has always been curious that some of the best user experience design solutions have been developed in gaming systems, yet few of these innovations find their way into mainstream user experiences. Both Guitar Hero and the Nintendo Wii are exceptional examples of how creating a tighter connection between the user’s interaction with the physical world and a screen-based display produces high levels of engagement and commercial success. Guitar Hero and several games for the Wii take advantage of the customer’s ability to acquire skills through the use of familiar real-world gestures and actions. By creating an interface between these actions and the feedback mechanisms of the game, these products do one very important thing: they tighten the connection between the user and the user experience.

However, the reason why these systems will have long-term impact is counter-intuitive. Upon deeper analysis, it becomes clear that their success is actually an example of advanced virtual reality development. Users become psychologically immersed in the virtual experience of the game by the creation of a more fluent physical interface with the game-play model. This form of interactive fluency is also present in the iPhone interface.

Predictions / 2008: Nintendo will sell the Wii in huge numbers. Guitar Hero will continue to create millions of unskilled guitar players. Major game companies will announce new divisions focused on this new user experience model.

Analysis: MTV, the largest and most successful purveyor of youth culture in the world, did a surprising thing in 2007. At a time when other mass media companies were stuffing their sites with FLASH, MTV took down its FLASH site and put up a surprisingly simple HTML site. The new user experience and related sub-sites were focused on providing the fastest and most direct access to content associated with MTV media properties. Out was the FLASHY user experience. In was fast navigation, easy way-finding and improved SEO. This change went essentially unnoticed by the media, but in our opinion, was a precursor to significant changes in the mass media landscape with respect to user experience design. In the midterm, we believe that other mass media companies will begin to understand that usability of the overall customer experience drives acquisition, retention, and migration. Over the long term, mass media companies will be forced to employ more rigorous customer experience research methods to successfully migrate from the traditional CPM advertising model to a new model that is based on monetizing their content through more robust online user experiences. In the new mass media landscape, it may be better to deeply engage existing customers than to buy new ones through acquisitions.

Second Life and VMTV: At the same time that Second Life was imploding as a platform for creating compelling corporate VR customer experiences, MTV quietly launched no less than 5 new virtual worlds based on the Makena platform. One of these MTV virtual media properties recently received an Emmy (Gold) Award for Outstanding Achievement in Advanced Media Technology. This suggests that there is more to corporate virtual life than Second Life.

Predictions / 2008: New solutions to the complex problem of cross-platform media distribution will begin to form, but no meaningful changes will occur until 2009. Other mass media companies will begin to understand the relationship between customer experience design, content development, and profitability in web-based delivery channels. In virtual reality, new properties will surface that prove the viability of these platforms for certain applications.

Analysis: First a disclaimer: we use Google Docs. Second, they are as promising as they are frustrating. The idea of internet-based applications designed to replace MS Office goes back to the original NetScape. Many smart folks have gotten up in the morning with a plan, venture capital funding, and a bevy of clever software developers – all with a bead drawn on MS Office. One might even say that Google is the ultimate version of this model. Yet success evades even Google, for as anyone who uses GooDoos (the term we coined for Google Docs) knows, they are both wondrous and highly dysfunctional. It is our opinion that short term and mid term, Google Docs will have a relatively low impact. However, if one peers slightly over the horizon at the next generation of hardware and software operating systems and focuses on Google’s plans for cloud computing, a much different picture appears. In the long term, the game is over. Cloud-based applications will take the day for these types of user experiences.

Surprisingly, the final solution will be far more complex than Google assumes. Simply mapping MS Office functionality into an impoverished screen-based (but cloud-focused) interface will not win the day. Who will figure this out remains an open question. Aside from the fact that Google may well own the cloud, they do not, and probably never will, own the user’s local platform employed to run these new cloud-based applications. From extensive research with clients, we know users’ machines tend to be surprising when looked at objectively. These computers are surprisingly outdated, underpowered, unreliable, and a majority of users have little knowledge of how to maintain and manage their computers’ operating system. Recent studies conducted by MauroNewMedia confirmed that less than 15% of those using MS Windows knew how to disable their personal firewall. Only about 55% knew how to turn off their pop-up blockers. It is at the end of the line, where the pixels hit the glass, that cloud-based applications hit the wall in terms of achieving large scale impact. In our consulting work with some of the most advanced cloud-based interfaces, we rarely see development teams evaluating user experiences on an objectively determined range of user hardware and software configurations. When this type of research is executed in an unbiased and professional way, startling insights result. Many user experiences that are wonderfully fluent and productive on the engineering team’s development system are unreliable, slow, and unsatisfying to use. This makes cloud-based applications complex from a computer science perspective and complex in a cognitive engineering sense. One must wonder if an engineering-centric culture like Google has the user-centric bandwidth to solve this vexing problem.

Predictions / 2008: Google will plug away at GooDoos and make incremental improvements but will put major focus on the larger cloud-computing problem. Microsoft…well, your guess is as good as ours. Microsoft’s primary business problem makes cloud-computing look simple.

Notable missteps of 2007

Analysis: Risk management on Wall Street is out of control. Over the past 15 years, MauroNewMedia has been involved in research and design of user interfaces for several of Wall Street’s most complex trading and risk-management systems. During this period, we have witnessed a staggering shift in how risk is managed by these new automated, screen-based systems. That shift has been the aggressive removal of the “human component” from many complex risk management systems. Gone are human intelligence and experience. In are increasingly more complex mathematical models which attempt to supplant human pattern-finding in complex data.

Wall Street paid the price for this shift and it was a big bill. In the end, several trillion dollars will have vanished in the sub-prime mortgage debacle. Short term, this problem will have minimal impact, as banks will sell equity to retain balance sheet performance. In the mid-term and long term, the design of algorithm-based risk management systems and the user interfaces that visually represent risk for these types of financial transactions must change. It is clear that human intelligence cannot be completely replaced by mathematical models, no matter how many PhD’s are stuffed into the software development process. Our bet is that the CEOs of Citibank, Merrill Lynch and other leading banks had no way to adequately visualize the risks they entered into with these hideously complex transactions. What we are referring to is a new combination of human intelligence combined with the best analytics possible. This is known in the field of professional human factors engineering as a “function allocation” problem where key tasks are dynamically allocated between human control and automation. It is interesting to note that Goldman Sachs profited massively from having PRECISELY the right analytics and hedging strategies that let them cash in all the way down to the bottom of the subprime mess. It is safe to say that Goldman Sachs had a better function allocation solution than those banks that lost billions during the same period. This was not a good year for understanding and managing risk on Wall Street, and it was an even worse year for all those customers who bought homes with mortgages they never understood. 2007 was a low point in customer experience design in the financial services industry!

Predictions / 2008: No change is expected.

Analysis: Tinkering with a winning user experience sometimes results in outsized responses from users and the media. No other user experience design issue in 2007 created more bad press for a company than Facebook/Beacon. This automated behavior tracking and posting function was a low point in user experience design for 2007. While the failed implementation of this system caused uproar with users, the bigger problem for Facebook was the amount of bad press Beacon created in the national media. For example, Louise Story of the New York Times took Facebook to task over Beacon in a series of interesting pieces that focused on Facebook’s attempt to circumvent user privacy issues. Facebook made the problem worse by forcing users to search for the link allowing them to disable the new feature. In the short term, this problem will not go away for Facebook, and in the mid term, Beacon may have damaged the creditability of Facebook as a user-centered, social networking destination. Beacon provides important lessons from which other companies entering the social networking space can benefit.

Why Facebook has such a large user base: The success of Facebook has been based on providing users with a critical combination of simplicity and control. This is known in professional human factors engineering as functional transparency. It is interesting to note that the iPhone user experience also presents such a balance. In Facebook, this transparency gave users total control of the flow of personal information into and out of their profiles. The combination of simplicity and control resulted in increased social interaction for most users and produced a significant network effect in terms of registration. However, Facebook made a critical error with Beacon in terms of user experience design.

How control impacts complexity: With Beacon, Facebook psychologically removed (or degraded) the “control” attribute from the combination of simplicity and control. Users no longer believed that they had complete control of their Facebook profiles. What we know about all successful technology (and related man-machine interfaces), from the iPod to the Space Shuttle, is that a sense of “control” is all-important. With a loss of control, the user’s impression of complexity increases even if the interface remains unchanged in all other ways. While Facebook executives may have thought they were only impacting the attribute of “control”, they were also affecting the user’s impression of simplicity. Facebook/Beacon is an example of how making changes to a major user experience design can have deep implications. Did Facebook test Beacon in an unbiased, professional user experience study? Maybe, or maybe not!

The larger conceptual problem: Facebook/Beacon was an example of clumsy automation. For those who understand how to professionally measure human interactions with technology, automation is frequently another word for complexity. Strategically, Beacon was a failed attempt to use automation to deliver ad impressions and to build brand connection by peer association. Beacon opens the door for new social networking platforms to bring back the proper balance between simplicity and control. These two attributes are the fundamental building blocks of successful social networking user experiences. Unfortunately, monetizing these two attributes is far more complex than Facebook imagined.

Predictions / 2008: Facebook will continue to have problems monetizing its massive user base. New social networking sites will emerge in 2008 that begin to capture smaller specific segments of the Facebook user profile who aren’t well-served by the Facebook user experience.

Viewpoint on social change and technology

The goal of this new section is to provide a forum for increased dialogue around technology platforms or systems that make broad claims for impacting positive social change. The One Laptop Per Child (OLPC) project is such a system.

Analysis: It is difficult to take a negative view of what appears to be an attempt to do profound social good. The OLPC project, under the direction of Nickolas Negroponte (formally head of MIT Media Lab), has been controversial from the beginning. For those involved in research at the interface between technology and the human mind and body, the OLPC project seemed like a technology solution looking for a problem. Sometimes, in the march to create technological solutions to vexing social problems, we forget that technology in its best form should support and expand human potential in both a social and individual context. Here are 3 questions we have about OLPC. These questions are intended to expand the dialogue about the objectives and the execution of the OLPC program.

Q1: Why a non-standard user experience interface? It is true that the esteemed graphic design firm Pentagram created a radically new and visually interesting user interaction model for the OLPC computer. The question is why? What was the real objective benefit of creating a totally new interface for millions of children in the third world? It is often surprising to those who study such things that the actual cost of a personal computer is not the hardware and software but rather the training time spent learning how to use the system. From this data, one might assume that a system that is easier/faster to learn is better. But this is not a simple question. For example, billions of man hours have already been spent learning to use the MS Office interface and other standard GUIs. From research, we know that much of this training takes place between experienced users and novice users. Users teach users and MS Windows and other standard GUI interfaces have several billion users who know how to use the software. No one is suggesting that MS Office applications are a paragon of usability (far from it), but why did the OLPC team throw away billions of man hours of training for a totally new interface? If supporting research exists, then problem solved. Without such research, introducing a new interface does not appear to have been a decision in the best interest of the world’s population of OLPC users.

A bigger problem: what good is it to teach millions of children to use an interface that the rest of the productive world will never use? Doesn’t this approach run contrary to doing well for those children? Aside from what Mr. Negroponte thinks, these children are going to learn MS Office, or another standard GUI, when they reach middle or upper school. Long term, OLPC hurts more than it helps because these kids end up with no transferable skills. The solution of a stripped-down MS Windows or even the Linux version of the standard GUI may have been repugnant to Mr. Negroponte, but in the end, it may have been a better solution for the greater good of the children who may use the OLPC computer.

Research challenge to OLPC: If a non-standard interface requirement was defined by objective and professionally executed field research, then the publication of such research is necessary to support the basic premise of the OLPC user experience design.

Q2: Why the student and not the teacher? If one were to research comprehensively the factors that impact acquiring the skills, rules and knowledge required to navigate the modern world, several key factors surface. One of the critical factors (some say the most critical factor) is the teacher and his or her ability to convey both content and compassion in the classroom. Wouldn’t it have been a potentially better solution for OLPC to focus on creating tools for enhancing the teacher’s experience first? In other words, “One Laptop Per Teacher”, with higher levels of functionality, communication, and most important, access to other teachers who can aid in defining and propagating teaching methods in complex and difficult third-world environments. In such environments where basic social and political processes are broken, wouldn’t it have been more productive to focus high levels of innovation on the teacher? This view changes three critical problems for OLPC. It changes the pricing model, refocuses training on factors that impact teacher effectiveness, and opens up the hardware platform options on a massive scale.

Research challenge to OLPC: If the need to focus a laptop on the student vs. the teacher was defined by objective and professionally executed field research, then the publication of such research is necessary to support the basic premise of the OLPC system overall.

Q3: Why not Intel and a standard hardware platform? In a recent article in The New York Times, it was made clear that Intel, a supporter of OLPC, no longer wished to be associated with the project and that Intel would not develop a proprietary processor for the OLPC computer. The breakup was based on Mr. Negroponte’s insistence that Intel stop marketing its basic PC to the same countries OLPC is attempting to capture. Mr. Negroponte was especially upset by comparisons between OLPC and the low-cost Intel computer which sports a stripped-down version of the standard GUI. What can Mr. Negroponte have been thinking? It is not only totally appropriate that Intel or anyone else sell to these customers, but also it is in the best interest of millions of children to do so. Is it possible that in the larger context, a low-cost computer which builds the skills of kids over the long term is a better solution? Totally objective technical evaluations will ultimately determine this answer, but in the final analysis, OLPC may be another clever computer project looking for a solution and that solution is not helping educate the impoverished kids of the world.

Research challenge to OLPC: If it has been shown through objective and professionally executed field research that the OLPC design is superior in terms of usability, functional feature-set, reliability, technical support, and upgradeability, then the publication of such research is necessary to support the basic premise of the OLPC solution overall. If this data exists, there should be no reason other manufacturers cannot respond with solutions to the same problems without pressure from OLPC.

UED Review background information

About the impact rating scales:
Short-term = ability to bring attention to a user experience design innovation that resulted in media exposure and buzz over the past 12 months
Mid-term = ability to influence a specific category of products or services in terms of user experience design over the next 1-2 years
Long-term = ability to bring about a fundamental industry-changing user experience design innovation for a basic technology platform or large industry sector over the next 3-5 years

Disclosure is important to us:
Over the past 30 years, MauroNewMedia has been retained by many leading corporations, start-ups and non profit entities. The placement of products on this list is entirely independent of relationships MauroNewMedia has with these companies. The opinions expressed in this review are the sole opinions of MauroNewMedia. We receive no compensation of any type for products reviewed and presented in the Annual User Experience Design Review.

Actual release dates of the selected systems:
We make note that some of the systems presented in the 2007 review were not actually launched in 2007. Inclusion was based on these systems achieving significant momentum in the 2007 general time frame.

Subscribe to email updates


Post a Comment

Your email address will not be published. Required fields are marked *