Designing for stressful situations

On Saturday 13th January 2018, residents of the US State of Hawaii received an alert on their phones. Instead of a text from their relatives, or the latest football scores, the alert gave the ominous message of “BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL”. In an environment of the realistic threat of nuclear war indicated by the US President’s tweets, alongside the continual goading and threats put out by North Korea on their state television networks, the announcement was met with derision, confusion, and overall, undue widespread panic. It turned out later, following a second message indicating that the first one was a “false alarm”, caused by an employee during a shift change at the Hawaii Emergency Management Agency, who intended to send out a “test” message, instead of a “live” one. This resulted in a public apology from the Governor of the state, and much embarrassment amongst those responsible.

Much has been said in the days that followed the incident about the way in which such a mistake could have been made. Aside from the general panic, it has the extra element of “cry wolf”, where, if the alert is sent out again, those receiving it will now view it with a level of skepticism, instead of immediately acting upon the advice – something which is crucial in the estimated 20 minutes that it would take for a nuclear missile to reach Hawaii from North Korea.

Aside from the social or political elements of this situation, it also raises some questions around the design of the system built to send out these alerts, and how to ensure that users can ensure that the intended drill of “test” or “live” messages are being run. Two luminaries of the design world, Don Norman, author of Design of Everyday Things, and Jared Spool, founder of UIE and the Center Centre have both written insightful pieces on their views of what went wrong, and how it can be prevented in future by the use of better design principles. Writing on Co.Design, Norman’s piece focusses on the fact that we like to blame people, rather than poor design and that the system should have been thoroughly tested before it went live. Spool’s piece on Medium focusses on the fact that the system incorporated poorly chosen file names, as well as poorly designed software.

Both pieces come to a similar, and vital conclusion. When we design and build a system, we should always consider the possibility of what it will be like for someone to use it under stressful conditions. We may not ever come to the point of designing alert mechanisms for incoming nuclear missiles, but we should still consider how easy our webpage, app or system might be to use if the user is pushed by a tight deadline, for example, or is being distracted in some way. Interestingly, my workplace at this very moment, is incurring stress upon my colleagues and I, as work is being done on the roof, and we are frequently subjected to loud bangs, or the sound of hammering and drilling. These interruptions frequently distract us from what we’re doing, causing us to lose focus on where we were with our work, and having to spend time reacquainting ourselves with the task at hand. In this situation, it is common for us to make mistakes as our brains make incorrect assumptions, and we then have to realise those mistakes, and correct them.

So, how do we design for these stressful situations? Thankfully, much of this is already incorporated into the skills we learn in User Experience Design. Giving structure to pages and forms, giving visual dominance to important elements, and clearly labelling and colouring key features can help the brain realise what needs to be done more quickly. We also need to consider what the implications would be if a user made a mistake. How easy would it be to choose the wrong thing? Can an error be easily rectified? Does the system indicate clearly the user’s selections, so that they can go back and fix it later? We can even test these things ourselves – take a prototype or a test version of the system, and go and sit in a busy or noisy environment and ask people to try it. Compare this with people in a quiet, calm place, and see if there is a difference in successful outcomes. The more we trial the system or software, the more sure we can be of its success. We can say that good design and testing can improve revenue or experience, but some day, in the case of the Hawaii Missile Alert, it might just save our lives.

A world of dark patterns – a manifesto

We’ve created a world of noise. A world filled with notifications, promotional emails, press releases and advertising billboards, all vying for our attention and clogging our day with distractions. How did we get to this state, and what can we do about it?

Look at your inbox. How many emails do you have currently in there? Who are they from? Sure, there are probably a few from your parents or from your friends, but the majority are probably from companies, offering you money off something, or perhaps a sale coming up. Look outside – if you live in an urban area that’s not Grenoble or Sao Paolo, chances are you don’t have to go far before you see a billboard, advertising something. Now, ask yourself, do you really need, or want those things they’re offering? Chances are, probably not. We’ve created an ecosystem of noise, where, to paraphrase Edward Norton’s character in Fight Club, “we buy things we don’t need with money we don’t have to impress people we don’t like”. There are whole industries set up to try and persuade you to part with your money, by making something you don’t actually need attractive enough for you to think it’s worth it.

In UX, this is what’s known as a dark pattern. It’s basically fooling the user into performing an action that they don’t actually want to do. We’ve all seen them before: the “Like us on Facebook” modal window that pops up on top of the article you’re trying to read, where they’ve hidden the close button, so most users think the only way to get it to go away is to click the one button on the window – the one that adds a Like to the company’s page, or the form where the sign up button is huge and ostentatious, and the “no thanks” button is a tiny text link underneath. These patterns trick the user into performing an action that is more in the interest of the company setting it up, than that of the person performing it. This, I’m sure you will agree, is wrong, and needs to change.

The good thing is, it can. The major thing that UX stands for is putting the needs of the user first. All of these dark patterns are examples of business requirements being put above those of the user. As a UX Designer, it’s my job to make sure that doesn’t happen. Of course, I’m still in the business of building websites that sell things, but I view UX as an antidote to all those tricks that advertisers used to pull in order to get you to buy these things. I stand up against the dark patterns, the hidden buttons, the tracking methods to try and entice you to buy things you don’t really want, and instead, present a different approach – to give you what you want, when you need it. I don’t define what you want, nor do I dictate when you need it – I just make sure that you can get it when you do. All of the content on this website is here to show you just how I do this, and help you understand that there’s a better way. Although I work in websites, UX doesn’t just extend to software – there are many more ways in which it can be used. I hope you explore some more, and look forward to showing you what I know.