Blog

Tragic Design: How to Spot Bad UxD in Software

September 22, 2020

It's vital that user experience design is done right. Uncover the consequences of bad UxD and learn how to prevent them through effective design.

Tragic Design: The Consequences Of Bad UxD

As people interact with and rely on more and more technology, we don’t have to look too far ahead into the future to spot some potentially tragic UxD pitfalls requiring expert navigation.

Bad user experience, for most of us, means something like a frustrating tab that won’t go away while browsing an app or website. This is usually nothing major, resulting in an eye-roll or an impatient screen tap.

But what if we bring bad UxD into situations involving people’s safety or well-being? UxD is often a part of much more important systems and, here, bad UxD can truly be a matter of life or death.


UxD can be dangerous? How?  


Let’s look at a couple of examples.

Jenny was a young cancer patient who had been struggling with her disease for quite a long time when she started taking new medication at a hospital. This new treatment was so aggressive that she required pre- and post-hydration cycles for three days through intravenous fluids. The nurses looking after Jenny were responsible for entering the information into a corresponding software application. 

Sadly, however, the nurses missed critical information about Jenny’s three-day hydration requirements on the app’s interface. Jenny died of toxicity and dehydration the day after her treatment. The experienced nurses made this critical error because they were too distracted trying to understand the app’s interface.

Analysis found that the interface was overloaded with information that made it difficult to extract useful insights. Colour choices weren’t the wisest and drew attention to the wrong parts of the app. Even worse, some critical information wasn’t visible on the app’s main interface at all!

Another tragic accident involving poor UxD occurred in 2016 which resulted in the death of actor Anton Yelchin, known for playing Commander Chekov in the recent Star Trek movies. Anton had exited his car and, believing that the automatic transmission was set to park, walked behind the vehicle. Sadly, it was actually in reverse or neutral, and the car rolled down his driveway killing the young actor. Over 1.1 million SUVs were recalled in response to the accident, with the design of the gear shifter judged to carry an unacceptable risk of confusing drivers. 

These examples show us two types of problems that might arise if UxD isn’t properly planned. In Jenny’s case, human cognitive abilities were largely overestimated. The information overload in the app’s interface didn’t allow the nurses to focus on what was important and critical information was buried within the system. This lack of clarity and straightforwardness ultimately contributed to Jenny’s death.

In Anton’s case, the car he owned lacked the usual grooves and feel familiar to people when moving the gear shifter into park, drive and reverse. This overlooked the fact that users get accustomed to certain mechanics over time and expect certain responses or behaviours from similar systems – something that could have been picked up by user testing. 


How can we create safe UxD for critical systems?


It’s a common misconception that UxD is not critical. This couldn’t be further from the truth. 

Many of the highest-profile and most costly system failures have poorly anticipated how users would respond to, understand, and use the system in question. We can’t look at systems and only consider their hardware and software aspects – we must also consider the users as part of that system. One needs to assess how people actually use critical systems to reduce operational errors, ensure safety and security, and optimise performance. 

Understanding a design problem in its various facets and creating a valid and effective solution is a wide-ranging task. For example, at Critical, we make this happen through discovery workshops based on design-thinking techniques. This process includes specifying goals and anti-goals. This means that we not only plan the things we want to incorporate into the design of a system, but we also plan the things that should be avoided. Solutions are then developed with real-world testing on users as part of the process – not just focus groups, but actual usability tests. Naturally, all our UxD activities conform to the wider industry regulations and best practices, while also accounting for the full critical ecosystem at stake. 

To learn more about how to get UxD right, we’ve put together a pocket guide on the topic. Download it for free below!