
At RKS, the design firm I lead, we believe User Experience is the backbone to building trust and empathy with those who use the products and interfaces we design, every day. Whether it’s a physical or digital interface, products must not surprise but delight, be dependable and not frustrate. There should not be barriers to use that need lengthy explanation, but instead products should work in expected ways, even if they’re entirely new to the user.
We design objects that live in the physical world and digital experiences that reside on them that respond to some form of human touch. These tactile and cognitive deliverables (the digital and physical) are very different from one another but our design approach to each of them is essentially the same.
When our designers gear-up to create a great user interface, we don’t concern ourselves with what the UI might look like. (Well, not at first.) We start by getting a clear understanding of the goals the end-user needs to accomplish. Once we understand the users’ needs, we can better determine what UI workflow solutions are required to achieve the goal. When designing a new UI, there’s simply no affordance for taking a ‘style first’ approach to the design process. Designing an appropriate user experience is seldom birthed from a concept sketch that “looks cool”. The appropriate look and user experience (UX) along with the user interface (UI) must be born from a foundation of understanding.
The consequences of putting style over substance in either is simply too high; because users may not understand how to perform the action. There may be input mistakes, the user may not understand the action, or find the action error prone.
At RKS, we use a similar user-centric approach when designing physical artifacts and their matting UI / UX. Aesthetics is definitely an important component and it’s often what’s celebrated, but functionality is paramount. This is perhaps most evident in the medical and scientific products we design. These products often reside in mission-critical environments. If a user makes a mistake performing the desired action, the consequences can be costly. Medical rescue personnel and first responders will tell you that generally their calm days are punctuated by moments of fear when the alarm bell rings. When they are dispatched into extreme conditions, there’s a much higher probability of situational disabilities. When designing medical gear for deployment on life-flight helicopters, we had to consider that even well-qualified users might be more prone to making mistakes while performing under extreme mental and physical conditions. This hardware platform that was engineered for use in the field by rescuers also resides in the quiescence of a hospital or lab setting. Obviously, the physical design and man-machine interface for this user segment had to be appropriately designed for these disparate environments. The internal hardware is the same, but, the physical designs are quite different from one another. The physical design and feature sets had to be reprioritized. It was important to research the trade-offs because you only get one chance to get the physical design right.
Generally, it’s much easier to spot and debug a flawed digital interface or experience before it’s released into the wild. And there’s usually the option to push UI/UX updates in the next code release. In contrast, the physical placement of controls, buttons, access ports, trays, and information graphics on the product are not updatable. Once the physical design is in production, there’s no turning back. We work tirelessly on the design until the physical and digital experiences harmoniously converge into a holistic product experience, which is key to building brand loyalty.
Until next time, thanks for letting me be Frank.