Moore's Law, in its crudest terms, refers to the theory that technological capacity doubles every two years. So far this prophecy – first posited by Intel co-founder Gordon Moore in 1965 – has been eerily consistent.
Mankind’s capacity for information-processing, sadly, hasn’t followed the same trajectory. Computers may be able to process information with pure objectivity, but the people who design them can’t.
The good news is we are not ignorant of our own ignorance. Those responsible for creating user-friendly software tools are more aware than most of of our flatlined information-processing skills.
That’s why User Interface Design (UID) experts frequently use metaphors in order to make interaction between people and technology easier.
"We often make assumptions about how things work by transposing metaphors that help us make sense of complex technology," explains Alan Dargan, curriculum lead for digital designers with Digital Skills Academy in Dublin's Digital Hub. "Metaphors act as shorthand to help us infer things about the world. Think of the metaphors employed on the earliest desktop computers: files, folders, the trash can etc. By labelling various functions with names we already recognised it helped us make sense of something that was at that point quite alien."
Unconscious Bias
So far, focus has been on the human weaknesses we’re conscious of. In this age of AI, however, with the increased risk of human bias being inadvertently incorporated into robotics, we must be extra vigilant that other, potentially nefarious, psychological shortcomings don’t make their way into UID.
“When metaphors prove to be highly effective they can start to shape how we view the world,” says Dargan.
The lines traditionally dividing artistic and technical design continue to blur. This trend is, in part, on account of a growing appreciation for the potential innovation borne out of cross-collaboration. In this instance, good design principles can be applied in a variety of settings – from the style of a dress to your desktop layout. However, while no one would dispute the merits of taking a more interdisciplinary approach generally, new challenges arise.
"I think the problem comes from the fact that most product developers are human beings and so think they can depend, in part, on their intuitions about what other human beings want and how they'll behave," stresses Prof Randolph G Bias, co-director of the Information eXperience Lab, at the University of Texas. "If you have been working on a project for six months, or one day, you no longer represent the perspectives of your future first-time user."
Then there’s the importance of context and perspective. How many of your users might be from Kerry or Mumbai? Are they loyal consumers who have used legacy technologies, or have just switched from a competitor?
“How many are blind or are using a mobile device,” adds Bias. “Which of these variables even matter enough for your user audience to behave differently? How might their expectations, their ‘mental models’, their motivations differ? I would barely dare to try to answer this question about my own family, much less a group of tens of thousands of strangers, without gathering some data.”
Ever-evolving UID
The advantage in the context of something like UID – as opposed to traditional design practices such as architecture, is that it’s a never-ending project.
“Each iteration is a hypothesis that is tested and amended based on how well it performs,” says Dargan. “Take buttons on a keyboard, for example. “There’s been a lot of talk around the optimum colour for, say, a better-converting Call to Action button. Colour, however is shaped by things like cultural association and is also largely relative when placed in the context of the other colours around it.”
Humanity’s psychological shortcomings notwithstanding, each design decision leads to something measurable – conversion rates etc. So if the big red button is at the bottom of the page instead or the middle or top, we can determine scientifically how this affects the overall design and, therefore, reposition it if needs be.
Missing the memo on bias
Some biases develop when an organisation thinks it already knows what its customers want without doing any research.
"This is not a random occurrence," stresses Doreen Lorenzo, clinical professor of design and future and director of the Center for Integrated Design, as well as an independent business advisor and design columnist. "In my experience, the more knowledge someone has, the less motivated they are to ask more questions, be innovative, or look outside of their comfort zone.
Companies spend millions on product launches and little or no money on qualitative research to actually understand the problem at hand, and what their customers actually need or want. For Lorenzo, this was such a huge issue she decided to found a new company, vidlet.com, which does mobile qualitative research at a fraction of the cost of traditional design research. “We formed the company because we saw too many firms developing products that users didn’t want,” she says.
The tortoise and the soft(ware)
Three years ago Prof Randolph G Bias published a paper, along with two colleagues, entitled The tortoise and the (soft)ware: Moore's Law, Amdahl's Law, and performance trends for human-machine systems.
The three scientists juxtaposed Moore’s Law with a century of research into human information-processing capabilities. On paper it’s a bit like putting a five-year-old who just earned her white belt in karate in a ring against Conor McGregor. Not exactly a fair fight.
Nevertheless, humans are adept at adapting. We have already found ways to come to terms with the growing amounts of data we know surround us.
“New digital tools scan the internet on our behalf,” says Prof Bias. “So, even though we can still only read approximately 260 words per minute, we still get to read only the newest and most relevant information gathered on our behalf. We also employ cognitive strategies, such as organising files in ways that are meaningful to us, for easy subsequent access and use.
“Thus, though our fundamental human information processing skills continue advancing at glacial evolutionary speeds, we can avoid getting inundated by the fire-hose presentation of new information.”