Back to table of contentsCredit: Patrick Corrigan, Toronto Star
As I mentioned earlier, I subscribe to the view that design is problem solving. But without a clear understanding of what a "problem" is, how can it be solved? What what does it mean to "solve" something anyway?
The problem is, once you really understand a problem, you realize that most problems are not solvable at all. They're tangled webs of causality, wicked thickets of complexity. The best you can do is understand this complex causality and find opportunities to adjust, nudge, and tweak. Take, for example, this TED Talk by Sendhil Mullainathan on "solving" diarrhea and blindness:
Note how the "solutions" to the problems are all incremental: they change a few parts of a broken system, which leads to great improvements, but the problem is never "solved".
What then is a problem? Herb Simon said, "Everyone designs who devises courses of action aimed at changing existing situations into preferred ones." (The Sciences of the Artificial, 1969). I take Simon's view and see problems as "undesirable situations" (meaning undesirable to a human). Therefore, problems are really just situations that people don't want.
Now, that doesn't mean that a situation is undesirable to everyone. For one person a situation might be undesirable, but to another, it might be greatly desirable. For example, most gambling addicts wish it was harder for them to gamble, but casinos are quite happy that it's easy to gamble. That means that problems are inherently tied to specific groups of people that wish their situation was different. Therefore, you can't define a problem without define who it's a problem for.
What about a situation is undesirable? Let's talk about this in terms of consequence. What makes missing your bus a problem? There are consequences to being late to class, late to work. You might be stuck in the rain. Maybe you're cold. Maybe you don't get a seat on the next bus, which aggravates your arthritis. You can see, therefore, that problems are personal because consequences are personal.
Another critical piece of problems is their cause. Every problem has multiple causes and it's a designers job to understand as many as they can. Let's continue with the bus example. Brainstorm for a moment: why do people miss the bus? Let's enumerate the possible reasons:
And so on. Note that not everyone misses the bus for the same reason; some people miss the bus for more than one reason. And this problem doesn't apply to everyone because not everyone uses a bus.
Helping people not miss their bus literally means changing one or more of those causes above. Each one would require a different solution and would have a different effect on the problem. If you focus on getting a rider up on time, it only helps riders who are sleeping in, and you'll probably build something sleep related. If you focus on the problem of not knowing where the bus is, it only helps riders that don't know, and you'll probably build something information related.
Because solutions are specific to problems, solutions embody assumptions about the cause and effect of problems. And sometimes those assumptions are wrong. For example, consider this video by Tommy Edison, where he documents his attempt to use a Bank of America ATM as a blind person:
Why was it so hard for him to find the headphone jack? No one on the design team had any clue about the the challenges of finding small holes headphone jack holes without sight. They did, however, include a nice big label above the hole that said "Audio jack", which of course, Tommy couldn't see. Diebold, the manufacturer of the ATM had a wrong understanding of the problem of blind ATM accessibility.
This brings us to one last point. Because everyone's problems are personal and have different causes and consequences, there is no such thing as the "average user". Every single solution will meet some people's needs while failing to meet others. And moreover, solutions will meet needs in different degrees, meaning that every solution will require some customization to accommodate the diversity of needs inherent to any problem. The person you're designing for is not like you and really not like anyone else. The best you can do is come up with a spectrum of needs to design against, and then decide who you're going to optimize for. If you're clever, perhaps you can find a design that's useful to a large, diverse group.
How do you come up with this spectrum? There are many ways: surveys, interviews, observations, reading research, and pretty much any other method a social scientist uses to understand human behavior. Learning all of these and learning to do all of them well is far outside the scope of this brief class, so we'll discuss two: interviews and contextual inquiries.
The essential quality of an interview is you asking someone questions and them giving you open ended answers. Interviews can vary in how formal they are, ranging from a fully prepared sequence of questions to more of a conversation. They vary in how structured they were, ranging from a predefined list of questions in a particular order to a set of possible questions you might ask in a particular order. The art and science of planning and conducting interviews is deep and complex and you shouldn't expect to become an expert in this class. However, you will practice.
There are a few basic things to avoid in your questions.
When I prepare for an interview, I do the following:
Want some examples of great interviews? I highly recommend any of those by Fresh Air host Terry Gross. She's particularly good at establishing rapport, showing sincere interest in her guest, and asking surprising, insightful questions that reveal her guests' perspectives on the world.
Interviews are not perfect. They are out of context and require people to remember things (which people tend not to do well). That means your understanding of a problem could be biased or flawed based on fabricated memories, misrepresentations, or even lies. Another downside of interviews is that participants may change their responses to please the interviewer or conform with societal expectations for how a person should behave, based on the context of the interview. This is called socially desirable responding or response bias.
The second method we'll talk about is the exact opposite of an interview: rather than asking someone to tell you about their life in the abstract, you directly observe some aspect of their life. You go to where someone works or lives, you watch their work or life, you ask them about their work or life, and from these observations, make sense of the nature and dynamics of their work or life. This approach, called Contextual Inquiry, is part of a larger design approach called Contextual Design.
I'm not going to cover the whole method or approach here, but these are the basics:
As with an interview, once you have your data, it's time to step back and interpret it. What did you see? What implications does it have for the problem you're solving? How does it change your understanding of the problem?
Here's an example of what a contextual inquiry looks like and feels like (it's not great, but it gives you a feel).
This contextual inquiry is good in that it happens in context. However, it fails in that the researcher is the one driving the conversation, rather than letting the work of the taxi driver determine the conversation. If the latter had happened, there may have been long periods of silence while the driver drove the car, consulted his smartphone, etc.
Like interviews, contextual inquiries are not perfect. They're extremely time consuming and so it's rare that you can do more than a few in a design project. That makes it hard to generalize from them, since you can't know how comparable your few observations are to all of the other people in the world you might want to design for.
Beyer, H., & Holtzblatt, K. (1997). Contextual design: defining customer-centered systems. Elsevier. Chicago
Contextual Interviews and How to Handle Them. (2016). Interaction Design Foundation.
Coyne, R. (2005). Wicked problems revisited. Design studies, 26(1), 5-17.
Dell, N., Vaidyanathan, V., Medhi, I., Cutrell, E., & Thies, W. (2012). Yours is better!: participant response bias in HCI. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 1321-1330). ACM.
Rubin, H. J., & Rubin, I. S. (2011). Qualitative interviewing: The art of hearing data. Sage.
Trufelman, A. (2016). On Average. 99% Invisible