Author: Taichi Nakatani
Interaction:
We might be experts at interacting with computers, but that doesn't make us experts at designing interactions between other humans and computers.
What is HCI:
Reference:
Learning goals:
Learning outcome: "To design effective interactions between humans and computers"
Learning strategies:
New application areas:
Learning Goals:
PPP Table
Test Case: Tesla interface screen
Processor model: Strictly observe user's behavior (e.g. timing)
Predictor model:
Participant model:
Takeaway: We'll use all of these models at different times and in different contexts
We might start with a participant model where we just ride around with users watching what they do.
Based on that, we might observe that they spend a lot of time fumbling around to return to the same few locations.
So, then we might redesign an interface to include some kind of ‘bookmarking’ system, and present it to users in interviews.
There, they might tell us that they like the design, but further note that they don’t need a long list of bookmarks -- they really only need work and home.
Based on that, we might then design an interface where a simple swipe takes them to work or home. Then, we might test that with users to see how much more efficiently they’re able to start navigation when these kinds of shortcuts are provided.
The results of each design phase inform the next, and different phases call for different types of evaluation, which echo different models of the user.
So, keeping in mind everything we’ve talked about, let’s design something for Morgan. Morgan walks to work. She likes to listen to audiobooks, mostly non-fiction. But she doesn’t just want to listen, she wants to be able to take notes and leave bookmarks as well. What would designing for her look like from the perspectives of viewing her as a processor, a predictor, and a participant?
Gulf of execution: How hard is it to do in the interface what is necessary to accomplish those goals? What’s the difference between what the user thinks they should have to do, and what they actually have to do?
3 Components:
Example: Microwave
Gulf of e valuation: How does the user becomes aware that their action succeeded.
3 Components:
Example: Thermostat
7 questions to bridge the gulf of execution / evaluation:
Norman also further articulates this by breaking the process into phases that span both execution and evaluation.
Tying it to KBAI:
What is the problem with the framing of the problem?
The right answer is: We shouldn't be thinking just about swiping or inserting a card, we should be thinking about the general purchasing process.
Lesson Goals
Lesson Outcomes
Assessments
"In order to design interactions that are better than existing designs, it is important to take into consideration the user’s needs at every stage of the design process."
ISO - Six principles to follow whwne pursuing user-centered design
"User-centered design isn’t just about catering to the user in the middle, but also in looking at the impact of our design on all the affected stakeholders."
Examples:
User = Teacher (uses gradebook)
, Secondary = Parents (receives gradebook)
, Tertiary = Students (affected by grade)
Reference: “The Inmate Are Running the Asylum” by Alan Cooper
"In HCI, we’re designing interfaces to accomplish goals, and then based on the output of our evaluations with those interfaces, we judge whether or not the goals of the interface were accomplished. Then, we repeat and continue."
"In many ways, we’re doing the same things that our users are doing: trying to understand how to accomplish a task in an interface. "
Quantitative Data: observations described or summarized numerically. Quantitative data involves anything numeric.
Qualitative Data: observations described or summarized non-numerically.
Uses:
Infamous studies:
Response:
See document link below for explanation of:
Before we start our needfinding exercises, we also want to enter with some understanding of what data we want to gather.
In order to do some real needfinding, the first thing we need to do is identify the problem space.
Significance: We want to understand who we’re designing for.
Audiobook example:
"Differentiate whether I’m designing for business people who want to be able to exercise while reading, or exercisers who want something else to do while exercising."
- identify these different types of users, and perform needfinding exercises on all of them.
- Reference: Doing Cultural Studies by Hugh Mackay and Linda Janes.
Definition: Fly on the wall approach. Note down what people are doing, and let that guide the design.
Definition: Be a participant in your own study.
Significance: Look at hacks users employ.
Errors:
Significance: Use ethnography (living close to users you're studying) to understand domain knowledge necessary to design new interface / improve the user task.
Definition: Ask users to talk about their perceptions of the task in the context of the task (while they're doing it).
5 Tips of Good Surveys:
Definition: Needs that our final interface must meet.
User data requirements:
External Requirements:
Definition: Direct manipulation is the principle that the user should feel as much as possible like they’re directly controlling the object of their task.
Invisible Interface: When the interface actually disappears . Users spends no time thinking about how to engage with the interface, all their time is dedicated to thinking about the task they're performing.
Hutchins, Edwin & Hollan, James & Norman, Donald. (1985). Direct Manipulation Interfaces. Human-computer Interaction. 1. 311-338. 10.1207/s15327051hci0104_2. https://www.lri.fr/~mbl/ENS/FONDIHM/2013/papers/Hutchins-HCI-85.pdf
Significance:
Definition: distance between the user’s goals and the system itself. Encompasses gulf of execution/evaluation.
"...the feeling of directness is inversely proportional to the amount of cognitive effort it takes to manipulate and evaluate a system”."
"The user starts with some goals, translates them into their form of expression in the interface, and executes that expression. The system then returns some output in some form of expression, which is translated by the user into their understanding of the new state of the system."
Definition: Providing the user the feeling that they are directly controlling the objects.
"The systems that best exemplify direct manipulation all give the qualitative feeling that one is directly engaged with control of the objects--not with the programs, not with the computer, but with the semantic objects of our goals and intentions."
Examples:
Apple touchpad actions, which are direct engagements:
Significance: "Direct manipulation isn’t just about designing interactions that feel like you’re directly manipulating the interface. It’s also about designing interfaces that lend themselves to interactions that feel direct."
Example: Stylus vs mouse - stylus makes the gulf much narrower to the point of the interface becoming invisible.
Good vs Bad Design of "invisible-ness"
Significance: Interfaces become invisible not just through great design, but also through users learning to use them.
Goal: Users should feel immediately as if they’re interacting with the task underlying the interface.
5 Tips fo Invisible Interfaces
Challenge: How would we design an invisible interface for universal remote control, one that doesn’t have the learning curves that most have?
Takeaway:
Information processing model:
Visual
Auditory
Haptic
Q: How to alert someone when they receive a text message, without disturbing others.
Solutions: Smartphones have cameras and light sensors - use that to determine where the phone is to determine what type of alert to use. (This could lead to a lot of surprise though).
3 kinds of memory:
Definition: very short term, less than a second.
Baddeley & Hitch's model of working memory:
Definition: Capacity for holding a small amount of information in an active, readily available state for a short interval.
"Chunking" - bits of short-term memory. We can only hold 4-5 chunks at a time.
Takeaways:
Definition: Seemingly unlimited store of memories. But harder to put something in there. Generally need to put it into short-term memory several times.
Leitner system: A way of memorizing key-value pairs (ie. flashcards).
"When we design interfaces, we are in some ways hoping the user has to learn as little as possible to find the interface useful. "
2 Kinds of Learning:
Definition: The amount of working memory resources used.
2 major implications on designing interfaces:
Example: Programming
Significance: In designing interfaces, we’re also interested in what is physically possible for users to do. Includes how fast / precise they can take an action (e.g. tapping).
Example: Spotify control widget
Two big mistakes:
Definition: The area in which we design our interfaces.
Goal: Generate lots of ideas
5 Tips for Effective Individual Brainstorming
4 behaviors in group brainstorming that can block progress (Thompson, 2008) + 1 more:
Takeaway:
"We should enter into group brainstorming with strong ideas of how to address these issues, ideally after a phase of individual brainstorming has occurred."
Osborn, 1957:
Oxley, Dzindolet, and Paulus, 1996:
Takeaway:
"Note that all eight of these rules prescribe what individuals should do, but they’re only effective if every individual does them. So, it’s good to cover these rules, post them publicly, and call one another on breaking from them."
Goal: Reduce ideas down to 3-4 ideas that are worht prototyping.
Definition: Create actual characters surrounding the user.
Definition: Large number of different variables about users, and list out possibilities for each. More demographic.
Examples:
Definition: Take personas and stretch it over timeline of the task in which we're interested.
Definition: Examine specific scenarios users may encounter while using the interface.
Scenarios for audiobook:
Definition: Creating an interaction model of the user and their goals.
Checkout model example: Given alternatives, what are its efficiencies / speed associated with it? Can measure how efficient one design is compared to another.
Applying learnings:
Main references:
TODO: Read more on nuances of perceptibility, tolerance and feedback between these authors
Condense from above reference
Definition: Relevant functions should be made visible so that the user can discover them, as opposed to having to read about them in the documentation or learn them through a tutorial.
Examples: Discovering functionality via toolbar in application.
Definition: The user should only be given as much information as they need.
Definition: "Relationship between the properties of an object and the capabilities of the agent that determine how the object could be possibly used" (Norman)
Example: Software buttons, click makes it look like its being depressed. Visualize space of options by a drag bar to select color.
Definition: In-context instructions, such as arrows to indicate which way to swipe or a menu icon to indicate how to access the options.
Affordance vs Signifiers
Definition: Relationship between interface and their effects in the world
Examples: Monitor display view, layout matches real world. Color range selection, color is shown with the slide bar.
Definition: User’s ability to actually perceive the state of the system
Example: Light switches (state determined by whether up or down), oven switch (can see where the dial is set).
Definition: We should be consistent both within and across interfaces to minimize the amount of learning the user needs to do to learn our interface. Follow convention if it exists.
Example: URL links should be highlighted in different color. Use consistent hotkeys used by other programs.
Definition: Don't force users against their preference.
Definition: Equity is largely about making the user experience the same for all users
Definiton:
Examples:
Definition:
Example: Newspaper layout. Still applies to digital media, though with less text since article can be embedded.
Definition: Prevent the user from performing erroneously in the first place by constraining their choices/actions.
Example: Password reset screen, explicitly tells you password constraints. Three-prong plug can only be installed a certain way.
Norman's 4 Types of constraints:
Definition: Allow users to undo and redo when mistakes are made.
Definition:
Example:
Definition:
Topics:
Definition: person’s understanding of the way something in the real world works.
Mental models and education:
(From Dix, Finlay, Abowd, and Beale in their book Human-Computer Interaction)
Definition: How things are visualized to users, in order to mold their mental model.
Example - Wolf vs Sheep:
Example: Google Calendar
Example: Powerpoint animations
Definition: Grounding an interface to something that users already know
Types of slips:
Types of mistakes:
How to prevent slips and mistakes:
Exercise: Sending a text to the wrong person. Slip or mistake?
Definition: A user's sense that they are helpless to accomplish their goals in an interface.
From a designer's perspective: What feedback do you need from your user to figure out how you can help them?
Definition: When you’re an expert in something, there are parts of the task that you do subconsciously, without even thinking about them.
Definition: Looking at the different ideas available to us and actually build things we can put in front of users.
Goal: Get user feedback as quickly and rapidly as possible, and iterate on improved prototypes.
Definition: Verbally explaining what the prototype is to a user.
Definition: Draw prototype on paper. Show them the prototype and get their thoughts.
"Card prototyping" - each screen / state is on a different card.
Definition: User interacts authentically with an interface, and a human supplies functionality that hasn't yet been implemented.
Pros:
Definition: More fleshed out version of paper prototype. Start to think about font size, screen real estate, etc.
Definition: Physical form of new idea. Doesn't have to actually work, can use existing devices and recontextualize it (e.g. car keyfob as bluetooth remote). Ask user to use it and provide feedback.
Takeaways:
Lesson Outcomes
Tasks analysis vs needfinding:
Definition: What task are they performing?
Two methods for formally articulating the tasks users are completing.
4 sets of information it proposes gathering about a task:
Motivation:
Cons:
Pros:
Takeaway: GOMS model helps us to focus on places where the interface is asking too much of the user
Paper: https://www.di.ubi.pt/~agomes/ihc/artigos/john2.pdf
Paper covers 4 Variations on GOMS - Differs in what additional elements are provided.
Takeaways:
Definition: General type of method to evaluating how users complete tasks.
Problem: Goal of CTA is to build models of human reasoning and decision-making, but often tasks are complex and high-level. Models that are too high-level are useless.
Solution: Break down into smaller tasks, to the point the task could be implemented in a variety of contexts
Strengths of breaking down large tasks to hierarchy of tasks:
Example of shopping checkout - Note real CTAs will be much more complex.
Pros:
Cons:
For cognitive models:
Definition: "Expanding the unit we use to analyze intelligence from a single mind to the mind equipped with other minds and artifacts and their relationships."
Takeaways: The pilot, plus all the other controls, dictates what the "cockpit" remembers.
Problem: During descent, a plane must make various wing configuration changes which are dependent on speed. Pilot must remember a sequence of speeds at which multiple changes must be made (in narrowly-defined times).
Solutions: Multiple artifacts perform various cognitive roles in the system.
Example - Checkbooks: What artifacts distribute cognition?
Is distributed cognition as design principle? No.
Significance: Important because distributed cognition interfaces exhibit and extend our cognitive qualities.
Motiviation: Technology is rapidly growing into the social sphere (e.g. social media), but interfaces are often at odds with how we really think about social interaction.
Definition: Focuses on the learners' responsiveness to their environments and the ways in which human action arises in “the flux of real activity” (Nardi, 1996).
Takeaways:
"We can try to structure it as much as we can, but until the users get started, the task doesn’t exist -- and once they get started they play a significant role in defining the task."
Takeaways:
Suchman, L. (1987). Plans and situated actions: The problem of human-machine communication. Cambridge University Press. http://bitsavers.trailing-edge.com/pdf/xerox/parc/techReports/ISL-6_Plans_and_Situated_Actions.pdf
What: Compares between two views of human action.
Significance:
"Rather than assuming the user has a plan in mind that they are actively carrying out, we might consider viewing only their immediate interaction with the current screen."
What: Large set of theories regarding interactions between various pieces of activity
Three main contributions of activity theory
Nardi, B. A. (1996). Context and Consciousness: Activity Theory and Human-computer Interaction. MIT Press.
What: Collection of papers on HCI
Significance:
"“Activity theory offers a set of perspectives on human activity and a set of concepts for describing that activity” and “This … is exactly what HCI research needs as we struggle to understand and describe “context”, “situation”, “practice”."
“Studying Context: A Comparison of Activity Theory, Situated Action Models, and Distributed Cognition”
What: Another paper in book, compares and contrasts the three philosophies.
Significance:
"Attention to the shaping force of goals in activity theory and distributed cognition... contrasts with the contingent, responsive, improvisatory emphasis of situated action."
- Activity theory and distributed cognition are focused on goals, while situated action is interested in improvisation.
"Goals are our musings out loud about why we did something after we have done it"
- Situated actions views goals to be constructed retroactively, interpreting past actions.
On "persistent structures" - Acitivity theory vs distributed cognition differs on how they evaluate symmetry between people and artifacts.
“Activity theory, with its emphasis on... motive and consciousness... sees artifacts and people as different.”
- Activity theory regards them as fundamentally different, given that humans have consciousness.
"“Distributed cognition... views people and [artifacts] as conceptually equivalent... “agents” in the system.”"
- Distributed cognitions puts people and artifacts as equals.
- Artifacts can have cognitive roles.
Definition: Evaluation is where we take what we’ve designed and put it in front of users to get their feedback.
Types:
"It’s important that you very clearly articulate at the beginning what you’re evaluating, what data you’re gathering, and what analysis you will use
Series of steps to perform to ensure that your evaluation is useful:
Goal: Get qualitative feedback from user. Similar to needfinding
Type of Qs:
Questions you’ll have to answer in designing a qualitative evaluation:
How to record data:
"When selecting a way to capture your qualitative evaluation, ask yourself: will be subjects find the camera intrusive? Am I capturing what happens on screen? How difficult with this data be to analyze?"
Definition: Record quantitative data
Treatment vs Control
Assigning participants
Problem: Difference in numbers could arise just by random chance, are they different enough to conclude they're really different?
Solution: Null and hypothesis testing. Steps:
Motiviation: Null and alternative hypotheses are common to all kinds of hypothesis tests. The specific kind of hypothesis test you conduct, however, depends on the kind of data that you have:
Problem: How to handle more than 2 independent variables?
Solution:
Problem: What if independent variable is interval or ratio?
Solution:
When to use: Only use if we wouldn't otherwise be doing any evaluation.
Heuristic Evaluation Hand interface and guidelines to a few experts to evaluate.
Definition: Step through the process of interacting with an interface, mentally simulating at each stage what the user is seeing, thinking, and doing
"Is it reasonable to expect the user to cross the gulf of execution? Is the right action sufficiently obvious? Is the response to the action the one the user would expect?"
Goal: Apply multiple evaluation techniques to constantly center our designs around the user
In some emerging areas, you’ll also be fighting multiple questions in evaluation.
Take virtual reality, for example: most people you encounter haven’t used virtual reality before. There is going to be a learning curve. How are you going to determine whether the learning curve is acceptable or not? If the user runs into difficulties, how can you tell if those come from your interface or if they’re part of the fundamental VR learning experience?
So, take a moment to brainstorm your evaluation approach for your chosen application area. What kinds of evaluations would you choose, and why?
Third Motivation: To change the user's behavior
Three goals of HCI:
Langdon Winner, Do Artifacts have Politics? (1980): https://faculty.cc.gatech.edu/~beki/cs4001/Winner.pdf
Takeaway:
Example: Robert Moses and Construction of Highways
"There are numerous examples of positive change happening more as a byproduct of technological advancement than as a goal of it."
Bijker, Of Bicycles, Bakelites, and Bulbs (Toward a Theory of Sociotechnical Change) - https://mitpress.mit.edu/9780262522274/of-bicycles-bakelites-and-bulbs/
Example: Internet access
Definition:
The Value-Sensitive Design Lab at the University of Washington defines Value-Sensitive Design by saying: “Value sensitive design seeks to provide theory and method to account for human values in a principled and systematic manner throughout the design process.”
not only is an interface useful in accomplishing as task and usable by the user, but is it consistent with their values?
https://www.ipc.on.ca/wp-content/uploads/resources/7foundationalprinciples.pdf
Friedman Paper: https://www.researchgate.net/publication/229068326_Value_Sensitive_Design_and_Information_Systems
Takeaways:
Takeaway: Values can differ between cultures, need to be sensitive to it (e.g. privacy)
Takeaways:
Takeaway: Society changes technology as well (moreoften than other way around)
"There’s one final thing you must understand about the guidelines and heuristics and principles we’ve talked about. They’re only half the picture. They’re necessary for good design, but they aren’t sufficient. You can’t just grab these guidelines off the shelf, throw them at a new task, and expect to have a perfect interface the first time."
Takeaway:
Commonalities:
Differences:
Paper (required reading): https://link.springer.com/chapter/10.1007/11774129_15
Takeaways:
Takeaways:
The first cycle:
The second cycle:
The third cycle:
Takeaways:
"In many ways, design principles capture takeaways and conclusions found by this design life cycle in the past in ways that can be transferred to new tasks."
Needfinding:
Design Alternatives:
Prototyping:
Evaluation:
Participatory Design
Action Research
Design-based Research