Index_U


U

Uncertainty, iterative development process and, 34

Unstructured diary studies, 371–375

defined, 371

email form for, 374–375

email instructions for, 371–373

structured diary studies vs., 371

Updating user profiles, 154–155

Upper management, presentations to, 500–501

U.S. Department of Agriculture, user advisory board charter, 389–391

Usability

design and, 20

ROI for, 522–525

See also efficiency; functionality

Usability Engineering Lifecycle, The, 57

Usability Professionals Association, 114, 449

Usability test diaries, 377–379

Usability testing

analyzing the data, 293–297

asking "why," 288

benefits and pitfalls, 70

budget research lab, 529–531

choosing features to test, 260, 268–270

collecting observations, 293–295

competitive, 67, 68, 428–430

conducting the interview, 285–292

cost-benefit cutoff for, 267

creating tasks, 270–275

defined, 259

described, 70

diary studies, 377–379

estimating task time, 272

evaluator for, 259, 290

example of iterative process, 261–262

example report, 297–302

extracting trends, 296

eye tracking, 466–467

focus groups vs., 204

fork task examples, 273–274

friends and family test, 9–15

hybrid interview, 291, 471

for interaction design, 50

investigating mistakes, 289

in iterative development cycle, 36

limitations, 395

log analysis combined with, 473–474

maintaining focus, 289–290

moderating, 288–290

observational interviewing combined with, 471–472

observer instructions, 291, 540–541

observers, 290–292

open-ended, or fishing, 260

organizing observations, 296

organizing questions into projects, 71–73

overview, 260

physical layout, 286–287, 529

preparation, 265–285

probing expectations, 288

probing nonverbal cues, 289

process for, 265–292

of prototypes, 67

quantitative information from, 293–295

recruiting, 265–268

report example, 484–493

for requirement gathering, 66

in research plan, 67, 68

in research plan example, 81

schedule for, 264, 267

scheduling service example, 39–40, 42

script for, 275–285

suggesting solutions, 288–289

as survey follow-up, 358

for surveys, 325, 327

target audience for, 260

task analysis combined with, 474–475

task-based vs. hybrid, 291

time requirements, 76

timing for, 260

tips and tricks, 292

Typhoon example, 5–6

user markets and, 267

user severity measures for problems, 296

uses for, 9, 259–260

videotaping, 276, 286–287, 295

virtual, 370, 464–466

Webmonkey 2.0 global navigation example, 261–263

Usage frequency and sampling frame, 329–330

Usage logs. See log analysis; log files

Usage survey questions, 308, 536–537

Usage trends of audience, 142

User advisory boards, 385–391

buyers on, 386

charter for, 388–391

defined, 385

limitations of, 385–386

qualities needed by members, 387

recruiting, 386–387

size of, 387

uses for, 385, 386

working with, 387–391

User-based statistics in log analysis, 411–412

User-centered corporate culture, 505–527

difficulties creating, 506, 525–526

encouraging user experience roles, 514–515

hostility toward creating, 526

inevitability of, 527

integration, 506–515

involving stakeholders early, 511–512

justification, 516–525

knowing the current process, 506–509

measuring effectiveness, 518–522

momentum and resistance to, 525

need for, 505–506

patience and persistence in creating, 513–514

preparing for failure, 511

reasons for user-centered processes, 516–518

ROI calculation, 522–525

short-term advantages of, 517–518

showing results, 512–513

starting small, 509–510

User experience

categories of work when creating, 43–44

changes over time, 368–369

continuous nature of, 43

identity design and, 44, 50–52

information architecture and, 43, 44–48

interaction design and, 44, 48–50

mediocre, total malfunction vs., 18

researchers, 52–53

User experience researchers

goals process and, 61

responsibilities, 52–53

User Experience Specialist, 514

User Interface Engineering, 441

User interfaces

consistency of, 49

efficiency of, 19

emphasis and, 49

identity design and users' attention to, 52

interaction design and, 48

predictability of, 49

User profiles, 129–157

benefits and pitfalls, 69

bias issues, 143

described, 69, 130

development using, 153–154

documenting, 151–152

example, 155–157

for focus group recruiting, 211, 212–213

in-house needs for, 152

for information architecture development, 47

from marketing research, 443–445

multiple profiles, 145

need for, 129–130, 152

personal details in, 146–148

portraits, 152

prioritizing, 148–149

for requirement gathering, 66

in research plan, 79, 131

role playing with, 130

scenarios using, 149–150

sharing, 153

updating, 154–155

using, 150–155

See also attributes of audiences

User profiling

bias issues, 143

clustering attributes, 143–145

creating people around the clusters, 146–148

defined, 130

guerilla method, 145

internal research for, 133–134

listing attributes, 135–143

overcoming resistance to, 132–133

preliminary research for, 133–135

prioritizing, 148–149

schedule for, 131

team for, 132

user interviews for, 134–135

User research

iterative development and, 35–36

quality assurance (QA) vs., 391

scheduling service example, 36–42

Users

"all possible," 497–498

complaints about, during presentations, 502

customers vs., 134

goals process and, 61

interviewing for user profiling, 134–135

making money vs. satisfying users, 17

multiple profiles for, 145

recruiting, 92

success criteria for, 18–20

as threats to some people, 526

as Web site stakeholders, 17

See also attributes of audiences




Observing the User Experience. A Practioner's Guide for User Research
Real-World .NET Applications
ISBN: 1558609237
EAN: 2147483647
Year: 2002
Pages: 144

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net