|
IBM's Joint Application Design (JAD), 468
ICQ, differentiation by, 25
IDC, 441
Identity cookies
defined, 407
expiration times, 407
session cookies vs., 407–408
user-based statistics using, 411–412
Identity design, 50–52
brand association and, 53
defined, 44
editorial voice for, 50
features and, 51
identity designer responsibilities, 51
information needs for, 51–52
tools, 52
user experience and, 44, 50–52
visual themes for, 50–51
Identity designers
information needs of, 51–52
responsibilities, 51
Impressions, 22
In-person surveys, 339–340
Incentives for research participants
choosing, 108–109
for contextual inquiry, 164–165, 166
costs, 76, 108
for diary studies, 381
no-shows and, 111
recruiting mistakes and, 110
for surveys, 328
Independent analysis, 440–441
Information architects
information needs of, 46–47
responsibilities, 45
Information architecture, 44–48
defined, 43
demographics for, 46
in Geocities Web hosting service, 44–45
implicit vs. explicit, 44–45
information architect responsibilities, 45
information needs for developing, 46–47
mental model for, 46–47
terminology for, 46
tools and techniques, 47–48
user experience and, 43, 44–48
user profiling and, 132
Web use profile for, 46
Information needs
of identity designers, 51–52
of information architects, 46–47
of interaction designers, 49
Informed consent statement
for focus groups, 219, 252
for usability testing, 277
Institutional caching of Web pages, 405–406
Instructions
email, for unstructured diary studies, 371–373
evaluation, for usability testing, 279–280
general, for surveys, 321–322
observer, for focus groups, 237–238, 539–550
observer, for usability tests, 291, 540–541
question, for surveys, 322–323
Interaction design, 48–50
audiences for, 49
defined, 44
information needs for, 49
interaction designer responsibilities, 48
task flows for, 49
tools, 49–60
user experience and, 44, 48–50
user interfaces and, 48, 49
user profiling and, 132
Interaction designers
information needs of, 49
responsibilities, 48
Interactions, contextual inquiry regarding, 172
Interactive Marketing Research Organization's Code of Ethics, 408
Interesting quotations section of reports, 493
Interfaces. See user interfaces
Internal discovery, requirement gathering using, 66
Internal traffic, removing hits from log files, 415
Internet Advertising Board Web site, 23
Internet resources
Advertising Board, 23
coding scheme, 400
coding software, 401
cookie-based session tracking information, 407
eye tracking equipment, 466
EZSort software, 197, 198
free research resources, 446–447
free Web survey tools, 325
independent analysis companies, 441
log analysis ethics, 408
log analysis tools, 415–416
mailing lists, 447
for professional recruiters, 113
specialist organizations, 449
usability blogs, 447
usability testing information, 267
for virtual usability tests, 464, 465
Internet use. See Web use
Interruption invitations for surveys, 337–339
Interviewer/interviewee model for contextual inquiry, 168
Interviewing, 117–127
artifacts in, 123–124
breaking the rules, 127
common problems, 124–126
composing nondirected questions for, 120–122
in contextual inquiry, 164, 170–171
defining terms, 122
dry runs for scripts, 118
listening to questions, 124
neutral interviewer for, 119–120
nondirected, 119–124
not forcing opinions, 122
observers for, 124
preliminary interview in usability testing, 277–279
restating answers, 123
reviewing tapes, 124
running a nondirected interview, 122–124
structure of interviews, 117–118
in task analysis, 184
usability testing combined with, 471–472
for user profiling, 134–135
using examples in, 123
videotaping interviews, 127
See also specific types
Introductions
in contextual inquiry, 169
in focus group discussion guide, 216–219, 251–252
in interviewing, 118
in surveys, 320
in telephone screener, 97–98
in usability testing script, 275–277
"Invisible Computer, The," 506
Invitations
for competitive research, 427
email, 336–337
interruption, 337–339
links for, 336
random selection JavaScript, 337–339
for research participants, 104–106
survey bias from, 334
for survey participants, 335–339
Issues
collecting from stakeholders, 59–60
presenting as goals, 60–61
in research plan example, 77–78
rewriting as research questions, 62–64
Iterative development
benefits, 32–34
corporate edict vs., 29, 30
creation step, 32
definition step, 31
examination step, 31
iterative spiral, 30–32
need for, 28
overview, 28–29
problems, 34–35
scheduling service example, 36–42
user research and, 35–36
waterfall method vs., 30
|