|
Caching of Web pages
defined, 404–405
institutional, 405–406
log analysis and, 405–406
personal, 405
Card sorting, 192–199
analyzing output informally, 195
analyzing output using cluster analysis, 195–198
benefits, 199
creating cards, 193–194
described, 192
for information architecture development, 47
preparation, 193
for prioritization, 198–199
process of, 193–194
the sort, 194
timing for, 192–193
Cardiod microphones, 225
Causation, correlation vs., 354
CCCORP (Computer Consultants Corporation) user advisory board charter, 388–389
Characteristic survey questions and subcategories, 308
Charters for user advisory boards, 388–391
Checklist survey questions, 310–311
Chi-square test, 354
Chief Experience Officer (CXO), 515
Click-throughs, 22
Clickstream analysis
average path, 413
content clustering, 414
cookies used for, 408
defined, 413
diary studies vs., 370
"next" pages, 413
overview, 413–414
purchase path, 413
shopping cart abandonment, 413–414
See also log analysis
Client domain statistics in log analysis, 410
Close-ended questions
open-ended vs., 121, 124–125
for surveys, 310, 311
Cluster analysis
analyzing card sorting output using, 196–198
defined, 195–196
diagrams, 197–198
EZSort software for, 196–198
Clustering, as audience attributes, 143–145
c|net
differentiation by, 24–25
navigation inconsistency in sites, 432–433
visual theme of, 50–51
Coding data
customer support comments, 400–401
defined, 400
diaries, 383
from focus groups, 242–243
software for, 401
Combined techniques
focus groups and diaries, 469–471
log analysis and usability tests, 473–474
observational interviews and usability tests, 471–472
surveys and focus groups, 472–473
task analysis and usability tests, 474–475
Comments
coding for focus groups, 242–243
leaving space in surveys for, 319
See also customer feedback analysis
Commercial recruiters. See professional recruiters
Committees, user. See user advisory boards
Communications of the ACM (June 1993), 468
Companies
conflicting agendas in, 501–502
culture for iterative development, 34–35
independent analysis specialists, 441
success criteria for, 23–27
traffic/demographic specialists, 442
as Web site stakeholders, 17
See also user-centered corporate culture
Comparing survey variables, 345–349
Competitive advantage from usercentered processes, 517
Competitive profiles, 423–426
audience profile, 424
features and attributes, 424–426
product description, 423–424
Competitive research acting on, 433–434
analyzing the data, 431–433
benchmarking, 433
competitive analysis techniques, 426–431
contextual inquiry, 427
extracting from focus group data, 246
feature audit, 425
focus groups for, 67, 206, 428
identifying the competition, 421–423
identity design and, 51, 52
invitations for, 427
profiling the competition, 423–426
recruiting for, 426–427
sorting competitors into categories, 422–423
surveys, 308, 430–431
usability testing, 67, 68, 428–430
user experience research, 419
uses for, 420
ZDNet example, 434–437
Computer Consultants Corporation (CCCORP) user advisory board charter, 388–389
ComScore, 442
Conclusion section of reports, 492
Confidence interval, 351–352
Confidentiality, reports and, 488
Confirming participant appointments, 106–108
Conflicting agendas in companies, 501–502
Conglomerate statistics in log analysis, 409–410
Consistency
in survey questions, 316
of user interfaces, 49
Consultants
contacting by email and phone, 454
finding, 449–454
guidelines for managing, 456–457
hiring, 447–457
reasons for using, 439
as resources after the initial research, 457
RFPs (request for proposals) for, 450–454
savings from using, 439
setting expectations, 454–457
timing for using, 448–449
See also specialists
Contact information for surveys, 320, 321, 322
Content clustering, clickstream analysis of, 414
Context
development over time, 369
presentations to engineers and, 496–497
Contextual Design, 57, 160, 176, 511
Contextual inquiry, 160–182
affinity diagrams for analyzing data, 176–179
after release, 67
analyzing data, 174–182
artifacts, 172
in atypical situations, 166
authenticity issues, 170
benefits and pitfalls, 69
big brother model, avoiding, 169
building models, 180–181
competitive, 427
conducting the inquiry, 167–169
defined, 160
developing scenarios, 166
expert/novice model, avoiding, 168
follow-up interview, 164, 170–171
frequency vs. importance and, 181
goals and, 176
guest model, avoiding, 168
for information architecture development, 47
information to collect, 171–173
interviewer/interviewee model, avoiding, 168
introduction and warm up, 169
in iterative development cycle, 36
learning the domain, 165
limitations, 395
main observation period, 169–170
master/apprentice model, 167
mental models and, 175
methods and, 171–172, 175–176
organizing questions into projects, 72
overview, 69, 160–161
partnership model, 167–168
practical preparations for, 166–167
privacy and, 170
process of, 162–173
recruiting for, 163–164
for requirement gathering, 67, 68
results, 181–182
schedule for, 162
scheduling participants, 164–165
scheduling service example, 37
structure of the inquiry, 169–171
as survey follow-up, 357–358
target audience, 163
task analysis vs., 182, 184
terminology and, 175
time requirements, 76
tools and, 171, 175
uses for, 161–162
values and, 176
videotaping, 173–174
warm-up, 169
wrap-up, 170
Contractors. See specialists
Conversion rate, 412
Cookies
clickstreams using, 408
expiration dates, 407–408
log analysis and, 406–408
sampling and tracking cookies, 330
session vs. identity, 407–408
turning on session tracking, 407
Cooper, Alan, 130
Corporate culture. See user-centered corporate culture
Corporate edict, iterative development vs., 29, 30
Corporations. See companies
Correlation, causation vs., 354
Costs
equipment, 76, 530–531
incentives, 76, 108–109, 164
professional recruiters, 116–117
See also budgets
Counting survey results. See tabulating survey results
Creation step in iterative development
overview, 32
scheduling service example, 38, 39, 40, 41
CRM (customer relationship management), 416, 418
Cross-tabulation, 345–349
Cultural models, 181
Culture, corporate. See user-centered corporate culture
Cummings, e. e., 498
Current users, identity design and, 52
Customer feedback analysis
analyzing comments, 401–402
benefits and pitfalls, 71
coding comments, 400–401
collecting comments, 397–398
customer support process, 396–397
described, 71
negative nature of comments, 396
organizing comments by subject and severity, 400
reading comments, 398–399
scheduling service example, 42
"stock" answers to questions, 397
tabulating comments, 401
tips, 398–399
uses for, 395, 396
Customer relationship management (CRM), 416, 418
Customer support. See customer feedback analysis
Customers, users vs., 134
CXO (Chief Experience Officer), 515
|