Separation of Lanthanides/ Lanthanides and Actinides
Chounta avouris limassol2011
1. Groupware Evaluation:
An Overview
Irene-Angelica Chounta, Nikolaos Avouris
University of Patras, Greece
1
2. Overview
• A brief review on Groupware
Evaluation
• Lessons learned from evaluation
studies
• Discussion
2
http://hci.ece.upatras.gr
3. Groupware Applications
software systems that support people
involved in a common task to achieve
their goals
3
http://hci.ece.upatras.gr
4. Groupware Evaluation
the assessment of the strengths and weaknesses of
groupware applications and systems
Groupware systems are expected to support:
– communication between partners
– support for awareness of others' actions
– the establishment of shared understanding
and goals
4
http://hci.ece.upatras.gr
5. Objectives of groupware evaluation
• the groupware application
used as an interactive
system
• the mechanics of group
interactions or
communication means used
• the collaborative experience
• the group activity outcome 5
http://hci.ece.upatras.gr
6. Methods based on single user
evaluation practice
•use of heuristics
•user testing
•interviews & questionnaires
•focus groups
•ethnographic methods
Question:
Are they enough to capture completely the groupware case?
How easily are applied in groupware (e.g. groupware
evaluation in real-case scenario) 6
http://hci.ece.upatras.gr
7. Groupware-specific methods
• Mechanics of collaboration: Collaboration,
communication and awareness aspects, are analyzed
and mapped through small-scale actions (Gutwin, C.,
Greenberg, S., 2000)
• Collaboration Usability Analysis (CUA): high and low-
level representation of collaborative activity and the
interactions through field studies and task analysis
techniques (Pinelle,D. et. al., 2003)
Gutwin, C., Greenberg, S. “The Mechanics of Collaboration: Developing Low Cost Usability Evaluation Methods for Shared
Workspaces”, (2000)
Pinelle,D., Gutwin, C., Greenberg. S. Task analysis for groupware usability evaluation: Modeling shared-workspace tasks with the
mechanics of collaboration”, (2003)
7
http://hci.ece.upatras.gr
8. Analytical methods
• Groupware Task Analysis (GTA): Hierarchical
task analysis in combination with human
information processing models that focuses
on the triplet people, work, situation (Van der
Veer et. al., 1996)
• Distributed GOMS (DGOMS): representation
of group activity to predict execution time,
distribution of workload and other
performance variables. (Min, D. et. al., 1999)
Van der Veer, G.C, Lenting, B.,F., Bergevoet, B.A.J. “GTA: Groupware task analysis -- Modeling complexity”, (1996)
Min, D., Koo, S., Chung, Y.H., Kim, B. “Distributed GOMS: an extension of GOMS to group task”, (1999) 8
http://hci.ece.upatras.gr
9. Groupware-specific frameworks
• Participatory evaluation (PETRA): A framework
designed to address both theoretical concerns and practical
design issues of groupware evaluation (Ross, S., et. al., 1995)
• Modeling and mapping awareness within a
collaborative setting focusing on the central relationships
underlying the processes of distributed group work (Neale, D.,
et. al., 2004)
• Breakdown analysis classifying different breakdown and
repair scenarios to highlight implications (Hartswood, M. et.
al. 2000)
Ross, S., Ramage, M., Rogers, Y. “PETRA: participatory evaluation through redesign and analysis”, (1995)
Neale, D.C., Carroll, J.M, Rosson, M.B. “Evaluating computer-supported cooperative work: models and frameworks”, (2004)
Hartswood, M., Procter, R. “Design guidelines for dealing with breakdowns and repairs in collaborative work settings”, (2000)
9
http://hci.ece.upatras.gr
10. Examples of Evaluation Studies
•traditional HCI methods were combined with
groupware evaluation methodologies
•special emphasis to the effectiveness of
alternative awareness mechanisms.
•qualitative analysis of video to assess the
collaboration activity
10
http://hci.ece.upatras.gr
11. Study A
•Evaluation of a web-based
argumentation tool used by
communities of practice [1]
• heuristic evaluation for single-user interface was
combined with Groupware Heuristic Evaluation
• both synchronous and asynchronous collaboration
was studied
• Overall fifty (50) participants took part in the study
[1] Chounta, I.A., Avouris, N. “Heuristic Evaluation of an Argumentation Tool used by Communities of Practice”, (2009)
11
http://hci.ece.upatras.gr
12. Results of Study A
• Various aspects need to be studied and evaluated
separately for an overall assessment
• Expert based inspection methods need to be combined
with user observation
• We need not only focus on the collaborative
functionality but also on the user interface design
issues.
• Most of the issues observed in a collaborative session
were due to flows of the interface design rather than
communication and awareness problems.
12
http://hci.ece.upatras.gr
13. Study B
•Qualitative study of synchronous
collaboration for problem-solving
on allocation of attention resources
during different collaborative
sessions [1]
•Three dyads' practice was monitored by an eyetracker and
analyzed.
•The dyads were formed in order to study different group
dynamics.
•The logfiles of the collaborative activity were combined with
the logfiles of the eyetracker to analyze the interplay between
task, awareness mechanisms and collaborative practice.
[1] Chounta I.A., Avouris N. “Study of the effect of awareness on synchronous collaborative problem-solving”, (2010)
13
http://hci.ece.upatras.gr
14. Results of Study B
•awareness and communication failures are often
interpreted as unwillingness towards collaboration or
gradual lost of interest in a collaborative activity
•the lack of adaptive awareness mechanisms that help
users to set priorities leads users to withdraw from the
joint activity
•partners remain visible in the common workspaces,
are aware of the actions of their partners but take no
actual role in the collaborative activity
14
http://hci.ece.upatras.gr
15. Discussion
• The complexity of group activity setting makes it difficult
to identify the source of observed problems
• The outcome of collaborative activities rely on many
factors such as: the quality of collaboration, the context
of the activity and the tools that mediate the activity.
• Indications to use combination of single user evaluation
methodologies and groupware specific methods
• Still a long way to go in establishing an evaluation
framework for a wide range of collaborative applications
• Open issue: how to include the context of use and the
quality of the outcome in the evaluation process
15
http://hci.ece.upatras.gr
16. thank you
hci.edu.gr
16
http://hci.ece.upatras.gr