SlideShare una empresa de Scribd logo
1 de 37
Descargar para leer sin conexión
USABILITY EVALUATION
    USER TESTS
1
TERM PAPER SCHEDULE
    December 19th
      Set your topic/research question
      Make a short lit. rev. and write the draft version (2-3
       pages)
      Write methodology

    Turn-in your draft: 10 pts
        Late submission: -2pt per day
    January 09th
      Paper Submission
      Late submission: -5 pt per day


                                                                 2
TERM PAPER
    Survey
        One approach is to choose an area of research, read all the
         relevant studies, and organize them in a meaningful way.
            An example of an organizing theme is a conflict or controversy in
             the area, where you might first discuss the studies that support
             one side, then discuss the studies that support the other side.


        Another approach is to choose an organizing theme or a point
         that you want to make, then select your studies accordingly.


    Empirical

        Gather data on an HCI topic
                                                                                 3
LITERATURE REVIEWS VERSUS
    RESEARCH ARTICLES
   Literature reviews survey research done in a particular
    area.
       Although they also evaluate methods and results, their main
        emphasis is on knitting together theories and results from a
        number of studies to describe the "big picture" of a field of
        research.


   Research articles, on the other hand, are empirical articles,
    specifically describing one or a few related studies.
     Research articles tend to focus on methods and results to
      document how a particular hypothesis was tested.
     The introduction of a research article is like a condensed
      literature review that gives the rationale for the study that has
                                                                          4
      been conducted.
EMPIRICAL PAPER - STRUCTURE
   Title
    Abstract
    Introduction (Lit. Rev. & Res.Q)
    Method
    Results
    Discussion
    Bibliography




                                       5
ABSTRACT

    A one paragraph summary
      A statement on objective/purpose of the investigation
      Description of participants
      Brief description of what participants did
      Summary of findings




                                                               6
INTRODUCTION
 Lit   review
     Background & rationale (previous research,
      what they found, what they identify as possible
      issues/questions)
     Use EBSCO, ScienceDirect, SCI, SSCI, etc.


 Statement      of purpose
     “The current study was conducted to evaluate
      the effect of X on Y” .... or ..... “to find out what
      are the factors that lead to Z” or “to determine
      the relationship between A and B”                       7
METHOD
   Enough detail for a reader to replicate

   Who participated (number, characteristics, volunteer or
    randomly selected)

   What materials were employed (systems, questionnaires
    - design, validity and reliability)

   What data was collected (dependent variables i.e. scores,
    ratings, responses)

   What were participants required to do (where, who,          8
    sequence of events - include instructions & tasks)
RESULTS
   How have the data been treated?

   Text and graphs

   Statistics - descriptive/inferential

   Summarize the results




                                           9
INDEPENDENT AND DEPENDENT VARIABLES

 Independent  - what the experimenter does to
 the subject e.g. exposed to an interface,
 training, mental model, or selected by age or
 gender
     levels & controls

 Dependent   - any behavior/ performance/
 attitude of the subject which is measured as
 the ‘outcome’ e.g. scores on a test, type of
 knowledge
                                                 10
DISCUSSION
    Interpretation-
      what do the results mean in terms of your original
       question
      why do you think they turned out like this



    Critique your study (limitations) and recommend
     improvements

    Suggestions for further research


                                                            11
CREDIBILITY OF THE STUDY
   Definition of the construct being measured

   Congruence between method & question

   Measurement
     Bias: instruction & instrument (wording), administering
     Reliability (stable/decision consistency)
           the degree to which an instrument measures the same way each
            time it is used under the same condition with the same subjects
       Validity
         strength of our conclusions, inferences or propositions
         Were we right?                                                12
PILOT YOUR METHOD….

    Try your method before capturing data for real…..

        Ask friends to answer your survey, take your test,
         perform your experiment etc.

        Look for issues that confuse them (or you!) - modify
         accordingly




                                                                13
STATISTICS
(IF YOU CHOOSE QUANTITATIVE APPROACH)

   Descriptive/summary stats are sufficient
     Mean (median, mode)
     Range
     Standard deviation (if you know how)
     Run tests if you are comfortable


   Provide tabulated raw data if possible, (put in
    appendices if large)




                                                      14
COMMON PITFALLS
    Rambling, unfocused style
        Keep a question in mind as you write


    All claims and opinions, no evidence
        Cite literature that supports your argument


    Misses relevant topics from class
        Try to see how the readings and lectures fit



                                                        15
DETERMINANTS OF USABILITY RATING


  Effectiveness
                              Design
                              Process
                  Usability
  Efficiency      Rating


  Satisfaction
                                   16
PLAN A USABILITY EVALUATION

   Describe:
      What data will you collect?
      What will these data tell you?
      What data collection methods will you employ?
      How long will it take to produce the test results?
      What form of feedback will you provide?


   List advantages/disadvantages of this plan


                                                           17
?
THOUGHT ACTIVITY
   VESTEL is designing an interface for their new VCD
    product (includes remote control) for Turkish market
    which aims to be the ‘most usable’ on the market - how
    would you test this claim?
   ASELSAN wishes to test its new weapon control system .
    You are charged with designing the test.
   Educational software company has developed an
    educational game for use in elementary school - they want
    to know if it is usable/enjoyable.
   Municipality of Ankara is producing an information booth
    for use in visitors centers, ASTI, etc. offering info and
    advice to visitors. They ask for a usability test. (Zero
    learning time)
                                                             18
SETTING USABILITY CRITERIA

   “Product X is usable to the extent that 70% of
    users, with no additional training, can perform
    all tasks with 95% accuracy, 25% faster than
    existing application use, and report at least
    equivalent satisfaction”




                                                      19
OR.....

   “Product X is usable to the extent that 80% of
    users, with 2 days training, can perform 90% of
    routine tasks with >90% accuracy, as efficiently
    as with the existing application, and report
    increases in satisfaction”




                                                       20
INSTEAD OF....

   “Product X is usable”
    (meaningless statement for HCI)

   “This new application is more usable than the old
    application”
    (begs the question...”More usable in what sense? And
    for whom? And where?” )



                                                           21
WHO SETS THE USABILITY CRITERIA?

    Purchasers
        can be basis for contract
    Designers
        basis for design targets
    Evaluators
        provide context/limits of generalization for evaluation
    Users
        Key stakeholders with privileged knowledge



                                                                   22
HOW ARE CRITERIA DERIVED?

   User analysis

   Task analysis

   Situation analysis




                            23
OUTPUT

 Scenarios    of use
     “Stories” of interaction in which users, tasks and
      contexts are described

 Scenarios    form basis of decisions on
     Effectiveness

     Efficiency

     Satisfaction

                                                           24
FRAMEWORK FOR USABILITY EVALUATION
     Approach and Type
         Approach refers to source of data
             User, Expert, or Model
         Type refers to purpose of evaluation
             Diagnostic (Formative) or Metrication (Summative)

     Any evaluation method is a combination of
      approach and type




                                                                  25
EVALUATION APPROACH
 The  approach defines the source of the data
 i.e., where does the evaluator gain the data
 about usability?
     from real users? (User-based)
     from usability experts or self evaluation? (Expert-based)
     from the application of a formal theory or model?
      (Model-based)




                                                                  26
CONDUCTING YOUR TEST: THINGS TO CONSIDER…

 How  many users?
 Length of test session?
 Where to conduct the session?
 Role of facilitator:
   put participant(s) at ease (testing the material, not
    them)
   observe and take notes
   not to intervene or assist
 Role,placement and responsibilities of other
  observers
 Verbal protocol (“think-aloud”)
 Token reward for participation (if appropriate)           27
DATA COLLECTION
  Quantitative     data
      Number of errors made using the system
      Time required for activity(s)


  Qualitative    data
      Ease of use – are materials convenient, easy to
       locate, to use?
      Learners’ reactions to materials, activities,
       evaluation
                                                         28
ANALYZING & REPORTING
YOUR USABILITY RESULTS

 Quantitative     data
     descriptive data (number of users, time spent, errors)
     be sure and discuss any data tables (what do they
      mean?)
 Qualitative    data
   consolidate your observations (negatives and positives!)
   extract common themes
   identify critical themes (e.g. length of time required)
   perform member checking if possible
   determine solutions for addressing the problems
                                                            29
   summarize and present your findings and solutions
ANALYZING & REPORTING
YOUR USABILITY RESULTS



 Observations   Interpretation   Recommendation




                                                  30
PROTOCOL
   Introduction
     Thank you...for agreeing to participate in this session.
     Product Description...A CD-ROM "book" on the topic of visual
      design for instructional multimedia.
     Purpose of session...is to make this product better.
     This product does have problems.
     Any problems you have or find with the product is with the
      product, not your fault.

   Instructions...
     I'll be asking you do certain things with the program and watching
      and writing notes as you do them. That's just to help me remember
      how things went later on.
     To help me do this, I'd like you to "think out loud" as you use the
      program and make your decisions to do certain things.
     I'd like you to try and perform the given tasks on your own as best
      you can. If you’re really stuck, I may be able to help, but I’d really
      like you to try it without my help.
     At any time, you can quit a particular task and move on or you
      may choose to quit the entire session.                                31
Efficiency
OBSERVATION SHEET
               Start time:           Finish time:


        page/link name       Notes                         +/-

    name of starting page




 Efficiency                          Effectiveness
                                                                 32
AND NOW.....THE USER
 How should we think of users?
 User as a psychological being

 Parameters of human information processing

 Emergence of skilled behavior




                                               33
NIELSEN (1993) - THE RANGE OF USER TYPES:

                   Knowledgeable about Domain




Minimal Computer Experience                 Extensive Computer Experience




                                                                     34
                              Ignorant about Domain
FOUR LEVELS OF THE USER IN HCI
   Psychophysiological
     HCI issue: brain-computer interaction (BCI)


   Perceptual
     HCI issue: Screen layout, readability

   Cognitive
     HCI Issue: Task Structure, Human Learning

   Social
      HCI issue: CSCW, Organizational impact


                                                    35
BASIC PROPERTIES OF ALL USERS
  Changes with experience
  Actively learns

  Limited attention

  Makes mistakes

  Models the system in their mind

  Remains unique

  Goal oriented




                                     36
   Some Resources
       http://www.hcibib.org/
       http://www.acm.org/sigchi/
       Usability.gov (research based evidences)
       http://www.microsoft.com/playtest/publications.htm
       comp.human-factors (USENET news group)

   Some Journals
       ACM transactions on computer-human interaction
       Human-computer interaction
       ACM Interactions
       International journal of human-computer interaction
       User modeling and user-adapted interaction
       Computers in Human Behavior
       Human factors in computing systems
       http://www.sciencedirect.com/science/journals
                                                              37

Más contenido relacionado

La actualidad más candente

Slideshare Presentation of Qualitative Data
Slideshare   Presentation of Qualitative DataSlideshare   Presentation of Qualitative Data
Slideshare Presentation of Qualitative DataDavin Marcus Raja
 
Qualitative Data Analysis (Steps)
Qualitative Data Analysis (Steps)Qualitative Data Analysis (Steps)
Qualitative Data Analysis (Steps)guest7f1ad678
 
Scale development -- Steps
Scale development -- StepsScale development -- Steps
Scale development -- StepsKhalid Mahmood
 
Kuliah 5 design of quantitative reseach_2012
Kuliah 5 design of quantitative reseach_2012Kuliah 5 design of quantitative reseach_2012
Kuliah 5 design of quantitative reseach_2012syahidov1924
 
Mixed Method Research
Mixed Method ResearchMixed Method Research
Mixed Method ResearchABCComputers
 
Notes from frankel and wallen
Notes from frankel  and wallenNotes from frankel  and wallen
Notes from frankel and wallenRose Jedin
 
1.model building
1.model building1.model building
1.model buildingVinod Sahu
 
Mba2216 business research week 2 research process 0613
Mba2216 business research week 2 research process 0613Mba2216 business research week 2 research process 0613
Mba2216 business research week 2 research process 0613Stephen Ong
 
Presentation On Sections
Presentation On SectionsPresentation On Sections
Presentation On Sectionsguest8dcb879
 
Quantitative and qualitative data, questionnaires, interviews
Quantitative and qualitative data, questionnaires, interviewsQuantitative and qualitative data, questionnaires, interviews
Quantitative and qualitative data, questionnaires, interviewsleannacatherina
 
Introduction to Research Methodology
Introduction to Research MethodologyIntroduction to Research Methodology
Introduction to Research MethodologyJosephin Remitha M
 
11 - qualitative research data analysis ( Dr. Abdullah Al-Beraidi - Dr. Ibrah...
11 - qualitative research data analysis ( Dr. Abdullah Al-Beraidi - Dr. Ibrah...11 - qualitative research data analysis ( Dr. Abdullah Al-Beraidi - Dr. Ibrah...
11 - qualitative research data analysis ( Dr. Abdullah Al-Beraidi - Dr. Ibrah...Rasha
 
Coding, Segmenting & Categorizing in Qualitative Data Analysis
Coding, Segmenting & Categorizing in Qualitative Data AnalysisCoding, Segmenting & Categorizing in Qualitative Data Analysis
Coding, Segmenting & Categorizing in Qualitative Data AnalysisDr. Sarita Anand
 

La actualidad más candente (17)

Slideshare Presentation of Qualitative Data
Slideshare   Presentation of Qualitative DataSlideshare   Presentation of Qualitative Data
Slideshare Presentation of Qualitative Data
 
Qualitative Data Analysis
Qualitative Data Analysis  Qualitative Data Analysis
Qualitative Data Analysis
 
Qualitative Data Analysis (Steps)
Qualitative Data Analysis (Steps)Qualitative Data Analysis (Steps)
Qualitative Data Analysis (Steps)
 
Scale development -- Steps
Scale development -- StepsScale development -- Steps
Scale development -- Steps
 
Kuliah 5 design of quantitative reseach_2012
Kuliah 5 design of quantitative reseach_2012Kuliah 5 design of quantitative reseach_2012
Kuliah 5 design of quantitative reseach_2012
 
Mixed Method Research
Mixed Method ResearchMixed Method Research
Mixed Method Research
 
Notes from frankel and wallen
Notes from frankel  and wallenNotes from frankel  and wallen
Notes from frankel and wallen
 
1.model building
1.model building1.model building
1.model building
 
Mba2216 business research week 2 research process 0613
Mba2216 business research week 2 research process 0613Mba2216 business research week 2 research process 0613
Mba2216 business research week 2 research process 0613
 
Presentation On Sections
Presentation On SectionsPresentation On Sections
Presentation On Sections
 
Quantitative and qualitative data, questionnaires, interviews
Quantitative and qualitative data, questionnaires, interviewsQuantitative and qualitative data, questionnaires, interviews
Quantitative and qualitative data, questionnaires, interviews
 
Introduction to Research Methodology
Introduction to Research MethodologyIntroduction to Research Methodology
Introduction to Research Methodology
 
Research design
Research designResearch design
Research design
 
11 - qualitative research data analysis ( Dr. Abdullah Al-Beraidi - Dr. Ibrah...
11 - qualitative research data analysis ( Dr. Abdullah Al-Beraidi - Dr. Ibrah...11 - qualitative research data analysis ( Dr. Abdullah Al-Beraidi - Dr. Ibrah...
11 - qualitative research data analysis ( Dr. Abdullah Al-Beraidi - Dr. Ibrah...
 
Coding, Segmenting & Categorizing in Qualitative Data Analysis
Coding, Segmenting & Categorizing in Qualitative Data AnalysisCoding, Segmenting & Categorizing in Qualitative Data Analysis
Coding, Segmenting & Categorizing in Qualitative Data Analysis
 
Quantitative analysis
Quantitative analysisQuantitative analysis
Quantitative analysis
 
Research Process design
Research Process designResearch Process design
Research Process design
 

Similar a Bm515 h8

writing a report lecture
writing a report lecture writing a report lecture
writing a report lecture Sidra Akhtar
 
Writing a report - APA Style
Writing a report - APA Style Writing a report - APA Style
Writing a report - APA Style Sidra Akhtar
 
(1) Critique Template for a Qualitative StudyNURS 6052Week.docx
(1)  Critique Template for a Qualitative StudyNURS 6052Week.docx(1)  Critique Template for a Qualitative StudyNURS 6052Week.docx
(1) Critique Template for a Qualitative StudyNURS 6052Week.docxkatherncarlyle
 
QualitativeAnalysis_W2015.ppt
QualitativeAnalysis_W2015.pptQualitativeAnalysis_W2015.ppt
QualitativeAnalysis_W2015.pptRabinThapa27
 
Running head RESEARCH TYPES .docx
Running head RESEARCH TYPES                                  .docxRunning head RESEARCH TYPES                                  .docx
Running head RESEARCH TYPES .docxtoltonkendal
 
writing a research proposal.pdf
writing a research proposal.pdfwriting a research proposal.pdf
writing a research proposal.pdfmshalimatulsaadiah
 
Research methods for Masters and Doctoral dissertation scholars
Research methods for Masters and Doctoral dissertation scholarsResearch methods for Masters and Doctoral dissertation scholars
Research methods for Masters and Doctoral dissertation scholarsThe Free School
 
GBS MSCBDA - Dissertation Guidelines.pdf
GBS MSCBDA - Dissertation Guidelines.pdfGBS MSCBDA - Dissertation Guidelines.pdf
GBS MSCBDA - Dissertation Guidelines.pdfStanleyChivandire1
 
Part 1 research and evaluation edited
Part 1 research and evaluation editedPart 1 research and evaluation edited
Part 1 research and evaluation editedYISMAW MENGGISTU
 
Critique Template for a Mixed-Methods StudyNURS 6052Week 6 A.docx
Critique Template for a Mixed-Methods StudyNURS 6052Week 6 A.docxCritique Template for a Mixed-Methods StudyNURS 6052Week 6 A.docx
Critique Template for a Mixed-Methods StudyNURS 6052Week 6 A.docxfaithxdunce63732
 
Critique Template for a Mixed-Methods StudyNURS 5052NURS 6052.docx
Critique Template for a Mixed-Methods StudyNURS 5052NURS 6052.docxCritique Template for a Mixed-Methods StudyNURS 5052NURS 6052.docx
Critique Template for a Mixed-Methods StudyNURS 5052NURS 6052.docxannettsparrow
 
Session 2 into to qualitative research intro
Session 2   into to qualitative research introSession 2   into to qualitative research intro
Session 2 into to qualitative research introAngela Ferrara
 
systematic review info.pdf
systematic review info.pdfsystematic review info.pdf
systematic review info.pdfraazia gul
 
Research design dr. raj agrawal
Research design dr. raj agrawalResearch design dr. raj agrawal
Research design dr. raj agrawalRavindra Sharma
 
Research Methodology UNIT 1.pptx
Research Methodology UNIT 1.pptxResearch Methodology UNIT 1.pptx
Research Methodology UNIT 1.pptxPallawiBulakh1
 
chapter session 2.6 data analysis28,11.ppt
chapter session 2.6 data analysis28,11.pptchapter session 2.6 data analysis28,11.ppt
chapter session 2.6 data analysis28,11.pptetebarkhmichale
 
Framework for Program Development and EvaluationReference.docx
Framework for Program Development and EvaluationReference.docxFramework for Program Development and EvaluationReference.docx
Framework for Program Development and EvaluationReference.docxhanneloremccaffery
 
Resaerch-design-Presentation-MTTE-1st-sem-2023.pptx
Resaerch-design-Presentation-MTTE-1st-sem-2023.pptxResaerch-design-Presentation-MTTE-1st-sem-2023.pptx
Resaerch-design-Presentation-MTTE-1st-sem-2023.pptxJanVincentFuentes
 

Similar a Bm515 h8 (20)

writing a report lecture
writing a report lecture writing a report lecture
writing a report lecture
 
Writing a report - APA Style
Writing a report - APA Style Writing a report - APA Style
Writing a report - APA Style
 
writing a report
writing a report writing a report
writing a report
 
(1) Critique Template for a Qualitative StudyNURS 6052Week.docx
(1)  Critique Template for a Qualitative StudyNURS 6052Week.docx(1)  Critique Template for a Qualitative StudyNURS 6052Week.docx
(1) Critique Template for a Qualitative StudyNURS 6052Week.docx
 
QualitativeAnalysis_W2015.ppt
QualitativeAnalysis_W2015.pptQualitativeAnalysis_W2015.ppt
QualitativeAnalysis_W2015.ppt
 
Running head RESEARCH TYPES .docx
Running head RESEARCH TYPES                                  .docxRunning head RESEARCH TYPES                                  .docx
Running head RESEARCH TYPES .docx
 
writing a research proposal.pdf
writing a research proposal.pdfwriting a research proposal.pdf
writing a research proposal.pdf
 
Research methods for Masters and Doctoral dissertation scholars
Research methods for Masters and Doctoral dissertation scholarsResearch methods for Masters and Doctoral dissertation scholars
Research methods for Masters and Doctoral dissertation scholars
 
GBS MSCBDA - Dissertation Guidelines.pdf
GBS MSCBDA - Dissertation Guidelines.pdfGBS MSCBDA - Dissertation Guidelines.pdf
GBS MSCBDA - Dissertation Guidelines.pdf
 
Part 1 research and evaluation edited
Part 1 research and evaluation editedPart 1 research and evaluation edited
Part 1 research and evaluation edited
 
Critique Template for a Mixed-Methods StudyNURS 6052Week 6 A.docx
Critique Template for a Mixed-Methods StudyNURS 6052Week 6 A.docxCritique Template for a Mixed-Methods StudyNURS 6052Week 6 A.docx
Critique Template for a Mixed-Methods StudyNURS 6052Week 6 A.docx
 
Critique Template for a Mixed-Methods StudyNURS 5052NURS 6052.docx
Critique Template for a Mixed-Methods StudyNURS 5052NURS 6052.docxCritique Template for a Mixed-Methods StudyNURS 5052NURS 6052.docx
Critique Template for a Mixed-Methods StudyNURS 5052NURS 6052.docx
 
Session 2 into to qualitative research intro
Session 2   into to qualitative research introSession 2   into to qualitative research intro
Session 2 into to qualitative research intro
 
systematic review info.pdf
systematic review info.pdfsystematic review info.pdf
systematic review info.pdf
 
Research Proposal Seminar
Research Proposal SeminarResearch Proposal Seminar
Research Proposal Seminar
 
Research design dr. raj agrawal
Research design dr. raj agrawalResearch design dr. raj agrawal
Research design dr. raj agrawal
 
Research Methodology UNIT 1.pptx
Research Methodology UNIT 1.pptxResearch Methodology UNIT 1.pptx
Research Methodology UNIT 1.pptx
 
chapter session 2.6 data analysis28,11.ppt
chapter session 2.6 data analysis28,11.pptchapter session 2.6 data analysis28,11.ppt
chapter session 2.6 data analysis28,11.ppt
 
Framework for Program Development and EvaluationReference.docx
Framework for Program Development and EvaluationReference.docxFramework for Program Development and EvaluationReference.docx
Framework for Program Development and EvaluationReference.docx
 
Resaerch-design-Presentation-MTTE-1st-sem-2023.pptx
Resaerch-design-Presentation-MTTE-1st-sem-2023.pptxResaerch-design-Presentation-MTTE-1st-sem-2023.pptx
Resaerch-design-Presentation-MTTE-1st-sem-2023.pptx
 

Bm515 h8

  • 1. USABILITY EVALUATION USER TESTS 1
  • 2. TERM PAPER SCHEDULE  December 19th  Set your topic/research question  Make a short lit. rev. and write the draft version (2-3 pages)  Write methodology  Turn-in your draft: 10 pts  Late submission: -2pt per day  January 09th  Paper Submission  Late submission: -5 pt per day 2
  • 3. TERM PAPER  Survey  One approach is to choose an area of research, read all the relevant studies, and organize them in a meaningful way.  An example of an organizing theme is a conflict or controversy in the area, where you might first discuss the studies that support one side, then discuss the studies that support the other side.  Another approach is to choose an organizing theme or a point that you want to make, then select your studies accordingly.  Empirical  Gather data on an HCI topic 3
  • 4. LITERATURE REVIEWS VERSUS RESEARCH ARTICLES  Literature reviews survey research done in a particular area.  Although they also evaluate methods and results, their main emphasis is on knitting together theories and results from a number of studies to describe the "big picture" of a field of research.  Research articles, on the other hand, are empirical articles, specifically describing one or a few related studies.  Research articles tend to focus on methods and results to document how a particular hypothesis was tested.  The introduction of a research article is like a condensed literature review that gives the rationale for the study that has 4 been conducted.
  • 5. EMPIRICAL PAPER - STRUCTURE  Title Abstract Introduction (Lit. Rev. & Res.Q) Method Results Discussion Bibliography 5
  • 6. ABSTRACT  A one paragraph summary  A statement on objective/purpose of the investigation  Description of participants  Brief description of what participants did  Summary of findings 6
  • 7. INTRODUCTION  Lit review  Background & rationale (previous research, what they found, what they identify as possible issues/questions)  Use EBSCO, ScienceDirect, SCI, SSCI, etc.  Statement of purpose  “The current study was conducted to evaluate the effect of X on Y” .... or ..... “to find out what are the factors that lead to Z” or “to determine the relationship between A and B” 7
  • 8. METHOD  Enough detail for a reader to replicate  Who participated (number, characteristics, volunteer or randomly selected)  What materials were employed (systems, questionnaires - design, validity and reliability)  What data was collected (dependent variables i.e. scores, ratings, responses)  What were participants required to do (where, who, 8 sequence of events - include instructions & tasks)
  • 9. RESULTS  How have the data been treated?  Text and graphs  Statistics - descriptive/inferential  Summarize the results 9
  • 10. INDEPENDENT AND DEPENDENT VARIABLES  Independent - what the experimenter does to the subject e.g. exposed to an interface, training, mental model, or selected by age or gender  levels & controls  Dependent - any behavior/ performance/ attitude of the subject which is measured as the ‘outcome’ e.g. scores on a test, type of knowledge 10
  • 11. DISCUSSION  Interpretation-  what do the results mean in terms of your original question  why do you think they turned out like this  Critique your study (limitations) and recommend improvements  Suggestions for further research 11
  • 12. CREDIBILITY OF THE STUDY  Definition of the construct being measured  Congruence between method & question  Measurement  Bias: instruction & instrument (wording), administering  Reliability (stable/decision consistency)  the degree to which an instrument measures the same way each time it is used under the same condition with the same subjects  Validity  strength of our conclusions, inferences or propositions  Were we right? 12
  • 13. PILOT YOUR METHOD….  Try your method before capturing data for real…..  Ask friends to answer your survey, take your test, perform your experiment etc.  Look for issues that confuse them (or you!) - modify accordingly 13
  • 14. STATISTICS (IF YOU CHOOSE QUANTITATIVE APPROACH)  Descriptive/summary stats are sufficient  Mean (median, mode)  Range  Standard deviation (if you know how)  Run tests if you are comfortable  Provide tabulated raw data if possible, (put in appendices if large) 14
  • 15. COMMON PITFALLS  Rambling, unfocused style  Keep a question in mind as you write  All claims and opinions, no evidence  Cite literature that supports your argument  Misses relevant topics from class  Try to see how the readings and lectures fit 15
  • 16. DETERMINANTS OF USABILITY RATING Effectiveness Design Process Usability Efficiency Rating Satisfaction 16
  • 17. PLAN A USABILITY EVALUATION  Describe: What data will you collect? What will these data tell you? What data collection methods will you employ? How long will it take to produce the test results? What form of feedback will you provide?  List advantages/disadvantages of this plan 17
  • 18. ? THOUGHT ACTIVITY  VESTEL is designing an interface for their new VCD product (includes remote control) for Turkish market which aims to be the ‘most usable’ on the market - how would you test this claim?  ASELSAN wishes to test its new weapon control system . You are charged with designing the test.  Educational software company has developed an educational game for use in elementary school - they want to know if it is usable/enjoyable.  Municipality of Ankara is producing an information booth for use in visitors centers, ASTI, etc. offering info and advice to visitors. They ask for a usability test. (Zero learning time) 18
  • 19. SETTING USABILITY CRITERIA  “Product X is usable to the extent that 70% of users, with no additional training, can perform all tasks with 95% accuracy, 25% faster than existing application use, and report at least equivalent satisfaction” 19
  • 20. OR.....  “Product X is usable to the extent that 80% of users, with 2 days training, can perform 90% of routine tasks with >90% accuracy, as efficiently as with the existing application, and report increases in satisfaction” 20
  • 21. INSTEAD OF....  “Product X is usable” (meaningless statement for HCI)  “This new application is more usable than the old application” (begs the question...”More usable in what sense? And for whom? And where?” ) 21
  • 22. WHO SETS THE USABILITY CRITERIA?  Purchasers  can be basis for contract  Designers  basis for design targets  Evaluators  provide context/limits of generalization for evaluation  Users  Key stakeholders with privileged knowledge 22
  • 23. HOW ARE CRITERIA DERIVED?  User analysis  Task analysis  Situation analysis 23
  • 24. OUTPUT  Scenarios of use  “Stories” of interaction in which users, tasks and contexts are described  Scenarios form basis of decisions on  Effectiveness  Efficiency  Satisfaction 24
  • 25. FRAMEWORK FOR USABILITY EVALUATION  Approach and Type  Approach refers to source of data  User, Expert, or Model  Type refers to purpose of evaluation  Diagnostic (Formative) or Metrication (Summative)  Any evaluation method is a combination of approach and type 25
  • 26. EVALUATION APPROACH  The approach defines the source of the data i.e., where does the evaluator gain the data about usability?  from real users? (User-based)  from usability experts or self evaluation? (Expert-based)  from the application of a formal theory or model? (Model-based) 26
  • 27. CONDUCTING YOUR TEST: THINGS TO CONSIDER…  How many users?  Length of test session?  Where to conduct the session?  Role of facilitator:  put participant(s) at ease (testing the material, not them)  observe and take notes  not to intervene or assist  Role,placement and responsibilities of other observers  Verbal protocol (“think-aloud”)  Token reward for participation (if appropriate) 27
  • 28. DATA COLLECTION  Quantitative data  Number of errors made using the system  Time required for activity(s)  Qualitative data  Ease of use – are materials convenient, easy to locate, to use?  Learners’ reactions to materials, activities, evaluation 28
  • 29. ANALYZING & REPORTING YOUR USABILITY RESULTS  Quantitative data  descriptive data (number of users, time spent, errors)  be sure and discuss any data tables (what do they mean?)  Qualitative data  consolidate your observations (negatives and positives!)  extract common themes  identify critical themes (e.g. length of time required)  perform member checking if possible  determine solutions for addressing the problems 29  summarize and present your findings and solutions
  • 30. ANALYZING & REPORTING YOUR USABILITY RESULTS Observations Interpretation Recommendation 30
  • 31. PROTOCOL  Introduction  Thank you...for agreeing to participate in this session.  Product Description...A CD-ROM "book" on the topic of visual design for instructional multimedia.  Purpose of session...is to make this product better.  This product does have problems.  Any problems you have or find with the product is with the product, not your fault.  Instructions...  I'll be asking you do certain things with the program and watching and writing notes as you do them. That's just to help me remember how things went later on.  To help me do this, I'd like you to "think out loud" as you use the program and make your decisions to do certain things.  I'd like you to try and perform the given tasks on your own as best you can. If you’re really stuck, I may be able to help, but I’d really like you to try it without my help.  At any time, you can quit a particular task and move on or you may choose to quit the entire session. 31
  • 32. Efficiency OBSERVATION SHEET Start time: Finish time: page/link name Notes +/- name of starting page Efficiency Effectiveness 32
  • 33. AND NOW.....THE USER  How should we think of users?  User as a psychological being  Parameters of human information processing  Emergence of skilled behavior 33
  • 34. NIELSEN (1993) - THE RANGE OF USER TYPES: Knowledgeable about Domain Minimal Computer Experience Extensive Computer Experience 34 Ignorant about Domain
  • 35. FOUR LEVELS OF THE USER IN HCI  Psychophysiological  HCI issue: brain-computer interaction (BCI)  Perceptual  HCI issue: Screen layout, readability  Cognitive  HCI Issue: Task Structure, Human Learning  Social  HCI issue: CSCW, Organizational impact 35
  • 36. BASIC PROPERTIES OF ALL USERS  Changes with experience  Actively learns  Limited attention  Makes mistakes  Models the system in their mind  Remains unique  Goal oriented 36
  • 37. Some Resources  http://www.hcibib.org/  http://www.acm.org/sigchi/  Usability.gov (research based evidences)  http://www.microsoft.com/playtest/publications.htm  comp.human-factors (USENET news group)  Some Journals  ACM transactions on computer-human interaction  Human-computer interaction  ACM Interactions  International journal of human-computer interaction  User modeling and user-adapted interaction  Computers in Human Behavior  Human factors in computing systems  http://www.sciencedirect.com/science/journals 37