SlideShare una empresa de Scribd logo
1 de 39
Chapter 4
Performance Metrics


          Presenter: 00335011 魏傳諺
Agenda


• Preface
• Task Success
• Time-on-Task
• Errors
• Efficiency
• Learnability
Preface of Performance Metrics

•   Based on specific user behaviors
     – User behaviors
     – The use of scenarios or task
•   How well users are actually using a product
•   Useful to estimate the magnitude of a specific usability issue
     – How many people are likely to encounter the same issue after the product is
       released?
     – How many users are able to successfully complete a core set of tasks using
       a product
•   Not the magical elixir for every situation
     – sample size
     – time & money
     – tell the what very effectively but not the why
Five Basic Types

                • The most widely used performance metric
Task Success    • How effectively users are able to complete a given set of tasks


Time-on-Task    • How much time is required to complete a task



   Errors       • Reflect the mistakes made during a task



  Efficiency    • The amount of effort a user expends to complete a task



 Learnability   • How performance changes over time
TASK SUCCESS
Task Success

• The most common usability metric
• As long as the user has a well-defined task, you can measure
  success
Collecting Any Type of Success Metric

• Each task must have a clear end-state
    – Define the success criteria  Data collection
        • Find the current price for a share of Google stock (clear end-state)
        • Research ways to save for your retirement (not a clear end-state)

• Way to collect success data
    – Verbally articulate the answer after completing the task
    – Provide their answers in a more structured way
        • Try to avoid write-in answers if possible

• In some case the correct solution to a task may not be verifiable
    – depends on the user‟s specific situation
    – testing is not being performed in person
Binary Success

•   Either participants complete a task successfully or they don‟t
•   How to Collect and Measure
     – 0&1
•   How to Analyze and Present
     – By individual task
     – By user or type of user
         • Frequency of use
         • Previous experience using the product
         • Domain expertise
         • Age group
         • Can calculate a percentage of tasks that each successfully completed
              – Binary data  Continuous data

•   Calculating Confidence Intervals
Levels of Success

• Partially completing a task?
   – coming close to fully completing a task may provide value to the
     participant
   – Helpful for you to know
       • Why some participants failed to complete a task
       • With which particular tasks they needed help
Levels of Success (cont’d)

• How to Collect and Measure
   – Must define the various levels
   – Based on the extent or degree to which a participant completed the task
       • Complete Success, Partial Success, and Failure
       • What constitutes „„giving assistance‟‟ to the participant
       • Assign a numeric value for each level
       • Does not differentiate between different types of failure
   – Based on the experience in completing a task
       • No Problem, Minor Problem, Major Problem, and Failure/Gave up
       • Ordinal data  No average score
   – Based on the participant accomplishing the task in different ways
       • Depending on the quality of the answer (not needs numeric score)
Levels of Success (cont’d)

• How to Analyze and Present
   – To create a tacked bar chart
   – To report a “usability score”
Issues in Measuring Success

• How to define whether a task was successful?
   – When unexpected situations arise
       • Make note of them
       • Afterward try to reach a consensus

• How or when to end a task
   – Stopping rule
       • Complete task / Reach the point at which they would give up or seek
         assistance
       • “Three strikes and you‟re out”
       • Set a time limit
   – If the participant is becoming particularly frustrated or agitated
TIME-ON-TASK
Time-on-Task

• Way to measure the efficiency of any product
    – The faster a participant can complete a task, the better the experience
• Exceptions to the assumption that faster is better
    – Game
    – Learning
Importance of Measuring Time-on-Task

• Particularly important for products
    – where tasks are performed repeatedly by the user
• The side benefits of measuring time-on-task
    – Increasing Efficiency  Cost Savings  Actual ROI
How to Collect and Measure Time-on-Task

•   The time elapsed between the start of a task and the end of a task
     – In minutes
     – In seconds
•   Measure by any time-keeping device
     – Start time & End time
     – Two people record the times
•   Automated Tools for Measuring Time-on-Task
     – less error-prone
     – Much less obtrusive
•   Turning on and off the Clock
     – Rules about how to measure time
          • Start the clock as soon as they finish reading the task
          • Point the timing ends at the participant hit the “answer” button
          • Stop timing when the participant has stopped interacting with the product
How to Collect and Measure Time-on-Task (cont’d)

• Tabulating Time Data
Analyzing and Presenting Time-on-Task Data

•   Ways to present
     – Mean
     – Median
     – Geometric mean
•   Ranges
     – Time interval
•   Thresholds
     – Whether users can complete certain tasks within an acceptable amount of
       time
•   Distributions and Outliers
     – Exclude outliers (> 3 SD above the mean)
     – Set up thresholds
     – determine the fastest possible time
Issues to Consider When Using Time Data

•   Only Successful Tasks or All Tasks?
     – Advantage of only including successful tasks
          • A cleaner measure of efficiency
     – Advantage of including all tasks
          • A more accurate reflection of the overall user experience
          • An independent measure in relation to the task success data
     – Always determined when to end  include all times
     – Sometimes decided when to end  only include successful tasks
•   Using a Think-Aloud Protocol?
     – Think-aloud protocol: to gain important insight
     – Have an impact on the time-on-task data
     – Retrospective probing technique
•   Should You Tell the Participants about the Time Measurement?
     – Perform the tasks as quickly and accurately as possible
ERRORS
Errors

• Usability issue vs. Error
    – A usability issue is the underlying cause of a problem
    – One or more errors are a possible outcome
• Errors
    – incorrect actions that may lead to task failure
When to Measure Errors

• When you want to understand the specific action or set of actions
  that may result in task failure
• Errors can tell
    – How many mistakes were made
    – Where they were made within the product
    – How various designs produce different frequencies and types of errors
    – How usable something really is
• Three general situations where measuring errors might be useful
    – When an error will result in a significant loss in efficiency
    – When an error will result in significant costs
    – When an error will result in task failure
What Constitutes an Error?

• No widely accepted definition of what constitutes an error
• Based on many different types of incorrect actions by the user
    – Entering incorrect data into a form field
    – Making the wrong choice in a menu or drop-down list
    – Taking an incorrect sequence of actions
    – Failing to take a key action
• Determine what constitutes an error
    – Make a list of all the possible actions
    – Define many of the different types of errors that can be made
What Constitutes an Error? (cont’d)
Collecting and Measuring Errors

• Not always easy
   – Need to know what the correct (set of) action(s) should be
• Consideration
   – Only a single error opportunity
   – Multiple error opportunities
• Way of organizing error data
   – Record the number of errors for each task and each user
   – 0 ~ max(number of error opportunities)
Analyzing and Presenting Errors

•   Tasks with a Single Error Opportunity
     – Look at the frequency of the error for each task
          • Frequency of errors
          • Percentage of participants who made an error for each task
     – From an aggregate perspective
          • Average the error rates for each task into a single error rate
          • Take an average of all the tasks that had a certain number of errors
          • Establish maximum acceptable error rates for each task
•   Tasks with Multiple Error Opportunities
     – Look at the frequency of errors for each task  error rate
     – The average number of errors made by each participant for each task
     – Which tasks fall above or below a threshold
     – Weight each type of error with a different value and then calculate an “error score”
Issues to Consider When Using Error Metrics

• Make sure you are not double-counting errors
• Need to know
    – An error rate, and
    – Why different errors are occurring
• An error is the same as failing to complete a task
    – Report errors as task failure
EFFICIENCY
Efficiency

• Time-on-task
• Look at the amount of effort required to complete a task
   – In most products, the goal is to minimize the amount of effort
   – two types of effort
       • Cognitive
           – Finding the right place to perform an action
           – Deciding what action is necessary
           – Interpreting the results of the action
       • Physical
           – The physical activity required to take action
Collecting and Measuring Efficiency

• Identify the action(s) to be measured
• Define the start and end of an action
• Count the actions
• Actions must be meaningful
    – Incremental increase in cognitive effort
    – Incremental increase in physical effort
• Look only at successful tasks
Analyzing and Presenting Efficiency Data
Analyzing and Presenting Efficiency Data (cont’d)
Efficiency as a Combination of Task Success and Time


• Task Success + Time-on-Task
• Core measure of efficiency
   – The ratio of the task completion rate to the mean time per task
LEARNABILITY
LEARNABILITY

•   Most products, especially new ones, require some amount of learning
•   Experience
     – Based on the amount of time spent using a product
     – Based on the variety of tasks performed
•   Learning
     – Sometimes quick and painless
     – At other times quite arduous and time consuming
•   Learnability
     – The extent to which something can be learned
     – How much time and effort are required to become proficient
     – While happens over a short period of time  maximize efficiency
     – While happen over a longer time period  great rely on memory
Collecting and Measuring Learnability Data

• Basically the same as they are for the other performance metrics
• Collect the data at multiple times
    – Based on expected frequency of use
• Decide which metrics to use  Decide how much time to allow
  between trials
• Alternatives
    – Trials within the same session
    – Trials within the same session but with breaks between tasks
    – Trials between sessions
Analyzing and Presenting Learnability Data

• By examining a specific performance metric
• Interpret the chart
    – Notice the slope of the line(s)
    – Notice the point of asymptote, or essentially where the line starts to
      flatten out
    – Look at the difference between the highest and lowest values on the y-
      axis
• Compare learnability across different conditions
Issues to Consider When Measuring Learnability


• What Is a Trial?
   – Learning is continuous and without breaks in time
       • Memory is much less a factor in this situation
       • More about developing and modifying different strategies to complete a set
         of tasks
       • Take measurements at specified time intervals

• Number of Trials
   – There must be at least two
   – In most cases there should be at least three or four
   – You should err on the side of more trials than you think you might need
     to reach stable performance.
Thanks for your listening~

Más contenido relacionado

La actualidad más candente

553: Oracle Database Performance: Are Database Users Telling Me The Truth?
553: Oracle Database Performance: Are  Database Users Telling Me The Truth?553: Oracle Database Performance: Are  Database Users Telling Me The Truth?
553: Oracle Database Performance: Are Database Users Telling Me The Truth?Alfredo Krieg
 
Improving Performance Improvement (Market Requirements Document - MRD)
Improving Performance Improvement (Market Requirements Document - MRD)Improving Performance Improvement (Market Requirements Document - MRD)
Improving Performance Improvement (Market Requirements Document - MRD)Adam "AB" Bloom
 
Lean Based Sofware Development
Lean Based Sofware DevelopmentLean Based Sofware Development
Lean Based Sofware DevelopmentSemen Arslan
 
Business process modelling
Business process modellingBusiness process modelling
Business process modellingKiito25
 
Business process mapping
Business process mappingBusiness process mapping
Business process mappingDAVIS THOMAS
 
Process mapping with flowcharts
Process mapping with flowchartsProcess mapping with flowcharts
Process mapping with flowchartsSteven Bonacorsi
 
itSMF Belgium kickoff 2015
itSMF Belgium kickoff 2015itSMF Belgium kickoff 2015
itSMF Belgium kickoff 2015itSMF Belgium
 
Cost Reduction Potential in Indirect Areas
Cost Reduction Potential in Indirect AreasCost Reduction Potential in Indirect Areas
Cost Reduction Potential in Indirect AreasSchwarzfischer
 
Process Mapping Checklist
Process Mapping ChecklistProcess Mapping Checklist
Process Mapping ChecklistJanet McGreevy
 
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems DevelopmentMetrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems DevelopmentBruce Douglass
 
Metrics in usability testing and user experiences
Metrics in usability testing and user experiencesMetrics in usability testing and user experiences
Metrics in usability testing and user experiencesHim Chitchat
 
PROCESS MAPPING AND PROCESS RECONSTRUCTIONS & DIAGRAMS
PROCESS MAPPING AND PROCESS RECONSTRUCTIONS & DIAGRAMSPROCESS MAPPING AND PROCESS RECONSTRUCTIONS & DIAGRAMS
PROCESS MAPPING AND PROCESS RECONSTRUCTIONS & DIAGRAMSHriday Bora
 
Process Mapping and Process Improvement for the Small Business Owner
Process Mapping and Process Improvement  for the Small Business OwnerProcess Mapping and Process Improvement  for the Small Business Owner
Process Mapping and Process Improvement for the Small Business OwnerMichiko Diby
 

La actualidad más candente (18)

553: Oracle Database Performance: Are Database Users Telling Me The Truth?
553: Oracle Database Performance: Are  Database Users Telling Me The Truth?553: Oracle Database Performance: Are  Database Users Telling Me The Truth?
553: Oracle Database Performance: Are Database Users Telling Me The Truth?
 
Improving Performance Improvement (Market Requirements Document - MRD)
Improving Performance Improvement (Market Requirements Document - MRD)Improving Performance Improvement (Market Requirements Document - MRD)
Improving Performance Improvement (Market Requirements Document - MRD)
 
Process mapping overview
Process mapping overviewProcess mapping overview
Process mapping overview
 
Simple Lean and VSM Training
Simple Lean and VSM TrainingSimple Lean and VSM Training
Simple Lean and VSM Training
 
Lean Based Sofware Development
Lean Based Sofware DevelopmentLean Based Sofware Development
Lean Based Sofware Development
 
Business process modelling
Business process modellingBusiness process modelling
Business process modelling
 
Business process mapping
Business process mappingBusiness process mapping
Business process mapping
 
Process mapping
Process mappingProcess mapping
Process mapping
 
Process mapping with flowcharts
Process mapping with flowchartsProcess mapping with flowcharts
Process mapping with flowcharts
 
itSMF Belgium kickoff 2015
itSMF Belgium kickoff 2015itSMF Belgium kickoff 2015
itSMF Belgium kickoff 2015
 
Cost Reduction Potential in Indirect Areas
Cost Reduction Potential in Indirect AreasCost Reduction Potential in Indirect Areas
Cost Reduction Potential in Indirect Areas
 
Process Mapping Checklist
Process Mapping ChecklistProcess Mapping Checklist
Process Mapping Checklist
 
Metrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems DevelopmentMetrics for Mofel-Based Systems Development
Metrics for Mofel-Based Systems Development
 
TESTING
TESTINGTESTING
TESTING
 
Metrics in usability testing and user experiences
Metrics in usability testing and user experiencesMetrics in usability testing and user experiences
Metrics in usability testing and user experiences
 
PROCESS MAPPING AND PROCESS RECONSTRUCTIONS & DIAGRAMS
PROCESS MAPPING AND PROCESS RECONSTRUCTIONS & DIAGRAMSPROCESS MAPPING AND PROCESS RECONSTRUCTIONS & DIAGRAMS
PROCESS MAPPING AND PROCESS RECONSTRUCTIONS & DIAGRAMS
 
Sdec10 lean AMS
Sdec10 lean AMSSdec10 lean AMS
Sdec10 lean AMS
 
Process Mapping and Process Improvement for the Small Business Owner
Process Mapping and Process Improvement  for the Small Business OwnerProcess Mapping and Process Improvement  for the Small Business Owner
Process Mapping and Process Improvement for the Small Business Owner
 

Similar a Measuring Performance Metrics

software testing metrics do's - don'ts-XBOSoft-QAI Webinar
software testing metrics do's - don'ts-XBOSoft-QAI Webinarsoftware testing metrics do's - don'ts-XBOSoft-QAI Webinar
software testing metrics do's - don'ts-XBOSoft-QAI WebinarXBOSoft
 
Software Quality Metrics Do's and Don'ts - XBOSoft-QAI Webinar
Software Quality Metrics Do's and Don'ts - XBOSoft-QAI WebinarSoftware Quality Metrics Do's and Don'ts - XBOSoft-QAI Webinar
Software Quality Metrics Do's and Don'ts - XBOSoft-QAI WebinarXBOSoft
 
PA2557_SQM_Lecture7 - Defect Prevention.pdf
PA2557_SQM_Lecture7 - Defect Prevention.pdfPA2557_SQM_Lecture7 - Defect Prevention.pdf
PA2557_SQM_Lecture7 - Defect Prevention.pdfhulk smash
 
e3-chap-09.ppt
e3-chap-09.ppte3-chap-09.ppt
e3-chap-09.pptKingSh2
 
Session-Based Test Management
Session-Based Test ManagementSession-Based Test Management
Session-Based Test Managementcaltonhill
 
Test estimation session
Test estimation sessionTest estimation session
Test estimation sessionVipul Agarwal
 
PAC 2019 virtual Joerek Van Gaalen
PAC 2019 virtual Joerek Van GaalenPAC 2019 virtual Joerek Van Gaalen
PAC 2019 virtual Joerek Van GaalenNeotys
 
Beyond "Quality Assurance"
Beyond "Quality Assurance"Beyond "Quality Assurance"
Beyond "Quality Assurance"Jason Benton
 

Similar a Measuring Performance Metrics (20)

software testing metrics do's - don'ts-XBOSoft-QAI Webinar
software testing metrics do's - don'ts-XBOSoft-QAI Webinarsoftware testing metrics do's - don'ts-XBOSoft-QAI Webinar
software testing metrics do's - don'ts-XBOSoft-QAI Webinar
 
Software Quality Metrics Do's and Don'ts - XBOSoft-QAI Webinar
Software Quality Metrics Do's and Don'ts - XBOSoft-QAI WebinarSoftware Quality Metrics Do's and Don'ts - XBOSoft-QAI Webinar
Software Quality Metrics Do's and Don'ts - XBOSoft-QAI Webinar
 
PA2557_SQM_Lecture7 - Defect Prevention.pdf
PA2557_SQM_Lecture7 - Defect Prevention.pdfPA2557_SQM_Lecture7 - Defect Prevention.pdf
PA2557_SQM_Lecture7 - Defect Prevention.pdf
 
Notes on usability testing
Notes on usability testingNotes on usability testing
Notes on usability testing
 
e3-chap-09.ppt
e3-chap-09.ppte3-chap-09.ppt
e3-chap-09.ppt
 
Evaluation techniques
Evaluation techniquesEvaluation techniques
Evaluation techniques
 
Usability testing 2013.12.20.
Usability testing 2013.12.20.Usability testing 2013.12.20.
Usability testing 2013.12.20.
 
體驗劇場_1050524_W14_易用性測試_楊政達
體驗劇場_1050524_W14_易用性測試_楊政達體驗劇場_1050524_W14_易用性測試_楊政達
體驗劇場_1050524_W14_易用性測試_楊政達
 
E3 chap-09
E3 chap-09E3 chap-09
E3 chap-09
 
Human Computer Interaction Evaluation
Human Computer Interaction EvaluationHuman Computer Interaction Evaluation
Human Computer Interaction Evaluation
 
Lesson14
Lesson14Lesson14
Lesson14
 
UNIT IV.ppt
UNIT IV.pptUNIT IV.ppt
UNIT IV.ppt
 
Disaster Recovery
Disaster Recovery Disaster Recovery
Disaster Recovery
 
Session-Based Test Management
Session-Based Test ManagementSession-Based Test Management
Session-Based Test Management
 
Test estimation session
Test estimation sessionTest estimation session
Test estimation session
 
PAC 2019 virtual Joerek Van Gaalen
PAC 2019 virtual Joerek Van GaalenPAC 2019 virtual Joerek Van Gaalen
PAC 2019 virtual Joerek Van Gaalen
 
Tech connect spring 2014 technology to job mapping v2
Tech connect spring 2014   technology to job mapping v2Tech connect spring 2014   technology to job mapping v2
Tech connect spring 2014 technology to job mapping v2
 
Beyond "Quality Assurance"
Beyond "Quality Assurance"Beyond "Quality Assurance"
Beyond "Quality Assurance"
 
Sadchap04
Sadchap04Sadchap04
Sadchap04
 
Istqb implementation
Istqb implementationIstqb implementation
Istqb implementation
 

Último

Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application ) Sakshi Ghasle
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Krashi Coaching
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTiammrhaywood
 
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting DataJhengPantaleon
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformChameera Dedduwage
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfMahmoud M. Sallam
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaVirag Sontakke
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsKarinaGenton
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Celine George
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 

Último (20)

Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application )
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
 
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdf
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of India
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its Characteristics
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 

Measuring Performance Metrics

  • 1. Chapter 4 Performance Metrics Presenter: 00335011 魏傳諺
  • 2. Agenda • Preface • Task Success • Time-on-Task • Errors • Efficiency • Learnability
  • 3. Preface of Performance Metrics • Based on specific user behaviors – User behaviors – The use of scenarios or task • How well users are actually using a product • Useful to estimate the magnitude of a specific usability issue – How many people are likely to encounter the same issue after the product is released? – How many users are able to successfully complete a core set of tasks using a product • Not the magical elixir for every situation – sample size – time & money – tell the what very effectively but not the why
  • 4. Five Basic Types • The most widely used performance metric Task Success • How effectively users are able to complete a given set of tasks Time-on-Task • How much time is required to complete a task Errors • Reflect the mistakes made during a task Efficiency • The amount of effort a user expends to complete a task Learnability • How performance changes over time
  • 6. Task Success • The most common usability metric • As long as the user has a well-defined task, you can measure success
  • 7. Collecting Any Type of Success Metric • Each task must have a clear end-state – Define the success criteria  Data collection • Find the current price for a share of Google stock (clear end-state) • Research ways to save for your retirement (not a clear end-state) • Way to collect success data – Verbally articulate the answer after completing the task – Provide their answers in a more structured way • Try to avoid write-in answers if possible • In some case the correct solution to a task may not be verifiable – depends on the user‟s specific situation – testing is not being performed in person
  • 8. Binary Success • Either participants complete a task successfully or they don‟t • How to Collect and Measure – 0&1 • How to Analyze and Present – By individual task – By user or type of user • Frequency of use • Previous experience using the product • Domain expertise • Age group • Can calculate a percentage of tasks that each successfully completed – Binary data  Continuous data • Calculating Confidence Intervals
  • 9. Levels of Success • Partially completing a task? – coming close to fully completing a task may provide value to the participant – Helpful for you to know • Why some participants failed to complete a task • With which particular tasks they needed help
  • 10. Levels of Success (cont’d) • How to Collect and Measure – Must define the various levels – Based on the extent or degree to which a participant completed the task • Complete Success, Partial Success, and Failure • What constitutes „„giving assistance‟‟ to the participant • Assign a numeric value for each level • Does not differentiate between different types of failure – Based on the experience in completing a task • No Problem, Minor Problem, Major Problem, and Failure/Gave up • Ordinal data  No average score – Based on the participant accomplishing the task in different ways • Depending on the quality of the answer (not needs numeric score)
  • 11. Levels of Success (cont’d) • How to Analyze and Present – To create a tacked bar chart – To report a “usability score”
  • 12. Issues in Measuring Success • How to define whether a task was successful? – When unexpected situations arise • Make note of them • Afterward try to reach a consensus • How or when to end a task – Stopping rule • Complete task / Reach the point at which they would give up or seek assistance • “Three strikes and you‟re out” • Set a time limit – If the participant is becoming particularly frustrated or agitated
  • 14. Time-on-Task • Way to measure the efficiency of any product – The faster a participant can complete a task, the better the experience • Exceptions to the assumption that faster is better – Game – Learning
  • 15. Importance of Measuring Time-on-Task • Particularly important for products – where tasks are performed repeatedly by the user • The side benefits of measuring time-on-task – Increasing Efficiency  Cost Savings  Actual ROI
  • 16. How to Collect and Measure Time-on-Task • The time elapsed between the start of a task and the end of a task – In minutes – In seconds • Measure by any time-keeping device – Start time & End time – Two people record the times • Automated Tools for Measuring Time-on-Task – less error-prone – Much less obtrusive • Turning on and off the Clock – Rules about how to measure time • Start the clock as soon as they finish reading the task • Point the timing ends at the participant hit the “answer” button • Stop timing when the participant has stopped interacting with the product
  • 17. How to Collect and Measure Time-on-Task (cont’d) • Tabulating Time Data
  • 18. Analyzing and Presenting Time-on-Task Data • Ways to present – Mean – Median – Geometric mean • Ranges – Time interval • Thresholds – Whether users can complete certain tasks within an acceptable amount of time • Distributions and Outliers – Exclude outliers (> 3 SD above the mean) – Set up thresholds – determine the fastest possible time
  • 19. Issues to Consider When Using Time Data • Only Successful Tasks or All Tasks? – Advantage of only including successful tasks • A cleaner measure of efficiency – Advantage of including all tasks • A more accurate reflection of the overall user experience • An independent measure in relation to the task success data – Always determined when to end  include all times – Sometimes decided when to end  only include successful tasks • Using a Think-Aloud Protocol? – Think-aloud protocol: to gain important insight – Have an impact on the time-on-task data – Retrospective probing technique • Should You Tell the Participants about the Time Measurement? – Perform the tasks as quickly and accurately as possible
  • 21. Errors • Usability issue vs. Error – A usability issue is the underlying cause of a problem – One or more errors are a possible outcome • Errors – incorrect actions that may lead to task failure
  • 22. When to Measure Errors • When you want to understand the specific action or set of actions that may result in task failure • Errors can tell – How many mistakes were made – Where they were made within the product – How various designs produce different frequencies and types of errors – How usable something really is • Three general situations where measuring errors might be useful – When an error will result in a significant loss in efficiency – When an error will result in significant costs – When an error will result in task failure
  • 23. What Constitutes an Error? • No widely accepted definition of what constitutes an error • Based on many different types of incorrect actions by the user – Entering incorrect data into a form field – Making the wrong choice in a menu or drop-down list – Taking an incorrect sequence of actions – Failing to take a key action • Determine what constitutes an error – Make a list of all the possible actions – Define many of the different types of errors that can be made
  • 24. What Constitutes an Error? (cont’d)
  • 25. Collecting and Measuring Errors • Not always easy – Need to know what the correct (set of) action(s) should be • Consideration – Only a single error opportunity – Multiple error opportunities • Way of organizing error data – Record the number of errors for each task and each user – 0 ~ max(number of error opportunities)
  • 26. Analyzing and Presenting Errors • Tasks with a Single Error Opportunity – Look at the frequency of the error for each task • Frequency of errors • Percentage of participants who made an error for each task – From an aggregate perspective • Average the error rates for each task into a single error rate • Take an average of all the tasks that had a certain number of errors • Establish maximum acceptable error rates for each task • Tasks with Multiple Error Opportunities – Look at the frequency of errors for each task  error rate – The average number of errors made by each participant for each task – Which tasks fall above or below a threshold – Weight each type of error with a different value and then calculate an “error score”
  • 27. Issues to Consider When Using Error Metrics • Make sure you are not double-counting errors • Need to know – An error rate, and – Why different errors are occurring • An error is the same as failing to complete a task – Report errors as task failure
  • 29. Efficiency • Time-on-task • Look at the amount of effort required to complete a task – In most products, the goal is to minimize the amount of effort – two types of effort • Cognitive – Finding the right place to perform an action – Deciding what action is necessary – Interpreting the results of the action • Physical – The physical activity required to take action
  • 30. Collecting and Measuring Efficiency • Identify the action(s) to be measured • Define the start and end of an action • Count the actions • Actions must be meaningful – Incremental increase in cognitive effort – Incremental increase in physical effort • Look only at successful tasks
  • 31. Analyzing and Presenting Efficiency Data
  • 32. Analyzing and Presenting Efficiency Data (cont’d)
  • 33. Efficiency as a Combination of Task Success and Time • Task Success + Time-on-Task • Core measure of efficiency – The ratio of the task completion rate to the mean time per task
  • 35. LEARNABILITY • Most products, especially new ones, require some amount of learning • Experience – Based on the amount of time spent using a product – Based on the variety of tasks performed • Learning – Sometimes quick and painless – At other times quite arduous and time consuming • Learnability – The extent to which something can be learned – How much time and effort are required to become proficient – While happens over a short period of time  maximize efficiency – While happen over a longer time period  great rely on memory
  • 36. Collecting and Measuring Learnability Data • Basically the same as they are for the other performance metrics • Collect the data at multiple times – Based on expected frequency of use • Decide which metrics to use  Decide how much time to allow between trials • Alternatives – Trials within the same session – Trials within the same session but with breaks between tasks – Trials between sessions
  • 37. Analyzing and Presenting Learnability Data • By examining a specific performance metric • Interpret the chart – Notice the slope of the line(s) – Notice the point of asymptote, or essentially where the line starts to flatten out – Look at the difference between the highest and lowest values on the y- axis • Compare learnability across different conditions
  • 38. Issues to Consider When Measuring Learnability • What Is a Trial? – Learning is continuous and without breaks in time • Memory is much less a factor in this situation • More about developing and modifying different strategies to complete a set of tasks • Take measurements at specified time intervals • Number of Trials – There must be at least two – In most cases there should be at least three or four – You should err on the side of more trials than you think you might need to reach stable performance.
  • 39. Thanks for your listening~