Más contenido relacionado
La actualidad más candente (20)
Similar a Software Engineering Practice - Software Metrics and Estimation (20)
Más de Radu_Negulescu (17)
Software Engineering Practice - Software Metrics and Estimation
- 1. Software metrics and estimation
McGill ECSE 428
Software Engineering Practice
Radu Negulescu
Winter 2004
- 2. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 2
About this module
Measuring software is very subjective and approximate, but necessary to
answer key questions in running a software project:
• Planning: How much time/money needed?
• Monitoring: What is the current status?
• Control: How to decide closure?
- 3. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 3
Metrics
What to measure/estimate?
Product metrics
• Size: LOC, modules, etc.
• Scope/specification: function points
• Quality: defects, defects/LOC, P1-defects, etc.
• Lifecycle statistics: requirements, fixed defects, open issues, etc.
• ...
Project metrics
• Time
• Effort: person-months
• Cost
• Test cases
• Staff size
• ...
- 4. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 4
Basis for estimation
What data can be used as basis for estimation?
• Measures of size/scope
• Baseline data (from previous projects)
• Developer commitments
• Expert judgment
• “Industry standard” parameters
- 5. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 5
Uncertainty of estimation
Cone of uncertainty
• [McConnell Fig. 8-2]
• [McConnell Table 8-1]
Sources of uncertainty
• Product related
Requirements change
Type of application (system, shrinkwrap, client-server, real-time, ...)
• Staff related
Sick days, vacation time
Turnover
Individual abilities
Analysts, developers (10:1 differences)
Debugging (20:1 differences)
Team productivity (5:1 differences)
• Process related
Tool support (or lack thereof)
Process used
• …
- 6. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 6
Estimate-convergence graph
Initial
product
definition
Approved
product
definition
Requirements
specification
Product
design
specification
Detailed
design
specification
Product
complete
1.0×
0.25×
4×
2×
0.5×
1.5×
0.67×
1.25×
0.8×
1.0×
0.6×
1.6×
1.25×
0.8×
1.15×
0.85×
1.1×
0.9×
Project Cost
(effort and size)
Project
schedule
- 7. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 7
LOC metrics
LOC = lines of code
A measure of the size of a program
• Logical LOC vs. physical LOC
Not including comments and blank lines
Split lines count as one
• Rough approximation: #statements, semicolons
Advantages
• Easy to measure
• Easy to automate
• Objective
Disadvantage
• Easy to falsify
• Encourages counter-productive coding practices
• Implementation-biased
- 8. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 8
FP metrics
A measure of the scope of the program
• External inputs (EI)
Number of screens, forms, dialogues, controls or messages through which
an end user or another program adds deletes or changes data
• External outputs (EO)
Screens, reports, graphs or messages generated for use by end users or
other programs
• External inquiries (EQ)
Direct accesses to data in database
• Internal logical files (ILF)
Major groups of end user data, could be a “file” or “database table”
• External interface files (EIF)
Files controlled by other applications which the program interacts with
- 9. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 9
Examples
[Source: David Longstreet]
EI:
EO:
- 10. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 10
Examples
EQ:
- 11. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 11
Examples
ILF
EIF
- 12. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 12
FP metrics
Complexity weights
Low Med High
EI 3 4 6
EO 4 5 7
EQ 3 4 6
ILF 7 10 15
EIF 5 7 10
Influence multiplier: 0.65..1.35
• 14 factors
- 13. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 13
Counting function points
349.6Adjusted Function Point Total
1.15Influence Multiplier
304Unadjusted Function Point total
651027059External Interface Files
10015310275Logical Internal Files
32644230Inquiries
63705747Outputs
44634236Inputs
totalmultipliercountmultipliercountmultipliercountProgram Characteristic
High Complexity
Medium
ComplexityLow Complexity
- 14. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 14
Influence factors
Was the application designed for end-user
efficiency?
End-user efficiency7
What percentage of the information is
entered On-Line?
On-Line data entry6
How frequently are transactions executed
daily, weekly, monthly, etc.?
Transaction rate5
How heavily used is the current hardware
platform where the application will be
executed?
Heavily used configuration4
Did the user require response time or
throughput?
Performance3
How are distributed data and processing
functions handled?
Distributed data processing2
How many communication facilities are
there to aid in the transfer or exchange of
information with the application or system?
Data communications1
- 15. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 15
Influence factors
Was the application specifically designed,
developed, and supported to facilitate
change?
Facilitate change14
Was the application specifically designed,
developed, and supported to be installed
at multiple sites for multiple organizations?
Multiple sites13
How effective and/or automated are start-
up, back up, and recovery procedures?
Operational ease12
How difficult is conversion and
installation?
Installation ease11
Was the application developed to meet
one or many user’s needs?
Reusability10
Does the application have extensive
logical or mathematical processing?
Complex processing9
How many ILF’s are updated by On-Line
transaction?
On-Line update8
- 16. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 16
Influence score
Strong influence throughout5
Significant influence4
Average influence3
Moderate influence2
Incidental influence1
Not present, or no influence0
InfluenceScore
- 17. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 17
Influence score
INF = 0.65 + SCORE/100
- 18. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 18
FP metrics
Some advantages
• Based on specification (black-box)
• Technology independent
• Strong relationship to actual effort
• Encourages good development
Some disadvantages
• Needs extensive training
• Subjective
- 19. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 19
Jones’ “rules of thumb” estimates
Code volumes:
• Approx. 100 LOC/FP, varies widely
[Source: C. Jones “Estimating Software Costs” 1998]
• Schedule:
#calendar months = FP^0.4
• Development staffing:
#persons = FP/150 (average)
Raleigh curve
• Development effort:
#months * #persons = FP^1.4/150
McConnell
• Equation 8-1 “Software schedule equation”
#months = 3.0 * #man-months^(1/3)
• Table 8-9 “Efficient schedules”
- 20. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 20
Quality estimation
Typical tradeoff:
Adding a dimension: quality
• Early quality will actually reduce costs, time
• Late quality is traded against other parameters
product (scope)
cost (effort) schedule (time)
- 21. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 21
Quality estimation
Quality measure:
• Fault potential: # of defects introduced during development
• Defect rate: #defects in product
[Source: C. Jones “Estimating Software Costs” 1998]
• Test case volumes:
#test cases = FP^1.2
• Fault potential:
#faults = FP^1.25
• Testing fault removal: 30%/type of testing
85…99% total
• Inspection fault removal:
60..65%/inspection type
- 22. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 22
Other typical estimates
[Source: C. Jones “Estimating Software Costs” 1998]
• Maintenance staffing:
#persons = FP/750
• Post-release repair:
rate = 8 faults/PM
• Software plans and docs:
Page count = FP^1.15
• Creeping requirements:
Rate = 2%/month
0% … 5% / month, depending on method
• Costs per requirement:
$500/FP initial reqs
$1200/FP close to completion
- 23. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 23
Sample question
Consider a software project of 350 function points, assuming: the ratio of
calendar time vs. development time (development speed) is 2; testing
consists of unit, integration, and system testing; and new requirements
are added at a rate of 3% per month.
(a) Using the estimation rules of thumb discussed in class, give an
estimate for each of the following project parameters, assuming a
waterfall process.
(i) The total effort, expressed in person-months.
(ii) The total cost of the project.
(iii) The number of inspection steps required to obtain fewer than 175
defects.
(b) Re-do the estimates in part (a) assuming that the project can be
split into two nearly independent parts of 200 function points and 150
function points, respectively.
- 24. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 24
Lifecycle statistics
Life cycle of a project item
• Well represented by a state machine
• E.g. a “bug” life cycle
Simplest form: 3 states
May reach 10s of states when bug prioritization is involved
• Statistics on bugs, requirements, “issues”, tasks, etc.
Open Fixed Closed
DEV
QA
QA
QA
- 25. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 25
Estimation process
Perceived
Actual
- 26. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 26
Example procedure
What do you think of the following procedure:
[Source: Schneider,Winters - “Applying Use Cases” Addison-Wesley, 1999]
Starting point: use cases.
UUCP: unadjusted use case points
• ~ # of analysis classes: 5, 10, 15
TCF: technical complexity factor
• 0.6 + sum(0.01 * TFactor)
• TFactor sum range: 14
EF: experience factor
• 1.4 + sum(-0.03 * EFactor)
• Efactor sum range: 4.5
UCP: use case points
• UUCP * TCF * EF
PH: person-hours
• UCP * (20..28) + 120
- 27. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 27
Estimation tips
Adapted from [McConnell].
Avoid tentative estimates.
• Allow time for the estimation activity.
Use baselined data.
Use developer-based estimates.
• Estimate by walkthrough.
Estimate by categories.
Estimate at a low level of detail.
Use estimation tools.
Use several different estimation techniques.
• Change estimation practices during a project.