SlideShare a Scribd company logo
1 of 63
Lecture 5 Estimation Estimate size, then Estimate effort, schedule and cost from size &  complexity CS 568
Project Metrics ,[object Object],[object Object],[object Object],[object Object],[object Object]
Approaches to Cost Estimation ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Time Staff-month T theoretical 75% * T theoretical Impossible design Linear increase Boehm: “A project can not be done in less than 75% of theoretical time” T theoretical  = 2.5 *  3 √staff-months But, how can I  estimate staff months?
PERT estimation ,[object Object],[object Object],[object Object],[object Object],[object Object]
Example ,[object Object],[object Object],[object Object],[object Object],[object Object]
Probability Distributions
See  www.brighton-webs.co.uk/distributions/beta.asp   The mean, mode and standard deviation in the above table are derived from the minimum, maximum and shape factors which resulted from the use of the PERT approximations. 15.97 15.17 Q3 (75%) 14.30 13.91 Q2 (50% - Median) 12.96 12.75 Q1 (25%) 2.07 1.67 Standard Deviation 13.5 13.65 Mode 14.50 14.00 Mean Triangular Beta  
Sizing Software Projects ,[object Object],[object Object],[object Object],[object Object],Staff months Lines of Code or  Function Points 500
Understanding the equations ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
How many software engineers? ,[object Object],[object Object],[object Object],[object Object]
Lines of Code ,[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Bernstein’s rule of thumb for small components
“ Productivity” as measured in 2000 3 x code for customized Code for reuse 17 – 105 NCSLOC/sm 1000-2000 NCSLOC/sm New embedded flight software ( customized) Reused Code 244 – 325 NCSLOC/sm Evolutionary or  Incremental approaches ( customized) 130 – 195 NCSLOC/sm Classical rates
QSE Lambda Protocol  ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Universal Software Engineering Equation ,[object Object],[object Object],[object Object]
Post-Release Reliability Growth in Software Products Author: Pankaj Jalote ,Brendan Murphy, Vibhu Saujanya Sharma Guided By: Prof. Lawrence Bernstein Prepared By: Mautik Shah
Introduction  ,[object Object],[object Object],[object Object]
Three possible reasons: ,[object Object],[object Object],[object Object]
Failure rate model
Using product support data
Using data from Automated Reporting
Product stabilization time ,[object Object],[object Object],[object Object],[object Object],[object Object]
Conclusion ,[object Object],[object Object],[object Object],[object Object]
Derivation of Reliability Equation valid after the stabilization intereval.  ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Function Point (FP) Analysis ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Productivity= f (size) Function Points Bell Laboratories  data Capers Jones data Productivity (Function points / staff month)
Adjusted Function Points ,[object Object],[object Object],[object Object],[object Object],Unadjusted Function Points (UFP) General System Characteristics (GSC) X = Adjusted Function Points (AFP) AFP  =  UFP (0.65 + .01*GSC),  note GSC = VAF= TDI ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Function Point Calculations ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Complexity Table 10 7 5 INTERFACES (F) 15 10 7 LOG INT (L) 6 4 3 INQUIRY(E) 7 5 4 OUTPUT(O) 6 4 3 INPUT (I) COMPLEX AVERAGE SIMPLE TYPE:
Complexity Factors ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Problem Domain   Measure of Complexity (1 is simple and 5 is complex) ,[object Object],[object Object],[object Object],[object Object],[object Object],Score ____
Architecture Complexity Measure of Complexity (1 is simple and 5 is complex) 1. Code ported from one known environment to another.  Application does not change more than 5%. 2. Architecture follows an existing pattern.  Process design is straightforward.  No complex hardware/software interfaces. 3. Architecture created from scratch.  Process design is straightforward.  No complex hardware/software interfaces. 4. Architecture created from scratch.  Process design is complex.  Complex hardware/software interfaces exist but they are well defined and unchanging. 5. Architecture created from scratch.  Process design is complex.  Complex hardware/software interfaces are ill defined and changing.  Score ____
Logic Design -Data Score ____ ,[object Object],[object Object],[object Object],[object Object],[object Object]
Logic Design- Code Score __ ,[object Object],[object Object],[object Object],[object Object],[object Object]
Computing Function Points See http://www.engin.umd.umich.edu/CIS/course.des/cis525/js/f00/artan/functionpoints.htm
Adjusted Function Points- review ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Function Points Qualifiers ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Function Point pros and cons ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Initial  Conversion http://www.qsm.com/FPGearing.html 42 Visual Basic 50 J2EE 60 Perl 59 JAVA 42 HTML 53 C++ 104 C Median SLOC/ UFP  Language
 
 
SLOC ,[object Object],[object Object],[object Object],[object Object]
Expansion Trends Expansion Factor Technology Change: Regression Testing 4GL Small Scale Reuse Machine Instructions High Level Languages Macro Assemblers Database Managers On-Line Dev Prototyping Subsec Time Sharing Object Oriented Programming Large Scale Reuse Order of Magnitude Every Twenty Years Each date is an estimate of widespread use of a software technology The ratio of Source line  of code to a  machine  level line of  code
Heuristics to do Better Estimates ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Heuristics to meet aggressive schedules ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Specification for Development Plan ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
COCOMO ,[object Object],[object Object],[object Object],[object Object]
COCOMO Formula Effort in staff months =a*KDLOC b 1.20 3.6 embedded 1.12 3.0 semi-detached 1.05 2.4 organic b a
A Retrospective on the Regression Models ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Initial  Conversion http://www.qsm.com/FPGearing.html 42 Visual Basic 50 J2EE 60 Perl 59 JAVA 42 HTML 53 C++ 104 C Median SLOC/function point Language
Delphi Method ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Function Point Method ,[object Object],[object Object],[object Object],[object Object],[object Object],External  Input External  Inquiry External  Output Internal Logical  Files External  Interface  File Five key components are  identified based on logical user view Application
Downside ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
For each component compute a Function Point value based on its make-up and complexity of its data Complexity Record Element  Types Data Elements (# of unique data fields) or   File  Types Referenced Low Average High Low Low Average High Average High Components: Low Avg .  High Total Internal Logical File  (ILF) __ x 7  __ x 10  __ x 15 ___ External Interface File  (EIF) __ x 5  __ x  7  __ x 10 ___ External Input  (EI) __ x 3  __ x  4  __ x  6 ___ External Output  (EO) __ x 4  __ x  5  __ x  7 ___ External Inquiry  (EQ) __ x 3  __ x  4  __ x  6 ___ ___ Total Unadjusted FPs Data  Relationships 1 3 3
When to Count CORRECTIVE MAINTENANCE PROSPECTUS ACHITECTURE  TESTING DELIVERY REQUIREMENTS IMPLEMENTATION SIZING SIZING Change Request Change Request SIZING SIZING SIZING SIZING
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Estimates vary f{risk factors}
Using  the equations ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Baseline current performance  levels  PERFORMANCE PRODUCTIVITY CAPABILITIES PERFORMANCE SOFTWARE PROCESS IMPROVEMENT TIME TO MARKET EFFORT DEFECTS MANAGEMENT SKILL LEVELS PROCESS TECHNOLOGY PRODUCTIVITY IMPROVEMENT INITIATIVES / BEST PRACTICES RISKS MEASURED BASELINE 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 0 100 200 400 800 1600 3200 6400 Sub Performance Best Practices Industry Averages Organization Baseline
Modeling Estimation SIZE REQUIREMENT REQUIREMENT Analyst SELECT MATCHING  PROFILE GENERATE  ESTIMATE WHAT IF ANALYSIS Counter Project Manager Software PM / User Metrics Database Plan vs. Actual Report Profile Size  Time The estimate is based on  the best available information. A poor requirements document will result in a poor estimate  Accurate estimating is a function  of using historical data with an effective estimating process. ESTABLISH PROFILE ACTUALS
Establish a baseline Performance Productivity A representative selection of projects is measured Size is expressed in terms  of functionality delivered to the user Rate of delivery is a  measure of productivity  Organizational Baseline 9 Rate of Delivery Function Points per Staff Month 0 200 400 600 800 1000 1200 1400 1600 1800 2000 2200 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Software Size
Monitoring improvements Track Progress Second year Rate of Delivery Function Points per Person Month 0 200 400 600 800 1000 1200 1400 1600 1800 2000 2200 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Software Size
Brooks Calling the Shot ,[object Object],[object Object],[object Object],[object Object]

More Related Content

What's hot

Se 381 - lec 25 - 32 - 12 may29 - program size and cost estimation models
Se 381 - lec 25 - 32 - 12 may29 - program size and cost estimation modelsSe 381 - lec 25 - 32 - 12 may29 - program size and cost estimation models
Se 381 - lec 25 - 32 - 12 may29 - program size and cost estimation modelsbabak danyal
 
Software cost estimation
Software cost estimationSoftware cost estimation
Software cost estimationHaitham Ahmed
 
Issues in software cost estimation
Issues in software cost estimationIssues in software cost estimation
Issues in software cost estimationKashif Aleem
 
Spm software effort estimation
Spm software effort estimationSpm software effort estimation
Spm software effort estimationKanchana Devi
 
Software Measurement: Lecture 1. Measures and Metrics
Software Measurement: Lecture 1. Measures and MetricsSoftware Measurement: Lecture 1. Measures and Metrics
Software Measurement: Lecture 1. Measures and MetricsProgrameter
 
Estimation Techniques V1.0
Estimation Techniques V1.0Estimation Techniques V1.0
Estimation Techniques V1.0Uday K Bhatt
 
Software Estimation Techniques
Software Estimation TechniquesSoftware Estimation Techniques
Software Estimation Techniqueskamal
 
Effort estimation( software Engineering)
Effort estimation( software Engineering)Effort estimation( software Engineering)
Effort estimation( software Engineering)kiran Patel
 
Lecture5
Lecture5Lecture5
Lecture5soloeng
 
Halsted’s Software Science-An analytical technique
Halsted’s Software Science-An analytical techniqueHalsted’s Software Science-An analytical technique
Halsted’s Software Science-An analytical techniqueNur Islam
 
Cost estimation using cocomo model
Cost estimation using cocomo modelCost estimation using cocomo model
Cost estimation using cocomo modelNitesh Bichwani
 
Software estimation techniques
Software estimation techniquesSoftware estimation techniques
Software estimation techniquesTan Tran
 

What's hot (20)

Se 381 - lec 25 - 32 - 12 may29 - program size and cost estimation models
Se 381 - lec 25 - 32 - 12 may29 - program size and cost estimation modelsSe 381 - lec 25 - 32 - 12 may29 - program size and cost estimation models
Se 381 - lec 25 - 32 - 12 may29 - program size and cost estimation models
 
Software cost estimation
Software cost estimationSoftware cost estimation
Software cost estimation
 
Issues in software cost estimation
Issues in software cost estimationIssues in software cost estimation
Issues in software cost estimation
 
Cocomo
CocomoCocomo
Cocomo
 
Metrics
MetricsMetrics
Metrics
 
Spm software effort estimation
Spm software effort estimationSpm software effort estimation
Spm software effort estimation
 
Software Measurement: Lecture 1. Measures and Metrics
Software Measurement: Lecture 1. Measures and MetricsSoftware Measurement: Lecture 1. Measures and Metrics
Software Measurement: Lecture 1. Measures and Metrics
 
Cocomo models
Cocomo modelsCocomo models
Cocomo models
 
Estimation Techniques V1.0
Estimation Techniques V1.0Estimation Techniques V1.0
Estimation Techniques V1.0
 
Software Estimation Techniques
Software Estimation TechniquesSoftware Estimation Techniques
Software Estimation Techniques
 
Software Estimation
Software EstimationSoftware Estimation
Software Estimation
 
use case point estimation
use case point estimationuse case point estimation
use case point estimation
 
Effort estimation( software Engineering)
Effort estimation( software Engineering)Effort estimation( software Engineering)
Effort estimation( software Engineering)
 
Lecture5
Lecture5Lecture5
Lecture5
 
Halsted’s Software Science-An analytical technique
Halsted’s Software Science-An analytical techniqueHalsted’s Software Science-An analytical technique
Halsted’s Software Science-An analytical technique
 
Cost estimation using cocomo model
Cost estimation using cocomo modelCost estimation using cocomo model
Cost estimation using cocomo model
 
Guide to Software Estimation
Guide to Software EstimationGuide to Software Estimation
Guide to Software Estimation
 
Ch26
Ch26Ch26
Ch26
 
Complexity metrics and models
Complexity metrics and modelsComplexity metrics and models
Complexity metrics and models
 
Software estimation techniques
Software estimation techniquesSoftware estimation techniques
Software estimation techniques
 

Similar to Cs 568 Spring 10 Lecture 5 Estimation

Software Engineering Fundamentals in Computer Science
Software Engineering Fundamentals in Computer ScienceSoftware Engineering Fundamentals in Computer Science
Software Engineering Fundamentals in Computer ScienceArti Parab Academics
 
Cse viii-advanced-computer-architectures-06cs81-solution
Cse viii-advanced-computer-architectures-06cs81-solutionCse viii-advanced-computer-architectures-06cs81-solution
Cse viii-advanced-computer-architectures-06cs81-solutionShobha Kumar
 
Pm Scheduling Cost Pricing
Pm Scheduling Cost PricingPm Scheduling Cost Pricing
Pm Scheduling Cost Pricingjonathan077070
 
Resume Manoj Kumar M
Resume Manoj Kumar MResume Manoj Kumar M
Resume Manoj Kumar MManoj Kumar
 
Managing software project, software engineering
Managing software project, software engineeringManaging software project, software engineering
Managing software project, software engineeringRupesh Vaishnav
 
Jun 08 - PMWT Featured Paper -Tarabykin - XP PAPER - FINAL
Jun 08 - PMWT Featured Paper -Tarabykin - XP PAPER - FINALJun 08 - PMWT Featured Paper -Tarabykin - XP PAPER - FINAL
Jun 08 - PMWT Featured Paper -Tarabykin - XP PAPER - FINALAlex Tarra
 
Measuring Performance by Irfanullah
Measuring Performance by IrfanullahMeasuring Performance by Irfanullah
Measuring Performance by Irfanullahguest2e9811e
 
Managing Complexity and Change with Scalable Software Design
Managing Complexity and Change with Scalable Software DesignManaging Complexity and Change with Scalable Software Design
Managing Complexity and Change with Scalable Software Designlbergmans
 
Software development slides
Software development slidesSoftware development slides
Software development slidesiarthur
 
Extreme programming talk wise consulting - www.talkwiseconsulting
Extreme programming   talk wise consulting - www.talkwiseconsultingExtreme programming   talk wise consulting - www.talkwiseconsulting
Extreme programming talk wise consulting - www.talkwiseconsultingtalkwiseone
 
Extreme Programming Talk Wise Consulting Www.Talkwiseconsulting
Extreme  Programming    Talk Wise  Consulting   Www.TalkwiseconsultingExtreme  Programming    Talk Wise  Consulting   Www.Talkwiseconsulting
Extreme Programming Talk Wise Consulting Www.Talkwiseconsultingtalkwiseone
 
The Magic Of Application Lifecycle Management In Vs Public
The Magic Of Application Lifecycle Management In Vs PublicThe Magic Of Application Lifecycle Management In Vs Public
The Magic Of Application Lifecycle Management In Vs PublicDavid Solivan
 

Similar to Cs 568 Spring 10 Lecture 5 Estimation (20)

Software Engineering Fundamentals in Computer Science
Software Engineering Fundamentals in Computer ScienceSoftware Engineering Fundamentals in Computer Science
Software Engineering Fundamentals in Computer Science
 
Cse viii-advanced-computer-architectures-06cs81-solution
Cse viii-advanced-computer-architectures-06cs81-solutionCse viii-advanced-computer-architectures-06cs81-solution
Cse viii-advanced-computer-architectures-06cs81-solution
 
COCOMO MODEL
COCOMO MODELCOCOMO MODEL
COCOMO MODEL
 
cost-estimation-tutorial
cost-estimation-tutorialcost-estimation-tutorial
cost-estimation-tutorial
 
Vedic Calculator
Vedic CalculatorVedic Calculator
Vedic Calculator
 
Pm Scheduling Cost Pricing
Pm Scheduling Cost PricingPm Scheduling Cost Pricing
Pm Scheduling Cost Pricing
 
Sd Revision
Sd RevisionSd Revision
Sd Revision
 
Resume Manoj Kumar M
Resume Manoj Kumar MResume Manoj Kumar M
Resume Manoj Kumar M
 
Cocomo model
Cocomo modelCocomo model
Cocomo model
 
Managing software project, software engineering
Managing software project, software engineeringManaging software project, software engineering
Managing software project, software engineering
 
Tarun_Medimi
Tarun_MedimiTarun_Medimi
Tarun_Medimi
 
Jun 08 - PMWT Featured Paper -Tarabykin - XP PAPER - FINAL
Jun 08 - PMWT Featured Paper -Tarabykin - XP PAPER - FINALJun 08 - PMWT Featured Paper -Tarabykin - XP PAPER - FINAL
Jun 08 - PMWT Featured Paper -Tarabykin - XP PAPER - FINAL
 
Measuring Performance by Irfanullah
Measuring Performance by IrfanullahMeasuring Performance by Irfanullah
Measuring Performance by Irfanullah
 
Software maintenance
Software maintenanceSoftware maintenance
Software maintenance
 
Managing Complexity and Change with Scalable Software Design
Managing Complexity and Change with Scalable Software DesignManaging Complexity and Change with Scalable Software Design
Managing Complexity and Change with Scalable Software Design
 
Software development slides
Software development slidesSoftware development slides
Software development slides
 
Extreme programming talk wise consulting - www.talkwiseconsulting
Extreme programming   talk wise consulting - www.talkwiseconsultingExtreme programming   talk wise consulting - www.talkwiseconsulting
Extreme programming talk wise consulting - www.talkwiseconsulting
 
Extreme Programming Talk Wise Consulting Www.Talkwiseconsulting
Extreme  Programming    Talk Wise  Consulting   Www.TalkwiseconsultingExtreme  Programming    Talk Wise  Consulting   Www.Talkwiseconsulting
Extreme Programming Talk Wise Consulting Www.Talkwiseconsulting
 
Cse
CseCse
Cse
 
The Magic Of Application Lifecycle Management In Vs Public
The Magic Of Application Lifecycle Management In Vs PublicThe Magic Of Application Lifecycle Management In Vs Public
The Magic Of Application Lifecycle Management In Vs Public
 

Cs 568 Spring 10 Lecture 5 Estimation

  • 1. Lecture 5 Estimation Estimate size, then Estimate effort, schedule and cost from size & complexity CS 568
  • 2.
  • 3.
  • 4. Time Staff-month T theoretical 75% * T theoretical Impossible design Linear increase Boehm: “A project can not be done in less than 75% of theoretical time” T theoretical = 2.5 * 3 √staff-months But, how can I estimate staff months?
  • 5.
  • 6.
  • 8. See www.brighton-webs.co.uk/distributions/beta.asp The mean, mode and standard deviation in the above table are derived from the minimum, maximum and shape factors which resulted from the use of the PERT approximations. 15.97 15.17 Q3 (75%) 14.30 13.91 Q2 (50% - Median) 12.96 12.75 Q1 (25%) 2.07 1.67 Standard Deviation 13.5 13.65 Mode 14.50 14.00 Mean Triangular Beta  
  • 9.
  • 10.
  • 11.
  • 12.
  • 13.
  • 14. “ Productivity” as measured in 2000 3 x code for customized Code for reuse 17 – 105 NCSLOC/sm 1000-2000 NCSLOC/sm New embedded flight software ( customized) Reused Code 244 – 325 NCSLOC/sm Evolutionary or Incremental approaches ( customized) 130 – 195 NCSLOC/sm Classical rates
  • 15.
  • 16.
  • 17. Post-Release Reliability Growth in Software Products Author: Pankaj Jalote ,Brendan Murphy, Vibhu Saujanya Sharma Guided By: Prof. Lawrence Bernstein Prepared By: Mautik Shah
  • 18.
  • 19.
  • 22. Using data from Automated Reporting
  • 23.
  • 24.
  • 25.
  • 26.
  • 27. Productivity= f (size) Function Points Bell Laboratories data Capers Jones data Productivity (Function points / staff month)
  • 28.
  • 29.
  • 30. Complexity Table 10 7 5 INTERFACES (F) 15 10 7 LOG INT (L) 6 4 3 INQUIRY(E) 7 5 4 OUTPUT(O) 6 4 3 INPUT (I) COMPLEX AVERAGE SIMPLE TYPE:
  • 31.
  • 32.
  • 33. Architecture Complexity Measure of Complexity (1 is simple and 5 is complex) 1. Code ported from one known environment to another. Application does not change more than 5%. 2. Architecture follows an existing pattern. Process design is straightforward. No complex hardware/software interfaces. 3. Architecture created from scratch. Process design is straightforward. No complex hardware/software interfaces. 4. Architecture created from scratch. Process design is complex. Complex hardware/software interfaces exist but they are well defined and unchanging. 5. Architecture created from scratch. Process design is complex. Complex hardware/software interfaces are ill defined and changing. Score ____
  • 34.
  • 35.
  • 36. Computing Function Points See http://www.engin.umd.umich.edu/CIS/course.des/cis525/js/f00/artan/functionpoints.htm
  • 37.
  • 38.
  • 39.
  • 40. Initial Conversion http://www.qsm.com/FPGearing.html 42 Visual Basic 50 J2EE 60 Perl 59 JAVA 42 HTML 53 C++ 104 C Median SLOC/ UFP Language
  • 41.  
  • 42.  
  • 43.
  • 44. Expansion Trends Expansion Factor Technology Change: Regression Testing 4GL Small Scale Reuse Machine Instructions High Level Languages Macro Assemblers Database Managers On-Line Dev Prototyping Subsec Time Sharing Object Oriented Programming Large Scale Reuse Order of Magnitude Every Twenty Years Each date is an estimate of widespread use of a software technology The ratio of Source line of code to a machine level line of code
  • 45.
  • 46.
  • 47.
  • 48.
  • 49. COCOMO Formula Effort in staff months =a*KDLOC b 1.20 3.6 embedded 1.12 3.0 semi-detached 1.05 2.4 organic b a
  • 50.
  • 51. Initial Conversion http://www.qsm.com/FPGearing.html 42 Visual Basic 50 J2EE 60 Perl 59 JAVA 42 HTML 53 C++ 104 C Median SLOC/function point Language
  • 52.
  • 53.
  • 54.
  • 55. For each component compute a Function Point value based on its make-up and complexity of its data Complexity Record Element Types Data Elements (# of unique data fields) or File Types Referenced Low Average High Low Low Average High Average High Components: Low Avg . High Total Internal Logical File (ILF) __ x 7 __ x 10 __ x 15 ___ External Interface File (EIF) __ x 5 __ x 7 __ x 10 ___ External Input (EI) __ x 3 __ x 4 __ x 6 ___ External Output (EO) __ x 4 __ x 5 __ x 7 ___ External Inquiry (EQ) __ x 3 __ x 4 __ x 6 ___ ___ Total Unadjusted FPs Data Relationships 1 3 3
  • 56. When to Count CORRECTIVE MAINTENANCE PROSPECTUS ACHITECTURE TESTING DELIVERY REQUIREMENTS IMPLEMENTATION SIZING SIZING Change Request Change Request SIZING SIZING SIZING SIZING
  • 57.
  • 58.
  • 59. Baseline current performance levels PERFORMANCE PRODUCTIVITY CAPABILITIES PERFORMANCE SOFTWARE PROCESS IMPROVEMENT TIME TO MARKET EFFORT DEFECTS MANAGEMENT SKILL LEVELS PROCESS TECHNOLOGY PRODUCTIVITY IMPROVEMENT INITIATIVES / BEST PRACTICES RISKS MEASURED BASELINE 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 0 100 200 400 800 1600 3200 6400 Sub Performance Best Practices Industry Averages Organization Baseline
  • 60. Modeling Estimation SIZE REQUIREMENT REQUIREMENT Analyst SELECT MATCHING PROFILE GENERATE ESTIMATE WHAT IF ANALYSIS Counter Project Manager Software PM / User Metrics Database Plan vs. Actual Report Profile Size Time The estimate is based on the best available information. A poor requirements document will result in a poor estimate Accurate estimating is a function of using historical data with an effective estimating process. ESTABLISH PROFILE ACTUALS
  • 61. Establish a baseline Performance Productivity A representative selection of projects is measured Size is expressed in terms of functionality delivered to the user Rate of delivery is a measure of productivity Organizational Baseline 9 Rate of Delivery Function Points per Staff Month 0 200 400 600 800 1000 1200 1400 1600 1800 2000 2200 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Software Size
  • 62. Monitoring improvements Track Progress Second year Rate of Delivery Function Points per Person Month 0 200 400 600 800 1000 1200 1400 1600 1800 2000 2200 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Software Size
  • 63.

Editor's Notes

  1. Just differentiating metrics and their uses (and most metrics could be used for both).
  2. Function points have their fudge factors too, and most practitioners do not use the unadjusted function point metric.
  3. There is also a judgment made on the complexity of each of these domains that were counted.
  4. Adjusted function points consider factors similar to those of the advanced COCOMO model. These factors are shown on the next slide. In training, folks should strive for consistency in counting function points … that is, scorers should agree on the counts of the factors, the complexity of each of the counts and the scoring of the characteristics -- this can be achieved but it takes a fair amount of training. The measure of scorers agreeing with each other is often referred to as inter-rater reliability.
  5. In other software engineering course at Stevens you will learn to calculate function points. It is a skill that has to be acquired and it does take a while. Function points are currently one of the more popular ways to estimate effort. B&Y stress in heavily in chapter 6.
  6. Here’s a listing of the advantages and disadvantages to function points from B&Y pp. 183-4. Function points are certainly more difficult to fudge than SLOC since they address aspects of the application. The other emphasis is on data collection -- you are only as good as your historical data and if you use these techniques extensively you should endeavor to continue to collect data and tune the metrics from experience.
  7. For completeness and to provide you with a feel for the degree of effort a function point represents here’s another table mapping several computer languages to function points.