The Magic Of Application Lifecycle Management In Vs Public
Cs 568 Spring 10 Lecture 5 Estimation
1. Lecture 5 Estimation Estimate size, then Estimate effort, schedule and cost from size & complexity CS 568
2.
3.
4. Time Staff-month T theoretical 75% * T theoretical Impossible design Linear increase Boehm: “A project can not be done in less than 75% of theoretical time” T theoretical = 2.5 * 3 √staff-months But, how can I estimate staff months?
8. See www.brighton-webs.co.uk/distributions/beta.asp The mean, mode and standard deviation in the above table are derived from the minimum, maximum and shape factors which resulted from the use of the PERT approximations. 15.97 15.17 Q3 (75%) 14.30 13.91 Q2 (50% - Median) 12.96 12.75 Q1 (25%) 2.07 1.67 Standard Deviation 13.5 13.65 Mode 14.50 14.00 Mean Triangular Beta
9.
10.
11.
12.
13.
14. “ Productivity” as measured in 2000 3 x code for customized Code for reuse 17 – 105 NCSLOC/sm 1000-2000 NCSLOC/sm New embedded flight software ( customized) Reused Code 244 – 325 NCSLOC/sm Evolutionary or Incremental approaches ( customized) 130 – 195 NCSLOC/sm Classical rates
15.
16.
17. Post-Release Reliability Growth in Software Products Author: Pankaj Jalote ,Brendan Murphy, Vibhu Saujanya Sharma Guided By: Prof. Lawrence Bernstein Prepared By: Mautik Shah
33. Architecture Complexity Measure of Complexity (1 is simple and 5 is complex) 1. Code ported from one known environment to another. Application does not change more than 5%. 2. Architecture follows an existing pattern. Process design is straightforward. No complex hardware/software interfaces. 3. Architecture created from scratch. Process design is straightforward. No complex hardware/software interfaces. 4. Architecture created from scratch. Process design is complex. Complex hardware/software interfaces exist but they are well defined and unchanging. 5. Architecture created from scratch. Process design is complex. Complex hardware/software interfaces are ill defined and changing. Score ____
34.
35.
36. Computing Function Points See http://www.engin.umd.umich.edu/CIS/course.des/cis525/js/f00/artan/functionpoints.htm
37.
38.
39.
40. Initial Conversion http://www.qsm.com/FPGearing.html 42 Visual Basic 50 J2EE 60 Perl 59 JAVA 42 HTML 53 C++ 104 C Median SLOC/ UFP Language
41.
42.
43.
44. Expansion Trends Expansion Factor Technology Change: Regression Testing 4GL Small Scale Reuse Machine Instructions High Level Languages Macro Assemblers Database Managers On-Line Dev Prototyping Subsec Time Sharing Object Oriented Programming Large Scale Reuse Order of Magnitude Every Twenty Years Each date is an estimate of widespread use of a software technology The ratio of Source line of code to a machine level line of code
45.
46.
47.
48.
49. COCOMO Formula Effort in staff months =a*KDLOC b 1.20 3.6 embedded 1.12 3.0 semi-detached 1.05 2.4 organic b a
50.
51. Initial Conversion http://www.qsm.com/FPGearing.html 42 Visual Basic 50 J2EE 60 Perl 59 JAVA 42 HTML 53 C++ 104 C Median SLOC/function point Language
52.
53.
54.
55. For each component compute a Function Point value based on its make-up and complexity of its data Complexity Record Element Types Data Elements (# of unique data fields) or File Types Referenced Low Average High Low Low Average High Average High Components: Low Avg . High Total Internal Logical File (ILF) __ x 7 __ x 10 __ x 15 ___ External Interface File (EIF) __ x 5 __ x 7 __ x 10 ___ External Input (EI) __ x 3 __ x 4 __ x 6 ___ External Output (EO) __ x 4 __ x 5 __ x 7 ___ External Inquiry (EQ) __ x 3 __ x 4 __ x 6 ___ ___ Total Unadjusted FPs Data Relationships 1 3 3
59. Baseline current performance levels PERFORMANCE PRODUCTIVITY CAPABILITIES PERFORMANCE SOFTWARE PROCESS IMPROVEMENT TIME TO MARKET EFFORT DEFECTS MANAGEMENT SKILL LEVELS PROCESS TECHNOLOGY PRODUCTIVITY IMPROVEMENT INITIATIVES / BEST PRACTICES RISKS MEASURED BASELINE 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 0 100 200 400 800 1600 3200 6400 Sub Performance Best Practices Industry Averages Organization Baseline
60. Modeling Estimation SIZE REQUIREMENT REQUIREMENT Analyst SELECT MATCHING PROFILE GENERATE ESTIMATE WHAT IF ANALYSIS Counter Project Manager Software PM / User Metrics Database Plan vs. Actual Report Profile Size Time The estimate is based on the best available information. A poor requirements document will result in a poor estimate Accurate estimating is a function of using historical data with an effective estimating process. ESTABLISH PROFILE ACTUALS
61. Establish a baseline Performance Productivity A representative selection of projects is measured Size is expressed in terms of functionality delivered to the user Rate of delivery is a measure of productivity Organizational Baseline 9 Rate of Delivery Function Points per Staff Month 0 200 400 600 800 1000 1200 1400 1600 1800 2000 2200 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Software Size
62. Monitoring improvements Track Progress Second year Rate of Delivery Function Points per Person Month 0 200 400 600 800 1000 1200 1400 1600 1800 2000 2200 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Software Size
63.
Editor's Notes
Just differentiating metrics and their uses (and most metrics could be used for both).
Function points have their fudge factors too, and most practitioners do not use the unadjusted function point metric.
There is also a judgment made on the complexity of each of these domains that were counted.
Adjusted function points consider factors similar to those of the advanced COCOMO model. These factors are shown on the next slide. In training, folks should strive for consistency in counting function points … that is, scorers should agree on the counts of the factors, the complexity of each of the counts and the scoring of the characteristics -- this can be achieved but it takes a fair amount of training. The measure of scorers agreeing with each other is often referred to as inter-rater reliability.
In other software engineering course at Stevens you will learn to calculate function points. It is a skill that has to be acquired and it does take a while. Function points are currently one of the more popular ways to estimate effort. B&Y stress in heavily in chapter 6.
Here’s a listing of the advantages and disadvantages to function points from B&Y pp. 183-4. Function points are certainly more difficult to fudge than SLOC since they address aspects of the application. The other emphasis is on data collection -- you are only as good as your historical data and if you use these techniques extensively you should endeavor to continue to collect data and tune the metrics from experience.
For completeness and to provide you with a feel for the degree of effort a function point represents here’s another table mapping several computer languages to function points.