Summary: This research investigates how natural human error during data collection and analysis impacts on PSP analysis results. The Personal Software Process (PSP) is used by software engineers to gather and analyze data about their work, and to produce empirically based evidence for the improvement of planning and quality in future projects. Published studies have suggested that adopting the PSP results in improved size and time estimation and in reduced numbers of defects found in the compile and test phases of development. However, the results of most studies evaluating the PSP are based on analysis of PSP data – data of unknown accuracy.
There are two basic phases when using the PSP: collection and analysis. In the collection phase a software engineer does actual work, and records primary measures about the work such as time and defect data. In the analysis phase this data is analyzed to provide derived measures such as lines of code per hour or the percentage of defects removed before the first compile. One goal of this research was to learn what kind of errors are made in the analysis phase and to see if there is any evidence of problems in the collection phase. This information provides some idea of the overall quality of PSP data. A second goal was to determine the impact of errors made on commonly used PSP measures such as yield and cost performance index.
In the fall of 1996 a one-semester software engineering class was taught, covering the PSP. Because of prior concerns about data quality, many steps were taken to address this issue; including curriculum modifications, in-class technical review, and special forms. Using the PSP, 89 projects were completed by 10 students. Then, two database systems were built using Progress 4GL/RDBMS. The first was designed to automate the PSP as far as possible through the process level PSP2. The second was designed to record errors found in PSP data. The class projects were entered into the automated PSP system. Any discrepancies between the paper forms and values generated by the automated system were recorded using the error recording tool. Then all errors collected were analyzed by type, subtype, severity, age, etc. Finally, an attempt was made to correct the students’ PSP data as far as possible. The original and corrected values were compared.
1539 errors were found, mostly analysis errors. 34% were of the most severe type – errors that resulted in multiple bad values on multiple forms for multiple projects. There was also evidence of problems in the collection phase, with 90 errors resulting from conflicts within or between time and defect data. Correction of the data showed that the student errors had a substantial impact on important measures such as yield and cost performance index.
Principal researcher(s): Anne Disney
Status: Active development 1996 – 1999.