[Previous Article]
[Next Article]
[In this Issue]
The 19th Annual Software Engineering Workshop was held at NASA Goddard Space Flight Center on November 30 and December 1, 1994. The workshop is sponsored by the Software Engineering Laboratory (a joint venture of NASA/Goddard, Computer Sciences Corporation, and the University of Maryland). Following SEL Director Rose Pajerski's opening remarks, SEL personnel reported on recent research in the Software Engineering Laboratory, addressing work underway there in experience-domain analysis, maintenance process characterization, and the experience "packaging" process.
Vic Basili (University of Maryland), explored the possibilities of expanding the SEL's quality improvement paradigm by examining ways in which "local" experience, a hallmark of the SEL improvement concept, might be transferred beyond the local environment to other software organizations whose product and process may be similar to benefit from that experience. He proposed both quantitative and qualitative approaches for assessing the applicability of experience across domains (or organizations). Basili's structure for comparing software experience domains proposed a scientific method for globalizing local software engineering experience.
Jon Valett (NASA/Goddard), the new Director of the SEL, reported on maintenance research. Valett's study examined software in the SEL's domain, where some of the operational systems are more than 10 years old. To baseline, or develop an accurate understanding of the local maintenance process, the research team looked at organizational structure, process activities, and product attributes. Valett itemized some key differences to keep in mind when studying an evolving product: look at "release" characteristics, measure effort per software change, distinguish between operational errors (those found in the operational product) and "release" errors (those introduced during maintenance), and apply experience gained from this product to improve future releases of the same product.
In the third presentation of the session, Sharon Waligora (Computer Sciences Corporation) explained how the SEL applies its lessons learned and improved approaches in the development environment. Experience "packaging," she said, is the process of culling the most relevant data and findings from research, experimentation, and process improvement activities; synthesizing them into useful information and guidance; and returning them to the source, the projects that supplied the raw experience. The SEL packaging process, Waligora said, can be viewed as a two-way feedback loop between the development organization and the experience "packagers." In one direction, the loop takes in the experience and expertise of development/maintenance personnel and identifies their needs. That qualitative input, as well as quantitative data and results of experimentation with new technology, is then synthesized by professional packagers. Waligora shared some data and lessons learned from SEL packaging efforts. Cost for packaging varied according to the format and level of detail of the material, averaging around 1 to 1.5 percent of the software development budget. In addition to the SEL presentations, speakers from academia, industry, and other NASA organizations discussed process models, software certification, measurement, and other topics.
Daniel Roy (Software Technology, Process & People), who recently completed a term as a visiting scientist at the SEI, reported on his experience there using the Personal Software Process (PSP). The PSP is a software engineering approach intended to illuminate for the individual the importance of good software engineering practices. Studying the PSP, Roy completed several programming exercises that incorporated measurement activities, such as error logs and effort logs, project planning and tracking, and post-mortem reviews. According to Roy's experience, the PSP does bring software engineering practices into focus, highlighting for the individual the connection between performing these activities and improving one's own software product.
Stan Rifkin (Master Systems Inc.) developed training in program comprehension techniques. Rifkin pointed out that reviewer comprehension is a factor often overlooked in inspections because it is assumed (wrongly) that developers, in their role as inspectors, know exactly how to go about reviewing software artifacts. Rifkin described program comprehension as a "complex process of translating, interpreting, and hypothesis testing among domains-from the real world, to the application domain, to the computing domain." Rifkin deployed his training and inspection technique in a commercial setting, where he trained a group of 35 software developers already using software inspections as a part of their process. He then followed up by collecting data over two years to quantify the effect of the training on their customer-detected-defect rate. Two control groups were also tracked. While reported errors for the two control groups remained stable at their initial levels, Rifkin reports a steady decrease in number of customer-detected defects in the products delivered by his group. This drop leveled off at 90 percent fewer customer-detected defects per day.
Scaling-up measurement to a broad population, Craig Sinclair (SAIC) applied the SEL baselining technique across NASA. The research was sponsored by the NASA Software Engineering Program. The specific goals of the baseline study were to provide insight to NASA management on the scope and nature of its software; to help focus and direct future spending on training, technology, and software engineering; and to identify areas ripe for software product and process improvement. Sinclair's team learned that more than 10 percent of NASA personnel and support contractors work with software the majority of their time on the job. One billion of NASA's 14-billion-dollar budget goes to building software and other software-related costs. This software can be divided into several domains, the two largest of which are mission ground support (35 percent) and administrative (26 percent) software. Programming language preferences and usage trends (always an intriguing bellwether) were examined. Far and away, most new systems are being built using C/C++ (47 percent), with FORTRAN a distant 29 percent, and Ada trailing with a distant 11 percent for new development projects. This is in contrast to operational systems, which are dominated by "other" languages (e.g., Assembler, Jovial, Lisp, Pascal) (38 percent), followed by FORTRAN, C/C++, COBOL, and Ada. Understanding and assessing NASA products and processes at this level is an important first step toward developing Agency policy and guidance for local NASA organizations at the Center level and below.
The SEL is accessible on the World Wide Web:
The workshop proceedings will be available in April free of charge from: