Volume 5, Number 3 - Experienced Based Management
Individuals acquire knowledge while working in software development projects but too often this knowledge is not documented or captured so that it can be shared or later reused. Dissemination of experience related to new development paradigms, such as development based on commercial off-the-shelf (COTS) components, can be particularly useful. To support sharing experiences about COTS-based software (CBS) development, we have built and made available to the software engineering community a web-based repository of lessons learned. This collection of practical knowledge is currently seeded with lessons extracted from published experiences and online Workshops. The repository offers capabilities for online search and retrieval, submission of lessons and feedback, plus a visual query interface that allows for analysis of the lessons learned and their usage. The repository will grow organically in the directions indicated by usage and contributions from its users.
COTS-based software development is different in many respects from in-house software development. Managers and developers must perform new activities and need new skills and knowledge whose acquisition is time consuming. Let us illustrate what we mean with a fictional story.
Joe Cotsford is a software team lead working for a government contractor and he has a problem: Joe has been charged by his project manager with the task of evaluating and selecting a COTS product to use as a major subcomponent of their CCTwo system. He's solicited all his colleagues on products to evaluate, and received about 20 suggestions. He spent a lot of time collecting information on these 20 products and was able to eliminate about half of them based on functionality, platform, and price. But now he needs some more criteria to use to whittle down the current list of 10 products to just one. He is at a loss as to how to proceed because he can't think of any other relevant factors to use to continue his evaluation. So he looks on the web for any available resources, but is quickly overwhelmed with the number of irrelevant hits. Surely someone somewhere has dealt with this problem before, but where can he find this experience?
Although this is a hypothetical character and scenario, the situation is not unusual. The problem is that this kind of knowledge might not be found in books, journals, or on the web, or if it exists, is scattered across many different web sites or buried in published papers. For example, some good general guidelines and procedures are available on the web, notably at the SEI's COTS based systems web site (http://www.sei.cmu.edu/cbs/) but Joe knows that his results would be stronger if corroborated by specific previous experiences. Joe would thus have difficulty finding any good sources for this information in a timely manner.
One approach to facilitating knowledge sharing within communities is building experience repositories. The development of a COTS Lessons Learned Repository (CLLR), in which all authors are involved, is part of the Center for Empirically-Based Software Engineering (CeBASE) initiative. CeBASE (www.cebase.org) is an National Science Foundation (NSF) sponsored partnership between universities, research centers, and industry, whose main purpose is to promote experience sharing among software development practitioners. We present here the results of the effort focused on acquisition and dissemination of experience throughout the CBS community of interest. The participation of project managers, developers, COTS integrators, and other practitioners is essential in order for the entire community to benefit from this effort.
The CLLR is a living repository that can be accessed through http://fc-md.umd.edu/fcmd/index.htmlll/index.asp. It is available for practitioners to use, comment on, add to, and help define its evolution. The repository will not grow or thrive unless it meets the needs of the practitioners' community. So its actual evolutionary path will depend completely on how it is used and the needs expressed by its users.
In this paper we present the current status and capabilities of the repository, and our vision for its future evolution. We strongly encourage readers to comment on our ideas.
Luckily, Joe finds a reference to the CeBASE COTS Lessons Learned Repository at http://fc-md.umd.edu/fcmd/index.htmlll/index.asp. After accessing the repository and searching for lessons learned on "COTS evaluation", he finds several that catch his eye. One says "on safety-critical systems, only use components from a very mature vendor". Another says "using components from vendors who have been in business less than 3 years incurs a high risk of having to replace that component during maintenance, or even before initial deployment of the system". Since Joe's system is both critical, and is envisioned to have a long life, both of these lessons learned seem relevant. This gives Joe ideas for evaluation criteria. It also makes Joe think that, in addition to evaluating the components on his list, he really should be evaluating the vendors as well. So he decides to go to his company's software acquisition office and talk to folks there about their previous experiences with the vendors of components on his short list.
While talking to his company's procurement people, Joe got an earful. In addition to information about companies and their maturity, he heard lots of horror stories. These included stories of broken contracts, non-responsiveness of help desks, low component quality, lack of documentation, vaporware, and sleazy negotiating tactics. For example, he found out that the vendors of the three most attractive-sounding products used terms such as "in alpha testing" as a synonym for vaporware. Based on all this information, he was easily able to filter out most of the 10 products on his short list, ending up with one final candidate. Because he had collected information on the product and the vendor, he was confident in recommending this one remaining product to his project manager.
Based on this experience, Joe recommended to his manager that all COTS-based development projects take into account this kind of vendor information in COTS evaluation. His recommendation was adopted and written into his company's procedures for COTS selection.
For justifying his COTS selection decision, Joe told his manager about the repository. It so happens that Joe's manager likes new gadgets, so he went online, browsed through the repository and was pleasantly surprised to find a few good pieces of advice, such as "if your customer does not have flexible requirements, using COTS is not a good option" and "get your customer involved early". That made him immediately initiate discussions and requirements negotiation with the customer.
Also, while talking to the acquisition people, Joe mentioned the lessons learned repository. They were curious about whether there was anything relevant to them there. Since it only takes a few clicks, they decided to give it a try. They learned about a project that had a problem with a multi-vendor solicitation that could have been avoided if a pre-award hearing would have been held. In order to prevent such a situation, Joe's organization decided to hold a hearing themselves.
The lessons learned repository that supported Joe in his decision making is real and is currently seeded with an initial set of about 70 lessons learned extracted from the literature. The sources that provided these lessons include journal articles [3], workshop presentations [6], and government reports [1], [7]. In addition, we organize online workshops (eWorkshops [4]) and use these discussions to synthesize new lessons and to refine the existing ones. We are also in the process of consolidating our repository with lessons learned by the SEI that have been collected from a large number of case studies, but have not been published.
The lessons are described in the repository by a set of attributes, the most important being the ones describing the context in which the lesson was learned (type of system, type of company, number and type of COTS). Other attributes refer to type of data, recommended audience, relevant life cycle phase, etc. Most of the attributes were chosen based on a bottom-up effort to differentiate the lessons learned in the initial repository, but others were added simply because they seemed to reflect issues of interest to potential practitioner users (e.g. impact on cost, quality and schedule).
The interface to the repository supports search and retrieval based on text searches over all attributes. Links to the original source of each lesson are also provided.
The repository also has a built-in facility for tracking various metrics related to repository usage. This metrics data can be used to tune the repository based on patterns of usage.
Figure 1 presents the main components of the COTS Lessons Learned repository and how it is used.
The main component currently implemented is the COTS Lessons Learned database, which is accessible over the web, through any browser at (http://fc-md.umd.edu/fcmd/index.htmlll/index.asp).
The main component currently available is the COTS Lessons Learned repository. Users can interact directly with the repository and browse or search and retrieve lessons. They can also submit feedback about existing lessons or submit new lessons. The feedback and new lessons will go first to a buffer; they will be examined and validated by the validator and then uploaded. An administrator takes care of maintaining the repository and an analyst is responsible for the repository's evolution. A component of the system (currently under development) will allow dialogues between users and experts, to support helping concrete problems. The logs of these dialogues are captured and used for extracting new lessons (as discussed in Section 3: Future Work).
A user can browse through the lessons or search for specific ones as shown in Figure 2. The statement of the lesson and the action recommended are listed on the front page thus immediately visible to the user. Details about a specific lesson are just a click away. The "Click here for details" link displays (as shown in Figure 3) all the values of that lesson's attributes.
Feedback is always welcome and users can at any time provide feedback on any of the lessons learned. This feedback reflects the utility of the lesson to the user, and the user's opinion about the applicability of the lesson in their specific context. An example of useful feedback is "this works differently in my environment because..." or "I experienced the same situation in a similar project", or "I experienced the same situation in this different context ...".
Users can submit feedback not only about individual lessons, but also about the repository itself, its technology, organization, and usage. This will allow the repository to evolve in the direction desired by its users.
The users are of course highly encouraged to contribute to the "community experience" with lessons they have learned themselves. In order to share this experience with their peers, developers and managers are asked to submit new lessons, by using an online submission form.
For guidance on the use of the repository, there is a set of frequently asked questions (FAQ) accessible from the main page. If an answer cannot be found in the FAQ, the user can submit a question and one of the tech support people will provide an answer. The new question-answer pair will be posted in the FAQ adding to the community knowledge.
Another capability of the system is a Visual Query Interface (VQI) [5] to the lessons learned, as shown in Figure 4. By using this VQI the user can visualize the entire content of the repository, which can facilitate the search for relevant lessons. In VQI, each dot displayed in the main window corresponds to one lesson in the repository. The set of lessons displayed can be filtered using the attributes as shown in the right window. On the X and Y-axes the user can select the attributes by which lessons should be displayed. In Figure 4, the X-axis represents the "Life cycle phase/activity" (e.g., COTS evaluation, COTS upgrade, Early development phases, Procurement, System acquisition) during which the lesson was learned or can be applied. The Y-axis represents the "Object" of the lesson (e.g., Product, Process, People, Vendor). The lessons can be colored by a selected attribute - which in Figure 4 is "Recommended audience" (e.g. Program Manager, Project Manager, Developer). By clicking on a particular dot, details about the lesson will be displayed in a new window similar to the one in Figure 3.
A year later, after his project had successfully completed, with the inclusion of the component he had recommended, Joe went back and took a look at his original "short list" of 10 candidate components. Almost all of the products he had eliminated were currently either no longer supported, or the vendors had gone out of business. Joe gathered information from other COTS-based projects that had taken place over the previous year and found many similar experiences. Joe felt this was a very useful lesson learned, so he went again to the CeBASE COTS Lessons Learned Repository, but this time to add his own experience. He summarized his lesson as "information about component vendors is as important as information about components in the COTS evaluation process." He also provided more information about the context of his observations (e.g. in defense-related, mission-critical systems) and provided a few more pieces of information, then submitted all this for others in the community to access.
The repository content is growing organically by contributions from users like Joe who add new lessons. The content also evolves as a result of analysis, synthesis, and refinement of the existing lessons. CBS experts, in collaboration with the maintainer of the database, perform this activity. The attributes used to characterize and classify the records will also evolve over time, again, in the direction where the use and the need will be.
Based on his experience with his CCTwo project, Joe became a regular user of the CCLR. One day, as he was browsing for information, he came across a lesson that had been derived from the lesson he had earlier submitted, along with similar lessons. While his lesson had addressed the general importance of vendor information in COTS evaluation in his domain, this new, synthesized lesson went further. It said, "If your customer is willing to negotiate the requirements, it is cost-effective to choose components from more dependable vendors than to choose components with a better functional fit to the original requirements". This statement was based on lessons submitted from software engineers in various domains who had both good and bad experiences with vendors. Some of them had done detailed cost/benefit analyses. This gave Joe a lot more information about how to use vendor information on future COTS evaluations. It also gave him a good feeling to know that he had contributed in a meaningful way to the body of knowledge in his community.
So he clicked on the "Ask An Expert" option and submitted a question on this subject. This began a dialogue with an expert in the area, who asked Joe questions about the specifics of his project and the risks he was concerned about. Finally, they came up with a plan that allowed Joe to quantify the "cost" of a risk, as well as the "benefit" of reducing that risk. These figures could then be incorporated into the cost and benefit estimates for the task. Later, Joe was happy to find a new lesson in the repository that was synthesized from his discussion with the expert, so that others could benefit from what he had learned.
The Ask An Expert feature is currently under implementation. It will offer an opportunity for peer-to-peer communication (left part of Figure 1). This dialogue is recorded and after the person who asked the question receives an answer, the log of this conversation is used to extract new lessons that will be added to the repository.
Joe began using his new approach for quantifying risks on more projects, and over time built up some experience with it. However, he still had problems with using it in some situations. In particular, he often had trouble estimating the potential loss of having to entirely replace a component due to unforeseen problems with it. He wanted to bounce some ideas off someone, but nobody in his immediate work area was familiar enough with these issues to be able to give any useful feedback. So Joe went to the CeBASE site again and this time joined a discussion group on COTS estimation. He posted his questions about incorporating risk into estimation to the discussion group, and an interesting discussion ensued. Many people shared their experiences using different approaches to cost estimation on COTS projects, and some offered well-tested alternative approaches, such as the COTS Cost Estimation Model (COCOTS [2]). So Joe got some new ideas to try out on his projects, and everyone involved learned something. The lessons shared in the discussion that were best supported with real experience and data were captured as new lessons learned in the repository, so that others not involved in the discussion might learn from them too.
Besides the peer-to-peer communication mechanism, we will offer other means for community support, such as group discussions and eWorkshops (moderated chat sessions where the log is recorded). These logs will be analyzed, mined, and new lessons will be derived and added to the repository, according to our approach of transforming "knowledge dust" into "knowledge pearls" [5]. We will thus provide the process and the technology to support knowledge collection, organization, storage, evolution, and dissemination for the CBS community.
The COTS Lessons Learned repository aims at disseminating valuable knowledge and experience between practitioners involved in COTS-based development. Some lessons learned in this area have been published in papers, but have not been previously actively elicited and shared on a larger scale. By providing an online repository used by both experienced and less experienced people, it is possible to create a community of software engineers and managers that share this kind of knowledge and experience on a daily basis.
The COTS Lessons Learned Repository is waiting to serve your needs in CBS development. We appreciate your contributions with new lessons and any feedback you can give. Please visit us at http://fc-md.umd.edu/fcmd/index.htmlll/index.asp.
This work is partially sponsored by NSF grant CCR0086078 to the University of Maryland with subcontract to the Fraunhofer Center Maryland (FC-MD). Many thanks to Chris Abts from the University of Southern California (USC) for providing the material to initially populate the repository; to Forrest Shull and Patricia Costa from FC-MD and Dan Port from USC for their suggestions in developing the repository.
![]() |
![]() |
![]() |