Volume 5, Number 3 - Experienced Based Management


Did We Lose Our Religion?

by Lloyd K. Mosemann II, Science Applications International Corporation (SAIC)


1.0 Background

Back in 1990 I declared that the 1980s were a lost decade from the perspective of software development progress. The question I posed was: "Will there be a breakthrough in the 1990's?" I went on to say:

"It won't happen automatically; people are too satisfied with unsatisfactory ways. We dare not make the mistake of complacency a la the automobile industry; we must push awareness and resource commitment to get ahead of the power curve of demand."

In 1994, I closed the annual DoD Software Technology Conference with the observation that the underlying need within the defense community is for predictability

.

"From a Pentagon perspective, it is not the fact that software costs are growing annually and consuming more and more of our defense dollars that worries us. Nor is it the fact that our weapon systems and commanders are becoming more and more reliant on software to perform their mission. Our inability to predict how much a software system will cost, when it will be operational, and whether or not it will satisfy user requirements, is the major concern. What our senior managers and DoD leaders want most from us, is to deliver on our promises. They want systems that are on-time, within budget, that satisfy user requirements, and are reliable."

The Question I should like to briefly explore in this article is:

"Where are we today, and where will we be tomorrow?" Did we lose our religion?

2.0 Did We Lose Our Religion?

Why do I use the metaphor of "religion"? Primarily because "religion" is the traditional example of "faith-based" behavior - that is, behavior that is based on a belief-system rather than on externally imposed rules such as the "law of gravity" or "she that has the gold, rules". Remember: the emotional discussions regarding whether Ada or C++ should be preferred were frequently described as "religious" arguments based on beliefs rather than facts.

Sadly, I still see the world of software being governed by religious-like belief systems rather than objective appraisal. When I left the Pentagon six years ago I described some of what was happening as "bumper sticker" management, and the situation has not changed for the better. For example, "Use Best Commercial Practice"; or, "Buy Product not Process".

What's wrong with "best commercial practice"? Fact is, it just doesn't exist among DoD contractors. It's a fantasy created by those who want to streamline acquisition, making it possible to cut the number of oversight personnel by reducing the opportunity for oversight. The best way to justify a "hands off" attitude is to assert that contractors always do everything right!

There are more mature software organizations today. Virtually every large DoD contractor can boast at least one organization at SEI CMM Level 4 or above, and several organizations at SEI CMM Level 3. On the other hand, most DoD software is still being developed in less mature organizations - mainly because the Program Executive Officer (PEO) or Program Manager doesn't demand that the part of the company that will actually build the software be Level 3!

Back in 1991 Paul Strassmann, who was then the Director of Defense Information, said:

"The #1 priority of DoD, as I see it, is to convert its software technology capability from a cottage industry into a modern industrial method of production."

Guess what? That has not happened. Why not? Because this requires "software engineering". Software engineering encompasses a set of three key elements methods, tools, and procedures that enable the manager to control the process of software development and provide the practitioner with a foundation for building high quality software in a productive manner.

The fundamental ingredient in a "software engineering" approach is the design of a robust software architecture. By architecture I don't refer to the grouping and linkage of servers and routers and PCs, but rather to the inherent design of the software itself - the identity of modules and their relationships, including in particular the infrastructure, control and data interfaces that permit software components to operate as a system. I am told by an attendee at a DoD Systems Design Review several months ago that the contractor described his proposed system as "modular". That's a good architectural term, and it was accepted at face value. In fact, the system, when looked at by the independent attendee, was found to have only two modules. When this was brought to the Government Program Manager's attention, he said, "The contractor says it is modular. He's smarter than we are." This little incident underscores two facts: architecture was understood by neither the contractor nor the government PM, and the probability of a successful software delivery is low.

All too often the DoD excuse for not requiring an architectural evaluation is that "requirements change rapidly can't afford to be locked into a specific architecture". Wrong - that is the very reason that attention should be given to architecture, so that changes to requirements can be accommodated easily.

I am told that SEI is still being asked to do Independent Technical Assessments of "Why a software acquisition has not produced the desired working software system?" Why wait to ask SEI to come in to do a post mortem and tell you how you screwed up? Why not ask them to come in, first, to review the RFP and Statement of Work, and, second, to assist in evaluation of the software engineering practices, software development environment, and proposed architecture proposed in response to the RFP, and then, again, after award to assess the quality of the software engineering and development process? SEI isn't cheap, but terminating a $100M project for lack of satisfactory performance isn't cheap either.

Interestingly, when one thinks about "best commercial practice", there are two very different worlds out there. There is the government contractor world; and, there is the real commercial world - banks, insurance companies, UPS and FedEx, Eckerd Drug, and Disney World. These companies develop their own software using the best available software tools such as Rational's software development environment. They don't pick the cheapest tools. They don't rely on COTS or outside software developers - their software is their business, and they consider that it provides them a competitive advantage, and they want to control it, and they use the best tools available regardless of cost.

Product Line Developments are also becoming increasingly commonplace in the true commercial world. In addition to the original example (CelsiusTech, the Swedish firm that was the first to exploit the benefits of a product line architecture approach to software application development back in the late 1980s), there are now a number of firms that you have heard of who are using an architecture-centered approach: for example, Nokia, Motorola, Hewlett-Packard, Philips, Cummins Engine, and (believe it or not) one government application at the National Reconnaissance Office (NRO). The NRO is enjoying 50% reductions in overall cost and schedule, and nearly tenfold reductions in defects and development personnel. Let me list for you the most common and relevant-to-DoD reasons that these companies give for adopting the "Architecture-Centered Product Line Approach" to software development:

These companies and the NRO do have "best practices", but the "best practices" are not yet widely recognized as such. Frankly, it will take DoD PEOs (not Program Managers) and their overseers, the Service Acquisition Executives, and especially the DoD Comptroller and PA&E folks, to recognize and direct the product line approach to software development and acquisition. (Unfortunately, these folks are not known for their software acumen.) The major impediment to the Product Line Development approach, aside from ignorance of its benefits, are cultural, organizational, and, especially, the DoD's stovepipe approach to budgeting and funding. DoD has many potential software product lines. None of them have been built, largely for "political" and stovepipe budgeting reasons. As a result development and fielding of defense systems continues to take longer, cost more, and lack predictable quality. Product lines require strategic investment, which appears to be outside the DoD Comptroller and Acquisition communities' frames-of-reference. Yet, it is the Department of Defense that most often uses the term "strategic".

Cummins' Engine Company used to take a year or more to bring software for a new engine to the test lab - now they can do it in less than a week! I strongly recommend that you obtain a copy of a new book, Software Product Lines, just published in 2002 by Addison-Wesley. The authors are from SEI. Sadly, with the exception of the NRO, it appears that the readers are mainly from commercial organizations. There was a previous version of the book, Software Architecture in Practice, that sold more than 14,000 copies, but I doubt that few, if any, DoD folks have read it.

I work for one, but let me tell you that, although government contractors are commercial organizations, they do not have an identifiable "best commercial practice". They basically will provide what the Government asks for. The only reason many of them can now boast of having at least a few SEI maturity level organizations is because, starting 10 years ago, many government RFPs required a Software Capability Evaluation as a condition of bidding. In fact, sad to say, many contractors today put into their proposals satisfactory CMM credentials, but then actually perform the work with an organization who couldn't spell CMM. Why does the Government let this happen? Why aren't there Software Capability Evaluations anymore? - the word is, they "take too long and cost too much". Better to make a quick award, and then, down the line, accept an inferior product or terminate for convenience. Too often what the government has been asking for is COTS. How many failures of COTS-based acquisitions have there been over the past decade? Too many!

"Best commercial practice" is not eliminating all software smarts in government and relying 100% on contractors to deliver good software. "Best commercial practice" is what real commercial companies are doing. They have in-house software expertise, they use a robust software development environment, and they base their software development on a sound software architecture. It is no secret/no surprise that Rational and their competitors have a growing market in the commercial world and a shrinking market in the government world. I'm not suggesting that the Government can or should develop software in-house. I am strongly suggesting that the Government needs enough in-house software expertise to know what it is buying. It is still true that you get what you pay for", and that "apples and oranges" are not the same!

Watts Humphrey recently published a new book entitled Winning with Software - An Executive Strategy. Not surprisingly, the book is directed primarily at executives of commercial companies. But every DoD acquisition executive, PEO and Program Manager needs to read and understand it message. Frankly, its message is pretty simple: Software projects rarely fail for technical reasons; invariably, the problem is poor management. He poses two interesting questions and buttresses them with numerous examples:

  1. 1. "Why do competent software professionals agree to dates when they have no idea how to meet them?"

  2. 2. "Why do executives accept schedule commitments when the engineers offer no evidence that they can meet these commitments?".

He asserts that Management's undisciplined approach to commitments contributes to every one of the five most common causes of project failure:

  1. Unrealistic schedules
  2. Inappropriate staffing
  3. Changing requirements
  4. Poor quality work
  5. Believing in magic.

3.0 What is Formal Methods Programming?

Basically, it is all of the above rolled together: sound management, established engineering processes, robust software development environment, model based architecture, and a reliable programming language. I was thrilled to see, in the March edition of CROSSTALK, an article by Peter Amey of Praxis Critical Systems, in which he stated:

"There is now compelling evidence that development methods that focus on bug prevention rather than bug detection can both raise quality and save time and money."

He then goes on to say that a key ingredient is the use of unambiguous programming languages that allow rigorous analysis very early in the development process. I was an early and vocal advocate of Ada, primarily because, unlike other languages, it enforces basic software engineering practice. In that article Peter describes the use of a subset of Ada known as SPARK. He says,

"The exact semantics of SPARK require software writers to think carefully and express themselves clearly; any lack of precision is ruthlessly exposed by its support tool, the SPARK Examiner."

He indicates that there were significant savings from using SPARK in a critical avionics program, including most notably that formal FAA test costs were reduced by 80%. Unfortunately, this is an isolated DoD example - the same rigor and discipline of formal methods programming should apply for all major software intensive system developments.

4.0 Predictability

Finally, let me say a word about Predictability. Predictability is the only metric that the warfighters care about. Question is - how can we make the warfighters (including the PEOs and PMs who are charged with delivering the capabilities needed by the warfighters) smart enough to know that you can't just buy software as a product off the showroom floor. There must be understanding of the software engineering paradigm that produces software. In fact, more than this, to be assured of getting software that works on a predictable schedule and at predictable cost requires that someone in government be smart enough to enunciate the basic processes that will be employed by the contractor to produce. This is important because, otherwise, competitors will bid an unrealistically low price and unrealistically fast schedule, and be awarded the contract. To perform at the low cost means no robust software development environment, no time and effort devoted to creating a valid software architecture, and probably means cheap people. If the government wants to "get software right", sufficient process guidance must be given to assure that the contractors all bid essentially the same level of capability. I believe that Government should be explicit about the need for architecture, a robust software development environment, perhaps even the requirement for a language like SPARK, but, as a minimum, it needs to specify that the performing organization must be at least CMM Level 3. Why? Because at Level 1 virtually all projects have cost and schedule overruns, whereas at Level 3 virtually all projects are on target. As regards Defect Rates and $ per Source Line of Code, there is substantial improvement on the order of 20-50%. It really is true that YOU GET WHAT YOU PAY FOR. If you want it cheap, you'll get it cheap - but it may not work in the manner envisioned, if it will work at all.

Where have we been? The Government I fear has been wallowing in a slue of self-deceit, thinking it need not have software expertise, but can simply rely on the so-called "best commercial practices" of contractors. The problem is that, left to their own devices, the "best practices" of defense contractors are not very good. Are there best commercial practices? You bet! The banks, insurance companies and other truly commercial enterprises have them. But they have them, not because of some automatic external happenstance, but because their senior managers have had the moxy to realize that it takes money to make money, and that it takes software expertise to develop or acquire software. The Government needs to go and do likewise. Otherwise, the decade of 2000 will likely not show any lessening of the Software Crisis that has carried over from the 1990s.


Previous Table of Contents Next