Testing with Object-Oriented Technology (OOT) in the Telephone Industry

Larry Bernstein - Have Laptop - Will Travel


Introduction

The telephone gadget is an enormously successful invention; each new level of system that surrounds it has spawned radical innovations and new services. Each changes adds complexity to managing the telephone network. Object-Oriented Technology (OOT) is the best answer to controlling the multiplying configurations that suddenly appear with new services. But OOT experiences the same birth pangs as every other new idea.

In addition outsourcing is changing the telecommunications industry. In the last decade service providers have moved from developing their own services and equipment internally to buying them from third parties. The split of Lucent Technologies from AT&T in 1996 was the ultimate expression of this policy. With outsourcing comes the challenge of evaluating just how well vendor systems work before making a purchase decision.


Test Models

GlobalOne met this challenge with an innovative use of Teradyne’s TestMaster’s tool. TestMaster is an automated test design and coding tool, which was used for building an object oriented model of the outsourced system. The model was based on the specifications contained in their Request for Proposal and from system descriptions provided by the supplier. GlobalOne engineers were able to use this model to first map the functions they wanted against the system description and then against the system itself. This assured them that the contracted telephony functions were present and the GlobalOne system engineers understood how the new service would fit into their business environment. This approach showed how giving modeling tools to the customer system engineers can head off unintended consequences well before the system is even developed.

The TestMaster model of the service node gave the GlobalOne system engineers insight into the dynamics of this complex system of systems that made up the service offering. With the model, the systems engineers were able to study the unique call flow for every variation of the service. For example, GlobalOne customers can use one of 12 languages to interact with their network. Manual evaluation of the interaction of language selection based on the object libraries with the many service variations would have been a huge task without TestMaster and supporting evaluation tools. In traditional manual methods the system engineers would study the system specifications and then develop test case to verify that the system worked as they expected. Finding the error paths is always a challenge. Typically many review meetings are needed between the system engineers themselves and then with the vendor’s technical people to ferret out the potential incompatibilities. With this approach, serious problems are often overlooked which at best show up in service testing and at worst are found by the paying customers. Problems found and fixed during the service test effort costs three times the effort then those found with the model and those found by the user add another factor of ten in cost escalation.

The TestMaster model-based test creation method permits the early involvement of the test organization in the development process, and is a powerful tool for facilitating communication between customer and supplier engineers. For example, the service offering systems use several different database technologies. To install the new service a database of customers was needed which contained administrative data and their service requests. The database initialization process was modeled with TestMaster, such that the database records were automatically generated from the model. Once the testers saw the strength of the model they adopted it as their test case database repository. Consequently, the TestMaster model of the databases was used for both populating the component databases in the target system, as well as serving as the input data for the test creation process. Expected results from the model were kept and later compared to the results from running the test cases against the system. When there were differences, analysts would compare the flow in the model with the flow in the service offering and find the problem. This moved debugging from detective work to analysis. Problems were found in the object libraries, component systems, in the model and even in the system design.

The model assures all features are present, not just the headliners. Once the service offering is installed in the evaluation labs the model produces test suites for automatic test drivers. These tests verify that that the system performs as expected..

The test scripts from the model resulted in high coverage rates for feature testing. Quite often testers are pressed for time and do not have the resources for exhaustive load testing and reliability testing. While testers focus on full load testing they often do not have the time to run “no-load” tests. These tests set up one or two simple transactions and then let the system idle waiting for new work. With the TestMaster model, setting up such a script was easy to do and pointed to reliability problems in the service offering system. With the data in hand it was clear that the offered load was triggering reliability problems and there was no argument that it was an unrealistic test load. A long-term benefit is that once the system is installed the model may be used for regression testing.


How to reap those Benefits

There are four broad benefits to using OOT; the technique can manage complexity better than anything else available, development and testing speed is increased, reuse becomes possible in a realistic way that has not been practical before and it permits the scaling-up of systems. To reap these benefits, there are 3 factors that are critical: software testing techniques must be generally understood, tools and an infrastructure must be in place and middle management must accept the new mind-set so the culture and management processes are appropriate to this new method.

Most of what we hope to gain from OOT derives from module encapsulation plus the idea of pre-built libraries of modules or classes that are designed and tested for reuse.

I spoke with Bjarne Stroustrup, inventor of C++, and he remarked that OOT forces one to think about architecture and design from the beginning, not just as an after thought. Stroustrup, in his soft-spoken but direct way, points out that every significant change carries risk:

“My number 1 reason to worry about OOT is that middle managers will have to learn new tricks and techniques to manage development well. This is not easy. A development manager, say, is often in a position to get blame for whatever goes wrong, yet most of the credit goes to their people when things go well. This doesn’t encourage risk-taking or innovation. Unless managers somehow find the time to get a basic understanding and a minimum level of comfort with OOT, it will seem only a risk, and token efforts will abound.”

In 1988, I ran a project called MACSTAR whose purpose was to control changes on a Centrex system so people could have their telephone numbers follow them through moves. OOT was impressive here. There was a 3 to 1 increase in productivity. Testing proceeded quickly and new features were added to the object-oriented sections with less effort and with fewer interface changes within the system. Accommodations to external changes in the object-oriented modules caused significantly fewer source lines to be modified than for similar changes in structured design modules. Redesign of selected subsystems using the object-oriented paradigm produced savings in maintenance. This was, however, a small project, just 10K lines of code, 400 function points and 20 people.

Two years later when there was a major effort to deploy OOT throughout the organization, the problems inherent in larger systems appeared. There was legacy code to deal with, the tools were primitive, the mind set of relational databases was difficult to change, and there was a tendency to loose track of large numbers of object classes. There was some stumbling and hesitation. Testers became confused with the huge object libraries.

But the technology has improved enough to allow us to produce a system that changes the way of running a telephone network. Previously the focus had been on the various hardware architectures that would support broadband services. An integrated object oriented operations support system platform that enables the efficient delivery, differentiation and billing of new high-quality services became possible. Key was the ability to understand and test new services.

This project changed the paradigm of software development to such a degree that the projections of 3,000 people and 36 months to delivery were invalidated. It took 425 people 15 months to deliver the first release. There are 14 separate subsystems. Of these, 8 are new designs and 6 are encapsulations of previously built systems which were not redone. There are 9K function points, 22 modules, 47 system interfaces and 12 databases. The object classes are of 3 types: communications, administration of the system itself and user interfaces. There are 195 object classes and 376 objects. The communications object classes constituted 190K lines in a 3.5 million line system.. The discipline required to do OOT exposed a 40% redundancy in the design specifications and allowed a paring down to elegance.

A guiding principle from our previous troubles was that the object class libraries should be very small, 0.5% of the number of function points, because we had no tools to keep track of large numbers of classes. There is a tendency for programmers to revert to the “cottage industry syndrome” of developing unique dialects that became unmaintainable. Testers were charged with accepting or rejecting new object libraries The details of the object classes caused problems at boundary conditions. For example, when the attributes of data being passed were slightly different, modules were not initialized properly. There was considerable churn on attributes. Ultimately Fred Brooks’ realization of the major paradigm shift was made real to us also: the greater the isolation of each object, the greater its power. Testers made sure that the isolation became a reality.

About the Author

Author Contact Information

Mr. Bernstein is president of the Center of National Software Studies and is a recognized expert in Software Technology. He provides consulting through his firm Have Laptop - Will Travel and is the Executive Technologist with Network Programs, Inc. building software systems for managing telephone services. Mr. Bernstein was an Executive Director of AT&T Bell Laboratories where he worked for 35 years.

Larry Bernstein
Have Laptop - Will Travel
901 Minnesota Street
San Francisco, CA 94107 USA
Phone: (415) 550-3020, Tollfree (USA): 800-942-SOFT
Fax: (415) 550-303
E-mail: [email protected]

Previous Table of Contents Next