Opinion on “Is System Debugging a valid concept?”

LinkedIn discussion from the System Integration and Test group

“Debugging” is a valid concept.

IMHO, “Debugging” is not on par with “Design”. “Debugging” is not a technical process, it is an outcome of the execution of a technical process.

ISO/IEC/IEEE 24765 Systems and software engineering – Vocabulary defines “Debug” as:

to detect, locate, and correct faults in a computer program.

Fault is defined as:

1. manifestation of an error in software. 2. an incorrect step, process, or data definition in a computer program. 3. a defect in a hardware device or component. “bug” is listed as a synonym for “fault”.

There is nothing prohibiting the extension of the term to all elements of a system (Hw, Sw, Design, Requirements, etc…).

“Bugs” or faults are found by executing test cases against a system element, a test case SUT, and comparing the expected result (test oracle) against the observation. The expected result is derived from the test basis and if the observation is non-conforming to the oracle, then the SUT is faulty. The bug or fault must be removed or the SUT will never be found to be compliant with its requirements. And yes, IMHO, a test case can be realized as an “inspection” of a design, abstract implementation, etc…

A “Test System” is an Enabling System of the System of Interest and has a responsibility for the production of objective evidence that the system of interest as well as its elements satisfies acceptance criteria.

ISO 15288 identifies the life cycle stages and technical processes. Test is an activity of the integration, verification and validation technical processes. Test is realized through the execution of test cases, behavior specifications realized by test components against a SUT. Every element of a system traverses the development stage in its own life cycle and is subjected to execution of technical processes. An outcome of “Integration” is:

c) A system capable of being verified against the specified requirements from architectural design is assembled and integrated.

There is an implication that to “be capable of being verified” and subsequently “accepted” by a stakeholder that the system element must be brought into compliance with its requirements or “to be free of faults”. Faults/bugs in a system element are detected and corrected, “debugged”, as an outcome of the execution of process activities and tasks.

There is a new ISO standard under development for Sw Testing. It currently consists of 4 vols. The std is 29119. In vol 1 Annex A a depiction of the role of test in V&V is provided. The principles of the std can apply to all engineering domains, not just Sw (IMHO). I’m not asserting that the std is the holy grail, but it does have some good content. There is some info in the public domain on the std.

ISO/IEC/IEEE 24765 Systems and software engineering — Vocabulary defines “test” as “an activity in which a system or component is executed under specified conditions, the results are observed or recorded, and an evaluation is made of some aspect of the system or component”.

The evaluation of the test’s observation may conclude that the observation does not conform to the expected result. The expected result is confirmed to be consistent with the test basis and in this case the existence of a fault/bug is confirmed.

One objective of the ISO/IC/IEEE standards is to establish a common framework and lexicon to aid communication among diverse domains. There is still much work to be done towards this end and there are some very committed individuals striving to harmonize the ISO standards. There is value in learning a common language.

Test is not analogous to the activity in which young children engage on Easter Sunday. That is an unstructured and random sequence of behavior in which the discovery of an egg is merely happenstance. Many individuals engage in such behavior and call it test.

If bug=defect=fault, then debugging=dedefecting=defaulting

Food for thought.

NIST published a comprehensive report on project statistics and experiences based on data from a large number of software projects

70% of the defects are introduced by the specifications

30% are introduced later in the technical solution

Only 5% of the specification defects are corrected in the specification phase

95% are detected later in the project or after delivery where the cost for correction on average is 22 times higher compared to a correction directly during the specification effort

Find the requirement defects in the program phase where they occur and there will be less defects to find during integration test or system test.

A work product inspection is a Test. It employs the static verification method “inspection”. ISO 29119 supports this concept. The INCOSE Guide for Writing Requirements can serve as your inspection checklist. It is also the specification for writing requirements and is therefore your Test Basis.

SEs (as the authors of the specifications) are, typically, the source of the majority of defects in a system.

Stakeholder politics plays a role in the requirement problem. Incompetence is yet another significant contributor. There are a host of factors.

Many SEs are behind the power curve. The ISO, IEEE, and INCOSE are driving SE maturity, SEs need to get onboard and support these efforts.

Emphasis needs to be on prevention and not detection.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s