Thoughts on a topic: A Scenario Conceptual Model

A federated enterprise may be represented as an encapsulated collection of stakeholder-governed entities conducting enterprise services and processes for the benefit of the stakeholder collective.  Services and processes, in response to a set of defined conditions, transform inputs and provide value added outputs.  Activity collections, aggregates of enterprise element capabilities, realize the enterprise capabilities provided by these services and processes.  The stakeholder value assessment of enterprise’s service and process outcomes results from the application of a standard measure of effectiveness derived from stakeholder-governance of the enterprise.  The collection of objective evidence, that we source from the validation of the enterprise in its intended use environments, provides this measure of effectiveness of enterprise capability.

Let us consider an architecture description of the enterprise as the conceptual blueprint that defines and documents the enterprise.  The architecture description captures the enterprise’s context as well as its structural and behavioral (i.e., service and process) elements.  The enterprise’s operational governance, objectives, standards and measures, doctrine, constraints, and its enclosing environment comprise elements its context.  The enterprise’s stakeholders stipulate the enterprise’s context.  Stakeholders establish the enterprise’s missions in intended use problem spaces, bounded contexts of use.

Missions are goal oriented.  To accomplish a mission the enterprise delivers an effect, a change in a condition or set of conditions.  The pre-conditions of a bounded context pose a problem; the effect defines the end state of the transformation of the pre-conditions to the post-conditions.  Strategic enterprise goals are key stakeholder statements of the enterprise’s delivery of value.  A mission is both doctrine and an instance of enterprise activity.  At a minimum one, but likely more, threads of the enterprise’s activity elements, discrete activities and their constituent tasks, deliver a capability to accomplish a mission.  An enterprise may encompass multiple missions and a number of scenarios may elaborate each mission.  A scenario details a specific thread of behavior (e.g., activities and interactions) of a use case specification.  A use case is not synonymous with a scenario.  A unique scenario accommodates each instance of a use case in any one of the multitude of defined contexts of intended use.

An enterprise’s architecture description may possess a single enterprise or strategic use case but possesses numerous scenarios and bounded context specifications.  Scenarios and their contexts are themselves detailed architecture descriptions for problem solving.  They postulate and describe a problem context for the enterprise, now and into the future.  A scenario contains an objective for the enterprise to achieve and a standard to effectively gauge the objective’s accomplishment in the light of the problem context.  A scenario and its bounded context serve both as activity execution guidance for the enterprise, as well an implementation of enterprise workflow, a thread.  The bounded context of a scenario contains conditions that influence the enterprise’s behavior and the outcome of a thread.  The detail of the scenario description may specify the dynamic behavior of enterprise activities within and interactions without.  A scenario is concrete and may focus on a single capability provided by the enterprise from the perspective of a single external entity.  An elaboration of the scenario description provides the details of the enterprise’s external interaction activities and governance as well as the enterprise’s internal operational activities and governance.

A scenario is the context specification for an enterprise thread of activity elements.  Threads solve problems by providing effects.  Problem solving provides a capability to enterprise service consumers.  An event, a problem description, initiates a thread.  Every thread has a defined end state, a problem solution.  Exchanges, governed by requirements as well as constraints, between thread elements execute a thread.  The set of conditions applied to the thread performers and their response to these conditions defines the end state.  Successful execution of a thread attains a capability allocated to that thread.

The enterprise exposes its capabilities as the services of its interfaces.  Ports, points of enterprise interaction with the constituents of its environment, host these interfaces.  These ports breach the encapsulation of the enterprise.  Through these interfaces the enterprise provides or requires services to or from other enterprises, as well as the conditions of the environment that influence it.  The exchange items of the enterprise’s environment context as well as it services, flow though these ports and interfaces.  In a sense, the interfaces establish the enterprise’s field of regard to its environment and other enterprises and entities.  Conditions external to the enterprise can only interact and possibly influence the enterprise through an interface.

A scenario may contain both temporal and casual events.  The goal of a temporal event initiated by an entity external to the enterprise is to induce an enterprise activity.  A set of conditions may accompany the temporal event.  Temporal events initiated by the enterprise allow it to learn about the state of external entities.  Knowledge is required for the enterprise to complete a thread of activity.  The goal of a casual event is to specify a set pre-condition that elicits a response from the enterprise’s services, a condition set perceived as a problem by the enterprise.  By executing a thread of internal activities triggered by the event, the enterprise may close a loop on an input condition and alter it.  The thread delivers an effect with defined set of post-conditions.  The application of a measurement standard to the outcome establishes the efficacy of the solution.  These generalized abstractions hold true even in enterprises consisting of activities with stochastic natures.

The scenario contains the specification of an event’s required or provided conditions.  The exposed services enable the sensing of these causal conditions.  An enterprise service senses and reacts to these sets of conditions.  Satisfaction of the required input conditions triggers an internal enterprise process thread, a collection of activity.  The objectives of stakeholder or social governance may drive these innate responses of the enterprise to conditions.  Data defines a condition’s scope.  A condition’s data is constrained through governance at the interface.  Natural law as well as doctrine governs an enterprise’s condition data.

The event invokes an instance of a defined enterprise behavior, a thread.  A service owns its behavior scenario and transforms the required input conditions to a new set of provided output conditions defining the instance, an end state conditions set.  The new provided condition accomplishes satisfaction of an enterprise objective, attaining a capability.  The scenario defines the post conditions of the event-triggered service behavior and the standard to apply to the provided conditions present at the enterprise’s external interfaces, its goals.  An enterprise scenario event is the result of a collaboration of enterprise elements, both external and internal.  The collaboration results in an increment of satisfaction of an enterprise objective thereby satisfying a stakeholder objective.

To achieve its objectives, an enterprise may interact with other enterprises to obtain resources; the scenario documents these interactions.  To fully satisfy an enterprise objective frequently requires the orchestration of a sequence of scenario events.  To fully satisfy all of the enterprise’s objectives may require a collection of sequential and parallel scenario event sequences.

There is an essence to a scenario conceptual model.  A scenario belongs to an entity that performs services.  The entity performs services to satisfy its objectives.  A casual set of required pre-conditions is an event that instantiates a service.  The result of the service instantiation is a defined set of provided post conditions.  A collection of performers, performer interfaces, conditions, data, activities, events, and governance represents a scenario conceptual model.  The model provides doctrine, methods, and a test for an instance of problem solving for the enterprise.

This conceptual model is applicable at all hierarchies within the enterprise.  This scenario conceptual model supports defining an entity, simulation of the entity, and can serve as the context to verify an entity instance.

Thoughts on using the “Context Specific Value Specification” concepts Of the SysML

The discussion of the “Context Specific Value Specification” feature of the SysML language is quite limited in the OMG SysML v1.5 specification.  I wouldn’t be surprised if many weren’t even aware of its presence.  I want to share some ways I’ve thought of employing this feature of the language to support Model-Based Test.  I hope I’ve used the concept in conformance to the language specification.  Feedback on any flaws is appreciated.

I use No Magic’s Cameo Systems Modeler for my modeling and there is support for this SysML feature, albeit not without issue.

I employ context specific value specifications to specify the properties of the entities in a bounded context of the system’s problem space.  This enables the “Test Model” to trace explicitly to the problem space definitions to construct test components traceable to the system definition and explicit traceability of value specifications of test vectors in the Test Context’s “Data Pool” (see OMG UTP for the definition).

I’ll provide some partial examples in this post.  Please understand these are incomplete examples for the purpose of provoking thoughts on the topic.

Several of the diagrams I’ll be presenting contain data modeling concepts based on the Open Geospatial Consortium (OGC) City Geography Markup Language (CityGML) Encoding Standard V2.0.0.  This standard supports the storage and exchange of virtual 3D city models.  I’ve employed only a small fraction of the concepts to represent fundamental properties of a roadway.

Here are the fundamental domain concepts supporting the modeling of a “Transportation Object”, specifically a roadway.

The above illustration conveys the relationships between the elements of the domain’s reference model (e.g., _TransportationObject, Transportation Complex, Traffic Area, Auxiliary Traffic Area, and several representative sub-classes) and their semantic.

The Traffic Area domain block

The Auxiliary Traffic Area

 

Several Instance specifications for elements of an exemplar roadway

Applying the instance specifications to the Default Value property of Part property specifications of the «block» Exemplar Road

The resulting bdd view of the Exemplar Road and its parts.

You should note that the Instance specifications are of the Traffic Area and the Auxiliary Traffic Area in the instance specification illustration above, yet in the above bdd the part properties of the Exemplar Road have been redefined using the sub-classes of Traffic Area and Auxiliary Traffic Area.  This may be unnecessary, but I chose to include redefinition of sub-class properties in this example.

Why would I go to all of this work defining specifics of a “Bounded Context” (an intended use problem space).  Firstly, I have defined the intended context of use of the system with considerable specificity.  I should be using these contexts as the referent for the properties of the test components of a test context (See the OMG UTP specification for definitions) responsible for integration, verification, and validation of the system under development.

Here is an unrelated example of what a data pool, represented in table format, of the vehicles  which might be operating on this Exemplar Road and some initial properties of those vehicles.

The entire table represents the UTP concept of “Data Pool” with each column a “Data Partition” and each row a set of test vectors for a Test Case.  Some specifications are incomplete (empty cells).

The illustration below is a fragment of a specific bounded context specification.

This illustration conveys how the imported context specification could be specialized to a Test Context specification, including the test components.

As each Test Case is instantiated its associated test vector data is pulled from one or more Data Pools associated with the Test Context.  The Data Pool (see the above Data Pool Table) in this example does not include a partition containing the expected observation of the SUT in response to test inputs.  This particular table relates only to properties of 2 of the test components, this example is typical of preliminary test design where the system’s intended use context has been imported to support test vector and test component specification.  Hopefully you can fill in the blanks yourself.

Hopefully I’ve provided enough explanation to aide your comprehension and perhaps you can share your thoughts on refinement of these ideas.

Thought on the development stage of a Test Case

From My perspective of the OMG’s UML Testing Profile

Conceptual Design Stage

The System Architecture description models and its sub-systems architecture description models own the test case conceptual design of each requirement having a SysML «Verify» dependency to a TestCase model element in version 1.4 of the language standard.  A Test Case in the UML Testing Profile (UTP) extends the UML metaclasses behavior and operation.  A Test Case ALWAYS returns a Verdict.

TestObjectiveSpecificationAtSRRb

Each requirement subject to verification should have such a dependency

The TestCase is named and it has a dependency relationship to its TestObjectiveSpecification model element per the UTP v1.2

A TestRequirement model element may be used to refine the TestObjectiveSpecification model element and act as a bridge between the System of Interest model and the Test System model

TestReqCon

Note: The UTP 2 specification addresses maturing concepts in these elements of a test specification

The TestObjectiveSpecification expresses the conceptual design of the TestCase in a natural language statement.  This statement identifies the most important components of the TestCase from the perspective of the Accepting Stakeholder concerns. The components are: pre and invariant conditions of the system under test (SUT) environment, the SUT and its interacting Actors or Systems, their Input(s) to the SUT, the expected behavior of the SUT in response to its inputs.

This collection of components expresses the ‘acceptance criterion’ of the Accepting Stakeholder.  A TestCase, satisfying its TestObjectiveSpecification, will produce objective evidence verifying, to the satisfaction of the Accepting Stakeholder, that the system requirement has been satisfied by the responsible system feature.

The goal of the Test Objective is to articulate an abstract, yet understandable and unambiguous description of the purpose of the TestCase.  The TestObjectiveSpecification does not state ‘How’ to implement the TestCase. The conceptual design treats the TestCase as a “black box”

The Test System Architecture Imports the conceptual design of a TestCase as the foundation to the development of the Test Design Specification for the TestCase

TestObjectiveSpecificationAtSRRa

A conceptual test case design may be realized by a collection of test cases by the Test System

TestCaseConcepta

Logical Design Stage

The Test Specification of the System Architecture has the responsibility for the Test Design Specification of each conceptual test case imported from the system architecture

A logical design identifies  the components of a test case’s structure and behavior and their relationships.

The logical design activity is an iterative activity ending when a specification can be realized

TestObjectiveSpecificationAtSFRa

Test Architecture and high level Test design is performed

Engineering tasks include:  requirements analysis, evaluation, allocation and component specification

The role each component plays in its test case and its test objective responsibilities is defined and traced

A logical design should not specify concrete values for test data properties.  Specify concepts for test data properties (e.g., inbounds, at Boundary, out of bounds, etc…)

TestReqLog

Logical properties can be realized as concrete data values through transformation rules

Allocate a TestCase to it’s owning TestContext

A TestContext owns 1 or more TestCases having a common test configuration and a common composited TestObjectiveSpecification

Test component / SUT connections are detailed by the Test Configuration of the TestContext

Typically an internal block diagram when using SysML

Document the execution sequence/schedule of a TestContext’s TestCases

Specify requirements for: the Test Environment, Test Components, test tools, test data, etc…

The Test Case Specification is fully described and can be realized

Concrete Design Stage

The Test Specification of the System Architecture has the responsibility for the Test Case Specification of each test case specified by the Test Design Specification

Define test data pools, partitions and selectors

Detailed design and specification of test data values

Detailed design and specification of the test environment

The physical design of a test case defines component deployment in the test environment

Identify constraints / limitations of the test environment, data flows, or execution infrastructure

The Test Case Specification is complete when an implementation can be realized

Interface Stimulator Services Enterprise

Identification
The Interface Stimulator Services Enterprise (ISSE) is a loosely coupled, modular set of simulation/stimulation service components and a core set of management services. The stimulator service components interact with SUT interfaces that normally exchange information with interfaces external to the SUT.  The ISSE provides test capabilities to the integration, verification, and validation life cycle processes. The ISSE is employable with test automation tools (e.g.: Quality Center, etc) and is planned for employment as a component with the system simulation trainer element.

jpg_7

Overview of the Proposed System
The primary objective of the ISSE is to provide information exchange stimulation capability to all information exchange interfaces external to the system. This is a fundamental enabler to basic system integration and verification operations. The interface stimulation components resident in the ISSE provide a web services managed interface to models, simulations, tools, databases, and hardware used to stimulate system’s interfaces. The stimulator service components have loose coupling and management of the stimulator service components is via core infrastructure services.

ISSE Overview
The ISSE is a fundamental enabler of integration and verification activities of system interfaces. Secondary design objectives are; support integration and verification activities by simulating interfaces, support system integration and verification activities by simulating information exchange interfaces external to the system, and support trainer operations by stimulating all segment information exchange interfaces external to the segment.

ISSE_SIL
The design of the ISSE capability set supports evolution to fulfill the operational needs of system data and control interfaces. This expansion of ISSE functionality is largely dependent on evolving program objectives in the area of interoperability validation.
The final ISSE spiral is the trainer product to support training of operators, maintainers, and supervisors. In this deliverable instantiation the ISSE is a component of the trainer element.
Each stimulator component possesses its own web services management interface and the ISSE provides common services to manage infrastructure and the stimulator components. In addition to stimulation services, data logging functionality with time stamping is in the design plan for all stimulator components to support artifact collection automation.
Users (test personnel, instructors, supervisors) can connect and configure the stimulator components (models, simulations, tools, hardware) with specific data and parameter values to compose a stimulation environment to perform a test or conduct a training exercise. The ISSE is capable of archiving service orchestrations for future reference. Test automation tools like HP’s Quality Center can employ service request transactions directly with the ISSE’ core services interface to invoke archived orchestrations.

ISSE_Realized
A design objective is to provide the capability to interface with virtually any model, simulation, tool, database, or hardware. The initial development spiral will only incorporate models and simulations that match the need to integrate and verify system entities. The models and simulations developed for or incorporated into the ISSE will have varying levels of fidelity:
High Appears to be reality from a system perspective. Dynamic and variable behaviors.
Medium system interactions have a visible but limited effect on behaviors.
Low Correctly formatted static data without dynamic variability of behaviors.
To manage cost and risk; functionality and fidelity of stimulator components will not evolve past the point where the components are suitable for integration and verification activities. Low and medium fidelity should suffice for many of the models and simulations. If additional functionality or greater fidelity is required to meet training and operational support objectives, the infusion of additional funding over and above that required for just system integration and verification will be necessary.

Architecture & Design
Employ SOA design patterns and web services to manage simulation / stimulator components and component strings. Employ open standards from standards bodies with demonstrated domain influence. The Object Management Group is a prime example of one such body. Maintain loose coupling of the simulation / stimulator components.
Maintain a focus on the future evolution spiral of simulation / stimulator components and core services of the ISSE. Keep in mind the evolutionary spiral of the trainer, model use in the tactical applications supporting operations planning, and development of distributed test beds for develop-mental/operational test and evaluation (D/OT&E) of the total enterprise.

Background, Objectives, and Scope
The ISSE engineering effort is responsible for the capture of the capability needs of the element, trainer, segment, operational support models, and system. Translate these needs to requirements and design a capability that is evolvable to provide those needs to the foreseen end state. The systems engineering effort is also responsible to identify models for use in the simulator components development (process model) and operation (parametric model). The implementation is limited to the capability set required to integrate and verify the segment.
The segment of interest engages in system data exchanges with entities external to the system. The possible number of external entities exceeds 10,000 instances. The integration and verification activities require information exchange stimulators to succeed in the testing of these interfaces.
COTS tools exist that may satisfy basic integration and verification of interfaces, at the element level, that exclusively engage in SOA system data exchanges at a single interface. In verification situations where coordination of complex exchanges of information occurs at multiple interfaces, existing COTS tools may prove inadequate. This requirement may emerge where system data exchanges at multiple interfaces require orchestration with a complex operational mission scenario. Coordinated scripts may suffice, but they may be subject to high maintenance overhead as scenarios evolve or change.
Realization of the a distributed test bed concept mandates employment of advanced interface stimulator capabilities to bridge segment interfaces to Distributed Interactive Simulation (DIS), High Level Architecture (HLA), or Test and Training Enabling Architecture (TENA) simulation / experiment environments. The complexity of a virtual system of systems experiment environment is unlikely to be supportable using simple scripted system data exchanges.
The objective of this effort is to define these near-term and far-term capabilities and develop only those essential near-term capabilities for segment integration and verification.
Operational Description

Near-Term Capability Description
The ISSE employs a core web services interface. External service consumers interact with this interface to obtain customized stimulator services. Provision for direct access to the stimulator component web services interface is required. This requirement supports component re-use in other infrastructures.
The ISSE employs composable stimulator components. The components feature a web services interface that serves to encapsulate the model, simulation, database, etc and provide for composition of the stimulator component at the environment build time. Modification of the stimulator application at run time is not required. Control of the stimulator component is required during run time. There is a clear distinction between component composition and component control. Composition implies the creation of service chains, links to external data resources or similar configuration of complex behavior models and simulations; actions that are difficult or impossible to orchestrate in real-time. This is different from the simple exposure of a service or process control at the service interface or to proprietary simulation control interface.
The ISSE interfaces to the user’s GUI through the bus’ core web services interface. Infrastructure services support management of the bus and stimulator components, as well as composition of the bus and stimulator components.
The ISSE provides an environment configuration report service. This service provides information relating to stimulator component composition data, model or simulation version data, data base version data, bus orchestration and component deployment data.
The ISSE provides a simulator component composition service. Simulator component composition ser-vice provides service consumer the capability to control those simulation elements exposed to service consumers. This provides a level of service customization.
The ISSE provides a bus orchestration service. This service coordinates the behaviors of the stimulator components.
The ISSE provides a service consumer notification service. An event notification service provided to service consumers.
The ISSE provides a simulator component deployment service. Supports automated deployment of a set of stimulator components.
The ISSE and stimulator components have configurable data monitoring, acquisition, and storage capabilities.
The ISSE supports third party stimulator service requests through its native web services interface. Third party applications may be COTS Test automation tools capable of interacting with a SOA interface.
The ISSE supports interaction directly with a stimulator component by a third party application.
The ISSE supports real-time stimulation of segment interfaces.
The ISSE provides the capability to stimulate segment interfaces in non-real-time as well as modified epoch time reference point.
The ISSE supports automated operation by software test automation tools such as HP Quality Center via the bus’ core web services interface.
The ISSE provides an automated archive service.
Future Capability Description
System program documents convey the concept of developing a “system in the loop” test capability for the system prior to operational deployment. A “test bed” supports the system level test capability.
The concept of a test bed implies that a system or portion of a system is placed in an environment that exercises the system under test (SUT) as if it were in a real world situation. The test bed provides inter-faces to other systems and/or portions of the same system that stimulate the SUT and accept and respond to outputs from the SUT.
In moving closer toward the real world environment, network components that are geographically distributed will comprise the test bed, removing the collocation requirements needed for earlier testing.
To establish the system test and verification environment one must identify a system test bed where the entire set of entities can be assembled and connected in their near operational state. This can be a virtual environment made up of several Integration Laboratories or it can be a physical single site.
The ISSE fits into the above system test environment as a part of the integration lab. The ISSE may evolve into a component of a test bed as testing evolves, allowing the use of actual external systems rather than simulators.
The system distributed test bed concept extends the integration lab to support program objectives in future development spirals of the Interface Stimulation Service Bus.

Definitions
Automated Archive Service – Archives are analogous to logs. There is a probability that artifacts will be required for the test evidence or debug. This service automates the collection and organization of the data logged by the different interface stimulator components, whatever they may have collected. Wraps it all up in a neat package and may even submit it to a CM service interface.
Bus Orchestration Service – If there is a need to have behaviors synchronized at various interface stimulator components, this is the service that is responsible for this. This service may be or is very tightly coupled to the timing service. In an HLA it is similar to what the RTI is responsible for.
Component Strings a component string is a concept where 2 or more atomic service components are engaged in a service contract to provide a complex service.
Composition the creation of service chains, links to external data resources or similar configuration of complex behavior models and simulations.
Environment Configuration Report Service – captures the test component environment attributes. Tool version, operating system, serial numbers of hardware. Supports re-execution of a test precisely as it was originally instantiated.
ISSE (Interface Stimulator Services Enterprise) – is a loosely coupled, modular set of simulation/stimulation service components and a core set of management services
Service Consumer Notification Service – This is the service that notifies the service consumer about alerts and could even provide the service output.
Simulator Component Composition Service – A service that employs an intelligent agent to assist a service consumer in composing another service. Service that post internal service process models can be customized to a specific consumer’s needs via this service mechanism.
Simulator Component Deployment Service – A service that might be employed to deploy atomic services to a test bed. Conversely, the infrastructure team may deploy the services and then this service is not required.
Stimulation Components Models, Simulations, tools, hardware that provides the stimulus for the external interfaces.
Stimulator Components see Stimulation Components.