Computer-aided specification and design tools
What is it?
Computer-aided specification and design tools are applications that help teams capture, structure, analyze, and maintain the requirements and design of safety-related systems. IEC 61508 (Part 3, B.2.4) recognizes several families: method-independent specification tools, model-oriented tools with hierarchical analysis (e.g., state machines, block diagrams, differential equations), entity-relationship-attribute (ERA) data models, and stimulus–response notations. Good tools provide automated inspections (e.g., missing cases, contradictions), simulation or animation, and strong traceability to decisions and safety requirements.
How it supports functional safety
These tools directly target systematic failures by reducing ambiguity, inconsistency, and incompleteness in specifications and designs. Automated checks and model analysis expose defects before implementation, preventing latent errors from propagating into code, tests, and operations. When models include diagnostics and data handling, they can also reveal manifestations of random or common-cause hardware faults in signals (e.g., impossible state transitions, out-of-range values), ensuring the safety function does not silently act on corrupted information.
When to use
- During system/software requirements and architectural design, especially for SIL-rated functions.
- When multiple abstraction levels (system → subsystem → component) must stay consistent.
- Where interfaces are complex (sensor fusion, interlocks, communication protocols).
- When formal traceability and verification evidence are needed for assessment/audit.
- When change impact analysis must be reliable and quick.
Inputs & Outputs
Inputs
- Functional & safety requirements (including hazards, safety goals, constraints)
- Operational scenarios, interfaces, assumptions, and environmental conditions
- Standards/policies for modeling, naming, and traceability
Outputs
- Structured specifications and/or models with clear semantics
- Analysis reports (ambiguity, completeness, consistency, coverage)
- Traceability links (requirements ⇄ architecture ⇄ verification)
- Review and simulation/animation artifacts for validation
Procedure
- Select the tool family that matches your objective and SIL context:
- Method-independent specification (structured editors, traceability, reviews)
- Model-oriented (state machines, block/continuous-time models, hybrid models)
- ERA data models (entities, relationships, attributes for data-rich domains)
- Stimulus–response (formal tables/logic for event-driven behavior)
- Establish modeling rules: notation, naming, levels of abstraction, and traceability conventions (IDs, link types, change control).
- Capture requirements and design in the tool; link safety requirements to hazards and to architectural elements.
- Run automated checks (missing transitions, conflicting constraints, orphaned/duplicate requirements, undefined interfaces).
- Validate with stakeholders using simulations/animations or scenario walk-throughs; record review findings in the tool.
- Maintain alignment as design evolves: enforce bi-directional traceability and re-run checks after changes.
- Generate evidence for assessment: exports of traceability matrices, analysis reports, and review records.
Worked Example
High-level
Scenario: A high-integrity process shutdown function must close valves on high pressure or high temperature. The team models the logic using a hierarchical state machine in a model-oriented tool and documents interlock conditions in a stimulus–response table. The tool flags unspecified transitions from “Voting=2oo3 FAIL” to “Shutdown Pending,” and highlights an orphaned requirement with no linked verification. After review, the team resolves the missing transition and adds a test case. Result: early removal of ambiguity and stronger evidence for SIL justification.
Tooling walk-through
- Capture safety requirements in a requirements tool and link them to hazards and acceptance criteria.
- Model sensor processing and shutdown logic in a state-based/model-based tool; auto-check for unreachable states and missing transitions.
- Document data entities (sensors, votes, alarms) in an ERA model to clarify ownership and interface contracts.
- Generate traceability & coverage reports for the assessor.
Quality criteria
- Clarity & semantics: Notation is defined; every model element and requirement is unambiguous and reviewable by domain experts.
- Completeness & consistency: Automated checks show no missing cases, no contradictions, and no orphaned or duplicate items.
- Traceability: End-to-end links (hazard → safety requirement → design element → verification) are complete and current.
- Change control: Versioning, baselines, and impact analysis are used and auditable.
- Evidence quality: Reports are reproducible; review records are linked to specific artifacts and decisions.
Common pitfalls
- Over-modeling (too much detail): Keep the level of abstraction aligned to decisions and hazards; defer implementation detail to later artifacts.
- Tool without method: Define modeling rules and review checklists; otherwise automated checks give a false sense of security.
- Poor traceability hygiene: Enforce link types and IDs; run periodic orphan/duplicate reports.
- Insufficient expertise: Train practitioners and appoint modeling “owners” for consistency and quality.
- No evidence strategy: Plan upfront how reports, baselines, and review records will feed the safety case.
References
FAQ
What concrete tools fit each family?
Method-independent specification: IBM DOORS / DOORS Next, Jama Connect. Model-oriented: MATLAB/Simulink & Stateflow, SCADE Suite, Enterprise Architect (statecharts). ERA modeling: ER/Studio, Enterprise Architect (ER/UML). Stimulus–response: SCR Toolset and similar tabular/contract-based notations.
Do these tools replace reviews or testing?
No. They complement structured reviews and verification. Use them to make reviews sharper and tests more targeted by exposing gaps early.
Are they mandatory under IEC 61508?
They are recommended techniques/measures. Selection and rigor should reflect the claimed SIL and the project’s risk profile.