Overview
Formal Verification (FV) controls apply rigorous mathematical and logical methods to verify security properties. These controls are optional but encouraged for OSSASAI-L3 deployments and for platform maintainers who want the highest confidence in security guarantees.
Note: Where a platform provides formal models, these controls provide normative evidence for core invariants (auth, session isolation, tool gating, misconfiguration safety).
OSSASAI-FV-01: Formal Model Integration
Metadata
| Attribute | Value |
|---|---|
| Control ID | OSSASAI-FV-01 |
| Requirement Level | SHOULD (L3) |
| Assurance Levels | L3 |
| Trust Boundary | Cross-cutting |
| Purpose | High-assurance proof of security properties |
Requirement
For OSSASAI-L3 and for platform maintainers, formal methods MAY be used as normative evidence for core invariants. Formal checks SHOULD be integrated into CI for security-sensitive changes.
Evidence
- CI logs showing formal verification runs
- Model artifacts (TLA+, Alloy, etc.)
- Verification results and bounds
Checks
- Formal models compile and verify successfully
- No invariant violations detected
# Run TLA+ model checking
tlc AuthModel.tla -config AuthModel.cfg
# Verify session isolation model
tlc SessionModel.tla -deadlock
# Check tool gating invariants
tlc ToolGatingModel.tla -coverage
Implementation
TLA+ Models:
Example formal model for authentication:
```tla
---- MODULE AuthModel ----
EXTENDS Integers, Sequences, TLC
CONSTANTS Users, Tokens, Sessions
VARIABLES
authenticated, \* Set of authenticated users
activeTokens, \* Map: token -> user
activeSessions \* Map: session -> {user, token}
TypeInvariant ==
/\ authenticated \subseteq Users
/\ activeTokens \in [Tokens -> Users \cup {NULL}]
/\ activeSessions \in [Sessions -> [user: Users, token: Tokens] \cup {NULL}]
\* Security Invariant: No session without valid token
SessionAuthInvariant ==
\A s \in Sessions:
activeSessions[s] # NULL =>
/\ activeSessions[s].token \in DOMAIN activeTokens
/\ activeTokens[activeSessions[s].token] = activeSessions[s].user
\* Security Invariant: Authenticated users have valid tokens
AuthTokenInvariant ==
\A u \in authenticated:
\E t \in DOMAIN activeTokens: activeTokens[t] = u
====
```
Session Isolation Model:
---- MODULE SessionModel ----
EXTENDS Integers, Sequences, FiniteSets
CONSTANTS Peers, Channels, Sessions
VARIABLES
sessionMap, \* Map: (peer, channel) -> session
sessionData, \* Map: session -> data
sessionOwner \* Map: session -> peer
\* Invariant: Sessions are isolated per peer
IsolationInvariant ==
\A s1, s2 \in DOMAIN sessionData:
s1 # s2 =>
sessionOwner[s1] # sessionOwner[s2] =>
sessionData[s1] \cap sessionData[s2] = {}
\* Invariant: No cross-peer data access
CrossPeerInvariant ==
\A p1, p2 \in Peers:
p1 # p2 =>
LET s1 == sessionMap[p1]
s2 == sessionMap[p2]
IN sessionData[s1] \cap sessionData[s2] = {}
====
```
**CI Integration:**
```yaml
# .github/workflows/formal-verification.yml
name: Formal Verification
on:
push:
paths:
- 'src/auth/**'
- 'src/session/**'
- 'src/tools/**'
- 'models/**'
jobs:
tla-check:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install TLA+ Tools
run: |
wget https://github.com/tlaplus/tlaplus/releases/download/v1.8.0/tla2tools.jar
echo "TLA_JAR=$(pwd)/tla2tools.jar" >> $GITHUB_ENV
- name: Check Auth Model
run: |
java -jar $TLA_JAR -config models/AuthModel.cfg models/AuthModel.tla
- name: Check Session Model
run: |
java -jar $TLA_JAR models/SessionModel.tla -deadlock
- name: Check Tool Gating Model
run: |
java -jar $TLA_JAR models/ToolGatingModel.tla
- name: Upload Results
uses: actions/upload-artifact@v3
with:
name: tla-results
path: models/*.out
```
### References
- TLA+ Specification Language
- Alloy Analyzer
- NIST SP 800-53 SA-17: Developer Security Architecture and Design
---
## OSSASAI-FV-02: Negative Model Regression
### Metadata
| Attribute | Value |
|-----------|-------|
| **Control ID** | OSSASAI-FV-02 |
| **Requirement Level** | SHOULD (L3) |
| **Assurance Levels** | L3 |
| **Trust Boundary** | Cross-cutting |
| **Purpose** | Verify known-bad states are unreachable |
### Requirement
Negative models SHOULD remain runnable and produce counterexample traces for their bug class. This ensures known vulnerability patterns remain detectable.
### Evidence
- Negative model artifacts
- Counterexample traces demonstrating detection
### Checks
- Negative models successfully find counterexamples
- Counterexamples match expected bug patterns
```bash
# Run negative model (should find counterexample)
tlc NegativeAuthModel.tla
# Expected output: "Error: Invariant violated"
# Verify counterexample matches expected pattern
grep -q "expected_vulnerability_pattern" counterexample.txt
Implementation
Negative Auth Model:
Model that should fail (demonstrates auth bypass detection):
```tla
---- MODULE NegativeAuthModel ----
\* This model SHOULD produce a counterexample
\* demonstrating an authentication bypass
EXTENDS AuthModel
\* Intentionally weak invariant (should be violated)
WeakAuthInvariant ==
\* BUG: Allows session without authentication
\A s \in Sessions:
activeSessions[s] # NULL =>
TRUE \* No auth check!
\* The model checker should find:
\* - State where session exists
\* - But user is not authenticated
\* - This is the counterexample we want
====
```
Negative Session Model:
Model demonstrating session confusion detection:
```tla
---- MODULE NegativeSessionModel ----
\* Demonstrates cross-session data leakage
EXTENDS SessionModel
\* Intentionally broken isolation
BrokenIsolation ==
\* BUG: Shared global state
\E globalData \in DataSet:
\A s \in Sessions:
globalData \in sessionData[s]
\* Model checker should find counterexample
\* showing data accessible across sessions
====
```
Regression Testing:
# Negative model regression tests
negative_models:
- name: "auth-bypass"
model: "NegativeAuthModel.tla"
expected: "counterexample"
bug_class: "authentication_bypass"
- name: "session-confusion"
model: "NegativeSessionModel.tla"
expected: "counterexample"
bug_class: "cross_session_leakage"
- name: "sandbox-escape"
model: "NegativeSandboxModel.tla"
expected: "counterexample"
bug_class: "sandbox_escape"
```
```bash
# Run all negative models
for model in models/Negative*.tla; do
echo "Testing: $model"
java -jar tla2tools.jar "$model" 2>&1 | grep -q "Error:" || {
echo "FAIL: $model should have found counterexample"
exit 1
}
done
echo "All negative models correctly detected bugs"
```
### References
- "Testing versus Formal Verification" - Lamport
- Property-based testing patterns
- Mutation testing for formal models
---
## OSSASAI-FV-03: CI Integration of Formal Checks
### Metadata
| Attribute | Value |
|-----------|-------|
| **Control ID** | OSSASAI-FV-03 |
| **Requirement Level** | SHOULD (L3) |
| **Assurance Levels** | L3 |
| **Trust Boundary** | Cross-cutting |
| **Purpose** | Continuous verification of security properties |
### Requirement
The conformance report SHOULD record model version, assumptions, and bounds. Formal checks should run in CI for security-sensitive changes.
### Evidence
- CI configuration showing formal verification steps
- Conformance report with verification metadata
### Checks
- CI pipeline includes formal verification
- Reports include model version and bounds
```bash
# Check CI configuration
grep -q "formal-verification" .github/workflows/*.yml
# Verify conformance report includes FV data
jq '.formal_verification' evidence/conformance-report.json
Implementation
Conformance Report Schema:
json
{
"ossasai": {
"version": "0.1",
"level": "L3"
},
"formal_verification": {
"enabled": true,
"models": [
{
"name": "AuthModel",
"version": "1.2.0",
"tool": "TLC",
"tool_version": "1.8.0",
"last_verified": "2026-01-15T10:00:00Z",
"result": "pass",
"bounds": {
"Users": 3,
"Tokens": 5,
"Sessions": 10
},
"assumptions": [
"Single-threaded execution",
"Atomic token operations"
],
"coverage": {
"states_explored": 15420,
"distinct_states": 2341
}
}
],
"negative_models": [
{
"name": "NegativeAuthModel",
"bug_class": "auth_bypass",
"result": "counterexample_found",
"trace_file": "evidence/auth-bypass-trace.txt"
}
]
}
}
Full CI Workflow:
```yaml name: Security Verification
on:
push:
branches: [main]
pull_request:
paths:
- 'src/**'
- 'models/**'
jobs:
formal-verification:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup TLA+
uses: japgolly/setup-tla@v1
with:
version: '1.8.0'
- name: Run Positive Models
run: |
for model in models/[A-Z]*Model.tla; do
echo "Verifying: $model"
tlc "$model" || exit 1
done
- name: Run Negative Models
run: |
for model in models/Negative*.tla; do
echo "Testing: $model (expecting counterexample)"
tlc "$model" 2>&1 | grep -q "Error:" || {
echo "FAIL: Negative model should find bug"
exit 1
}
done
- name: Generate Report
run: |
python scripts/generate-fv-report.py > evidence/fv-report.json
- name: Upload Evidence
uses: actions/upload-artifact@v3
with:
name: formal-verification-evidence
path: evidence/
```
Model Documentation:
```yaml # models/README.yaml models: AuthModel: description: “Authentication and token management” invariants: - “SessionAuthInvariant” - “AuthTokenInvariant” bounds: recommended: Users: 3 Tokens: 5 Sessions: 10 minimum: Users: 2 Tokens: 3 Sessions: 5 assumptions: - “Operations are atomic” - “No concurrent modifications” coverage: “full state space with recommended bounds”
SessionModel:
description: "Session isolation and data boundaries"
invariants:
- "IsolationInvariant"
- "CrossPeerInvariant"
bounds:
recommended:
Peers: 3
Sessions: 5
assumptions:
- "Session IDs are unique"
```
References
- OSSASAI Section 7: Formal Verification
- TLA+ Hyperbook
- Amazon’s use of TLA+ for AWS services