Vector Stream Systems logo Vector Stream Systems

Solutions

MBSE, reimagined for agent-ready teams

Model-Based Systems Engineering is the practice of using formal, linked digital models as the primary artifacts for analysis and verification. VectorOWL makes those models machine-accessible to AI agents—without losing the rigor.

What is MBSE?

Replace documents with linked, queryable models

Traditional engineering relies on Word docs, spreadsheets, and slide decks to capture requirements, architecture, and verification. MBSE replaces those static artifacts with formal digital models—requirements, architecture, behavior, and interfaces—linked together in a graph you can query, diff, and validate.

The result: less ambiguity, better traceability, and a single source of truth that scales with system complexity.

Why MBSE matters

  • 01 Reduces ambiguity from narrative-only specifications
  • 02 Improves traceability from intent to evidence
  • 03 Scales better for aerospace, automotive, and defense
  • 04 Enables automated validation and impact analysis

Why VectorOWL?

The missing layer for AI-assisted engineering

Semantic retrieval

Traditional MBSE tools are purely symbolic. VectorOWL adds embeddings so you can query by meaning—"find wing designs similar to this one"—not just by ID.

Agent-ready context

Coding agents need structured context, not PDF exports. VectorOWL exposes the graph via Model Context Protocol so agents reason over the same model humans review.

Hard constraints

When AI suggests a design change, anchors verify it against safety margins and policy limits before it reaches production. No silent violations.

Documents vs Models vs VectorOWL

Capability Documents Traditional MBSE VectorOWL
Traceability Manual cross-references Linked model elements Queryable graph + vectors
AI readiness None Low (symbolic only) Native (MCP + embeddings)
Validation Manual review Static constraints Dynamic anchors + CI gates
Version control SharePoint / email Tool-specific exports Git-native ontology diffs
Similarity search None None Vector ANN over designs

Install

Self-host in minutes

VectorOWL is open source. Clone, build, and run locally. No external dependencies except Rust and a triple store.

1. Clone & build

git clone https://github.com/sponsors/radsilent.git
cd VectorOWL
cargo build --release -p vectorowld

Requires Rust 1.75+ and LibTorch. CPU-only mode works without a GPU.

2. Run the server

VECTOROWL_REQUIRE_TORCH_GPU=false \
  cargo run -p vectorowld

Defaults to 127.0.0.1:8080. Set PORT env var to override.

3. Register MCP server

# Claude Desktop
python3 scripts/patch_claude_mcp.py

# Cursor
# Add to ~/.cursor/mcp.json
{
  "mcpServers": {
    "vectorowl-runtime": {
      "command": "vectorowl-mcp",
      "args": ["--url", "http://127.0.0.1:8080"]
    }
  }
}

See MCP setup docs for full config.

4. Verify

curl http://127.0.0.1:8080/openapi.json

You should see the OpenAPI spec. The UI runs separately at http://localhost:5173.

License key (optional)

# Not required for local development
# export VECTOROWL_LICENSE_KEY="your-license-key"

# Future: keys are generated per subscription and validated at runtime.
# For now, the server runs without a license.

License enforcement is not yet active. When enabled, keys will be tied to your subscription account.

View on GitHub Need help?

Workflow

From zero to governed graph in 4 steps

1

Model your system in OWL

Define classes, individuals, and relationships. Start with requirements and architecture; add behavior and interfaces as you go.

2

Ingest evidence

Load simulation results, telemetry, and documents into the vector layer. Link them to ontology nodes for retrieval.

3

Define anchors

Encode hard constraints—temperature limits, stress bounds, policy gates—so no design ships without proof.

4

Connect your toolchain

Register VectorOWL MCP servers in Claude, Cursor, or your own host. Agents now reason over the same graph your team reviews.