Security & Compliance

AI Workflows for Regulated Industries

Grove executes workflows as one-shot directed graphs: input in, DAG runs, output out. No hidden state, no implicit memory, no surprise side effects. Deploy within your own infrastructure, maintain a complete audit trail of every run, and keep regulated data under your control.

01

Workflows as Bounded Executions

Grove's execution model is predictable by design: each workflow run takes an explicit input, traverses a directed graph of nodes, and produces an explicit output. The graph is defined up front and cannot change mid-run. Every data dependency is visible in the definition. For non-conversational workloads — extraction, classification, enrichment, multi-step analysis — there is no implicit state between runs.

Air-Gap Compatible

Grove runs as a self-contained Kubernetes deployment within your VPC. The orchestration engine, database, and all workflow state remain inside your infrastructure perimeter.

Your LLM Keys, Your Contracts

API keys for LLM providers are passed per-request via HTTP headers. You use your own enterprise agreements with Anthropic, OpenAI, or Google. We are not a party to the LLM inference chain — your data flows directly from your infrastructure to your provider.

Self-Hosted Model Support

For workflows processing the most sensitive data, route specific nodes to self-hosted models running within your VPC. Different nodes in the same workflow can use different providers — cloud APIs for non-sensitive tasks, self-hosted models for everything else.

No Implicit State

A workflow run is self-contained. It receives its inputs, executes its nodes, and returns its outputs. Nothing carries over to the next run except the persisted audit trail. Conversational state is an optional, opt-in layer — most workflows do not use it at all.

02

Encryption & Secrets Management

Credentials and sensitive configuration are encrypted at rest using industry-standard cryptography. Secret values are never exposed through the API.

Algorithm AES-256-GCM — authenticated encryption with per-value nonces
Key Management Server-side encryption key loaded from environment; never persisted to storage
API Design Values write-only — the API never returns secret content, only confirms existence
03

Workflow Documentation

Grove supports plain-language workflow documentation for compliance review. Each workflow can carry human-authored metadata and per-node descriptions, and the platform can export the current workflow as a CCO-friendly markdown document.

Plain-Language Metadata

Workflows can include a description, purpose statement, category, data classification, and owning team. These fields are designed for human reviewers rather than model execution.

Per-Node Documentation

Every node can carry its own description so compliance staff can understand what each step does without reading prompts or application code.

Markdown Compliance Export

A dedicated API endpoint renders the workflow, execution order, node details, referenced models, and accessible tools as a markdown document that can be reviewed directly or printed to PDF.

Metadata Change History

Metadata edits are audit-logged field by field. Compliance staff can see who changed a workflow description, what the previous value was, and what it said at any historical point.

04

Per-Node Audit Trail

Every workflow execution produces a complete, timestamped record of what ran, when, with what inputs and outputs, and how long it took. This audit trail is persisted to PostgreSQL and queryable via API.

Node Execution Records

Each node in a workflow — every LLM call, tool invocation, and data transformation — generates a durable execution record capturing its upstream inputs, output value, status, error details, and wall-clock duration. Full data provenance for every decision.

Real-Time Event Stream

Server-Sent Events (SSE) stream every workflow event as it happens: node started, node completed, tool calls requested, results received, run completed. Build monitoring and alerting on top of the native event stream.

Run Configuration Capture

The execution configuration for every run — which model, what tools, timeout settings, session bindings — is captured and persisted at run creation time. Full reproducibility of the execution context.

Tool Call Tracking

Every tool invocation within a workflow is recorded: what was requested, what inputs were provided, and when results were received. Both built-in and application-defined tool calls are captured in the audit trail.

05

Immutable Records & Non-Destructive Lifecycle

Workflow definitions and execution history are preserved throughout their lifecycle. The audit trail is append-only at the operational level — records are not overwritten or removed during normal operation.

Workflow Immutability

Workflow definitions are create-and-retire only. There is no update operation. A workflow deployed on day 1 remains byte-identical throughout its operational lifetime, so every audit record unambiguously references the exact workflow that was executed. Retired workflows are soft-deleted and retained.

Non-Destructive Audit Trail

Run records and node execution history are not deleted during normal operation. Failed runs retain their complete execution trail — which nodes ran, what inputs they received, what outputs they produced, and where execution halted — so incidents can be fully reconstructed after the fact.

06

Data Boundaries & Multi-Tenant Isolation

Grove enforces data boundaries at the run level, not just the session level. Every run executes in an isolated context with no implicit access to data from other runs, and records can be tagged with an opaque owner label for tenant-scoped operations.

Per-Run Execution Context

Each workflow run has its own execution context — inputs, tool registry, and scratch state — that is constructed at run start and discarded when the run completes. One run cannot read another run's data.

Owner-Scoped Records

Workflows, runs, and sessions can be tagged with an opaque owner label at creation time. The label is indexed and enables tenant-scoped queries, bulk disposal, and audit filtering without coupling Grove to any specific tenancy model.

Optional Session Isolation

For conversational workloads that need persistent state across runs, Grove offers an optional session layer. Session data — messages, memory, summaries — is scoped per session with no cross-session access. Workflows that do not use sessions are unaffected by this layer entirely.

Namespace Isolation (Sessions)

Session memory supports hierarchical namespaces for structured data organization. Namespace boundaries enforce isolation within and across sessions for applications that need to partition conversational state.

07

Data Disposal & Right-to-Erasure

Grove provides provable data disposal with a forever-retained audit trail. When customer data needs to be removed — whether for retention policies, customer offboarding, or regulatory deletion requests — the disposal is genuine, recorded, and cryptographically attested.

Tombstones for Audit Records

Run history and per-node execution records are tombstoned: their content is erased while the row remains as a marker. The audit trail's structural integrity is preserved — foreign key references survive — without retaining any of the disposed data.

Hard Delete for Sensitive Data

Sessions, conversation messages, and session memory are hard-deleted on disposal. PII-bearing records are physically removed from storage, not flagged or hidden.

Disposal Audit Log

Every disposal event is recorded in a separate, forever-retained audit log: what was disposed, when, by whom, why, and a SHA-256 hash of the original record content. The hash proves the record existed without retaining its content.

Owner-Scoped Bulk Disposal

Records can be tagged with an opaque owner label at creation time. A single API call disposes every workflow, run, and session associated with a given label — ideal for customer offboarding and tenant-scoped erasure requests.

Disposal Endpoints Per-record (run, workflow, session) and bulk (by owner label) with dry-run preview
Cascade Semantics Disposing a workflow cascades to its runs and node executions; sessions are intentionally not cascaded
Audit Query API Disposal log is queryable by table, record ID, owner label, and date range
08

Durability & Crash Recovery

Grove persists execution state at every step. If the server crashes mid-workflow, failed runs can be resumed from the last completed checkpoint — no data loss, no re-execution of already-completed work.

Checkpoint Persistence Per-node outputs persisted to PostgreSQL as each node completes
Stale Run Detection On startup, orphaned in-progress runs are automatically detected and marked failed
Resume from Checkpoint Failed runs resume execution from the last completed node — already-finished work is not repeated
Atomic Claims Concurrent resume requests are safely handled — exactly one succeeds
09

Enterprise LLM Provider Management

Register and manage LLM providers through a secure API with support for multiple authentication methods. Route workflows through named model groups with automatic failover.

Five Provider Backends

Anthropic, OpenAI, Google Gemini, Vertex AI (Claude on GCP), and Azure OpenAI. Mix providers within a single workflow based on sensitivity, cost, or capability requirements.

Enterprise Auth

API keys (encrypted in the secrets store), OAuth2 client credentials, GCP managed identity, and Azure managed identity. No plaintext credentials in configuration.

Named Model Groups

Define model tiers — fast, standard, frontier — and the broker resolves them to concrete providers. Change what "fast" means across all workflows without modifying any workflow definition.

Automatic Failover

Model groups support ordered backends. If the primary provider is unavailable (rate limited, down, or timing out), the broker automatically falls through to the next backend.

10

Regulatory Alignment

Grove's architecture supports compliance with data protection frameworks across regulated industries — financial services, healthcare, legal, and any environment where data custody and auditability are required.

Administrative, technical, and physical safeguards
Air-gap deployment, AES-256-GCM encryption, per-session isolation, Kubernetes namespace controls
Service provider oversight
Client-owned API keys for LLM providers — your enterprise agreements, your data relationship with the inference provider. The orchestration vendor is not a party to the LLM inference chain.
Incident detection and response
Per-node audit trail, real-time SSE event stream, persistent run history in PostgreSQL — the technical substrate for detection, investigation, and documentation.
Recordkeeping and retention
PostgreSQL persistence of workflow definitions, run history, node executions, session data, and execution configuration. Retention periods configurable to your regulatory requirements.
Data custody and sovereignty
Entire platform deploys within the institution's cloud boundary. Customer information never routes through external SaaS.
Data disposal and right to erasure
Per-record and bulk disposal endpoints with cryptographically attested audit log. Tombstone-based disposal preserves audit integrity for run history while hard-deleting PII-bearing session data. Owner-scoped purges support customer offboarding workflows.

Grove provides the technical infrastructure for compliance. Regulatory compliance programs, written policies, and legal assessments are the responsibility of the deploying institution and should be developed with qualified compliance counsel.

Ready to deploy AI workflows in your cloud?

Contact Sales