Overview
Grove is a DAG-based AI workflow orchestration engine. You define workflows as directed acyclic graphs in TypeScript, then Grove Core — a Rust service deployed to your own Kubernetes cluster — executes them with automatic parallelism, crash recovery, and multi-provider LLM routing.
Most teams integrate with Grove in two steps:
- Install the TypeScript SDK in your application to define and run workflows.
- Deploy Grove Core to Kubernetes in your own cloud (EKS, AKS, GKE, or on-prem).
TypeScript SDK
The @thedouglenz/grove-sdk package is the primary way to build and run workflows. It provides a graph builder API, streaming run handles, tool handler registration, and type-safe access to the Grove Core HTTP API.
Install
The SDK requires Node.js 18 or later.
npm install @thedouglenz/grove-sdk Quick Start
Connect to your Grove Core instance, build a small workflow, start a run, and stream events as nodes complete:
import {
GroveClient,
InputNode,
LlmNode,
OutputNode,
} from '@thedouglenz/grove-sdk';
const grove = new GroveClient({
baseUrl: process.env.GROVE_URL ?? 'http://localhost:3000',
});
// Define a DAG: input → LLM → output
const input = new InputNode('input');
const processor = new LlmNode('process', input, {
systemPrompt: 'Summarize this topic: {{inputs.topic}}',
provider: 'anthropic',
});
const output = new OutputNode('output', processor);
// Register the workflow and start a run
const workflow = await grove.workflows.create(
output.toWorkflow('summarize-topic'),
);
const run = await grove.workflows.startRun(workflow.id, {
inputs: { topic: 'DAG-based workflow engines' },
});
// Stream events as nodes execute
for await (const event of run.events()) {
if (event.type === 'node_completed') {
console.log(`${event.nodeId} finished`);
}
if (event.type === 'run_completed') {
console.log(event.outputs);
}
} Every run returns a RunHandle with a live SSE stream. You can pause workflows on external tool calls, handle them in your application, and resume execution — credentials stay in your environment.
SDK Features
- Graph Builder API — Compose DAGs with
InputNode,LlmNode,ToolCallNode,MergeNode, andOutputNode. - Real-Time SSE Streaming — Consume node-level events as they happen. No polling.
- External Tool Handlers — Register handlers that run in your app; Grove pauses the workflow and waits for results.
- Multi-Provider Routing — Choose a provider per node: Anthropic Claude, OpenAI, Google Gemini, Azure OpenAI, or Vertex AI.
- Template Variables — Pass data between nodes with
{{inputs.key}}and{{nodes.node_id.output}}syntax. - Sessions & Memory — Multi-turn conversation context with rolling summarization.
- Trailhead Integration — Built-in tools for semantic search, entity operations, and knowledge graph traversal.
- Workflows, Skills, Secrets, MCP, Sessions APIs — Type-safe clients for every Grove Core resource.
Kubernetes Deployment
Grove deploys into your own Kubernetes cluster via Helm. The chart is an umbrella that provisions Grove Core (Rust engine), Trailhead (knowledge graph service), a PostgreSQL database, and optional supporting services — all in a namespace you control.
Prerequisites
- A Kubernetes cluster (EKS, AKS, GKE, on-prem, or air-gapped)
- CloudNativePG operator for PostgreSQL provisioning
- An ingress controller (Traefik, NGINX, or your preferred option)
- Image pull credentials for the Grove container registry
- API keys for the LLM providers you plan to use
Grove's onboarding team will provide prebuilt manifests for CloudNativePG and your ingress controller if you don't have them installed.
Install with Helm
A minimal install looks like this:
# Create the namespace
kubectl create namespace grove
# Install the chart
helm upgrade --install grove ./deploy/helm/grove-demo \
--namespace grove \
--set secrets.anthropicApiKey="<YOUR_KEY>" \
--set secrets.openaiApiKey="<YOUR_KEY>" \
--set ingress.host="grove.example.com" See deploy/helm/grove-demo/values.yaml for the full list of configurable options — image tags, ingress class, database storage size, MinIO storage, resource limits, and more. We ship a values-prod.yaml template with recommended production settings.
What Gets Deployed
- Grove Core — Rust DAG execution engine (Axum + Tokio)
- Trailhead — Knowledge graph and RAG service (pgvector-backed)
- PostgreSQL — CloudNativePG cluster for durability, state, and vector storage
- MinIO — S3-compatible object storage for documents
- Kubernetes Secrets — API keys and database credentials (encrypted at rest with AES-256-GCM)
- Ingress — Path-based routing to all services
Everything runs inside your VPC. No data leaves your environment. Air-gapped installs are supported with offline image bundles.
White-Glove Support
Every enterprise license includes hands-on onboarding from the Grove team. We'll help you:
- Provision the cluster and dependencies
- Configure LLM provider routing and secrets
- Design your first workflows with your engineering team
- Set up monitoring, alerting, and backup policies
- Plan for production scaling and failover
Ready to deploy?
Get a guided install and your first workflow running in under a week.
Contact Sales →