Skip to main content
Get started in under 5 minutes with no infrastructure setup.
1

Sign up at app.superglue.cloud

Or book a demo to talk to our team first.
2

Connect a system

Click one of the system icons (Slack, Stripe, GitHub, etc.) or describe the system you want to connect in the agent chat:Superglue agent interfaceThe agent will guide you through authentication and configuration.
3

Build and test a tool

Describe what you need — the agent will set up your tool, test it, and let you review before saving.
4

Execute or schedule

Save the tool, run it on-demand, schedule it, or trigger via webhook.

Self-Hosted

Deploy on your infrastructure for complete control and customizability.

Quick Start

Prerequisites: Docker, a PostgreSQL database, and an LLM API key (OpenAI, Anthropic, or Gemini)
1

Create .env file

For full environment setup, see Production Setup below or the .env.example.
2

Run

Run the following command:
docker run -d \
  --name superglue \
  --env-file .env \
  -p 3001:3001 \
  -p 3002:3002 \
  superglueai/superglue:latest
3

Access dashboard and API


Local Development

git clone https://github.com/superglue-ai/superglue.git
cd superglue
cp .env.example .env
Edit .env with your configuration, then:
npm install
npm run dev

Production Setup

For production deployments:
# ===================
# REQUIRED
# ===================

# Server endpoints (required for web dashboard to connect)
API_ENDPOINT=https://your-domain.com:3002
SUPERGLUE_APP_URL=https://your-domain.com:3001

# Supabase Authentication (required)
NEXT_PUBLIC_SUPABASE_URL=https://your-project.supabase.co
NEXT_PUBLIC_SUPABASE_ANON_KEY=your-anon-key

# LLM Provider: OPENAI, ANTHROPIC, GEMINI, BEDROCK, VERTEX, or AZURE
LLM_PROVIDER=ANTHROPIC
# Provide the API key and model for your chosen provider:
# ANTHROPIC_API_KEY=sk-ant-xxxxx
# ANTHROPIC_MODEL=claude-sonnet-4-5

# GEMINI_API_KEY=your-gemini-key
# GEMINI_MODEL=gemini-2.5-flash

# OPENAI_API_KEY=sk-proj-xxxxx
# OPENAI_MODEL=gpt-5.2

# Agent chat LLM (can differ from backend LLM_PROVIDER)
FRONTEND_LLM_PROVIDER=ANTHROPIC

# PostgreSQL (required)
POSTGRES_HOST=your-postgres-host
POSTGRES_PORT=5432
POSTGRES_USERNAME=superglue
POSTGRES_PASSWORD=secure-password
POSTGRES_DB=superglue
# Uncomment for local/unsecured postgres:
# POSTGRES_SSL=false

# Credential encryption (HIGHLY recommended)
# Generate a strong key: openssl rand -hex 32
MASTER_ENCRYPTION_KEY=your-32-byte-encryption-key

# ===================
# OPTIONAL
# ===================

# Web search in agent
# TAVILY_API_KEY=your-tavily-key

# Workflow scheduling (only enable on a single instance)
# START_SCHEDULER_SERVER=false

File Storage

File storage (S3 or MinIO) is required for system documentation — including URL scraping, OpenAPI spec fetching, and file uploads. Without it, tools can still be created and executed, but systems won’t have documentation context for the AI agent.
Add to your .env:
FILE_STORAGE_PROVIDER=aws
AWS_ACCESS_KEY_ID=your-access-key
AWS_SECRET_ACCESS_KEY=your-secret-key
AWS_REGION=us-east-1
AWS_BUCKET_NAME=your-bucket-name
AWS_BUCKET_PREFIX=raw
File processing: When files are uploaded (PDFs, CSVs, etc.), they need to be parsed before the agent can use them. To enable automatic processing, configure an S3 EventBridge notification to send Object Created events to your superglue instance:
  • Target: https://your-domain.com/v1/file-references/process (POST)
  • Authorization: Bearer token matching your AUTH_TOKEN
Without EventBridge, uploaded files will remain in PENDING status and won’t be parsed. Scraped documentation and OpenAPI specs are stored pre-processed and work without EventBridge.

Next Steps

Core Concepts

Understand systems, tools, and execution

Creating a Tool

Build your first tool

MCP Overview

Use superglue tools in your internal GPT, Cursor or Claude