Prerequisites
| Dependency | Version | Notes |
|---|
| Node.js | 18+ | Runtime for Cloudflare Workers and frontend |
| Java | 21 | SAP connector (Spring Boot) |
| Maven | 3.9+ | Build the Java connector |
| Wrangler CLI | Latest | npm install -g wrangler |
| SAP JCo | 3.1 | Downloaded separately from SAP Support Portal |
You also need:
- A Cloudflare account with Durable Objects enabled
- A Supabase project (free tier works for dev)
- An Anthropic API key with access to Claude Opus 4.5
- An OpenAI API key for embedding generation (scan pipeline)
1. Install dependencies
git clone <repo-url> && cd aisi
npm install
cp .dev.vars.example .dev.vars
Edit .dev.vars with your credentials:
ANTHROPIC_API_KEY=sk-ant-...
OPENAI_API_KEY=sk-...
PRIMARY_MODEL=anthropic:claude-opus-4-5
SECONDARY_MODEL=anthropic:claude-haiku-4-5
SUPABASE_URL=https://xyz.supabase.co
SUPABASE_ANON_KEY=eyJhbGc...
SUPABASE_SERVICE_ROLE_KEY=eyJhbGc...
ENCRYPTION_KEY=<64-char-hex-key> # generate: openssl rand -hex 32
PRIMARY_MODEL and SECONDARY_MODEL are optional. Use provider:model format (for example anthropic:claude-opus-4-5 or openai:gpt-5-mini). If omitted, the app falls back to anthropic:claude-opus-4-5 (primary) and anthropic:claude-haiku-4-5 (secondary).
.dev.vars is gitignored. These variables are injected into the Cloudflare Worker at dev time by Wrangler.
Connector API keys are no longer global Worker env vars. They are per-connection secrets stored in connection_secrets (typically via the workspace connection wizard/API).
3. Set up the database
The application uses Supabase (Postgres) for metadata storage. Run the migrations:
cd supabase
supabase db push
Or apply the migration files in supabase/migrations/ manually via the Supabase dashboard.
Tables created:
| Table | Purpose |
|---|
organizations | Multi-tenant orgs |
org_members | User-org membership with roles |
workspaces | Workspace-level settings (org, web tools, etc.) |
workspace_connections | Named SAP connections per workspace |
connection_secrets | Per-connection secrets (connector_key, host/sysnr/client/router) |
user_connection_credentials | Per-user encrypted SAP credentials per connection |
agents | AI chat agents (Durable Object IDs) |
sap_search_entries | Indexed SAP metadata (searchable objects) |
workspace_sap_graph | Index + enrichment status per workspace |
workspace_sap_index | Phase-level index tracking |
sap_generated_docs | Generated module and technical docs |
clean_core_analysis | Clean Core / conversion readiness results |
sap_sync_runs | Full/delta/enrich sync history |
audit_logs | Audit trail |
Row-level security (RLS) is enabled on all tables.
4. Build the SAP connector
SAP JCo is proprietary and not available in Maven Central. You must download it from the SAP Support Portal with S-user credentials.
cd connector
# Place JCo files in lib/
# sapjco3.jar + native library (libsapjco3.so, sapjco3.dll, or libsapjco3.jnilib)
ls lib/
# sapjco3.jar libsapjco3.so
mvn clean package
To run without a real SAP connection, set SAP_MOCK_MODE=true:
SAP_MOCK_MODE=true java -jar target/connector-0.1.0.jar
See Connector for full configuration.
Start the SAP connector
cd connector
SAP_MOCK_MODE=true java -Xmx1536m -Xms512m \
-Dloader.path=lib/sapjco3.jar \
-jar target/connector-0.1.0.jar
Runs on http://localhost:8080 in mock mode for quick local bring-up. Verify with curl http://localhost:8080/health.Start the Cloudflare Worker
Runs on http://localhost:8787. Start the frontend
Runs on http://localhost:5173. API requests proxy to the Worker on 8787.
Once everything is running:
- Log in via the frontend (
http://localhost:5173)
- Create an organization
- Create a workspace
- Open Workspace Settings and click Add Connection
- In the 3-step wizard:
- Enter SAP system details (host, system number, client, optional router string)
- Copy/run the generated Docker command for the connector
- Verify connector health, test connection, and save
- (Optional) In SAP Credentials, save your personal SAP username/password for each connection
- Create an agent in the workspace and start chatting
The agent automatically gets access to SAP tools for configured connections. When two or more connections are configured, cross-system tools (field mapping, ETL) become available.
Quick reference
| Command | What it does |
|---|
npm run dev | Start Cloudflare Worker (port 8787) |
npm run dev:frontend | Start Vite frontend (port 5173) |
npm run build | Build frontend + dry-run deploy |
npm run deploy | Build + deploy to Cloudflare Workers |
npm test | Run tests |
npm run typecheck | Type-check without emitting |
npm run lint | Run ESLint |