DSL Import API
The DSL Import API lets you programmatically create complete agents, tools, knowledge bases, and data lake structures from a single JSON definition. Instead of manually creating each resource through the UI, you define everything in one DSL file and import it in a single API call.
Base URL
POST /api/dsl/import
For self-hosted installations: https://kaman.in/api/dsl/import
Authentication
The endpoint supports two authentication methods:
| Method | Header | Use Case |
|---|---|---|
| Cookie auth | Cookie: kaman_access_token=... | Browser / logged-in user |
| Service headers | x-client-id + x-user-id | Service-to-service calls |
For KDL (data lake) creation and file uploads, also include a Bearer token:
Authorization: Bearer <your_token>
Content Types
The endpoint accepts two content types:
| Content-Type | Use Case |
|---|---|
application/json | DSL JSON body only (no file uploads) |
multipart/form-data | DSL JSON + file attachments for KB documents |
Multipart Upload
To upload files alongside the DSL, send multipart/form-data with:
- A
dslfield containing the JSON definition - One or more file fields with the actual files
KB documents reference attachments by filename:
{
"knowledgeBase": {
"name": "My KB",
"documents": [
{ "type": "file", "name": "User Guide", "attachment": "guide.pdf" },
{ "type": "file", "name": "Remote Doc", "fileUrl": "https://example.com/doc.pdf" }
]
}
}
curl -X POST "https://kaman.in/api/dsl/import" \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "x-client-id: 1" \
-H "x-user-id: 1" \
-F "dsl=<my-agent.json" \
-F "files=@guide.pdf;filename=guide.pdf"
The importer uploads attached files to storage and resolves URLs automatically. For fileUrl documents, kaman-agent downloads and processes the file on its end.
Quick Start
1. Create a DSL file
{
"tools": [
{
"name": "My Tool",
"description": "Does something useful",
"functions": [
{
"name": "doSomething",
"description": "Performs the operation and returns the result",
"inputFields": {
"query": {
"type": "string",
"description": "What to look up",
"required": true
}
},
"outputFields": {
"result": { "type": "string", "description": "The result" }
},
"callCode": "return { result: 'Hello ' + data.query };"
}
]
}
],
"agent": {
"name": "My Agent",
"description": "A helpful assistant with custom tools",
"options": [
{
"systemPrompt": "You are a helpful assistant. Use the doSomething tool when asked.",
"model": "gpt-4",
"tools": [{ "ref": "My Tool" }],
"faqs": ["What can you do?"]
}
]
}
}
2. Import it
curl -X POST "https://kaman.in/api/dsl/import" \
-H "Content-Type: application/json" \
-H "x-client-id: YOUR_ORG_ID" \
-H "x-user-id: YOUR_USER_ID" \
-d @my-agent.json
3. Check the response
{
"imported": 2,
"updated": 0,
"skipped": 0,
"errors": 0,
"details": [
{ "name": "My Tool", "type": "tool", "status": "imported", "pluginId": 142 },
{ "name": "My Agent", "type": "agent", "status": "imported", "pluginId": 143 }
]
}
Your agent and tool are now live and accessible in the platform.
DSL Structure
A DSL file is a JSON object with up to four top-level blocks. At least one must be present:
| Block | Purpose | Creates |
|---|---|---|
kdl | Data lake structure | Lakes, schemas, tables, seed data (via KDL API) |
tools | Tool/function definitions | Plugin records (type=tool) with functions |
knowledgeBase | RAG knowledge base | Plugin record (type=kb) with documents |
agent | Agent configuration | Plugin record (type=app) with full config |
Resources are imported in dependency order: KDL -> Tools -> Knowledge Base -> Agent. The agent can reference tools and KBs defined in the same file using ref.
Tools Block
An array of tool definitions. Each tool becomes a plugin with one or more callable functions.
{
"tools": [
{
"name": "Weather Lookup",
"description": "Looks up current weather for any city",
"version": "1.0.0",
"category": "Utilities",
"functions": [
{
"name": "getWeather",
"description": "Gets current weather for a city. Returns temperature, conditions, humidity.",
"inputFields": {
"city": {
"type": "string",
"description": "City name (e.g., 'London', 'Tokyo')",
"required": true
},
"units": {
"type": "string",
"description": "Temperature units: 'celsius' or 'fahrenheit'",
"required": false,
"default": "celsius"
}
},
"outputFields": {
"temperature": { "type": "number", "description": "Current temperature" },
"conditions": { "type": "string", "description": "Weather conditions" }
},
"callCode": "const resp = await axios.get(`https://api.weather.example/${data.city}`); return resp.data;",
"sync": true
}
]
}
]
}
Function Definition
| Field | Type | Required | Description |
|---|---|---|---|
name | string | Yes | Function name in camelCase (e.g., getWeather, sendEmail) |
description | string | Yes | What it does and returns. The LLM reads this to decide when to use the tool. |
callCode | string | One of | Inline JavaScript. Available vars: data, axios, _ (lodash), crypto, f, config |
url | string | One of | HTTP endpoint to call instead of callCode |
inputFields | object | No | Parameters the LLM provides at runtime (become the Zod schema) |
userFields | object | No | Credentials/config the user stores once (NOT visible to LLM) |
outputFields | object | No | What the function returns (documentation only) |
sync | boolean | No | When true, results are cached for identical inputs |
Field Types
Fields in inputFields, userFields, and outputFields use the same format:
{
"<fieldName>": {
"type": "string | number | boolean | object | array | date | file | binary",
"description": "Human-readable description",
"required": true,
"default": "optional default value",
"innerFields": { }
}
}
Important:
inputFieldsare what the LLM fills in at runtime.userFieldsare credentials or configuration that the user sets up once (API keys, passwords, tokens) — these are never exposed to the LLM.
Credentials & User Fields
Tools that need API keys or authentication should use userFields, not inputFields:
{
"functions": [
{
"name": "searchCRM",
"description": "Searches the CRM for contacts",
"inputFields": {
"query": { "type": "string", "description": "Search query", "required": true }
},
"userFields": {
"apiKey": {
"type": "string",
"description": "Your CRM API key",
"required": true,
"config": { "type": "apikey" }
}
},
"callCode": "const resp = await axios.get('https://crm.example/api/search', { params: { q: data.query }, headers: { 'X-API-Key': data.apiKey } }); return resp.data;"
}
]
}
Triggers
Tools can also define event triggers:
{
"triggers": [
{
"name": "onNewLead",
"description": "Fires when a new lead is created",
"outputFields": {
"leadId": { "type": "string", "description": "New lead ID" }
},
"registrationCallCode": "return axios.post('https://crm.example/webhooks', { url: f.url });",
"checkCallCode": "return axios.get('https://crm.example/webhooks/status');",
"checkInterval": 60000
}
]
}
Knowledge Base Block
A single knowledge base definition with documents for RAG (retrieval-augmented generation).
{
"knowledgeBase": {
"name": "Product Documentation",
"description": "Product guides and FAQ content",
"documents": [
{
"type": "file",
"name": "User Guide",
"fileUrl": "https://storage.example.com/docs/user-guide.pdf",
"mimeType": "application/pdf"
},
{
"type": "text",
"name": "FAQ Content",
"content": "# Frequently Asked Questions\n\n## How do I reset my password?\nGo to Settings > Security > Reset Password...",
"metadata": { "category": "faq", "last_updated": "2026-03-01" }
}
]
}
}
Document Types
| Type | Required Fields | Description |
|---|---|---|
file | fileId, fileUrl, or attachment | Reference to a file. Automatically parsed and embedded. |
text | content | Inline text content. Note: text documents are stored but may require file upload for full embedding support. |
File Document Sources
There are three ways to provide files for KB documents:
1. Direct attachment (multipart upload) — best for local files:
{ "type": "file", "name": "Guide", "attachment": "guide.pdf" }
Include the actual file in the multipart form data. The importer uploads it to storage and resolves the fileId automatically.
2. URL reference — best for files hosted online:
{ "type": "file", "name": "Guide", "fileUrl": "https://example.com/guide.pdf" }
The platform downloads the file and processes it during KB creation.
3. Pre-existing fileId — for files already in platform storage:
{ "type": "file", "name": "Guide", "fileId": "s3://bucket/org/hash.pdf" }
Agent Block
The complete agent configuration including system prompt, model, tool bindings, and settings.
{
"agent": {
"name": "Sales Assistant",
"description": "Helps sales teams with CRM lookups and email drafting",
"slug": "sales-assistant",
"category": "Sales",
"version": "1.0.0",
"options": [
{
"name": "default",
"systemPrompt": "You are a sales assistant. Help users look up contacts and draft emails.",
"model": "gpt-4",
"tools": [
{ "ref": "CRM Tool" },
{ "ref": "Email Tool" }
],
"knowledgeBases": [
{ "ref": "Sales Playbook" }
],
"datasources": [
{ "ref": "sales_lake.crm.deals" }
],
"faqs": [
"Look up contact info for Acme Corp",
"Draft a follow-up email",
"Show me this quarter's deals"
],
"settings": {
"maxIterations": 10,
"toolCallTimeout": 15000,
"toolCallRetries": 2,
"llmThinkingMode": "quick"
}
}
],
"pricing": {
"model": "free"
}
}
}
Agent Options
Each agent has at least one options entry (persona). Most agents have one.
| Field | Type | Required | Description |
|---|---|---|---|
systemPrompt | string | Yes | The main instruction prompt for the LLM |
model | string | No | LLM model ID (default: gpt-4). Examples: gpt-4, claude-sonnet-4-5-20250514 |
tools | array | No | Tool references (see below) |
knowledgeBases | array | No | KB references |
datasources | array | No | KDL data source references |
faqs | string[] | No | Quick-prompt suggestions shown to users |
settings | object | No | Execution settings (timeouts, retries, etc.) |
Tool References
Tools in agent options can reference resources three ways:
{ "ref": "My Tool" }
References a tool defined in the same DSL file by name.
{ "id": 42 }
References an existing tool in the platform by ID.
{ "name": "existingToolFunction" }
References an existing function by name.
Datasource References
{ "ref": "my_lake.analytics.deals" }
References a KDL table defined in the same DSL file (lakeName.schema.table format).
{ "lakeName": "my_lake", "schemaName": "analytics", "tableName": "deals" }
Explicit KDL coordinates for an existing data source.
Knowledge Base References
{ "ref": "My KB Name" }
References a knowledge base defined in the same DSL file.
{ "id": "123" }
References an existing knowledge base by ID.
KDL Block
Creates data lake structures: lakes, schemas, tables with column definitions, and optional seed data.
{
"kdl": {
"lakeName": "analytics_lake",
"description": "Company analytics data",
"schemas": [
{
"name": "sales",
"tables": [
{
"name": "deals",
"description": "Sales deal pipeline",
"columns": {
"id": { "type": "INTEGER", "primaryKey": true, "autoIncrement": true },
"company": { "type": "VARCHAR(255)" },
"amount": { "type": "DOUBLE" },
"stage": { "type": "VARCHAR(50)" },
"created_at": { "type": "TIMESTAMP", "default": "CURRENT_TIMESTAMP" }
},
"seedData": {
"rows": [
{ "company": "Acme Corp", "amount": 50000, "stage": "negotiation" },
{ "company": "Globex Inc", "amount": 120000, "stage": "closed_won" }
]
}
}
]
}
]
}
}
Column Types
Supported SQL types: VARCHAR, INTEGER, BIGINT, DOUBLE, FLOAT, BOOLEAN, TIMESTAMP, DATE, TEXT, JSON, JSONB, BLOB, BINARY. Size suffixes are supported (e.g., VARCHAR(255)).
Column Properties
| Property | Type | Default | Description |
|---|---|---|---|
type | string | required | SQL column type |
primaryKey | boolean | false | Primary key column |
nullable | boolean | true | Allow NULL values |
unique | boolean | false | Unique constraint |
autoIncrement | boolean | false | Auto-incrementing |
default | any | - | Default value expression |
Response Format
Success (200)
{
"imported": 3,
"updated": 0,
"skipped": 0,
"errors": 0,
"details": [
{ "name": "Weather Lookup", "type": "tool", "status": "imported", "pluginId": 142 },
{ "name": "Weather Tips KB", "type": "kb", "status": "imported", "pluginId": 143 },
{ "name": "Weather Buddy", "type": "agent", "status": "imported", "pluginId": 144 }
]
}
Validation Error (400)
{
"error": "DSL validation failed",
"validationErrors": [
{ "path": "tools/0/functions/0", "message": "Must have property \"callCode\" or \"url\"" },
{ "path": "agent/options/0/systemPrompt", "message": "Must be a non-empty string" }
]
}
Partial Failure (207)
When some resources fail but others succeed:
{
"imported": 1,
"updated": 0,
"skipped": 0,
"errors": 1,
"details": [
{ "name": "My Tool", "type": "tool", "status": "imported", "pluginId": 142 },
{ "name": "My Agent", "type": "agent", "status": "error", "error": "..." }
]
}
Status Values
| Status | Description |
|---|---|
imported | New resource created successfully |
updated | Existing resource updated (same name + type + org) |
skipped | Resource already exists and didn't need changes (KDL only) |
error | Resource failed to import |
Idempotency
The import is idempotent. Running the same DSL file twice will update existing resources on the second run instead of creating duplicates. Resources are matched by name + type within your organization — different organizations can have resources with the same name without conflict.
Code Examples
TypeScript
const BASE_URL = 'https://kaman.in';
async function importDsl(dsl: object, orgId: number, userId: number) {
const response = await fetch(`${BASE_URL}/api/dsl/import`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'x-client-id': String(orgId),
'x-user-id': String(userId),
},
body: JSON.stringify(dsl),
});
return response.json();
}
// Import from a file
const dsl = await import('./my-agent.json');
const result = await importDsl(dsl, 1, 1);
console.log(`Imported: ${result.imported}, Updated: ${result.updated}, Errors: ${result.errors}`);
Python
import requests
import json
BASE_URL = "https://kaman.in"
def import_dsl(dsl: dict, org_id: int, user_id: int) -> dict:
response = requests.post(
f"{BASE_URL}/api/dsl/import",
json=dsl,
headers={
"Content-Type": "application/json",
"x-client-id": str(org_id),
"x-user-id": str(user_id),
},
)
return response.json()
# Load and import
with open("my-agent.json") as f:
dsl = json.load(f)
result = import_dsl(dsl, org_id=1, user_id=1)
print(f"Imported: {result['imported']}, Updated: {result['updated']}, Errors: {result['errors']}")
cURL
# Import from a file
curl -X POST "https://kaman.in/api/dsl/import" \
-H "Content-Type: application/json" \
-H "x-client-id: 1" \
-H "x-user-id: 1" \
-d @my-agent.json
# Import inline JSON
curl -X POST "https://kaman.in/api/dsl/import" \
-H "Content-Type: application/json" \
-H "x-client-id: 1" \
-H "x-user-id: 1" \
-d '{
"agent": {
"name": "Quick Bot",
"options": [{
"systemPrompt": "You are a helpful assistant.",
"model": "gpt-4"
}]
}
}'
Error Handling
| Status | Error | Description |
|---|---|---|
| 400 | Validation error | DSL JSON doesn't match the schema |
| 401 | Unauthorized | Missing authentication (no cookie or x-client-id) |
| 207 | Partial failure | Some resources imported, some failed |
| 500 | Server error | Unexpected internal error |
Best Practices
- Start small: Import just a tool or just an agent first, then add more blocks
- Use
reffor cross-references: When your DSL has both tools and an agent, use{ "ref": "Tool Name" }to link them - Test idempotency: Run your import twice to verify updates work correctly
- Include descriptions: Good descriptions on tools and functions help the LLM use them effectively
- Use
userFieldsfor secrets: Never put API keys ininputFields— useuserFieldsinstead so the LLM doesn't see credentials
Next Steps
- Tools API - Search and execute individual tools
- Knowledge Base API - Manage knowledge bases and RAG
- KDL API - Query your data lake directly
- Authentication - Learn about API authentication methods