Python Library.
Deploy anywhere with our Open-Source Library
The same library that powers the Kiln app can be used to deploy your project, on any cloud or server. MIT open-source library.
Your Kiln project, from Python
Load a Kiln project, run a task, and access results—all with the same SDK that powers the desktop app. No migration, no re-writes.
from kiln_ai.datamodel import Project
project = Project.load_from_file("./my_project/project.kiln")
task = project.tasks()[0]
result = await task.run(
input='{"query": "Summarize the latest report"}',
)From the app to production in 3 steps
Design tasks, refine prompts, and evaluate models using the Kiln desktop app.
pip install kiln_ai—the same engine as the app, now in your Python environment.
Load your Kiln project and run tasks programmatically in notebooks, servers, or CI pipelines.
How it works
Shared project files
The library reads and writes the same .kiln files as the desktop app. Tweak a prompt in the UI and your server/notebook sees it instantly.
MIT open-source
The Kiln Python library is MIT open-source. Zero lock-in.
Export your product for prod
Export a minimal representation of your Kiln project for production: your prompts, models, agents — without the dataset. Includes everything needed to deploy a Kiln RAG search tool.
- prompts
- models
- agents
- RAG search
- dataset
Connect existing tools
Connect external tools and services as MCP tools.
Everything in the app, programmable
Execute any Kiln task with the same prompts and models as the app.
Iterate over task runs, ratings, and eval results programmatically.
Load Kiln datasets directly into DataFrames for analysis.
Pydantic-validated data model with iterators for the full project hierarchy.
Run evals from code—same LLM-as-Judge and G-Eval scoring as the UI.
No analytics collected.
Designed for Jupyter and other notebook environments.
Open source, local-first, works with any AI provider.
AI development before and after the Kiln Python library
- Prototype in a UI, then rewrite everything in code for production—two separate codebases doing the same thing.
- Export datasets manually every time someone updates a prompt or adds a training example.
- Install a proprietary SDK that locks you into one platform's API and billing.
- Build in the desktop app, ship from the library—same engine, same project files, no rewrite.
- Changes in the app are instantly visible in code, and vice versa—one source of truth.
- Works with any AI provider—swap models without changing your integration.
Frequently asked
Is the Python library the same code as the desktop app?
Yes. The desktop app is built on the kiln_ai library. pip install kiln_ai gives you the same engine — not a wrapper or client SDK.
Do I need the desktop app to use the library?
No. The library works standalone for creating projects, defining tasks, and running them. Most teams use both: app for building, library for shipping.
Can I use any AI provider?
Yes. OpenAI, Anthropic, Google Gemini, Amazon Bedrock, Ollama, and any OpenAI-compatible endpoint. Switch providers by changing a config value.
How do I deploy to production?
Point the library at your project folder and run tasks in your server. The companion REST API (pip install kiln_server) exposes the same capabilities over FastAPI.
Build in the app. Ship from Python.
One pip install gives you the same engine that powers the Kiln desktop app—datasets, tasks, evals, and production deployment.