A unified harness substrate for agentic frameworks.
Trace, budget, sandbox, auth, and HITL — for any orchestrator. Entorin slides under your agent loop and supplies the harness layer everyone otherwise reinvents.
What you get under your loop
One OTel trace per run
Every node, LLM call, tool call, sandbox exec, checkpoint, and handoff is a span — carrying run_id, principal_id, tokens, and cost.
One budget ledger
A single ledger across heterogeneous orchestrators. BudgetGate.check runs before any wire activity; cap-exceeded runs terminate before the network call.
BYOK / multi-tenant
API keys flow as Capability, never read from os.environ. Per-principal scopes for memory, tools, retrieval, checkpoints.
How Entorin differs from agentic frameworks
Frameworks build DAGs. Entorin builds the harness under them. It plugs in; it doesn't replace.
| Agentic framework | Entorin | |
|---|---|---|
| Workflow shape | DAG / state graph you author | Whatever you already use — bare loop, SDK, framework |
| Observability | Per-framework, often partial | One OTel trace per run — every span, every adapter |
| Budget | Reinvented per framework, if at all | One ledger + gate across heterogeneous orchestrators |
| Auth / keys | os.environ reads scattered across the SDK | Capability flow at substrate boundary; never reads env |
| Tool & agent protocols | Bespoke abstractions | Reuses MCP (tools) and ACP (agent-as-tool) |
| What you write | Subclass nodes, register tools, build a graph | Plain Python; substrate emits without your loop knowing |
What Entorin does not do
Scope discipline is the substrate philosophy. If you find yourself wanting one of these from Entorin, that is a sign the wrong tool is on your shortlist.
- No DAG / workflow builderThat's LangGraph, CrewAI, et al.
- No prompt templatingThat's your code.
- No vector DB / retrieval storeRetrieval ships as a protocol; bring your own backend.
- No deployment infraNo Docker, K8s, queues, or load balancers.
- No eval suitesTraces are the regression substrate; you bring the assertions.
Quick start
Install the package, mint a RunContext, drop into a bare loop. The full harness emits because the substrate primitives emit them — your loop doesn't need to know.
# install
uv add entorin
# run
from adapters.bare_loop import run_qa
answer = await run_qa(
model=anthropic_model,
ctx=ctx, bus=bus,
question="how many roads must a man walk down?",
tool_clients={"fs": fs_client},
max_turns=6,
) Pick your path
Httpx transport + wrapped MCP sessions + per-run sandbox.
Same shape, OpenAI wire format. Chat Completions + Responses.
Node-wrapper + HITL driver. Substrate inside; LangGraph outside.
No SDK at all. 50 lines of Python inherit the full harness.