Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
63 changes: 0 additions & 63 deletions .eslintrc.json

This file was deleted.

52 changes: 40 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,38 +11,66 @@
<a href="https://github.com/constructive-io/agentic-kit/blob/main/LICENSE"><img height="20" src="https://img.shields.io/badge/license-MIT-blue.svg"/></a>
</p>

A unified, streaming-capable interface for multiple LLM providers.
A provider-portable LLM toolkit with structured streaming, model registries,
cross-provider message normalization, and an optional stateful agent runtime.

## Packages

- **agentic-kit** — core library with provider abstraction and `AgentKit` manager
- **agentic-kit** — low-level portability layer with model descriptors, registries, structured event streams, and compatibility helpers
- **@agentic-kit/agent** — minimal stateful runtime with sequential tool execution and lifecycle events
- **@agentic-kit/ollama** — adapter for local Ollama inference
- **@agentic-kit/anthropic** — adapter for Anthropic Claude models
- **@agentic-kit/openai** — adapter for OpenAI and OpenAI-compatible APIs
- **@agentic-kit/openai** — generalized adapter for OpenAI-compatible chat completion APIs

## Getting Started

```bash
git clone git@github.com:constructive-io/agentic-kit.git
cd agentic-kit
yarn install
yarn build
yarn test
pnpm install
pnpm build
pnpm test
```

## Usage

```typescript
import { createOllamaKit, createMultiProviderKit, OllamaAdapter } from 'agentic-kit';
import { complete, getModel } from "agentic-kit";

const kit = createOllamaKit('http://localhost:11434');
const text = await kit.generate({ model: 'mistral', prompt: 'Hello' });
const model = getModel("openai", "gpt-4o-mini");
const message = await complete(model!, {
messages: [{ role: "user", content: "Hello", timestamp: Date.now() }],
});

// Multi-provider
const multi = createMultiProviderKit();
multi.addProvider(new OllamaAdapter('http://localhost:11434'));
console.log(message.content);
```

## Contributing

See individual package READMEs for docs and local dev instructions.

## Testing

Default tests stay deterministic and local:

```bash
pnpm test
```

There is also a local-only Ollama live lane that does not hit hosted
providers. The default root command runs the fast smoke tier:

```bash
OLLAMA_LIVE_MODEL=qwen3.5:4b pnpm test:live:ollama
```

Run the broader lane explicitly when you want slower behavioral coverage:

```bash
OLLAMA_LIVE_MODEL=qwen3.5:4b pnpm test:live:ollama:extended
```

The Ollama live script performs a preflight against `OLLAMA_BASE_URL` and exits
cleanly if the local server or requested model is unavailable. If
`nomic-embed-text:latest` is installed, the lane also exercises local embedding
generation.
76 changes: 76 additions & 0 deletions REDESIGN_DECISIONS.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
# Agentic Kit Redesign Decisions

Date: 2026-04-18

This document records the redesign decisions made while evaluating `agentic-kit`
against the comparable `pi-mono` architecture, especially `packages/ai` and
`packages/agent`.

## Scope and Package Boundaries

1. `agentic-kit` remains the low-level provider portability layer.
2. Stateful orchestration moves into a separate `@agentic-kit/agent` package.
3. Tool execution stays out of `agentic-kit` core; the core only models tools,
tool calls, and tool results.
4. `@agentic-kit/agent` v1 should be intentionally minimal, shipping only the
sequential tool loop, lifecycle events, abort/continue, and pluggable context
transforms. Steering/follow-up queues and richer interruption policies are
deferred to phase 2.

## Core Type System

5. Core tool definitions use plain JSON Schema.
6. TypeBox/Zod support stays as helper adapters, not the core contract.
7. Core models are represented by a provider-independent `ModelDescriptor`
registry with capability metadata.
8. The model registry must support both built-in descriptors and runtime
registration of custom models/providers from day one.
9. The core message model treats `image` input and `thinking` output as
first-class content blocks in v1.
10. `usage`, `cost`, `stopReason`, and abort-driven partial-result semantics are
mandatory parts of the core contract in v1.

## Streaming and Conversation Semantics

11. Structured event streams become the primary streaming primitive; text-only
chunk callbacks remain as convenience wrappers.
12. Cross-provider replay and handoff is a hard requirement for v1, including
normalization for reasoning blocks, tool-call IDs, aborted turns, and
orphaned tool results.

## Provider Strategy

13. OpenAI-compatible backends should be handled by one generalized adapter path
with compatibility flags, not many first-class provider packages.
14. Embeddings stay out of the primary conversational core and live behind a
separate optional capability interface or companion package.

## Migration Strategy

15. `agentic-kit` should ship a backward-compatibility layer for the current
`generate({ model, prompt }, { onChunk })` API for one transition release.

## Architectural Implications

These decisions imply the following target architecture:

- `agentic-kit`
Low-level portability layer. Owns message/content types, model descriptors,
provider registry, streaming event protocol, compatibility transforms, usage,
and provider adapters.
- `@agentic-kit/agent`
Optional stateful runtime. Owns tool execution, sequential loop semantics,
lifecycle events, context transforms, and abort/continue behavior.
- Separate optional capabilities or companion packages
For non-conversational workloads such as embeddings, and optional schema
helpers such as TypeBox/Zod integration.

## Design Principles Confirmed

- Keep the protocol portable and runtime-agnostic.
- Normalize provider differences in the core instead of leaking them upward.
- Treat OpenAI-compatible APIs as a compatibility class, not a brand-specific
architecture.
- Avoid coupling the low-level layer to any single schema library or vendor SDK.
- Preserve a migration path from the existing text-only API while moving the
real architecture to structured messages and events.
16 changes: 16 additions & 0 deletions apps/tanstack-chat-demo/.cta.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
{
"projectName": "tanstack-chat-demo",
"mode": "file-router",
"typescript": true,
"tailwind": true,
"packageManager": "pnpm",
"git": false,
"install": false,
"addOnOptions": {},
"includeExamples": false,
"envVarValues": {},
"routerOnly": false,
"version": 1,
"framework": "react",
"chosenAddOns": []
}
13 changes: 13 additions & 0 deletions apps/tanstack-chat-demo/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
node_modules
.DS_Store
dist
dist-ssr
*.local
.env
.nitro
.tanstack
.wrangler
.output
.vinxi
__unconfig*
todos.json
11 changes: 11 additions & 0 deletions apps/tanstack-chat-demo/.vscode/settings.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
{
"files.watcherExclude": {
"**/routeTree.gen.ts": true
},
"search.exclude": {
"**/routeTree.gen.ts": true
},
"files.readonlyInclude": {
"**/routeTree.gen.ts": true
}
}
Loading
Loading