"hey AI, please make a mastra-like LLM framework for Elixir"
Find a file
2026-04-11 00:48:15 +09:00
_disabled Add stream_generate: streaming agent loop with tool calls 2026-03-19 01:42:58 +00:00
config Initial implementation of nyabibara - Elixir LLM framework 2026-03-18 01:19:09 +00:00
docs Major session: safety/hooks/auth to core, tool improvements, security fixes 2026-03-20 14:26:42 +00:00
guides QA: Ecto store bugfixes, tests, docs 2026-03-18 14:57:52 +00:00
lib Harden Codex streaming tool call handling 2026-04-11 00:48:15 +09:00
test Harden Codex streaming tool call handling 2026-04-11 00:48:15 +09:00
.formatter.exs Initial implementation of nyabibara - Elixir LLM framework 2026-03-18 01:19:09 +00:00
.gitignore Initial implementation of nyabibara - Elixir LLM framework 2026-03-18 01:19:09 +00:00
.mcp.json Initial implementation of nyabibara - Elixir LLM framework 2026-03-18 01:19:09 +00:00
.tool-versions Update to OTP 28.4.1 + Elixir 1.18.4-otp-28 2026-03-18 18:09:57 +00:00
CHANGELOG.md Initial implementation of nyabibara - Elixir LLM framework 2026-03-18 01:19:09 +00:00
LICENSE-APACHE Add dual license: MIT + Apache-2.0 2026-03-21 05:24:19 +00:00
LICENSE-MIT Add dual license: MIT + Apache-2.0 2026-03-21 05:24:19 +00:00
mix.exs Make ecto_libsql backend optional again 2026-03-25 02:46:45 +00:00
mix.lock Make ecto_libsql backend optional again 2026-03-25 02:46:45 +00:00
README.md QA: Ecto store bugfixes, tests, docs 2026-03-18 14:57:52 +00:00

Nyabibara

An Elixir LLM framework inspired by Mastra. Agents, tools, memory, workflows, RAG, and streaming — all built on OTP.

# One-line chat
{:ok, result} = Nyabibara.generate("openai/gpt-4o", "What is Elixir?")

# Streaming
{:ok, stream} = Nyabibara.stream("anthropic/claude-sonnet-4-5", "Tell me a story")
Nyabibara.Stream.print(stream)

# Agents with tools
defmodule MyAgent do
  use Nyabibara.Agent

  model {:openai, "gpt-4o"}
  instructions "You are a helpful assistant."
  tool MyApp.Tools.GetWeather
end

{:ok, response} = MyAgent.generate("What's the weather in Seoul?")

Why nyabibara?

The Elixir ecosystem has excellent LLM building blocks — LangChain for chains, Jido for agent orchestration, Ash AI for Ash Framework integration, InstructorLite for structured outputs. But there is no unified framework that brings agents, workflows, memory, RAG, and evaluation together in a single package with idiomatic Elixir DX.

nyabibara fills that gap.

vs. LangChain (Elixir)

LangChain provides a solid chain abstraction over multiple LLM providers, but takes a monolithic approach where chains, messages, and tools are coupled to the framework's internal types. nyabibara takes a composable approach — each module (providers, agents, memory, tools) works independently and can be adopted incrementally. You can use Nyabibara.generate/3 for simple calls without touching agents or workflows.

vs. Ash AI

Ash AI offers powerful declarative AI integration, but requires the Ash Framework. If your project doesn't use Ash, you can't use Ash AI. nyabibara is framework-independent — it works with Phoenix, plain Elixir scripts, Livebook, iex, or any OTP application. Phoenix LiveView integration is an optional add-on, not a requirement.

vs. Jido

Jido is a mature agent orchestration framework (v2.0, 1.6k stars) with sophisticated OTP patterns. However, it requires adopting the full Jido ecosystem — jido, jido_ai, jido_action, jido_signal — and its abstraction level (Actions, Directives, Signals, FSM strategies) can be excessive for straightforward LLM applications. nyabibara absorbs Jido's best ideas (pure functional agents, directive pattern for side-effect separation, supervisor-based lifecycle) in a simpler single-package design.

vs. InstructorLite

InstructorLite does one thing well: structured LLM outputs via Ecto schemas. nyabibara includes structured output support as part of a larger toolkit. If you only need structured outputs, InstructorLite is a great choice. If you also need agents, memory, tool calling, streaming, and RAG, nyabibara provides all of these under one dependency.

What nyabibara brings to Elixir

Capability Existing Elixir options nyabibara
Provider abstraction ReqLLM, LangChain Built-in, "provider/model" string routing
Streaming Manual GenServer/Task Nyabibara.stream/3 + LiveView components
Tool calling LangChain Functions, Ash AI DSL use Nyabibara.Tool behaviour + schema
Agents Jido (multi-package), LangChain use Nyabibara.Agent with auto tool loop
Memory None (each project rolls its own) Conversation, Working, Observational memory (ETS or Ecto/PostgreSQL)
Workflows None (Jido FSM is different) Graph-based workflow DSL
RAG bitcrowd/rag Integrated pipeline (chunk, embed, search)
Evaluation Nearly none ExUnit-integrated eval macros (planned)
MCP hermes_mcp, ash_ai Client + server integration
LiveView streaming Manual PubSub wiring Declarative streaming components

Design principles

  • Framework independent — Phoenix and Ecto are optional. Core works in any Elixir environment.
  • OTP native — Agents run as supervised GenServers. Memory uses ETS in development and Ecto/PostgreSQL in production. Fault isolation comes free.
  • Composable — Use only what you need. Nyabibara.generate/3 works without agents. Agents work without workflows. Each layer is opt-in.
  • Mastra-inspired DX"openai/gpt-4o" model strings, use Nyabibara.Agent macros, declarative tool definitions. Familiar to Mastra/Vercel AI SDK users, idiomatic to Elixir developers.

Installation

Add nyabibara to your list of dependencies in mix.exs:

def deps do
  [
    {:nyabibara, "~> 0.1.0"}
  ]
end

Configure your API keys:

# config/runtime.exs
config :nyabibara, :providers, %{
  openai: %{api_key: System.get_env("OPENAI_API_KEY")},
  anthropic: %{api_key: System.get_env("ANTHROPIC_API_KEY")}
}

Providers

Provider Module Model String
OpenAI Nyabibara.Providers.OpenAI "openai/gpt-4o"
Anthropic Nyabibara.Providers.Anthropic "anthropic/claude-sonnet-4-5"
Ollama Nyabibara.Providers.Ollama "ollama/llama3"
OpenRouter Nyabibara.Providers.OpenRouter "openrouter/meta-llama/llama-3-70b"

Or pass credentials directly:

Nyabibara.generate("openai/gpt-4o", "Hello", api_key: "sk-...")

Running Tests

# Unit tests (no external APIs)
mix test.unit

# All tests
mix test

# Integration tests (requires API keys)
mix test.integration

# External API tests
mix test.external

Documentation

Generate docs locally:

mix docs
open doc/index.html

Documentation is available at HexDocs.

License

MIT