- Elixir 100%
| _disabled | ||
| config | ||
| docs | ||
| guides | ||
| lib | ||
| test | ||
| .formatter.exs | ||
| .gitignore | ||
| .mcp.json | ||
| .tool-versions | ||
| CHANGELOG.md | ||
| LICENSE-APACHE | ||
| LICENSE-MIT | ||
| mix.exs | ||
| mix.lock | ||
| README.md | ||
Nyabibara
An Elixir LLM framework inspired by Mastra. Agents, tools, memory, workflows, RAG, and streaming — all built on OTP.
# One-line chat
{:ok, result} = Nyabibara.generate("openai/gpt-4o", "What is Elixir?")
# Streaming
{:ok, stream} = Nyabibara.stream("anthropic/claude-sonnet-4-5", "Tell me a story")
Nyabibara.Stream.print(stream)
# Agents with tools
defmodule MyAgent do
use Nyabibara.Agent
model {:openai, "gpt-4o"}
instructions "You are a helpful assistant."
tool MyApp.Tools.GetWeather
end
{:ok, response} = MyAgent.generate("What's the weather in Seoul?")
Why nyabibara?
The Elixir ecosystem has excellent LLM building blocks — LangChain for chains, Jido for agent orchestration, Ash AI for Ash Framework integration, InstructorLite for structured outputs. But there is no unified framework that brings agents, workflows, memory, RAG, and evaluation together in a single package with idiomatic Elixir DX.
nyabibara fills that gap.
vs. LangChain (Elixir)
LangChain provides a solid chain abstraction over multiple LLM providers, but takes a monolithic approach where chains, messages, and tools are coupled to the framework's internal types. nyabibara takes a composable approach — each module (providers, agents, memory, tools) works independently and can be adopted incrementally. You can use Nyabibara.generate/3 for simple calls without touching agents or workflows.
vs. Ash AI
Ash AI offers powerful declarative AI integration, but requires the Ash Framework. If your project doesn't use Ash, you can't use Ash AI. nyabibara is framework-independent — it works with Phoenix, plain Elixir scripts, Livebook, iex, or any OTP application. Phoenix LiveView integration is an optional add-on, not a requirement.
vs. Jido
Jido is a mature agent orchestration framework (v2.0, 1.6k stars) with sophisticated OTP patterns. However, it requires adopting the full Jido ecosystem — jido, jido_ai, jido_action, jido_signal — and its abstraction level (Actions, Directives, Signals, FSM strategies) can be excessive for straightforward LLM applications. nyabibara absorbs Jido's best ideas (pure functional agents, directive pattern for side-effect separation, supervisor-based lifecycle) in a simpler single-package design.
vs. InstructorLite
InstructorLite does one thing well: structured LLM outputs via Ecto schemas. nyabibara includes structured output support as part of a larger toolkit. If you only need structured outputs, InstructorLite is a great choice. If you also need agents, memory, tool calling, streaming, and RAG, nyabibara provides all of these under one dependency.
What nyabibara brings to Elixir
| Capability | Existing Elixir options | nyabibara |
|---|---|---|
| Provider abstraction | ReqLLM, LangChain | Built-in, "provider/model" string routing |
| Streaming | Manual GenServer/Task | Nyabibara.stream/3 + LiveView components |
| Tool calling | LangChain Functions, Ash AI DSL | use Nyabibara.Tool behaviour + schema |
| Agents | Jido (multi-package), LangChain | use Nyabibara.Agent with auto tool loop |
| Memory | None (each project rolls its own) | Conversation, Working, Observational memory (ETS or Ecto/PostgreSQL) |
| Workflows | None (Jido FSM is different) | Graph-based workflow DSL |
| RAG | bitcrowd/rag | Integrated pipeline (chunk, embed, search) |
| Evaluation | Nearly none | ExUnit-integrated eval macros (planned) |
| MCP | hermes_mcp, ash_ai | Client + server integration |
| LiveView streaming | Manual PubSub wiring | Declarative streaming components |
Design principles
- Framework independent — Phoenix and Ecto are optional. Core works in any Elixir environment.
- OTP native — Agents run as supervised GenServers. Memory uses ETS in development and Ecto/PostgreSQL in production. Fault isolation comes free.
- Composable — Use only what you need.
Nyabibara.generate/3works without agents. Agents work without workflows. Each layer is opt-in. - Mastra-inspired DX —
"openai/gpt-4o"model strings,use Nyabibara.Agentmacros, declarative tool definitions. Familiar to Mastra/Vercel AI SDK users, idiomatic to Elixir developers.
Installation
Add nyabibara to your list of dependencies in mix.exs:
def deps do
[
{:nyabibara, "~> 0.1.0"}
]
end
Configure your API keys:
# config/runtime.exs
config :nyabibara, :providers, %{
openai: %{api_key: System.get_env("OPENAI_API_KEY")},
anthropic: %{api_key: System.get_env("ANTHROPIC_API_KEY")}
}
Providers
| Provider | Module | Model String |
|---|---|---|
| OpenAI | Nyabibara.Providers.OpenAI |
"openai/gpt-4o" |
| Anthropic | Nyabibara.Providers.Anthropic |
"anthropic/claude-sonnet-4-5" |
| Ollama | Nyabibara.Providers.Ollama |
"ollama/llama3" |
| OpenRouter | Nyabibara.Providers.OpenRouter |
"openrouter/meta-llama/llama-3-70b" |
Or pass credentials directly:
Nyabibara.generate("openai/gpt-4o", "Hello", api_key: "sk-...")
Running Tests
# Unit tests (no external APIs)
mix test.unit
# All tests
mix test
# Integration tests (requires API keys)
mix test.integration
# External API tests
mix test.external
Documentation
Generate docs locally:
mix docs
open doc/index.html
Documentation is available at HexDocs.
License
MIT