By James Aspinwall — February 2026
Elixir’s AI ecosystem is small enough that you can know everyone who matters and large enough that serious infrastructure is being built. Here’s a map of the people and projects shaping how Elixir connects to LLMs, MCP, and machine learning.
MCP: The Protocol Layer
The Model Context Protocol is how AI models talk to external tools and data. Several Elixir developers have built competing implementations, each with a different philosophy.
Cloudwalk / Zoedsoupe built Hermes MCP, one of the most mature Elixir MCP SDKs available on Hex. It provides full client and server implementations with Streamable HTTP and STDIO transports. Zoedsoupe later forked it as Anubis MCP, currently at version 0.17. If you’re evaluating MCP libraries for production, these are the ones with the most mileage.
Arjan Scherpenisse authored elixir-mcp, a leaner take on MCP servers. His decorator-based approach lets you tag a public function with @decorate tool_call() and it automatically becomes an MCP tool. Less ceremony, faster to wire up. Good for teams that want MCP integration without a heavy framework commitment.
Pranavj17 maintains an Elixir MCP SDK that closely follows Anthropic’s official MCP specification. If spec compliance matters — and for interoperability it should — this is the reference implementation to watch.
E-xyza created Codicil, which takes MCP in an interesting direction: giving LLMs semantic understanding of Elixir codebases through vector search and compilation-time analysis. Instead of just exposing tools, it exposes knowledge about your code to the model.
Josiah Dahl built McpDocs, an MCP server that exposes Elixir project and dependency documentation to LLMs via SSE. Practical and immediately useful — your AI assistant can read your project’s hex docs without you pasting them into the prompt.
LLM Frameworks: The Integration Layer
Mark Ericksen (brainlid) is the person to know here. His LangChain library on Hex is the dominant LLM integration framework in the Elixir ecosystem. It supports OpenAI, Anthropic, Google Gemini, and local models. Ericksen is also a regular on the Elixir Wizards podcast, which makes him one of the more visible advocates for AI-in-Elixir. If you’re building LLM-powered features in an Elixir application, you’re probably using his library.
Orthagonal authored LangChainEx, an earlier and more lightweight alternative for LLM chaining. It’s notable for its multi-agent focus — coordinating multiple LLM calls with different roles and tools. Less adoption than Ericksen’s library but a different design philosophy worth understanding.
Machine Learning: The Foundation Layer
Sean Moriarity wrote the book — literally. Machine Learning in Elixir is the definitive text on ML in the BEAM ecosystem. He created Axon (neural networks) and Bumblebee (pre-trained model integration), and is a core contributor to Nx, the numerical computing library that makes all of it possible. His recent talks on “Designing LLM Native Systems” in Elixir signal where the ecosystem is heading: not just calling external APIs, but running inference natively.
Jose Valim, the creator of Elixir, is actively experimenting with LLMs through Livebook — interactive notebooks for AI workflows, OpenAI function calling exploration, and shaping how the language itself accommodates AI workloads. He’s not building MCP servers, but when Valim moves in a direction, the ecosystem follows.
George Guimaraes maintains the awesome-ml-gen-ai-elixir curated list on GitHub. It’s the single best resource for tracking new libraries, tools, and players as they appear. Bookmark it.
Where to Watch
The Elixir Forum — specifically the Libraries and AI categories — is where most of these developers announce and discuss their work. The #ai and #llm channels on the Elixir Slack are active and accessible. New projects surface there weeks before they show up anywhere else.
Who to Connect With First
For anyone building MCP-based AI integration on Elixir, the highest-value relationships are:
- The Hermes/Anubis MCP maintainers (Cloudwalk/Zoedsoupe) — they’ve solved the transport and protocol problems you’ll hit. Their battle scars save you months.
- Mark Ericksen — his LangChain library is likely part of your stack whether you planned it or not. Understanding his roadmap tells you where the ecosystem’s LLM integration is going.
- Sean Moriarity — if your use case moves beyond API calls toward on-device inference or custom models, he’s the authority.
The Elixir AI community is at the stage where showing up and contributing gets you noticed. The field is small, the problems are real, and the people building solutions are reachable.