Geminix — Google Gemini Comes to Elixir

By James Aspinwall — February 2026

The Library

Geminix is a new Elixir library providing direct bindings to Google’s Gemini API. It was released in February 2026 by Tiago Barroso, and if that name doesn’t ring a bell, his code is running on your machine right now. He’s the author of Makeup, the syntax highlighting engine used by ExDoc — the tool that generates documentation for every Elixir package on Hex. His libraries have been downloaded over 200 million times.

When someone with that track record publishes a Gemini SDK, it’s worth paying attention.

What Makes It Different

Geminix is auto-generated from Google’s official JSON API specification at compile time. Rather than a developer manually writing wrapper functions that drift out of sync with the API, the library reads Google’s spec and generates Elixir functions and structs directly. No intermediate files — the quoted expressions are compiled in memory.

This approach has a practical consequence: when Google updates the Gemini API, updating Geminix means pulling the new spec and recompiling. The library tracks the API by design, not by effort.

Manual code handles the edge cases — batch job polling, long-running operations — where the spec alone isn’t enough.

How This Helps WorkingAgents

WorkingAgents currently routes LLM calls through OpenRouter, which supports Gemini models among others. That works, but it adds a middleman, an extra network hop, and OpenRouter’s markup on token pricing. A direct Gemini binding opens several doors.

Direct provider integration. WorkingAgents’ ServerChat system already supports multiple providers — Anthropic, OpenRouter, Perplexity. Geminix slots in as a native Gemini provider. No OpenRouter detour, no OpenAI-format translation layer. Gemini’s API directly, in Elixir structs.

Large context windows. Gemini offers context windows up to 1 million tokens. For the compliance rule engine architecture — where an AI reads a bank’s full regulatory handbook and generates executable rule code — that context window is the difference between processing rules one at a time and processing an entire regulatory framework in a single call.

Batch processing. Geminix includes support for long-running batch jobs with polling. For workflows where WorkingAgents needs to process a volume of documents, summarize a backlog, or generate multiple rule modules from a compliance manual, batch mode means firing off the work and getting results when they’re ready rather than blocking on each call.

Cost arbitrage. Google prices Gemini aggressively against Anthropic and OpenAI. For high-volume, lower-stakes tasks — summarization, classification, initial draft generation — routing through Gemini directly instead of through OpenRouter shaves cost on both the token price and the routing fee.

Native Elixir fit. Because Geminix generates proper Elixir structs rather than raw JSON maps, the integration with GenServers, pattern matching, and the rest of the OTP toolkit is natural. No JSON key-string gymnastics.

What to Watch For

The library is early — version 0.2.0, just released. It doesn’t have production mileage yet. But the spec-driven generation approach means the API surface is complete and correct by construction, even if the ergonomics will likely improve over the next few releases.

Given Barroso’s track record of building foundational Elixir tooling that the entire ecosystem depends on, this is a library worth tracking and integrating early rather than waiting for it to become obvious.

Sources