Loading...
アイコン

Code and Stuff

チャンネル登録者数 5030人

1248 回視聴 ・ 90いいね ・ 2025/10/31

In this video, I explore ReqLLM, a new Elixir package that provides a unified interface for working with LLMs across multiple providers.

We'll use a Livebook notebook to demonstrate streaming responses, structured outputs with multiple schema formats, and building AI agents with tool calls. This eliminates the complexity of switching between different LLM providers and lets you optimize your application for cost and performance without rewriting code.

What's covered:
What is ReqLLM and why use it?
Setting up Livebook with ReqLLM
Exploring supported providers and models from models.dev
Basic text generation with provider options
Streaming responses with token-by-token rendering
Multi-turn conversations with context management
Generating structured outputs with NimbleOptions, JSON Schema, and Zoi
Building AI agents with tool calls and external data sources
Cost tracking and token metering for billing

Links
Demo notebook: gist.github.com/ChristianAlexander/512ae4639c4d682…
ReqLLM documentation: hexdocs.pm/req_llm
Livebook: livebook.dev/
The Swarm Discord: discord.gg/KSjyKkDh

コメント

コメントを取得中...

コントロール
設定

使用したサーバー: directk