Providers Overview

Providers are adapters that connect Riffer to LLM services. They implement a common interface for text generation and streaming.

Available Providers

Provider Identifier Gem Required
OpenAI openai openai
Azure OpenAI azure_openai openai
Amazon Bedrock amazon_bedrock aws-sdk-bedrockruntime
Anthropic anthropic anthropic
Gemini gemini None
Mock mock None

Model String Format

Agents specify providers using the provider/model format:

class MyAgent < Riffer::Agent
  model 'openai/gpt-5-mini'                                           # OpenAI
  model 'azure_openai/gpt-5-mini'                                     # Azure OpenAI
  model 'amazon_bedrock/us.anthropic.claude-haiku-4-5-20251001-v1:0'  # Bedrock
  model 'anthropic/claude-haiku-4-5-20251001'                         # Anthropic
  model 'gemini/gemini-2.5-flash-lite'                                # Gemini
  model 'mock/any'                                                    # Mock provider
end

Provider Interface

All providers inherit from Riffer::Providers::Base and implement:

generate_text

Generates a response synchronously:

provider = Riffer::Providers::OpenAI.new(api_key: "...")

response = provider.generate_text(
  prompt: "Hello!",
  model: "gpt-5-mini"
)
# => Riffer::Messages::Assistant

# Or with messages
response = provider.generate_text(
  messages: [Riffer::Messages::User.new("Hello!")],
  model: "gpt-5-mini"
)

stream_text

Streams a response as an Enumerator:

provider.stream_text(prompt: "Tell me a story", model: "gpt-5-mini").each do |event|
  case event
  when Riffer::StreamEvents::TextDelta
    print event.content
  end
end

Method Parameters

Parameter Description
prompt String prompt (required if messages not provided)
system Optional system message string
messages Array of message objects/hashes (alternative to prompt)
model Model name string
tools Array of Tool classes
**options Provider-specific options (including web_search if supported by provider)

You must provide either prompt or messages, but not both.

Using Providers Directly

While agents abstract provider usage, you can use providers directly:

require 'riffer'

Riffer.configure do |config|
  config.openai.api_key = ENV['OPENAI_API_KEY']
end

provider = Riffer::Providers::OpenAI.new

# Simple prompt
response = provider.generate_text(
  prompt: "What is Ruby?",
  model: "gpt-5-mini"
)
puts response.content

# With system message
response = provider.generate_text(
  prompt: "Explain recursion",
  system: "You are a programming tutor. Use simple language.",
  model: "gpt-5-mini"
)

# With message history
messages = [
  Riffer::Messages::System.new("You are helpful."),
  Riffer::Messages::User.new("Hi!"),
  Riffer::Messages::Assistant.new("Hello!"),
  Riffer::Messages::User.new("How are you?")
]

response = provider.generate_text(
  messages: messages,
  model: "gpt-5-mini"
)

Tool Support

Providers convert tools to their native format:

class WeatherTool < Riffer::Tool
  description "Gets weather"
  params do
    required :city, String
  end
  def call(context:, city:)
    text("Sunny in #{city}")
  end
end

response = provider.generate_text(
  prompt: "What's the weather in Tokyo?",
  model: "gpt-5-mini",
  tools: [WeatherTool]
)

if response.tool_calls.any?
  # Handle tool calls
end

Provider Registry

Riffer uses a registry to find providers by identifier:

Riffer::Providers::Repository.find(:openai)
# => Riffer::Providers::OpenAI

Riffer::Providers::Repository.find(:azure_openai)
# => Riffer::Providers::AzureOpenAI

Riffer::Providers::Repository.find(:amazon_bedrock)
# => Riffer::Providers::AmazonBedrock

Riffer::Providers::Repository.find(:anthropic)
# => Riffer::Providers::Anthropic

Riffer::Providers::Repository.find(:gemini)
# => Riffer::Providers::Gemini

Riffer::Providers::Repository.find(:mock)
# => Riffer::Providers::Mock

Provider-Specific Guides