Class: Spacy::OpenAIClient

Inherits:
Object
  • Object
show all
Defined in:
lib/ruby-spacy/openai_client.rb

Overview

A lightweight OpenAI API client with tools support for GPT-5 series models. This client implements the chat completions and embeddings endpoints without external dependencies.

Defined Under Namespace

Classes: APIError

Constant Summary collapse

API_ENDPOINT =
"https://api.openai.com/v1"
DEFAULT_TIMEOUT =
120
MAX_RETRIES =
3
BASE_RETRY_DELAY =
1

Instance Method Summary collapse

Constructor Details

#initialize(access_token:, timeout: DEFAULT_TIMEOUT) ⇒ OpenAIClient

Returns a new instance of OpenAIClient.



28
29
30
31
# File 'lib/ruby-spacy/openai_client.rb', line 28

def initialize(access_token:, timeout: DEFAULT_TIMEOUT)
  @access_token = access_token
  @timeout = timeout
end

Instance Method Details

#chat(model:, messages:, max_completion_tokens: 1000, temperature: nil, tools: nil, tool_choice: nil, response_format: nil) ⇒ Hash

Sends a chat completion request with optional tools support. Note: GPT-5 series and o-series models do not support the temperature parameter.

Parameters:

  • model (String)

    The model to use (e.g., “gpt-5-mini”)

  • messages (Array<Hash>)

    The conversation messages

  • max_completion_tokens (Integer) (defaults to: 1000)

    Maximum tokens in the response

  • temperature (Float, nil) (defaults to: nil)

    Sampling temperature (ignored for models that don’t support it)

  • tools (Array<Hash>, nil) (defaults to: nil)

    Tool definitions for function calling

  • tool_choice (String, Hash, nil) (defaults to: nil)

    Tool selection strategy

  • response_format (Hash, nil) (defaults to: nil)

    Response format specification (e.g., { type: “json_object” })

Returns:

  • (Hash)

    The API response



44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
# File 'lib/ruby-spacy/openai_client.rb', line 44

def chat(model:, messages:, max_completion_tokens: 1000, temperature: nil, tools: nil, tool_choice: nil, response_format: nil)
  body = {
    model: model,
    messages: messages,
    max_completion_tokens: max_completion_tokens
  }

  # GPT-5 series and o-series models do not support temperature parameter
  unless temperature_unsupported?(model)
    body[:temperature] = temperature || 0.7
  end

  if tools && !tools.empty?
    body[:tools] = tools
    body[:tool_choice] = tool_choice || "auto"
  end

  body[:response_format] = response_format if response_format

  post("/chat/completions", body)
end

#embeddings(model:, input:, dimensions: nil) ⇒ Hash

Sends an embeddings request.

Parameters:

  • model (String)

    The embeddings model (e.g., “text-embedding-3-small”)

  • input (String)

    The text to embed

  • dimensions (Integer, nil) (defaults to: nil)

    The number of dimensions for the output embeddings

Returns:

  • (Hash)

    The API response



81
82
83
84
85
86
87
88
89
# File 'lib/ruby-spacy/openai_client.rb', line 81

def embeddings(model:, input:, dimensions: nil)
  body = {
    model: model,
    input: input
  }
  body[:dimensions] = dimensions if dimensions

  post("/embeddings", body)
end

#temperature_unsupported?(model) ⇒ Boolean

Checks if the model does not support the temperature parameter. This includes GPT-5 series and o-series (o1, o3, o4-mini, etc.) models.

Parameters:

  • model (String)

    The model name

Returns:

  • (Boolean)


70
71
72
73
# File 'lib/ruby-spacy/openai_client.rb', line 70

def temperature_unsupported?(model)
  name = model.to_s
  name.start_with?("gpt-5") || name.match?(/\Ao\d/)
end