Class: Spacy::OpenAIHelper

Inherits:
Object
  • Object
show all
Defined in:
lib/ruby-spacy/openai_helper.rb

Overview

A helper class for OpenAI API interactions, designed to work with spaCy’s linguistic analysis via the block-based Language#with_openai API.

Examples:

Basic usage with linguistic_summary

nlp = Spacy::Language.new("en_core_web_sm")
nlp.with_openai(model: "gpt-5-mini") do |ai|
  doc = nlp.read("Apple Inc. was founded by Steve Jobs.")
  ai.chat(system: "Analyze the linguistic data.", user: doc.linguistic_summary)
end

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(access_token: nil, model: "gpt-5-mini", max_completion_tokens: 1000, temperature: 0.7) ⇒ OpenAIHelper

Creates a new OpenAIHelper instance.

Parameters:

  • access_token (String, nil) (defaults to: nil)

    OpenAI API key (defaults to OPENAI_API_KEY env var)

  • model (String) (defaults to: "gpt-5-mini")

    the default model for chat requests

  • max_completion_tokens (Integer) (defaults to: 1000)

    default maximum tokens in responses

  • temperature (Float) (defaults to: 0.7)

    default sampling temperature



22
23
24
25
26
27
28
29
30
31
# File 'lib/ruby-spacy/openai_helper.rb', line 22

def initialize(access_token: nil, model: "gpt-5-mini",
               max_completion_tokens: 1000, temperature: 0.7)
  @access_token = access_token || ENV["OPENAI_API_KEY"]
  raise "Error: OPENAI_API_KEY is not set" unless @access_token

  @model = model
  @default_max_completion_tokens = max_completion_tokens
  @default_temperature = temperature
  @client = OpenAIClient.new(access_token: @access_token)
end

Instance Attribute Details

#modelString (readonly)

Returns the default model for chat requests.

Returns:

  • (String)

    the default model for chat requests



15
16
17
# File 'lib/ruby-spacy/openai_helper.rb', line 15

def model
  @model
end

Instance Method Details

#chat(system: nil, user: nil, messages: nil, model: nil, max_completion_tokens: nil, temperature: nil, response_format: nil, raw: false) ⇒ String, ...

Sends a chat completion request to OpenAI.

Provides convenient system: and user: keyword arguments as shortcuts for building simple message arrays. For more complex conversations, pass a full messages: array directly.

Parameters:

  • system (String, nil) (defaults to: nil)

    system message content (shortcut)

  • user (String, nil) (defaults to: nil)

    user message content (shortcut)

  • messages (Array<Hash>, nil) (defaults to: nil)

    full message array (overrides system:/user:)

  • model (String, nil) (defaults to: nil)

    model override (defaults to instance model)

  • max_completion_tokens (Integer, nil) (defaults to: nil)

    token limit override

  • temperature (Float, nil) (defaults to: nil)

    temperature override

  • response_format (Hash, nil) (defaults to: nil)

    response format (e.g., { type: “json_object” })

  • raw (Boolean) (defaults to: false)

    if true, returns the full API response Hash instead of text

Returns:

  • (String, Hash, nil)

    the response text, full response Hash (if raw:), or nil on error



48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
# File 'lib/ruby-spacy/openai_helper.rb', line 48

def chat(system: nil, user: nil, messages: nil,
         model: nil, max_completion_tokens: nil,
         temperature: nil, response_format: nil, raw: false)
  msgs = messages || build_messages(system: system, user: user)
  raise ArgumentError, "No messages provided. Use system:/user: or messages:" if msgs.empty?

  response = @client.chat(
    model: model || @model,
    messages: msgs,
    max_completion_tokens: max_completion_tokens || @default_max_completion_tokens,
    temperature: temperature || @default_temperature,
    response_format: response_format
  )

  raw ? response : response.dig("choices", 0, "message", "content")
rescue OpenAIClient::APIError => e
  puts "Error: OpenAI API call failed - #{e.message}"
  nil
end

#embeddings(text, model: "text-embedding-3-small", dimensions: nil) ⇒ Array<Float>?

Generates text embeddings using OpenAI’s embeddings API.

Parameters:

  • text (String)

    the text to embed

  • model (String) (defaults to: "text-embedding-3-small")

    the embeddings model

  • dimensions (Integer, nil) (defaults to: nil)

    number of dimensions (nil uses model default)

Returns:

  • (Array<Float>, nil)

    the embedding vector, or nil on error



74
75
76
77
78
79
80
# File 'lib/ruby-spacy/openai_helper.rb', line 74

def embeddings(text, model: "text-embedding-3-small", dimensions: nil)
  response = @client.embeddings(model: model, input: text, dimensions: dimensions)
  response.dig("data", 0, "embedding")
rescue OpenAIClient::APIError => e
  puts "Error: OpenAI API call failed - #{e.message}"
  nil
end