Class: Spacy::OpenAIHelper
- Inherits:
-
Object
- Object
- Spacy::OpenAIHelper
- Defined in:
- lib/ruby-spacy/openai_helper.rb
Overview
A helper class for OpenAI API interactions, designed to work with spaCy’s linguistic analysis via the block-based Language#with_openai API.
Instance Attribute Summary collapse
-
#model ⇒ String
readonly
The default model for chat requests.
Instance Method Summary collapse
-
#chat(system: nil, user: nil, messages: nil, model: nil, max_completion_tokens: nil, temperature: nil, response_format: nil, raw: false) ⇒ String, ...
Sends a chat completion request to OpenAI.
-
#embeddings(text, model: "text-embedding-3-small", dimensions: nil) ⇒ Array<Float>?
Generates text embeddings using OpenAI’s embeddings API.
-
#initialize(access_token: nil, model: "gpt-5-mini", max_completion_tokens: 1000, temperature: 0.7) ⇒ OpenAIHelper
constructor
Creates a new OpenAIHelper instance.
Constructor Details
#initialize(access_token: nil, model: "gpt-5-mini", max_completion_tokens: 1000, temperature: 0.7) ⇒ OpenAIHelper
Creates a new OpenAIHelper instance.
22 23 24 25 26 27 28 29 30 31 |
# File 'lib/ruby-spacy/openai_helper.rb', line 22 def initialize(access_token: nil, model: "gpt-5-mini", max_completion_tokens: 1000, temperature: 0.7) @access_token = access_token || ENV["OPENAI_API_KEY"] raise "Error: OPENAI_API_KEY is not set" unless @access_token @model = model @default_max_completion_tokens = max_completion_tokens @default_temperature = temperature @client = OpenAIClient.new(access_token: @access_token) end |
Instance Attribute Details
#model ⇒ String (readonly)
Returns the default model for chat requests.
15 16 17 |
# File 'lib/ruby-spacy/openai_helper.rb', line 15 def model @model end |
Instance Method Details
#chat(system: nil, user: nil, messages: nil, model: nil, max_completion_tokens: nil, temperature: nil, response_format: nil, raw: false) ⇒ String, ...
Sends a chat completion request to OpenAI.
Provides convenient system: and user: keyword arguments as shortcuts for building simple message arrays. For more complex conversations, pass a full messages: array directly.
48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 |
# File 'lib/ruby-spacy/openai_helper.rb', line 48 def chat(system: nil, user: nil, messages: nil, model: nil, max_completion_tokens: nil, temperature: nil, response_format: nil, raw: false) msgs = || (system: system, user: user) raise ArgumentError, "No messages provided. Use system:/user: or messages:" if msgs.empty? response = @client.chat( model: model || @model, messages: msgs, max_completion_tokens: max_completion_tokens || @default_max_completion_tokens, temperature: temperature || @default_temperature, response_format: response_format ) raw ? response : response.dig("choices", 0, "message", "content") rescue OpenAIClient::APIError => e puts "Error: OpenAI API call failed - #{e.message}" nil end |
#embeddings(text, model: "text-embedding-3-small", dimensions: nil) ⇒ Array<Float>?
Generates text embeddings using OpenAI’s embeddings API.
74 75 76 77 78 79 80 |
# File 'lib/ruby-spacy/openai_helper.rb', line 74 def (text, model: "text-embedding-3-small", dimensions: nil) response = @client.(model: model, input: text, dimensions: dimensions) response.dig("data", 0, "embedding") rescue OpenAIClient::APIError => e puts "Error: OpenAI API call failed - #{e.message}" nil end |