Class: OllamaChat::MessageList

Inherits:
Object
  • Object
show all
Includes:
MessageType, Term::ANSIColor
Defined in:
lib/ollama_chat/message_list.rb

Instance Attribute Summary collapse

Instance Method Summary collapse

Methods included from MessageType

#message_type

Constructor Details

#initialize(chat) ⇒ MessageList

The initialize method sets up the message list for an OllamaChat session.

belongs to

Parameters:



9
10
11
12
# File 'lib/ollama_chat/message_list.rb', line 9

def initialize(chat)
  @chat     = chat
  @messages = []
end

Instance Attribute Details

#messagesObject (readonly)

Returns the value of attribute messages.



16
17
18
# File 'lib/ollama_chat/message_list.rb', line 16

def messages
  @messages
end

#systemObject (readonly)

Returns the value of attribute system.



14
15
16
# File 'lib/ollama_chat/message_list.rb', line 14

def system
  @system
end

Instance Method Details

#<<(message) ⇒ OllamaChat::MessageList

The << operator appends a message to the list of messages and returns self.

Parameters:

  • message (Ollama::Message)

    the message to append

Returns:



38
39
40
41
# File 'lib/ollama_chat/message_list.rb', line 38

def <<(message)
  @messages << message
  self
end

#at_locationString

The at_location method returns the location/time/units information as a string if location is enabled.

Returns:

  • (String)

    the location information



208
209
210
211
212
213
214
215
216
217
218
# File 'lib/ollama_chat/message_list.rb', line 208

def at_location
  if @chat.location.on?
    location_name            = config.location.name
    location_decimal_degrees = config.location.decimal_degrees * ', '
    localtime                = Time.now.iso8601
    units                    = config.location.units
    config.prompts.location % {
      location_name:, location_decimal_degrees:, localtime:, units:,
    }
  end.to_s
end

#clearOllamaChat::MessageList

The clear method removes all non-system messages from the message list.

Returns:



28
29
30
31
# File 'lib/ollama_chat/message_list.rb', line 28

def clear
  @messages.delete_if { _1.role != 'system' }
  self
end

#drop(n) ⇒ Integer

The drop method removes the last n exchanges from the message list and returns the number of removed exchanges.

Parameters:

  • n (Integer)

    the number of exchanges to remove

Returns:

  • (Integer)

    the number of removed exchanges, or 0 if there are no more exchanges to pop



126
127
128
129
130
131
132
133
134
135
136
137
# File 'lib/ollama_chat/message_list.rb', line 126

def drop(n)
  if @messages.reject { _1.role == 'system' }.size > 1
    n = n.to_i.clamp(1, Float::INFINITY)
    r = @messages.pop(2 * n)
    m = r.size / 2
    STDOUT.puts "Dropped the last #{m} exchanges."
    m
  else
    STDOUT.puts "No more exchanges you can drop."
    0
  end
end

#lastOllama::Message

Returns the last message from the conversation.

Returns:

  • (Ollama::Message)

    The last message in the conversation, or nil if there are no messages.



47
48
49
# File 'lib/ollama_chat/message_list.rb', line 47

def last
  @messages.last
end

#list_conversation(last = nil) ⇒ OllamaChat::MessageList

The list_conversation method displays the last n messages from the conversation.

Parameters:

  • last (Integer) (defaults to: nil)

    the number of messages to display (default: nil)

Returns:



100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
# File 'lib/ollama_chat/message_list.rb', line 100

def list_conversation(last = nil)
  last = (last || @messages.size).clamp(0, @messages.size)
  @messages[-last..-1].to_a.each do |m|
    role_color = case m.role
                 when 'user' then 172
                 when 'assistant' then 111
                 when 'system' then 213
                 else 210
                 end
    content = m.content.full? { @chat.markdown.on? ? Kramdown::ANSI.parse(_1) : _1 }
    message_text = message_type(m.images) + " "
    message_text += bold { color(role_color) { m.role } }
    message_text += ":\n#{content}"
    m.images.full? { |images|
      message_text += "\nImages: " + italic { images.map(&:path) * ', ' }
    }
    STDOUT.puts message_text
  end
  self
end

#load_conversation(filename) ⇒ OllamaChat::MessageList

The load_conversation method loads a conversation from a file and populates the message list.

Parameters:

  • filename (String)

    the path to the file containing the conversation

Returns:



67
68
69
70
71
72
73
74
75
76
77
# File 'lib/ollama_chat/message_list.rb', line 67

def load_conversation(filename)
  unless File.exist?(filename)
    STDOUT.puts "File #{filename} doesn't exist. Choose another filename."
    return
  end
  @messages =
    File.open(filename, 'r') do |output|
      JSON(output.read).map { Ollama::Message.from_hash(_1) }
    end
  self
end

#save_conversation(filename) ⇒ OllamaChat::MessageList

The save_conversation method saves the current conversation to a file.

Parameters:

  • filename (String)

    the path where the conversation will be saved

Returns:



84
85
86
87
88
89
90
91
92
93
# File 'lib/ollama_chat/message_list.rb', line 84

def save_conversation(filename)
  if File.exist?(filename)
    STDOUT.puts "File #{filename} already exists. Choose another filename."
    return
  end
  File.open(filename, 'w') do |output|
    output.puts JSON(@messages)
  end
  self
end

#second_lastOllama::Message

The second_last method returns the second-to-last message from the conversation if there are more than one non-system messages.

Returns:

  • (Ollama::Message)

    the second-to-last message



55
56
57
58
59
# File 'lib/ollama_chat/message_list.rb', line 55

def second_last
  if @messages.reject { _1.role == 'system' }.size > 1
    @messages[-2]
  end
end

#set_system_prompt(system) ⇒ OllamaChat::MessageList

The set_system_prompt method sets the system prompt for the chat session. This implies deleting all of the messages in the message list, so it only contains the system prompt at the end.

Parameters:

  • system (String)

    the new system prompt

Returns:



146
147
148
149
150
151
# File 'lib/ollama_chat/message_list.rb', line 146

def set_system_prompt(system)
  @system = system.to_s
  @messages.clear
  @messages << Ollama::Message.new(role: 'system', content: self.system)
  self
end

#show_system_promptself, NilClass

The show_system_prompt method displays the system prompt configured for the chat session.

It retrieves the system prompt from the @system instance variable, parses it using Kramdown::ANSI, and removes any trailing newlines. If the resulting string is empty, the method returns immediately.

Otherwise, it prints a formatted message to the console, including the configured system prompt and its length in characters.

Returns:

  • (self, NilClass)

    nil if the system prompt is empty, otherwise self.



164
165
166
167
168
169
170
171
172
173
174
# File 'lib/ollama_chat/message_list.rb', line 164

def show_system_prompt
  system_prompt = Kramdown::ANSI.parse(system.to_s).gsub(/\n+\z/, '').full?
  system_prompt or return
  STDOUT.puts <<~EOT
    Configured system prompt is:
    #{system_prompt}

    System prompt length: #{bold{system_prompt.size}} characters.
  EOT
  self
end

#sizeInteger

Returns the number of messages stored in the message list.

Returns:

  • (Integer)

    The size of the message list.



21
22
23
# File 'lib/ollama_chat/message_list.rb', line 21

def size
  @messages.size
end

#to_aryArray

The to_ary method converts the message list into an array of Ollama::Message objects. If location support was enabled and the message list contains a system message, the system messages is decorated with the curent location, time, and unit preferences.

messages in the list.

Returns:

  • (Array)

    An array of Ollama::Message objects representing the



183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
# File 'lib/ollama_chat/message_list.rb', line 183

def to_ary
  location = at_location.full?
  add_system = !!location
  result = @messages.map do |message|
    if message.role == 'system' && location
      add_system = false
      content = message.content + "\n\n#{location}"
      Ollama::Message.new(role: message.role, content:)
    else
      message
    end
  end
  if add_system
    prompt = @chat.config.system_prompts.assistant?
    content = [ prompt, location ].compact * "\n\n"
    message = Ollama::Message.new(role: 'system', content:)
    result.unshift message
  end
  result
end