Class: Chainer::Functions::Activation::LogSoftmax
- Inherits:
-
Chainer::Function
- Object
- Chainer::Function
- Chainer::Functions::Activation::LogSoftmax
- Defined in:
- lib/chainer/functions/activation/log_softmax.rb
Overview
Log-softmax activation function.
Instance Attribute Summary
Attributes inherited from Chainer::Function
#inputs, #output_data, #outputs, #rank, #retain_after_backward
Class Method Summary collapse
-
.log_softmax(x) ⇒ Chainer::Variable
Channel-wise log-softmax function.
Instance Method Summary collapse
Methods inherited from Chainer::Function
#call, #forward_cpu, #initialize, #retain_inputs, #retain_outputs
Constructor Details
This class inherits a constructor from Chainer::Function
Class Method Details
.log_softmax(x) ⇒ Chainer::Variable
log(softmax(x)) may cause underflow when x
is too small, because softmax(x) may returns 0
. log_softmax
method is more stable.
Channel-wise log-softmax function.
This function computes its logarithm of softmax along the second axis. Let $c = (c_1, c_2, \dots, c_D)$ be the slice of x
along with the second axis. For each slice $c$, it computes the logarithm of the function $f(c)$ defined as
$$ f(c) = { \exp(c) \over \sum_{ d } \exp(c_d) }. $$
This method is theoretically equivalent to log(softmax(x)) but is more stable.
58 59 60 |
# File 'lib/chainer/functions/activation/log_softmax.rb', line 58 def self.log_softmax(x) self.new.(x) end |
Instance Method Details
#backward(x, gy) ⇒ Object
71 72 73 74 75 |
# File 'lib/chainer/functions/activation/log_softmax.rb', line 71 def backward(x, gy) y = @output_data[0] gx = gy[0] - Numo::NMath.exp(y) * gy[0].sum(axis: 1, keepdims: true) [gx] end |
#forward(xs) ⇒ Object
62 63 64 65 66 67 68 69 |
# File 'lib/chainer/functions/activation/log_softmax.rb', line 62 def forward(xs) y = Chainer::Functions::Activation._log_softmax(xs[0]) @x_shape = xs[0].shape @x_dtype = xs[0].class retain_inputs([]) retain_outputs([0]) [y] end |