Class: Chainer::Functions::Activation::LogSoftmax
- Inherits:
-
Chainer::FunctionNode
- Object
- Chainer::FunctionNode
- Chainer::Functions::Activation::LogSoftmax
- Defined in:
- lib/chainer/functions/activation/log_softmax.rb
Overview
Log-softmax activation function.
Instance Attribute Summary
Attributes inherited from Chainer::FunctionNode
Class Method Summary collapse
-
.log_softmax(x) ⇒ Chainer::Variable
Channel-wise log-softmax function.
Instance Method Summary collapse
Methods inherited from Chainer::FunctionNode
#apply, #backward_accumulate, #forward_cpu, #get_retained_inputs, #get_retained_outputs, #initialize, #label, #output_data, #retain_inputs, #retain_outputs, #unchain
Constructor Details
This class inherits a constructor from Chainer::FunctionNode
Class Method Details
.log_softmax(x) ⇒ Chainer::Variable
log(softmax(x)) may cause underflow when x
is too small, because softmax(x) may returns 0
. log_softmax
method is more stable.
Channel-wise log-softmax function.
This function computes its logarithm of softmax along the second axis. Let $c = (c_1, c_2, \dots, c_D)$ be the slice of x
along with the second axis. For each slice $c$, it computes the logarithm of the function $f(c)$ defined as
$$ f(c) = { \exp(c) \over \sum_{ d } \exp(c_d) }. $$
This method is theoretically equivalent to log(softmax(x)) but is more stable.
59 60 61 |
# File 'lib/chainer/functions/activation/log_softmax.rb', line 59 def self.log_softmax(x) self.new.apply([x]).first end |
Instance Method Details
#backward(indexes, gy) ⇒ Object
71 72 73 74 |
# File 'lib/chainer/functions/activation/log_softmax.rb', line 71 def backward(indexes, gy) y = get_retained_outputs.first LogSoftmaxGrad.new(@x_shape, @x_dtype).apply([y, gy[0]]) end |
#forward(xs) ⇒ Object
63 64 65 66 67 68 69 |
# File 'lib/chainer/functions/activation/log_softmax.rb', line 63 def forward(xs) y = Chainer::Functions::Activation._log_softmax(xs[0]) @x_shape = xs[0].shape @x_dtype = xs[0].class retain_outputs([0]) [y] end |