Module: Chainer::Functions::Activation

Defined in:
lib/chainer/functions/activation/relu.rb,
lib/chainer/functions/activation/tanh.rb,
lib/chainer/functions/activation/sigmoid.rb,
lib/chainer/functions/activation/leaky_relu.rb,
lib/chainer/functions/activation/relu_grad2.rb,
lib/chainer/functions/activation/log_softmax.rb,
lib/chainer/functions/activation/sigmoid_grad.rb

Defined Under Namespace

Classes: LeakyReLU, LeakyReLUGrad, LogSoftmax, LogSoftmaxGrad, ReLUGrad2, Relu, Sigmoid, SigmoidGrad, Tanh, TanhGrad

Class Method Summary collapse

Class Method Details

._log_softmax(x) ⇒ Object



14
15
16
17
# File 'lib/chainer/functions/activation/log_softmax.rb', line 14

def self._log_softmax(x)
  log_z = logsumexp(x)
  x - log_z
end

.logsumexp(x) ⇒ Object



4
5
6
7
8
9
10
11
12
# File 'lib/chainer/functions/activation/log_softmax.rb', line 4

def self.logsumexp(x)
  xm = Chainer.get_array_module(x)
  m = x.max(axis: 1, keepdims: true)
  y = x - m
  y = xm::NMath.exp(y)
  s = y.sum(axis: 1, keepdims: true)
  s = xm::NMath.log(s)
  m + s
end