Class: Chainer::Functions::Activation::Sigmoid

Inherits:
Chainer::FunctionNode show all
Defined in:
lib/chainer/functions/activation/sigmoid.rb

Overview

Logistic sigmoid function.

Instance Attribute Summary

Attributes inherited from Chainer::FunctionNode

#inputs, #outputs, #rank

Class Method Summary collapse

Instance Method Summary collapse

Methods inherited from Chainer::FunctionNode

#apply, #backward_accumulate, #forward_cpu, #get_retained_inputs, #get_retained_outputs, #initialize, #label, #output_data, #retain_inputs, #retain_outputs, #unchain

Constructor Details

This class inherits a constructor from Chainer::FunctionNode

Class Method Details

.sigmoid(x) ⇒ Chainer::Variable

Element-wise sigmoid logistic function.

$$ f(x)=(1 + \exp(-x))^ { -1 }. $$

Examples:

It maps the input values into the range of $‘[0, 1]`$.

> x = Numo::SFloat.new(3).seq(-2, 2)
=> Numo::SFloat#shape=[3]
[-2, 0, 2]
> F = Chainer::Functions::Activation::Sigmoid
> F.sigmoid(x).data
=> Numo::SFloat#shape=[3]
[0.119203, 0.5, 0.880797]

Parameters:

  • x (Chainer::Variable or Numo::NArray or Cumo::NArray)

    Input variable. A $(s_1, s_2, …, s_N)$-shaped float array.

Returns:

  • (Chainer::Variable)

    Output variable. A $(s_1, s_2, …, s_N)$-shaped float array.



23
24
25
# File 'lib/chainer/functions/activation/sigmoid.rb', line 23

def self.sigmoid(x)
  self.new.apply([x]).first
end

Instance Method Details

#backward(indexes, grad_outputs) ⇒ Object



36
37
38
39
40
41
# File 'lib/chainer/functions/activation/sigmoid.rb', line 36

def backward(indexes, grad_outputs)
  x = nil
  y = get_retained_outputs.first
  gy, = grad_outputs
  Chainer::Functions::Activation::SigmoidGrad.new([x]).apply([y, gy])
end

#forward(inputs) ⇒ Object



27
28
29
30
31
32
33
34
# File 'lib/chainer/functions/activation/sigmoid.rb', line 27

def forward(inputs)
  x, = inputs
  half = 0.5
  xm = Chainer.get_array_module(x)
  y = Utils::Array.force_array((xm::NMath.tanh(x * half) * half)+ half)
  retain_outputs([0])
  [y]
end