Class: Chainer::Functions::Activation::LeakyReLU

Inherits:
Chainer::FunctionNode show all
Defined in:
lib/chainer/functions/activation/leaky_relu.rb

Overview

Leaky rectifier unit.

Instance Attribute Summary

Attributes inherited from Chainer::FunctionNode

#inputs, #outputs, #rank

Class Method Summary collapse

Instance Method Summary collapse

Methods inherited from Chainer::FunctionNode

#apply, #backward_accumulate, #forward_cpu, #get_retained_inputs, #get_retained_outputs, #label, #output_data, #retain_inputs, #retain_outputs, #unchain

Constructor Details

#initialize(slope: 0.2) ⇒ LeakyReLU

Returns a new instance of LeakyReLU.



37
38
39
# File 'lib/chainer/functions/activation/leaky_relu.rb', line 37

def initialize(slope:0.2)
  @slope = slope
end

Class Method Details

.leaky_relu(x, slope: 0.2) ⇒ Chainer::Variable

Leaky Rectified Linear Unit function.

This function is expressed as

$$ f(x)=\max(x, ax), $$

where $a$ is a configurable slope value.

Examples:

> x = Numo::SFloat[[-1, 0], [2, -3], [-2, 1]]
> x
=> Numo::SFloat#shape=[3,2]
[[-1, 0], 
 [2, -3], 
 [-2, 1]]
> F = Chainer::Functions::Activation::LeakyReLU
> F.leaky_relu(x, slope:0.2).data
=> Numo::SFloat#shape=[3,2]
[[-0.2, 0], 
 [2, -0.6], 
 [-0.4, 1]]

Parameters:

  • x (Chainer::Variable or Numo::NArray or Cumo::NArray)

    Input variable. A $(s_1, s_2, …, s_N)$-shaped float array.

  • slope (float) (defaults to: 0.2)

    Slope value $a$.

Returns:

  • (Chainer::Variable)

    Output variable. A $(s_1, s_2, …, s_N)$-shaped float array.



33
34
35
# File 'lib/chainer/functions/activation/leaky_relu.rb', line 33

def self.leaky_relu(x, slope: 0.2)
  self.new(slope: slope).apply([x])[0]
end

Instance Method Details

#backward(indexes, grad_outputs) ⇒ Object



53
54
55
56
57
58
59
60
61
62
# File 'lib/chainer/functions/activation/leaky_relu.rb', line 53

def backward(indexes, grad_outputs)
  if @slope >= 0
    x = nil
    y = get_retained_outputs.first.data
  else
    x = get_retained_inputs.first.data
    y = nil
  end
  LeakyReLUGrad.new(x, y, @slope).apply(grad_outputs)
end

#forward(inputs) ⇒ Object



41
42
43
44
45
46
47
48
49
50
51
# File 'lib/chainer/functions/activation/leaky_relu.rb', line 41

def forward(inputs)
  x, = inputs
  y = x.dup
  y[x < 0] *= @slope
  if @slope >= 0
    retain_outputs([0])
  else
    retain_inputs([0])
  end
  [y]
end