Class: Chainer::Functions::Activation::LeakyReLU
- Inherits:
-
Chainer::FunctionNode
- Object
- Chainer::FunctionNode
- Chainer::Functions::Activation::LeakyReLU
- Defined in:
- lib/chainer/functions/activation/leaky_relu.rb
Overview
Leaky rectifier unit.
Instance Attribute Summary
Attributes inherited from Chainer::FunctionNode
Class Method Summary collapse
-
.leaky_relu(x, slope: 0.2) ⇒ Chainer::Variable
Leaky Rectified Linear Unit function.
Instance Method Summary collapse
- #backward(indexes, grad_outputs) ⇒ Object
- #forward(inputs) ⇒ Object
-
#initialize(slope: 0.2) ⇒ LeakyReLU
constructor
A new instance of LeakyReLU.
Methods inherited from Chainer::FunctionNode
#apply, #backward_accumulate, #forward_cpu, #get_retained_inputs, #get_retained_outputs, #label, #output_data, #retain_inputs, #retain_outputs, #unchain
Constructor Details
#initialize(slope: 0.2) ⇒ LeakyReLU
Returns a new instance of LeakyReLU.
37 38 39 |
# File 'lib/chainer/functions/activation/leaky_relu.rb', line 37 def initialize(slope:0.2) @slope = slope end |
Class Method Details
.leaky_relu(x, slope: 0.2) ⇒ Chainer::Variable
Leaky Rectified Linear Unit function.
This function is expressed as
$$ f(x)=\max(x, ax), $$
where $a$ is a configurable slope value.
33 34 35 |
# File 'lib/chainer/functions/activation/leaky_relu.rb', line 33 def self.leaky_relu(x, slope: 0.2) self.new(slope: slope).apply([x])[0] end |
Instance Method Details
#backward(indexes, grad_outputs) ⇒ Object
53 54 55 56 57 58 59 60 61 62 |
# File 'lib/chainer/functions/activation/leaky_relu.rb', line 53 def backward(indexes, grad_outputs) if @slope >= 0 x = nil y = get_retained_outputs.first.data else x = get_retained_inputs.first.data y = nil end LeakyReLUGrad.new(x, y, @slope).apply(grad_outputs) end |
#forward(inputs) ⇒ Object
41 42 43 44 45 46 47 48 49 50 51 |
# File 'lib/chainer/functions/activation/leaky_relu.rb', line 41 def forward(inputs) x, = inputs y = x.dup y[x < 0] *= @slope if @slope >= 0 retain_outputs([0]) else retain_inputs([0]) end [y] end |