Class: Chainer::Functions::Activation::ReLUGrad2

Inherits:
Chainer::FunctionNode show all
Defined in:
lib/chainer/functions/activation/relu_grad2.rb

Overview

Computes the gradient of the ReLU function.

This function takes 2 variables b and c, and computes f(b, c) = sign(b) * c with backpropagation where operations are dones in elementwise manner and sign(x) = 1 when x > 0 is positive and 0 otherwise. As the gradient of f with respect to b is 0, we do not backpropagate errors toward b for computational efficiency.<Paste>

Instance Attribute Summary

Attributes inherited from Chainer::FunctionNode

#inputs, #outputs, #rank

Instance Method Summary collapse

Methods inherited from Chainer::FunctionNode

#apply, #backward_accumulate, #forward_cpu, #get_retained_inputs, #get_retained_outputs, #label, #output_data, #retain_inputs, #retain_outputs, #unchain

Constructor Details

#initialize(b) ⇒ ReLUGrad2

Returns a new instance of ReLUGrad2.



13
14
15
# File 'lib/chainer/functions/activation/relu_grad2.rb', line 13

def initialize(b)
  @b = b.data
end

Instance Method Details

#backward(indexes, gy) ⇒ Object



22
23
24
# File 'lib/chainer/functions/activation/relu_grad2.rb', line 22

def backward(indexes, gy)
  [gy[0] * heaviside(@b)]
end

#forward(inputs) ⇒ Object



17
18
19
20
# File 'lib/chainer/functions/activation/relu_grad2.rb', line 17

def forward(inputs)
  y = inputs[0] * (@b > 0)
  [Utils::Array.force_array(y, y.class)]
end