Class: Newral::Networks::Sigmoid
- Inherits:
-
Perceptron
- Object
- Perceptron
- Newral::Networks::Sigmoid
- Defined in:
- lib/newral/networks/sigmoid.rb
Instance Attribute Summary
Attributes inherited from Perceptron
#bias, #inputs, #last_output, #weights
Instance Method Summary collapse
-
#adjust_weights(expected: nil, learning_rate: 0.5, layer: :output, weights_at_output_nodes: nil, output: nil) ⇒ Object
depending on where the neuron is placed we need other infos to adjust the weights on output we just need the expected results for hidden neurons we also need to know the weights at the output nodes and the actual output of the network.
-
#delta_rule(expected: nil) ⇒ Object
if you want to know how this works visit mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/.
-
#delta_rule_hidden(output: nil, expected: nil, weights_at_output_nodes: nil) ⇒ Object
this just works for 1 hidden layer stats.stackexchange.com/questions/70168/back-propagation-in-neural-nets-with-2-hidden-layers bias also not adjusted as biases are seen as constants.
- #output ⇒ Object
Methods inherited from Perceptron
#add_input, #calculate_input, #calculate_value, #initialize, #move, #number_of_directions, #set_weights_and_bias, #update_with_vector
Constructor Details
This class inherits a constructor from Newral::Networks::Perceptron
Instance Method Details
#adjust_weights(expected: nil, learning_rate: 0.5, layer: :output, weights_at_output_nodes: nil, output: nil) ⇒ Object
depending on where the neuron is placed we need other infos to adjust the weights on output we just need the expected results for hidden neurons we also need to know the weights at the output nodes and the actual output of the network
45 46 47 48 49 50 |
# File 'lib/newral/networks/sigmoid.rb', line 45 def adjust_weights( expected: nil, learning_rate:0.5, layer: :output, weights_at_output_nodes: nil, output: nil ) error_delta = layer.to_sym == :output ? delta_rule( expected: expected ) : delta_rule_hidden( output: output, expected: expected, weights_at_output_nodes: weights_at_output_nodes ) @weights.each_with_index do |weight,idx| @weights[idx] = @weights[idx]-error_delta[ idx ]*learning_rate end end |
#delta_rule(expected: nil) ⇒ Object
if you want to know how this works visit mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/
12 13 14 15 16 17 18 19 |
# File 'lib/newral/networks/sigmoid.rb', line 12 def delta_rule( expected: nil ) error_delta = [] @weights.each_with_index do |weight,idx| input = calculate_input( @inputs[ idx ] ) error_delta << -(expected-output)*(output)*(1-output )*input end error_delta end |
#delta_rule_hidden(output: nil, expected: nil, weights_at_output_nodes: nil) ⇒ Object
this just works for 1 hidden layer stats.stackexchange.com/questions/70168/back-propagation-in-neural-nets-with-2-hidden-layers bias also not adjusted as biases are seen as constants
25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 |
# File 'lib/newral/networks/sigmoid.rb', line 25 def delta_rule_hidden( output: nil, expected: nil, weights_at_output_nodes: nil ) error_delta = [] @inputs.each do |input| d_e_total_d_out = 0 output.each_with_index do |result,idx| d_e_total_d_out_idx = -( expected[idx]-result ) d_e_out_d_net = (1-result)*result d_e_total_d_out_idx = d_e_total_d_out_idx*d_e_out_d_net*weights_at_output_nodes[idx] d_e_total_d_out = d_e_total_d_out+d_e_total_d_out_idx end d_out_d_net = (1-@last_output)*@last_output error_delta << d_e_total_d_out*d_out_d_net*calculate_input( input ) end error_delta end |
#output ⇒ Object
5 6 7 8 |
# File 'lib/newral/networks/sigmoid.rb', line 5 def output value = calculate_value @last_output = 1/(1+Math.exp(value*-1)) end |