Multi-Layer Perceptron Neural Network
This is a Multi-Layer Perceptron Neural Network that uses early stopping to prevent itself from overfitting.
It also saves its state so that you can train the network and then re-use it again when-ever you want.
Install
gem sources -a -http://gemcutter.org
sudo gem install db_mlp
How To Use
require 'rubygems'
require 'db_mlp'
a = DBMLP.new(path_to_db, :hidden_layers => [2], :output_nodes => 1, :inputs => 2)
training = [[[0,0], [0]], [[0,1], [1]], [[1,0], [1]], [[1,1], [0]]]
testing = [[[0,0], [0]], [[0,1], [1]], [[1,0], [1]], [[1,1], [0]]]
validation = [[[0,0], [0]], [[0,1], [1]], [[1,0], [1]], [[1,1], [0]]]
a.train(training, testing, validation, number_of_training_iterations)
puts "Test data"
puts "[0,0] = > #{a.feed_forward([0,0]).inspect}"
puts "[0,1] = > #{a.feed_forward([0,1]).inspect}"
puts "[1,0] = > #{a.feed_forward([1,0]).inspect}"
puts "[1,1] = > #{a.feed_forward([1,1]).inspect}"
After training has finished the network is saved to the file path specified. When you want to re-use the network just call:
a = DBMLP.load(path_to_db)
a.feed_for_forward([0,1])
You can also tell the network what iterations you would like it to perform validations on:
DBMLP.new(path_to_db, :hidden_layers => [2],
:output_nodes => 1,
:inputs => 2,
:validate_every => 100)
Test Reports
If you want it to, the MLP can produce a test report. The basic idea is that at the end of training the MLP will feedforward again all the entries that you have passed into the validation attribute. The file contains data about the index, the data that was inputted, the target, the result and the error. Here’s an example:
ID: 0 Attributes: [0, 0] Target: 0 Resuts: 0.387170168937349 Error: 0.0749503698574878
ID: 1 Attributes: [0, 1] Target: 1 Resuts: 0.365112645315455 Error: 0.20154097656917
ID: 2 Attributes: [1, 0] Target: 1 Resuts: 0.40477576498281 Error: 0.1771459449759
ID: 3 Attributes: [1, 1] Target: 0 Resuts: 0.382819699838249 Error: 0.0732754612921235
Benchmarks
The above example produces these times (3000 iterations)
Copyright
Copyright © 2009 Red Davis. See LICENSE for details.