class SimpleNeuralNetwork::Network
Attributes
An array of layers
Public Class Methods
Deserialize a JSON neural network back into a Ruby object Note that the normalization function will need to be reset. Normalization function serialization in the future would be cool.
# File lib/network.rb, line 120 def self.deserialize(string) hash = JSON.parse(string) network = Network.new hash["layers"].each do |layer| neurons_array = layer["neurons"] layer = Layer.new(neurons_array.length, network) network.layers << layer layer.neurons.each_with_index do |neuron, index| neuron_hash = neurons_array[index] neuron.bias = neuron_hash["bias"].to_f neuron.edges = neuron_hash["edges"].map(&:to_f) end end network.layers.each_with_index do |layer, index| unless index == 0 layer.prev_layer = network.layers[index - 1] end layer.next_layer = network.layers[index + 1] end network end
# File lib/network.rb, line 32 def initialize @layers = [] @inputs = [] @hidden_layer_normalization_function = method(:default_hidden_layer_normalization_function) @output_normalization_function = method(:default_output_normalization_function) @edge_initialization_function = method(:default_edge_initialization_function) @neuron_bias_initialization_function = method(:default_neuron_bias_initialization_function) end
Public Instance Methods
# File lib/network.rb, line 93 def clear_edge_caches @layers.each do |layer| layer.clear_edge_cache end end
# File lib/network.rb, line 72 def create_layer(neurons:) unless @layers.empty? new_layer = Layer.new(neurons, self) prev_layer = @layers.last @layers << new_layer new_layer.prev_layer = prev_layer prev_layer.next_layer = new_layer prev_layer.initialize_neuron_edges else @layers << Layer.new(neurons, self) end end
Returns the number of input nodes
# File lib/network.rb, line 63 def input_size @layers[0].size end
Returns the number of output nodes
# File lib/network.rb, line 68 def output_size @layers[-1].size end
# File lib/network.rb, line 88 def reset_normalization_functions @output_normalization_function = method(:default_output_normalization_function) @hidden_layer_normalization_function = method(:default_hidden_layer_normalization_function) end
Run an input set against the neural network. Accepts an array of input integers between 0 and 1 Input array length must be equal to the size of the first layer. Returns an array of outputs.
skip_validation: Skips validations that may be expensive for large sets
# File lib/network.rb, line 49 def run(inputs, skip_validation: false) unless skip_validation unless inputs.size == input_size && inputs.all? { |input| input >= 0 && input <= 1 } raise InvalidInputError.new("Invalid input passed to Network#run") end end @inputs = inputs # Get output from last layer. It recursively depends on layers before it. @layers[-1].get_output(normalize: output_normalization_function) end
Serializes the neural network into a JSON string. This can later be deserialized back into a Network
object Useful for storing partially trained neural networks. Note: Currently does not serialize bias init function, edge init function, or normalization function
# File lib/network.rb, line 102 def serialize { layers: layers.map do |layer| { neurons: layer.neurons.map do |neuron| { bias: neuron.bias.to_f, edges: neuron.edges.map(&:to_f) } end } end }.to_json end
Private Instance Methods
# File lib/network.rb, line 168 def default_edge_initialization_function rand(-5..5) end
# File lib/network.rb, line 172 def default_neuron_bias_initialization_function 0 end
The default normalization function for the network output The standard logistic sigmoid function f(x) = 1 / (1 + e^(-x))
# File lib/network.rb, line 154 def default_output_normalization_function(output) 1 / (1 + (Math::E ** (-1 * output))) end