Class: FastForward::NN
- Inherits:
-
Object
- Object
- FastForward::NN
- Defined in:
- lib/fast_forward/nn.rb
Instance Attribute Summary collapse
-
#biases ⇒ Object
readonly
Returns the value of attribute biases.
-
#input_dim ⇒ Object
readonly
Returns the value of attribute input_dim.
-
#layer_activations ⇒ Object
readonly
Returns the value of attribute layer_activations.
-
#layer_sizes ⇒ Object
readonly
Returns the value of attribute layer_sizes.
-
#output_dim ⇒ Object
readonly
Returns the value of attribute output_dim.
-
#weights ⇒ Object
readonly
Returns the value of attribute weights.
Class Method Summary collapse
-
.activate(x, activation_fct) ⇒ Object
Activation functions.
-
.relu(x) ⇒ Object
ReLU activation function.
-
.sigmoid(x) ⇒ Object
Sigmoid / logistic activation function.
-
.softmax(x, subtract_max = true) ⇒ Object
Softmax activation function.
-
.tanh(x) ⇒ Object
Tanh activation function.
Instance Method Summary collapse
-
#check_model_integrity(sample_data, tol: FastForward::DEFAULT_TOL, exception_if_fail: false, verbose: false) ⇒ true, false
The check verdict.
-
#forward_pass(inputs, array_output = true) ⇒ Object
Neural network forward pass.
-
#initialize(layer_sizes, layer_activations, weights, biases) ⇒ NN
constructor
A new instance of NN.
-
#predict(inputs, array_output = true) ⇒ Object
Predict outputs with a forward pass.
-
#predict_class(inputs) ⇒ Object
Predict class labels with a forward pass (supports multi-class and multi-task).
-
#predict_proba(inputs, array_output = true) ⇒ Object
Predict class probabilities with a forward pass.
Constructor Details
#initialize(layer_sizes, layer_activations, weights, biases) ⇒ NN
Returns a new instance of NN
7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
# File 'lib/fast_forward/nn.rb', line 7 def initialize(layer_sizes, layer_activations, weights, biases) # layer_sizes must include input_dim and output_dim @input_dim = layer_sizes.first @output_dim = layer_sizes.last @layer_sizes = layer_sizes @layer_activations = layer_activations @n_layers = @layer_sizes.count - 1 @weights = [] @biases = [] @n_layers.times.map do |idx| shape = [@layer_sizes[idx], @layer_sizes[idx + 1]] @weights << NMatrix.new(shape, weights[idx].flatten, dtype: :float64) @biases << NMatrix.new([1, shape[1]], biases[idx], dtype: :float64) end end |
Instance Attribute Details
#biases ⇒ Object (readonly)
Returns the value of attribute biases
5 6 7 |
# File 'lib/fast_forward/nn.rb', line 5 def biases @biases end |
#input_dim ⇒ Object (readonly)
Returns the value of attribute input_dim
5 6 7 |
# File 'lib/fast_forward/nn.rb', line 5 def input_dim @input_dim end |
#layer_activations ⇒ Object (readonly)
Returns the value of attribute layer_activations
5 6 7 |
# File 'lib/fast_forward/nn.rb', line 5 def layer_activations @layer_activations end |
#layer_sizes ⇒ Object (readonly)
Returns the value of attribute layer_sizes
5 6 7 |
# File 'lib/fast_forward/nn.rb', line 5 def layer_sizes @layer_sizes end |
#output_dim ⇒ Object (readonly)
Returns the value of attribute output_dim
5 6 7 |
# File 'lib/fast_forward/nn.rb', line 5 def output_dim @output_dim end |
#weights ⇒ Object (readonly)
Returns the value of attribute weights
5 6 7 |
# File 'lib/fast_forward/nn.rb', line 5 def weights @weights end |
Class Method Details
.activate(x, activation_fct) ⇒ Object
Activation functions
144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 |
# File 'lib/fast_forward/nn.rb', line 144 def self.activate(x, activation_fct) if activation_fct == "identity" || activation_fct == "linear" return x elsif activation_fct == "relu" || activation_fct == "rectifier" return NN.relu(x) elsif activation_fct == "sigmoid" || activation_fct == "logistic" return NN.sigmoid(x) elsif activation_fct == "tanh" return NN.tanh(x) elsif activation_fct == "softmax" return NN.softmax(x) else raise ArgumentError, "Activation function not supported: #{activation_fct}" end end |
.relu(x) ⇒ Object
ReLU activation function
162 163 164 165 |
# File 'lib/fast_forward/nn.rb', line 162 def self.relu(x) x.map!{ |e| [0, e].max } return x end |
.sigmoid(x) ⇒ Object
Sigmoid / logistic activation function
168 169 170 171 |
# File 'lib/fast_forward/nn.rb', line 168 def self.sigmoid(x) x.map!{ |e| 1.0 / (1.0 + Math::exp(-e)) } return x end |
.softmax(x, subtract_max = true) ⇒ Object
Softmax activation function
180 181 182 183 184 185 186 187 188 189 190 191 192 |
# File 'lib/fast_forward/nn.rb', line 180 def self.softmax(x, subtract_max = true) if subtract_max # better for numerical stability # matrix with same shape as x, with max entry per row row_max = NN.extend_vec(x.max(1), x.cols) x = x - row_max end e_x = x.exp row_sum = NN.extend_vec(e_x.sum(1), e_x.cols) return e_x / row_sum end |
.tanh(x) ⇒ Object
Tanh activation function
174 175 176 177 |
# File 'lib/fast_forward/nn.rb', line 174 def self.tanh(x) x.map!{ |e| Math::tanh(e) } return x end |
Instance Method Details
#check_model_integrity(sample_data, tol: FastForward::DEFAULT_TOL, exception_if_fail: false, verbose: false) ⇒ true, false
sample_data is supposed to have the following structure: sample_data = { “samples”: { “X”: sample_inputs_array, “y”: sample_outputs_array } }
Returns The check verdict
39 40 41 |
# File 'lib/fast_forward/nn.rb', line 39 def check_model_integrity(sample_data, tol: FastForward::DEFAULT_TOL, exception_if_fail: false, verbose: false) return FastForward.check_model_integrity(self, sample_data, tol: tol, exception_if_fail: exception_if_fail, verbose: verbose) end |
#forward_pass(inputs, array_output = true) ⇒ Object
Neural network forward pass
51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 |
# File 'lib/fast_forward/nn.rb', line 51 def forward_pass(inputs, array_output = true) # fix input shape if only one element is provided inputs = [inputs] if inputs.first.is_a?(Numeric) ones = NVector.ones(inputs.count) x = NMatrix.new([inputs.count, @input_dim], inputs.flatten, dtype: :float64) @n_layers.times.map do |idx| h = x.dot(@weights[idx]) + ones.dot(@biases[idx]) x = NN.activate(h, @layer_activations[idx]) end if !array_output return x # elsif x.shape.first == 1 # return x.first elsif x.shape == [1, 1] return x.first elsif x.shape.last == 1 return x.to_a.map(&:first) else return x.to_a end end |
#predict(inputs, array_output = true) ⇒ Object
Predict outputs with a forward pass.
133 134 135 136 137 138 139 140 |
# File 'lib/fast_forward/nn.rb', line 133 def predict(inputs, array_output = true) last_act = FastForward.rename_activation(@layer_activations.last) if last_act == "softmax" || last_act == "sigmoid" return predict_class(inputs) else return forward_pass(inputs, array_output) end end |
#predict_class(inputs) ⇒ Object
Predict class labels with a forward pass (supports multi-class and multi-task).
94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 |
# File 'lib/fast_forward/nn.rb', line 94 def predict_class(inputs) n_inputs = inputs.first.is_a?(Numeric) ? 1 : inputs.count probas = forward_pass(inputs, array_output = true) probas = [probas] if n_inputs == 1 last_act = FastForward.rename_activation(@layer_activations.last) if last_act == "softmax" # multiclass classes = probas.map do |class_p| _,idx = class_p.each_with_index.max idx end elsif probas.first.is_a?(Numeric) # 2-class classes = probas.map(&:round) else # multi-task classes = probas.map do |task_p| task_p.map(&:round) end end classes = classes.first if n_inputs == 1 return classes end |
#predict_proba(inputs, array_output = true) ⇒ Object
Same as forward_pass
Predict class probabilities with a forward pass
85 86 87 |
# File 'lib/fast_forward/nn.rb', line 85 def predict_proba(inputs, array_output = true) return forward_pass(inputs, array_output) end |