A single layer Perceptron for Regression: Part 1

 The next agenda: to create the code for a single layer perceptron.

What is a single layer perceptron?

A neural network with 1 layer that has a single neuron.
This means:
1) Input/Predictor values enter this neuron.
2) Each input has some weight connected to it.
3) Inputs are multiplied to their respective weights and summed up.
4) This sum is passed through an activation function.
5) The activated value of the sum/net is the predicted/output value of the neuron.
6) The error is calculated and the weights are changed with the help of a backward pass.



image created by Hridaya Annuncio on Kleki.com



Functions created till now:

  • initialize_weights : The initial random weights assigned.
    Parameters:  (1) row (string): a single row of predictor values. This is used to find the number of                            weights that have to be initialized.
              

  • activate: Based on the activation function desired to be used, the activated net value of all the inputs times their weights is calculated.
    Parameters:  (1) activation_function (string)
                         (2) input_sum (float) 


  • forward_pass: This function takes care of one entire forward pass - multiplying weights with their corresponding inputs, summing all these values, and activating the net.
    Parameters:(1) row(numpy array): 1 row of predictor values
                       (2) weights(numpy array): Current weights
                       (3)activation_function(string)

  • calculate_error: Calculating the error in the predicted values based on the kind of error the user desires (eg: Sum of Squared Errors or SSE).
    Parameters: (1)predicted_output(numpy array)
                        (2) actual_output(numpy array)
                        (3) type_of_error(string)

Functions in development:

- normalize
-backward_pass


Code

Comments

Popular posts from this blog

What Data Structure Do I use?

A Single Layer Perceptron for Regression: Part 2