Coder Social home page Coder Social logo

iml-regressor's Introduction

Regressor to Predict California House Prices

See report for detailed analysis of hyperparameter tuning

Completed as part of a group coursework (Grade: 100%)

Required Packages

For both Part 1 and Part 2 some python packages are requried. To install these packages using pip run the command `pip install -r requirements.txt'

Part 1

Linear Layer

A linear layer is initialised with xavier_init and a bias. It has a simple linear forward pass method, a backward pass method using a gradient with respect to layer output and a parameter update method of gradient descent.

Activation Function Classes

SigmoidLayer and ReluLayer both have forward pass methods and backward pass methods that returns an output according to their function definitions and derivatives respectively.

Multi-Layer Network

Layers are initialised depending on the number of features and neurons, and activation functions used. Forward and backward pass methods are implemented according to the functions used. Parameters are updated with one gradient descent step.

Trainer

The trainer shuffles input then splits and trains with the input using gradient descent, and calculate the loss function of the dataset.

Preprocessor

The preprocessor applies and reverts normalisation using respective methods.

Part 2

Regressor

The Regressor model can be constructed with the arguments:

  • nb_epoch: The number of epochs to train the model
  • learning_rate: The initial learning rate to pass into the Adam GD model
  • hidden_layer_sizes: An array defining th shape of the hidden layers
  • batch_size: The batch size to use for mini-batch GD (default: -1 uses a single batch equal to the size of the dataset)
  • output_file: If True the per epoch RMSE will be written to loss.csv during training of the model

To train the model, pass training data when calling the fit method. Pass test data to the predict method to obtain the models predicted values. The score method will return the root mean squared error between the models predictions and the actual data.

Hyperparameter Tuning

The param_grid in the RegressorHyperParameterSearch function defines which samples are to be used in the grid search to find the optimal parameters.

This function will return the best model's parameters and will also output all of the scores for each combination of hyperparameter to hyperparams_scores/all_hyperparams_scores.csv. This is useful for testing and validating the results.

Running the example_main() function will perform a hyperparameter search and then train a model based on the optimal parameters found - saving a pickle of it to part2_model.pickle

iml-regressor's People

Contributors

robwakefield avatar stevenchenwaiho avatar marklaw avatar

Watchers

 avatar Kostas Georgiou avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.