Coder Social home page Coder Social logo

asu-cse569's Introduction

ASU-CSE569

A. Bayesian Decision Theory & The Curse of Dimensionality

In the first part of this project, we will implement a minimum error rate classifier on a 2-class problem. In this project, we use a modified subset of the MNIST data containing images of the digit “0” and the digit “1” for ease of implementation and learning. Our dataset consists of 5923 training images for digit “0” and 6742 training images for digit “1”. Similarly, for testing purposes, we have 980 and 1135 images of digits “0” and “1” respectively. These images are stored as a matrix of the dimension 28 x 28 each. Our final goal is to perform minimum error classification using Bayesian Decision Theory. However, it is difficult to use Bayesian Decision Theory in a 784-d space. Therefore, it is essential to perform dimensionality reduction first. Another essential requirement for performing minimum error classification is to know the underlying parameters of the distributions for each class. After visualizing the distribution, we’ll use maximum likelihood estimation technique to estimate the underlying parameters of these distributions. Finally, we’ll use the Bayesian Decision Theory for optimal classification.

B. Shallow & Deep Learning in Neural Networks

The field of artificial neural networks have gained quite a popularity for long time now. Neural Networks like Multi-layered Perceptrons and Deep Neural Networks like Convolutional Neural Networks are called Universal Learners because they can learn any underlying non-linear functions of the data samples. In this project we will explore the workings of Multi-layered Perceptron by implementing from scratch. We will use a custom dataset to learn and test our neural network. In the later sections, we once again explore the MNIST dataset using CNNs. This part of the project again comprises of two components. First is the implementation, training, and testing of a “shallow” 3-layer MLP. We perform a 2-class classification task using a dataset consisting of 2000 training samples for each class. The first 1500 samples are used for training purpose and the rest 500 are set aside for validation for the neural network. We train the network until the learning error doesn’t decrease any further for the validation set or 1000 epochs are reached. Once the loss/error is minimized, we test the neural network on our testing set which contains 1000 samples of each class. The second part is the task of image classification using deep learning with Convolutional Neural Networks. For this task, we use the MNIST dataset which contains 70,000 images of handwritten digits, divided into 60,000 training images and 10,000 testing images.

asu-cse569's People

Contributors

mrjay10 avatar

Stargazers

Nick Imanzi avatar Urvish Rana avatar

Watchers

James Cloos avatar  avatar

Forkers

notprakhargupta

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.