Coder Social home page Coder Social logo

Hi , I'm Mike!



Click the arrows to see more!

💼 My Recent Projects Include:
Machine Learning Projects:
1. Regression with Multiple Linear Perceptron (MLP) Modeling of the Saddle and Ackley Functions
  • Google Coolab Notebook: Jupyter Notebook
  • Github Repository: Respository
  • Paper: Regression with Multiple Linear Perceptron (MLP) Modeling of the Saddle and Ackley Functions
  • MLP Machine Learning Algorithm:
    1. Generate a data set with the simple Saddle Point or the Ackley Function
      • Saddle Point:
      • Ackley:
    2. Add uniform random noise and visualize the 3D meshgrid
    3. Reshape the generated data to be a tensor input vector (shape will be: sample rows by feature columns)
    4. Regression MLP Model Parameters:
    5. Create a predicted Saddle point and Ackley Function from the Regression MLP trained Neural Network
    6. Plot the Results


    Multiple Linear Perceptron (MLP) Model


    Results of Saddle Function Predictions


      Results are shown in the above image Left-to-Right, Top-to-Bottom
    1. Real vs. Predicted Saddle
    2. z-x cross section @ y = 2
    3. z-x cross section @ y = 0
    4. Model Loss Vs. Epochs
    5. Topological Heat Map


    Results of Ackley Function Predictions


      Results are shown in the above image Left-to-Right, Top-to-Bottom
    1. Real vs. Predicted Ackley
    2. z-x cross section @ y = 2
    3. z-x cross section @ y = 0
    4. Model Loss Vs. Epochs
    5. Topological Heat Map

    2. Real Estate Evaluation of housing prices in Taipei Taiwan


    Multiple Linear Perceptron (MLP) Model

    We will be examining real estate valuation which will help us understand where people tend to live in a city. The higher the price, the greater the demand to live in the property. Predicting real estate valuation can help urban design and urban policies, as it could help identify what factors have the most impact on property prices. Our aim is to predict real estate value, based on several features.



    • Regression MLP Machine Learning on Taipei Taiwan Algorithm:
      1. Load the Real estate valuation data set
      2. Independent feature vector containing:
        1. X2 house age
        2. X3 distance to the nearest MRT station
        3. X4 number of convenience stores
        4. X5 latitude
        5. X6 longitude
      3. Train/Test split the data at a ratio of 80:20, respectively
      4. Min/Max Scale the dataset with a range of 0 to 1
      5. Normalise the scaled features
      6. Regression MLP Model Parameters:
      7. Create a predicted House Price Prediction of the unit area from the Regression MLP trained Neural Network
      8. Plot the Results


    Regression MLP Model Loss


    Regression MLP Predictions in New Taiwan Dollars (NT$)

    3. Classification of MNIST 70,000 Handwritten Digits 0-9 Image Data Set

    • Google Coolab Notebook: Jupyter Notebook
    • Github Repository: Respository
    • Paper: Classification of MNIST 70,000 Handwritten Digits 0-9 Image Data Set
    • Categorical Cross Entropy Algorithm:
      1. Load the Modified National Institute of Standards and Technology (MNIST) Handwritten digits 0-9 data set
      2. Train/Test split the data at a ratio of 6:1, respectively
      3. Reshape the images from 28x28 pixels to 784x1 pixels
      4. Normalise the image pixels by dividing by the gray scale image intensity level set L:
      5. Create 10 Categories for the 10 digits 0-9 to be classified



      6. Creating classes 10 classes for the 10 digits 0-9 of handwritten digits



      7. Categorical Cross Entropy (CE) Model Parameters:
        • categorical cross entropy (CE) Loss Function:

        • Where: The formula can be seen as above, where ti refers to the i -th element of the target vector and si refers to the i -th element of the models output vector, and C the number of classes.


          Visualization of Log Loss (Cross Entropy)


          Cross Entropy between probability distributions for each Class

        • Model Accuracy:

        • Where: M is the number of samples in the dataset, tk is the target vector for the k-th sample, and sk is the models output vector for the k-th sample.

        • Neural Network Architecture:
          • Input Layer = 16 hyperbolic tangent activation (tanh) neurons with an input shape of 784x1
          • Hidden Layer = 16 hyperbolic tangent activation (tanh) neurons with an input shape of 16x1
          • Output Layer = 10 softmax neurons

          • Classification Neural Network Architecture

        • Stochastic Gradient Descent optimizer
          • Learing Rate = 0.4
          • Exponential Decay Factor = 0
          • Momentum = 0.5
        • Train Duration: 10 Epochs
        • Batch Size = 128
        • training samples = 60,000
        • testing samples = 10,000


    7. Show Results:


    Visualization of Model Loss and Accuracy (0.1532 and 95.49% Respectively)


    Visualization of First Layer Weights W1 from Neural Network Architecture


    Visualization of Second Layer Weights W2 from Neural Network Architecture


    Visualization of Third Layer Weights W3 from Neural Network Architecture

    4. Classification of Fashion MNIST Image Data Set

    • Google Coolab Notebook: Jupyter Notebook
    • Github Repository: Respository
    • Paper: Classification of Fashion MNIST Image Data Set
    • Categorical Cross Entropy Algorithm:
      1. Load the Modified National Institute of Standards and Technology (MNIST) Fashion data set
      2. Train/Test split the data at a ratio of 6:1, respectively
      3. Reshape the images from 28x28 pixels to 784x1 pixels
      4. Normalise the image pixels by dividing by the gray scale image intensity level set L:
      5. Create 10 Categories for class_names = ['T-shirt/top', 'Trouser', 'Pullover', 'Dress', 'Coat', 'Sandal', 'Shirt', 'Sneaker', 'Bag', 'Ankle boot']



      6. Creating 10 classes for the 10 types of clothing in the Image Data Set



      7. Categorical Cross Entropy (CE) Model Parameters:
        • categorical cross entropy (CE) Loss Function:

        • Where: The formula can be seen as above, where ti refers to the i -th element of the target vector and si refers to the i -th element of the models output vector, and C the number of classes.


          Visualization of Log Loss (Cross Entropy)


          Cross Entropy between probability distributions for each Class

        • Model Accuracy:

        • Where: M is the number of samples in the dataset, tk is the target vector for the k-th sample, and sk is the models output vector for the k-th sample.

        • Neural Network Architecture:
          • Input Layer = 64 ReLu activation neurons with an input shape of 784x1
          • Hidden Layer = 64 ReLu activation neurons with an input shape of 64x1
          • Output Layer = 10 softmax neurons

          • Classification Neural Network Architecture

        • Stochastic Gradient Descent optimizer
          • Learing Rate = 0.1
          • Exponential Decay Factor = 0
          • Momentum = 0
        • Train Duration: 10 Epochs
        • Batch Size = 128
        • training samples = 60,000
        • testing samples = 10,000


    7. Show Results:


    Visualization of Model Loss and Accuracy (0.3090 and 88.66% Respectively)


    Visualization of First Layer Weights W1 from Neural Network Architecture


    Visualization of Second Layer Weights W2 from Neural Network Architecture


    Visualization of Third Layer Weights W3 from Neural Network Architecture

    Mike Ferko's Projects

    scrapy icon scrapy

    Scrapy, a fast high-level web crawling & scraping framework for Python.

    Recommend Projects

    • React photo React

      A declarative, efficient, and flexible JavaScript library for building user interfaces.

    • Vue.js photo Vue.js

      🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

    • Typescript photo Typescript

      TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

    • TensorFlow photo TensorFlow

      An Open Source Machine Learning Framework for Everyone

    • Django photo Django

      The Web framework for perfectionists with deadlines.

    • D3 photo D3

      Bring data to life with SVG, Canvas and HTML. 📊📈🎉

    Recommend Topics

    • javascript

      JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

    • web

      Some thing interesting about web. New door for the world.

    • server

      A server is a program made to process requests and deliver data to clients.

    • Machine learning

      Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

    • Game

      Some thing interesting about game, make everyone happy.

    Recommend Org

    • Facebook photo Facebook

      We are working to build community through open source technology. NB: members must have two-factor auth.

    • Microsoft photo Microsoft

      Open source projects and samples from Microsoft.

    • Google photo Google

      Google ❤️ Open Source for everyone.

    • D3 photo D3

      Data-Driven Documents codes.