This neural network model is used to create a prediction application to learn the applicants' successful rate if funded by Alphabet Soup. The preiction application takes into account four different alternative models as a way to learn different techniques on the neural network model that could potentially increase the model accuracy and decrease the model loss.
This project leverages python version 3.8.5 with the following packages and modules:
-
pandas - This was used to be able to easily manipulate dataframes.
-
Jupyter Lab - This was used to be able to create and share documents that contain live code, equations, visualizations and narrative text.
-
Scikit Learn - version 0.24.2 - This package has a lot of different tools and model that could be use to create a machine learning model.
-
Standard Scaler - Standardize features by removing the mean and scaling to unit variance.
-
OneHotEncoder - Encode categorical features as a one-hot numeric array.
-
TensorFlow - version 2.6.0 - This is an end-to-end open source platform for machine learning. It has a comprehensive, flexible ecosystem of tools, libraries, and community resources that lets researchers push the state-of-the-art in ML and developers easily build and deploy ML-powered applications.
-
Keras - version 2.6.0 - This is an API designed for human beings, not machines. Keras follows best practices for reducing cognitive load: it offers consistent & simple APIs, it minimizes the number of user actions required for common use cases, and it provides clear & actionable error messages. It also has a free open source Python library for developing and evaluating deep learning models.
-
layers Dense - This implements the operation: output = activation(dot(input, kernel) + bias) where activation is the element-wise activation function passed as the activation argument, kernel is a weights matrix created by the layer, and bias is a bias vector created by the layer (only applicable if use_bias is True).
-
models Sequential - This is used appropriately for a plain stack of layers where each layer has exactly one input tensor and one output tensor.
-
layers LeakyReLU - This is a “leaky” alternative to the ReLU function. This means that instead of transforming negative input values to 0, it transforms negative input values into much smaller negative values.
-
ReLU - The rectified linear unit (ReLU) function returns a value from 0 to infinity. This activation function transforms any negative input to 0. It is the most commonly used activation function in neural networks due to its faster learning and simplified output. However, it is not always appropriate for simpler models.
-
-
PyVizlot - Python visualization package that provides a single platform for accessing multiple visualization libraries. Two of the libraries are:
- hvplot.pandas - version 0.7.2 - For the interactive visualization of the crowdfunding data.
On the terminal, under the conda dev environment, type the code below:
pip install jupyterlab
If you have Jupyter Lab already installed, to open your Notebook and be able to view your hidden files, please type this while on your conda dev environment:
jupyter lab --ContentsManager.allow_hidden=True
Once you click ENTER, this will open on your default browser.
-
To install the Scikit-learn, check that your development environment is active, and then run the following command:
pip install -U scikit-learn
-
To check if scikit-learn is already installed, you can run the following code on your dev environment:
conda list scikit-learn
-
To install the TensorFlow, check that your development environment is active, and then run the following command:
pip install --upgrade tensorflow
-
To verify if TensorFlow is already installed, you can run the following code on your dev environment:
python -c "import tensorflow as tf;print(tf.__version__)"
-
To verify if Keras is already installed, you can run the following code on your dev environment:
python -c "import tensorflow as tf;print(tf.keras.__version__)"
Here are some of the data we did for this model.
To be able to get a good analysis on the prediction analysis, we need to be able to model, fit and predict using activation model ReLU, LeakyReLu and Sigmoid.
Contributed by: Justine Cho
Email: [email protected]
Copyright (c) [2021] [Justine Cho]
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.