Coder Social home page Coder Social logo

idnns's Introduction

IDNNs

Description

IDNNs is a python library that implements training and calculating of information in deep neural networks [Shwartz-Ziv & Tishby, 2017] in TensorFlow. The library allows you to investigate how networks look on the information plane and how it changes during the learning.

Prerequisites

  • tensorflow r1.0 or higher version
  • numpy 1.11.0
  • matplotlib 2.0.2
  • multiprocessing
  • joblib

Usage

All the code is under the idnns/ directory. For training a network and calculate the MI and the gradients of it run the an example in main.py. Off course you can also run only specific methods for running only the training procedure/calculating the MI. This file has command-line arguments as follow -

  • start_samples - The number of the first sample for calculate the information
  • batch_size - The size of the batch
  • learning_rate - The learning rate of the network
  • num_repeat - The number of times to run the network
  • num_epochs - maximum number of epochs for training
  • net_arch - The architecture of the networks
  • per_data - The percent of the training data
  • name - The name for saving the results
  • data_name - The dataset name
  • num_samples - The max number of indexes for calculate the information
  • save_ws - True if we want to save the outputs of the network
  • calc_information - 1 if we want to calculate the MI of the network
  • save_grads - True if we want to save the gradients of the network
  • run_in_parallel - True if we want to run all the networks in parallel mode
  • num_of_bins - The number of bins that we divide the neurons' output
  • activation_function - The activation function of the model 0 for thnh 1 for RelU'
  • interval_accuracy_display - The interval for display accuracy
  • interval_information_display - The interval for display the information calculation
  • cov_net - True if we want covnet
  • rand_labels - True if we want to set random labels
  • data_dir - The directory for finding the data The results are save under the folder jobs. Each run create a directory with a name that contains the run properties. In this directory there are the data.pickle file with the data of run and python file that is a copy of the file that create this run. The data is under the data directory.

For plotting the results we have the file plot_figures.py. This file contains methods for plotting diffrent aspects of the data (the information plane, the gradients,the norms, etc).

References

  1. Ravid. Shwartz-Ziv, Naftali Tishby, Opening the Black Box of Deep Neural Networks via Information, 2017, Arxiv.

idnns's People

Contributors

bdwyer2 avatar martinschiemer avatar ravidziv avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

idnns's Issues

Binning for mutual info calculation

Hi, I have been running your code with the option -activation_function=1, i.e. with ReLUs according to the README. However, I noticed that binning for mutual info calculation is performed in between -1 and 1 independent of the activation function.

bins = np.linspace(-1, 1, num_of_bins)

However, this looks coherent with tanh activations only.

possible for loop bug

In idnns/information/information_process.py on line 172 there is a for loop using the variable i however we are already in another for loop also using i starting on line 166.

Is this a bug or was is it intended?

Lines 166 - 178:

        for i in range(len(ws_iter_index)):
            data = ws_iter_index[i]
            new_data = np.zeros((num_of_samples * data.shape[0], data.shape[1]))
            labels = np.zeros((num_of_samples * label.shape[0], label.shape[1]))
            x = np.zeros((num_of_samples * data.shape[0], 2))
            sigma = 0.5
            for i in range(data.shape[0]):
                print i
                cov_matrix = np.eye(data[i, :].shape[0]) * sigma
                t_i = np.random.multivariate_normal(data[i, :], cov_matrix, num_of_samples)
                new_data[num_of_samples * i:(num_of_samples * (i + 1)) , :] = t_i
                labels[num_of_samples * i:(num_of_samples * (i + 1)) , :] = label[i, :]
                x[num_of_samples * i:(num_of_samples * (i + 1)) , 0] = i

Running on MNIST

Hi,

I was trying to run this on the MNIST data-set, but setting -d_name MNIST does not seem to work. What's the right parameter I must pass in?

"Calculating the information" takes a long time

Hi,

I am running the main script without any issues: python main.py

The part of the script that trains the network takes about a minute (after output Start running the network)

The part that calculates the information takes a very long time (after output Start calculating the information...). Something like 5 to 10 minutes for every 30 epochs. So the whole script will take about 5 hours to complete. I have tried on both a MacBookPro with an i7, and a more beefy machine with a ryzen 7 (both have the speed I reported above). Note that all my CPUs are at 100% during this part, so I suppose parallelisation is already done, or wont help...
My main issue with it is that it will make experimenting with it slow, as I have to wait a long time to see the results after each modification.

Is this the expected behavior ? Is it the same speed on your machine ?
Can I do anything to speed it up ?

Thanks in advance

Generating good looking plots

Hi Ravid,

Thank you for releasing the code. Based on the sheer amount of code, it would be almost impossible for anyone else to replicate your work without this public repo!

small_plot

I ran main.py basically as is, and the figure I was able to obtain is attached above.
The job folder name is net_nEpochInds=274_lr=0.0004_layerSizes=10,7,5,4,3_LastEpochsInds=7999_DataName=var_u_sampleLen=1_nDistSmpls=1_batch=256_nRepeats=1_nEpoch=8000

This figure looks a bit wierd...(not just the caption I(X; T) was cut off), you also see a slight increase of I(X; T) near the end of training. Am I using the code correctly?

Also, the information_network collects self.l1_norms, self.l2_norms
Are those parameter/weight_matrices norms? Is there a way to plot them as well? (I see some keywords in plot_figures that contain "plot_norms", but not sure how to use them exactly...)

Update to Python3 and Windows

I've modified source files for compatibility with Python3 at this repo. On Windows, Tensorflow only works with Python3, not Python2. Hence, it was necessary to update to Python3. You may wish to create a Python3 branch of IDNNs and use my mods. Cheers.

To install on Windows, install Anaconda3, then tensorflow and joblib.

conda install tensorflow-gpu joblib

Execute idnns.bat for a quick test of IDNNs.

Some graphs are missing data

Hello,

Thanks for the update and your previous reply.
I'm running the code on mac with python3 and venv.

By running this command:
python main.py -num_repeat=1 -num_epochs=1000 -num_samples=50

The information plane graph looks good. The two other graphs are missing the data. Any pointers to what is the issue?
Thanks a lot.

I have attached the charts below:

graph1

graph2

graph3

Unable to clone with eCryptfs

The eCryptfs file system does not support file names longer than 143 characters such as the files in the jobs/ directory. This makes it impossible for me to clone this repository on my machine.

Would it be possible to shorten some of the directory names in the jobs/ directory? I'm sure other linux users will face this issue as well. Thanks!

git install error

I like to use pip3 install with the syntax:
pip3 install git+https://github.com/ravidziv/IDNNs.git

I got this error message:
Collecting git+https://github.com/ravidziv/IDNNs.git
Cloning https://github.com/ravidziv/IDNNs.git to /private/tmp/pip-req-build-59as3hdr
Complete output from command python setup.py egg_info:
Traceback (most recent call last):
File "", line 1, in
File "/anaconda3/lib/python3.6/tokenize.py", line 452, in open
buffer = _builtin_open(filename, 'rb')
FileNotFoundError: [Errno 2] No such file or directory: '/private/tmp/pip-req-build-59as3hdr/setup.py'

----------------------------------------

Command "python setup.py egg_info" failed with error code 1 in /private/tmp/pip-req-build-59as3hdr/

Anyone knows how to solve it? I am using MacOS, and I also try this syntax in another Mac which also gives me the same error.

Running information network on pre-trained TF models

Hi, Thanks for all your work in this, just wanted to know how much work would it be, to make plugging-in pre-trained models possible or use model architecture other than the mentioned ones, to visualize the information plane? As we calculate the information on per epoch basis, looks like there is a tight coupling between models and the information network code. Also, if there are additional specific artifacts that should be brought-in with the pre-trained models which could help in visualizing the plane. Overall idea is to make this as a tool which can be used with various models.
Thanks,
-AT

Running IDNN for mnist dataset

Hi @ravidziv

Can you please guide me about how to run this code for MNIST dataset? What should be the parameters, and network layers if I need to run it with 5 to 6 deep layers?

Calculating the IB curve (blue line) in Figure 6?

Hi @ravidziv,

Thanks for sharing the code publically.

I was wondering if you could specify which part of your code computes the IB curve (the blue line) in Figure 6 in the paper? Did you use Blahut-Arimoto algorithm to compute it? If possible, would you mind sharing your computed data of the IB curve in Figure 6?

Look forward to your reply.

Regards,
-Thanh

[BUG] Calculate Information for Epoch

Hi, there's a typo in line 143 of your information_process.py script:

parmas = calc_by_sampling_neurons(ws_iter_index=ws_iter_index, num_of_samples=num_of_samples, label=label, sigma=sigma, bins=bins, pxs=pxs)

parmas should be renamed to params since the latter is what the function returns

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.