Coder Social home page Coder Social logo

npeet_lnc's Introduction

NPEET_LNC

Mutual information Estimation with Local Non-uniformity Correction (a branch of NPEET Non-parametric Entropy Estimation Toolbox)

This package contains Python code implementing mutual information estimation functions for continuous variables. This estimator gives a correction term to the traditional kNN estimator and can estimate mutual information more accurately than Kraskov estimator for strongly dependent variables with limited samples.

To use this package, it requires scipy 0.12 or greater.

Example installation and usage:

git clone https://github.com/BiuBiuBiLL/NPEET_LNC.git

>>> from lnc import MI
>>> import numpy as np
>>> x = [1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,9.0,10.0,11.0]
>>> y = [1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,9.0,10.0,11.0]
>>> MI.mi_LNC([x,y],k=5,base=np.exp(1),alpha=0.25) #Apply LNC estimator
Output: 25.29758574548632
>>> MI.mi_Kraskov([x,y],k=5,base=np.exp(1)) #Same data applied to Kraskov estimator
Output: 0.62745310245310382

To run the test example in test.py, simply type the following in terminal

$python test.py
Output:
Testing 2D linear relationship Y=X+Uniform_Noise
noise level=1e-07, Nsamples = 500
True MI(x:y) 16.0472260191
Kraskov MI(x:y) 5.79082342999
LNC MI(x:y) 15.9355050641

Testing 2D quadratic relationship Y=X^2+Uniform_Noise
noise level=1e-07, Nsamples = 1000
True MI(x:y) 15.8206637545
Kraskov MI(x:y) 6.48347086055
LNC MI(x:y) 11.4586276609

Testing 3D linear relationship Y=X+Uniform_Noise, Z=X+Uniform_Noise
noise level=1e-07, Nsamples = 500
True MI(x:y:z) 32.2836569188
Kraskov MI(x:y:z) 11.58164686
LNC MI(x:y:z) 32.1846129957

Testing 3D quadratic relationship Y=X^2+Uniform_Noise, Z=X^2+Uniform_Noise
noise level=1e-07, Nsamples = 500
True MI(x:y:z) 31.5020968975
Kraskov MI(x:y:z) 11.57764686
LNC MI(x:y:z) 25.6686276941

One need to specify the thresholding parameter alpha when using LNC estimator. This parameter is related to the nearest-neighbor parameter k and dimensionality d, see alpha.xlsx for the detailed alpha value to use.

Also see the references on the implemented estimators.

			A Kraskov, H Stögbauer, P Grassberger. 
			http://pre.aps.org/abstract/PRE/v69/i6/e066138
			Estimating Mutual Information
			PRE, 2004.

			Shuyang Gao, Greg Ver Steeg and Aram Galstyan 
			http://arxiv.org/abs/1411.2003
			Efficient Estimation of Mutual Information for Strongly Dependent Variables
			AISTATS, 2015.

npeet_lnc's People

Contributors

biubiubill avatar

Watchers

Xiaowei Song avatar James Cloos avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.