Coder Social home page Coder Social logo

simple-einet's Introduction

An EinsumNetworks Implementation

This repository contains code for my personal EinsumNetworks implementation.

For a speed benchmark comparison against the official EinsumNetworks implementation, check out benchmark.md (TLDR: simple-einet is faster in all dimensions except the input-channel size in which it scales similar to the official EinsumNetworks implementation).

Notebooks

The notebooks directory contains Jupyter notebooks that demonstrate the usage of this library.

PyTorch Lightning Training

The main_pl.py script offers PyTorch-Lightning based training for discriminative and generative Einets.

Classification on MNIST examples:

python main_pl.py dataset=mnist batch_size=128 epochs=100 dist=normal D=5 I=32 S=32 R=8 lr=0.001 gpu=0 classification=true 

Generative training on MNIST:

python main_pl.py dataset=mnist D=5 I=16 R=10 S=16 lr=0.1 dist=binomial epochs=10 batch_size=128

MNIST Samples

Installation

You can install simple-einet as a dependency in your project as follows:

pip install git+https://github.com/braun-steven/simple-einet

If you want to additionally install the dependencies requires to launch the provided scripts such as main.py, main_pl.py or the notebooks, run

pip install "git+https://github.com/braun-steven/simple-einet#egg=simple-einet[app]"

If you plan to edit the files after installation:

git clone [email protected]:braun-steven/simple-einet.git
cd simple-einet
pip install -e .

Usage Example

The following is a simple usage example of how to create, optimize, and sample from an Einet.

import torch
from simple_einet.layers.distributions.normal import Normal
from simple_einet.einet import Einet
from simple_einet.einet import EinetConfig


if __name__ == "__main__":
    torch.manual_seed(0)

    # Input dimensions
    in_features = 4
    batchsize = 5

    # Create input sample
    x = torch.randn(batchsize, in_features)

    # Construct Einet
    cfg = EinetConfig(
        num_features=in_features,
        depth=2,
        num_sums=2,
        num_channels=1,
        num_leaves=3,
        num_repetitions=3,
        num_classes=1,
        dropout=0.0,
        leaf_type=Normal,
    )
    einet = Einet(cfg)

    # Compute log-likelihoods
    lls = einet(x)
    print(f"lls.shape: {lls.shape}")
    print(f"lls: \n{lls}")

    # Optimize Einet parameters (weights and leaf params)
    optim = torch.optim.Adam(einet.parameters(), lr=0.001)

    for _ in range(1000):
        optim.zero_grad()

        # Forward pass: compute log-likelihoods
        lls = einet(x)

        # Backprop negative log-likelihood loss
        nlls = -1 * lls.sum()
        nlls.backward()

        # Update weights
        optim.step()

    # Construct samples
    samples = einet.sample(2)
    print(f"samples.shape: {samples.shape}")
    print(f"samples: \n{samples}")

Citing EinsumNetworks

If you use this software, please cite it as below.

@software{braun2021simple-einet,
author = {Braun, Steven},
title = {{Simple-einet: An EinsumNetworks Implementation}},
url = {https://github.com/braun-steven/simple-einet},
version = {0.0.1},
}

If you use EinsumNetworks as a model in your publications, please cite our official EinsumNetworks paper.

@inproceedings{pmlr-v119-peharz20a,
  title = {Einsum Networks: Fast and Scalable Learning of Tractable Probabilistic Circuits},
  author = {Peharz, Robert and Lang, Steven and Vergari, Antonio and Stelzner, Karl and Molina, Alejandro and Trapp, Martin and Van Den Broeck, Guy and Kersting, Kristian and Ghahramani, Zoubin},
  booktitle = {Proceedings of the 37th International Conference on Machine Learning},
  pages = {7563--7574},
  year = {2020},
  editor = {III, Hal Daumé and Singh, Aarti},
  volume = {119},
  series = {Proceedings of Machine Learning Research},
  month = {13--18 Jul},
  publisher = {PMLR},
  pdf = {http://proceedings.mlr.press/v119/peharz20a/peharz20a.pdf},
  url = {http://proceedings.mlr.press/v119/peharz20a.html},
  code = {https://github.com/cambridge-mlg/EinsumNetworks},
}

simple-einet's People

Contributors

braun-steven avatar dependabot[bot] avatar felixdivo avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

simple-einet's Issues

How to conduct classification in EinsumNetworks?

I have seen that ARt-SPN can be classified, and I plan to conduct the same classification in EinsumNetworks, but the training accuracy is very low, I can't find the reason, can you add its classification effect? I look forward to receiving your reply. Thank you.

Is Differentiable Sampling implemented?

Hi @braun-steven, thanks for the great work.
I wanted to confirm with you if this repository contains the updated code for the differentiable sampling as in the paper - https://proceedings.mlr.press/v181/lang22a/lang22a.pdf

On similar lines, I see here (https://github.com/braun-steven/simple-einet/blob/main/examples/test_iris.py ), that you are using CE as loss, that would mean I can base this as an example for differentiable sampling ?

If not, where can I refer to an end-to-end example, where the backprop through an EiNet happens, with the gumbel softmax implementation?

Improving the iris dataset classification

Hi,
Thanks for the library. I made a small change in the iris classification demo, using the full dataset (i.e. num_features=4) and then scaling the dataset using sklearn.preprocessing import StandardScaler, the classifier now achieves 100% training accuracy and 98% test accuracy!

Here's the code:

import torch
from matplotlib.colors import ListedColormap
from sklearn import datasets
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from simple_einet.einet import Einet, EinetConfig
from simple_einet.layers.distributions.normal import Normal, RatNormal
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns


# Load the Iris dataset
iris = datasets.load_iris()
X = iris.data
scaler = StandardScaler()
X = scaler.fit_transform(X)
# Split the dataset into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, iris.target, test_size=0.33, random_state=42)

# Convert the data to PyTorch tensors
X_train = torch.tensor(X_train).float()
y_train = torch.tensor(y_train).long()
X_test = torch.tensor(X_test).float()
y_test = torch.tensor(y_test).long()

# Define model configuration using default values
config = EinetConfig(
    num_features=4,
    depth=1,
    num_sums=5,
    num_leaves=5,
    num_repetitions=5,
    num_classes=3,
    leaf_type=Normal,
    leaf_kwargs={},
    dropout=0.0,
)

# Initialize the model
model = Einet(config)


# Set up the optimizer and loss function
optimizer = torch.optim.Adam(model.parameters(), lr=0.1)
cross_entropy = torch.nn.CrossEntropyLoss()

# Define a function to compute accuracy
def accuracy(model, X, y):
    with torch.no_grad():
        outputs = model(X)
        predictions = outputs.argmax(-1)
        correct = (predictions == y).sum()
        total = y.shape[0]
        return correct / total * 100

# Training loop
for epoch in range(50):
    optimizer.zero_grad()
    log_likelihoods = model(X_train)
    loss = cross_entropy(log_likelihoods, y_train)
    loss.backward()
    optimizer.step()

    acc_train = accuracy(model, X_train, y_train)
    acc_test = accuracy(model, X_test, y_test)
    print(f"Epoch: {epoch + 1}, Loss: {loss.item():.2f}, Accuracy Train: {acc_train:.2f} %, Accuracy Test: {acc_test:.2f} %")

Conditional reasoning examples

Hi Again,

The examples applications seem to be geared towards either generative modelling for sampling and image completion or discriminative learning for inference.

I was wondering if you would be happy to provide an example for generative modelling for conditional inference similar to conditional inference in Bayesian networks. And perhaps a simple application such as Iris classification would suffice.

Thanks,

Example Code is not working

Hey

I'm trying to make some tests with the implementation of EinSumNets, but i cant get the example code running.

  1. The import to the RatNormal class is woring. It has to be
    from simple_einet.layers.distributions.normal import RatNormal instead of from simple_einet.distributions import RatNormal
  2. Even after fixing that, a new error occurs. TypeError: simple_einet.layers.distributions.normal.RatNormal() argument after ** must be a mapping, not NoneType. Removing that line of code (line 278 in einet.py) does solve the problem, but can't be the solution.
    Thanks in advance

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.