Coder Social home page Coder Social logo

mafm / deep_evidential_regression_loss_pytorch Goto Github PK

View Code? Open in Web Editor NEW

This project forked from deebuls/deep_evidential_regression_loss_pytorch

0.0 1.0 0.0 195 KB

Implementation of Deep evidential regression paper

License: Apache License 2.0

Python 100.00%

deep_evidential_regression_loss_pytorch's Introduction


Deep Evidential Regression Loss Function

Paper Conference

Description

The paper "Deep Evidential Uncertainty/Regression" was submitted to ICLR where it was rejected[1]. The idea is inline with light of Sensoy et al.[2] and Malinin & Gales[3]. It was rejected because of lack of experiments and similar ideas with Malinin thesis. The goal is to implement the loss function and validate the results.

Installation

Typical Install

pip install git+https://github.com/deebuls/deep_evidential_regression_loss_pytorch

Development

git clone https://github.com/deebuls/deep_evidential_regression_loss_pytorch
cd deep_evidential_regression_loss_pytorch
pip install -e .[dev]

Tests can then be run from the root of the project using:

nosetests

Usage

To use this code EvidentialLossSumOfSquares and create loss function. loss.py implements the evidential loss function.

Check examples for detailed usage example

ToDo

  1. Different variation of the loss (NLL, with log(alpha, beta, lambda))
  2. When output is image as case of VAE
  3. Examples
  4. Test cases

Abstract

Deterministic neural networks (NNs) are increasingly being deployed in safety critical domains, where calibrated, robust and efficient measures of uncertainty are crucial. While it is possible to train regression networks to output the parameters of a probability distribution by maximizing a Gaussian likelihood function, the resulting model remains oblivious to the underlying confidence of its predictions. In this paper, we propose a novel method for training deterministic NNs to not only estimate the desired target but also the associated evidence in support of that target. We accomplish this by placing evidential priors over our original Gaussian likelihood function and training our NN to infer the hyperparameters of our evidential distribution. We impose priors during training such that the model is penalized when its predicted evidence is not aligned with the correct output. Thus the model estimates not only the probabilistic mean and variance of our target but also the underlying uncertainty associated with each of those parameters. We observe that our evidential regression method learns well-calibrated measures of uncertainty on various benchmarks, scales to complex computer vision tasks, and is robust to adversarial input perturbations.

References

deep_evidential_regression_loss_pytorch's People

Contributors

deebuls avatar

Watchers

James Cloos avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.