bertdv / bmlip Goto Github PK
View Code? Open in Web Editor NEWCourse 5SSD0 - Bayesian Machine Learning and Information Processing
Home Page: http://bmlip.nl
License: Other
Course 5SSD0 - Bayesian Machine Learning and Information Processing
Home Page: http://bmlip.nl
License: Other
I would like to put links in the lecture notes to a specific page in a pdf file (eg the PRML book). It should be possible for a student to click on a link in the notebook and a specific page from the PRML book opens. This way we can make notes such as
for proof, see Bishop, p.463.
I have looked around and it should be possible to add a named bookmark (say with name p463
) and make it work with a link such as
https://github.com/bertdv/BMLIP/blob/master/lessons/notebooks/files/PRML.pdf#p463
but it does not work for me. Can any of you figure out how to make this work?
I think we (read: I) should write an installation guide so that the students can quickly install the right versions of Julia and Jupyters. Lots of things can go wrong in the PP notebooks if they get their versions wrong.
Are there alternative methods for generating PDF files from Jupyter notebooks? It is difficult at the moment to get attractive-looking output.
As discussed, please finish the cell at bottom of
https://nbviewer.jupyter.org/github/bertdv/BMLIP/blob/master/lessons/notebooks/Factor-Graphs.ipynb
discussing the Bethe Free energy and showing how belief propagation corresponds to a stationary point of BFE. Simple derivation preferred.
I want to run the notebooks in mybinder and add a badge to the repo:
This should make it easy (1 click) for students to run the IJulia notebooks. check out https://discourse.julialang.org/t/using-julia-in-binder-interactive-web-environment-for-running-your-code/21802
and please set up the repo such that mybinder works properly.
Can you please code the following visualization of a KL divergence in Julia?
I would like to add it to the latent variable models lession, see
https://github.com/bertdv/BMLIP/blob/master/lessons/notebooks/Latent-Variable-Models-and-VB.ipynb
In the section "Transformations and Sums of Gaussian Variables", the following is written:
... after a linear transformation $z=Ax+b$, no matter how $x$ is distributed,
the mean and variance of $z$ are given by $\mathbb{E}[z] = \mathbb{E}[x] + b$
and $\mathrm{var}[z] = A\mathrm{var}[z] A^T$
I think that
The Julia example (cart parking) in the active inference lesson (https://nbviewer.org/github/bertdv/BMLIP/blob/master/lessons/notebooks/Intelligent-Agents-and-Active-Inference.ipynb) is based on ForneyLab. Please move the code to ReactiveMP. @Sepideh-Adamiat can you change the code to the mountain car task? thanks!
Are you good to go with https://biaslab.github.io/teaching/bmlip/ ?
eg Thijs's mountain car, Magnus' BATMAN or Burak' robot. Preferably coded in FL.
Use jmd to write the lectures, then export to various formats.
eg see workflow in https://github.com/mitmath/18S191
Use pluto.jl rather than jupyter ipynb notebooks. Static display with github's htmlpreview, eg
The last simulation in https://nbviewer.jupyter.org/github/bertdv/BMLIP/blob/master/lessons/notebooks/The-Gaussian-Distribution.ipynb has a DimensionError. Please fix.
@Sepideh-Adamiat B0-B6
@abpolym B7-B12
Add Julia code to generate fig. 10.6 in Latent Variable lesson (https://nbviewer.jupyter.org/github/bertdv/BMLIP/blob/master/lessons/notebooks/Latent-Variable-Models-and-VB.ipynb)
Also there is a deprecation warning in another piece of code. Please fix.
They do not render well in nbviewer (at least not for me).
instructions for generating PDF files:
https://github.com/bertdv/BMLIP/blob/master/bundler/instructions.txt
code in bundler
folder.
outputs in output
folder.
Towards the bottom of the BML lecture
there is a Code Example: Bayesian evolution for the coin toss
. The code produces 4 plots that illustrate the evolution of priors to posteriors (+ Bayes factor) for 2 models (that differ in their prior).
In principle, the code works and the plots are produced. I have two requests:
thanks.
Perhaps I did something wrong but I don't know what. I updated the landing page content/teaching/bmlip.md
with information about the new 2022-23 class and now the menu for 2021-22 and 2022-23 under the teaching tap at biaslab homepage has disappeared. What did I do wrong?
the two notebooks:
https://github.com/bertdv/BMLIP/blob/master/lessons/notebooks/ai_agent/robot_car_1d.ipynb
https://github.com/bertdv/BMLIP/blob/master/lessons/notebooks/ai_agent/robot_car_2d.ipynb
give some errors with Julia 1.5.2. Please fix
In one of the notebooks, I open up the generated algorithm. However, if the cell doesn't encapsulate the output (i.e. give you a scroll bar), the notebook's length increases dramatically.
Can we find a specific line in the algo
and print only that?
Please fix the code examples in PT lesson. There is an error talking about WebIO, see attached. I installed WebIO but it still doesnt work. Probably remove the animation to make it work?
@animate
feature in my jupyter notebooks. I have no problem with getting rid of the animations if that solves the problem.Links in README for read-only versions of lecture notes seems to be broken. Some can think that links just refer to wrong file names but I noticed this commit d13c9e0 (month ago) which supposed to be a "fix" for this links?
I can see two options here:
In first case I can make a simple pull request reverting broken changes.
In teh Gaussian distributions lesson, I see the following piece of code:
This should really be refactored to cleaner code. For instance:
t
inside a function. Rather, pass t
as an argument, but in principle the performKalmanStep
function should not even need a time index.performKalmanStep
function. (e.g. Please refactor to clean code.
Since the class notes are Julia notebooks, I d like to test them automatically everytime when we commit. Apparently GH has made (will make) this available, see
https://github.com/features/actions
Please set up automatic testing of our notebooks.
Not all lessons render. Occasionally we'll hit a 404. This is a placeholder issue to track progress towards fixing this issue.
Known culprits:
The final simulation leads to posterior class prediction = 1.0. This is odd, should be < 1.0 I think. Please check out what's going on.
There are some files in folders that are perhaps outdated. For example, the folder lessons/notebooks/ipad-notes
with 1 small pdf. Shall we remove it?
there is a "dangling edge" problem w the code in the Factor graph notebook. I think there has been an update to FL so this can be fixed.
The notebook
https://nbviewer.jupyter.org/github/bertdv/BMLIP/blob/master/lessons/notebooks/Factor-Graphs.ipynb
does nto show it yet (at least not for me).
in the code example in the BML lecture (coin toss), we plot surprise
.
Let's check out if the makefile in this repo works for us as well.
The code example in the Bayesian ML lesson (https://nbviewer.org/github/bertdv/BMLIP/blob/master/lessons/notebooks/Intelligent-Agents-and-Active-Inference.ipynb) only computes the posterior for the coin toss.
I have updated the class and derived an expression for both the posterior and evidence.
I would like to extend this example and include model comparison into the code.
So, let's extend the code to:
model 1: prior beta(1,1)
model 2: prior beta(5,1)
plot both posteriors as before but also evidence for both models.
In this lesson,
the code starts w
using Pkg;
Pkg.activate("/home/mkoudahl/Documents/biaslab/repos/BMLIP/lessons/notebooks/probprog/workspace/");
Pkg.instantiate()
Please fix the path to an accessible folder.
Can you try to generate a pdf book from the lecture notes B1-B12 again?
The lecture notes and exercises at the landing page for 2021-22 (https://biaslab.github.io/teaching/archive/bmlip-2021/) point to the most recent versions (Not the versions of 2021-22). Please fix.
Currently, I'm using the following command to inspect the MCMC chain in the HMM notebook.
describe(chain)
However, this produces a lot of output. I actually just want to inspect the chains for some of the latter states. I need to find a way of indexing the chain, something like:
describe(chain[:x_40 : :x_50])
There is a (Julia) code issue in the latent variable lecture notes:
https://github.com/bertdv/BMLIP/blob/master/lessons/notebooks/Latent-Variable-Models-and-VB.ipynb
Has to do with pre-compilation of CSV.jl
. Could anybody please fix this ASAP? thanks!
The label=
keyword argument in contour
is not responding.
MWE:
using Distribution
using Plots
pyplot()
x1 = -5:.1:5
x2 = -5:.1:5
contour(x1, x2, (x1, x2) -> pdf(MvNormal([0., 0.], [1. 0.;0. 1.]), [x1, x2]), label="a")
I'm getting a warning when I try this in the commandline: UserWarning: The following kwargs were not used by contour: 'label'
It seems Julia is ignoring this particular keyword argument. Anyone have any experience with this?
lesson: Bayesian evolution of ๐(๐|๐ท) for the coin toss
In the simulation Bayesian evolution of ๐(๐|๐ท) for the coin toss
lets get rid of the animation and replace the code by code that generates four small plots that shows the evolution of the posterior, eg after 0, 1, 5, 50 tosses.
Is the Project.toml
file up to date? I may not understand how Project.toml
(and I m sorry if i say nonsense here) but I see dependency on ForneyLab
(shouldn't this be removed?), and no dependency on reactiveMP
Also, in the Gaussian distributions lesson, I see a dependency on the hCubature
package, and get an error, but I don't see the hCubature
package listed in Project.toml
In teh Discriminative classification lesson, I get an error since teh Optim
package is not found:
In principle, the notebooks should run for anybody who has installed Julia.
Please update the Project.toml
file.
At least, needs to get updated to version 1.5.2 (perhaps other issues).
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.