tik0 / bnt Goto Github PK
View Code? Open in Web Editor NEWAutomatically exported from code.google.com/p/bnt
Automatically exported from code.google.com/p/bnt
Hey, got the following message. Already changed the permission's folder. Im
running win7 and matlab R2008b.
initializing pmtk3
downloading 39 packages to pmtk3/pmtksupportCopy from
pmtksupport.googlecode.com - this may take a few minutes
downloading GGM-GWishart.............done
downloading GPstuff-2.0..............done
downloading SPAMS-1.02...............done
downloading boostingDemo.............done
downloading bpca.....................done
downloading dpMixWood................done
downloading dpmixturesTeh07..........done
downloading ekfukf1.2................done
downloading export_fig...............done
downloading exportfig................done
downloading fastICA-2.5..............done
downloading fastfit..................done
downloading gaimc1.0-graphAlgo.......done
downloading glmnet-matlab............Warning: Permission denied to create file
"C:\Program
Files\MATLAB\R2008b\toolbox\probabilistic
toolkit\pmtksupportCopy\glmnet-matlab\glmnetMex.mexw64".
> In iofun\private\extractArchive>extractArchiveEntry at 108
In iofun\private\extractArchive at 52
In unzip at 92
In downloadAllSupport at 22
In initPmtk3 at 49
done
downloading gpml-matlab..............done
downloading graphViz4Matlab..........done
downloading l1ls.....................done
downloading lars.....................done
downloading libdai-0.2.6.............done
downloading liblinear-1.51...........done
downloading libsvm-mat-2.9.1.........Warning: Permission denied to create file
"C:\Program
Files\MATLAB\R2008b\toolbox\probabilistic
toolkit\pmtksupportCopy\libsvm-mat-2.9.1\libsvmTrain.mexw64".
> In iofun\private\extractArchive>extractArchiveEntry at 108
In iofun\private\extractArchive at 52
In unzip at 92
In downloadAllSupport at 22
In initPmtk3 at 49
done
downloading lightspeed2.3............done
downloading markSchmidt-9march2011...done
downloading matbugs..................done
downloading matlabRlink..............done
downloading maxBranching.............done
downloading mcmcdiag.................done
downloading mplp-1.0.................done
downloading netlab3.3................done
downloading onlineEM.................done
downloading pfColorTracker...........done
downloading pmtkSupportRoot.m........failed to download
downloading randraw..................done
downloading rbpfSlam.................done
downloading readme.txt...............failed to download
downloading rjmcmcRbf................done
downloading sparseBayes2.0...........done
downloading svmLightWindows..........done
downloading vblinlogreg..............done
??? Undefined function or variable 'pmtkSupportRoot'.
Error in ==> installLightspeedPMTK at 19
directory = fullfile(pmtkSupportRoot, getConfigValue('PMTKlightSpeedDir'));
Error in ==> initPmtk3 at 178
installLightspeedPMTK();
Regards
Original issue reported on code.google.com by [email protected]
on 13 May 2011 at 10:10
What steps will reproduce the problem?
Here the Code :
%% Construct DBN
O = 32; % num observable symbols
ss = 6; % slice size
intra = zeros(ss);
% One observation variable the #1
intra(2,1) = 1;
intra(3,1) = 1;
intra(4,1) = 1;
intra(5,1) = 1;
intra(6,1) = 1;
inter = zeros(ss);
inter(2, [2 3 4 5]) = 1;
inter(3, [2 3 4 6]) = 1;
inter(4, [2 3 4 5 6]) = 1;
inter(5, [2 4 5 6]) = 1;
inter(6, [3 4 5 6]) = 1;
onodes = 1; % observed
dnodes = 1:ss; % discrete
ns = [O 2*ones(1,ss-1)]; % binary nodes except observed node
eclass1 = [1 2 2 2 2 2];
eclass2 = [1 3:7];
eclass = [eclass1 eclass2];
bnet = mk_dbn(intra, inter, ns, 'discrete', dnodes, 'observed', onodes,
'eclass1', eclass1, 'eclass2', eclass2);
%% Set Prior
% Use Uniform Prior :
for e=1:max(eclass)
bnet.CPD{e} = tabular_CPD(bnet, e, 'CPT', 'unif');
end
%% DATA : sequences of 100 observations
k = 300; % Number of samples
T = 100; % Length of samples
data = randi(32,k,T);
%% Learn DBN
engine = smoother_engine(jtree_2TBN_inf_engine(bnet));
ncases = k;%number of examples
cases = cell(1,ncases);
for i=1:ncases
% ev = sample_dbn(bnet, T);
cases{i} = cell(ss,T);
cases{i}(onodes,:) = num2cell(data(onodes,1:T), 1);
end
[bnetH2, LLtrace] = learn_params_dbn_em(engine, cases, 'max_iter', 4);
What is the expected output? What do you see instead?
Here is the error :
??? Error using ==> reshape
To RESHAPE the number of elements must not change.
Error in ==> myreshape at 10
T = reshape(T, sizes(:)');
Error in ==> dpot.dpot at 21
pot.T = myreshape(T, sizes);
Error in ==> discrete_CPD.convert_to_pot at 28
pot = dpot(domain, ns(domain), T);
Error in ==> jtree_2TBN_inf_engine.fwd at 15
CPDpot{n} = convert_to_pot(bnet.CPD{e}, engine.pot_type, fam(:), ev2);
Error in ==> smoother_engine.enter_evidence at 14
[f{t}, ll(t)] = fwd(engine.tbn_engine, f{t-1}, ev(:,t), t);
Error in ==> learn_params_dbn_em>EM_step at 131
[engine, ll] = enter_evidence(engine, evidence);
Error in ==> learn_params_dbn_em at 82
[engine, loglik, logpost] = EM_step(engine, evidence, temperature);
Error in ==> testDBN1 at 63
[bnetH2, LLtrace] = learn_params_dbn_em(engine, cases, 'max_iter', 4);
What version of the product are you using? On what operating system?
I'm using Matlab 2010a on a win 7 x64 OS.
Please provide any additional information below.
The aim is to make classification over some data that look like the one I
generated. Soi'd like to learn 2 DBNs over two labeled sets of data. But i'm
not even capable of learning one DBN :P
It's been two days that I'm searching for what I have done wrong ... If you can
tell me I would be very pleased.
Thank you.
PS: sorry for the mistakes in English ... its not my mother tongue :)
Original issue reported on code.google.com by [email protected]
on 5 May 2011 at 6:45
Attachments:
What steps will reproduce the problem?
I looked at dirichlet_sample script file under FullBNT-1.0.7\bnt\KPMstats, and
written like No.2. I wonder both code No.2 is wrong, and it should be changed
like codes in No.1. Or both of the case are correct?
Could you please clarify on this.
No.1)
alpha = [1/3 1/3 1/3]; % gamma distribution shape parameter (vector)
scale = [1 1 1]; % gamma distribution scale parameter
CPT = zeros(N, k); % conditional prob. table: 4 x 3 -> P(F(3) | H(2),I(2))
for j=1:N
CPT(j,:) = gamrnd(alpha, scale, 1, k); % generate N x 1 Gamma random numbers with shape(=alpha)/scale parameters
end
No.2)
alpha = [1/3 1/3 1/3]; % gamma distribution shape parameter (vector)
scale = [1 1 1]; % gamma distribution scale parameter
CPT = zeros(N, k); % conditional prob. table: 4 x 3 -> P(F(3) | H(2),I(2))
for j=1:k
CPT(:,j) = gamrnd(alpha(j), scale(j), N, 1); % generate N x 1 Gamma random numbers with shape(=alpha)/scale parameters
end
What is the expected output? What do you see instead?
it seems that both of the output on the code above look correct.
What version of the product are you using? On what operating system?
FullBNT-1.0.7, Matlab2014b, Windows
Please provide any additional information below.
Original issue reported on code.google.com by [email protected]
on 11 Jan 2015 at 1:09
What steps will reproduce the problem?
1. Run the basic examples of Bayes net from
http://bnt.googlecode.com/svn/trunk/docs/usage.html
2.
3.
What is the expected output? What do you see instead?
mk_bnet should work without error.
What version of the product are you using? On what operating system?
1.0.7
Please provide any additional information below.
bnet=mk_bnet(dag, node_sizes,'discrete',[2 2 2 2]);
Undefined function 'mysetdiff' for input arguments of type 'double'.
Then I changed mysetdiff to builtin setdiff()
bnet = mk_bnet(dag, node_sizes, 'discrete', discrete_nodes);
Undefined function 'parents' for input arguments of type 'double'
Original issue reported on code.google.com by [email protected]
on 12 Mar 2013 at 12:20
I need In my project to learn structure and parameter.
after I learn the parameter ,I got some rows has just Zeros .What does these
Zeros mean .
I use 1.0.7 with windowes 8
I submit the structure and the data and the CPT .
Original issue reported on code.google.com by [email protected]
on 20 Nov 2014 at 9:59
Attachments:
1. remove redundant assignment to hidden_bitv
2. replace undefined variable 'evidence' with 'ev' (is that right?)
Original issue reported on code.google.com by [email protected]
on 25 May 2011 at 4:32
Attachments:
I am trying to do inference on a HMM model (coupled) that has loops, and I want
to use the loopy belief propagation method.
I noticed that there are two functions:
pearl_unrolled_dbn_inf_engine
pearl_dbn_inf_engine
It seems that the later one is not work yet. Is there any difference between
the two? Where can I find more information?
Your response is greatly appreciated.
I am using the FullBNT 1.0.7 version
Original issue reported on code.google.com by [email protected]
on 14 Nov 2011 at 11:53
how can i use EM algorithm ? can you give me an example with a random data base
? data=randint(4,100) for exemple
Original issue reported on code.google.com by [email protected]
on 4 May 2013 at 9:19
hi
Email Address:::: [email protected]
help me ; help me plz I write code for Inference but it have error below:::
??? Error using ==> subsindex Function 'subsindex' is not defined for values of
class 'cell'.
Error in ==> discrete_CPD.convert_to_table at 14 T = CPT(index{:});
Error in ==> discrete_CPD.convert_to_pot at 20
T = convert_to_table(CPD, domain, evidence);
Error in ==> jtree_inf_engine.enter_evidence at 57
pot{n} = convert_to_pot(bnet.CPD{e}, pot_type, fam(:), evidence);
Error in ==> Rafe_inference at 116 engine,loglik? =
enter_evidence(engine,evidence);
my code is :::::
clear all
clc
A?=xlsread('E:\DATA MINING\final_cut.xlsx');
N=6;
dag=zeros(N,N);
AS=1;AM=2;CC=3;SC=4;VT=5;DA=6;
dag(3:6,AS)=1;dag(2,3)=1;dag(3,4)=1;
discrete_nodes=1:N;
node_sizes=3 9 9 21 10?;
onodes=2:6;
bnet=mk_bnet(dag,node_sizes,'observed',onodes);
draw_graph(bnet.dag);
bnet.CPD{AS}=tabular_CPD(bnet,AS);
bnet.CPD{AM}=tabular_CPD(bnet,AM);
bnet.CPD{CC}=tabular_CPD(bnet,CC);
bnet.CPD{SC}=tabular_CPD(bnet,SC);
bnet.CPD{VT}=tabular_CPD(bnet,VT);
bnet.CPD{DA}=tabular_CPD(bnet,DA);
TrainingSamples?=cell(N,size(A,1));
for i = 1 : size(A,1)
TrainingSamples?(1,i)={A(i,1)'};
TrainingSamples?(2,i)={A(i,2)'};
TrainingSamples?(3,i)={A(i,3)'};
TrainingSamples?(4,i)={A(i,4)'};
TrainingSamples?(5,i)={A(i,5)'};
TrainingSamples?(6,i)={A(i,6)'};
end
bnet=learn_params(bnet,TrainingSamples?);
engine = jtree_inf_engine(bnet);
evidence = cell(1,N);
evidence{AM} = {A(i,2)'};
evidence{CC} = {A(i,3)'};
evidence{SC} = {A(i,4)'};
evidence{VT} = {A(i,5)'};
evidence{DA} = {A(i,6)'};
engine,loglik? = enter_evidence(engine,evidence);
marg = marginal_nodes(engine, AS);
Original issue reported on code.google.com by [email protected]
on 19 Sep 2011 at 4:54
Attachments:
i started to work on DBN recently, after that i work on HMM before.
i whould creat a speech recognition system with dbn using BNT.
i understand that each trame will be moddeled by a set of hidden and observed
nodes.
the observed nodes size will be the size of the acoustic trame.
but i can't uderstand how i can fix the hidden nodes size!!
i think about using the number of phonems but i am not sure.
can anyone help me.. thanks in advance.
Original issue reported on code.google.com by [email protected]
on 20 Oct 2012 at 9:12
What steps will reproduce the problem?
Running the following code will reproduce the problem:
intra = [0 1; 0 0];
inter = [1 0; 0 0];
num_nodes = 2;
num_states = [2 2];% num of states
dnodes = [1 2]; % indices of discrete nodes
onodes = 2;% indices of observed nodes
eclass1 = [1 2];
eclass2 = [3 2];
N = max([eclass1 eclass2]);
CPT = cell(1,N);
bnet = mk_dbn(intra, inter, num_states, 'discrete', dnodes, ...
'observed', onodes, 'eclass1', eclass1, 'eclass2', eclass2);
bnet.CPD{1} = tabular_CPD(bnet,1);
bnet.CPD{2} = tabular_CPD(bnet,2);
bnet.CPD{3} = tabular_CPD(bnet,3);
engine = smoother_engine(jtree_2TBN_inf_engine(bnet));
ss = 2;%slice size(ss)
ncases = 10;%number of examples
T=10;
max_iter=2;%iterations for EM
cases = cell(1, ncases);
for i=1:ncases
ev = sample_dbn(bnet, T);
cases{i} = cell(ss,T);
cases{i}(onodes,:) = ev(onodes, :);
cases{i}{2,3} = [];
cases{i}{2,4} = [];
end
[bnet2 LLTrace]= learn_params_dbn_em(engine, cases);
for i=1:N
s=struct(bnet2.CPD{i});
CPT{i}=s.CPT;
end
celldisp(CPT);
What is the expected output? What do you see instead?
I expect to see CPTs printed to the screen for the prior, transition, and
emission probabilities. Instead, I get the following error:
Error using .*
Matrix dimensions must agree.
Error in mult_by_table (line 7)
bigT(:) = bigT(:) .* Ts(:); % must have bigT(:) on LHS to preserve
shape
Error in dpot/multiply_by_pot (line 11)
Tbig.T = mult_by_table(Tbig.T, Tbig.domain, Tbig.sizes,
Tsmall.T, Tsmall.domain, Tsmall.sizes);
Error in jtree_inf_engine/init_pot (line 17)
clpot{c} = multiply_by_pot(clpot{c}, pots{i});
Error in jtree_2TBN_inf_engine/fwd (line 36)
[f.clpot, f.seppot] = init_pot(engine.jtree_engine, clqs, pots,
engine.pot_type, engine.observed);
Error in smoother_engine/enter_evidence (line 14)
[f{t}, ll(t)] = fwd(engine.tbn_engine, f{t-1}, ev(:,t), t);
Error in learn_params_dbn_em>EM_step (line 131)
[engine, ll] = enter_evidence(engine, evidence);
Error in learn_params_dbn_em (line 82)
[engine, loglik, logpost] = EM_step(engine, evidence,
temperature);
Error in mem_no_aux (line 58)
bnet2 = learn_params_dbn_em( engine, cases);
What version of the product are you using? On what operating system?
I am using BNT Full Version 1.07 on 64 bit Windows 7.
Please provide any additional information below.
What has been provided is one of the sample HMM learning codes - it has been
slightly augmented so that there are missing values in the samples for the
observed nodes. Substituting a sample with the missing value symbol, [], causes
element wise matrix multiplication errors in mult_by_table preventing the
program from continuing.
Original issue reported on code.google.com by [email protected]
on 25 Jul 2012 at 1:02
I got the idea that using mk_bnet function to construct the Bayesian network
with
conditional probabilities, but I can not figure out how to assign values to
those nodes.
The nodes in our problem have both probabilities and values, for example, the
rain is
30% to be large which is 100 mm/day.
And the deterministic nodes which are calculated by parent nodes' values, can
they be
constructed by deterministic_CPD?
I know this may not be a issue for expert, but please help, thank you!!
Original issue reported on code.google.com by [email protected]
on 11 Jul 2012 at 6:54
At line 47: min_fill(j) = l^2 - sum(M(:));
Should we also minus the self edges like the following?
min_fill(j) = l^2 -l - sum(M(:));
What version of the product are you using?
Version 1.0.7
Original issue reported on code.google.com by [email protected]
on 2 Jul 2013 at 8:00
hi
Email Address:::: [email protected]
help me ; help me plz I write code for Inference but it have error below:::
??? Error using ==> subsindex Function 'subsindex' is not defined for values of
class 'cell'.
Error in ==> discrete_CPD.convert_to_table at 14 T = CPT(index{:});
Error in ==> discrete_CPD.convert_to_pot at 20
T = convert_to_table(CPD, domain, evidence);
Error in ==> jtree_inf_engine.enter_evidence at 57
pot{n} = convert_to_pot(bnet.CPD{e}, pot_type, fam(:), evidence);
Error in ==> Rafe_inference at 116 engine,loglik? =
enter_evidence(engine,evidence);
my code is :::::
clear all
clc
A?=xlsread('E:\DATA MINING\final_cut.xlsx');
N=6;
dag=zeros(N,N);
AS=1;AM=2;CC=3;SC=4;VT=5;DA=6;
dag(3:6,AS)=1;dag(2,3)=1;dag(3,4)=1;
discrete_nodes=1:N;
node_sizes=3 9 9 21 10?;
onodes=2:6;
bnet=mk_bnet(dag,node_sizes,'observed',onodes);
draw_graph(bnet.dag);
bnet.CPD{AS}=tabular_CPD(bnet,AS);
bnet.CPD{AM}=tabular_CPD(bnet,AM);
bnet.CPD{CC}=tabular_CPD(bnet,CC);
bnet.CPD{SC}=tabular_CPD(bnet,SC);
bnet.CPD{VT}=tabular_CPD(bnet,VT);
bnet.CPD{DA}=tabular_CPD(bnet,DA);
TrainingSamples?=cell(N,size(A,1));
for i = 1 : size(A,1)
TrainingSamples?(1,i)={A(i,1)'};
TrainingSamples?(2,i)={A(i,2)'};
TrainingSamples?(3,i)={A(i,3)'};
TrainingSamples?(4,i)={A(i,4)'};
TrainingSamples?(5,i)={A(i,5)'};
TrainingSamples?(6,i)={A(i,6)'};
end
bnet=learn_params(bnet,TrainingSamples?);
engine = jtree_inf_engine(bnet);
evidence = cell(1,N);
evidence{AM} = {A(i,2)'};
evidence{CC} = {A(i,3)'};
evidence{SC} = {A(i,4)'};
evidence{VT} = {A(i,5)'};
evidence{DA} = {A(i,6)'};
engine,loglik? = enter_evidence(engine,evidence);
marg = marginal_nodes(engine, AS);
Original issue reported on code.google.com by [email protected]
on 19 Sep 2011 at 4:55
Attachments:
What steps will reproduce the problem?
1. Running the code will reproduce the problem
What is the expected output? What do you see instead?
I am confused with the marginal_nodes function. I am not sure what the node and
the t value is supposed to be. I have a HMM-Gaussian, a simple one. I learned
parameters and for each of my gaussian value, I want to predict the state [3
hidden states].
So I produced the evidence function and then I want to estimate the marginal
values, which unfortunately cannot be done with my error in calling
marginal_nodes function. Please let me know what values should be given to
arguments nodes and t. [i did not understand the the documents explanation].
What version of the product are you using? On what operating system?
mac OS and latest version, which I downloaded from bayesnet website just today.
Please provide any additional information below.
Mine is a simple univariate gaussian output and discrete hidden states [3
states]. My problem is with the marginal_nodes function.
Original issue reported on code.google.com by [email protected]
on 11 Jul 2011 at 11:33
Attachments:
how can i compute for every case in the databases theta_ijk ie
p(xi=k|pa(xi)=j,theta,Dl) where Dl is a case in the databases.
thinking in advance
Original issue reported on code.google.com by [email protected]
on 7 Jun 2010 at 10:56
What steps will reproduce the problem?
1. I am trying to run the example which is on this link
"http://bnt.googlecode.com/svn/trunk/docs/usage.html" but I got some
errors. I attached the file please run it and tell me if I did any mistake.
2. I want to Marginalize the "S" node w.r.t "R".
3. And I am getting these errors in MATLAB. I am pasting below:
??? Index exceeds matrix dimensions.
Error in ==> discrete_CPD.convert_to_table at 14
T = CPT(index{:});
Error in ==> discrete_CPD.convert_to_pot at 20
T = convert_to_table(CPD, domain, evidence);
Error in ==> jtree_inf_engine.enter_evidence at 57
pot{n} = convert_to_pot(bnet.CPD{e}, pot_type, fam(:), evidence);
Error in ==> BN_cct1 at 15
[engine, loglik] = enter_evidence(engine, evidence);
What is the expected output? What do you see instead?
Ans: Expected output should be a value of marginal probability.
What version of the product are you using? On what operating system?
MATLAB 7.6.0 (R2008a) at VISTA HOME PREMIUM.
Please provide any additional information below.
Original issue reported on code.google.com by [email protected]
on 9 May 2010 at 1:31
Attachments:
I'm curious whether it is possible to create a replicating binary tree using
BNT.
to be exact: V(n+1)= 2*V(n) or
= 0.5*V(n)
where the probability of each event is (say) 50%.
with an arbitrary number of steps n (so the complete state space is unknown)
What version of the product are you using? On what operating system?
the most recent version, on matlab 2011b, windows 7
Thanks in advance.
Original issue reported on code.google.com by [email protected]
on 9 Dec 2011 at 2:12
The general BP inference (belprop_inf_engine) engine for bayesian networks does
not compute marginals properly on even the simplest networks. The Pearl BP
inference engine works flawlessly, but the general BP engine for bayesian
networks does not. The general BP inference engine for factor graphs works
properly also.
Thanks,
Jason
Original issue reported on code.google.com by [email protected]
on 31 Jul 2010 at 10:15
What steps will reproduce the problem?
1. I want to learn the pdag from alarm network data with 200,000 records using
learn_struct_pdag_pc() method.
2.the call is like this :
[pdag, G] = learn_struct_pdag_pc('cond_indep_fisher_z',n , n-1
,CovMatrix,nSamples,alpha)
where
[CovMatrix, nSamples, varfields] = CovMat(data_BNT_path,n)
I got this error :
??? Error using ==> erfc
Input must be real and full.
Error in ==> cond_indep_fisher_z>normcdf at 73
p(k) = 0.5 * erfc( - (x(k) - mu(k)) ./ (sigma(k) * sqrt(2)));
1- I got this error for some other networks.
Original issue reported on code.google.com by [email protected]
on 12 Jan 2012 at 10:53
What steps will reproduce the problem?
1. Just copied bnt into my MATLAB folder (in my path).
2. Ran test_bnt
3. Assertion trips.
What is the expected output? What do you see instead?
The test_bnt script executing successfully. What I see is:
??? Error using ==> assert at 9
assertion violated:
Error in ==> mk_rooted_tree at 12
assert(isequal(post, post2));
Error in ==> jtree_inf_engine.jtree_inf_engine at 108
[engine.jtree, engine.preorder, engine.postorder] =
mk_rooted_tree(engine.jtree, engine.root_clq);
Error in ==> cg1 at 17
engines{end+1} = jtree_inf_engine(bnet);
Error in ==> test_BNT at 5
cg1
What version of the product are you using? On what operating system?
I am using BNT 1.0.7 on Mac OS X 10.6 and MATLAB 2010b.
Please provide any additional information below.
Original issue reported on code.google.com by [email protected]
on 14 Jul 2011 at 8:11
When I computed the joint probability , using the jtree_inf_engine. I got the
following information:
??? Error using ==>
jtree_inf_engine.marginal_nodes at 13
no clique contains 1 4
I want to know what this means? Thanks.
Original issue reported on code.google.com by [email protected]
on 18 Jun 2014 at 12:06
hello!
i'm doing research in NATURAL Language processing,recently i've found DBN as a
tool to find relations between word features. i began to study DBN according to
murphy 2002 thesis. then i've found i have to study more in applied probability
so i'm currently understanding it.
i wanted to ask
if i have the fully observed data and i wanted to find the relations between
them? and how can i know the CPT. is it using EM algorithm.
how much do i need to get all the concepts of bayes net?
thanks
Original issue reported on code.google.com by [email protected]
on 24 Dec 2013 at 7:08
jtree_2TBN_inf_engine can't handle observed values for a node that you said was
hidden. This is a problem when training the DBN, as often you /do/ know the
hidden node value and you'd like the algorithm to account for this.
In message #1906 on the mailing list, "Bob" submitted a patch. Please review it
and apply it to the codebase. Patches attached.
See: http://tech.dir.groups.yahoo.com/group/BayesNetToolbox/message/1906
Original issue reported on code.google.com by [email protected]
on 9 Dec 2010 at 10:17
Attachments:
What steps will reproduce the problem?
1. I am using BNT to find marginal probability.
2. But when I use junction tree function it only shows " 1-by-1 " when I call
engine while using jtree inference.
3.
What is the expected output? What do you see instead?
1-by-1
What version of the product are you using? On what operating system?
Matlab 7.1 Ver. and using BNT 1.0.7.
Please provide any additional information below.
I want to see junction tree graph using BNT like we can draw in Hugin and other
Probabilistic software's, if not then at-least Can I see the combined nodes
means in the form of Cliques so that I can analyze my simulation that which
nodes are giving me higher values or lower.
Thank You so much in advance.
Original issue reported on code.google.com by [email protected]
on 11 Jan 2011 at 10:55
I wanted to evaluate the marginal probabilities in a network of 5 nodes.
However, whenever I try to run the program it gives an error about the
disagreement of the inner matrices while multiplication.
The inputs were according to the instructions in the tutorial.
Attached is the file with the descriptions.
Original issue reported on code.google.com by [email protected]
on 19 Jul 2011 at 3:55
Attachments:
I tried to add the file to the Matlab toolbox file and run the installC_BNT.m.
But it does not work.
The following is the first line of installC_BNT.m.
BNT_HOME = '/home/ai2/murphyk/matlab/FullBNT';
Who can tell what is the address and how to edit it?
And I also google how to install toolbox on Mac but those approaches are not
work so if u can give me a detail introduction to install the toolbox to my
Mac, I would really appreciate. By the way, my Mac and Matlab both are the
newest version. Thank you!
Original issue reported on code.google.com by [email protected]
on 15 Nov 2014 at 12:36
What steps will reproduce the problem?
1. I download and extract full_bnt.1.0.7
2. run addpath(genpath.) as the install manual
3. I can not find test_btn function, it also cause my matlab crash after my
addpath() command.
What is the expected output? What do you see instead?
My matlab crash as my normal function like ls, pwd do not work
What version of the product are you using? On what operating system?
my laptop macbook air 2013 with mac os x
Please provide any additional information below.
Original issue reported on code.google.com by [email protected]
on 12 Dec 2013 at 7:45
Attachments:
What steps will reproduce the problem?
running mk_named_CPT results in an error due to the function 'stringmatch'
being missing
using gibbs sampling inference there is an error due to the function
'compute_posterior' being missing
What is the expected output? What do you see instead?
What version of the product are you using?1.07 On what operating system? Win XP
(SP3)
Please provide any additional information below.
Replacing the reference to 'stringmatch' with 'strmatch' and including the
'exact' option appears to fix the first problem.
The second problem appears to be solved by running 'installC_BNT' after
changing the directory in the file and making pathnames consistent with
windows, i.e. replace all / with \\. (This contradicts the statement that this
method is no longer needed).
Original issue reported on code.google.com by [email protected]
on 8 Jul 2010 at 8:43
in the demo given
(http://www.media.mit.edu/wearables/mithril/BNT/mixtureBNT.txt), there is only
one sequence for training and 1 sequence for test. How about if we want to
train with multiple sequences? should we add the intra-class sequences into the
same sequence or there is another procedure? If the action is non-periodic
(like standing up) then adding into the same sequence definitely cause problem.
appreciate your help on this.
Original issue reported on code.google.com by [email protected]
on 8 Jan 2012 at 7:51
I have a BN with boolean and continuous (Gaussian) nodes. I am using the
funtion learn_params_em to learn the parameters.
I would like to know, how could I print/see the values in the Conditional
Probability tables (CPTs) and conditional density tables (CDTs)?
Original issue reported on code.google.com by [email protected]
on 5 Jun 2012 at 11:12
Hi,
I have recently started learning the BayesNet toolbox and I faced some problems
with running test experiments. I hope someone could help me.
> What steps will reproduce the problem?
I run the below script to test the inference algorithm on DBNs using the sample
codes from the online tutorial for a simple HMM:
------------------------------------------------------------------
clc
clear
intra = zeros(2);
intra(1,2) = 1; % node 1 in slice t connects to node 2 in slice t
inter = zeros(2);
inter(1,1) = 1; % node 1 in slice t-1 connects to node 1 in slice t
Q = 2; % num hidden states
O = 2; % num observable symbols
ns = [Q O];
dnodes = 1:2;
onodes = [2];
bnet = mk_dbn(intra, inter, ns, 'discrete', dnodes);
% Create CPDs
for i=1:3
bnet.CPD{i} = tabular_CPD(bnet, i);
end
% Create the engine
engine = smoother_engine(jtree_2TBN_inf_engine(bnet));
% Add evidence
T = 10;
ss = 2;
ev = sample_dbn(bnet, T);
evidence = cell(ss,T);
evidence(onodes,:) = ev(onodes, :); % all cells besides onodes are empty
engine = enter_evidence(engine, evidence);
% Compute the marginal
t = 5;
nodes = [1 2];
m = marginal_nodes(engine, nodes, t);
------------------------------------------------------------------
> What is the expected output? What do you see instead?
I get the following error when I run the script:
------------------------------------------------------------------
??? Error using ==> times
Matrix dimensions must agree.
Error in ==> mult_by_table at 7
bigT(:) = bigT(:) .* Ts(:); % must have bigT(:) on LHS to preserve shape
Error in ==> dpot.multiply_by_pot at 11
Tbig.T = mult_by_table(Tbig.T, Tbig.domain, Tbig.sizes, Tsmall.T, Tsmall.domain, Tsmall.sizes);
Error in ==> jtree_inf_engine.init_pot at 17
clpot{c} = multiply_by_pot(clpot{c}, pots{i});
Error in ==> jtree_2TBN_inf_engine.fwd1 at 20
[f.clpot, f.seppot] = init_pot(engine.jtree_engine1, CPDclqs, CPDpot,
engine.pot_type,
engine.observed1);
Error in ==> smoother_engine.enter_evidence at 12
[f{1}, ll(1)] = fwd1(engine.tbn_engine, ev(:,1), 1);
------------------------------------------------------------------
> What version of the product are you using? On what operating system?
I have installed the FullBNT-1.0.7 version.
> Please provide any additional information below.
I am using Matlab 7.7.0
Original issue reported on code.google.com by [email protected]
on 27 Jun 2011 at 5:04
Hello, Dear Developers,
I am currently working on mixed Bayes network with both discrete variable and
continuous variable. In the structure learning step. I am using
"learn_struct_mwst" to learn structure with my data. For the node_size in this
function, I know if discrete variable A has 3 alternative values, the node_size
for A is 3. However, if variable B is continuous, how can I set the node_size
for it? For example, if B is ranging from -100 to 100, do I need to put a '200'
for size of B and rescale the ranging from [-100, 100] to [0, 200]?
Thank you!!
Original issue reported on code.google.com by [email protected]
on 16 Jan 2015 at 10:29
First, let me thank you for providing and taking care for this great toolkit!
I've been using it for a while and it opened lots of new possibilities for my
research!
What steps will reproduce the problem?
I have attached an .m file which reproduces the error. Note that the file is
based on the Mixture-of-Experts example provided on this website. I changed the
network to have dimension 2 in the node denoted by X. The error occurs in
enter_evidence when assuming only X to be observed as evidence. If any other
node is observed, the error does not occur. Also, the model can be learned
without error. I added the comment "fails" to the respective lne in the script.
What is the expected output? What do you see instead?
Error using +
Matrix dimensions must agree.
Error in cpot/multiply_by_pot (line 11)
bigpot.h(u) = bigpot.h(u) + smallpot.h;
Error in cgpot/multiply_by_pot (line 18)
bigpot.can{i} = multiply_by_pot(bigpot.can{i}, smallpot.can{src});
Error in jtree_inf_engine/init_pot (line 17)
clpot{c} = multiply_by_pot(clpot{c}, pots{i});
Error in jtree_inf_engine/enter_evidence (line 77)
[clpot, seppot] = init_pot(engine, clqs, pot, pot_type, onodes);
Error in TEST (line 36)
engine = enter_evidence(engine, {[-0.31; 0.1]; []; []});% fails
What version of the product are you using? On what operating system?
latest copmment in changelog is "7 May 2010 wsun", so I think it's the latest
version of BNT; Matlab 2011b, Windows 7
Please provide any additional information below.
I am not very eductated wrt Bayesian networks, but I still have a suspicion
what could be the "bug". IN multiply_by_pot, smalpot and bigpot are combined,
where in bigpot, the observed continuous node's size has been set to 0 (see
also bnt help on this page), while smallpot (the same observed ct. node) has
size 2.
Original issue reported on code.google.com by [email protected]
on 21 Mar 2012 at 7:59
Attachments:
What steps will reproduce the problem?
1. install bnt on Matlab 2012a with Parallel Computing Toolbox
2. In Matlab, go to Parallel menu, "Select Cluster Profile"
3.
What is the expected output? What do you see instead?
Normally, you will see the profile "local." Instead, you will get a Java
exception.
What version of the product are you using? On what operating system?
Matlab 2012a on Win7 Enterprise 64 bit
Please provide any additional information below.
Workaround: after installing bnt, move bnt files in the matlab path to the end
of the path listing. test_BNT should still work.
Original issue reported on code.google.com by [email protected]
on 22 Oct 2012 at 6:13
The FINITE function is obsolete in Matlab R2009a (version 7.8). Please
change to ISFINITE in ffa.m (line 71) and mfa.m (line 116).
Original issue reported on code.google.com by [email protected]
on 14 Mar 2010 at 5:32
in order to modify the learn_params_em i would like to see the parameters of
each node in each step.
thinkind in advance.
Fradj.
Original issue reported on code.google.com by [email protected]
on 6 Oct 2010 at 10:13
What steps will reproduce the problem?
1. running log_lik_complete on a bnet with very low likelihoods
What is the expected output? What do you see instead?
Expected:
...
node 4 has very low likelihood
node 5 has very low likelihood
node 8 has very low likelihood
...
I see:
node node node node...
What version of the product are you using? On what operating system?
FullBNT-1.0.4 on Ubuntu linux
Please provide any additional information below.
To fix this, change line 27 in log_lik_complete.m from
if approxeq(exp(ll), 0), fprintf('node %d has very low likelihood\n'); end
to
if approxeq(exp(ll), 0), fprintf('node %d has very low likelihood\n', i);
end
Original issue reported on code.google.com by [email protected]
on 30 Mar 2010 at 4:48
What steps will reproduce the problem?
1. Run draw_dot(G)
2. Error is returned.
3.
What is the expected output? What do you see instead?
A graph. Instead I see an error.
What version of the product are you using? On what operating system?
neato - graphviz version 2.28.0 (20110507.0327)
Please provide any additional information below.
It looks like the output from neato has changed so that the text compare on
line 95 in dot_to_graph.m no longer works. If I change the line to
[node_pos] = sscanf(line(pos_pos:length(line)), 'pos="%f,%f"')';
(from [node_pos] = sscanf(line(pos_pos:length(line)), ' pos = "%d,%d"')';)
then it works. Also, I did not have the range function in my distribution which
is called in line 114 and 115 of dot_to_graph.m. I took a guess that it is
simply max(x)-min(x) and that seems to work.
Original issue reported on code.google.com by [email protected]
on 5 Jan 2012 at 9:43
Hey there,
I am trying to unserstand how to use Dirichlet priors on the parameters of a
CPT. In the documentation I can see how to define a dirichlet prior, but it
seems to me that it is not possible to specify the Dirichlet pseudo counts
separately for each entry.
In my case, I have a prior CPT that represents my prior knowledge, but if I
define a dirichlet prior over my parameters, I can only specify the ESS.
The function bayes_update_params will then recompute the pseudo counts by
summing the counts present in the data for a particular parent-child
combination with the dirichlet prior counts, but neglecting the prior CPT.
If I have that P(X=x | Pa1 = p) with probability 0.9 and a BDeu Dirichlet prior
with ESS = 10
I would expect my dirichlet prior to already have a pseudo count of 9 for that
particulare state, while instead I get the same number of pseudo counts spread
over all the possible states.
The results of this is that my posterior is being relatively flat and drive by
the data, despite my prior CPT being sharply peaked around some entries.
What am I doing wrong?
Original issue reported on code.google.com by [email protected]
on 25 Oct 2011 at 1:02
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.