Comments (8)
Hello, May I ask what path did you use in the decode Diffusion-LM part?
from diffusion-lm.
Hello, May I ask what path did you use in the decode Diffusion-LM part?
improved diffusion/diffusion_models/diff_e2e-tgt_block_rand16_transformer_lr0.0001_0.0_2000_sqrt_Lsimple_h128_s2_d0.1_sd102_xstart_e2e
from diffusion-lm.
Hello, May I ask what path did you use in the decode Diffusion-LM part?
improved diffusion/diffusion_models/diff_e2e-tgt_block_rand16_transformer_lr0.0001_0.0_2000_sqrt_Lsimple_h128_s2_d0.1_sd102_xstart_e2e
I am using the same path but the printed output list is empty. Have you met the same problem?
from diffusion-lm.
Hello, May I ask what path did you use in the decode Diffusion-LM part?
improved diffusion/diffusion_models/diff_e2e-tgt_block_rand16_transformer_lr0.0001_0.0_2000_sqrt_Lsimple_h128_s2_d0.1_sd102_xstart_e2e
I am using the same path but the printed output list is empty. Have you met the same problem?
The path is "diffusion_models/diff_e2e-tgt_block_rand16_transformer_lr0.0001_0.0_2000_sqrt_Lsimple_h128_s2_d0.1_sd102_xstart_e2e"
from diffusion-lm.
there is nothing special about batch_decode. It's a helper function that calls the right command. If it doesn't work, my recommendation is to run the command directly:
python scripts/text_sample.py --model_path diffusion_models/diff_roc_pad_rand128_transformer_lr0.0001_0.0_2000_sqrt_Lsimple_h128_s2_d0.1_sd101_xstart_e2e/ema_0.9999_100000.pt --batch_size 50 --num_samples 50 --top_p 1.0 --out_dir genout
if you want to figure out why it doesn't work, I guess (since you return empty list), there might be something about this try/except
from diffusion-lm.
to answer another question about which model path to use for controllable generation. you use the Diffusion-LM model path (the first one). To pass in the control classifier path, you might want to modify in the code (sorry for the bad coding practice, I will clean this up in the next version of the code release), e.g., you can search from the string "from_pretrained" to narrow the search space. In the case of Tree, it should be line 316.
from diffusion-lm.
Hello! Thank you very much for sharing. I would like to ask which file are you using in the path:
"diffusion_models/diff_e2e-tgt_block_rand16_transformer_lr0.0001_0.0_2000_sqrt_Lsimple_h128_s2_d0.1_sd102_xstart_e2e"
that you mentioned?
There are five files here: log, progress, random_emb.torch, training_args.json, vocab.json.
I only enter the path "diffusion_models/diff_e2e-tgt_block_rand16_transformer_lr0.0001_0.0_2000_sqrt_Lsimple_h128_s2_d0.1_sd102_xstart_e2e" ,
the following error will occur:
IsADirectoryError: [Errno 21] Is a directory: 'diffusion_models/diff_e2e-tgt_block_rand16_transformer_lr0.0001_0.0_2000_sqrt_Lsimple_h128_s2_d0.1_sd102_xstart_e2e'
I would appreciate it if you could take some time to answer my questions. Thanks.
from diffusion-lm.
I had the same problem as above.
I would appreciate it if you could take some time to answer my questions. Thanks.
from diffusion-lm.
Related Issues (20)
- I wander where to find the model in the predictability HOT 1
- Training on A100
- Separate weights for word embedding and lm-head?
- Questions about the result of success rate of PPLM? HOT 2
- Why not directly use Emb(W) as X_0? HOT 2
- Error when running training script on Google Colab HOT 2
- Fail to load GPT2 pretrained model for attribute controled generation
- Reproducing Table 5: Sentence Infilling - CIDEr / BLEU-4 metrics HOT 1
- Baseline reproduction
- error when runing:Exception in thread Thread-4:·······ValueError: signal number 32 out of range
- Which classifier to use in custom_trainer.py for controllable generation?
- About the tT_loss HOT 1
- The difference between this code and the paper "IDDPM" in the run_loop function in train_util.py.
- The relevant code that caused the error is in the Controllable Text Generation section, after the model trained for 6 epochs and started evaluating, it raised a KeyError: 'eval_loss' HOT 2
- Questions about the NLL loss
- E2E training procedure
- Issue while generating controllable text generation
- How to Execute the Semantic Content Subtask with infill.py
- Seq2Seq tasks with Diffusion LM
- Difficulty in running code
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from diffusion-lm.