Originally cloned from SE(3)-Transformers: 3D Roto-Translation Equivariant Attention Networks.
(recommended commands with conda)
- pytorch
- dgl
- pip install packaging
- conda install -c conda-forge pynfft
- pip install lie_learn
- pip install wandb
- pip install -e .
using /home/vwslz/github/workspace/se3_transformer/experiments/dunbrack/train_dunbrack.py --model SE3Transformer --num_epochs 50 --num_degrees 4 --num_layers 7 --num_channels 32 --name dunbrack-chi --num_workers 4 --batch_size 16 --task target --num_cat_task 2 --div 2 --pooling max --head 8 --lr 0.00075 --print_interval 50 --data_address dunbrack.pt --use_wandb