Coder Social home page Coder Social logo

koniqplusplus's People

Contributors

ssl92 avatar

Stargazers

YingYan0017 avatar Faych Chen avatar HJH_Chenhe avatar  avatar  avatar zhangqian avatar Ciprian avatar Weixia Zhang avatar Lucas avatar  avatar lalala avatar Yuan avatar  avatar  avatar Thomas Thuilot avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

Forkers

david-hown

koniqplusplus's Issues

how to get koniq10k_distributions_sets.csv ?

I download the koniq++database.csv file from KonIQ++ Webpage and put it in my own path for KonIQ-10k dataset 1024x768 IMAGES FULL (.ZIP).

But When I run command:
python3 main.py --dataset KonIQ-10k --resize --lr 1e-4 -bs 8 -e 25 --ft_lr_ratio 0.1 --loss_type norm-in-norm --p 1 --q 2 --koniq_root ./1024x768

it reports:
FileNotFoundError: [Errno 2] No such file or directory: './1024x768/koniq10k_distributions_sets.csv'

So, how can I get koniq10k_distributions_sets.csv ?

RuntimeError: pretrained_model is a zip archive (did you mean to use torch.jit.load()?)

Thanks for sharing wonderful work.
However, there are errors in testing with the provided pretrained_model in both pre-1.6 and post-1.6 pytorch.

In torch 1.3.1, the error is

$ python test_image.py --root_path imgs --img_name test_img.jpg --resize --p 1 --q 2
        
    Traceback (most recent call last):
      File "/home/ron_lee/miniconda3/envs/torch1.3.1-py36-cuda9.0-tf1.14/lib/python3.6/tarfile.py", line 189, in nti
        n = int(s.strip() or "0", 8)
    ValueError: invalid literal for int() with base 8: 'ils\n_reb'


    Traceback (most recent call last):
      File "test_image.py", line 154, in <module>
        run(args)
      File "test_image.py", line 16, in run
        checkpoint = torch.load(args.trained_model_file)
      File "/home/ron_lee/miniconda3/envs/torch1.3.1-py36-cuda9.0-tf1.14/lib/python3.6/site-packages/torch/serialization.py", line 426, in load
        return _load(f, map_location, pickle_module, **pickle_load_args)
      File "/home/ron_lee/miniconda3/envs/torch1.3.1-py36-cuda9.0-tf1.14/lib/python3.6/site-packages/torch/serialization.py", line 599, in _load
        raise RuntimeError("{} is a zip archive (did you mean to use torch.jit.load()?)".format(f.name))
    RuntimeError: ./checkpoints/pretrained_model is a zip archive (did you mean to use torch.jit.load()?)

This error happens when using checkpoint trained with 1.6 or later, yet trying to test with earlier than 1.6 of pytorch.

On the other hand, using pytorch 1.8 leads to "Missing key(s) in state_dict: "sidenet_q.head0.0.weight",...".

MSU Video Quality Metrics Benchmark Invitation

Hello! We kindly invite you to participate in our video quality metrics benchmark. You can submit KonIQ++ to the benchmark, following the submission steps, described here. The dataset distortions refer to compression artifacts on professional and user-generated content. The full dataset is used to measure methods overall performance, so we do not share it to avoid overfitting. Nevertheless, we provided the open part of it (around 1,000 videos) within our paper "Video compression dataset and benchmark of learning-based video-quality metrics", accepted to NeurIPS 2022.

About pretrained weights and GPU ram usage

I download the weight and modify partial code of test_image.py to this:

    # args.format_str = 'model-loss={}-p={}-q={}-detach-{}-ft_lr_ratio={}-alpha={}-{}-res={}-{}x{}-aug={}-monotonicity={}-lr={}-bs={}-e={}-opt_level={}' \
    #     .format(args.loss_type, args.p, args.q, args.detach, args.ft_lr_ratio, args.alpha,
    #             args.dataset, args.resize, args.resize_size_h, args.resize_size_w, args.augment,
    #             args.monotonicity_regularization, args.lr, args.batch_size, args.epochs, args.opt_level)
    args.root_path = '.'
    args.img_name = 'a.jpg'
    args.save_heatmap = True
    args.trained_model_file = 'pretrained_model' #'./checkpoints/' + args.format_str

When I run test_image.py encountering error:

RuntimeError: Error(s) in loading state_dict for Model_Joint:
        Missing key(s) in state_dict: "sidenet_q.head0.0.weight", "sidenet_q.head0.0.bias", "sidenet_q.head0.1.weight", "sidenet_q.head0.1.bias", "sidenet_q.head0.1.running_mean", "sidenet_q.head0.1.running_var", "sidenet_q.head1.0.weight", "sidenet_q.head1.0.bias", "sidenet_q.head1.1.weight", "sidenet_q.head1.1.bias", "sidenet_q.head1.1.running_mean", "sidenet_q.head1.1.running_var", "sidenet_q.head2.0.weight", "sidenet_q.head2.0.bias", "sidenet_q.head2.1.weight", "sidenet_q.head2.1.bias", "sidenet_q.head2.1.running_mean", "sidenet_q.head2.1.running_var", "sidenet_q.head3.0.weight", "sidenet_q.head3.0.bias", "sidenet_q.head3.1.weight", "sidenet_q.head3.1.bias", "sidenet_q.head3.1.running_mean", "sidenet_q.head3.1.running_var", "sidenet_q.head4.0.weight", "sidenet_q.head4.0.bias", "sidenet_q.head4.1.weight", "sidenet_q.head4.1.bias", "sidenet_q.head4.1.running_mean", "sidenet_q.head4.1.running_var", "sidenet_q.head5.0.weight", "sidenet_q.head5.0.bias", "sidenet_q.head5.1.weight", "sidenet_q.head5.1.bias", "sidenet_q.head5.1.running_mean", "sidenet_q.head5.1.running_var", "sidenet_q.head6.0.weight", "sidenet_q.head6.0.bias", "sidenet_q.head6.1.weight", "sidenet_q.head6.1.bias", "sidenet_q.head6.1.running_mean", "sidenet_q.head6.1.running_var", "sidenet_q.head7.0.weight", "sidenet_q.head7.0.bias", "sidenet_q.head7.1.weight", "sidenet_q.head7.1.bias", "sidenet_q.head7.1.running_mean", "sidenet_q.head7.1.running_var", "sidenet_q.fusion_block1.conv1.0.weight", "sidenet_q.fusion_block1.conv1.0.bias", "sidenet_q.fusion_block1.conv1.1.weight", "sidenet_q.fusion_block1.conv1.1.bias", "sidenet_q.fusion_block1.conv1.1.running_mean", "sidenet_q.fusion_block1.conv1.1.running_var", "sidenet_q.fusion_block1.attn.conv_ca.0.weight", "sidenet_q.fusion_block1.attn.conv_ca.0.bias", "sidenet_q.fusion_block1.attn.conv_ca.2.weight", "sidenet_q.fusion_block1.attn.conv_ca.2.bias", "sidenet_q.fusion_block1.attn.conv_pa.0.weight", "sidenet_q.fusion_block1.attn.conv_pa.0.bias", "sidenet_q.fusion_block1.attn.conv_pa.2.weight", "sidenet_q.fusion_block1.attn.conv_pa.2.bias", "sidenet_q.fusion_block2.conv1.0.weight", "sidenet_q.fusion_block2.conv1.0.bias", "sidenet_q.fusion_block2.conv1.1.weight", "sidenet_q.fusion_block2.conv1.1.bias", "sidenet_q.fusion_block2.conv1.1.running_mean", "sidenet_q.fusion_block2.conv1.1.running_var", 
"sidenet_q.fusion_block2.attn.conv_ca.0.weight", "sidenet_q.fusion_block2.attn.conv_ca.0.bias", "sidenet_q.fusion_block2.attn.conv_ca.2.weight", "sidenet_q.fusion_block2.attn.conv_ca.2.bias", "sidenet_q.fusion_block2.attn.conv_pa.0.weight", "sidenet_q.fusion_block2.attn.conv_pa.0.bias", "sidenet_q.fusion_block2.attn.conv_pa.2.weight", "sidenet_q.fusion_block2.attn.conv_pa.2.bias", "sidenet_q.fusion_block3.conv1.0.weight", "sidenet_q.fusion_block3.conv1.0.bias", "sidenet_q.fusion_block3.conv1.1.weight", "sidenet_q.fusion_block3.conv1.1.bias", "sidenet_q.fusion_block3.conv1.1.running_mean", "sidenet_q.fusion_block3.conv1.1.running_var", "sidenet_q.fusion_block3.attn.conv_ca.0.weight", "sidenet_q.fusion_block3.attn.conv_ca.0.bias", "sidenet_q.fusion_block3.attn.conv_ca.2.weight", "sidenet_q.fusion_block3.attn.conv_ca.2.bias", "sidenet_q.fusion_block3.attn.conv_pa.0.weight", "sidenet_q.fusion_block3.attn.conv_pa.0.bias", "sidenet_q.fusion_block3.attn.conv_pa.2.weight", "sidenet_q.fusion_block3.attn.conv_pa.2.bias", "sidenet_q.fusion_block4.conv1.0.weight", "sidenet_q.fusion_block4.conv1.0.bias", "sidenet_q.fusion_block4.conv1.1.weight", "sidenet_q.fusion_block4.conv1.1.bias", "sidenet_q.fusion_block4.conv1.1.running_mean", "sidenet_q.fusion_block4.conv1.1.running_var", "sidenet_q.fusion_block4.attn.conv_ca.0.weight", "sidenet_q.fusion_block4.attn.conv_ca.0.bias", "sidenet_q.fusion_block4.attn.conv_ca.2.weight", "sidenet_q.fusion_block4.attn.conv_ca.2.bias", "sidenet_q.fusion_block4.attn.conv_pa.0.weight", "sidenet_q.fusion_block4.attn.conv_pa.0.bias", "sidenet_q.fusion_block4.attn.conv_pa.2.weight", "sidenet_q.fusion_block4.attn.conv_pa.2.bias", "sidenet_q.fc_q.weight", "sidenet_q.fc_q.bias", "sidenet_dist.head0.0.weight", "sidenet_dist.head0.0.bias", "sidenet_dist.head0.1.weight", "sidenet_dist.head0.1.bias", "sidenet_dist.head0.1.running_mean", "sidenet_dist.head0.1.running_var", "sidenet_dist.head1.0.weight", "sidenet_dist.head1.0.bias", "sidenet_dist.head1.1.weight", "sidenet_dist.head1.1.bias", "sidenet_dist.head1.1.running_mean", "sidenet_dist.head1.1.running_var", "sidenet_dist.head2.0.weight", "sidenet_dist.head2.0.bias", "sidenet_dist.head2.1.weight", "sidenet_dist.head2.1.bias", "sidenet_dist.head2.1.running_mean", "sidenet_dist.head2.1.running_var", "sidenet_dist.head3.0.weight", "sidenet_dist.head3.0.bias", "sidenet_dist.head3.1.weight", "sidenet_dist.head3.1.bias", "sidenet_dist.head3.1.running_mean", "sidenet_dist.head3.1.running_var", "sidenet_dist.head4.0.weight", "sidenet_dist.head4.0.bias", "sidenet_dist.head4.1.weight", "sidenet_dist.head4.1.bias", "sidenet_dist.head4.1.running_mean", "sidenet_dist.head4.1.running_var", "sidenet_dist.head5.0.weight", "sidenet_dist.head5.0.bias", "sidenet_dist.head5.1.weight", "sidenet_dist.head5.1.bias", "sidenet_dist.head5.1.running_mean", "sidenet_dist.head5.1.running_var", "sidenet_dist.head6.0.weight", "sidenet_dist.head6.0.bias", "sidenet_dist.head6.1.weight", "sidenet_dist.head6.1.bias", "sidenet_dist.head6.1.running_mean", "sidenet_dist.head6.1.running_var", "sidenet_dist.head7.0.weight", "sidenet_dist.head7.0.bias", "sidenet_dist.head7.1.weight", "sidenet_dist.head7.1.bias", "sidenet_dist.head7.1.running_mean", "sidenet_dist.head7.1.running_var", "sidenet_dist.fusion_block1.conv1.0.weight", "sidenet_dist.fusion_block1.conv1.0.bias", "sidenet_dist.fusion_block1.conv1.1.weight", "sidenet_dist.fusion_block1.conv1.1.bias", "sidenet_dist.fusion_block1.conv1.1.running_mean", "sidenet_dist.fusion_block1.conv1.1.running_var", "sidenet_dist.fusion_block1.attn.conv_ca.0.weight", "sidenet_dist.fusion_block1.attn.conv_ca.0.bias", "sidenet_dist.fusion_block1.attn.conv_ca.2.weight", "sidenet_dist.fusion_block1.attn.conv_ca.2.bias", "sidenet_dist.fusion_block1.attn.conv_pa.0.weight", "sidenet_dist.fusion_block1.attn.conv_pa.0.bias", "sidenet_dist.fusion_block1.attn.conv_pa.2.weight", "sidenet_dist.fusion_block1.attn.conv_pa.2.bias", "sidenet_dist.fusion_block2.conv1.0.weight", "sidenet_dist.fusion_block2.conv1.0.bias", "sidenet_dist.fusion_block2.conv1.1.weight", "sidenet_dist.fusion_block2.conv1.1.bias", "sidenet_dist.fusion_block2.conv1.1.running_mean", "sidenet_dist.fusion_block2.conv1.1.running_var", "sidenet_dist.fusion_block2.attn.conv_ca.0.weight", "sidenet_dist.fusion_block2.attn.conv_ca.0.bias", "sidenet_dist.fusion_block2.attn.conv_ca.2.weight", "sidenet_dist.fusion_block2.attn.conv_ca.2.bias", "sidenet_dist.fusion_block2.attn.conv_pa.0.weight", "sidenet_dist.fusion_block2.attn.conv_pa.0.bias", "sidenet_dist.fusion_block2.attn.conv_pa.2.weight", "sidenet_dist.fusion_block2.attn.conv_pa.2.bias", "sidenet_dist.fusion_block3.conv1.0.weight", "sidenet_dist.fusion_block3.conv1.0.bias", "sidenet_dist.fusion_block3.conv1.1.weight", "sidenet_dist.fusion_block3.conv1.1.bias", "sidenet_dist.fusion_block3.conv1.1.running_mean", "sidenet_dist.fusion_block3.conv1.1.running_var", "sidenet_dist.fusion_block3.attn.conv_ca.0.weight", "sidenet_dist.fusion_block3.attn.conv_ca.0.bias", "sidenet_dist.fusion_block3.attn.conv_ca.2.weight", "sidenet_dist.fusion_block3.attn.conv_ca.2.bias", "sidenet_dist.fusion_block3.attn.conv_pa.0.weight", "sidenet_dist.fusion_block3.attn.conv_pa.0.bias", "sidenet_dist.fusion_block3.attn.conv_pa.2.weight", "sidenet_dist.fusion_block3.attn.conv_pa.2.bias", "sidenet_dist.fusion_block4.conv1.0.weight", "sidenet_dist.fusion_block4.conv1.0.bias", "sidenet_dist.fusion_block4.conv1.1.weight", "sidenet_dist.fusion_block4.conv1.1.bias", "sidenet_dist.fusion_block4.conv1.1.running_mean", "sidenet_dist.fusion_block4.conv1.1.running_var", "sidenet_dist.fusion_block4.attn.conv_ca.0.weight", "sidenet_dist.fusion_block4.attn.conv_ca.0.bias", "sidenet_dist.fusion_block4.attn.conv_ca.2.weight", "sidenet_dist.fusion_block4.attn.conv_ca.2.bias", "sidenet_dist.fusion_block4.attn.conv_pa.0.weight", "sidenet_dist.fusion_block4.attn.conv_pa.0.bias", "sidenet_dist.fusion_block4.attn.conv_pa.2.weight", "sidenet_dist.fusion_block4.attn.conv_pa.2.bias", "sidenet_dist.fc_q.weight", "sidenet_dist.fc_q.bias".
        Unexpected key(s) in state_dict: "sub_q.head0.0.weight", "sub_q.head0.0.bias", "sub_q.head0.1.weight", "sub_q.head0.1.bias", "sub_q.head0.1.running_mean", "sub_q.head0.1.running_var", "sub_q.head0.1.num_batches_tracked", "sub_q.head1.0.weight", "sub_q.head1.0.bias", "sub_q.head1.1.weight", "sub_q.head1.1.bias", "sub_q.head1.1.running_mean", "sub_q.head1.1.running_var", "sub_q.head1.1.num_batches_tracked", "sub_q.head2.0.weight", "sub_q.head2.0.bias", "sub_q.head2.1.weight", "sub_q.head2.1.bias", "sub_q.head2.1.running_mean", "sub_q.head2.1.running_var", "sub_q.head2.1.num_batches_tracked", "sub_q.head3.0.weight", "sub_q.head3.0.bias", "sub_q.head3.1.weight", "sub_q.head3.1.bias", "sub_q.head3.1.running_mean", "sub_q.head3.1.running_var", "sub_q.head3.1.num_batches_tracked", "sub_q.head4.0.weight", "sub_q.head4.0.bias", "sub_q.head4.1.weight", "sub_q.head4.1.bias", "sub_q.head4.1.running_mean", "sub_q.head4.1.running_var", "sub_q.head4.1.num_batches_tracked", "sub_q.head5.0.weight", "sub_q.head5.0.bias", "sub_q.head5.1.weight", "sub_q.head5.1.bias", "sub_q.head5.1.running_mean", "sub_q.head5.1.running_var", "sub_q.head5.1.num_batches_tracked", "sub_q.head6.0.weight", "sub_q.head6.0.bias", "sub_q.head6.1.weight", "sub_q.head6.1.bias", "sub_q.head6.1.running_mean", "sub_q.head6.1.running_var", "sub_q.head6.1.num_batches_tracked", "sub_q.head7.0.weight", "sub_q.head7.0.bias", "sub_q.head7.1.weight", "sub_q.head7.1.bias", "sub_q.head7.1.running_mean", "sub_q.head7.1.running_var", "sub_q.head7.1.num_batches_tracked", "sub_q.fusion_block1.conv1.0.weight", "sub_q.fusion_block1.conv1.0.bias", "sub_q.fusion_block1.conv1.1.weight", "sub_q.fusion_block1.conv1.1.bias", "sub_q.fusion_block1.conv1.1.running_mean", "sub_q.fusion_block1.conv1.1.running_var", "sub_q.fusion_block1.conv1.1.num_batches_tracked", "sub_q.fusion_block1.attn.conv_ca.0.weight", "sub_q.fusion_block1.attn.conv_ca.0.bias", "sub_q.fusion_block1.attn.conv_ca.2.weight", "sub_q.fusion_block1.attn.conv_ca.2.bias", "sub_q.fusion_block1.attn.conv_pa.0.weight", "sub_q.fusion_block1.attn.conv_pa.0.bias", "sub_q.fusion_block1.attn.conv_pa.2.weight", "sub_q.fusion_block1.attn.conv_pa.2.bias", "sub_q.fusion_block2.conv1.0.weight", "sub_q.fusion_block2.conv1.0.bias", "sub_q.fusion_block2.conv1.1.weight", "sub_q.fusion_block2.conv1.1.bias", "sub_q.fusion_block2.conv1.1.running_mean", "sub_q.fusion_block2.conv1.1.running_var", "sub_q.fusion_block2.conv1.1.num_batches_tracked", "sub_q.fusion_block2.attn.conv_ca.0.weight", "sub_q.fusion_block2.attn.conv_ca.0.bias", "sub_q.fusion_block2.attn.conv_ca.2.weight", "sub_q.fusion_block2.attn.conv_ca.2.bias", "sub_q.fusion_block2.attn.conv_pa.0.weight", "sub_q.fusion_block2.attn.conv_pa.0.bias", "sub_q.fusion_block2.attn.conv_pa.2.weight", "sub_q.fusion_block2.attn.conv_pa.2.bias", "sub_q.fusion_block3.conv1.0.weight", "sub_q.fusion_block3.conv1.0.bias", "sub_q.fusion_block3.conv1.1.weight", "sub_q.fusion_block3.conv1.1.bias", "sub_q.fusion_block3.conv1.1.running_mean", "sub_q.fusion_block3.conv1.1.running_var", "sub_q.fusion_block3.conv1.1.num_batches_tracked", "sub_q.fusion_block3.attn.conv_ca.0.weight", "sub_q.fusion_block3.attn.conv_ca.0.bias", "sub_q.fusion_block3.attn.conv_ca.2.weight", "sub_q.fusion_block3.attn.conv_ca.2.bias", "sub_q.fusion_block3.attn.conv_pa.0.weight", "sub_q.fusion_block3.attn.conv_pa.0.bias", "sub_q.fusion_block3.attn.conv_pa.2.weight", "sub_q.fusion_block3.attn.conv_pa.2.bias", "sub_q.fusion_block4.conv1.0.weight", "sub_q.fusion_block4.conv1.0.bias", "sub_q.fusion_block4.conv1.1.weight", "sub_q.fusion_block4.conv1.1.bias", "sub_q.fusion_block4.conv1.1.running_mean", "sub_q.fusion_block4.conv1.1.running_var", "sub_q.fusion_block4.conv1.1.num_batches_tracked", "sub_q.fusion_block4.attn.conv_ca.0.weight", "sub_q.fusion_block4.attn.conv_ca.0.bias", "sub_q.fusion_block4.attn.conv_ca.2.weight", "sub_q.fusion_block4.attn.conv_ca.2.bias", "sub_q.fusion_block4.attn.conv_pa.0.weight", "sub_q.fusion_block4.attn.conv_pa.0.bias", "sub_q.fusion_block4.attn.conv_pa.2.weight", "sub_q.fusion_block4.attn.conv_pa.2.bias", "sub_q.fc_q.weight", "sub_q.fc_q.bias", "sub_dist.head0.0.weight", "sub_dist.head0.0.bias", "sub_dist.head0.1.weight", "sub_dist.head0.1.bias", "sub_dist.head0.1.running_mean", "sub_dist.head0.1.running_var", "sub_dist.head0.1.num_batches_tracked", "sub_dist.head1.0.weight", "sub_dist.head1.0.bias", "sub_dist.head1.1.weight", "sub_dist.head1.1.bias", "sub_dist.head1.1.running_mean", "sub_dist.head1.1.running_var", "sub_dist.head1.1.num_batches_tracked", "sub_dist.head2.0.weight", "sub_dist.head2.0.bias", "sub_dist.head2.1.weight", "sub_dist.head2.1.bias", "sub_dist.head2.1.running_mean", "sub_dist.head2.1.running_var", "sub_dist.head2.1.num_batches_tracked", "sub_dist.head3.0.weight", "sub_dist.head3.0.bias", "sub_dist.head3.1.weight", "sub_dist.head3.1.bias", "sub_dist.head3.1.running_mean", "sub_dist.head3.1.running_var", "sub_dist.head3.1.num_batches_tracked", "sub_dist.head4.0.weight", "sub_dist.head4.0.bias", "sub_dist.head4.1.weight", "sub_dist.head4.1.bias", "sub_dist.head4.1.running_mean", "sub_dist.head4.1.running_var", "sub_dist.head4.1.num_batches_tracked", "sub_dist.head5.0.weight", "sub_dist.head5.0.bias", "sub_dist.head5.1.weight", "sub_dist.head5.1.bias", "sub_dist.head5.1.running_mean", "sub_dist.head5.1.running_var", "sub_dist.head5.1.num_batches_tracked", "sub_dist.head6.0.weight", "sub_dist.head6.0.bias", "sub_dist.head6.1.weight", "sub_dist.head6.1.bias", "sub_dist.head6.1.running_mean", "sub_dist.head6.1.running_var", "sub_dist.head6.1.num_batches_tracked", "sub_dist.head7.0.weight", "sub_dist.head7.0.bias", "sub_dist.head7.1.weight", "sub_dist.head7.1.bias", "sub_dist.head7.1.running_mean", "sub_dist.head7.1.running_var", "sub_dist.head7.1.num_batches_tracked", "sub_dist.fusion_block1.conv1.0.weight", "sub_dist.fusion_block1.conv1.0.bias", "sub_dist.fusion_block1.conv1.1.weight", "sub_dist.fusion_block1.conv1.1.bias", "sub_dist.fusion_block1.conv1.1.running_mean", "sub_dist.fusion_block1.conv1.1.running_var", "sub_dist.fusion_block1.conv1.1.num_batches_tracked", "sub_dist.fusion_block1.attn.conv_ca.0.weight", "sub_dist.fusion_block1.attn.conv_ca.0.bias", "sub_dist.fusion_block1.attn.conv_ca.2.weight", "sub_dist.fusion_block1.attn.conv_ca.2.bias", "sub_dist.fusion_block1.attn.conv_pa.0.weight", "sub_dist.fusion_block1.attn.conv_pa.0.bias", "sub_dist.fusion_block1.attn.conv_pa.2.weight", "sub_dist.fusion_block1.attn.conv_pa.2.bias", "sub_dist.fusion_block2.conv1.0.weight", "sub_dist.fusion_block2.conv1.0.bias", "sub_dist.fusion_block2.conv1.1.weight", "sub_dist.fusion_block2.conv1.1.bias", "sub_dist.fusion_block2.conv1.1.running_mean", "sub_dist.fusion_block2.conv1.1.running_var", "sub_dist.fusion_block2.conv1.1.num_batches_tracked", "sub_dist.fusion_block2.attn.conv_ca.0.weight", "sub_dist.fusion_block2.attn.conv_ca.0.bias", "sub_dist.fusion_block2.attn.conv_ca.2.weight", "sub_dist.fusion_block2.attn.conv_ca.2.bias", "sub_dist.fusion_block2.attn.convsub_dist.fusion_block3.conv1.0.weight", "sub_dist.fusion_block3.conv1.0.bias", "sub_dist.fusion_block3.conv1.1.weight", "sub_dist.fusion_block3.conv1.1.bias", "sub_dist.fusion_block3.conv1.1.running_mean", "sub_dist.fusion_block3.conv1.1.running_var", "sub_dist.fusion_block3.conv1.1.num_batches_tracked", "sub_dist.fusion_block3.attn.conv_ca.0.weight", "sub_dist.fusion_block3.attn.conv_ca.0.bias", "sub_dist.fusion_block3.attn.conv_ca.2.weight", "sub_dist.fusion_block3.attn.conv_ca.2.bias", "sub_dist.fusion_block3.attn.conv_pa.0.weight", "sub_dist.fusion_block3.attn.conv_pa.0.bias", "sub_dist.fusion_block3.attn.conv_pa.2.weight", "sub_dist.fusion_block3.attn.conv_pa.2.bias", "sub_dist.fusion_block4.conv1.0.weight", "sub_dist.fusion_block4.conv1.0.bias", "sub_dist.fusion_block4.conv1.1.weight", "sub_dist.fusion_block4.conv1.1.bias", "sub_dist.fusion_block4.conv1.1.running_mean", "sub_dist.fusion_block4.conv1.1.running_var", "sub_dist.fusion_block4.conv1.1.num_batches_tracked", "sub_dist.fusion_block4.attn.conv_ca.0.weight", "sub_dist.fusion_block4.attn.conv_ca.0.bias", "sub_dist.fusion_block4.attn.conv_ca.2.weight", "sub_dist.fusion_block4.attn.conv_ca.2.bias", "sub_dist.fusion_block4.attn.conv_pa.0.weight", "sub_dist.fusion_block4.attn.conv_pa.0.bias", "sub_dist.fusion_block4.attn.conv_pa.2.weight", "sub_dist.fusion_block4.attn.conv_pa.2.bias", "sub_dist.fc_q.weight", "sub_dist.fc_q.bias".

Then I switch the model load param strict to False but encounter CUDA memory insufficient error. I wonder that how much GPU ram do I need when inference. My current config is RTX3060 12G.
I try to use fp16 reduce GPU ram usage by this:

    if args.save_heatmap is None:
        with torch.cuda.amp.autocast():
            q = model(im.unsqueeze(0))
        print('The image quality score is {}'.format(q[-1].item() * k[-1] + b[-1]))

But I encounter another error saying dimension problem:

File "\koniqplusplus\IQAmodel.py", line 135, in forward
    x2 = self.fusion_block2(torch.cat((x1, x2), dim=1), x3)
RuntimeError: Sizes of tensors must match except in dimension 1. Expected size 170 but got size 171 for tensor number 1 in the list.

checkpoint

Hi, are you planning to release the checkpoint? Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.