when run run_inference_newser script the following error appears:
Namespace(alpha=0.9, attn_debug=False, batch_size=8, beam_size=4, beta=5.0, block_ngram_repeat=3, coverage_penalty='summary', data_type='text', dump_beam='', dynamic_dict=False, fast=False, gpu=0, ignore_when_blocking=['story_separator_special_tag'], image_channel_size=3, length_penalty='wu', log_file='', max_length=300, max_sent_length=None, min_length=200, models=['Feb17__step_20000.pt'], n_best=1, output='drive/MyDrive/summarization/data-1/generated_output.txt', replace_unk=False, report_bleu=False, report_rouge=False, sample_rate=16000, share_vocab=False, src='drive/MyDrive/summarization/data-1/test.txt.src.tokenized.fixed.cleaned.final.truncated.txt', src_dir='', stepwise_penalty=True, tgt='drive/MyDrive/summarization/data-1/test.txt.tgt.tokenized.fixed.cleaned.final.truncated.txt', verbose=True, window='hamming', window_size=0.02, window_stride=0.01)
Namespace(accum_count=5, adagrad_accumulator_init=0.1, adam_beta1=0.9, adam_beta2=0.999, batch_size=2, batch_type='sents', bridge=True, brnn=True, cnn_kernel_width=3, context_gate=None, copy_attn=True, copy_attn_force=False, copy_loss_by_seqlength=True, coverage_attn=False, data='newser_sent_500_300/newser_sents', dec_layers=1, decay_method='', decay_steps=10000, decoder_type='rnn', dropout=0.0, enc_layers=1, encoder_type='brnn', epochs=0, exp='', exp_host='', feat_merge='concat', feat_vec_exponent=0.7, feat_vec_size=-1, fix_word_vecs_dec=False, fix_word_vecs_enc=False, generator_function='log_softmax', global_attention='mlp', global_attention_function='softmax', gpu_backend='nccl', gpu_ranks=[0], gpu_verbose_level=0, gpuid=[], heads=8, image_channel_size=3, input_feed=1, keep_checkpoint=-1, label_smoothing=0.0, lambda_coverage=1, layers=1, learning_rate=0.15, learning_rate_decay=0.5, log_file='', master_ip='localhost', master_port=10000, max_generator_batches=32, max_grad_norm=4.0, model_type='text', normalization='sents', optim='adagrad', param_init=0.1, param_init_glorot=False, position_encoding=False, pre_word_vecs_dec=None, pre_word_vecs_enc=None, report_every=50, reuse_copy_attn=True, rnn_size=512, rnn_type='LSTM', sample_rate=16000, save_checkpoint_steps=1000, save_model='model_newser_atten/Feb17_', seed=777, self_attn_type='scaled-dot', share_decoder_embeddings=False, share_embeddings=False, src_word_vec_size=128, start_decay_steps=50000, tensorboard=False, tensorboard_log_dir='runs/onmt', tgt_word_vec_size=128, train_from='', train_steps=30000, transformer_ff=2048, truncated_decoder=0, valid_batch_size=32, valid_steps=10000, warmup_steps=4000, window_size=0.02, word_vec_size=128, world_size=1)
/usr/local/lib/python3.6/dist-packages/torchtext/data/field.py:323: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor).
var = torch.tensor(arr, dtype=self.dtype, device=device)
/content/Multi-News/code/Hi_MAP/onmt/translate/translator.py:555: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor).
return torch.tensor(a, requires_grad=False)
Traceback (most recent call last):
File "Multi-News/code/Hi_MAP/translate.py", line 37, in
main(opt)
File "Multi-News/code/Hi_MAP/translate.py", line 24, in main
attn_debug=opt.attn_debug)
File "/content/Multi-News/code/Hi_MAP/onmt/translate/translator.py", line 233, in translate
batch_data = self.translate_batch(batch, data, fast=self.fast)
File "/content/Multi-News/code/Hi_MAP/onmt/translate/translator.py", line 342, in translate_batch
return self._translate_batch(batch, data)
File "/content/Multi-News/code/Hi_MAP/onmt/translate/translator.py", line 650, in _translate_batch
beam_attn.data[:, j, :memory_lengths[j]])
File "/content/Multi-News/code/Hi_MAP/onmt/translate/beam.py", line 140, in advance
self.attn.append(attn_out.index_select(0, prev_k))
RuntimeError: expected scalar type Long but found Float