puyuanliu / finetune-transformer-summarization Goto Github PK
View Code? Open in Web Editor NEWThis project gives an implementation of fine-tuning pretrained Transformers (e.g., GPT2) on the summarization tasks. Further, we evaluated the performance of different models in an unsupervised setting, where the training target (summaries) are generated by searching with some heuristic function.
License: GNU General Public License v3.0