Coder Social home page Coder Social logo

Comments (4)

airaria avatar airaria commented on May 27, 2024 1

你返回的attention list 是包含13个元素吗?原版的transformer里attention的长度只有12。你可能需要将index-1:
L4_attention_mse=[{"layer_T":2, "layer_S":0, "feature":"attention", "loss":"attention_mse", "weight":1},
{"layer_T":5, "layer_S":1, "feature":"attention", "loss":"attention_mse", "weight":1},
{"layer_T":8, "layer_S":2, "feature":"attention", "loss":"attention_mse", "weight":1},
{"layer_T":11, "layer_S":3, "feature":"attention", "loss":"attention_mse", "weight":1}]

from textbrewer.

airaria avatar airaria commented on May 27, 2024

你可以检查以下几个关键点:

  1. (教师/学生)模型是否返回了足够数量的hidden states?比如你这段代码里要求教师返回长度13的list(embedding + 12 hidden states),学生返回长度4的list (embedding + 3 hidden states);
    如果你用的是HuggingFace的Transformers, 需要在对应模型的config里设置 config.output_hidden_states=True。见https://huggingface.co/transformers/model_doc/bert.html?highlight=output_hidden_states
  2. adaptor里是否正确匹配了模型的输出并正确返回了dict ,其中 dict['hidden'] = 模型输出的hidden_states?

from textbrewer.

MarvinLong avatar MarvinLong commented on May 27, 2024

你可以检查以下几个关键点:

  1. (教师/学生)模型是否返回了足够数量的hidden states?比如你这段代码里要求教师返回长度13的list(embedding + 12 hidden states),学生返回长度4的list (embedding + 3 hidden states);
    如果你用的是HuggingFace的Transformers, 需要在对应模型的config里设置 config.output_hidden_states=True。见https://huggingface.co/transformers/model_doc/bert.html?highlight=output_hidden_states
  2. adaptor里是否正确匹配了模型的输出并正确返回了dict ,其中 dict['hidden'] = 模型输出的hidden_states?

好像attention的还是对应不上,我的理解是attention的index-1,不知道这样对不对。算上embedding,老师有13层hidden,12层attention;学生有5层hidden,4层attention。
L4_attention_mse=[{"layer_T":3, "layer_S":1, "feature":"attention", "loss":"attention_mse", "weight":1},
{"layer_T":6, "layer_S":2, "feature":"attention", "loss":"attention_mse", "weight":1},
{"layer_T":9, "layer_S":3, "feature":"attention", "loss":"attention_mse", "weight":1},
{"layer_T":12, "layer_S":4, "feature":"attention", "loss":"attention_mse", "weight":1}]

from textbrewer.

MarvinLong avatar MarvinLong commented on May 27, 2024

你返回的attention list 是包含13个元素吗?原版的transformer里attention的长度只有12。你可能需要将index-1:
L4_attention_mse=[{"layer_T":2, "layer_S":0, "feature":"attention", "loss":"attention_mse", "weight":1},
{"layer_T":5, "layer_S":1, "feature":"attention", "loss":"attention_mse", "weight":1},
{"layer_T":8, "layer_S":2, "feature":"attention", "loss":"attention_mse", "weight":1},
{"layer_T":11, "layer_S":3, "feature":"attention", "loss":"attention_mse", "weight":1}]

是的确实要减一,matches.py文件中的attention老师和学生的对应index都减一,使用attention进行蒸馏就没问题了。

from textbrewer.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.