Coder Social home page Coder Social logo

aalok-sathe / surprisal Goto Github PK

View Code? Open in Web Editor NEW
20.0 5.0 5.0 663 KB

A unified interface for computing surprisal (log probabilities) from language models! Supports neural, symbolic, and black-box API models.

Home Page: https://aalok-sathe.github.io/surprisal/

License: MIT License

Python 100.00%
language-modeling large-language-models log-likelihood surprisal gpt next-word-prediction

surprisal's Issues

Error when using Python-based tokenizers

Traceback (most recent call last):
  File "/net/vast-storage/scratch/vast/evlab/asathe/code/composlang/lmsurprisal/notebooks/extract_surprisals.py", line 73, in <module>
    main()
  File "/net/vast-storage/scratch/vast/evlab/asathe/code/composlang/lmsurprisal/notebooks/extract_surprisals.py", line 57, in main
    surprisals = [
  File "/net/vast-storage/scratch/vast/evlab/asathe/code/composlang/lmsurprisal/surprisal/model.py", line 133, in extract_surprisal
    surprisals = self.surprise([*textbatch])
  File "/net/vast-storage/scratch/vast/evlab/asathe/code/composlang/lmsurprisal/surprisal/model.py", line 184, in surprise
    tokens=tokenized[b], surprisals=-logprobs[b, :].numpy()
  File "/home/asathe/om2-home/anaconda3/envs/surprisal/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 240, in __getitem__
    raise KeyError(
KeyError: 'Indexing with integers (to access backend Encoding for a given batch index) is not available when using Python based tokenizers'

compute surprisal for Chinese characters

Is there any way to compute surprisal for Chinese sentences? Right now, the Chinese characters are processed in a weird way and the output does not match the number of Chinese characters in the input.

Slicing in SurprisalArray is not fully Pythonic

Need to either: [1] make a note somewhere or [2] add a warning or [3] add a workaround implementation that slicing doesn't exactly work the same way as it does with Python lists or numpy arrays.

  • [0:None] has undefined behavior
  • [:] has undefined behavior
  • [x:-1] has undefined behavior

What does work: providing actual or overshooting indices to characters or words within the stimulus/input.

  • [1:3, 'char'] works fine and returns surprisal over all tokens overlapping with chars 1:3
  • [0:99, 'char'] works fine and returns surprisal over all tokens that appear within the first 99 chars

Conflating causal LM and "gpt" model class

The current implementation of AutoHuggingFaceModel.from_pretrained takes in a model_class argument, where passing gpt as model_class redirects to the CausalHuggingFaceModel constructor. This is a bit confusing, because users may want to get surprisals from other causal LMs like LLaMa or Mistral.

Maybe the from_pretrained function (https://github.com/aalok-sathe/surprisal/blob/main/surprisal/model.py#L470) can take more abstract options for model_class: for example, causal or masked, instead of gpt and bert?

Are surprisal values across different batch sizes slightly different?

Observed small differences in results for batch_size=1 vs larger batch sizes and was trying a number of things to get to the bottom of it. Padding/attention masks didn’t solve it. I just set batch size to 1 so that it's perfectly deterministic across runs (since it still runs fast enough) and deferred this issue to the future.
contributed by @benlipkin

Make the CI stuff work

(non-breaking: currently the CI stuff is added just as an in-progress TODO. surprisal is released manually to PyPI at the moment and works fine regardless of CI/CD tests)

  • pylint
  • automatically push tagged releases to PyPI

Dependency issues in Python 3.12

Hello,

I installed your Surprisal package with Python 3.12. Upon running a script that was essentially your test examples (the open AI variant), I received the message, "ModuleNotFoundError: No module named 'torch'." Looking further into the issue, I found that PyTorch has not been released yet for Python 3.12. Could you verify that your package works on Python 3.12, and if not, which version of Python do you recommend installing to use Surprisal?

Support GPU

Currently batch evaluation is implemented, but there is no support for using GPU. This will likely require a modification in surprisal/model.py at initialization of a HuggingFaceModel instance.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.