Skip to content

Add column and entries for context length. #11

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 6, 2023

Conversation

LudwigStumpp
Copy link
Collaborator

As discussed in #7, I added an additional column context length to the table and filled in the data for most of the models.

While it was unambiguous for most of the recent decoder-transformer with a standard positional embedding, here some exceptions:

  • T5 was trained with a sequence length of 512. However, its use of relative-attention theoretically allows for longer sequences. See the discussion here. As 512 was mainly used during training, I decided to use 512 as this is the value that makes effectively sense given the capabilities obtained during training
  • Replit Code uses ALiBi which allows for extrapolation of sequence during inference that are longer than the ones seen during training. As I did not find any other information on the maximum context length, I decided to use
    infinity as the value. Not 100% sure though.
  • MPT-7B also uses ALiBi. According to their blog post and their GitHub, the models are trained on up to 65k inputs and can handle up to 84k. Therefore decided to use 84k
  • SantaCoder + RedPajama-INCITE: Could not find any context length information. Marked them with ?

TODO: Some entries are missing and some are still unclear to me; marked with ?.

TODO: Some entries are missing and some are still unclear to me.
@eugeneyan
Copy link
Owner

Thanks for adding this, especially the ? where we're uncertain.

@eugeneyan eugeneyan merged commit d2723c8 into eugeneyan:main May 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants