We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
We are interested in building out our offering of seq2seq capable models.
One higher priority model here is BART, which is reasonably small, useful for both discriminative and generative tasks, and decently popular.
First step will be to implement a backbone. Here's a template PR -> #622