Transformers
PyTorch
TensorBoard
marian
text2text-generation
Generated from Trainer
Eval Results (legacy)
Instructions to use autoevaluate/translation with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use autoevaluate/translation with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("autoevaluate/translation") model = AutoModelForSeq2SeqLM.from_pretrained("autoevaluate/translation") - Notebooks
- Google Colab
- Kaggle
Add evaluation results on wmt16
#1
by lewtun HF Staff - opened
Beep boop, I am a bot from Hugging Face's automatic evaluation service! Your model has been evaluated on the wmt16 dataset. Accept this pull request to see the results displayed on the Hub leaderboard. Evaluate your model on more datasets here.