Instructions to use answerdotai/ModernBERT-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use answerdotai/ModernBERT-base with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="answerdotai/ModernBERT-base")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("answerdotai/ModernBERT-base") model = AutoModelForMaskedLM.from_pretrained("answerdotai/ModernBERT-base") - Notebooks
- Google Colab
- Kaggle
Interview Request: Thoughts on Model Documentation
#33
by evatang - opened
Hi! We are researchers from Carnegie Mellon University conducting a study on generative AI developers' evaluation and documentation practices. Given the popularity and success of your model, we're particularly interested in learning from your team's experiences.
Our study aims to:
- Understand current practices in Gen AI model evaluation and reporting
- Identify challenges faced by developers in these areas
- Explore potential improvements in evaluation and documentation processes
We're seeking participants with hands-on experience in these aspects of Gen AI development. Would any members of your team be interested in participating in a (compensated) interview to share their insights?
For more details about the study and to express interest, here is our recruitment page: https://forms.gle/fbn4734YxrRg6mkBA