Instructions to use Synthyra/ESMplusplus_large with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Synthyra/ESMplusplus_large with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="Synthyra/ESMplusplus_large", trust_remote_code=True)# Load model directly from transformers import AutoModelForMaskedLM model = AutoModelForMaskedLM.from_pretrained("Synthyra/ESMplusplus_large", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Have you or your group also done similar work for ESM3?
#9
by zhanghao520 - opened
Thanks for sharing your work on ESM++ — making ESMC fully Huggingface compatible sounds really convenient, especially with the possibility of using PEFT/LoRA for finetuning.
I was wondering, have you or your group also done similar work for ESM3?
Hi @zhanghao520 ,
Glad you are interested in ESM++! We have not done this compatibility with ESM3, I will need to review their license again to see if that is possible.
Best,
Logan