Instructions to use stabilityai/StableBeluga2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use stabilityai/StableBeluga2 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="stabilityai/StableBeluga2")# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("stabilityai/StableBeluga2") model = AutoModelForCausalLM.from_pretrained("stabilityai/StableBeluga2") - Notebooks
- Google Colab
- Kaggle
- Local Apps
- vLLM
How to use stabilityai/StableBeluga2 with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "stabilityai/StableBeluga2" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "stabilityai/StableBeluga2", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }'Use Docker
docker model run hf.co/stabilityai/StableBeluga2
- SGLang
How to use stabilityai/StableBeluga2 with SGLang:
Install from pip and serve model
# Install SGLang from pip: pip install sglang # Start the SGLang server: python3 -m sglang.launch_server \ --model-path "stabilityai/StableBeluga2" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "stabilityai/StableBeluga2", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }'Use Docker images
docker run --gpus all \ --shm-size 32g \ -p 30000:30000 \ -v ~/.cache/huggingface:/root/.cache/huggingface \ --env "HF_TOKEN=<secret>" \ --ipc=host \ lmsysorg/sglang:latest \ python3 -m sglang.launch_server \ --model-path "stabilityai/StableBeluga2" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "stabilityai/StableBeluga2", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }' - Docker Model Runner
How to use stabilityai/StableBeluga2 with Docker Model Runner:
docker model run hf.co/stabilityai/StableBeluga2
Adding Evaluation Results
#36 opened almost 2 years ago
by
leaderboard-pr-bot
Adding `safetensors` variant of this model
#35 opened about 2 years ago
by
SFconvertbot
Training tokens
1
#34 opened about 2 years ago
by
adminscholaro
Adding Evaluation Results
#33 opened over 2 years ago
by
leaderboard-pr-bot
Evenly distribute the model across GPUs?
#32 opened over 2 years ago
by
Shiba
Error after loading in 4bit
#31 opened over 2 years ago
by
g1hahn
Minimum VRAM / GPU specs?
6
#30 opened almost 3 years ago
by
sociopathic-hamster
Upload 16 bit precision weights
4
#29 opened almost 3 years ago
by
mallorbc
Highly Restrictive Licensing
#28 opened almost 3 years ago
by
cognisant
Can we use it with GPT4ALL program?
#27 opened almost 3 years ago
by
musabka
How the size of the model is ~275GB ?
2
#26 opened almost 3 years ago
by
chakibb
StableBegula2 70B is finetuned on which model? Llama2 pretrained model or Llama2 Chat model ?
π 1
1
#25 opened almost 3 years ago
by
kosec39
Is there integration with Langchain?
1
#24 opened almost 3 years ago
by
DesmondChoy
Disk space full
π 1
2
#21 opened almost 3 years ago
by
DanielTTY
License Change
4
#20 opened almost 3 years ago
by
bspence08
So the name changed...
6
#19 opened almost 3 years ago
by
pnb
It is very good. a gpt3.5 Competitor. share a long conversation example.
πβ€οΈ 4
3
#18 opened almost 3 years ago
by
AIReach
Add missing colon in prompt format
#17 opened almost 3 years ago
by
parkeraddison
Update README.md
#14 opened almost 3 years ago
by
eltociear
Please provide a GGML version. or the necessary components to create it.
π 2
4
#13 opened almost 3 years ago
by
rombodawg
RuntimeError
3
#12 opened almost 3 years ago
by
User1232123
Does it provide function calling?
6
#11 opened almost 3 years ago
by
gileneo
Hardware Requirements?
4
#10 opened almost 3 years ago
by
cameronraygun
Update README.md
#8 opened almost 3 years ago
by
GageWeike
Update README.md
#7 opened almost 3 years ago
by
GageWeike
Update README.md
#6 opened almost 3 years ago
by
natesanders
Dataset
ππ 3
#4 opened almost 3 years ago
by
ehartford
How to finetune this beauty?
π 6
4
#3 opened almost 3 years ago
by
mandeepbagga
Will the dataset be made available?
ππ 11
#2 opened almost 3 years ago
by
bratao