Text Classification
PEFT
Safetensors
Transformers
LoRA
QLoRA
multi-label
decoder-only
trl
bitsandbytes
Instructions to use Amirhossein75/LLM-Decoder-Tuning-Text-Classification with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use Amirhossein75/LLM-Decoder-Tuning-Text-Classification with PEFT:
from peft import PeftModel from transformers import AutoModelForSequenceClassification base_model = AutoModelForSequenceClassification.from_pretrained("meta-llama/Llama-3.2-1B") model = PeftModel.from_pretrained(base_model, "Amirhossein75/LLM-Decoder-Tuning-Text-Classification") - Transformers
How to use Amirhossein75/LLM-Decoder-Tuning-Text-Classification with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="Amirhossein75/LLM-Decoder-Tuning-Text-Classification")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("Amirhossein75/LLM-Decoder-Tuning-Text-Classification", dtype="auto") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- b144659e005c9bc5cfa9971bfe958aa8548964be5094196484aa1fffe18ba567
- Size of remote file:
- 17.2 MB
- SHA256:
- 52716f60c3ad328509fa37cdded9a2f1196ecae463f5480f5d38c66a25e7a7dc
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.