stanfordnlp/imdb
Viewer • Updated • 100k • 178k • 370
How to use NikkeS/imdb-distilbert with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="NikkeS/imdb-distilbert") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("NikkeS/imdb-distilbert")
model = AutoModelForSequenceClassification.from_pretrained("NikkeS/imdb-distilbert")This model is a fine-tuned version of distilbert-base-uncased on the IMDB movie reviews dataset for binary sentiment classification (positive vs. negative). The model has been trained to classify movie reviews into either positive (1) or negative (0) sentiments.
distilbert-base-uncasedfrom transformers import AutoModelForSequenceClassification, AutoTokenizer
import torch
# Load the fine-tuned model from Hugging Face Hub
model = AutoModelForSequenceClassification.from_pretrained("your-hf-username/imdb-distilbert")
tokenizer = AutoTokenizer.from_pretrained("your-hf-username/imdb-distilbert")
def predict_sentiment(review):
inputs = tokenizer(review, return_tensors="pt", truncation=True, padding=True, max_length=256)
with torch.no_grad():
logits = model(**inputs).logits
prediction = torch.argmax(logits, dim=1).item()
return "Positive" if prediction == 1 else "Negative"
# Example Usage
print(predict_sentiment("This movie was absolutely fantastic!"))
print(predict_sentiment("The acting was terrible, and the story made no sense."))
distilbert-base-uncased tokenizer.5e-5162fp16=True for efficiency)If you use this model, please cite:
@article{salonen2025imdb-distilbert,
title={Fine-tuned DistilBERT for Sentiment Analysis on IMDB Reviews},
author={Nikke Salonen},
year={2025}
}
For questions or issues, contact nikke.salonen@gmail.com.
This model card provides all necessary details, including training info, evaluation results, and usage instructions. Let me know if you'd like any modifications before uploading to Hugging Face Hub!
Base model
distilbert/distilbert-base-uncased