vishnun/NLP-KnowledgeGraph
Viewer • Updated • 43.6k • 121 • 17
How to use vishnun/knowledge-graph-nlp with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("token-classification", model="vishnun/knowledge-graph-nlp") # Load model directly
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("vishnun/knowledge-graph-nlp")
model = AutoModelForTokenClassification.from_pretrained("vishnun/knowledge-graph-nlp")This model is a fine-tuned version of distilbert-base-uncased on the vishnun/NLP-KnowledgeGraph dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| 0.2908 | 1.0 | 2316 | 0.2461 | 0.8455 | 0.8023 | 0.8234 | 0.9167 |
| 0.1973 | 2.0 | 4632 | 0.2000 | 0.8745 | 0.8446 | 0.8593 | 0.9341 |
| 0.1593 | 3.0 | 6948 | 0.1863 | 0.8973 | 0.8632 | 0.8799 | 0.9427 |
| 0.1336 | 4.0 | 9264 | 0.1830 | 0.8988 | 0.8715 | 0.8849 | 0.9453 |
Base model
distilbert/distilbert-base-uncased