Distilbert Base Uncased

Downloads
Hugging Face
14.4M
782
Context
Small context
512
License
license:apache-2.0
Updated
11/3/2025
by
distilbert

DistilBERT is a smaller, faster, cheaper, and lighter version of BERT. It is designed for natural language processing tasks and is trained on the BookCorpus and Wikipedia datasets. The model is licensed under Apache 2.0 and is tagged with exbert.

Language Model
OTHER

Quick Info

Released
3/2/2022
Framework
OTHER

Resources