HuggingFace's Transformers: State-of-the-art Natural Language Processing
Abstract
Recent progress in natural language processing has been driven by advances in both model architecture and model pretraining. Transformer architectures have facilitated building higher-capacity models and pretraining has made it possible to effectively utilize this capacity for a wide variety of tasks. \textit{Transformers} is an open-source library with the goal of opening up these advances to the wider machine learning community. The library consists of carefully engineered state-of-the art Transformer architectures under a unified API. Backing this library is a curated collection of pretrained models made by and available for the community. \textit{Transformers} is designed to be extensible by researchers, simple for practitioners, and fast and robust in industrial deployments. The library is available at \url{https://github.com/huggingface/transformers}.
- Publication:
-
arXiv e-prints
- Pub Date:
- October 2019
- DOI:
- arXiv:
- arXiv:1910.03771
- Bibcode:
- 2019arXiv191003771W
- Keywords:
-
- Computer Science - Computation and Language
- E-Print:
- 8 pages, 4 figures, more details at https://github.com/huggingface/transformers