Awesome
<img align="right" width="150" height="150" src="doc/cat-delft-small.jpg">DeLFT
DeLFT (Deep Learning Framework for Text) is a Keras and TensorFlow framework for text processing, focusing on sequence labeling (e.g. named entity tagging, information extraction) and text classification (e.g. comment classification). This library re-implements standard state-of-the-art Deep Learning architectures relevant to text processing tasks.
DeLFT has three main purposes:
-
Covering text and rich texts: most of the existing Deep Learning works in NLP only consider simple texts as input. In addition to simple texts, we also target rich text where tokens are associated to layout information (font. style, etc.), positions in structured documents, and possibly other lexical or symbolic contextual information. Text is usually coming from large documents like PDF or HTML, and not just from segments like sentences or paragraphs, and contextual features appear very useful. Rich text is the most common textual content used by humans to communicate and work.
-
Reproducibility and benchmarking: by implementing several references/state-of-the-art models for both sequence labeling and text classification tasks, we want to offer the capacity to easily validate reported results and to benchmark several methods under the same conditions and criteria.
-
Production level, by offering optimzed performance, robustness and integration possibilities, we aim at supporting better engineering decisions/trade-off and successful production-level applications.
Some contributions include:
-
A variety of modern NLP architectures and tasks to be used following the same API and input formats, including RNN, ELMo and transformers.
-
Reduction of the size of RNN models, in particular by removing word embeddings from them. For instance, the model for the toxic comment classifier went down from a size of 230 MB with embeddings to 1.8 MB. In practice the size of all the models of DeLFT is less than 2 MB, except for Ontonotes 5.0 NER model which is 4.7 MB.
-
Implementation of a generic support of categorical features, available in various architectures.
-
Usage of dynamic data generator so that the training data do not need to stand completely in memory.
-
Efficient loading and management of an unlimited volume of static pre-trained embeddings.
-
A comprehensive evaluation framework with the standard metrics for sequence labeling and classification tasks, including n-fold cross validation.
-
Integration of HuggingFace transformers as Keras layers.
A native Java integration of the library has been realized in GROBID via JEP.
The latest DeLFT release 0.3.4 has been tested successfully with python 3.8 and Tensorflow 2.9.3. As always, GPU(s) are required for decent training time. For example, a GeForce GTX 1050 Ti (4GB) is working very well for running RNN models and BERT or RoBERTa base models. Using BERT large model is no problem with a GeForce GTX 1080 Ti (11GB), including training with modest batch size. Using multiple GPUs (training and inference) is supported.
DeLFT Documentation
Visit the DELFT documentation for detailed information on installation, usage and models.
Using DeLFT
PyPI packages are available for stable versions. Latest stable version is 0.3.4
:
python3 -m pip install delft==0.3.4
DeLFT Installation
For installing DeLFT and use the current master version, get the github repo:
git clone https://github.com/kermitt2/delft
cd delft
It is advised to setup first a virtual environment to avoid falling into one of these gloomy python dependency marshlands:
virtualenv --system-site-packages -p python3.8 env
source env/bin/activate
Install the dependencies:
python3 -m pip install -r requirements.txt
Finally install the project, preferably in editable state
python3 -m pip install -e .
See the DELFT documentation for usage.
License and contact
Distributed under Apache 2.0 license. The dependencies used in the project are either themselves also distributed under Apache 2.0 license or distributed under a compatible license.
If you contribute to DeLFT, you agree to share your contribution following these licenses.
Contact: Patrice Lopez (patrice.lopez@science-miner.com) and Luca Foppiano (@lfoppiano).
How to cite
If you want to this work, please refer to the present GitHub project, together with the Software Heritage project-level permanent identifier. For example, with BibTeX:
@misc{DeLFT,
title = {DeLFT},
howpublished = {\url{https://github.com/kermitt2/delft}},
publisher = {GitHub},
year = {2018--2024},
archivePrefix = {swh},
eprint = {1:dir:54eb292e1c0af764e27dd179596f64679e44d06e}
}