Domain Adaptation - SentenceTransformers SBERT : Goal is to adapt text embedding models to your specific text domain.
Easy Theory and python code in Jupyter NB / COLAB. Python. SBERT.
BERT. Transformers. HuggingFace.
Discover how text embeddings model can be adapted to your specific domain.
As requested by my viewers (poll in community tab 2 days ago).
00:00 Neural Search
03:09 Domain Adaptation
06:52 Adaptive Pre-training
10:26 Python code in Jupyter NB
15:20 Outlook Improvements
#nlproc
#SentenceTransformers
#python
#pythonprogramming
#pytorch
#colab
#ai
#deeplearning
#machinelearningwithpython
#sbert
#bert
#domain
#adaptation
All Credits to:
https://sbert.net/
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "http://arxiv.org/abs/1908.10084",
}
@inproceedings{wang-2021-TSDAE,
title = "TSDAE: Using Transformer-based Sequential Denoising Auto-Encoderfor Unsupervised Sentence Embedding Learning",
author = "Wang, Kexin and Reimers, Nils and Gurevych, Iryna",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2021",
month = nov,
year = "2021",
address = "Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
pages = "671--688",
url = "https://arxiv.org/abs/2104.06979",
}
31 Comments