Working with transformer-based embeddings for text similarity matching
Embeddings from transformer models such as BERT can be used as representations of sentences. In this session, Matteus Tanha works with these embeddings to match similar sentences or paragraphs by exploring a few different distance metrics. The focus is on the application of transformer models but he...
Autor Corporativo: | |
---|---|
Otros Autores: | |
Formato: | Video |
Idioma: | Inglés |
Publicado: |
[Place of publication not identified] :
Manning Publications
2021.
|
Edición: | [First edition] |
Materias: | |
Ver en Biblioteca Universitat Ramon Llull: | https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009823037006719 |
Sumario: | Embeddings from transformer models such as BERT can be used as representations of sentences. In this session, Matteus Tanha works with these embeddings to match similar sentences or paragraphs by exploring a few different distance metrics. The focus is on the application of transformer models but he also goes through the pre-processing steps to extract good quality natural language text. |
---|---|
Descripción Física: | 1 online resource (1 video file (22 min.)) : sound, color |