Natural language processing using transformer architectures

"Whether you need to automatically judge the sentiment of a user review, summarize long documents, translate text, or build a chatbot, you need the best language model available. In 2018, pretty much every NLP benchmark was crushed by novel transformer-based architectures, replacing long-standi...

Descripción completa

Detalles Bibliográficos
Otros Autores: Géron, Aurélien, on-screen presenter (onscreen presenter)
Formato: Vídeo online
Idioma:Inglés
Publicado: [Place of publication not identified] : O'Reilly Media 2020.
Materias:
Ver en Biblioteca Universitat Ramon Llull:https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009822840206719
Descripción
Sumario:"Whether you need to automatically judge the sentiment of a user review, summarize long documents, translate text, or build a chatbot, you need the best language model available. In 2018, pretty much every NLP benchmark was crushed by novel transformer-based architectures, replacing long-standing architectures based on recurrent neural networks. In short, if you're into NLP, you need transformers. But to use transformers, you need to know what they are, what transformer-based architectures look like, and how you can implement them in your projects. Aurélien Géron (Kiwisoft) dives into recurrent neural networks and their limits, the invention of the transformer, attention mechanisms, the transformer architecture, subword tokenization using SentencePiece, self-supervised pretraining--learning from huge corpora, one-size-fits-all language models, BERT and GPT 2, and how to use these language models in your projects using TensorFlow."--Resource description page.
Notas:Title from resource description page (viewed July 22, 2020).
Descripción Física:1 online resource (1 streaming video file (45 min., 24 sec.)) : digital, sound, color