
Natural Language Processing with Transformers, Revised Edition
Read by
Tom Beyer
Release:
07/15/2025
Release:
07/15/2025
Release:
07/15/2025
Runtime:
13h 7m
Runtime:
13h 7m
Runtime:
13h 7m
Unabridged
Quantity:
Since their introduction in 2017, transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or coder, this practical book shows you how to train and scale these large models using Hugging Face Transformers, a Python-based deep learning library.
Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, Lewis Tunstall, Leandro von Werra, and Thomas Wolf use a hands-on approach to teach you how transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve.
● Build, debug, and optimize transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering
● Learn how transformers can be used for cross-lingual transfer learning
● Apply transformers in real-world scenarios where labeled data is scarce
● Make transformer models efficient for deployment
● Train transformers from scratch and learn how to scale to multiple GPUs and distributed environments
Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, Lewis Tunstall, Leandro von Werra, and Thomas Wolf use a hands-on approach to teach you how transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve.
● Build, debug, and optimize transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering
● Learn how transformers can be used for cross-lingual transfer learning
● Apply transformers in real-world scenarios where labeled data is scarce
● Make transformer models efficient for deployment
● Train transformers from scratch and learn how to scale to multiple GPUs and distributed environments
Release:
2025-07-15
2025-07-15
2025-07-15
Runtime:
Runtime:
Runtime:
13h 7m
13h 7m
13h 7m
Format:
audio
audio
audio
Weight:
0.0 lb
1.15 lb
0.55 lb
Language:
English
ISBN:
9781663753106
9798228511842
9798228511859
Publisher:
Ascent Audio
Ascent Audio
Ascent Audio
Praise
