generated from fastai/fastpages
-
Notifications
You must be signed in to change notification settings - Fork 3
/
index.html
45 lines (30 loc) · 3.81 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
---
layout: home
search_exclude: true
image: images/logo.png
---
![]({{site.baseurl}}/images/book_cover.jpg "https://learning.oreilly.com/library/view/natural-language-processing/9781098103231/")
> "The preeminent book for the preeminent transformers library—a model of clarity!"
—Jeremy Howard, cofounder of fast.ai and professor at University of Queensland
> “A wonderfully clear and incisive guide to modern NLP's most essential library. Recommended!”
—Christopher Manning, Thomas M. Siebel Professor in Machine Learning, Stanford University
<a class="btn btn-secondary" href="https://www.amazon.com/Natural-Language-Processing-Transformers-Revised/dp/1098136799/ref=sr_1_2?keywords=natural+language+processing+with+transformers&qid=1655452281&s=books&sprefix=natural+lan%2Cstripbooks-intl-ship%2C266&sr=1-2">Buy the book on Amazon</a>
<a class="btn btn-secondary" href="https://learning.oreilly.com/library/view/natural-language-processing/9781098136789/">Read the book online at O'Reilly</a>
<a class="btn btn-secondary" href="https://github.com/nlp-with-transformers/notebooks">Download the book's code</a>
Since their introduction in 2017, transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or coder, this practical book shows you how to train and scale these large models using Hugging Face Transformers, a Python-based deep learning library.
Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, authors [Lewis Tunstall](https://twitter.com/_lewtun), [Leandro von Werra](https://twitter.com/lvwerra), and [Thomas Wolf](https://twitter.com/Thom_Wolf) use a hands-on approach to teach you how transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve.
* Build, debug, and optimize transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering
* Learn how transformers can be used for cross-lingual transfer learning
* Apply transformers in real-world scenarios where labeled data is scarce
* Make transformer models efficient for deployment using techniques such as distillation, pruning, and quantization
* Train transformers from scratch and learn how to scale to multiple GPUs and distributed environments
# News 🗞️
### January 31, 2022
Lewis will be joining [Abhishek Thakur](https://twitter.com/abhi1thakur) to talk about the book and various techniques you can use to optimize Transformer models for production environments. We'll also be giving away 5 copies of the book -- join the event [here](https://www.youtube.com/watch?v=5E0nlHWgMMU)!
### June 17, 2022
Due to the popularity of the book, O'Reilly has decided to print it in **full color** from now on in the [revised edition](https://www.amazon.com/Natural-Language-Processing-Transformers-Revised/dp/1098136799/ref=sr_1_2?keywords=natural+language+processing+with+transformers&qid=1655452281&s=books&sprefix=natural+lan%2Cstripbooks-intl-ship%2C266&sr=1-2) 🥳. Thank you to everyone who helped make this happen!
### June 29, 2022
Lewis will be presenting at Munich NLP to talk about the book and various techniques you can use to optimize Transformer models for production environments. We'll also be giving away 3 electronic copies of the book -- join the event [here](https://discord.com/invite/EFZgxYRfTN?event=987395866807648286)!
# Contact
For questions, comments, or requests to interview the authors, please send an email to [[email protected]](mailto:[email protected]).
To submit errata or errors in the book, please do so via the [O'Reilly platform](https://www.oreilly.com/catalog/errata.csp?isbn=0636920512424).