forked from barryclark/jekyll-now
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Create 2024-10-19-Understanding-Language-:-Transformers.md
- Loading branch information
Showing
1 changed file
with
3 additions
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,3 +1,5 @@ | ||
## Introduction | ||
|
||
After some time away, we are back at it! This time, we are going to understand the idea of modelling human language through the framework of Machine Learning. More specifically, the most widely known and the most commercialized method of application for this task is the Large Language Model. This is basically a term for a machine learning model having many parameters that attempts to model human language. The way | ||
After some time away, we are back! This time, we are going to understand the idea of modelling human language through the framework of Machine Learning. More specifically, the most widely known and the most commercialized method of application for this task is the Large Language Model. This is basically a term for a machine learning model having many parameters that attempts to model human language. In the present, the underlying model that is used to implement this is the "Transformer". The whole model has been described and explained in its entirety in the paper title "Attention Is All You Need" by Vaswani et. al. | ||
|
||
|