From f32049207d92aeb30b96587caf724e7247df5227 Mon Sep 17 00:00:00 2001 From: Tushar Chaturvedi <37372851+tushar-c@users.noreply.github.com> Date: Fri, 1 Nov 2024 23:08:57 +0530 Subject: [PATCH] Update 2024-10-30-Understanding-Transformers-Part-2-:-Building-Blocks.md --- ...-10-30-Understanding-Transformers-Part-2-:-Building-Blocks.md | 1 + 1 file changed, 1 insertion(+) diff --git a/_posts/2024-10-30-Understanding-Transformers-Part-2-:-Building-Blocks.md b/_posts/2024-10-30-Understanding-Transformers-Part-2-:-Building-Blocks.md index bcadb269bedec..924f311bbfc82 100644 --- a/_posts/2024-10-30-Understanding-Transformers-Part-2-:-Building-Blocks.md +++ b/_posts/2024-10-30-Understanding-Transformers-Part-2-:-Building-Blocks.md @@ -225,6 +225,7 @@ Now, since our progression for understanding and implementing the Transformers i ## Positional Encoding +As can be seen in the diagram, the next bit after the Inputs and the Embeddings is the `Positional Encoding` layer. The way to understand this layer is very simple. ## Conclusion