Transformers🤗 has become the front-runner for our everyday-life applications, powering from Google, and Twitter to Zoom, Uber & many more! They almost become the default approach to building features, products, workflows, and technology! Transformers were first introduced in Attention is All you Need paper as a tool for sequence transduction — converting one sequence of symbols to another. One of the most popular examples of them at that time was translation! However, nowadays, they have not only become frontier in NLP but also have done so magically in the vision 👀 area! So let’s dive in 🏊‍♀️! So much fun ahead! 🥳

Let's get started... 🚀

1) High-level architecture

2) Mapping Words to Vectors

3) Inner Product

4) Attention Mechanism

5) Self-attention Mechanism in detail

6) Sequence to Sequence models

7) Ways that Attention could be manifested 🥸