top of page
Search

What it takes to learn transformer models for natural langauge processing!-part II

  • joydeepml2020
  • Jun 3, 2021
  • 1 min read

In my last post, we have discussed about simple encoder decoder based seq2seq models.

In this article, we will try to understand the shortcommings of simple encoder-decoder based model and what solution is bought by attention models.



 
 
 

Comments


bottom of page