Abstract
Natural Language Processing plays a vital role in our day-to-day life. Deep learning models for NLP help make human life easier as computers can think, talk, and interact like humans. Applications of the NLP models can be seen in many domains, especially in machine translation and psychology. This paper briefly reviews the different transformer models and the advantages of using an Encoder-Decoder language translator model. The article focuses on the need for sequence-to-sequence language-translation models like BERT, RoBERTa, and XLNet, along with their components.