reference

https://hyunsooworld.tistory.com/entry/최대한-쉽게-설명한-논문리뷰-Attention-Is-All-You-NeedTransformer-논문

https://wandukong.tistory.com/19

https://aistudy9314.tistory.com/63

Abstract

this paper propose new simple network architecture, the Transformer,

Introduction

Recurrent neural networks(RNN)

has been the firm base of sequence modeling and transduction problems(language modeling and machine translation)

Attention mechanisms