본문으로 건너뛰기
/learn
paperfinished📌 pinned

Attention Is All You Need

Vaswani et al.

★★★★★#AI
Self-attention만으로 Seq2Seq를 풀어낸 분기점. 이 한 편이 LLM 시대를 열었습니다.

📅 2026-02-25

#transformer#nlp

Meta

Authors
Ashish Vaswani, Noam Shazeer, Niki Parmar
Venue
NeurIPS
Year
2017
arXiv
1706.03762

Quotes

  • We propose a new simple network architecture, the Transformer, based solely on attention mechanisms

    p.1

외부 링크