HAZEL

[NLP Paper Review] ALBERT: A Lite BERT for Self-supervised Learning of Language Representations 논문 리뷰 / ALBERT 본문

DATA ANALYSIS/Paper

[NLP Paper Review] ALBERT: A Lite BERT for Self-supervised Learning of Language Representations 논문 리뷰 / ALBERT

Rmsid01 2021. 5. 9. 11:02

NLP 논문 스터디에서 발표한 내용으로, PPT만 있는 글 입니다.

- 추후에 설명 글도 첨가할 예정 ** 

 

arxiv.org/abs/1909.11942

 

ALBERT: A Lite BERT for Self-supervised Learning of Language Representations

Increasing model size when pretraining natural language representations often results in improved performance on downstream tasks. However, at some point further model increases become harder due to GPU/TPU memory limitations and longer training times. To

arxiv.org

 

논문 발표 PPT