일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | ||||||
2 | 3 | 4 | 5 | 6 | 7 | 8 |
9 | 10 | 11 | 12 | 13 | 14 | 15 |
16 | 17 | 18 | 19 | 20 | 21 | 22 |
23 | 24 | 25 | 26 | 27 | 28 |
- 짝수
- sigmoid
- 논문리뷰
- 코딩테스트
- NLP
- torch
- 자연어처리
- MySQL
- CASE
- LSTM
- 자연어 논문 리뷰
- inner join
- t분포
- leetcode
- 표준편차
- GRU
- 서브쿼리
- update
- SQL코테
- airflow
- nlp논문
- 그룹바이
- sql
- Window Function
- SQL 날짜 데이터
- 설명의무
- 자연어 논문
- Statistics
- HackerRank
- 카이제곱분포
- Today
- Total
목록논문리뷰 (4)
HAZEL
https://arxiv.org/abs/1801.07698 ArcFace: Additive Angular Margin Loss for Deep Face Recognition One of the main challenges in feature learning using Deep Convolutional Neural Networks (DCNNs) for large-scale face recognition is the design of appropriate loss functions that enhance discriminative power. Centre loss penalises the distance between the d arxiv.org 논문 발표 PPT
NLP 논문 스터디에서 발표한 내용으로, PPT만 있는 글 입니다. - 추후에 설명 글도 첨가할 예정 ** arxiv.org/abs/1909.11942 ALBERT: A Lite BERT for Self-supervised Learning of Language Representations Increasing model size when pretraining natural language representations often results in improved performance on downstream tasks. However, at some point further model increases become harder due to GPU/TPU memory limitations and longer..
NLP 논문 스터디에서 발표한 내용으로, PPT만 있는 글 입니다. - 추후에 설명 글도 첨가할 예정 ** arxiv.org/abs/1901.02860 Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context Transformers have a potential of learning longer-term dependency, but are limited by a fixed-length context in the setting of language modeling. We propose a novel neural architecture Transformer-XL that enables learning dependency beyond a ..
NLP 논문 스터디에서 발표한 내용으로, PPT만 있는 글 입니다. - 추후에 설명 글도 첨가할 예정 ** arxiv.org/abs/1706.03762 Attention Is All You Need The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration. The best performing models also connect the encoder and decoder through an attention mechanism. We propose a new arxiv.org 논문 발표 PPT