일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | ||||||
2 | 3 | 4 | 5 | 6 | 7 | 8 |
9 | 10 | 11 | 12 | 13 | 14 | 15 |
16 | 17 | 18 | 19 | 20 | 21 | 22 |
23 | 24 | 25 | 26 | 27 | 28 | 29 |
30 | 31 |
- SQL 날짜 데이터
- nlp논문
- 짝수
- airflow
- GRU
- t분포
- leetcode
- Window Function
- 자연어처리
- 서브쿼리
- SQL코테
- 표준편차
- MySQL
- inner join
- 카이제곱분포
- 그룹바이
- torch
- 설명의무
- 자연어 논문 리뷰
- CASE
- sigmoid
- NLP
- HackerRank
- 논문리뷰
- 코딩테스트
- 자연어 논문
- Statistics
- LSTM
- update
- sql
- Today
- Total
목록DATA ANALYSIS (78)
HAZEL

Weather Observation Station 7 >> 문제 Query the list of CITY names ending with vowels (a, e, i, o, u) from STATION. Your result cannot contain duplicates. Input Format The STATION table is described as follows: where LAT_N is the northern latitude and LONG_W is the western longitude. >> 기본적으로 푸는 방법 SELECT DISTINCT city FROM station WHERE city LIKE '%a' OR city LIKE '%e' OR city LIKE '%i' OR city L..

Weather Observation Station 6 >> 문제 Query the list of CITY names starting with vowels (i.e., a, e, i, o, or u) from STATION. Your result cannot contain duplicates. Input Format The STATION table is described as follows: where LAT_N is the northern latitude and LONG_W is the western longitude. >> 기본적으로 푸는 방법 select DISTINCT city FROM station WHERE city LIKE 'a%' OR city LIKE 'e%' OR city LIKE 'i%..
185. Department Top Three Salaries >> 문제 The Employee table holds all employees. Every employee has an Id, and there is also a column for the department Id. +----+-------+--------+--------------+ | Id | Name | Salary | DepartmentId | +----+-------+--------+--------------+ | 1 | Joe | 85000 | 1 | | 2 | Henry | 80000 | 2 | | 3 | Sam | 60000 | 2 | | 4 | Max | 90000 | 1 | | 5 | Janet | 69000 | 1 | |..
Window function 01. window function 와 groupby : groupby 함수와 유사하다. 하지만, 원래 있던 값이 아니라, 새로운 값이 나와서 그룹을 묶어주게 된다. mysql> SELECT SUM(profit) AS total_profit FROM sales; +--------------+ | total_profit | +--------------+ | 7535 | +--------------+ mysql> SELECT country, SUM(profit) AS country_profit FROM sales GROUP BY country ORDER BY country; +---------+----------------+ | country | country_profit | +..

NLP 논문 스터디에서 발표한 내용으로, PPT만 있는 글 입니다. - 추후에 설명 글도 첨가할 예정 ** arxiv.org/abs/1909.11942 ALBERT: A Lite BERT for Self-supervised Learning of Language Representations Increasing model size when pretraining natural language representations often results in improved performance on downstream tasks. However, at some point further model increases become harder due to GPU/TPU memory limitations and longer..

NLP 논문 스터디에서 발표한 내용으로, PPT만 있는 글 입니다. - 추후에 설명 글도 첨가할 예정 ** arxiv.org/abs/1901.02860 Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context Transformers have a potential of learning longer-term dependency, but are limited by a fixed-length context in the setting of language modeling. We propose a novel neural architecture Transformer-XL that enables learning dependency beyond a ..

NLP 논문 스터디에서 발표한 내용으로, PPT만 있는 글 입니다. - 추후에 설명 글도 첨가할 예정 ** arxiv.org/abs/1706.03762 Attention Is All You Need The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration. The best performing models also connect the encoder and decoder through an attention mechanism. We propose a new arxiv.org 논문 발표 PPT

import os from glob import glob import torch from torchvision import datasets, transforms # dataset 예제 변환, transform 예제 변환을 줌 1. Data Loader 부르기 # batch 사이즈를 데이터 로드에 직접 넣어줌 batch_size = 32 test_batch_size =32 # train 용도 이므로 True , 로컬에 데이터가 없으면 download 받을 것이므로 True # 데이터를 변경시켜줄것이므로, 아래처럼 처리 train_loader = torch.utils.data.DataLoader( datasets.MNIST('dataset/', train = True , download= True, tran..

01. PyTorch Basic : 타입만 다를 뿐 numpy 와 유사하게 동작한다. import numpy as np import torch 1. 기본 arange print(np.arange(9)) # [0 1 2 3 4 5 6 7 8] print(torch.arange(9)) # tensor([0, 1, 2, 3, 4, 5, 6, 7, 8]) 2. shape # shape 를 알수 있음 nums = torch.arange(6) nums.shape 3. type type(nums) 4. numpy 로 변환 # 넘파이 변환이 가능함 nums.numpy() 5. reshape # numpy 와 유사하게 reshape 를 할 수 있음 nums.reshape(2,3) 6. rand # 랜덤으로 3,3 텐서 ..

11장. 신경망 기계번역 11.1. 다국어 신경망 번역 11.1.1. 제로샷 학습 ( zero-shot learning ) : 논문 - Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation Melvin Johnson∗ , Mike Schuster∗ , Quoc V. Le, Maxim Krikun, Yonghui Wu, Zhifeng Chen, Nikhil Thorat, Fernanda Viégas, Martin Wattenberg, Greg Corrado, Macduff Hughes, Jeffrey Dean Google www.aclweb.org/anthology/Q17-1024.pdf * 특징 : 여..