๐Ÿ Python & library/PyTorch

[PyTorch] nn.Embedding ์ดˆ๊ธฐํ™”ํ•˜๊ธฐ (initialization)

๋ณต๋งŒ 2022. 10. 27. 13:37

PyTorch์˜ nn.Embedding layer์„ ์ดˆ๊ธฐํ™”ํ•˜๋Š” ๋ฐฉ๋ฒ•์—๋Š” ๋‘ ๊ฐ€์ง€๊ฐ€ ์žˆ๋‹ค.

embedding = nn.Embedding(num_embeddings, embedding_dim)

 

 

 

1. torch.tensor์˜ ๋‚ด์žฅ method ์ด์šฉํ•˜๊ธฐ

 

embedding.weight.data.uniform_(-1, 1)

 

torch.tensor์€ uniform_ ๋“ฑ์˜ ๋‚ด์žฅ method๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ์–ด ์ด๋ฅผ ํ†ตํ•ด ๊ฐ’์„ ์ดˆ๊ธฐํ™”ํ•  ์ˆ˜ ์žˆ๋‹ค.

 

 

 

2. torch.nn.init ์ด์šฉํ•˜๊ธฐ

 

nn.init.uniform_(embedding.weight, -1.0, 1.0)

 

torch.nn.init์˜ method๋“ค์„ ์ด์šฉํ•  ์ˆ˜๋„ ์žˆ๋‹ค.

 

์ด ๋ฐฉ๋ฒ•์„ ์ด์šฉํ•˜๋ฉด uniform_ ์ด์™ธ์—๋„ xavier_uniform_ ๋“ฑ ๋ณด๋‹ค ๋‹ค์–‘ํ•œ initialization ๋ฐฉ๋ฒ•๋“ค์„ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ๋‹ค.

 

 

 

์ถœ์ฒ˜: https://stackoverflow.com/questions/55276504/different-methods-for-initializing-embedding-layer-weights-in-pytorch

 

Different methods for initializing embedding layer weights in Pytorch

There seem to be two ways of initializing embedding layers in Pytorch 1.0 using an uniform distribution. For example you have an embedding layer: self.in_embed = nn.Embedding(n_vocab, n_embed) A...

stackoverflow.com

 

๋ฐ˜์‘ํ˜•