๋ค์ ํจ์๋ฅผ ์ด์ฉํด PyTorch scheduler์ ์๊ฐํํ ์ ์๋ค.
import matplotlib.pyplot as plt
def visualize_scheduler(optimizer, scheduler, epochs):
lrs = []
for _ in range(epochs):
optimizer.step()
lrs.append(optimizer.param_groups[0]['lr'])
scheduler.step()
plt.plot(lrs)
plt.show()
scheduler.get_lr()
๋ก learning rate๋ฅผ ์ป์ด์ค์ง ์๊ณ
optimizer.param_groups[0]['lr']
๋ก ์ป์ด์ค๋ ์ด์ ๋,
ReduceLROnPlateau
๋ฑ์ scheduler์ ๊ฒฝ์ฐ get_lr()
method๊ฐ ์๊ธฐ ๋๋ฌธ.
๋ค์๊ณผ ๊ฐ์ด ์ฌ์ฉํ ์ ์๋ค.
import torch
import torch.optim as optim
epochs = 300
optimizer = optim.SGD([torch.tensor(1)], lr=0.1, momentum=0.9)
scheduler = optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max=100, eta_min=0)
visualize_scheduler(optimizer, scheduler, epochs)
๋ฐ์ํ
'๐ Python & library > PyTorch' ์นดํ ๊ณ ๋ฆฌ์ ๋ค๋ฅธ ๊ธ
[PyTorch] ๋ชจ๋ธ ์๊ฐํ ํด ์ธ๊ฐ์ง - Torchviz, HiddenLayer, Netron (Model visualization) (2) | 2022.01.13 |
---|---|
[PyTorch/Tensorflow v1, v2] Gradient Clipping ์ถ๊ฐํ๊ธฐ (0) | 2022.01.12 |
[PyTorch] ReduceLROnPlateau (0) | 2021.10.26 |
[PyTorch] CosineAnnealingLR, CosineAnnealingWarmRestarts (0) | 2021.10.14 |
[PyTorch] nn.ModuleList ๊ธฐ๋ฅ๊ณผ ์ฌ์ฉ ์ด์ (1) | 2021.08.04 |