Detected call of `lr_scheduler.step()` before `optimizer.step()`.
生活随笔
收集整理的這篇文章主要介紹了
Detected call of `lr_scheduler.step()` before `optimizer.step()`.
小編覺得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.
在使用pytorch的指數(shù)衰減學(xué)習(xí)率時(shí),出現(xiàn)報(bào)錯(cuò)UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()`. In PyTorch 1.1.0 and later, you should call them in the opposite order: `optimizer.step()` before `lr_scheduler.step()`. ?Failure to do this will result in PyTorch skipping the first value of the learning rate schedule.
原因是如報(bào)錯(cuò)所說(shuō),在“optimizer.step()”之前對(duì)“l(fā)r_scheduler.step()”的進(jìn)行了調(diào)用,如下錯(cuò)誤的代碼所示:
for i in range(epoch):net.scheduler.step()net.set_mode_train(True)for j, (x, y) in enumerate(train_loader):cost_pred, err = net.fit(x, y)net.epoch = i所以應(yīng)該把lr_scheduler.step()放在每次epoch訓(xùn)練完成之后:
for i in range(epoch):net.set_mode_train(True)for j, (x, y) in enumerate(train_loader):cost_pred, err = net.fit(x, y)net.scheduler.step()net.epoch = i?
總結(jié)
以上是生活随笔為你收集整理的Detected call of `lr_scheduler.step()` before `optimizer.step()`.的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。
- 上一篇: 深度学习需要注意的11个方面
- 下一篇: pytorch的nn.CrossEntr