Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Depereciation warning in pytorch 1.5 (maybe and above?) #5

Closed
Junyoungpark opened this issue Jul 30, 2020 · 4 comments · Fixed by #7
Closed

Depereciation warning in pytorch 1.5 (maybe and above?) #5

Junyoungpark opened this issue Jul 30, 2020 · 4 comments · Fixed by #7

Comments

@Junyoungpark
Copy link

Hi! I encountered following warning while I using AdamP for my project

..\torch\csrc\utils\python_arg_parser.cpp:756: UserWarning: This overload of add_ is deprecated:
	add_(Number alpha, Tensor other)
Consider using one of the following signatures instead:
	add_(Tensor other, *, Number alpha)

Might this be relevant to the AdmaP update?

@bhheo
Copy link
Collaborator

bhheo commented Jul 31, 2020

Hi

We implement AdamP based on PyTorch 1.3
PyTorch 1.5 changes default args of in-place operations such as add_() and addcmul_().
So, following lines of AdamP makes the warnings

AdamP/adamp/adamp.py

Lines 80 to 81 in 64a6310

exp_avg.mul_(beta1).add_(1 - beta1, grad)
exp_avg_sq.mul_(beta2).addcmul_(1 - beta2, grad, grad)

p.data.add_(-step_size, perturb)

If you change it as

 exp_avg.mul_(beta1).add_(grad, alpha=1 - beta1) 
 exp_avg_sq.mul_(beta2).addcmul_(grad, grad, value=1 - beta2) 
 p.data.add_(perturb, alpha=-step_size) 

then the warnings will disappear
But, I'm not sure whether this changes is compatible with old versions. So, I will not apply the changes to master or pip.

You can also find difference in official AdamW.py
Torch 1.4
https://github.com/pytorch/pytorch/blob/74044638f755cd8667bedc73da4dbda4aa64c948/torch/optim/adamw.py#L100-L101
Torch 1.5
https://github.com/pytorch/pytorch/blob/4ff3872a2099993bf7e8c588f7182f3df777205b/torch/optim/adamw.py#L104-L105

@Junyoungpark
Copy link
Author

thanks for the comments! I'll use the code snippet for torch 1.5

@bhheo
Copy link
Collaborator

bhheo commented Aug 27, 2020

Inplace operation was changed in #7.
Thanks for your contribution.

@SanghyukChun
Copy link
Collaborator

SanghyukChun commented Aug 27, 2020

@Junyoungpark
We just release adamp==0.3.0(https://github.com/clovaai/AdamP/releases/tag/v0.3.0).
Please retry pip install adamp --upgrade :)

xinrongl added a commit to xinrongl/naic that referenced this issue Oct 8, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants