[go: nahoru, domu]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

natten2d function requires kernel size #8

Closed
lzy00codeforfun opened this issue Apr 5, 2023 · 2 comments · Fixed by #9
Closed

natten2d function requires kernel size #8

lzy00codeforfun opened this issue Apr 5, 2023 · 2 comments · Fixed by #9
Labels
bug Something isn't working

Comments

@lzy00codeforfun
Copy link

Hi there,

Thanks for the great work! When I try to evaluate the FFHQ 256x256 generation, it seems the natten2d function requires kernel_size to be specified, both natten2dqkrpb and natten2dav. However, this is not specified in the code models/stylenat.py. I used this command to install the natten package, pip3 install natten -f https://shi-labs.com/natten/wheels/cu118/torch2.0.0/index.html . Is this because the code isn't compatible with this natten version? Do you have any idea of why this happens or what's the missing value here?

Thanks!

@alihassanijr alihassanijr added the bug Something isn't working label Apr 5, 2023
alihassanijr added a commit to alihassanijr/StyleNAT that referenced this issue Apr 5, 2023
NATTEN functions take in kernel size as well as dilation since 0.14.6.
The change in signature breaks Hydra-NA, which calls those functions
directly instead of using the nn.Module.

This fixes SHI-Labs#8 .

Refs:

* https://github.com/SHI-Labs/NATTEN/releases/tag/v0.14.6
* https://github.com/SHI-Labs/NATTEN/blob/main/CHANGELOG.md#0146---2023-03-21

Same change in other repositories using NATTEN:
* huggingface/transformers#22229
* huggingface/transformers#22298
@alihassanijr
Copy link
Member

Thank you for your interest and bringing this to our attention.

Yes, the issue is that NATTEN's function calls have a slightly different signature since v0.14.5, which is the release that added torch 2.0 support.
#9 should resolve the issue.

@stevenwalton
Copy link
Collaborator
stevenwalton commented Apr 5, 2023

Thanks @alihassanijr

@lzy00codeforfun let us know if you have any additional issues. Please reopen if this did not resolve things. I also will note that you should probably see some speed improvements if using pytorch 2.0 but we have not tested StyleNAT under those settings. We are still using the same versioning as the paper. torch==1.13.1, pytorch-cuda==1.7 torchvision==0.14.1. Natten versions (which @alihassanijr is the lead developer on) should, in general, not affect StyleNAT, but for what it is worth, my current environment has natten==0.14.4 (still available through pip)

Thanks for bringing this to our attention!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants