We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
https://github.com/yuyanli0831/OmniFusion/blob/aaf52cc953ade3be1f5fc3df446705e4223b8d21/test.py#L97
This will create a chance that a part of test data will not be used. It should be changed to drop_last=False.
The text was updated successfully, but these errors were encountered:
Well, simply changing drop_last=False will create an error due to a bug in pytorch. It's going to be something like
TypeError: Caught TypeError in replica 1 on device 1. Original Traceback (most recent call last): File "/home/localstorage/miniconda3/envs/omnifusion/lib/python3.7/site-packages/torch/nn/parallel/parallel_apply.py", line 61, in _worker output = module(*input, **kwargs) File "/home/localstorage/miniconda3/envs/omnifusion/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1051, in _call_impl return forward_call(*input, **kwargs) TypeError: forward() missing 1 required positional argument: 'high_res'
To circumvent this error, we need to feed all args as keyword args. That means, we need to change the following https://github.com/yuyanli0831/OmniFusion/blob/aaf52cc953ade3be1f5fc3df446705e4223b8d21/test.py#L198 into equi_outputs_list = network(high_res=rgb, iter=iters)
equi_outputs_list = network(high_res=rgb, iter=iters)
Sorry, something went wrong.
No branches or pull requests
https://github.com/yuyanli0831/OmniFusion/blob/aaf52cc953ade3be1f5fc3df446705e4223b8d21/test.py#L97
This will create a chance that a part of test data will not be used. It should be changed to drop_last=False.
The text was updated successfully, but these errors were encountered: