You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I tried map_location="cpu", map_location="cuda:0". My GPU is A5000, cuda==10.1, pytorch==1.4.0, python==3.8. The models I used are your pre-traned model.
Could you help me? Thanks in advance!
The text was updated successfully, but these errors were encountered:
Exception has occurred: ModuleNotFoundError
No module named 'transformers.configuration_openai'
File "/CURE/src/tester/generator.py", line 61, in generate_gpt_conut
loaded = torch.load(
File "/CURE/src/tester/generator.py", line 134, in <module>
generate_gpt_conut(vocab_file, model_file, input_file, identifier_txt_file, identifier_token_file, output_file, beam_size)
ModuleNotFoundError: No module named 'transformers.configuration_openai'
But I already used pip to install transformers=2.10.0, is there a problem with my installation?
Dear author,
When I use
python generator.py
to generate patches, I got theSegmentation fault (core dumped)
. I found it was caused by torch.load().I tried map_location="cpu", map_location="cuda:0". My GPU is A5000, cuda==10.1, pytorch==1.4.0, python==3.8. The models I used are your pre-traned model.
Could you help me? Thanks in advance!
The text was updated successfully, but these errors were encountered: