Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

M1 - dream.py fails but txt2img runs fine - memory issue? #248

Closed
krummrey opened this issue Aug 31, 2022 · 1 comment
Closed

M1 - dream.py fails but txt2img runs fine - memory issue? #248

krummrey opened this issue Aug 31, 2022 · 1 comment

Comments

@krummrey
Copy link
Contributor

I can get the txt2img.py script (in the originals folder which is patched for mps) to run fine.
The dream.py starts fine but when I try to use it fails even at a request for 64x64 pixels.

* Initialization done! Awaiting your command (-h for help, 'q' to quit)
dream> ocean -S 1 -W 64 -H 64
Traceback (most recent call last):
  File "/Users/jan/Documents/ML/stable-diffusion/scripts/dream.py", line 518, in <module>
    main()
  File "/Users/jan/Documents/ML/stable-diffusion/scripts/dream.py", line 99, in main
    main_loop(t2i, opt.outdir, opt.prompt_as_dir, cmd_parser, infile)
  File "/Users/jan/Documents/ML/stable-diffusion/scripts/dream.py", line 208, in main_loop
    image_list  = t2i.prompt2image(image_callback=callback, **vars(opt))
  File "/Users/jan/Documents/ML/stable-diffusion/ldm/simplet2i.py", line 283, in prompt2image
    torch.cuda.torch.cuda.reset_peak_memory_stats()
  File "/opt/homebrew/Caskroom/miniforge/base/envs/ldm/lib/python3.10/site-packages/torch/cuda/memory.py", line 256, in reset_peak_memory_stats
    return torch._C._cuda_resetPeakMemoryStats(device)
AttributeError: module 'torch._C' has no attribute '_cuda_resetPeakMemoryStats'
@krummrey
Copy link
Contributor Author

Duplicate to #234 - sorry closing it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant