Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support loading stable zero123 in diffuser (fp16 support) #395

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

Adamdad
Copy link

@Adamdad Adamdad commented Jan 2, 2024

This pull request introduces functionality to support loading the stable-Zero123 into the diffuser with FP16 (half-precision floating-point) compatibility.

Key Features:

  • Stable Zero123 Integration: Ensures the diffusers can load and utilize the stable version of Zero123.
  • FP16 Support: Enhances performance by enabling half-precision floating-point computations, particularly beneficial for memory efficiency and speed on compatible hardware.
    The integration has been thoroughly tested for stability and performance improvements.

Copy link
Collaborator

@DSaurus DSaurus left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thank you for implementing zero123 in diffusers.

@@ -638,6 +666,7 @@ def forward(
loss_sd = 0.5 * F.mse_loss(latents, target, reduction="sum") / batch_size

guidance_out = {
"loss_sds": loss_sd,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Magic123 uses loss_sd so it will raise an error. How about changing the config file from loss_sds to loss_sd?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I originally keep/duplicate both the key of both loss_sd and loss_sds, To make it compatible with different models. Yes, i will consider change the cfg. Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants