Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v0.31.0 removes data volume from deployment #135

Closed
cterence opened this issue Dec 1, 2024 · 9 comments · Fixed by #137
Closed

v0.31.0 removes data volume from deployment #135

cterence opened this issue Dec 1, 2024 · 9 comments · Fixed by #137

Comments

@cterence
Copy link

cterence commented Dec 1, 2024

Hi !
Trying to update to this new version removes my existing volume definitions in .spec.volumes in the deployment.
My updated values (I just added the new storage key, an empty existingVolumeClaim key and moved my existing data and attachments keys in it):

storage:
  ## @param existingVolumeClaim If defined, the values here will be used for the data and
  ## attachments PV's. The custom values for data and attachments will be ignored if
  ## a value is set here
  ##
  existingVolumeClaim:
    {}
    # claimName: "vaultwarden-pvc"
    # dataPath: "/data"
    # attachmentsPath: /data/attachments

  ## @param data Data directory configuration, refer to values.yaml for parameters.
  ##
  data:
    name: "vaultwarden-data"
    size: "5Gi"
    class: ""
    path: "/data"
    accessMode: "ReadWriteOnce"
    keepPvc: true

  ## @param attachments Attachments directory configuration, refer to values.yaml for parameters.
  ## By default, attachments/ is located inside the data directory.
  ##
  attachments:
    name: "vaultwarden-files"
    size: "10Gi"
    class: ""
    path: /files
    accessMode: "ReadWriteOnce"
    keepPvc: true

My diff in ArgoCD (deployment manifest):

image

@paimonsoror could you please help since you were the author of this change please ?

@paimonsoror
Copy link
Contributor

Hey there! taking a look now. Super odd, so is it only removing your vaultwarden-data volume?

@cterence
Copy link
Author

cterence commented Dec 1, 2024

Yes this is the only change, if you want to take a look at my helm chart/values, it's over here.

@paimonsoror
Copy link
Contributor

paimonsoror commented Dec 1, 2024

Yes this is the only change, if you want to take a look at my helm chart/values, it's over here.

Can you confirm that's the right values file?

https://github.com/cterence/homelab-gitops/blob/70d37df1d88f6c0cdcd10b72ffbc2ae4d2182661/k8s-apps/vaultwarden/values.yaml#L236

This and attachments should be nested under storage

@cterence
Copy link
Author

cterence commented Dec 1, 2024

Yes this is the correct one. It's on the previous chart version here sorry.

Here's my PR which targets the new v0.31.0 version with the updated values. This is what I previously merged and got me the ArgoCD error.

@paimonsoror
Copy link
Contributor

Ok cool. I'll text your values once I get home. Thanks for the details!

@Mawarii
Copy link

Mawarii commented Dec 1, 2024

I am not sure, but maybe this has to do with the changes made here:
image

I have also a problem with 0.31.0, because if I template it volumes: will be defined twice and therefore the deployment.yaml is not valid.
image

I don't know much about ArgoCD, but I could imagine that the Argo diff removes the first volumes block and therefore removes the data volume.

@santiagon610
Copy link
Contributor

I don't know much about ArgoCD, but I could imagine that the Argo diff removes the first volumes block and therefore removes the data volume.

Good point. If memory serves me correctly, YAML is "last one defined wins," which means that the volumes with name: vaultwarden-files should win and the volumes with vaultwarden-data would be ignored.

@paimonsoror
Copy link
Contributor

yup! i see it - ok let me work on a fix here

@paimonsoror
Copy link
Contributor

got a fix ready - rebasing my local against main and then staging a PR

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants