-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Google Cloud Storage - Watch Not Working But Download Is #4855
Comments
Actually, upon further investigation into logging, even if i change the ACL to public for all items in the bucket, I can see that Peertube is adding the
|
More information, I have found out there is a I can see that each time when I upload a the UUID from Peertube logs is the name of the folder in the bucket However, the videos now cannot play. It just loops like it's waiting. I went to the Network tab and I can see that it seemingly should be hitting the right address I compared the URLs character by character in NotePad++ and each character is the exact same. (I had to put the bucketname at the end of the base url for this to happen. So I see the files being uploaded into Any thoughts on how this can be solved. Here is screenshot of the bucket configuration parameters (except password. Anything else that could help please let me know. Additional Info: Though the video cannot be watched, from my android and computer, the download option DOES work. So videos can be downloaded they just cannot be watched :(. |
Hello, You should not set a |
@Chocobozzz https://peertube.gsugambit.com/w/71tCgPvLbFRRhXhtYf64Wp
|
@Chocobozzz 'Also any hopes of getting a config flag to disable |
Have you enabled CORS on your bucket? https://docs.joinpeertube.org/admin-remote-storage?id=cors-settings I see a CORS error when I try to load your video. |
@Chocobozzz That fixed it. The videos now play instantly. Here is the configuration I used for GCS in case you want to put it in the documents
If you would like I can try removing the base URL parameter and see if videos still upload/play/download without issue. Any thoughts on the ACL being a configurable parameter? If i can get a dev environment set up for this, i'd love to contribute |
@Chocobozzz So I have confirmed that removing base url still allows it to work which is puzzling because it was the only change I made at that point in debugging. I did notice one issue though. When you first upload a video it begin's transcoding but you can watch it before hand. Once the video finishes transcoding and is available in the cloud, the logs start to error and write every few milliseconds that the video is no longer available locally in peertube (since it's not in cloud storage). The video then stops playing and user has to refresh though they won't know what is happening. It wrote 3 log files in a very small amount of time. I had just cleaned all logs before doing the base URL check. |
@Chocobozzz Actually in the HTTP request, I can see the URL is indeed incorrect, it's adding in the bucket name in the URL but somehow google is handling it and it is working now but this definitely isn't how it should be. The bucket name should not be in the function i highlighted from the source code above. Is that required for AWS or some other object storage potentially? |
It's an expected behaviour, it's the reason why we display a message below the player:
I don't understand. If you disabled |
@Chocobozzz Regarding the first, I understand that once it transcodes it will no longer be available locally but until they refresh the screen the logs will write every millisecond. Non involved users won't know this was happening. For sure I understand throwing the error but if we could handle it someway to make it not recur over and over. Regarding the second, no the URL is not correct. The google url is |
We're just using the virtual host path for buckets: https://cloud.google.com/storage/docs/request-endpoints#xml-api |
this is solved in #4850 |
@Chocobozzz @gsugambit I was trying the steps listed above with configs directly in production.yaml and GCS bucket does have the CORS. But I keep getting the Access Denied error. I'm able to use gsutil to upload a file from GCP VM to the GCS bucket after using the Service Account. Am I missing any other configs? And the object storage lib seems to be AWS specific? @gsugambit were you using the inbuilt one or something else for GCS bucket? |
Describe the current behavior
I have tried to get peertube to publish to a google GCP bucket. What i found was that the issue is that the source code automatically puts "public-read" as the ACL. Can this be made configurable based off a parameter? Something like PEERTUBE_OBJECT_STORAGE_APPLY_ACL ? Maybe it could default to
true
to maintain functionality for today. With GCS, because they use uniform bucket-level access by default, it throws this error:If I was certain on how to build the entire system locally i would test deploying a docker container with that 1 line of code removed and see if it works.
Steps to reproduce:
Describe the expected behavior
Video should be uploaded to bucket
Additional information
PeerTube instance:
Browser name, version and platforms on which you could reproduce the bug:
Latest version of chrome and firefox
This seems like the same issue that was run into here but it looks like he changed the code to make it work. If this could be configurable it would help others use the GA docker containers
The text was updated successfully, but these errors were encountered: