Replies: 6 comments 11 replies
-
I'd like to understand your use case a little better. Do you want to load local files into BigQuery or do you want to load files stored in GCS into BigQuery? What is the initial format of your files? |
Beta Was this translation helpful? Give feedback.
-
I'm currently on vacation, I'll look into it as soon as possible. The |
Beta Was this translation helpful? Give feedback.
-
Are we interested in covering all capabilities of BQ? If so, we may have to depend on this https://github.com/ThouCheese/cloud-storage-rs I'm close to a working code example. |
Beta Was this translation helpful? Give feedback.
-
This is the updated broken sample along with the GCS crate.
|
Beta Was this translation helpful? Give feedback.
-
Although the existing API could be simplified, you can implement what you need by following the example below. https://github.com/lquerel/gcp-bigquery-client/blob/main/examples/bq_load_job.rs The issue was not in the client itself but in the initial example. Instead of passing a download url, the JobConfigurationLoad object was expecting an URI (gs:///. Let me know if everything is working now for you. |
Beta Was this translation helpful? Give feedback.
-
@nixxholas @lquerel - any idea what's the difference between this method: https://cloud.google.com/bigquery/docs/reference/api-uploads - and the standard method of uploading to GCS and then starting a load job that copies from GCS? Is the first method "directly to bigquery" and cheaper? |
Beta Was this translation helpful? Give feedback.
-
Can't seem to find documentation relating to this. Are we missing this feature or am I unable to spot the feature in the Job struct?
Was thinking if we can support batch loading with this client. That would be perfect and extremely economical.
Cool Example Here
Beta Was this translation helpful? Give feedback.
All reactions