Skip to content

Commit

Permalink
feat: [dataproc] add support for new Dataproc features (#9127)
Browse files Browse the repository at this point in the history
* fix: Add service_yaml_parameters to py_gapic_library BUILD.bazel targets

PiperOrigin-RevId: 510187992

Source-Link: googleapis/googleapis@5edc235

Source-Link: https://github.com/googleapis/googleapis-gen/commit/b0bedb72e4765a3e0b674a28c50ea0f9a9b26a89
Copy-Tag: eyJwIjoiamF2YS1kYXRhcHJvYy8uT3dsQm90LnlhbWwiLCJoIjoiYjBiZWRiNzJlNDc2NWEzZTBiNjc0YTI4YzUwZWEwZjlhOWIyNmE4OSJ9

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

* feat: add support for new Dataproc features

PiperOrigin-RevId: 510421528

Source-Link: googleapis/googleapis@5a395e4

Source-Link: https://github.com/googleapis/googleapis-gen/commit/54e9de1f13dbe0c6bef5bc6e7a2a4edbd588bd1b
Copy-Tag: eyJwIjoiamF2YS1kYXRhcHJvYy8uT3dsQm90LnlhbWwiLCJoIjoiNTRlOWRlMWYxM2RiZTBjNmJlZjViYzZlN2EyYTRlZGJkNTg4YmQxYiJ9

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

* feat: ROLLBACK: add support for new Dataproc features

PiperOrigin-RevId: 511238112

Source-Link: googleapis/googleapis@d126575

Source-Link: https://github.com/googleapis/googleapis-gen/commit/a338d04315b79a899b4d2fa31469e4659ea322d0
Copy-Tag: eyJwIjoiamF2YS1kYXRhcHJvYy8uT3dsQm90LnlhbWwiLCJoIjoiYTMzOGQwNDMxNWI3OWE4OTliNGQyZmEzMTQ2OWU0NjU5ZWEzMjJkMCJ9

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

* feat: add support for new Dataproc features

1. Allow to change shielded config defaults for 2.1+ images
2. Support batches filtering in list API
3. Support Trino jobs on 2.1+ image clusters
4. Support batch TTL
5. Support custom staging bucket for batches
6. Expose approximate and current batches resources usage
7. Support Hudi and Trino components

PiperOrigin-RevId: 511550277

Source-Link: googleapis/googleapis@9111603

Source-Link: https://github.com/googleapis/googleapis-gen/commit/4a0877ffbede2d57c2f3776e2d8df6a2d6f9b99c
Copy-Tag: eyJwIjoiamF2YS1kYXRhcHJvYy8uT3dsQm90LnlhbWwiLCJoIjoiNGEwODc3ZmZiZWRlMmQ1N2MyZjM3NzZlMmQ4ZGY2YTJkNmY5Yjk5YyJ9

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

---------

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
  • Loading branch information
gcf-owl-bot[bot] and gcf-owl-bot[bot] authored Feb 28, 2023
1 parent 0fd61a4 commit 841366b
Show file tree
Hide file tree
Showing 97 changed files with 10,757 additions and 2,235 deletions.
6 changes: 3 additions & 3 deletions java-dataproc/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,20 +20,20 @@ If you are using Maven, add this to your pom.xml file:
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-dataproc</artifactId>
<version>4.6.0</version>
<version>4.8.0</version>
</dependency>
```

If you are using Gradle without BOM, add this to your dependencies:

```Groovy
implementation 'com.google.cloud:google-cloud-dataproc:4.6.0'
implementation 'com.google.cloud:google-cloud-dataproc:4.8.0'
```

If you are using SBT, add this to your dependencies:

```Scala
libraryDependencies += "com.google.cloud" % "google-cloud-dataproc" % "4.6.0"
libraryDependencies += "com.google.cloud" % "google-cloud-dataproc" % "4.8.0"
```
<!--- {x-version-update-end} -->

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -394,7 +394,8 @@ public final UnaryCallable<CreateBatchRequest, Operation> createBatchCallable()
* }
* }</pre>
*
* @param name Required. The name of the batch to retrieve.
* @param name Required. The fully qualified name of the batch to retrieve in the format
* "projects/PROJECT_ID/locations/DATAPROC_REGION/batches/BATCH_ID"
* @throws com.google.api.gax.rpc.ApiException if the remote call fails
*/
public final Batch getBatch(BatchName name) {
Expand All @@ -421,7 +422,8 @@ public final Batch getBatch(BatchName name) {
* }
* }</pre>
*
* @param name Required. The name of the batch to retrieve.
* @param name Required. The fully qualified name of the batch to retrieve in the format
* "projects/PROJECT_ID/locations/DATAPROC_REGION/batches/BATCH_ID"
* @throws com.google.api.gax.rpc.ApiException if the remote call fails
*/
public final Batch getBatch(String name) {
Expand Down Expand Up @@ -561,6 +563,8 @@ public final ListBatchesPagedResponse listBatches(String parent) {
* .setParent(LocationName.of("[PROJECT]", "[LOCATION]").toString())
* .setPageSize(883849137)
* .setPageToken("pageToken873572522")
* .setFilter("filter-1274492040")
* .setOrderBy("orderBy-1207110587")
* .build();
* for (Batch element : batchControllerClient.listBatches(request).iterateAll()) {
* // doThingsWith(element);
Expand Down Expand Up @@ -593,6 +597,8 @@ public final ListBatchesPagedResponse listBatches(ListBatchesRequest request) {
* .setParent(LocationName.of("[PROJECT]", "[LOCATION]").toString())
* .setPageSize(883849137)
* .setPageToken("pageToken873572522")
* .setFilter("filter-1274492040")
* .setOrderBy("orderBy-1207110587")
* .build();
* ApiFuture<Batch> future =
* batchControllerClient.listBatchesPagedCallable().futureCall(request);
Expand Down Expand Up @@ -626,6 +632,8 @@ public final ListBatchesPagedResponse listBatches(ListBatchesRequest request) {
* .setParent(LocationName.of("[PROJECT]", "[LOCATION]").toString())
* .setPageSize(883849137)
* .setPageToken("pageToken873572522")
* .setFilter("filter-1274492040")
* .setOrderBy("orderBy-1207110587")
* .build();
* while (true) {
* ListBatchesResponse response = batchControllerClient.listBatchesCallable().call(request);
Expand Down Expand Up @@ -665,7 +673,8 @@ public final UnaryCallable<ListBatchesRequest, ListBatchesResponse> listBatchesC
* }
* }</pre>
*
* @param name Required. The name of the batch resource to delete.
* @param name Required. The fully qualified name of the batch to retrieve in the format
* "projects/PROJECT_ID/locations/DATAPROC_REGION/batches/BATCH_ID"
* @throws com.google.api.gax.rpc.ApiException if the remote call fails
*/
public final void deleteBatch(BatchName name) {
Expand Down Expand Up @@ -693,7 +702,8 @@ public final void deleteBatch(BatchName name) {
* }
* }</pre>
*
* @param name Required. The name of the batch resource to delete.
* @param name Required. The fully qualified name of the batch to retrieve in the format
* "projects/PROJECT_ID/locations/DATAPROC_REGION/batches/BATCH_ID"
* @throws com.google.api.gax.rpc.ApiException if the remote call fails
*/
public final void deleteBatch(String name) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -113,6 +113,24 @@
}
}
},
"NodeGroupController": {
"clients": {
"grpc": {
"libraryClient": "NodeGroupControllerClient",
"rpcs": {
"CreateNodeGroup": {
"methods": ["createNodeGroupAsync", "createNodeGroupAsync", "createNodeGroupAsync", "createNodeGroupOperationCallable", "createNodeGroupCallable"]
},
"GetNodeGroup": {
"methods": ["getNodeGroup", "getNodeGroup", "getNodeGroup", "getNodeGroupCallable"]
},
"ResizeNodeGroup": {
"methods": ["resizeNodeGroupAsync", "resizeNodeGroupAsync", "resizeNodeGroupOperationCallable", "resizeNodeGroupCallable"]
}
}
}
}
},
"WorkflowTemplateService": {
"clients": {
"grpc": {
Expand Down Expand Up @@ -142,24 +160,6 @@
}
}
}
},
"NodeGroupController": {
"clients": {
"grpc": {
"libraryClient": "NodeGroupControllerClient",
"rpcs": {
"CreateNodeGroup": {
"methods": ["createNodeGroupAsync", "createNodeGroupAsync", "createNodeGroupAsync", "createNodeGroupOperationCallable", "createNodeGroupCallable"]
},
"GetNodeGroup": {
"methods": ["getNodeGroup", "getNodeGroup", "getNodeGroup", "getNodeGroupCallable"]
},
"ResizeNodeGroup": {
"methods": ["resizeNodeGroupAsync", "resizeNodeGroupAsync", "resizeNodeGroupOperationCallable", "resizeNodeGroupCallable"]
}
}
}
}
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -97,43 +97,43 @@
* }
* }</pre>
*
* <p>======================= WorkflowTemplateServiceClient =======================
* <p>======================= NodeGroupControllerClient =======================
*
* <p>Service Description: The API interface for managing Workflow Templates in the Dataproc API.
* <p>Service Description: The `NodeGroupControllerService` provides methods to manage node groups
* of Compute Engine managed instances.
*
* <p>Sample for WorkflowTemplateServiceClient:
* <p>Sample for NodeGroupControllerClient:
*
* <pre>{@code
* // This snippet has been automatically generated and should be regarded as a code template only.
* // It will require modifications to work:
* // - It may require correct/in-range values for request initialization.
* // - It may require specifying regional endpoints when creating the service client as shown in
* // https://cloud.google.com/java/docs/setup#configure_endpoints_for_the_client_library
* try (WorkflowTemplateServiceClient workflowTemplateServiceClient =
* WorkflowTemplateServiceClient.create()) {
* LocationName parent = LocationName.of("[PROJECT]", "[LOCATION]");
* WorkflowTemplate template = WorkflowTemplate.newBuilder().build();
* WorkflowTemplate response =
* workflowTemplateServiceClient.createWorkflowTemplate(parent, template);
* try (NodeGroupControllerClient nodeGroupControllerClient = NodeGroupControllerClient.create()) {
* NodeGroupName name = NodeGroupName.of("[PROJECT]", "[REGION]", "[CLUSTER]", "[NODE_GROUP]");
* NodeGroup response = nodeGroupControllerClient.getNodeGroup(name);
* }
* }</pre>
*
* <p>======================= NodeGroupControllerClient =======================
* <p>======================= WorkflowTemplateServiceClient =======================
*
* <p>Service Description: The `NodeGroupControllerService` provides methods to manage node groups
* of Compute Engine managed instances.
* <p>Service Description: The API interface for managing Workflow Templates in the Dataproc API.
*
* <p>Sample for NodeGroupControllerClient:
* <p>Sample for WorkflowTemplateServiceClient:
*
* <pre>{@code
* // This snippet has been automatically generated and should be regarded as a code template only.
* // It will require modifications to work:
* // - It may require correct/in-range values for request initialization.
* // - It may require specifying regional endpoints when creating the service client as shown in
* // https://cloud.google.com/java/docs/setup#configure_endpoints_for_the_client_library
* try (NodeGroupControllerClient nodeGroupControllerClient = NodeGroupControllerClient.create()) {
* NodeGroupName name = NodeGroupName.of("[PROJECT]", "[REGION]", "[CLUSTER]", "[NODE_GROUP]");
* NodeGroup response = nodeGroupControllerClient.getNodeGroup(name);
* try (WorkflowTemplateServiceClient workflowTemplateServiceClient =
* WorkflowTemplateServiceClient.create()) {
* LocationName parent = LocationName.of("[PROJECT]", "[LOCATION]");
* WorkflowTemplate template = WorkflowTemplate.newBuilder().build();
* WorkflowTemplate response =
* workflowTemplateServiceClient.createWorkflowTemplate(parent, template);
* }
* }</pre>
*/
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -163,6 +163,8 @@ public class HttpJsonBatchControllerStub extends BatchControllerStub {
Map<String, List<String>> fields = new HashMap<>();
ProtoRestSerializer<ListBatchesRequest> serializer =
ProtoRestSerializer.create();
serializer.putQueryParam(fields, "filter", request.getFilter());
serializer.putQueryParam(fields, "orderBy", request.getOrderBy());
serializer.putQueryParam(fields, "pageSize", request.getPageSize());
serializer.putQueryParam(fields, "pageToken", request.getPageToken());
serializer.putQueryParam(fields, "$alt", "json;enum-encoding=int");
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -515,7 +515,8 @@ public void instantiateWorkflowTemplate(
* <pre>
* Instantiates a template and begins execution.
* This method is equivalent to executing the sequence
* [CreateWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.CreateWorkflowTemplate], [InstantiateWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.InstantiateWorkflowTemplate],
* [CreateWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.CreateWorkflowTemplate],
* [InstantiateWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.InstantiateWorkflowTemplate],
* [DeleteWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.DeleteWorkflowTemplate].
* The returned Operation can be used to track execution of
* workflow by polling
Expand Down Expand Up @@ -734,7 +735,8 @@ public void instantiateWorkflowTemplate(
* <pre>
* Instantiates a template and begins execution.
* This method is equivalent to executing the sequence
* [CreateWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.CreateWorkflowTemplate], [InstantiateWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.InstantiateWorkflowTemplate],
* [CreateWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.CreateWorkflowTemplate],
* [InstantiateWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.InstantiateWorkflowTemplate],
* [DeleteWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.DeleteWorkflowTemplate].
* The returned Operation can be used to track execution of
* workflow by polling
Expand Down Expand Up @@ -897,7 +899,8 @@ public com.google.longrunning.Operation instantiateWorkflowTemplate(
* <pre>
* Instantiates a template and begins execution.
* This method is equivalent to executing the sequence
* [CreateWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.CreateWorkflowTemplate], [InstantiateWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.InstantiateWorkflowTemplate],
* [CreateWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.CreateWorkflowTemplate],
* [InstantiateWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.InstantiateWorkflowTemplate],
* [DeleteWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.DeleteWorkflowTemplate].
* The returned Operation can be used to track execution of
* workflow by polling
Expand Down Expand Up @@ -1049,7 +1052,8 @@ protected WorkflowTemplateServiceFutureStub build(
* <pre>
* Instantiates a template and begins execution.
* This method is equivalent to executing the sequence
* [CreateWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.CreateWorkflowTemplate], [InstantiateWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.InstantiateWorkflowTemplate],
* [CreateWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.CreateWorkflowTemplate],
* [InstantiateWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.InstantiateWorkflowTemplate],
* [DeleteWorkflowTemplate][google.cloud.dataproc.v1.WorkflowTemplateService.DeleteWorkflowTemplate].
* The returned Operation can be used to track execution of
* workflow by polling
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -80,10 +80,10 @@ public static final com.google.protobuf.Descriptors.Descriptor getDescriptor() {
* Full URL, partial URI, or short name of the accelerator type resource to
* expose to this instance. See
* [Compute Engine
* AcceleratorTypes](https://cloud.google.com/compute/docs/reference/beta/acceleratorTypes).
* AcceleratorTypes](https://cloud.google.com/compute/docs/reference/v1/acceleratorTypes).
* Examples:
* * `https://www.googleapis.com/compute/beta/projects/[project_id]/zones/us-east1-a/acceleratorTypes/nvidia-tesla-k80`
* * `projects/[project_id]/zones/us-east1-a/acceleratorTypes/nvidia-tesla-k80`
* * `https://www.googleapis.com/compute/v1/projects/[project_id]/zones/[zone]/acceleratorTypes/nvidia-tesla-k80`
* * `projects/[project_id]/zones/[zone]/acceleratorTypes/nvidia-tesla-k80`
* * `nvidia-tesla-k80`
* **Auto Zone Exception**: If you are using the Dataproc
* [Auto Zone
Expand Down Expand Up @@ -115,10 +115,10 @@ public java.lang.String getAcceleratorTypeUri() {
* Full URL, partial URI, or short name of the accelerator type resource to
* expose to this instance. See
* [Compute Engine
* AcceleratorTypes](https://cloud.google.com/compute/docs/reference/beta/acceleratorTypes).
* AcceleratorTypes](https://cloud.google.com/compute/docs/reference/v1/acceleratorTypes).
* Examples:
* * `https://www.googleapis.com/compute/beta/projects/[project_id]/zones/us-east1-a/acceleratorTypes/nvidia-tesla-k80`
* * `projects/[project_id]/zones/us-east1-a/acceleratorTypes/nvidia-tesla-k80`
* * `https://www.googleapis.com/compute/v1/projects/[project_id]/zones/[zone]/acceleratorTypes/nvidia-tesla-k80`
* * `projects/[project_id]/zones/[zone]/acceleratorTypes/nvidia-tesla-k80`
* * `nvidia-tesla-k80`
* **Auto Zone Exception**: If you are using the Dataproc
* [Auto Zone
Expand Down Expand Up @@ -535,10 +535,10 @@ public Builder mergeFrom(
* Full URL, partial URI, or short name of the accelerator type resource to
* expose to this instance. See
* [Compute Engine
* AcceleratorTypes](https://cloud.google.com/compute/docs/reference/beta/acceleratorTypes).
* AcceleratorTypes](https://cloud.google.com/compute/docs/reference/v1/acceleratorTypes).
* Examples:
* * `https://www.googleapis.com/compute/beta/projects/[project_id]/zones/us-east1-a/acceleratorTypes/nvidia-tesla-k80`
* * `projects/[project_id]/zones/us-east1-a/acceleratorTypes/nvidia-tesla-k80`
* * `https://www.googleapis.com/compute/v1/projects/[project_id]/zones/[zone]/acceleratorTypes/nvidia-tesla-k80`
* * `projects/[project_id]/zones/[zone]/acceleratorTypes/nvidia-tesla-k80`
* * `nvidia-tesla-k80`
* **Auto Zone Exception**: If you are using the Dataproc
* [Auto Zone
Expand Down Expand Up @@ -569,10 +569,10 @@ public java.lang.String getAcceleratorTypeUri() {
* Full URL, partial URI, or short name of the accelerator type resource to
* expose to this instance. See
* [Compute Engine
* AcceleratorTypes](https://cloud.google.com/compute/docs/reference/beta/acceleratorTypes).
* AcceleratorTypes](https://cloud.google.com/compute/docs/reference/v1/acceleratorTypes).
* Examples:
* * `https://www.googleapis.com/compute/beta/projects/[project_id]/zones/us-east1-a/acceleratorTypes/nvidia-tesla-k80`
* * `projects/[project_id]/zones/us-east1-a/acceleratorTypes/nvidia-tesla-k80`
* * `https://www.googleapis.com/compute/v1/projects/[project_id]/zones/[zone]/acceleratorTypes/nvidia-tesla-k80`
* * `projects/[project_id]/zones/[zone]/acceleratorTypes/nvidia-tesla-k80`
* * `nvidia-tesla-k80`
* **Auto Zone Exception**: If you are using the Dataproc
* [Auto Zone
Expand Down Expand Up @@ -603,10 +603,10 @@ public com.google.protobuf.ByteString getAcceleratorTypeUriBytes() {
* Full URL, partial URI, or short name of the accelerator type resource to
* expose to this instance. See
* [Compute Engine
* AcceleratorTypes](https://cloud.google.com/compute/docs/reference/beta/acceleratorTypes).
* AcceleratorTypes](https://cloud.google.com/compute/docs/reference/v1/acceleratorTypes).
* Examples:
* * `https://www.googleapis.com/compute/beta/projects/[project_id]/zones/us-east1-a/acceleratorTypes/nvidia-tesla-k80`
* * `projects/[project_id]/zones/us-east1-a/acceleratorTypes/nvidia-tesla-k80`
* * `https://www.googleapis.com/compute/v1/projects/[project_id]/zones/[zone]/acceleratorTypes/nvidia-tesla-k80`
* * `projects/[project_id]/zones/[zone]/acceleratorTypes/nvidia-tesla-k80`
* * `nvidia-tesla-k80`
* **Auto Zone Exception**: If you are using the Dataproc
* [Auto Zone
Expand Down Expand Up @@ -636,10 +636,10 @@ public Builder setAcceleratorTypeUri(java.lang.String value) {
* Full URL, partial URI, or short name of the accelerator type resource to
* expose to this instance. See
* [Compute Engine
* AcceleratorTypes](https://cloud.google.com/compute/docs/reference/beta/acceleratorTypes).
* AcceleratorTypes](https://cloud.google.com/compute/docs/reference/v1/acceleratorTypes).
* Examples:
* * `https://www.googleapis.com/compute/beta/projects/[project_id]/zones/us-east1-a/acceleratorTypes/nvidia-tesla-k80`
* * `projects/[project_id]/zones/us-east1-a/acceleratorTypes/nvidia-tesla-k80`
* * `https://www.googleapis.com/compute/v1/projects/[project_id]/zones/[zone]/acceleratorTypes/nvidia-tesla-k80`
* * `projects/[project_id]/zones/[zone]/acceleratorTypes/nvidia-tesla-k80`
* * `nvidia-tesla-k80`
* **Auto Zone Exception**: If you are using the Dataproc
* [Auto Zone
Expand All @@ -665,10 +665,10 @@ public Builder clearAcceleratorTypeUri() {
* Full URL, partial URI, or short name of the accelerator type resource to
* expose to this instance. See
* [Compute Engine
* AcceleratorTypes](https://cloud.google.com/compute/docs/reference/beta/acceleratorTypes).
* AcceleratorTypes](https://cloud.google.com/compute/docs/reference/v1/acceleratorTypes).
* Examples:
* * `https://www.googleapis.com/compute/beta/projects/[project_id]/zones/us-east1-a/acceleratorTypes/nvidia-tesla-k80`
* * `projects/[project_id]/zones/us-east1-a/acceleratorTypes/nvidia-tesla-k80`
* * `https://www.googleapis.com/compute/v1/projects/[project_id]/zones/[zone]/acceleratorTypes/nvidia-tesla-k80`
* * `projects/[project_id]/zones/[zone]/acceleratorTypes/nvidia-tesla-k80`
* * `nvidia-tesla-k80`
* **Auto Zone Exception**: If you are using the Dataproc
* [Auto Zone
Expand Down
Loading

0 comments on commit 841366b

Please sign in to comment.