Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor(hadoop): Update Hadoop Data Sources #414

Merged
merged 3 commits into from
Mar 20, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
66 changes: 40 additions & 26 deletions docs/data-sources/hadoop.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,44 +7,58 @@ subcategory: "Hadoop"

This module can be useful for getting detail of Hadoop instance created before.

## Example Usage
~> **NOTE:** This only supports VPC environment.

#### Basic usage
## Example Usage

The following example shows how to take Hadoop instance ID and obtain the data.

```hcl
data "ncloud_hadoop" "hadoop_by_id" {
id = ncloud_hadoop.hadoop.id
```terraform
data "ncloud_hadoop" "by_id" {
id = 1234567
}

data "ncloud_hadoop" "hadoop_by_filter" {
filter {
name = "id"
values = [ncloud_hadoop.hadoop.id]
}
data "ncloud_hadoop" "by_name" {
cluster_name = "example"
}
```

## Argument Reference

* `id` - (Optional) The ID of the specific Hadoop to retrieve.
* `zone_code` - (Optional) The zone code of the specific Hadoop to retrieve.
* `vpc_no` - (Optional) The vpc ID of the specific Hadoop to retrieve.
* `subnet_no` - (Optional) The subnet ID of the specific Hadoop to retrieve.
* `cluster_name` - (Optional) The name of the specific Hadoop to retrieve.
* `server_name` - (Optional) The server name in server list of specific Hadoop to retrieve.
* `server_instance_no` - (Optional) The server ID in server list of the specific Hadoop to retrieve.
* `filter` - (Optional) Custom filter block as described below.
* `name` - (Required) The name of the field to filter by.
* `values` - (Reuired) Set of values that are accepted for the given field.
* `regex` - (Optional) is `values` treated as a regular expression.
The following arguments are required:

* `id` - (Required) Hadoop instance number. Either `id` or `cluster_name` must be provided.
* `cluster_name` - (Required) Hadoop service name. Either `id` or `cluster_name` must be provided.

## Attributes Reference

* `id` - The ID of Hadoop.
* `cluster_name` - The name of Hadoop.
* `cluster_type_code` - The type code of Hadoop.
This data source exports the following attributes in addition to the arguments above:

* `region_code` - Region code.
* `vpc_no` - The ID of the associated VPC.
* `image_product_code` - The image product code of the Hadoop instance.
* `cluster_type_code` - The cluster type code.
* `version` - The version of Hadoop.
* `image_product_code` - The image product code of Hadoop.
* `hadoop_server_instance_list` - The server instance list of Hadoop.
* `ambari_server_host` - The name of ambari host.
* `cluster_direct_access_account` - Account name with direct access to the cluster.
* `login_key` - The login key name.
* `object_storage_bucket` - The name of object storage bucket.
* `kdc_realm` - Realm information of kerberos authentication.
* `domain` - Domain.
* `is_ha` - Whether using high availability of the specific Hadoop.
* `add_on_code_list` - The list of Hadoop Add-On.
* `access_control_group_no_list` - The list of access control group number.
* `hadoop_server_list` The list of Hadoop server instance.
* `server_instance_no` - Server instance number.
* `server_name` - Name of the server.
* `server_role` - Server role code. ex) M(Master), H(Standby Master)
* `zone_code` - Zone code.
* `subnet_no` - The ID of the associated Subnet.
* `product_code` - Product code.
* `is_public_subnet` - Public subnet status.
* `cpu_count` - the number of the virtual CPU.
* `memory_size` - Memory size.
* `data_storage_type` - The type of data storage.
* `data_storage_size` - Data storage size.
* `uptime` - Running start time.
* `create_date` - Server create date.
19 changes: 11 additions & 8 deletions docs/data-sources/hadoop_add_on.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,22 +5,22 @@ subcategory: "Hadoop"

# Data Source: ncloud_hadoop_add_on

This module can be useful for getting add-ons of Hadoop.
This module can be useful for getting add-on list of Hadoop.

## Example Usage
~> **NOTE:** This only supports VPC environment.

#### Basic usage
## Example Usage

The following example shows how to take add-ons of Hadoop.

```hcl
```terraform
data "ncloud_hadoop_add_on" "addon" {
image_product_code= "SW.VCHDP.LNX64.CNTOS.0708.HDP.15.B050"
image_product_code= "SW.VCHDP.LNX64.CNTOS.0708.HDP.21.B050"
cluster_type_code= "CORE_HADOOP_WITH_SPARK"
}

data "ncloud_hadoop_add_on" "addon_output_file" {
image_product_code= "SW.VCHDP.LNX64.CNTOS.0708.HDP.15.B050"
image_product_code= "SW.VCHDP.LNX64.CNTOS.0708.HDP.21.B050"
cluster_type_code= "CORE_HADOOP_WITH_SPARK"

output_file = "hadoop_add_on.json"
Expand All @@ -29,11 +29,14 @@ data "ncloud_hadoop_add_on" "addon_output_file" {

## Argument Reference

The following arguments are supported:

* `image_product_code` - (Required) The image product code of the specific Hadoop add-on to retrieve.
* `cluster_type_code` - (Required) The cluster type code of the specific Hadoop add-on to retrieve.
* `output_file` - (Optional) The name of file that can save data source after running `terraform plan`.

## Attributes Reference

* `id` - The ID of add-on list.
* `add_on_list` - The add-on list of Hadoop.
This data source exports the following attributes in addition to the arguments above:

* `add_on_list` - The add-on list of Hadoop.
15 changes: 9 additions & 6 deletions docs/data-sources/hadoop_bucket.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,15 +5,15 @@ subcategory: "Hadoop"

# Data Source: ncloud_hadoop_bucket

This module can be useful for getting buckets to create hadoop instance.
This module can be useful for getting Object Storage buckets to create hadoop instance.

## Example Usage
~> **NOTE:** This only supports VPC environment.

#### Basic usage
## Example Usage

The following example shows how to take buckets.

```hcl
```terraform
data "ncloud_hadoop_bucket" "bucket" {

}
Expand All @@ -25,9 +25,12 @@ data "ncloud_hadoop_add_on" "bucket_output_file" {

## Argument Reference

The following arguments are supported:

* `output_file` - (Optional) The name of file that can save data source after running `terraform plan`.

## Attributes Reference

* `id` - The ID of bucket list.
* `bucket_list` - The bucket list of Hadoop.
This data source exports the following attributes in addition to the arguments above:

* `bucket_list` - The Object Storage bucket list of Hadoop.
11 changes: 0 additions & 11 deletions examples/hadoop_add_on/main.tf

This file was deleted.

19 changes: 0 additions & 19 deletions examples/hadoop_add_on/variables.tf

This file was deleted.

8 changes: 0 additions & 8 deletions examples/hadoop_add_on/versions.tf

This file was deleted.

9 changes: 0 additions & 9 deletions examples/hadoop_buckets/main.tf

This file was deleted.

11 changes: 0 additions & 11 deletions examples/hadoop_buckets/variables.tf

This file was deleted.

8 changes: 0 additions & 8 deletions examples/hadoop_buckets/versions.tf

This file was deleted.

17 changes: 17 additions & 0 deletions internal/common/data_source.go
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,23 @@ func WriteToFile(filePath string, data interface{}) error {
return os.WriteFile(filePath, []byte(str), 0777)
}

func WriteStringListToFile(path string, list types.List) error {
var dataList []string

for _, v := range list.Elements() {
var data string
if err := json.Unmarshal([]byte(v.String()), &data); err != nil {
return err
}
dataList = append(dataList, data)
}

if err := WriteToFile(path, dataList); err != nil {
return err
}
return nil
}

func WriteImageProductToFile(path string, images types.List) error {
var imagesToJson []imageProductToJson

Expand Down
Loading
Loading