Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix: Added "is_public" to tabular_datasets table #501

Merged
merged 7 commits into from
Oct 7, 2022
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
Adding a boolean column to show if a dataset is public or not
  • Loading branch information
happyhuman committed Oct 5, 2022
commit 88176721a97122f2a6545892f295bac4696a7441
Original file line number Diff line number Diff line change
@@ -148,6 +148,7 @@ class DatasetInfo:
dataset_id: str = None
description: str = None
num_tables: int = None
is_public: bool = None

def __init__(
self,
@@ -161,6 +162,8 @@ def __init__(
self.description = np.nan
self.created_at = dataset_reference.created
self.modified_at = dataset_reference.modified
entries = list(dataset_reference.access_entries)
self.is_public = any(map(lambda e: e.entity_id in {'allAuthenticatedUsers', 'allUsers'}, entries))

def __repr__(self) -> str:
return f"{self.project_id}.{self.dataset_id}"
Original file line number Diff line number Diff line change
@@ -54,7 +54,12 @@ resources:
"name": "num_tables",
"description": "Number of tables contained in this dataset",
"type": "INTEGER"
}
},
{
"name": "is_public",
"description": "Whether or not the dataset is public to all users",
"type": "BOOLEAN"
}
]
- type: bigquery_table
dataset_id: _cloud_datasets
26 changes: 26 additions & 0 deletions datasets/noaa_puget/pipelines/dataset.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# Copyright 2021 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

dataset:
name: noaa_puget
friendly_name: NOAA Puget Sound Nearshore Fish 2017-2018
description: ~
update_frequency: ~
dataset_sources: ~
terms_of_use: ~

resources:
- type: bigquery_dataset
dataset_id: noaa_puget
description: ~78k images extracted from nearshore marine video, on which ~68k bounding boxes have been added on fish and crustaceans.
58 changes: 58 additions & 0 deletions datasets/noaa_puget/pipelines/noaa_puget/pipeline.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
# Copyright 2021 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

---
resources: ~

dag:
airflow_version: 2
initialize:
dag_id: "noaa_puget"
default_args:
owner: "Google"
depends_on_past: False
start_date: '2022-08-30'
max_active_runs: 1
schedule_interval: "@yearly" # TODO: Is this okay for "NEVER"??
catchup: False
default_view: graph

tasks:
- operator: "BashOperator"
description: "Task to copy `noaa_estuary_fish-images.zip` to GCS."
args:
task_id: "download_and_process_image_zip_file"
bash_command: |
mkdir -p $data_dir/{{ ds }}
curl -o $data_dir/{{ ds }}/noaa_estuary_fish_images.zip -L $zip_source_url
unzip $data_dir/{{ ds }}/noaa_estuary_fish_images.zip -d $data_dir/{{ ds }}/images
env:
zip_source_url: "http://ecologize.ddns.net/lila-files/noaa_estuary_fish-images.zip"
data_dir: "/home/airflow/gcs/data/noaa_puget/noaa_puget"

- operator: "BashOperator"
description: "Task to copy `noaa_estuary_fish-annotations.zip` to GCS."
args:
task_id: "download_and_process_annotation_zip_file"
bash_command: |
mkdir -p $data_dir/{{ ds }}
curl -o $data_dir/{{ ds }}/noaa_estuary_fish_annotations.zip -L $zip_source_url
unzip $data_dir/{{ ds }}/noaa_estuary_fish_annotations.zip -d $data_dir/{{ ds }}/annotations
env:
zip_source_url: "http://ecologize.ddns.net/lila-files/noaa_estuary_fish-annotations.zip"
data_dir: "/home/airflow/gcs/data/noaa_puget/noaa_puget"

graph_paths:
- "download_and_process_image_zip_file >> download_and_process_annotation_zip_file"
-