-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] Define best practices for defining collections when produced for specific stories #58
Comments
@abarciauskas-bgse From the STAC perspective I would recommend following the consensus decision you referenced of using only generalizable collections. This avoids conflating This seems like a best practice but presents another issue, where to store story specific filters to access only the items in the general collection which are relevant to the story? I'd suggest that using some type of
|
@abarciauskas-bgse @sharkinsspatial @xhagrg the temporal cadence of the hurricane event nightlights data is a unique case:
Nightlights events test metadata with start/end |
I sent Ranjay an email about the temporal nature of the BMHD monthly files - I think if he can verify that the start_ and end_datetime of the monthly data files can be used for the
I believe for Ida there is just a day before the hurricane 2021-08-09 and a day after the hurricane 2021-08-31, so I think just a single datetime does make sense for those items. |
@abarciauskas-bgse @anayeaye after reading the response from Ranjay, it looks like we can just use the start and end date time of the corresponding month? Do we move ahead with ingestion of these files in the same collection? |
Yes thanks @xhagrg for checking, I think we should consolidate in the nightlights-hd-monthly dataset and also add |
Also Ranjay shared these links as the product pages for the dataset: https://ladsweb.modaps.eosdis.nasa.gov/missions-and-measurements/products/VNP46A3/, https://ladsweb.modaps.eosdis.nasa.gov/missions-and-measurements/products/VNP46A4/ @anayeaye @sharkinsspatial is there a good place for these types of references in the STAC collection metadata? I'm checking with Ranjay but my guess that those are the product pages for the source HDF5 used to generate the COGs we have. |
@abarciauskas-bgse @sharkinsspatial @xhagrg RE: datetimes I also don't want to block ingest on the nightlights data--I think it will be fine either way because it is small enough to easily refactor or reingest if needed. |
@abarciauskas-bgse Just refreshed and saw the Collection level metadata question above. I think these references would be good links to add to the document. This HLS delta collection has external links to metadata, maybe we could follow this pattern: https://dev-stac.delta-backend.xyz/collections/HLSS30.002
|
@xhagrg per @anayeaye's comment about start_ and end_datetime, I think we will want to include start_ and end_datetime for Ida files. Sorry for the re-work. I see those files are already published to https://dev-stac.delta-backend.xyz/collections/BMHD/items |
@abarciauskas-bgse I will be using the "nightlights-hd-monthly" collection which already exists. will be adding the start_ and end_datetime in the properties. Do we retain the datetime field? or set it to none as done previously? |
|
Moving this to veda-data as it's a good/useful conversation, and we're sunsetting this repo: https://github.com/NASA-IMPACT/veda-architecture/issues/322. |
The question has arisen for datasets we are curating specifically to tell the EJ story of Hurricanes Ida and Maria: how do we want to organize collections that are produced for specific events (e.g. Hurricanes). The consensus so far is that we want to publish to generalizable collections as much as possible rather than collections scoped to a specific event.
What this means is that for the EJ story we will be creating and using the following collections:
Interested in any additional thoughts or considerations from the team @anayeaye @slesaad @xhagrg @leothomas @danielfdsilva
The text was updated successfully, but these errors were encountered: