Skip to content

Commit

Permalink
Fixes related to sample data updates
Browse files Browse the repository at this point in the history
  • Loading branch information
Cadair committed Feb 8, 2025
1 parent d1710dd commit 34d9ab2
Show file tree
Hide file tree
Showing 3 changed files with 6 additions and 4 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ jobs:
with:
python-version: '3.13'
test_extras: tests
test_command: pytest --pyargs dkist -k "not test_fail"
test_command: pytest --pyargs dkist -k "not test_fail" --remote-data=none
# We have to work around a github runner bug here: https://github.com/actions/runner/issues/2788#issuecomment-2145922705
upload_to_pypi: ${{ startsWith(github.ref || format('{0}{1}', 'refs/tags/', github.event.release.tag_name), 'refs/tags/v') && !endsWith(github.ref || format('{0}{1}', 'refs/tags/', github.event.release.tag_name), '.dev') }}
secrets:
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/reproject_vbi_mosaic.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ If you want to replace this dataset with your own dataset, see {ref}`dkist:howto
Let's load the data with {obj}`dkist.load_dataset`:

```{code-cell} ipython3
ds = dkist.load_dataset(VBI_AJQWW / "VBI_L1_20231016T184519_AJQWW.asdf")
ds = dkist.load_dataset(VBI_AJQWW)
ds
```

Expand Down
6 changes: 4 additions & 2 deletions tools/update_sample_data.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ def main(datasets, working_directory, destination_path="/user_tools_tutorial_dat

for did, props in datasets.items():
res = Fido.search(a.dkist.Dataset(did))
asdf_file = Fido.fetch(res, path=working_directory / "{dataset_id}", progress=False, overwrite=False)
asdf_file = Fido.fetch(res, path=working_directory / "{dataset_id}", progress=False, overwrite=True)

ds = dkist.load_dataset(asdf_file)
if "slice" in props:
Expand All @@ -64,10 +64,12 @@ def main(datasets, working_directory, destination_path="/user_tools_tutorial_dat
[f.unlink() for f in dataset_path.glob("*.mp4")]
[f.unlink() for f in dataset_path.glob("*.pdf")]
assert len(list(dataset_path.glob("*.asdf"))) == 1
dataset_files = tuple(dataset_path.glob("*"))

sample_filename = working_directory / props["filename"]
with tarfile.open(sample_filename, mode="w") as tfile:
tfile.add(dataset_path, recursive=True)
for dfile in dataset_files:
tfile.add(dfile, arcname=dfile.name, recursive=False)

sample_files_for_upload.append(sample_filename)

Expand Down

0 comments on commit 34d9ab2

Please sign in to comment.