Skip to content

Finding the data

Daniel Walker edited this page May 13, 2024 · 10 revisions

Where: Globus

All ACES data is hosted at the University of Florida and is accessible via Globus. If you don't know how to navigate Globus, or don't yet have access, you can find a brief overview here.

But where on Globus?

Feathered cubes (individual regions)

To find the feathered 12m+7m+TP data products:

  • Go to the file manager tab
  • Select Collection = ACES - HPG
  • Go to /upload/Feather_12m_7m_TP/

Here you will see sub-directories for each of the SPWs. Within these are directories for cubes and moments. The cubes all have the following naming convention:

Sgr_A_st_XYZ.TP_7M_12M_feather_all.SPW123.image.statcont.contsub.fits

where XYZ should be replaced with the ACES region, and SPW123 with the spectral window.

Non-feathered products (individual regions)

If you want to download images for specific fields (non-feathered), you need to:

  • Go to the file manager tab
  • Select Collection = ACES - HPG
  • Select Path = /rawdata/2021.1.00172.L/science_goal.uid___A001_X1590_X30a8/group.uid___A001_X1590_X30a9/

Here you will find a lot of different directories starting with member.uid___A001_X... -- these are the MOUS IDs for all of our data. But how do you know which is which? You can consult either:

For reference, a mosaic of the ACES coverage and corresponding field names ('a', 'b', etc.) can be found here. And a list of the different lines covered, and their corresponding spectral windows can be found here.

Example: let's say you wanted to get the 12m HCO+ cube for the Brick (field ao/41). Using the above tables:

  • MOUS ID = X15a0_X190 & SPW = 29
  • Go to the above base path + member.uid___A001_X15a0_X190
  • Go to the calibrated/working/ directory -- this is where the data products live
  • There are a lot of files here, but you can filter/sort to find exactly what you're looking for. In this case:
  • Cube = uid___A001_X15a0_X190.s38_0.Sgr_A_star_sci.spw29.cube.I.iter1.image.pbcor.statcont.contsub.fits
    • This is the primary beam corrected, continuum-subtracted cube for SPW29 of our desired MOUS
    • uid___A001_X15a0_X190 -> MOUS ID
    • s38_0 -> ALMA pipeline stage 38
    • Sgr_A_star_ -> field name (not the most useful in hindsight)
    • sci.spw29 -> science spectral window 29 (the 12m SPW which contains HCO+)
    • iter1 -> indicates that this has been cleaned (iter0 indicates dirty image)
    • pbcor -> primary beam corrected
    • statcont.contsub -> continuum subtraction has been performed using STATCONT

Mosaics

If instead you want the full ACES mosaics, simply go to /mosaics/ (again on ACES - HPG), and then go to either /continuum/ or /cubes/. The filenames here are more intuitive, though for the cubes, some of them have 'downsampled' in their names to indicate that they have been downsampled spatially and/or spectrally. This has been done to minimise the file sizes to make them more manageable. We are looking into approaches to make the data cubes more manageable without degrading the resolution too much.

How: it depends ...

Web interface

Please see our guide here for an overview on how to transfer data via the Globus website. In brief, you need to have a personal endpoint set up, to which you can initiate a file transfer(s).

Note that you can also perform direct downloads without an endpoint, though with limitations. You can only download files, not directories, and only one at a time.

Command Line Interface (CLI)

If you want to do batch transfers, or otherwise don't want to manually navigate through the many subdirectories to find the files you need each time, you can script the process using the Globus CLI. This takes a little more time to set up and does require some bash scripting, but once you get the hang of it it's very easy to automate bulk transfers.

You can find an example shell script for doing this here. This example script transfers all 12m HNCO cubes to your personal endpoint by looping through the CSV file mentioned above, searching for all cubes matching the filter, and then initiating the transfers.

Note that whether you are using the website or the CLI, there is a limit of 100 active transfers. If you attempt to submit more than 100 transfers in quick succession, any requests beyond this limit will be rejected.