Skip to content

Commit

Permalink
Merge pull request #4 from nasa/dev
Browse files Browse the repository at this point in the history
data files are published
  • Loading branch information
thomasastanley authored Apr 22, 2022
2 parents af8ab7d + b5ed0df commit df2848b
Showing 1 changed file with 45 additions and 2 deletions.
47 changes: 45 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,9 +11,50 @@ The latest predictions can be downloaded from https://maps.nccs.nasa.gov/downloa

### Data files

**We are currently obtaining a long-term url for sharing the data files necessary to run LHASA 2.0. In the meantime,these files are available upon request.**
LHASA requires several large data files, but not all data may be needed by all users. The contents of [static.zip](https://gpm.nasa.gov/sites/default/files/data/landslides/static.zip) are required for the global landslide forecast. The contents of [exposure.zip]() are only used for the exposure analysis. The contents of [ref_data.zip](https://gpm.nasa.gov/sites/default/files/data/landslides/ref_data.zip) are only used for the global post-fire debris flow analysis.

LHASA requires several large data files, but not all data may be needed by all users. The contents of static.zip are required for the global landslide forecast. The contents of exposure.zip are only used for the exposure analysis. The contents of pfdf.zip are only used for the global post-fire debris flow analysis.
### Installation

After cloning this repository, some setup is required prior to running LHASA. The following commands have been tested in a linux environment. Users of Windows or other systems may be required to modify each of these steps.

# Set up python environment
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh
sh Miniconda3-latest-Linux-x86_64.sh
conda env create -f lhasa.yml

# Set up directory structure
mkdir -p nrt/hazard/tif
mkdir -p nrt/exposure/csv
mkdir -p fcast/hazard/tif
mkdir -p fcast/exposure/csv
mkdir -p pfdf/firms

# Obtain required data files
wget https://gpm.nasa.gov/sites/default/files/data/landslides/static.zip
unzip static.zip
rm static.zip

wget https://gpm.nasa.gov/sites/default/files/data/landslides/exposure.zip
unzip exposure.zip
rm exposure.zip

wget https://gpm.nasa.gov/sites/default/files/data/landslides/ref_data.zip
unzip ref_data.zip pfdf/
rm ref_data.zip

# Configure post-fire debris flow model
python pfdf/setup.py

The post-fire debris flow module uses Google Earth Engine to access Landsat imagery. Please see the [README](https://github.com/nasa/LHASA/blob/master/pfdf/README.md) for more information.

### Routine operation

Once a month, run the following commands to build the fires database needed to run the post-fire debris flow module:

conda activate lhasa
python /scripts/gee_export_all.py --filepath /pfdf --gee_username username

Then run [lhasa.sh](https://github.com/nasa/LHASA/blob/master/lhasa.sh) at the desired cadence, e.g. once per day.

### Citation

Expand All @@ -31,6 +72,8 @@ The software released here enables the user to run the global landslide forecast

No long-term archive for predictions from LHASA 2.0 has been established.

---

## LHASA 1.1

Although version 2 surpasses version 1 in accuracy and features, some users may prefer the simplicity of a single heuristic decision tree. Therefore, LHASA 1.1 is still running and its output can be seen at https://pmm.nasa.gov/precip-apps.
Expand Down

0 comments on commit df2848b

Please sign in to comment.