Code for the zero-shot semantic parser described in our EMNLP 2018 paper.
The structure mapper implementation is an extension of this code.
- Install Miniconda2
- Install Stanford CoreNLP:
$ wget http://nlp.stanford.edu/software/stanford-corenlp-full-2016-10-31.zip
$ unzip stanford-corenlp-full-2016-10-31.zip
- Install python dependencies:
$ conda install --file reqs_conda.txt
$ pip install -r reqs_pip.txt
To delexicalize data for all domains run and prepare cross domain splits use:
$ python src/py/zero_shot/preprocess.py
To run one of the models implemented in the paper use:
$ sh scripts/MODEL.sh SPLIT
Where:
MODEL
is one of the following: zero_shot, cross_lex, cross_lex_rep, in_abstract, in_lex
.
SPLIT
is either test
(the original train/test split of the OVERNIGHT dataset), or dev
(in this case the original train set is split to 80%/20% train/test sets).
To run all models use:
$ sh scripts/run_all.sh
Results are saved to /res
folder. To print all results use:
$ python src/py/zero_shot/print_res.py