Skip to content
This repository has been archived by the owner on Feb 3, 2021. It is now read-only.

Commit

Permalink
Fix job configuration option for aztk spark job submit command (#435)
Browse files Browse the repository at this point in the history
`--job-conf` option mentioned in the docs wasn't working.

CLI help was showing that option is named `--configuration-c`
which seems to be a result of a missing comma in option definition.
  • Loading branch information
shtratos authored and jafreck committed Mar 13, 2018
1 parent 1c31335 commit 4be5ac2
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion aztk_cli/spark/endpoints/job/submit.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ def setup_parser(parser: argparse.ArgumentParser):
dest='job_id',
required=False,
help='The unique id of your Spark Job. Defaults to the id value in .aztk/job.yaml')
parser.add_argument('--configuration' '-c',
parser.add_argument('--configuration', '-c',
dest='job_conf',
required=False,
help='Path to the job.yaml configuration file. Defaults to .aztk/job.yaml')
Expand Down
2 changes: 1 addition & 1 deletion docs/70-jobs.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ Once submitted, this Job will run two applications, pipy100 and pipy200, on an a
Submit a Spark Job:
```sh
aztk spark job submit --id <your_job_id> --job-conf </path/to/job.yaml>
aztk spark job submit --id <your_job_id> --configuration </path/to/job.yaml>
```

NOTE: The Job id (`--id`) can only contain alphanumeric characters including hyphens and underscores, and cannot contain more than 64 characters. Each Job **must** have a unique id.
Expand Down

0 comments on commit 4be5ac2

Please sign in to comment.