Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

1.10.2 patch #713

Merged
merged 8 commits into from
Jul 31, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 8 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,15 @@
# nf-core/tools: Changelog

## [v1.10.2 - Copper Camel _(bought back from the dead)_](https://github.com/nf-core/tools/releases/tag/1.10.2) - [2020-07-31]

Second patch release to address some small errors discovered in the pipeline template.
Apologies for the inconvenience.

* Fix syntax error in `/push_dockerhub.yml` GitHub Action workflow
* Change `params.readPaths` -> `params.input_paths` in `test_full.config`
* Check results when posting the lint results as a GitHub comment

## [v1.10.1](https://github.com/nf-core/tools/releases/tag/1.10.1) - [2020-07-30]
## [v1.10.1 - Copper Camel _(patch)_](https://github.com/nf-core/tools/releases/tag/1.10.1) - [2020-07-30]

Patch release to fix the automatic template synchronisation, which failed in the v1.10 release.

Expand Down
2 changes: 1 addition & 1 deletion docs/lint_errors.md
Original file line number Diff line number Diff line change
Expand Up @@ -177,7 +177,7 @@ This test will fail if the following requirements are not met in these files:

2. `linting.yml`: Specifies the commands to lint the pipeline repository using `nf-core lint` and `markdownlint`
* Must be turned on for `push` and `pull_request`.
* Must have the command `nf-core lint ${GITHUB_WORKSPACE}`.
* Must have the command `nf-core -l lint_log.txt lint ${GITHUB_WORKSPACE}`.
* Must have the command `markdownlint ${GITHUB_WORKSPACE} -c ${GITHUB_WORKSPACE}/.github/markdownlint.yml`.

3. `branch.yml`: Ensures that pull requests to the protected `master` branch are coming from the correct branch when a PR is opened against the _nf-core_ repository.
Expand Down
78 changes: 47 additions & 31 deletions nf_core/lint.py
Original file line number Diff line number Diff line change
Expand Up @@ -771,7 +771,7 @@ def check_actions_lint(self):
self.passed.append((5, "Continuous integration runs Markdown lint Tests: `{}`".format(fn)))

# Check that the nf-core linting runs
nfcore_lint_cmd = "nf-core lint ${GITHUB_WORKSPACE}"
nfcore_lint_cmd = "nf-core -l lint_log.txt lint ${GITHUB_WORKSPACE}"
try:
steps = lintwf["jobs"]["nf-core"]["steps"]
assert any([nfcore_lint_cmd in step["run"] for step in steps if "run" in step.keys()])
Expand Down Expand Up @@ -1440,39 +1440,55 @@ def github_comment(self):
"""
If we are running in a GitHub PR, try to post results as a comment
"""
if os.environ.get("GITHUB_TOKEN", "") != "" and os.environ.get("GITHUB_COMMENTS_URL", "") != "":
try:
headers = {"Authorization": "token {}".format(os.environ["GITHUB_TOKEN"])}
# Get existing comments - GET
get_r = requests.get(url=os.environ["GITHUB_COMMENTS_URL"], headers=headers)
if get_r.status_code == 200:

# Look for an existing comment to update
update_url = False
for comment in get_r.json():
if comment["user"]["login"] == "github-actions[bot]" and comment["body"].startswith(
"\n#### `nf-core lint` overall result"
):
# Update existing comment - PATCH
log.info("Updating GitHub comment")
update_r = requests.patch(
url=comment["url"],
data=json.dumps({"body": self.get_results_md().replace("Posted", "**Updated**")}),
headers=headers,
)
return

# Create new comment - POST
if len(self.warned) > 0 or len(self.failed) > 0:
log.info("Posting GitHub comment")
post_r = requests.post(
url=os.environ["GITHUB_COMMENTS_URL"],
data=json.dumps({"body": self.get_results_md()}),
if os.environ.get("GITHUB_TOKEN", "") == "":
log.debug("Environment variable GITHUB_TOKEN not found")
return
if os.environ.get("GITHUB_COMMENTS_URL", "") == "":
log.debug("Environment variable GITHUB_COMMENTS_URL not found")
return
try:
headers = {"Authorization": "token {}".format(os.environ["GITHUB_TOKEN"])}
# Get existing comments - GET
get_r = requests.get(url=os.environ["GITHUB_COMMENTS_URL"], headers=headers)
if get_r.status_code == 200:

# Look for an existing comment to update
update_url = False
for comment in get_r.json():
if comment["user"]["login"] == "github-actions[bot]" and comment["body"].startswith(
"\n#### `nf-core lint` overall result"
):
# Update existing comment - PATCH
log.info("Updating GitHub comment")
update_r = requests.patch(
url=comment["url"],
data=json.dumps({"body": self.get_results_md().replace("Posted", "**Updated**")}),
headers=headers,
)
return

# Create new comment - POST
if len(self.warned) > 0 or len(self.failed) > 0:
r = requests.post(
url=os.environ["GITHUB_COMMENTS_URL"],
data=json.dumps({"body": self.get_results_md()}),
headers=headers,
)
try:
r_json = json.loads(r.content)
response_pp = json.dumps(r_json, indent=4)
except:
r_json = r.content
response_pp = r.content

if r.status_code == 201:
log.info("Posted GitHub comment: {}".format(r_json["html_url"]))
log.debug(response_pp)
else:
log.warn("Could not post GitHub comment: '{}'\n{}".format(r.status_code, response_pp))

except Exception as e:
log.warning("Could not post GitHub comment: {}\n{}".format(os.environ["GITHUB_COMMENTS_URL"], e))
except Exception as e:
log.warning("Could not post GitHub comment: {}\n{}".format(os.environ["GITHUB_COMMENTS_URL"], e))

def _wrap_quotes(self, files):
if not isinstance(files, list):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ jobs:
run: conda install -c conda-forge awscli
- name: Start AWS batch job
# TODO nf-core: You can customise AWS full pipeline tests as required
# Add full size test data (but still relatively small datasets for few samples)
# Add full size test data (but still relatively small datasets for few samples)
# on the `test_full.config` test runs with only one set of parameters
# Then specify `-profile test_full` instead of `-profile test` on the AWS batch command
{% raw %}env:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -57,5 +57,12 @@ jobs:
GITHUB_COMMENTS_URL: ${{ github.event.pull_request.comments_url }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_PR_COMMIT: ${{ github.event.pull_request.head.sha }}
run: nf-core lint ${GITHUB_WORKSPACE}
run: nf-core -l lint_log.txt lint ${GITHUB_WORKSPACE}

- name: Upload linting log file artifact
if: ${{ always() }}
uses: actions/upload-artifact@v2
with:
name: linting-log-file
path: lint_log.txt
{% endraw %}
Original file line number Diff line number Diff line change
Expand Up @@ -8,32 +8,33 @@ on:
release:
types: [published]

push_dockerhub:
name: Push new Docker image to Docker Hub
runs-on: ubuntu-latest
# Only run for the nf-core repo, for releases and merged PRs
if: {% raw %}${{{% endraw %} github.repository == '{{ cookiecutter.name }}' {% raw %}}}{% endraw %}
env:
DOCKERHUB_USERNAME: {% raw %}${{ secrets.DOCKERHUB_USERNAME }}{% endraw %}
DOCKERHUB_PASS: {% raw %}${{ secrets.DOCKERHUB_PASS }}{% endraw %}
steps:
- name: Check out pipeline code
uses: actions/checkout@v2
jobs:
push_dockerhub:
name: Push new Docker image to Docker Hub
runs-on: ubuntu-latest
# Only run for the nf-core repo, for releases and merged PRs
if: {% raw %}${{{% endraw %} github.repository == '{{ cookiecutter.name }}' {% raw %}}}{% endraw %}
env:
DOCKERHUB_USERNAME: {% raw %}${{ secrets.DOCKERHUB_USERNAME }}{% endraw %}
DOCKERHUB_PASS: {% raw %}${{ secrets.DOCKERHUB_PASS }}{% endraw %}
steps:
- name: Check out pipeline code
uses: actions/checkout@v2

- name: Build new docker image
run: docker build --no-cache . -t {{ cookiecutter.name_docker }}:latest
- name: Build new docker image
run: docker build --no-cache . -t {{ cookiecutter.name_docker }}:latest

- name: Push Docker image to DockerHub (dev)
if: {% raw %}${{ github.event_name == 'push' }}{% endraw %}
run: |
echo "$DOCKERHUB_PASS" | docker login -u "$DOCKERHUB_USERNAME" --password-stdin
docker tag {{ cookiecutter.name_docker }}:latest {{ cookiecutter.name_docker }}:dev
docker push {{ cookiecutter.name_docker }}:dev
- name: Push Docker image to DockerHub (dev)
if: {% raw %}${{ github.event_name == 'push' }}{% endraw %}
run: |
echo "$DOCKERHUB_PASS" | docker login -u "$DOCKERHUB_USERNAME" --password-stdin
docker tag {{ cookiecutter.name_docker }}:latest {{ cookiecutter.name_docker }}:dev
docker push {{ cookiecutter.name_docker }}:dev

- name: Push Docker image to DockerHub (release)
if: {% raw %}${{ github.event_name == 'release' }}{% endraw %}
run: |
echo "$DOCKERHUB_PASS" | docker login -u "$DOCKERHUB_USERNAME" --password-stdin
docker push {{ cookiecutter.name_docker }}:latest
docker tag {{ cookiecutter.name_docker }}:latest {{ cookiecutter.name_docker }}:{% raw %}${{ github.event.release.tag_name }}{% endraw %}
docker push {{ cookiecutter.name_docker }}:{% raw %}${{ github.event.release.tag_name }}{% endraw %}
- name: Push Docker image to DockerHub (release)
if: {% raw %}${{ github.event_name == 'release' }}{% endraw %}
run: |
echo "$DOCKERHUB_PASS" | docker login -u "$DOCKERHUB_USERNAME" --password-stdin
docker push {{ cookiecutter.name_docker }}:latest
docker tag {{ cookiecutter.name_docker }}:latest {{ cookiecutter.name_docker }}:{% raw %}${{ github.event.release.tag_name }}{% endraw %}
docker push {{ cookiecutter.name_docker }}:{% raw %}${{ github.event.release.tag_name }}{% endraw %}
13 changes: 4 additions & 9 deletions nf_core/sync.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,12 +79,11 @@ def sync(self):
""" Find workflow attributes, create a new template pipeline on TEMPLATE
"""

config_log_msg = "Pipeline directory: {}".format(self.pipeline_dir)
log.debug("Pipeline directory: {}".format(self.pipeline_dir))
if self.from_branch:
config_log_msg += "\n Using branch `{}` to fetch workflow variables".format(self.from_branch)
log.debug("Using branch `{}` to fetch workflow variables".format(self.from_branch))
if self.make_pr:
config_log_msg += "\n Will attempt to automatically create a pull request on GitHub.com"
log.info(config_log_msg)
log.debug("Will attempt to automatically create a pull request")

self.inspect_sync_dir()
self.get_wf_config()
Expand Down Expand Up @@ -388,11 +387,7 @@ def sync_all_pipelines(gh_username=None, gh_auth_token=None):
log.error("Something went wrong when syncing {}:\n{}".format(wf.full_name, e))
failed_syncs.append(wf.name)
else:
log.info(
"[green]Sync successful for {0}:[/] [blue][link={1}]{1}[/link]".format(
wf.full_name, sync_obj.gh_pr_returned_data.get("html_url")
)
)
log.info("[green]Sync successful for {}".format(wf.full_name))
successful_syncs.append(wf.name)

# Clean up
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
from setuptools import setup, find_packages
import sys

version = "1.10.1"
version = "1.10.2"

with open("README.md") as f:
readme = f.read()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,5 +41,4 @@ jobs:
run: |
pip install nf-core
- name: Run nf-core lint
run: |
nf-core lint ${GITHUB_WORKSPACE}
run: nf-core -l lint_log.txt lint ${GITHUB_WORKSPACE}