Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add nf-test tests and snapshots #323

Closed
wants to merge 14 commits into from
121 changes: 54 additions & 67 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -1,102 +1,89 @@
name: nf-core CI
# This workflow runs the pipeline with the minimal test dataset to check that it completes without any syntax errors
name: nf-core CI
on:
push:
branches:
- dev
- "dev"
pull_request:
branches:
- dev
- master
release:
types: [published]
types:
- "published"

env:
NXF_ANSI_LOG: false
NFTEST_VER: "0.8.3"

concurrency:
group: "${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}"
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true

jobs:
test:
name: Run pipeline with test data (AMP and ARG workflows)
# Only run on push if this is the nf-core dev branch (merged PRs)
if: "${{ github.event_name != 'push' || (github.event_name == 'push' && github.repository == 'nf-core/funcscan') }}"
define_nxf_versions:
name: Choose nextflow versions to test against depending on target branch
runs-on: ubuntu-latest
strategy:
matrix:
NXF_VER:
- "23.04.0"
- "latest-everything"
parameters:
- "--annotation_tool prodigal"
- "--annotation_tool prokka"
- "--annotation_tool bakta --annotation_bakta_db_downloadtype light"

outputs:
matrix: ${{ steps.nxf_versions.outputs.matrix }}
steps:
- name: Check out pipeline code
uses: actions/checkout@v3

- name: Install Nextflow
uses: nf-core/setup-nextflow@v1
with:
version: "${{ matrix.NXF_VER }}"

- name: Run pipeline with test data (AMP and ARG workflows)
- id: nxf_versions
run: |
nextflow run ${GITHUB_WORKSPACE} -profile test,docker --outdir ./results ${{ matrix.parameters }}
if [[ "${{ github.event_name }}" == "pull_request" && "${{ github.base_ref }}" == "dev" && "${{ matrix.NXF_VER }}" != "latest-everything" ]]; then
echo matrix='["latest-everything"]' | tee -a $GITHUB_OUTPUT
else
echo matrix='["latest-everything", "23.04.0"]' | tee -a $GITHUB_OUTPUT
fi

test_bgc:
name: Run pipeline with test data (BGC workflow)
# Only run on push if this is the nf-core dev branch (merged PRs)
if: "${{ github.event_name != 'push' || (github.event_name == 'push' && github.repository == 'nf-core/funcscan') }}"
test:
name: nf-test
needs: define_nxf_versions
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
NXF_VER:
- "23.04.0"
- "latest-everything"
parameters:
- "--annotation_tool prodigal"
- "--annotation_tool prokka"
- "--annotation_tool bakta --annotation_bakta_db_downloadtype light"
NXF_VER: ${{ fromJson(needs.define_nxf_versions.outputs.matrix) }}
#components:
# - "tests/pipeline/test.nf.test --profile test"
#parameters:
# - "--annotation_tool prodigal"
# - "--annotation_tool prokka"
# - "--annotation_tool bakta --annotation_bakta_db_downloadtype light"
tags:
- "test"
# - "test_bgc"
# - "test_deeparg"
# - "test_nothing"
# - "test_full"
profile:
- "docker"

steps:
- name: Check out pipeline code
uses: actions/checkout@v2
uses: actions/checkout@v3

- name: Install Nextflow
uses: nf-core/setup-nextflow@v1
with:
version: "${{ matrix.NXF_VER }}"

- name: Run pipeline with test data (BGC workflow)
- name: Install nf-test
run: |
nextflow run ${GITHUB_WORKSPACE} -profile test_bgc,docker --outdir ./results ${{ matrix.parameters }} --bgc_skip_deepbgc
wget -qO- https://code.askimed.com/install/nf-test | bash -s $NFTEST_VER
sudo mv nf-test /usr/local/bin/

test_deeparg:
name: Run pipeline with test data (DeepARG only workflow)
# Only run on push if this is the nf-core dev branch (merged PRs)
if: "${{ github.event_name != 'push' || (github.event_name == 'push' && github.repository == 'nf-core/funcscan') }}"
runs-on: ubuntu-latest
strategy:
matrix:
NXF_VER:
- "23.04.0"
- "latest-everything"
parameters:
- "--annotation_tool bakta --annotation_bakta_db_downloadtype light"
- "--annotation_tool pyrodigal"
- name: Run nf-test
run: |
nf-test test ${{ matrix.tags }},${{ matrix.profile }} --junitxml=test.xml

steps:
- name: Check out pipeline code
uses: actions/checkout@v2
- name: Output log on failure
if: failure()
run: |
sudo apt install bat > /dev/null
batcat --decorations=always --color=always ${{ github.workspace }}/.nf-test/tests/*/output/pipeline_info/software_versions.yml

- name: Install Nextflow
uses: nf-core/setup-nextflow@v1
- name: Publish Test Report
uses: mikepenz/action-junit-report@v3
if: always() # always run even if the previous step fails
with:
version: "${{ matrix.NXF_VER }}"

- name: Run pipeline with test data (DeepARG workflow)
run: |
wget https://zenodo.org/record/8280582/files/deeparg.zip ## download from zenodo due to instability of deepARG server
unzip deeparg.zip
nextflow run ${GITHUB_WORKSPACE} -profile test_deeparg,docker --outdir ./results ${{ matrix.parameters }} --arg_deeparg_data 'deeparg/'
report_paths: "*.xml"
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,5 @@ results/
testing/
testing*
*.pyc
tests/
nf-test/
.nf-test*
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

### `Added`

- [#323](https://github.com/nf-core/funcscan/pull/323) Add nf-test files and snapshots (by @louperelo)
- [#322](https://github.com/nf-core/funcscan/pull/322) Updated all modules: introduce environment.yml files. (by @jasmezz)

### `Fixed`
Expand Down
1 change: 1 addition & 0 deletions conf/test_deeparg.config
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,7 @@ params {
arg_skip_amrfinderplus = true
arg_skip_abricate = true
arg_skip_deeparg = false
arg_deeparg_data = 'https://zenodo.org/records/8280582/files/deeparg.zip?download=1'

run_amp_screening = false
run_bgc_screening = false
Expand Down
18 changes: 18 additions & 0 deletions nf-test.config
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
config {

// location for all nf-tests
testsDir "tests"

// nf-test directory including temporary files for each test
workDir ".nf-test"

// location of library folder that is added automatically to the classpath
libDir "tests/lib/"

// location of an optional nextflow.config file specific for executing tests
configFile "nextflow.config"

// run all test with the defined docker profile from the main nextflow.config
profile ""

}
38 changes: 38 additions & 0 deletions tests/lib/UTILS.groovy
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
// Helper functions for pipeline tests

class UTILS {

// Function to remove Nextflow version from software_versions.yml
public static String removeNextflowVersion(outputDir) {
def softwareVersions = path("$outputDir/pipeline_info/software_versions.yml").yaml
if (softwareVersions.containsKey("Workflow")) {
softwareVersions.Workflow.remove("Nextflow")
}
return softwareVersions
}

// Function to filter lines from a file and return a new file
public static File filterLines(String inFilePath, int linesToSkip) {
if (linesToSkip >= 0) {
File inputFile = new File(inFilePath)
File outputFile = new File(inFilePath + ".filtered")
def lineCount = 0
inputFile.eachLine { line ->
lineCount++
if (lineCount > linesToSkip) {
outputFile.append(line + '\n')
}
}
return outputFile
} else {
File inputFile = new File(inFilePath)
File outputFile = new File(inFilePath + ".filtered")
def lines = inputFile.readLines()
def totalLines = lines.size()
lines.take(totalLines + linesToSkip).each { line ->
outputFile.append(line + '\n')
}
return outputFile
}
}
}
62 changes: 62 additions & 0 deletions tests/pipeline/test.nf.test
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
nextflow_pipeline {

name "Test Workflow main.nf"
script "main.nf"
tag "test"
tag "pipeline"

test("AMP and ARG workflow") {

when {
params {
outdir = "$outputDir"
}
}

then {
assertAll(
{ assert workflow.success },
{ assert snapshot(UTILS.removeNextflowVersion("$outputDir")).match("software_versions") },
{ assert snapshot(path("$outputDir/amp/ampir/").list()).match("amp_ampir") },
{ assert snapshot(path("$outputDir/amp/amplify/").list()).match("amp_amplify") },
{ assert new File("$outputDir/amp/hmmer_hmmsearch/sample_1/sample_1_mybacteriocin.txt.gz").exists() },
{ assert new File("$outputDir/amp/hmmer_hmmsearch/sample_2/sample_2_mybacteriocin.txt.gz").exists() },
{ assert snapshot(path("$outputDir/amp/macrel/sample_1/sample_1_log.txt"),
path("$outputDir/amp/macrel/sample_1/README.md")).match("amp_macrel_sample_1") },
{ assert new File("$outputDir/amp/macrel/sample_1/sample_1.all_orfs.faa.gz").exists() },
{ assert new File("$outputDir/amp/macrel/sample_1/sample_1.prediction.gz").exists() },
{ assert snapshot(path("$outputDir/amp/macrel/sample_2/sample_2_log.txt"),
path("$outputDir/amp/macrel/sample_2/README.md")).match("amp_macrel_sample_2") },
{ assert new File("$outputDir/amp/macrel/sample_2/sample_2.all_orfs.faa.gz").exists() },
{ assert new File("$outputDir/amp/macrel/sample_2/sample_2.prediction.gz").exists() },
{ assert snapshot(path("$outputDir/arg/abricate/").list()).match("arg_abricate") },
{ assert snapshot(path("$outputDir/arg/amrfinderplus/").list()).match("arg_amrfinderplus") },
{ assert snapshot(path("$outputDir/arg/fargene/sample_1/class_a/predictedGenes/wastewater_metagenome_contigs_1-class_A-filtered-peptides.fasta"),
path("$outputDir/arg/fargene/sample_1/class_a/predictedGenes/wastewater_metagenome_contigs_1-class_A-filtered.fasta"),
path("$outputDir/arg/fargene/sample_1/class_a/results_summary.txt"),
path("$outputDir/arg/fargene/sample_1/class_b_1_2/predictedGenes/wastewater_metagenome_contigs_1-class_B_1_2-filtered-peptides.fasta"),
path("$outputDir/arg/fargene/sample_1/class_b_1_2/predictedGenes/wastewater_metagenome_contigs_1-class_B_1_2-filtered.fasta"),
path("$outputDir/arg/fargene/sample_1/class_b_1_2/results_summary.txt"),
path("$outputDir/arg/fargene/sample_2/class_a/predictedGenes/wastewater_metagenome_contigs_2-class_A-filtered-peptides.fasta"),
path("$outputDir/arg/fargene/sample_2/class_a/predictedGenes/wastewater_metagenome_contigs_2-class_A-filtered.fasta"),
path("$outputDir/arg/fargene/sample_2/class_a/results_summary.txt"),
path("$outputDir/arg/fargene/sample_2/class_b_1_2/predictedGenes/wastewater_metagenome_contigs_2-class_B_1_2-filtered-peptides.fasta"),
path("$outputDir/arg/fargene/sample_2/class_b_1_2/predictedGenes/wastewater_metagenome_contigs_2-class_B_1_2-filtered.fasta"),
path("$outputDir/arg/fargene/sample_2/class_b_1_2/results_summary.txt")).match("arg_fargene") },
{ assert snapshot(path("$outputDir/arg/hamronization/").list()).match("arg_hamronization") },
{ assert snapshot(path("$outputDir/multiqc/multiqc_data/multiqc_citations.txt"),
path("$outputDir/multiqc/multiqc_data/multiqc_sources.txt")).match("multiqc") },
{ assert new File("$outputDir/reports/ampcombi/ampcombi.log").exists() },
{ assert new File("$outputDir/reports/ampcombi/sample_1/sample_1_amp.faa").exists() },
{ assert new File("$outputDir/reports/ampcombi/sample_1/sample_1_ampcombi.csv").exists() },
{ assert new File("$outputDir/reports/ampcombi/sample_1/sample_1_diamond_matches.txt").exists() },
{ assert new File("$outputDir/reports/ampcombi/sample_2/sample_2_amp.faa").exists() },
{ assert new File("$outputDir/reports/ampcombi/sample_2/sample_2_ampcombi.csv").exists() },
{ assert new File("$outputDir/reports/ampcombi/sample_2/sample_2_diamond_matches.txt").exists() },
{ assert snapshot(path("$outputDir/reports/hamronization_summarize/hamronization_combined_report.tsv")).match("summary_hamronization") }
)
}

}

}
Loading