Best practices for workflows in GitHub repositories

Author(s) orcid logoSimone Leo avatar Simone Leoorcid logoEli Chadwick avatar Eli Chadwick
Reviewers Helena Rasche avatarEli Chadwick avatarSaskia Hiltemann avatarYvan Le Bras avatar
Overview
Questions:
  • What are Workflow Best Practices

  • How does RO-Crate help?

Objectives:
  • Generate a workflow test using Planemo

  • Understand how testing can be automated with GitHub Actions

Time estimation: 30 minutes
Supporting Materials:
Published: May 11, 2023
Last modification: Sep 30, 2024
License: Tutorial Content is licensed under Apache-2.0. The GTN Framework is licensed under MIT
purl PURL: https://gxy.io/GTN:T00339
rating Rating: 5.0 (0 recent ratings, 1 all time)
version Revision: 5
Best viewed in a Jupyter Notebook

This tutorial is best viewed in a Jupyter notebook! You can load this notebook one of the following ways

Launching the notebook in Jupyter in Galaxy

  1. Instructions to Launch JupyterLab
  2. Open a Terminal in JupyterLab with File -> New -> Terminal
  3. Run wget https://training.galaxyproject.org/training-material/topics/fair/tutorials/ro-crate-galaxy-best-practices/fair-ro-crate-galaxy-best-practices.ipynb
  4. Select the notebook that appears in the list of files on the left.

Downloading the notebook

  1. Right click one of these links: Jupyter Notebook (With Solutions), Jupyter Notebook (Without Solutions)
  2. Save Link As..

A workflow, just like any other piece of software, can be formally correct and runnable but still lack a number of additional features that might help its reusability, interoperability, understandability, etc.

One of the most useful additions to a workflow is a suite of tests, which help check that the workflow is operating as intended. A test case consists of a set of inputs and corresponding expected outputs, together with a procedure for comparing the workflow’s actual outputs with the expected ones. It might be the case, in fact, that a test may be considered successful even if the actual outputs do not match the expected ones exactly, for instance because the computation involves a certain degree of randomness, or the output includes timestamps or randomly generated identifiers.

Providing documentation is also important to help understand the workflow’s purpose and mode of operation, its requirements, the effect of its parameters, etc. Even a single, well structured README file can go a long way towards getting users started with your workflow, especially if complemented by examples that include sample inputs and running instructions.

Agenda

In this tutorial, you will learn about the best practices that the Galaxy community has created for workflows.

  1. Community best practices
  2. Best practice repositories and RO-Crate
  3. Generating tests for your workflow
  4. Adding a GitHub workflow for running tests automatically

This tutorial assumes that you already have a Galaxy workflow that you want to apply best practices to. You can follow along using any workflow you have created or imported during a previous tutorial (such as A short introduction to Galaxy).

Community best practices

Though the practices listed in the introduction can be considered general enough to be applicable to any kind of software, individual communities usually add their own specific sets of rules and conventions that help users quickly find their way around software projects, understand them more easily and reuse them more effectively. The Galaxy community, for instance, has a guide on best practices for maintaining workflows and a built-in Best Practices panel in the workflow editor (see the tip below).

When you are editing a workflow, there are a number of additional steps you can take to ensure that it is a Best Practice workflow and will be more reusable.

  1. Open a workflow for editing
  2. In the workflow menu bar, you’ll find the galaxy-wf-options Workflow Options dropdown menu.
  3. Click on it and select galaxy-wf-best-practices Best Practices from the dropdown menu.

    screenshot showing the best practices menu item in the gear dropdown.

  4. This will take you to a new side panel, which allows you to investigate and correct any issues with your workflow.

    screenshot showing the best practices side panel. several issues are raised like a missing annotation with a link to add that, and non-optional inputs that are unconnected. Additionally several items already have green checks like the workflow defining creator information and a license.

The Galaxy community also has a guide on best practices for maintaining workflows. This guide includes the best practices from the Galaxy workflow panel, plus:

  • adding tests to the workflow
  • publishing the workflow on GitHub, a public GitLab server, or another public version-controlled repository
  • registering the workflow with a workflow registry such as WorkflowHub or Dockstore

Hands-on: Apply best practices for workflow structure
  1. Open your workflow for editing and find the Best Practices panel (see the tip above).
  2. Resolve the warnings that appear until every item has a green tick.

The Intergalactic Workflow Commission (IWC) is a collection of highly curated Galaxy workflows that follow best practices and conform to a specific GitHub directory layout, as specified in the guide on adding workflows. In particular, the workflow file must be accompanied by a Planemo test file with the same name but a -test.yml extension, and a test-data directory that contains the datasets used by the tests described in the test file. The guide also specifies how to fulfill other requirements such as setting a license, a creator and a version tag. A new workflow can be proposed for inclusion in the collection by opening a pull request to the IWC repository: if it passes the review and is merged, it will be published to iwc-workflows. The publication process also generates a metadata file that turns the repository into a Workflow Testing RO-Crate, which can be registered to WorkflowHub and LifeMonitor.

Best practice repositories and RO-Crate

The repo2rocrate software package allows to generate a Workflow Testing RO-Crate for a workflow repository that follows community best practices. It currently supports Galaxy (based on IWC guidelines), Nextflow and Snakemake. The tool assumes that the workflow repository is structured according to the community guidelines and generates the appropriate RO-Crate metadata for the various entities. Several command line options allow to specify additional information that cannot be automatically detected or needs to be overridden.

To try the software, we’ll clone one of the iwc-workflows repositories, whose layout is known to respect the IWC guidelines. Since it already contains an RO-Crate metadata file, we’ll delete it before running the tool.

pip install repo2rocrate
git clone https://github.com/iwc-workflows/parallel-accession-download
cd parallel-accession-download/
rm -fv ro-crate-metadata.json
repo2rocrate --repo-url https://github.com/iwc-workflows/parallel-accession-download

This adds an ro-crate-metadata.json file at the top level with metadata generated based on the tool’s knowledge of the expected repository layout. By specifying a zip file as an output with the -o option, we can directly generate an RO-Crate in the format accepted by WorkflowHub and LifeMonitor:

repo2rocrate --repo-url https://github.com/iwc-workflows/parallel-accession-download -o ../parallel-accession-download.crate.zip

Generating tests for your workflow

What if you only have a workflow, but you don’t have the test layout yet? You can use Planemo to generate it.

pip install planemo

As an example we will use this simple workflow, which has only two steps: it sorts the input lines and changes them to upper case. Follow these steps to generate a test layout for it:

Hands-on: Generate Workflow Tests With Planemo
  1. Download the workflow to a sort-and-change-case.ga file.
  2. Download this input dataset to an input.bed file.
  3. Upload the workflow to Galaxy (e.g., Galaxy Europe): from the upper menu, click on “Workflow” > “Import” > “Browse”, choose sort-and-change-case.ga and then click “Import workflow”.
  4. Rename the uploaded workflow from sort-and-change-case (imported from uploaded file) to sort-and-change-case by clicking the pencil icon next to the workflow name.
  5. Start a new history: click on the “+” button on the History panel to the right.
  6. Upload the input dataset to the new history: on the left panel, go to “Upload Data” > “Choose local files” and select input.bed, then click “Start” > “Close”.
  7. Wait for the file to finish uploading (i.e., for the loading circle on the dataset’s line in the history to disappear).
  8. Run the workflow on the input dataset: click on “Workflow” in the upper menu, locate sort-and-change-case, and click on the play button to the right.

Workflow Entry.

  1. This should take you to the workflow running page. The input slot should be already filled with input.bed since there is nothing else in the history. Click on “Run Workflow” on the upper right of the center panel.

    Workflow Run Page.

  2. Wait for the workflow execution to finish.
  3. On the upper menu, go to “Data” > “Workflow Invocations”, expand the invocation corresponding to the workflow just run and copy the invocation’s ID. In my case it says “Invocation ID: 86ecc02a9dd77649” on the right, where 86ecc02a9dd77649 is the ID.

    Workflow Invocation.

  4. On the upper menu, go to “User” > “Preferences” > “Manage API Key”. If you don’t have an API key yet, click the button to create a new one. Under “Current API key”, click the button to copy the API Key on the right.

    API key.

  5. Run planemo workflow_test_init --galaxy_url https://usegalaxy.eu --from_invocation INVOCATION_ID --galaxy_user_key API_KEY, replacing INVOCATION_ID with the actual invocation ID and API_KEY with the actual API key. If you’re not using the Galaxy Europe instance, also replace https://usegalaxy.eu with the URL of the instance you’re using.
  6. Browse the files that have been created - sort-and-change-case-tests.yml and test_data/

Optionally see this tip for more details:

Ensuring a Tutorial has a Workflow

  1. Find a tutorial that you’re interested in, that doesn’t currently have tests.

    This tutorial has a workflow (.ga) and a test, notice the -test.yml that has the same name as the workflow .ga file.

    machinelearning/workflows/machine_learning.ga
    machinelearning/workflows/machine_learning-test.yml

    You want to find tutorials without the -test.yml file. The workflow file might also be missing.

  2. Check if it has a workflow (if it does, skip to step 5.)
  3. Follow the tutorial
  4. Extract a workflow from the history
  5. Run that workflow in a new history to test

Extract Tests (Online Version)

If you are on UseGalaxy.org or another server running 24.2 or later, you can use PWDK, a version of planemo running online to generate the workflow tests.

However if you are on an older version of Galaxy, or a private Galaxy server, then you’ll need to do the following:

Extract Tests (Manual Version)

  1. Obtain the workflow invocation ID, and your API key (User → Preferences → Manage API Key)

    screenshot of the workflow invocation page. The user drop down shows where to find this page, and a red box circles a field named "Invocation ID"

  2. Install the latest version of planemo

    # In a virtualenv
    pip install planemo
  3. Run the command to initialise a workflow test from the workflows/ subdirectory - if it doesn’t exist, you might need to create it first.

    planemo workflow_test_init --from_invocation <INVOCATION ID> --galaxy_url <GALAXY SERVER URL> --galaxy_user_key <GALAXY API KEY>

    This will produce a folder of files, for example from a testing workflow:

    $ tree
    .
    ├── test-data
    │   ├── input dataset(s).shapefile.shp
    │   └── shapefile.shp
    ├── testing-openlayer.ga
    └── testing-openlayer-tests.yml

Adding Your Tests to the GTN

  1. You will need to check the -tests.yml file, it has some automatically generated comparisons. Namely it tests that output data matches the test-data exactly, however, you might want to replace that with assertions that check for e.g. correct file size, or specific text content you expect to see.

  2. If the files in test-data are already uploaded to Zenodo, to save disk space, you should delete them from the test-data dir and use their URL in the -tests.yml file, as in this example:

    - doc: Test the M. Tuberculosis Variant Analysis workflow
    job:
    'Read 1':
    location: https://zenodo.org/record/3960260/files/004-2_1.fastq.gz
    class: File
    filetype: fastqsanger.gz
  3. Add tests on the outputs! Check the planemo reference if you need more detail.

    - doc: Test the M. Tuberculosis Variant Analysis workflow
    job:
    # Simple explicit Inputs
    'Read 1':
    location: https://zenodo.org/record/3960260/files/004-2_1.fastq.gz
    class: File
    filetype: fastqsanger.gz
    outputs:
    jbrowse_html:
    asserts:
    has_text:
    text: "JBrowseDefaultMainPage"
    snippy_fasta:
    asserts:
    has_line:
    line: '>Wildtype Staphylococcus aureus strain WT.'
    snippy_tabular:
    asserts:
    has_n_columns:
    n: 2
  4. Contribute all of those files to the GTN in a PR, adding them to the workflows/ folder of your tutorial.

Question
  1. How do the files in test_data/ relate to your Galaxy history?
  2. Look at the contents of sort-and-change-case-tests.yml. What are the expected outputs of the test?
  1. The files in test_data/ correspond to the output files in the history, though some of the names are different:
  2. bed_input.bed has the same name in the history - this is the input file we uploaded
  3. sorted_bed.bed corresponds to the Sort on data 1 step (you can confirm this by viewing the file contents)
  4. uppercase_bed.tabular corresponds to the Change case on data 2 step (you can confirm this by viewing the file contents)
  5. The expected outputs are test-data/sorted_bed.bed and test-data/uppercase_bed.tabular. This means that when the workflow is run on the input (test-data/bed_input.bed), it is expected to produce two files that look exactly like those outputs.

To build up the test suite further, you can invoke the workflow multiple times with different inputs, and use each invocation to generate a test, using the same command as before:

planemo workflow_test_init --galaxy_url https://usegalaxy.eu --from_invocation INVOCATION_ID --galaxy_user_key API_KEY

Each invocation should test a different behavior of the workflow. This could mean using different datatypes for inputs, or changing the workflow settings to produce different results.

Hands-on: Generate tests for your own workflow
  1. Create a new folder on your computer to store the workflow.
  2. Download the Galaxy workflow you updated to follow best practices earlier in this tutorial. You can do this by going to the Workflow page and clicking galaxy-download Download workflow in .ga format.
  3. Create a new Galaxy history, and run the workflow on some appropriate input data.
  4. Use planemo to turn that workflow invocation into a test case.

Adding a GitHub workflow for running tests automatically

In the previous section, you learned how to generate a test layout for an example Galaxy workflow. This procedure also gives you the file structure you need to populate the GitHub repository in line with community best practices. One thing is still missing though: a GitHub workflow to test the Galaxy workflow automatically. Let’s create this now.

At the top level of the repository, create a .github/workflows directory and place a wftest.yml file inside it with the following content:

name: Periodic workflow test
on:
  schedule:
    - cron: '0 3 * * *'
  workflow_dispatch:
jobs:
  test:
    name: Test workflow
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v2
      with:
        fetch-depth: 1
    - uses: actions/setup-python@v1
      with:
        python-version: '3.7'
    - name: install Planemo
      run: |
        pip install --upgrade pip
        pip install planemo
    - name: run planemo test
      run: |
        planemo test --biocontainers sort-and-change-case.ga

Replacing sort-and-change-case.ga with the name of your actual Galaxy workflow. You can find extensive documentation on GitHub workflows on the GitHub web site. Here we’ll give some highlights:

  • the on field sets the GitHub workflow to run:
    • automatically every day at 3 AM
    • when manually dispatched
  • the steps do the following:
    • check out the GitHub repository
    • set up a Python environment
    • install Planemo
    • run planemo test on the Galaxy workflow

An example of a repository built according to the guidelines given here is simleo/ccs-bam-to-fastq-qc-crate, which realizes the Workflow Testing RO-Crate setup for BAM-to-FASTQ-QC.

Your workflow is now ready to add to GitHub! If you’re not familiar with GitHub, follow these instructions to create a repository and upload your workflow files: Uploading a project to GitHub.