From 98ec55e2a0e47c3c88fb2dbaea5769aa14345bdf Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Tue, 8 Oct 2024 13:00:28 +0530 Subject: [PATCH 01/20] Update comment with git regression data --- docs/multiindex_isotope_decay_data.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/multiindex_isotope_decay_data.ipynb b/docs/multiindex_isotope_decay_data.ipynb index 9cf75e41ff0..f633bf4f320 100644 --- a/docs/multiindex_isotope_decay_data.ipynb +++ b/docs/multiindex_isotope_decay_data.ipynb @@ -67,7 +67,7 @@ } ], "source": [ - "# Download the atom data file from tardis-refdata repo to run this cell.\n", + "# Download the atom data file from tardis-regression-data repo to run this cell.\n", "download_atom_data('kurucz_cd23_chianti_H_He')\n", "atom_data_file = 'kurucz_cd23_chianti_H_He.h5'\n", "atom_data = AtomData.from_hdf(atom_data_file)" From ad4e4627c0e795e312198ba29cbc80040e723980 Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Tue, 8 Oct 2024 14:07:02 +0530 Subject: [PATCH 02/20] update hyperlink --- docs/quickstart.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/quickstart.ipynb b/docs/quickstart.ipynb index 87c9ad0d23f..7a9625d3a23 100644 --- a/docs/quickstart.ipynb +++ b/docs/quickstart.ipynb @@ -43,7 +43,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "You can also obtain a copy of the atomic data from the [tardis-refdata](https://github.com/tardis-sn/tardis-refdata/tree/master/atom_data) repository." + "You can also obtain a copy of the atomic data from the [tardis-regression-data](https://github.com/tardis-sn/tardis-regression-data/tree/main/atom_data) repository." ] }, { From d45290a525d4e3b5ea3aaf7b65738d67e0c6615a Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Tue, 8 Oct 2024 14:15:32 +0530 Subject: [PATCH 03/20] Update file name --- docs/contributing/development/index.rst | 2 +- .../development/update_refdata.rst | 56 ------------------- .../development/update_regression_data.rst | 48 ++++++++++++++++ 3 files changed, 49 insertions(+), 57 deletions(-) delete mode 100644 docs/contributing/development/update_refdata.rst create mode 100644 docs/contributing/development/update_regression_data.rst diff --git a/docs/contributing/development/index.rst b/docs/contributing/development/index.rst index 53fde2cc824..4b72e603677 100644 --- a/docs/contributing/development/index.rst +++ b/docs/contributing/development/index.rst @@ -30,6 +30,6 @@ the core team (active maintainers) of TARDIS. :maxdepth: 2 continuous_integration - update_refdata + update_regression_data matterbridge debug_numba diff --git a/docs/contributing/development/update_refdata.rst b/docs/contributing/development/update_refdata.rst deleted file mode 100644 index a3d53c3e0fe..00000000000 --- a/docs/contributing/development/update_refdata.rst +++ /dev/null @@ -1,56 +0,0 @@ -.. _update refdata: - -************************* -Update the Reference Data -************************* - -A special kind of tests are executed only when ``pytest`` is called alongside the ``--refdata`` flag. -These tests compares the output of the TARDIS code (mostly arrays) against the information stored -in the reference data files. - -TARDIS stores reference data in the `tardis-refdata `_ -repository. This repository also has a mirror hosted in Azure Pipelines (synchronized automatically by a -GitHub workflow) since this Microsoft service does not have limitations in bandwidth nor storage. - -Sometimes, this data needs to be updated. The procedure to update these files manually is not trivial -and has been automated recently thanks to the `NumFOCUS `_ support. - - -================= -Default Procedure -================= - -Imagine you are working on a new feature (or fix) for TARDIS, you have opened a pull request and the -reference data tests are failing in the testing pipeline. This could happen for many reasons: - -A. There's a problem in your code. -B. Your code is OK, but the reference data is outdated. -C. The pipeline is broken. - -If you think your could be dealing with scenario B, then: - -#. Write ``/azp run compare-refdata`` in a comment on your PR. -#. Analyze the results and discuss if the reference data effectively requires an update. -#. Update the reference data by writing ``/azp run update-refdata`` on a new comment. - -.. note:: - - - If you don't have enough privileges to run the pipelines, tag a TARDIS developer capable of doing so. - - If any of these two pipelines fail, please tag a `TARDIS team member `_ responsible for CI/CD. - -If everything went well, the reference data will have been updated by the TARDIS bot and the commit -message should include the pull request number that triggered the update. - -================ -Manual Procedure -================ - -The manual procedure is documented for debugging purposes and should not be used in general. - -#. Activate the ``tardis`` environment. -#. Fork and clone the ``tardis-refdata`` repository. -#. Follow the instructions at the top of the notebook ``tardis-refdata/notebooks/ref_data_compare.ipynb``. -#. Go to your local ``tardis`` repository and make sure you are working on the branch you want to generate new reference data from. -#. Generate new reference data with ``pytest tardis --refdata=/path/to/tardis-refdata --generate-reference``. -#. Run the ``ref_data_compare.ipynb`` notebook and check the results. -#. Make a new branch in ``tardis-refdata``, push your new reference data and open a pull request. diff --git a/docs/contributing/development/update_regression_data.rst b/docs/contributing/development/update_regression_data.rst new file mode 100644 index 00000000000..b34cef0ca30 --- /dev/null +++ b/docs/contributing/development/update_regression_data.rst @@ -0,0 +1,48 @@ +.. _update regression-data: + +************************* +Update the Regression Data +************************* + +A special kind of tests are executed only when ``pytest`` is called alongside the ``--regression-data`` flag. These tests compare the output of the TARDIS code (mostly arrays) against the information stored in the regression data files. + +TARDIS stores regression data in the `tardis-regression-data `_ repository. Sometimes, this data needs to be updated. The procedure to update these files has been simplified, allowing for a more straightforward process. + +================= +Default Procedure +================= + +Imagine you are working on a new feature (or fix) for TARDIS, and you have opened a pull request. If the regression data tests are failing in the testing pipeline, this could happen for various reasons: + +A. There's a problem in your code. +B. Your code is OK, but the regression data is outdated. +C. The pipeline is broken. + +If you suspect scenario B, then: + +#. Analyze the results to determine if the regression data requires an update. +#. Update your fork of the ``tardis-regression-data`` repository by pulling the latest changes and merging them into your local branch. +#. Push your updated fork to GitHub. + +.. note:: + + - If you do not have enough privileges to update the repository, tag a TARDIS developer capable of doing so. + - If any issues arise during this process, please tag a `TARDIS team member `_ responsible for CI/CD. + +If everything goes smoothly, your regression data will be updated in your fork, and you can proceed with your development. + +================ +Manual Procedure +================ + +The manual procedure is documented for debugging purposes and should not be used generally. + +#. Activate the ``tardis`` environment. +#. Fork and clone the ``tardis-regression-data`` repository. +#. Follow any necessary instructions within your local copy. +#. Go to your local ``tardis`` repository and ensure you are working on the branch from which you want to generate new regression data. +#. Generate new regression data with ``pytest tardis --regression-data=/path/to/tardis-regression-data --generate-reference``. +#. Check your results and ensure everything is correct. +#. Make a new branch in ``tardis-regression-data``, push your new regression data, and open a pull request. + +By following these updated procedures, you can efficiently manage and update regression data within your TARDIS project setup. From 482a29d1aa7661dbd43926583cc620dcfbcb4fd4 Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Tue, 8 Oct 2024 16:35:12 +0530 Subject: [PATCH 04/20] Update running tests page --- .../development/running_tests.rst | 34 +++++++------------ 1 file changed, 12 insertions(+), 22 deletions(-) diff --git a/docs/contributing/development/running_tests.rst b/docs/contributing/development/running_tests.rst index 8143ac6c36f..73d97b3faab 100644 --- a/docs/contributing/development/running_tests.rst +++ b/docs/contributing/development/running_tests.rst @@ -26,16 +26,16 @@ tests, you can run this with: > pytest tardis -Running the more advanced unit tests requires TARDIS Reference data that can be +Running the more advanced unit tests requires TARDIS Regression data that can be downloaded -(`tardis-refdata `_). +(`tardis-regression-data `_). `Git LFS `_ is used -to download the large refdata files in the tardis-refdata repository. +to download the large files in the tardis-regression-data repository. However, it is not required to download the entire repository. Firstly it is -important to identify the refdata files that are needed. Sometimes, it is possible +important to identify the regression-data files that are needed. Sometimes, it is possible that a preused fixture that is also being used in the current tests is using some -refdata. So, it is advised to check for such cases beforehand. +regression-data. So, it is advised to check for such cases beforehand. After identifying the refdata files to be used in the unit tests, those particular files can be downloaded using ``git lfs`` @@ -44,21 +44,21 @@ files can be downloaded using ``git lfs`` > git lfs pull --include=filename -It is important to maintain the same directory structure as the tardis-refdata repo -i.e. the lfs files should be in the same directory tree exactly as in tardis-refdata +It is important to maintain the same directory structure as the tardis-regression-data repo +i.e. the lfs files should be in the same directory tree exactly as in tardis-regression-data repository. Finally, the tests can be run using the following command .. code-block:: shell - > pytest tardis --tardis-refdata=/path/to/tardis-refdata/ + > pytest tardis --tardis-regression-data=/path/to/tardis-regression-data/ Or, to run tests for a particular file or directory .. code-block:: shell - > pytest tardis/path/to/test_file_or_directory --tardis-refdata=/path/to/tardis-refdata/ + > pytest tardis/path/to/test_file_or_directory --tardis-regression-data=/path/to/tardis-regression-data/ .. warning:: The `tests workflow `_ runs on @@ -70,22 +70,12 @@ Or, to run tests for a particular file or directory You can check if cache was generated by looking in the ``Restore LFS Cache`` step of the workflow run. Cache can also be found under the "Management" Section under "Actions" tab. -Generating Plasma Reference -=========================== - -You can generate Plasma Reference by the following command: - -.. code-block:: shell - - > pytest -rs tardis/plasma/tests/test_complete_plasmas.py - --tardis-refdata="/path/to/tardis-refdata/" --generate-reference - Running the Integration Tests ============================= -These tests require reference files against which the results of the various +These tests require regression data files against which the results of the various tardis runs are tested. So you first need to either download the current -reference files (`here `_) +regression data files (`here `_) or generate new ones. Both of these require a configuration file for the integration tests: @@ -97,7 +87,7 @@ Inside the atomic data directory there needs to be atomic data for each of the setups that are provided in the ``test_integration`` folder. If no references are given, the first step is to generate them. The ``--less-packets`` option is useful for debugging purposes and will just -use very few packets to generate the references and thus make the process much +use very few packets to generate the regression data and thus make the process much faster --- THIS IS ONLY FOR DEBUGGING PURPOSES. The ``-s`` option ensures that TARDIS prints out the progress: From f46a5a32edc668f538ae0773295affcdcf278ebd Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Tue, 8 Oct 2024 16:46:16 +0530 Subject: [PATCH 05/20] Removing outdated updating test data procedures --- .../development/update_regression_data.rst | 22 ++----------------- 1 file changed, 2 insertions(+), 20 deletions(-) diff --git a/docs/contributing/development/update_regression_data.rst b/docs/contributing/development/update_regression_data.rst index b34cef0ca30..352ba9e8fe0 100644 --- a/docs/contributing/development/update_regression_data.rst +++ b/docs/contributing/development/update_regression_data.rst @@ -8,35 +8,17 @@ A special kind of tests are executed only when ``pytest`` is called alongside th TARDIS stores regression data in the `tardis-regression-data `_ repository. Sometimes, this data needs to be updated. The procedure to update these files has been simplified, allowing for a more straightforward process. -================= -Default Procedure -================= - -Imagine you are working on a new feature (or fix) for TARDIS, and you have opened a pull request. If the regression data tests are failing in the testing pipeline, this could happen for various reasons: +Imagine you are working on a new feature (or fix) for TARDIS, and you have opened a pull request. If the regression data tests are failing, this could happen for various reasons: A. There's a problem in your code. B. Your code is OK, but the regression data is outdated. -C. The pipeline is broken. - -If you suspect scenario B, then: - -#. Analyze the results to determine if the regression data requires an update. -#. Update your fork of the ``tardis-regression-data`` repository by pulling the latest changes and merging them into your local branch. -#. Push your updated fork to GitHub. -.. note:: - - - If you do not have enough privileges to update the repository, tag a TARDIS developer capable of doing so. - - If any issues arise during this process, please tag a `TARDIS team member `_ responsible for CI/CD. - -If everything goes smoothly, your regression data will be updated in your fork, and you can proceed with your development. +If you suspect scenario B, please follow the `Manual Procedure <#manual-procedure>`_ to make a pull request to update the regression data. If any issues arise during this process, please tag a `TARDIS team member `_ responsible for CI/CD. ================ Manual Procedure ================ -The manual procedure is documented for debugging purposes and should not be used generally. - #. Activate the ``tardis`` environment. #. Fork and clone the ``tardis-regression-data`` repository. #. Follow any necessary instructions within your local copy. From 9ab6890d66233c8ece06e28a3261b605a1f3aef9 Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Tue, 8 Oct 2024 18:55:11 +0530 Subject: [PATCH 06/20] Remove azure pipelines from tardis docs --- docs/contributing/development/azure_links.inc | 15 - .../development/continuous_integration.rst | 321 +----------------- docs/contributing/development/links.inc | 1 - 3 files changed, 8 insertions(+), 329 deletions(-) delete mode 100644 docs/contributing/development/azure_links.inc diff --git a/docs/contributing/development/azure_links.inc b/docs/contributing/development/azure_links.inc deleted file mode 100644 index af352a44a42..00000000000 --- a/docs/contributing/development/azure_links.inc +++ /dev/null @@ -1,15 +0,0 @@ - -.. azure stuff -.. _azure DevOps: http://azure.microsoft.com/en-us/services/devops/?nav=mi -.. _azure documentation section on triggers: https://docs.microsoft.com/en-us/azure/devops/pipelines/repos/github?view=azure-devops&tabs=yaml#ci-triggers -.. _azure documentation section on variables: https://docs.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch -.. _azure documentation section on jobs: https://docs.microsoft.com/en-us/azure/devops/pipelines/process/phases?view=azure-devops&tabs=yaml -.. _azure documentation section on templates: https://docs.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops -.. _checking out multiple repositories: https://github.com/microsoft/azure-pipelines-yaml/blob/master/design/multi-checkout.md#behavior-changes-in-multi-checkout-mode -.. general stuff -.. _continuous integration: https://en.wikipedia.org/wiki/Continuous_integration -.. _update it manually: https://docs.microsoft.com/en-us/azure/devops/repos/git/import-git-repository?view=azure-devops#what-if-my-source-repository-contains-git-lfs-objects -.. _error 413: https://developercommunity.visualstudio.com/content/problem/867488/git-lfs-push-got-413-error.html - -.. vim: ft=rstS - diff --git a/docs/contributing/development/continuous_integration.rst b/docs/contributing/development/continuous_integration.rst index b8b181b8c7d..7a32739d95d 100644 --- a/docs/contributing/development/continuous_integration.rst +++ b/docs/contributing/development/continuous_integration.rst @@ -10,270 +10,22 @@ or a change is merged into the *master* branch, a service will clone the repository, checkout to the current commit and execute all the TARDIS tests. This helps us to detect bugs immediately. - -Azure Repos ------------ - -Azure Repos is just another service to store Git repositories. -Currently, we use Azure Repos to mirror ``tardis-refdata`` -repository since Azure does not impose limits on LFS bandwidth -nor storage. - -**To clone this repository:** - -.. code-block:: bash - - git clone https://tardis-sn@dev.azure.com/tardis-sn/TARDIS/_git/tardis-refdata - -**To download a LFS file through HTTPS:** - -.. code-block:: none - - https://dev.azure.com/tardis-sn/TARDIS/_apis/git/repositories/tardis-refdata/items?path=atom_data/kurucz_cd23_chianti_H_He.h5&resolveLfs=true - -This mirror is automatically synced by `a GitHub workflow`. If you want -to `update it manually`_, remember to set ``git config http.version HTTP/1.1`` -to avoid `error 413`_ while pushing large files. - - -Azure Pipelines & GitHub Actions --------------------------------- - -Currently, we use the `Azure DevOps`_ service to run most of our -pipelines and GitHub Actions for some others (called "workflows"). The -following sections explains briefly the different components of a -pipeline/workflow, mostly focused on the Azure service. +GitHub Actions +-------------- A pipeline (or a workflow) is essentially a :term:`YAML` configuration file with different sections such as variables, jobs and steps. These files run commands or tasks when they are triggered by some event, like a commit being pushed to a certain branch. -Pipelines on Azure must be created through the web UI for the first time. -Then, making changes to an existing pipeline is as easy as making a pull -request. To create a new workflow on GitHub, just create a new YAML file -in ``.github/workflows``. - - -Triggers --------- - -First thing to do is telling the pipeline when it should run. In -Azure, *trigger* (also known as the CI trigger) sets up the pipeline -to run every time changes are pushed to a branch. - -.. code-block:: yaml - - trigger: - - master - -If some trigger is not specified then the default configuration -is assumed. - -.. code-block:: yaml - - trigger: - branches: - include: - - '*' - - pr: - branches: - include: - - '*' - -This means the pipeline will start running every time changes are -merged to any branch of the repository, or someone pushes new -commits to a pull request. - -If you want to run a pipeline only manually set both triggers to -*none*. - -.. code-block:: yaml - - trigger: none - - pr: none - -Notice that you can test changes in a pipeline by activating the PR -trigger on a new pull request, even if that trigger is disabled on -the YAML file present in the *master* branch. - -On GitHub Actions these triggers are named ``push`` and ``pull_request``, -and works mostly in the same way. - -.. warning:: Triggers also can be set on the Azure's web interface - too, but this action is discouraged, since it overrides - any trigger specified in the YAML file and could lead to - confusing situations. - -There are more useful triggers such as the *cron* trigger, see the -`Azure documentation section on triggers`_ for more information. - - -Variables ---------- - -Variable syntax -=============== - -Azure Pipelines supports three different ways to reference variables: -*macro*, *template expression*, and *runtime expression*. Each syntax -can be used for a different purpose and has some limitations. - -.. image:: images/variables.png - :align: center - -**What syntax should I use?** Use *macro syntax* if you are providing -input for a task. Choose a *runtime expression* if you are working with -conditions and expressions. If you are defining a variable in a template, -use a *template expression*. - - -Define variables -================ - -Usually, we define variables at the top of the YAML file. - -.. code-block:: yaml - - variables: - my.var: 'foo' - - steps: - - bash: | - echo $(my.var) - -When a variable is defined at the top of a YAML, it will be available -to all jobs and stages in the pipeline as a *global variable*. -Variables at the *stage* level override variables at the *root* level, -while variables at the *job* level override variables at the *root* -and *stage* level. - -Also, variables are available to scripts through environment variables. -The name is upper-cased and ``.`` is replaced with ``_``. For example - -.. code-block:: yaml - - variables: - my.var: 'foo' - - steps: - - bash: | - echo $MY_VAR - -To set a variable from a script task, use the ``task.setvariable`` logging -command. - -.. code-block:: yaml - - steps: - - - bash: | - echo "##vso[task.setvariable variable=my.var]foo" - - - bash: | - echo $(my.var) - -See the `Azure documentation section on variables`_ for more information. - - -Predefined variables --------------------- - -The most important (and confusing) predefined variables are the ones related -to paths in Azure: - -* All folders for a given pipeline are created under ``Agent.BuildDirectory`` - variable, alias ``Pipeline.Workspace``. This includes subdirectories like - ``/s`` for sources or ``/a`` for artifacts. - -* Path to source code varies depending on how many repositories we fetch. - For example, source code is located under the ``Build.Repository.LocalPath`` - variable (alias ``Build.SourcesDirectory``) when fetching a single repository, - but after fetching a second repository code is moved automatically to - ``Build.Repository.LocalPath/repository-name``. - -See the Azure documentation to learn more about `checking out multiple repositories`_. - - -Jobs ----- - -You can organize your pipeline into jobs. Every pipeline has at least one job. -A job is a series of steps that run sequentially as a unit. In other words, -a job is the smallest unit of work that can be scheduled to run. - - -.. code-block:: yaml - - jobs: - - job: myJob - - pool: - vmImage: 'ubuntu-latest' - - steps: - - bash: echo "Hello world" - -Jobs can run in parallel (for example: run the same job on multiple OSes) or -depend on a previous job. - -See the `Azure documentation section on jobs`_ for more information. - +Currently, we use GitHub Actions to run all of our pipelines. Making changes to an existing +pipeline is as easy as making a pull request. To create a new workflow on GitHub, +just create a new YAML file in ``.github/workflows``. TARDIS Pipelines ---------------- -Brief description of pipelines already implemented on Azure or GitHub Actions. - - -The default template -==================== - -Templates let you define reusable content, logic, and parameters. It functions -like an include directive in many programming languages (content from one file -is inserted into another file). - -The common set of steps used across most TARDIS pipelines resides in the -"default" template: - -- Force ``set -e`` on all Bash steps. -- Set TARDIS custom variables. -- Fetch TARDIS main repository. -- Fetch TARDIS reference data repository from mirror (optional). -- Configure Anaconda for Linux and macOS agents. -- Install Mamba package manager (optional). -- Install TARDIS environment (optional). -- Build and install TARDIS (optional). - -It was written to make pipelines easier to create and maintain. For example, -to start a new pipeline use:: - - steps: - - template: templates/default.yml - parameters: - useMamba: true - -**List of template parameters:** - -- ``fetchDepth`` (*int*): the depth of commits to fetch from ``tardis`` repository, - default is ``0`` (no limit). -- ``fetchRefdata`` (*bool*): fetch the ``tardis-refdata`` repository from Azure Repos, - default is ``false``. -- ``refdataRepo`` (*option*): source of the ``tardis-refdata`` repository, - options are ``azure`` (default) or ``github``. -- ``useMamba`` (*bool*): use the ``mamba`` package manager instead of ``conda``, - default is ``false``. -- ``tardisEnv`` (*bool*): setup the TARDIS environment, default is ``true``. - -**List of predefined custom variables:** - -- ``tardis.dir`` is equivalent to ``$(Build.SourcesDirectory)/tardis``. -- ``refdata.dir`` is equivalent to ``$(Build.SourcesDirectory)/tardis-refdata``. - -See the `Azure documentation section on templates`_ for more information. - +Brief description of pipelines already implemented on Tardis Documentation build pipeline ============================ @@ -292,8 +44,7 @@ Testing pipeline ================ The `testing pipeline`_ (CI) consists basically in the same job running twice -in parallel (one for each OS) with the steps from the default template, plus -extra steps to run the tests and upload the coverage results. +in parallel (one for each OS), plus extra steps to run the tests and upload the coverage results. Authors pipeline @@ -315,60 +66,4 @@ In the near future we want to auto-update the citation guidelines in the Release pipeline ================ -Publishes a new release of TARDIS every sunday at 00:00 UTC. - - -Compare reference data pipeline -=============================== - -This pipeline compares two versions of the reference data. It's triggered manually via -the Azure Pipelines web UI, or when a TARDIS contributor leaves the following comment -on a pull request: -:: - - /AzurePipelines run compare-refdata - -For brevity, you can comment using ``/azp`` instead of ``/AzurePipelines``. - -By default, generates new reference data for the ``HEAD`` of the pull request. Then, -compares against latest reference data stored in ``tardis-refdata`` repository. If -you want to compare two different labels (SHAs, branches, tags, etc.) uncomment and -set the ``ref1.hash`` and ``ref2.hash`` variables in -``.github/workflows/compare-refdata.yml`` on your pull request. For example: -.. code-block:: yaml - - ref1.hash: 'upstream/pr/11' - ref2.hash: 'upstream/master' - -The web UI also allows to compare any version of the reference data by providing those -variables at runtime, but the access to the dashboard is restricted to a small group -of developers. - -.. warning:: If using the Azure dashboard, do not define ``ref1.hash`` and ``ref2.hash`` - between quotation marks or **the pipeline will fail**. This does not apply for - the YAML file. - -Finally, the report is uploaded to the -`OpenSupernova.org server `_ -following the ``/`` folder structure. If the pipeline fails, also a report is -generated, but not necessarily gives useful debug information (depends on which step the -pipeline has failed). - - -TARDIS Carsus Compatibility Check -================================= -The TARDIS Carsus Compatibility Check or the "Bridge" compares reference data -generated with different versions of Carsus. It consists of two jobs- a "carsus-build" job to -generate an atomic file with the latest version of Carsus and a "tardis-build" job -to generate a new reference data with it. These two reference data files are compared using the -`this notebook `_. -The workflow has a ``workflow_dispatch`` event so that it can be triggered manually, but is also -triggered every week due to the "save-atomic-files" workflow. - - -The Save Atomic Files Workflow -============================== -The Save Atomic Files workflow runs every week but can also be triggered manually. -It runs the "Bridge" and sends an artifact containing the generated atomic data file -and the comparison notebook to Moria. This workflow has a separate job to indicate if the -bridge has failed. +Publishes a new release of TARDIS every sunday at 00:00 UTC. \ No newline at end of file diff --git a/docs/contributing/development/links.inc b/docs/contributing/development/links.inc index 517b1d39948..f8b6bfd0061 100644 --- a/docs/contributing/development/links.inc +++ b/docs/contributing/development/links.inc @@ -2,5 +2,4 @@ .. include:: known_projects.inc .. include:: this_project.inc .. include:: git_links.inc -.. include:: azure_links.inc .. include:: matterbridge.inc \ No newline at end of file From cd75fa561a2d544c1c39425065a52a90faa6b607 Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Tue, 8 Oct 2024 18:55:23 +0530 Subject: [PATCH 07/20] remove integration tests section --- .../development/running_tests.rst | 34 +------------------ 1 file changed, 1 insertion(+), 33 deletions(-) diff --git a/docs/contributing/development/running_tests.rst b/docs/contributing/development/running_tests.rst index 73d97b3faab..a94b21e019a 100644 --- a/docs/contributing/development/running_tests.rst +++ b/docs/contributing/development/running_tests.rst @@ -68,36 +68,4 @@ Or, to run tests for a particular file or directory If, by any chance, you need to run tests on your fork, make sure to run the tests workflow on master branch first. The LFS cache generated in the master branch should be available in all child branches. You can check if cache was generated by looking in the ``Restore LFS Cache`` step of the workflow run. - Cache can also be found under the "Management" Section under "Actions" tab. - -Running the Integration Tests -============================= - -These tests require regression data files against which the results of the various -tardis runs are tested. So you first need to either download the current -regression data files (`here `_) -or generate new ones. - -Both of these require a configuration file for the integration tests: - -.. literalinclude:: integration.yml - :language: yaml - -Inside the atomic data directory there needs to be atomic data for each of -the setups that are provided in the ``test_integration`` folder. -If no references are given, the first step is to generate them. -The ``--less-packets`` option is useful for debugging purposes and will just -use very few packets to generate the regression data and thus make the process much -faster --- THIS IS ONLY FOR DEBUGGING PURPOSES. The ``-s`` option ensures that -TARDIS prints out the progress: - -.. code-block:: shell - - > pytest --integration=integration.yml -m integration --generate-reference --less-packets - -To run the test after having run the ``--generate-references``, all that is -needed is: - -.. code-block:: shell - - > pytest --integration=integration.yml -m integration --less-packets --remote-data + Cache can also be found under the "Management" Section under "Actions" tab. \ No newline at end of file From 2d9f34f655fe63a345cb20bb691be990fc207fa1 Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Tue, 8 Oct 2024 19:11:06 +0530 Subject: [PATCH 08/20] Remove integration tests mention from docs --- .../development/running_tests.rst | 70 ++++++++++++++----- 1 file changed, 54 insertions(+), 16 deletions(-) diff --git a/docs/contributing/development/running_tests.rst b/docs/contributing/development/running_tests.rst index a94b21e019a..f9438d1dbca 100644 --- a/docs/contributing/development/running_tests.rst +++ b/docs/contributing/development/running_tests.rst @@ -4,13 +4,9 @@ Running tests ************* -There are two basic categories of tests in TARDIS: 1) the unit -tests, and 2) the integration tests. Unit tests check the outputs of individual functions, -while the integration tests check entire runs for different setups of TARDIS. +In TARDIS, we focus primarily on unit tests. These tests check the outputs of individual functions, ensuring that each component behaves as expected. -The unit tests run very quickly and thus are executed after every suggested change -to TARDIS. The integration tests are much more costly and thus are only executed -every few days on a dedicated server. +Unit tests run quickly and are executed after every suggested change to TARDIS, allowing for immediate feedback and maintaining code quality. All of them are based on the excellent ``astropy-setup-helpers`` package and `pytest `_. @@ -26,16 +22,16 @@ tests, you can run this with: > pytest tardis -Running the more advanced unit tests requires TARDIS Regression data that can be +Running the more advanced unit tests requires TARDIS Reference data that can be downloaded -(`tardis-regression-data `_). +(`tardis-refdata `_). `Git LFS `_ is used -to download the large files in the tardis-regression-data repository. +to download the large refdata files in the tardis-refdata repository. However, it is not required to download the entire repository. Firstly it is -important to identify the regression-data files that are needed. Sometimes, it is possible +important to identify the refdata files that are needed. Sometimes, it is possible that a preused fixture that is also being used in the current tests is using some -regression-data. So, it is advised to check for such cases beforehand. +refdata. So, it is advised to check for such cases beforehand. After identifying the refdata files to be used in the unit tests, those particular files can be downloaded using ``git lfs`` @@ -44,21 +40,21 @@ files can be downloaded using ``git lfs`` > git lfs pull --include=filename -It is important to maintain the same directory structure as the tardis-regression-data repo -i.e. the lfs files should be in the same directory tree exactly as in tardis-regression-data +It is important to maintain the same directory structure as the tardis-refdata repo +i.e. the lfs files should be in the same directory tree exactly as in tardis-refdata repository. Finally, the tests can be run using the following command .. code-block:: shell - > pytest tardis --tardis-regression-data=/path/to/tardis-regression-data/ + > pytest tardis --tardis-refdata=/path/to/tardis-refdata/ Or, to run tests for a particular file or directory .. code-block:: shell - > pytest tardis/path/to/test_file_or_directory --tardis-regression-data=/path/to/tardis-regression-data/ + > pytest tardis/path/to/test_file_or_directory --tardis-refdata=/path/to/tardis-refdata/ .. warning:: The `tests workflow `_ runs on @@ -68,4 +64,46 @@ Or, to run tests for a particular file or directory If, by any chance, you need to run tests on your fork, make sure to run the tests workflow on master branch first. The LFS cache generated in the master branch should be available in all child branches. You can check if cache was generated by looking in the ``Restore LFS Cache`` step of the workflow run. - Cache can also be found under the "Management" Section under "Actions" tab. \ No newline at end of file + Cache can also be found under the "Management" Section under "Actions" tab. + +Generating Plasma Reference +=========================== + +You can generate Plasma Reference by the following command: + +.. code-block:: shell + + > pytest -rs tardis/plasma/tests/test_complete_plasmas.py + --tardis-refdata="/path/to/tardis-refdata/" --generate-reference + +Running the Integration Tests +============================= + +These tests require reference files against which the results of the various +tardis runs are tested. So you first need to either download the current +reference files (`here `_) +or generate new ones. + +Both of these require a configuration file for the integration tests: + +.. literalinclude:: integration.yml + :language: yaml + +Inside the atomic data directory there needs to be atomic data for each of +the setups that are provided in the ``test_integration`` folder. +If no references are given, the first step is to generate them. +The ``--less-packets`` option is useful for debugging purposes and will just +use very few packets to generate the references and thus make the process much +faster --- THIS IS ONLY FOR DEBUGGING PURPOSES. The ``-s`` option ensures that +TARDIS prints out the progress: + +.. code-block:: shell + + > pytest --integration=integration.yml -m integration --generate-reference --less-packets + +To run the test after having run the ``--generate-references``, all that is +needed is: + +.. code-block:: shell + + > pytest --integration=integration.yml -m integration --less-packets --remote-data From f7545190da18d8937c0c3654aad6fad944cecca5 Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Wed, 9 Oct 2024 17:55:17 +0530 Subject: [PATCH 09/20] Update docs as per review --- .../development/continuous_integration.rst | 23 +++++++++++++++++-- 1 file changed, 21 insertions(+), 2 deletions(-) diff --git a/docs/contributing/development/continuous_integration.rst b/docs/contributing/development/continuous_integration.rst index 7a32739d95d..3827587189e 100644 --- a/docs/contributing/development/continuous_integration.rst +++ b/docs/contributing/development/continuous_integration.rst @@ -19,7 +19,7 @@ run commands or tasks when they are triggered by some event, like a commit being pushed to a certain branch. Currently, we use GitHub Actions to run all of our pipelines. Making changes to an existing -pipeline is as easy as making a pull request. To create a new workflow on GitHub, +pipeline is as easy as making a pull request. To create a new GitHub Action workflow, just create a new YAML file in ``.github/workflows``. TARDIS Pipelines @@ -66,4 +66,23 @@ In the near future we want to auto-update the citation guidelines in the Release pipeline ================ -Publishes a new release of TARDIS every sunday at 00:00 UTC. \ No newline at end of file +Publishes a new release of TARDIS every sunday at 00:00 UTC. + + +TARDIS Carsus Compatibility Check +================================= +The TARDIS Carsus Compatibility Check or the "Bridge" compares reference data +generated with different versions of Carsus. It consists of two jobs- a "carsus-build" job to +generate an atomic file with the latest version of Carsus and a "tardis-build" job +to generate a new reference data with it. These two reference data files are compared using the +`this notebook `_. +The workflow has a ``workflow_dispatch`` event so that it can be triggered manually, but is also +triggered every week due to the "save-atomic-files" workflow. + + +The Save Atomic Files Workflow +============================== +The Save Atomic Files workflow runs every week but can also be triggered manually. +It runs the "Bridge" and sends an artifact containing the generated atomic data file +and the comparison notebook to Moria. This workflow has a separate job to indicate if the +bridge has failed. \ No newline at end of file From 2886eb14c0e325ca9e9e19441094f4e74510a458 Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Wed, 9 Oct 2024 17:55:44 +0530 Subject: [PATCH 10/20] Update docs as per review --- docs/contributing/development/update_regression_data.rst | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/docs/contributing/development/update_regression_data.rst b/docs/contributing/development/update_regression_data.rst index 352ba9e8fe0..2f13d5b65c1 100644 --- a/docs/contributing/development/update_regression_data.rst +++ b/docs/contributing/development/update_regression_data.rst @@ -10,8 +10,9 @@ TARDIS stores regression data in the `tardis-regression-data `_ to make a pull request to update the regression data. If any issues arise during this process, please tag a `TARDIS team member `_ responsible for CI/CD. From b7ca7417c51a53b97dc5b8272ae72bca20ceaf14 Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Thu, 10 Oct 2024 15:11:55 +0530 Subject: [PATCH 11/20] Fix hyperlink on homepage --- docs/index.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/index.rst b/docs/index.rst index ce71ae3b974..42f4940aec6 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -19,7 +19,7 @@ TARDIS Core Package Documentation TARDIS is an open-source Monte Carlo radiative-transfer spectral synthesis code for 1D models of supernova ejecta. It is designed for rapid spectral modelling of supernovae. It is developed and maintained by a -`multi-disciplinary team `_ +`multi-disciplinary team `_ including software engineers, computer scientists, statisticians, and astrophysicists. From 399c79238afa29d98c586cb4150b763a8732341f Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Thu, 10 Oct 2024 15:16:31 +0530 Subject: [PATCH 12/20] update team page hyperlink --- CODE_OF_CONDUCT.md | 2 +- GOVERNANCE.md | 2 +- docs/contributing/development/update_regression_data.rst | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md index e904a91ac74..bfe0ec25ba6 100644 --- a/CODE_OF_CONDUCT.md +++ b/CODE_OF_CONDUCT.md @@ -15,4 +15,4 @@ As members of the community, This code of conduct has been adapted from the Astropy Code of Conduct, which in turn uses parts of the PSF code of conduct. -**To report any violations of the code of conduct, please contact a member of the [TARDIS core team](https://tardis-sn.github.io/team/community_roles/index.html) (the email tardis.supernova.code@gmail.com is monitored by the core team) or the Ombudsperson (see the [team page](https://tardis-sn.github.io/team/community_roles/index.html); who is outside of the TARDIS collaboration and will treat reports confidentially).** +**To report any violations of the code of conduct, please contact a member of the [TARDIS core team](https://tardis-sn.github.io/people/core/) (the email tardis.supernova.code@gmail.com is monitored by the core team) or the Ombudsperson (see the [team page](https://tardis-sn.github.io/people/core/); who is outside of the TARDIS collaboration and will treat reports confidentially).** diff --git a/GOVERNANCE.md b/GOVERNANCE.md index d0563d3d05e..7b9f508473c 100644 --- a/GOVERNANCE.md +++ b/GOVERNANCE.md @@ -1,3 +1,3 @@ # TARDIS Collaboration Governance -Please visit our website to learn more about the [TARDIS Governance](https://tardis-sn.github.io/team/governance/). +Please visit our website to learn more about the [TARDIS Governance](https://tardis-sn.github.io/people/governance/). diff --git a/docs/contributing/development/update_regression_data.rst b/docs/contributing/development/update_regression_data.rst index 2f13d5b65c1..29f8fcaeb11 100644 --- a/docs/contributing/development/update_regression_data.rst +++ b/docs/contributing/development/update_regression_data.rst @@ -14,7 +14,7 @@ A. There's a problem in your code. B. Your code is OK, but the regression data is outdated. C. The pipeline is broken. -If you suspect scenario B, please follow the `Manual Procedure <#manual-procedure>`_ to make a pull request to update the regression data. If any issues arise during this process, please tag a `TARDIS team member `_ responsible for CI/CD. +If you suspect scenario B, please follow the `Manual Procedure <#manual-procedure>`_ to make a pull request to update the regression data. If any issues arise during this process, please tag a `TARDIS team member `_ responsible for CI/CD. ================ Manual Procedure From 6bfcb86aaed8484eff77a34fc60ea201ca9d1f6f Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Mon, 14 Oct 2024 12:37:34 +0530 Subject: [PATCH 13/20] update testing pipeline description --- docs/contributing/development/continuous_integration.rst | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/docs/contributing/development/continuous_integration.rst b/docs/contributing/development/continuous_integration.rst index 3827587189e..61d5c6d88ce 100644 --- a/docs/contributing/development/continuous_integration.rst +++ b/docs/contributing/development/continuous_integration.rst @@ -43,8 +43,7 @@ See the :ref:`Documentation Preview ` section for more information. Testing pipeline ================ -The `testing pipeline`_ (CI) consists basically in the same job running twice -in parallel (one for each OS), plus extra steps to run the tests and upload the coverage results. +The `testing pipeline`_ (CI) comprises of six concurrent jobs. Each of these jobs runs three types of tests across two distinct categories—continuum and non-continuum—and supports three different operating system platforms. Additionally, there are extra steps involved in executing the tests and uploading the coverage results Authors pipeline From 1b1212b96c5dafaaebfad2e73be2d2fcb4f4ccef Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Mon, 14 Oct 2024 19:09:21 +0530 Subject: [PATCH 14/20] Remove manual procedure heading --- docs/contributing/development/update_regression_data.rst | 8 ++------ 1 file changed, 2 insertions(+), 6 deletions(-) diff --git a/docs/contributing/development/update_regression_data.rst b/docs/contributing/development/update_regression_data.rst index 29f8fcaeb11..078726e71c0 100644 --- a/docs/contributing/development/update_regression_data.rst +++ b/docs/contributing/development/update_regression_data.rst @@ -14,11 +14,7 @@ A. There's a problem in your code. B. Your code is OK, but the regression data is outdated. C. The pipeline is broken. -If you suspect scenario B, please follow the `Manual Procedure <#manual-procedure>`_ to make a pull request to update the regression data. If any issues arise during this process, please tag a `TARDIS team member `_ responsible for CI/CD. - -================ -Manual Procedure -================ +If you suspect scenario B, please follow these instructions: #. Activate the ``tardis`` environment. #. Fork and clone the ``tardis-regression-data`` repository. @@ -28,4 +24,4 @@ Manual Procedure #. Check your results and ensure everything is correct. #. Make a new branch in ``tardis-regression-data``, push your new regression data, and open a pull request. -By following these updated procedures, you can efficiently manage and update regression data within your TARDIS project setup. +If any issues arise during this process, please tag a `TARDIS team member `_ responsible for CI/CD. \ No newline at end of file From 41d78af8b39f2786381b022cae1749a5463e79ad Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Mon, 14 Oct 2024 19:11:16 +0530 Subject: [PATCH 15/20] Change tarids into upper case letters --- docs/contributing/development/continuous_integration.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/contributing/development/continuous_integration.rst b/docs/contributing/development/continuous_integration.rst index 61d5c6d88ce..d538ca2fea8 100644 --- a/docs/contributing/development/continuous_integration.rst +++ b/docs/contributing/development/continuous_integration.rst @@ -25,7 +25,7 @@ just create a new YAML file in ``.github/workflows``. TARDIS Pipelines ---------------- -Brief description of pipelines already implemented on Tardis +Brief description of pipelines already implemented on TARDIS Documentation build pipeline ============================ From 68f4530530528017c33561d6404c125bb39db306 Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Mon, 14 Oct 2024 19:23:18 +0530 Subject: [PATCH 16/20] Add commonly followed steps in current tardis pipelines --- .../development/continuous_integration.rst | 22 +++++++++++++++++++ 1 file changed, 22 insertions(+) diff --git a/docs/contributing/development/continuous_integration.rst b/docs/contributing/development/continuous_integration.rst index d538ca2fea8..78059d07db1 100644 --- a/docs/contributing/development/continuous_integration.rst +++ b/docs/contributing/development/continuous_integration.rst @@ -27,6 +27,28 @@ TARDIS Pipelines Brief description of pipelines already implemented on TARDIS +# Streamlined Steps for TARDIS Pipelines + +We have a common set of steps which are utilized in TARDIS pipelines to streamline the process: + +### Common Steps + +1. **Use `setup_lfs` Action** + - If you need access to regression or atomic data, incorporate the `setup_lfs` action to ensure proper handling of large file storage. + +2. **Use `setup_env` Action** + - To configure your environment effectively, utilize the `setup_env` action. This will help establish the necessary variables and settings for your pipeline. + +3. **Run Configuration** + - Ensure that your pipeline runs with the appropriate shell settings. You can define this in your YAML configuration as follows: + + .. code-block:: yaml + + defaults: + run: + shell: bash -l {0} + + Documentation build pipeline ============================ From d1906777293f0b4dcd9a70d79d3b6d1cd75ba628 Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Mon, 14 Oct 2024 20:16:49 +0530 Subject: [PATCH 17/20] Fix headings --- .../development/continuous_integration.rst | 10 ++++++---- 1 file changed, 6 insertions(+), 4 deletions(-) diff --git a/docs/contributing/development/continuous_integration.rst b/docs/contributing/development/continuous_integration.rst index 78059d07db1..91a9318ebde 100644 --- a/docs/contributing/development/continuous_integration.rst +++ b/docs/contributing/development/continuous_integration.rst @@ -23,15 +23,17 @@ pipeline is as easy as making a pull request. To create a new GitHub Action work just create a new YAML file in ``.github/workflows``. TARDIS Pipelines ----------------- +================ Brief description of pipelines already implemented on TARDIS -# Streamlined Steps for TARDIS Pipelines +Streamlined Steps for TARDIS Pipelines +======================================== We have a common set of steps which are utilized in TARDIS pipelines to streamline the process: -### Common Steps +Common Steps +------------ 1. **Use `setup_lfs` Action** - If you need access to regression or atomic data, incorporate the `setup_lfs` action to ensure proper handling of large file storage. @@ -87,7 +89,7 @@ In the near future we want to auto-update the citation guidelines in the Release pipeline ================ -Publishes a new release of TARDIS every sunday at 00:00 UTC. +Publishes a new release of TARDIS every SUNDAY at 00:00 UTC. TARDIS Carsus Compatibility Check From 5fcb36ba5d1cf4f1dc42465d848e4ca8b9b869b4 Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Mon, 14 Oct 2024 20:26:07 +0530 Subject: [PATCH 18/20] Change sunday from all caps to only initial letter as capital --- docs/contributing/development/continuous_integration.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/contributing/development/continuous_integration.rst b/docs/contributing/development/continuous_integration.rst index 91a9318ebde..05e8d7becc4 100644 --- a/docs/contributing/development/continuous_integration.rst +++ b/docs/contributing/development/continuous_integration.rst @@ -89,7 +89,7 @@ In the near future we want to auto-update the citation guidelines in the Release pipeline ================ -Publishes a new release of TARDIS every SUNDAY at 00:00 UTC. +Publishes a new release of TARDIS every Sunday at 00:00 UTC. TARDIS Carsus Compatibility Check From 819165b2f4e3566f9a3e5ce696e2b3f6ba416a93 Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Mon, 14 Oct 2024 20:28:40 +0530 Subject: [PATCH 19/20] Update testing pipeline description --- docs/contributing/development/continuous_integration.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/contributing/development/continuous_integration.rst b/docs/contributing/development/continuous_integration.rst index 05e8d7becc4..45db365ccf1 100644 --- a/docs/contributing/development/continuous_integration.rst +++ b/docs/contributing/development/continuous_integration.rst @@ -67,7 +67,7 @@ See the :ref:`Documentation Preview ` section for more information. Testing pipeline ================ -The `testing pipeline`_ (CI) comprises of six concurrent jobs. Each of these jobs runs three types of tests across two distinct categories—continuum and non-continuum—and supports three different operating system platforms. Additionally, there are extra steps involved in executing the tests and uploading the coverage results +The `testing pipeline`_ (CI) comprises multiple concurrent jobs. Each of these jobs runs tests across two distinct categories—continuum and rpacket tracking—and supports two different operating systems. Additionally, there are extra steps involved in executing the tests and uploading the coverage results Authors pipeline From 34b63072ed777f1018be8a922abf65032e14f457 Mon Sep 17 00:00:00 2001 From: KasukabeDefenceForce Date: Wed, 16 Oct 2024 09:46:07 +0530 Subject: [PATCH 20/20] update group email --- CODE_OF_CONDUCT.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md index bfe0ec25ba6..8562360134f 100644 --- a/CODE_OF_CONDUCT.md +++ b/CODE_OF_CONDUCT.md @@ -15,4 +15,4 @@ As members of the community, This code of conduct has been adapted from the Astropy Code of Conduct, which in turn uses parts of the PSF code of conduct. -**To report any violations of the code of conduct, please contact a member of the [TARDIS core team](https://tardis-sn.github.io/people/core/) (the email tardis.supernova.code@gmail.com is monitored by the core team) or the Ombudsperson (see the [team page](https://tardis-sn.github.io/people/core/); who is outside of the TARDIS collaboration and will treat reports confidentially).** +**To report any violations of the code of conduct, please contact a member of the [TARDIS core team](https://tardis-sn.github.io/people/core/) (the email tardiscollaboration@gmail.com is monitored by the core team) or the Ombudsperson (see the [team page](https://tardis-sn.github.io/people/core/); who is outside of the TARDIS collaboration and will treat reports confidentially).**