Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: matbench-genmetrics: A Python library for benchmarking crystal structure generative models using time-based splits of Materials Project structures #5618

Closed
editorialbot opened this issue Jul 4, 2023 · 109 comments
Assignees
Labels
accepted Dockerfile Jupyter Notebook Mathematica published Papers published in JOSS recommend-accept Papers recommended for acceptance in JOSS. review Track: 2 (BCM) Biomedical Engineering, Biosciences, Chemistry, and Materials

Comments

@editorialbot
Copy link
Collaborator

editorialbot commented Jul 4, 2023

Submitting author: @sgbaird (Sterling Baird)
Repository: https://github.com/sparks-baird/matbench-genmetrics
Branch with paper.md (empty if default branch):
Version: v0.6.5
Editor: @phibeck
Reviewers: @ml-evs, @mkhorton, @jamesrhester
Archive: 10.5281/zenodo.10840604

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/7ba5a67474e60ef6927a8635354b8546"><img src="https://joss.theoj.org/papers/7ba5a67474e60ef6927a8635354b8546/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/7ba5a67474e60ef6927a8635354b8546/status.svg)](https://joss.theoj.org/papers/7ba5a67474e60ef6927a8635354b8546)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@ml-evs & @mkhorton & @jamesrhester, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review.
First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @phibeck know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @ml-evs

📝 Checklist for @jamesrhester

📝 Checklist for @mkhorton

@editorialbot editorialbot added Dockerfile Jupyter Notebook Mathematica review Track: 2 (BCM) Biomedical Engineering, Biosciences, Chemistry, and Materials labels Jul 4, 2023
@editorialbot
Copy link
Collaborator Author

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

Software report:

github.com/AlDanial/cloc v 1.88  T=0.08 s (755.2 files/s, 365581.4 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Python                          26            704           1254           1847
Jupyter Notebook                 5              0          21769           1194
Markdown                        12            208              0            679
TeX                              6             28              0            378
YAML                             5             21             72            209
INI                              1             11              0             83
Dockerfile                       1             15             23             26
make                             1              6              8             15
TOML                             1              1              3              5
JSON                             1              0              0              1
-------------------------------------------------------------------------------
SUM:                            59            994          23129           4437
-------------------------------------------------------------------------------


gitinspector failed to run statistical information for the repository

@editorialbot
Copy link
Collaborator Author

Wordcount for paper.md is 1122

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.26434/chemrxiv-2022-6l4pm is OK
- 10.1038/s41467-019-10030-5 is OK
- 10.21105/joss.04528 is OK
- 10.1021/acs.jcim.8b00839 is OK
- 10.1038/s43588-022-00349-3 is OK
- 10.1038/s41524-020-00406-3 is OK
- 10.1063/1.4812323 is OK
- 10.1016/j.commatsci.2012.10.028 is OK
- 10.1038/s41598-022-08413-8 is OK
- 10.3389/fphar.2020.565644 is OK
- 10.1016/j.matt.2021.11.032 is OK
- 10.1107/S2056989019016244 is OK
- 10.1038/s41586-019-1335-8 is OK
- 10.1002/advs.202100566 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@jamesrhester
Copy link

jamesrhester commented Jul 4, 2023

Review checklist for @jamesrhester

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/sparks-baird/matbench-genmetrics?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@sgbaird) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@ml-evs
Copy link

ml-evs commented Jul 4, 2023

Review checklist for @ml-evs

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/sparks-baird/matbench-genmetrics?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@sgbaird) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@phibeck
Copy link

phibeck commented Jul 14, 2023

@jamesrhester, @ml-evs, @mkhorton, thanks again for reviewing! Just checking in on your review. Please let me know if you have any questions about the process. Feel free to create issues in project repository directly or write them down as comments here, but please do link the issues in this review so it's easy to follow for everyone. @mkhorton, please go ahead and create your checklist first using the command @editorialbot generate my checklist.

@mkhorton
Copy link

mkhorton commented Jul 15, 2023

Review checklist for @mkhorton

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/sparks-baird/matbench-genmetrics?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@sgbaird) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@phibeck
Copy link

phibeck commented Jul 24, 2023

👋 @jamesrhester , @mkhorton, please update us on how it's going with your reviews when you find the time

@mkhorton
Copy link

Appreciate the reminder @phibeck, thank you -- on my radar, can't believe it's almost been a month already since agreeing to review!

@jamesrhester
Copy link

Likewise!

@phibeck
Copy link

phibeck commented Aug 10, 2023

Thank you all for getting the review started! As you work through your checklists, please feel free to comment and ask questions in this thread. You are encouraged to create issues in the repository directly. When you do, please mention openjournals/joss-reviews#5618 so that it creates a link in this thread and we can keep track of it.

Please let me know if you have any questions or if either of you requires some more time.

@mkhorton
Copy link

Hi @phibeck, running through the checklist. Unfortunately I may have a conflict of interest: I have previously been on publications with the second author, Joseph Montoya. We worked together in the same research group until 2018 (i.e. outside the four year window in the COI policy), but the most recent paper we were on together actually only came out in 2021.

@danielskatz
Copy link

If the work was done more then 4 years, but the paper appeared later, this is not a conflict for JOSS. The four years is about the collaborative relationship itself.

@phibeck
Copy link

phibeck commented Aug 18, 2023

Thank you, @danielskatz, for clarifying this question. Sounds like you don't have a COI here, @mkhorton.

@ml-evs
Copy link

ml-evs commented Aug 22, 2023

Just a heads-up (mostly for @phibeck) that I will restart my review on this, and will continue collecting small things in my old issue at sparks-baird/matbench-genmetrics#80 which wasn't previously linked here.

@phibeck
Copy link

phibeck commented Sep 4, 2023

@jamesrhester, @mkhorton & @ml-evs - could you provide an update on the progress of your review? Thank you!

@jamesrhester
Copy link

Getting onto this now. We've just had a big crystallography meeting that took up a lot of my cycles...

@jamesrhester
Copy link

As a crystallographer but non ML specialist, I found the "Statement of Need" lacked context. Line 22 in the paper made no sense to me, which is not good for the first sentence. I would therefore like one or two further sentences added at the beginning explaining how ML uses benchmarks, e.g "in ML, the result of a prediction is evaluated using benchmarks, which are then used to adjust the ML weights. Typically, a crystal structure ML model has used benchmarks from...." (which might show I have no idea what I'm talking about). I think this will better help readers to quickly determine whether or not the paper is relevant to them.

Once this is done I'm ready to sign off on my review.

@phibeck
Copy link

phibeck commented Sep 5, 2023

Great, thanks for the update, @jamesrhester. @sgbaird, feel free to get started working on the comments and issues linked here by @jamesrhester and @ml-evs. Please update us here in this issue about the progress so we can keep track of the changes.

@phibeck
Copy link

phibeck commented Sep 20, 2023

👋 @sgbaird could you let us know where you stand with responding to the comments of the reviewers?

@mkhorton and @ml-evs, it would be great to get an update from your side as well regarding the remaining points on your checklists. Thank you!

@phibeck
Copy link

phibeck commented May 10, 2024

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@phibeck
Copy link

phibeck commented May 10, 2024

Sorry about that. Not sure what happened there. I added it in just now. Does it look ok?

No problem. Looks good now, thanks!

@phibeck
Copy link

phibeck commented May 10, 2024

@editorialbot check references

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1038/s41524-023-00987-9 is OK
- 10.26434/chemrxiv-2022-6l4pm is OK
- 10.1038/s41467-019-10030-5 is OK
- 10.21105/joss.04528 is OK
- 10.1021/acs.jcim.8b00839 is OK
- 10.1038/s43588-022-00349-3 is OK
- 10.1038/s41524-020-00406-3 is OK
- 10.1063/1.4812323 is OK
- 10.1016/j.commatsci.2012.10.028 is OK
- 10.1038/s41598-022-08413-8 is OK
- 10.3389/fphar.2020.565644 is OK
- 10.1016/j.matt.2021.11.032 is OK
- 10.1107/S2056989019016244 is OK
- 10.1038/s41586-019-1335-8 is OK
- 10.1002/advs.202100566 is OK
- 10.48550/arXiv.2306.11688 is OK
- 10.48550/arXiv.2308.14920 is OK

MISSING DOIs

- No DOI given, and none found for title: Scikit-Learn: Machine Learning in Python
- No DOI given, and none found for title: Crystal Diffusion Variational Autoencoder for Peri...
- No DOI given, and none found for title: Physics Guided Generative Adversarial Networks for...

INVALID DOIs

- None

@phibeck
Copy link

phibeck commented May 10, 2024

@editorialbot recommend-accept

@editorialbot
Copy link
Collaborator Author

Attempting dry run of processing paper acceptance...

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1038/s41524-023-00987-9 is OK
- 10.26434/chemrxiv-2022-6l4pm is OK
- 10.1038/s41467-019-10030-5 is OK
- 10.21105/joss.04528 is OK
- 10.1021/acs.jcim.8b00839 is OK
- 10.1038/s43588-022-00349-3 is OK
- 10.1038/s41524-020-00406-3 is OK
- 10.1063/1.4812323 is OK
- 10.1016/j.commatsci.2012.10.028 is OK
- 10.1038/s41598-022-08413-8 is OK
- 10.3389/fphar.2020.565644 is OK
- 10.1016/j.matt.2021.11.032 is OK
- 10.1107/S2056989019016244 is OK
- 10.1038/s41586-019-1335-8 is OK
- 10.1002/advs.202100566 is OK
- 10.48550/arXiv.2306.11688 is OK
- 10.48550/arXiv.2308.14920 is OK

MISSING DOIs

- No DOI given, and none found for title: Scikit-Learn: Machine Learning in Python
- No DOI given, and none found for title: Crystal Diffusion Variational Autoencoder for Peri...
- No DOI given, and none found for title: Physics Guided Generative Adversarial Networks for...

INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

👋 @openjournals/bcm-eics, this paper is ready to be accepted and published.

Check final proof 👉📄 Download article

If the paper PDF and the deposit XML files look good in openjournals/joss-papers#5342, then you can now move forward with accepting the submission by compiling again with the command @editorialbot accept

@editorialbot editorialbot added the recommend-accept Papers recommended for acceptance in JOSS. label May 10, 2024
@sgbaird
Copy link

sgbaird commented May 15, 2024

I guess the Alverson reference needs to be updated to the published version. Will try to address shortly - no worries if too late.

EDIT: added in sparks-baird/matbench-genmetrics@8f7102d

@sgbaird
Copy link

sgbaird commented May 17, 2024

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@phibeck
Copy link

phibeck commented May 23, 2024

Okay, thanks!

👋 @Kevin-Mattheus-Moerman this paper is ready for acceptance!

@Kevin-Mattheus-Moerman
Copy link
Member

Kevin-Mattheus-Moerman commented May 25, 2024

@sgbaird as AEiC for JOSS I will now help to process this submission for acceptance in JOSS. I have checked this review, your repository, the archive link, and the paper. Most seems in order, I only have the below points that require your attention:

  • Check spelling for practioners this should perhaps be practitioners
  • In your affiliations, please spell out USA as United States of America.
  • In the 3rd affiliation, please add the country (and you may remove the zip code if you like, this is not needed).
  • For the reference "JARVIS-Leaderboard: A Large Scale Benchmark of Materials Design Methods", you cite an Arxiv link, if you think it is appropriate, you could instead refer to the version which, if I'm not mistaken, now appears published here: https://doi.org/10.1038/s41524-024-01259-w
  • On the use of ChatGPT can you please clarify how it was used here in more detail?

@sgbaird
Copy link

sgbaird commented May 25, 2024

@sgbaird as AEiC for JOSS I will now help to process this submission for acceptance in JOSS. I have checked this review, your repository, the archive link, and the paper. Most seems in order, I only have the below points that require your attention:

  • Check spelling for practioners this should perhaps be practitioners
  • In your affiliations, please spell out USA as United States of America.
  • In the 3rd affiliation, please add the country (and you may remove the zip code if you like, this is not needed).
  • For the reference "JARVIS-Leaderboard: A Large Scale Benchmark of Materials Design Methods", you cite an Arxiv link, if you think it is appropriate, you could instead refer to the version which, if I'm not mistaken, now appears published here: doi.org/10.1038/s41524-024-01259-w
  • On the use of ChatGPT can you please clarify how it was used here in more detail?

Hi, I think all of these are addressed now. Can you take a look?

@Kevin-Mattheus-Moerman
Copy link
Member

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@Kevin-Mattheus-Moerman
Copy link
Member

@editorialbot accept

@editorialbot
Copy link
Collaborator Author

Doing it live! Attempting automated processing of paper acceptance...

@editorialbot
Copy link
Collaborator Author

Ensure proper citation by uploading a plain text CITATION.cff file to the default branch of your repository.

If using GitHub, a Cite this repository menu will appear in the About section, containing both APA and BibTeX formats. When exported to Zotero using a browser plugin, Zotero will automatically create an entry using the information contained in the .cff file.

You can copy the contents for your CITATION.cff file here:

CITATION.cff

cff-version: "1.2.0"
authors:
- family-names: Baird
  given-names: Sterling G.
  orcid: "https://orcid.org/0000-0002-4491-6876"
- family-names: Sayeed
  given-names: Hasan M.
  orcid: "https://orcid.org/0000-0002-6583-7755"
- family-names: Montoya
  given-names: Joseph
  orcid: "https://orcid.org/0000-0001-5760-2860"
- family-names: Sparks
  given-names: Taylor D.
  orcid: "https://orcid.org/0000-0001-8020-7711"
contact:
- family-names: Baird
  given-names: Sterling G.
  orcid: "https://orcid.org/0000-0002-4491-6876"
doi: 10.5281/zenodo.10840604
message: If you use this software, please cite our article in the
  Journal of Open Source Software.
preferred-citation:
  authors:
  - family-names: Baird
    given-names: Sterling G.
    orcid: "https://orcid.org/0000-0002-4491-6876"
  - family-names: Sayeed
    given-names: Hasan M.
    orcid: "https://orcid.org/0000-0002-6583-7755"
  - family-names: Montoya
    given-names: Joseph
    orcid: "https://orcid.org/0000-0001-5760-2860"
  - family-names: Sparks
    given-names: Taylor D.
    orcid: "https://orcid.org/0000-0001-8020-7711"
  date-published: 2024-05-27
  doi: 10.21105/joss.05618
  issn: 2475-9066
  issue: 97
  journal: Journal of Open Source Software
  publisher:
    name: Open Journals
  start: 5618
  title: "matbench-genmetrics: A Python library for benchmarking crystal
    structure generative models using time-based splits of Materials
    Project structures"
  type: article
  url: "https://joss.theoj.org/papers/10.21105/joss.05618"
  volume: 9
title: "matbench-genmetrics: A Python library for benchmarking crystal
  structure generative models using time-based splits of Materials
  Project structures"

If the repository is not hosted on GitHub, a .cff file can still be uploaded to set your preferred citation. Users will be able to manually copy and paste the citation.

Find more information on .cff files here and here.

@editorialbot
Copy link
Collaborator Author

🐘🐘🐘 👉 Toot for this paper 👈 🐘🐘🐘

@editorialbot
Copy link
Collaborator Author

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.05618 joss-papers#5392
  2. Wait five minutes, then verify that the paper DOI resolves https://doi.org/10.21105/joss.05618
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@editorialbot editorialbot added accepted published Papers published in JOSS labels May 27, 2024
@Kevin-Mattheus-Moerman
Copy link
Member

@sgbaird congratulations on this JOSS publication!

Thanks for editing @phibeck !

And a special thank you to the reviewers: @ml-evs, @mkhorton, @jamesrhester !!!!

@editorialbot
Copy link
Collaborator Author

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.05618/status.svg)](https://doi.org/10.21105/joss.05618)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.05618">
  <img src="https://joss.theoj.org/papers/10.21105/joss.05618/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.05618/status.svg
   :target: https://doi.org/10.21105/joss.05618

This is how it will look in your documentation:

DOI

We need your help!

The Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

@sgbaird
Copy link

sgbaird commented May 27, 2024

@sgbaird congratulations on this JOSS publication!

Thanks for editing @phibeck !

And a special thank you to the reviewers: @ml-evs, @mkhorton, @jamesrhester !!!!

Amazing! Thank you all so much for the support and patience 🤗 I really appreciate your efforts!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted Dockerfile Jupyter Notebook Mathematica published Papers published in JOSS recommend-accept Papers recommended for acceptance in JOSS. review Track: 2 (BCM) Biomedical Engineering, Biosciences, Chemistry, and Materials
Projects
None yet
Development

No branches or pull requests

10 participants