Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: An Introduction to prismAId: Open-Source and Open Science AI for Advancing Information Extraction in Systematic Reviews #7616

Open
editorialbot opened this issue Dec 24, 2024 · 29 comments
Assignees
Labels
Go R review TeX Track: 5 (DSAIS) Data Science, Artificial Intelligence, and Machine Learning

Comments

@editorialbot
Copy link
Collaborator

editorialbot commented Dec 24, 2024

Submitting author: @ricboer0 (Riccardo Boero)
Repository: https://github.com/open-and-sustainable/prismaid
Branch with paper.md (empty if default branch): joss-submission
Version: v0.6.4
Editor: @crvernon
Reviewers: @philip928lin, @jhculb
Archive: Pending

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/b304d9af36e75fd2b10ebcdb55add064"><img src="https://joss.theoj.org/papers/b304d9af36e75fd2b10ebcdb55add064/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/b304d9af36e75fd2b10ebcdb55add064/status.svg)](https://joss.theoj.org/papers/b304d9af36e75fd2b10ebcdb55add064)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@philip928lin & @jhculb, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review.
First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @crvernon know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @philip928lin

📝 Checklist for @jhculb

@editorialbot
Copy link
Collaborator Author

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

✅ OK DOIs

- 10.1186/s13643-021-01626-4 is OK
- 10.1080/02763869.2019.1588072 is OK
- 10.3390/cli12080116 is OK
- 10.31222/osf.io/wh8qn is OK

🟡 SKIP DOIs

- No DOI given, and none found for title: prismAId - Documentation website

❌ MISSING DOIs

- None

❌ INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

Software report:

github.com/AlDanial/cloc v 1.98  T=0.04 s (2249.0 files/s, 197666.1 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Go                              64            694            950           4075
Markdown                        12            330              2           1203
YAML                             6             72             31            423
JavaScript                       1             47             25            208
TOML                             5             29             19            188
Text                             3              0              0            112
TeX                              1              4              0             72
CSS                              1              9              1             42
R                                1              9             78             34
Julia                            2             10              7             33
Python                           1              6              3             24
JSON                             1              0              0             19
C                                1              4              2             15
C/C++ Header                     1              3              2              4
-------------------------------------------------------------------------------
SUM:                           100           1217           1120           6452
-------------------------------------------------------------------------------

Commit count by author:

   701	Riccardo Boero
    12	GitHub Action

@editorialbot
Copy link
Collaborator Author

Paper file info:

📄 Wordcount for paper.md is 1823

✅ The paper includes a Statement of need section

@editorialbot
Copy link
Collaborator Author

License info:

🟡 License found: GNU Affero General Public License v3.0 (Check here for OSI approval)

@crvernon
Copy link

👋 @ricboer0, @philip928lin and @jhculb - This is the review thread for the paper. All of our communications will happen here from now on.

Please read the "Reviewer instructions & questions" in the first comment above.

Both reviewers have checklists at the top of this thread (in that first comment) with the JOSS requirements. As you go over the submission, please check any items that you feel have been satisfied. There are also links to the JOSS reviewer guidelines.

The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, the reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention #7616 so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread. In my experience, it is better to post comments/questions/suggestions as you come across them instead of waiting until you've reviewed the entire package.

We aim for the review process to be completed within about 4-6 weeks but please make a start well ahead of this as JOSS reviews are by their nature iterative and any early feedback you may be able to provide to the author will be very helpful in meeting this schedule.

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@philip928lin
Copy link

philip928lin commented Dec 26, 2024

Review checklist for @philip928lin

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/open-and-sustainable/prismaid?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@ricboer0) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1. Contribute to the software 2. Report issues or problems with the software 3. Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@jhculb
Copy link

jhculb commented Dec 28, 2024

Review checklist for @jhculb

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/open-and-sustainable/prismaid?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@ricboer0) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1. Contribute to the software 2. Report issues or problems with the software 3. Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@jhculb
Copy link

jhculb commented Dec 28, 2024

Hi @ricboer0,

I'm getting a 404 on the github.io documentation link in the readme: https://open-and-sustainable.github.io/prismAId/

Is this just me? @philip928lin

@jhculb
Copy link

jhculb commented Dec 28, 2024

Hi @ricboer0,

Your paper may be improved with a brief survey of past and current academic information extraction tools, as at the moment the state of the field section of the paper feels lacking.

Perhaps LISC https://github.com/lisc-tools/lisc ?
Or AutoIE: https://doi.org/10.1007/978-981-97-5495-3_32 ?

@philip928lin
Copy link

Hi @ricboer0,

I'm getting a 404 on the github.io documentation link in the readme: https://open-and-sustainable.github.io/prismAId/

Is this just me? @philip928lin

@jhculb I encountered same issue as you. Multiple hyperlinks in the readme have same issue. @ricboer0 please double check the links.

@philip928lin
Copy link

philip928lin commented Dec 28, 2024

Hi @ricboer0,
It is nice to see the online Review Configurator. However, the outputted configuration file has incorrect value format that leads to error when running the code. Please see below.

The online configurator did not output the correct format

Error loading project configuration: toml: line 27 (last key "project.llm.1.temperature"): incompatible types: TOML value has type string; destination has type float
Error: toml: line 27 (last key "project.llm.1.temperature"): incompatible types: TOML value has type string; destination has type float

Namely, the following lines in TOML file

temperature = "0.01"
tpm_limit = "0"
rpm_limit = "0"

should be

temperature = 0.01
tpm_limit = 0
rpm_limit = 0

Also, if the online Review Configurator can automatically format the path string for users, it can eliminate the potential errors as shown below.

Error loading project configuration: toml: line 9 (last key "project.configuration.results_file_name"): expected eight hexadecimal digits after '\U', but got "C:\\Us" instead
Error: toml: line 9 (last key "project.configuration.results_file_name"): expected eight hexadecimal digits after '\U', but got "C:\\Us" instead

@philip928lin
Copy link

Hi @ricboer0,

I also tried to play with the Zotero Integration. I appreciate this feature. However, more detailed steps and expected outputs can be provided in the documentation to ease usage.

  1. A minor correction in the documentation: "Privacy Tab" should be "Security Tab"
    Pasted image 20241227115411

  2. Adding the example Zotero configuration block to the documentation would be helpful. E.g.,

[project.zotero]
user = ""
api_key = ""
group = ""
  1. I could not see the review results when running with Zotero integration.
    I tried to follow the instruction to run the literature review using Zotero integration. After running, I can see the "zotero" folder is created with the extracted texts as described in the documentation, but I have a hard time finding the review output like "output_test.csv" or "lit_test_summary.txt" at the given output path. No error messages were shown. I was using a Python wrapper and called the python script through the command line. @jhculb, do you have the same issue here?

@ricboer0, I would appreciate more detailed instructions and examples on the Zotero integration feature.

Also, another side note, when I executed the Python script in the IDE like Spyder, it restarted a new kernel in the middle of the process every time. However, this may be due to the IDE's internal issue. I can successfully run the Python script when directly called from the command line.

@ricboer0
Copy link

Hi @ricboer0,
I'm getting a 404 on the github.io documentation link in the readme: https://open-and-sustainable.github.io/prismAId/
Is this just me? @philip928lin

@jhculb I encountered same issue as you. Multiple hyperlinks in the readme have same issue. @ricboer0 please double check the links.

Hi @jhculb and @philip928lin,
thank you so much for all the useful comments and suggestions.

I am working to double check and fix links to documentation website
open-and-sustainable/prismaid#107
I will let you know when fixes are ready and merged.

@ricboer0
Copy link

Hi @ricboer0,

Your paper may be improved with a brief survey of past and current academic information extraction tools, as at the moment the state of the field section of the paper feels lacking.

Perhaps LISC https://github.com/lisc-tools/lisc ? Or AutoIE: https://doi.org/10.1007/978-981-97-5495-3_32 ?

Hi @jhculb,
I did include a review of AI tools and approaches to SLRs in the first submission of the paper to JOSS. It was actually a review of reviews on the topic because it is already a vast field in the literature.

However, following guidelines from the editor @crvernon I significantly reduced the length of the paper and removed that section (among others). You can read that 'introduction' section in the preprint that corresponds to the first submission of this paper at https://doi.org/10.31222/osf.io/wh8qn.

I am open to any suggestion. In this shorter version of the paper I already reference the extended introduction accessible in the preprint for a more complete description of alternative approaches but if this is not enough I would be happy to bring that removed section back into the paper. It is just a matter of space constraints, if they matter.

@ricboer0
Copy link

Hi @ricboer0, It is nice to see the online Review Configurator. However, the outputted configuration file has incorrect value format that leads to error when running the code. Please see below.

The online configurator did not output the correct format

Error loading project configuration: toml: line 27 (last key "project.llm.1.temperature"): incompatible types: TOML value has type string; destination has type float Error: toml: line 27 (last key "project.llm.1.temperature"): incompatible types: TOML value has type string; destination has type float

Namely, the following lines in TOML file

temperature = "0.01"
tpm_limit = "0"
rpm_limit = "0"

should be

temperature = 0.01
tpm_limit = 0
rpm_limit = 0

Also, if the online Review Configurator can automatically format the path string for users, it can eliminate the potential errors as shown below.

Error loading project configuration: toml: line 9 (last key "project.configuration.results_file_name"): expected eight hexadecimal digits after '\U', but got "C:\\Us" instead
Error: toml: line 9 (last key "project.configuration.results_file_name"): expected eight hexadecimal digits after '\U', but got "C:\\Us" instead

Hi @philip928lin,
thanks for finding the bug and the suggestion on path formatting. I will work on these issues at open-and-sustainable/prismaid#108 and let you know once finished.

@ricboer0
Copy link

Hi @ricboer0,

I also tried to play with the Zotero Integration. I appreciate this feature. However, more detailed steps and expected outputs can be provided in the documentation to ease usage.

1. A minor correction in the documentation: "Privacy Tab" should be "Security Tab"
   ![Pasted image 20241227115411](https://private-user-images.githubusercontent.com/23210992/399098495-83fddbee-be24-4f57-b625-fa7376c79d33.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MzU1NDY0MjgsIm5iZiI6MTczNTU0NjEyOCwicGF0aCI6Ii8yMzIxMDk5Mi8zOTkwOTg0OTUtODNmZGRiZWUtYmUyNC00ZjU3LWI2MjUtZmE3Mzc2Yzc5ZDMzLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNDEyMzAlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQxMjMwVDA4MDg0OFomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWM2NGFhN2Y1NjFhZGVlYTNkOWQ0M2JjMTExMzhjMTkwOGViYmRjZTE4ODE3MTI5YmIwM2E0ZDI3NWVhM2M4NDEmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.PHZ7mkm9t7lcpylUCtKYbAdr9NrcS-LcjU_X3N1qypI)

2. Adding the example Zotero configuration block to the documentation would be helpful. E.g.,
[project.zotero]
user = ""
api_key = ""
group = ""
3. I could not see the review results when running with Zotero integration.
   I tried to follow the instruction to run the literature review using Zotero integration. After running, I can see the "zotero" folder is created with the extracted texts as described in the documentation, but I have a hard time finding the review output like "output_test.csv" or "lit_test_summary.txt" at the given output path. No error messages were shown. I was using a Python wrapper and called the python script through the command line. @jhculb, do you have the same issue here?

@ricboer0, I would appreciate more detailed instructions and examples on the Zotero integration feature.

Also, another side note, when I executed the Python script in the IDE like Spyder, it restarted a new kernel in the middle of the process every time. However, this may be due to the IDE's internal issue. I can successfully run the Python script when directly called from the command line.

Hi @philip928lin,
On issues .1 and .2 I will improve the documentation in open-and-sustainable/prismaid#109

On issue .3: since you can find the Zotero manuscripts on your machine, it means that the Zotero integration is properly configured and actually working: it connects to the API, it finds the collection, downloads the pdfs and 'tries' the conversion to .txt.
Though I don't know anything about Spyder, it seems like you are encountering an error later in the execution of the review logic or in the phase saving results.
Please set the logging level to medium or high: through log inspection you should be able to spot the problem quite easily.

@ricboer0
Copy link

Hi @jhculb and @philip928lin,
I have completed and merged all the issues I opened to incorporate your suggestions, and I have also rebased the JOSS submission branch.

Please verify if the changes address the comments you raised.

Additionally, could you let me know if I can extend the paper to include a review of the state of the field? This would help to clearly highlight the novelty of this tool, but I believe it cannot be added at the expense of other sections in the paper, as they are all at least as essential.

@jhculb
Copy link

jhculb commented Dec 30, 2024

Hi @ricboer0,

Your paper may be improved with a brief survey of past and current academic information extraction tools, as at the moment the state of the field section of the paper feels lacking.

Perhaps LISC https://github.com/lisc-tools/lisc ? Or AutoIE: https://doi.org/10.1007/978-981-97-5495-3_32 ?

Hi @jhculb,
I did include a review of AI tools and approaches to SLRs in the first submission of the paper to JOSS. It was actually a review of reviews on the topic because it is already a vast field in the literature.

However, following guidelines from the editor @crvernon I significantly reduced the length of the paper and removed that section (among others). You can read that 'introduction' section in the preprint that corresponds to the first submission of this paper at https://doi.org/10.31222/osf.io/wh8qn.

I am open to any suggestion. In this shorter version of the paper I already reference the extended introduction accessible in the preprint for a more complete description of alternative approaches but if this is not enough I would be happy to bring that removed section back into the paper. It is just a matter of space constraints, if they matter.

Hi @ricboer0,

I will defer to the editor on this (@crvernon - your thoughts?), but I understand that the focus for JOSS' State of the field section is the availability of similar software and the distinction to your tool.

From the review checklist; "State of the field: Do the authors describe how this software compares to other commonly-used packages?"

Having a quick look at your preprint I see a interesting review of the field but no explicit software cited, which may be the reason @crvernon asked for it to be condensed.

I personally would recommend a short review of popular or distubguished software packages / applications providing similar functionality be included. Otherwise a fine paper I believe :)

@crvernon
Copy link

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@crvernon
Copy link

Concerning the length of the paper and the statement of need:

I personally would recommend a short review of popular or distinguished software packages / applications providing similar functionality be included.

I think this is a reasonable request. In JOSS we want to keep the paper as succinct as possible and encourage the procedural and in-depth descriptive content to be placed in the software's documentation for lasting benefit that will hopefully be improved as the software evolves. This makes the JOSS paper for powerful too, since it will not be filled with content that may be quickly outdated as versions of the software change - instead, with a focus on why this software is needed and why it is being introduced as an open-source community research software.

On this note, it is good to cite what is comparable and the value that you believe that your software is adding. JOSS does not require uniqueness, but the hope is that this transparency in the statement of need section will give users context to quickly decide if the proposed software is one that may benefit their own research.

Thanks!

@ricboer0
Copy link

ricboer0 commented Jan 2, 2025

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@ricboer0
Copy link

ricboer0 commented Jan 2, 2025

Concerning the length of the paper and the statement of need:

I personally would recommend a short review of popular or distinguished software packages / applications providing similar functionality be included.

I think this is a reasonable request. In JOSS we want to keep the paper as succinct as possible and encourage the procedural and in-depth descriptive content to be placed in the software's documentation for lasting benefit that will hopefully be improved as the software evolves. This makes the JOSS paper for powerful too, since it will not be filled with content that may be quickly outdated as versions of the software change - instead, with a focus on why this software is needed and why it is being introduced as an open-source community research software.

On this note, it is good to cite what is comparable and the value that you believe that your software is adding. JOSS does not require uniqueness, but the hope is that this transparency in the statement of need section will give users context to quickly decide if the proposed software is one that may benefit their own research.

Thanks!

@crvernon, @jhculb, and @philip928lin ,
following your comments and guidelines, I added a concise state of the field section to the paper. See the pdf generated above.

Please let me know if there are any other points where the paper can be improved.
Thanks a lot

@crvernon
Copy link

crvernon commented Jan 3, 2025

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Go R review TeX Track: 5 (DSAIS) Data Science, Artificial Intelligence, and Machine Learning
Projects
None yet
Development

No branches or pull requests

5 participants