Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running inference in batch on all videos via GUI #497

Closed
calebvogt opened this issue Mar 1, 2021 · 2 comments
Closed

Running inference in batch on all videos via GUI #497

calebvogt opened this issue Mar 1, 2021 · 2 comments
Labels
enhancement New feature or request

Comments

@calebvogt
Copy link

calebvogt commented Mar 1, 2021

OS: Windows, running locally with GPU
SLEAP version: stable v1.0.10

Issue: I have trained a model and successfully used it to run inference over a single video using the GUI exclusively. However, I cannot seem to find a way to run inference over all videos in the project via the GUI. My sense from looking over docs and notebooks is that most users are doing this via the command line. If it doesn't exist already, in a future SLEAP release it would be awesome to have a "Predict on: All Project Videos" option in the following drop down menu:

image

For anyone else having this issue, here is an ugly, yet functional "batch.py" script that can be used to batch run your model over your videos. Note that batch.py should be placed in your project folder with all the videos you want it to run on.

Copy and paste the following into a text editor and save as "batch.py" in your working directory. Make sure to edit the script to point to the folder containing your model before running it.

##########
import os
import subprocess
subprocess.call('(for %i in (*.mp4) do sleap-track ''%i'' -m "models/210226_161753.single_instance.2819")', shell=True) ### CHANGE FOR SINGLE OR MULTI-ANIMAL MODEL

###########

image

In the shell:

cd /d /your/working/directory
conda activate sleap
batch.py

Hopefully someone finds this temp fix useful. If there is another (more elegant) way to batch run inference please let me know!

@arie-matsliah
Copy link
Contributor

HI @calebvogt,
Thanks for the feedback and for writing up this workaround that could also help other SLEAP.
We have a list of GUI improvements planned in our queue, and we'll also keep track of this enhancement request.

Cheers

@arie-matsliah arie-matsliah added the enhancement New feature or request label Mar 1, 2021
This was referenced May 12, 2022
@talmo
Copy link
Collaborator

talmo commented May 12, 2022

Closing this one, but follow #733 for progress on this issue.

@talmo talmo closed this as completed May 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants