-
Notifications
You must be signed in to change notification settings - Fork 371
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] Add BaseInferencer
to MMEngine
#874
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
gaotongxiao
previously approved these changes
Jan 16, 2023
* Update BaseInferencer * Fix ci * Fix CI and rename iferencer to infer * Fix CI * Add renamed file * Add test file * Adjust interface sequence * refine preprocess * Update unit test Update unit test * Update unit test * Fix unit test * Fix as comment * Minor refine * Fix docstring and support load image from different backend * Support load collate_fn from downstream repos, refine dispatch * Minor refine * Fix lint * refine grammar * Remove FileClient * Refine docstring * add rich * Add list_models * Add list_models * Remove backend args * Minor refine
* Add preprocess inputs * Add type hint * update api/infer in index.rst * rename preprocess_inputs to _inputs_to_list * Fix doc format * Update infer.py Co-authored-by: Zaida Zhou <[email protected]>
* first commit * [Enhance] Support build model from weight * minor refine * Fix type hint * refine comments * Update docstring * refine as comment * Add method * Refine docstring * Fix as comment * refine comments * Refine warning message * Fix unit test and refine comments
zhouzaida
reviewed
Jan 16, 2023
zhouzaida
reviewed
Jan 16, 2023
zhouzaida
reviewed
Jan 16, 2023
zhouzaida
approved these changes
Jan 16, 2023
zhouzaida
added a commit
that referenced
this pull request
Jan 16, 2023
This reverts commit 2d8f2be.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.
Motivation
Add
BaseInferencer
for providing an easy and clean interface for single or multiple images inferencing.How to use
How to build an inferencer
Based on
BaseInferencer
, the__init__
of subclasses should at least accept 3 arguments:Alias
, the config could also be inferred by theAlias
Config
,ConfigDict
ordict
instance.The standard workflow of inferencer
BaseInferencer
implement the standard inference work flow in__call__
:if DownStream repos want to customize the workflow, they can override the
__call__
methodpreprocess data
prepare the
pipeline
(abstract method)subclasses should override the
_init_pipeline
to customize the pipeline. The returned pipeline will be used to process each single data.prepare the
collate_fn
BaseInferencer
provides a common way to getcollate_fn
from cfg. If you have more custom usage, you can override the method to get targetcollate_fn
prepare the chunked data.
subclasses could override the
prepare
to get a custom chunked data.BaseInferencer
provides a common way to build the chunked data in_get_chunk_data
.preprocess will use the prepared
pipeline
,collate_fn
to return a target chunked data, of which each item could be directly passed tomodel.test_step
forward
Inference with the chunked data.
BaseInferencer
callmodel.test_step in
forwardby default
.visualize(abstract method)
Subclasses should implement visualize to visualize the result and return the visualization result.
postprocess(abstract method)
Subclasses should implement postprocess to get the target format result(DataSample or dict) and the visualization result.
The coverage rate of unit test:
Modification
Please briefly describe what modification is made in this PR.
BC-breaking (Optional)
Does the modification introduce changes that break the backward-compatibility of the downstream repos?
If so, please describe how it breaks the compatibility and how the downstream projects should modify their code to keep compatibility with this PR.
Use cases (Optional)
If this PR introduces a new feature, it is better to list some use cases here, and update the documentation.
Checklist