Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add per-image overlap (pimo) #1220

Conversation

jpcbertoldo
Copy link
Contributor

@jpcbertoldo jpcbertoldo commented Jul 27, 2023

Description

Per-Image Overlap (PImO)

Adding the metric Per-Image Overlap (PImO) and its area under the curve (AUC).

At each given threshold:

  • X-axis: in-image FPR average on normal images (equivalent to the set FPR of normal images).
  • Y-axis: Overlap between the class 'anomalous' in the ground truth and the predicted masks (in-image TPR).

I choose the vocabulary 'overlap' insted of TPR because it echoes with the Per-Region Overlap (PRO).


binclf curve

The per-image overlap (tpr) needs the binary classification matrix curve (binclf curve), which has to be computed per image , while they must share the thresholds (Since the x-axis (FPR) is shared).

You'll find a weird-looking funciton using itertools to compute thos binclf curves, which i found to be much faster on CPU than the two alternatives in torchmetrics (the latter mentioned in the former's doc) -- image below shows some WIP comparisons; a proper version on the way.

image

However, there is a tensor.cpu().numpy() in there, and I only tested with tensors already in the CPU.
Perhaps when using tensors in the GPU that operation might make the fastest algorithm change?

  • Fixes # (issue)

Changes

  • Bug fix (non-breaking change which fixes an issue)
  • Refactor (non-breaking change which refactors the code base)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Checklist

  • My code follows the pre-commit style and check guidelines of this project.
  • I have performed a self-review of my code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing tests pass locally with my changes
  • I have added a summary of my changes to the CHANGELOG (not for minor changes, docs and tests).

@jpcbertoldo jpcbertoldo requested a review from samet-akcay as a code owner July 27, 2023 21:52
@jpcbertoldo
Copy link
Contributor Author

Other features (like choosing and plotting representative cases in the distribution of per-image metrics) coming up as well as unit tests for this.

@jpcbertoldo
Copy link
Contributor Author

About the binclf speed, I will also test with numba.

It could further speedup but it adds a new dependency, is it ok? (a priori)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please ignore this one, it will disappear at some point as i clean my code

@jpcbertoldo jpcbertoldo requested a review from paularamo as a code owner July 28, 2023 16:30
@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@samet-akcay samet-akcay changed the base branch from main to feature/pimo August 1, 2023 11:15
@jpcbertoldo
Copy link
Contributor Author

@samet-akcay could you launch the tests here, plz?

@jpcbertoldo
Copy link
Contributor Author

Looks like there is an issue with the example notebook I added because it's saved with my local kernel name.

What's an easy solution for this?

=================================== FAILURES ===================================
_ /home/user/actions-runner/_work/anomalib/anomalib/notebooks/500_use_cases/502_perimg_metrics.ipynb _
Error - No such kernel: 'anomalib-dev-gsoc'
------------------------------ Captured log call -------------------------------
WARNING  traitlets:kernelspec.py:286 Kernelspec name anomalib-dev-gsoc cannot be found!
ERROR    traitlets:manager.py:92 No such kernel named anomalib-dev-gsoc
Traceback (most recent call last):
  File "/home/user/actions-runner/_work/anomalib/anomalib/.tox/pre-merge-py310/lib/python3.10/site-packages/jupyter_client/manager.py", line 85, in wrapper
    out = await method(self, *args, **kwargs)
  File "/home/user/actions-runner/_work/anomalib/anomalib/.tox/pre-merge-py310/lib/python3.10/site-packages/jupyter_client/manager.py", line 397, in _async_start_kernel
    kernel_cmd, kw = await self._async_pre_start_kernel(**kw)
  File "/home/user/actions-runner/_work/anomalib/anomalib/.tox/pre-merge-py310/lib/python3.10/site-packages/jupyter_client/manager.py", line 359, in _async_pre_start_kernel
    self.kernel_spec,
  File "/home/user/actions-runner/_work/anomalib/anomalib/.tox/pre-merge-py310/lib/python3.10/site-packages/jupyter_client/manager.py", line 182, in kernel_spec
    self._kernel_spec = self.kernel_spec_manager.get_kernel_spec(self.kernel_name)
  File "/home/user/actions-runner/_work/anomalib/anomalib/.tox/pre-merge-py310/lib/python3.10/site-packages/jupyter_client/kernelspec.py", line 287, in get_kernel_spec
    raise NoSuchKernel(kernel_name)
jupyter_client.kernelspec.NoSuchKernel: No such kernel named anomalib-dev-gsoc
=========================== short test summary info ============================
FAILED notebooks/500_use_cases/502_perimg_metrics.ipynb:: - Error - No such k...
=================== 1 failed, 6 passed in 172.50s (0:02:52) ====================
pre-merge-py310: exit 1 (173.58 seconds) /home/user/actions-runner/_work/anomalib/anomalib> pytest --nbmake notebooks --ignore=notebooks/300_benchmarking --ignore=notebooks/400_openvino --ignore=notebooks/500_use_cases/501_dobot pid=38855
.pkg: _exit> python /home/user/actions-runner/_work/_tool/Python/3.10.12/x64/lib/python3.10/site-packages/pyproject_api/_backend.py True setuptools.build_meta
  pre-merge-py310: FAIL code 1 (949.71=setup[116.88]+cmd[659.25,173.58] seconds)
  evaluation failed :( (949.94 seconds)
Error: Process completed with exit code 1.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe you could show some plots of the PImO curves in this notebook. You could use these plots to explain some of the limitations of the set metrics and why it is important to also look at the metrics on the image level. This would help others to see the added value of your metric.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I forgot to mention, but i'm adding more things in other branches to make smaller PRs, and i'm incrementing this notebook (like the things you said).

samet-akcay and others added 4 commits August 3, 2023 20:36
* New printing stuff

* Remove dead code + address codacy issues

* Refactor try/except + log to comet/wandb during runs

* pre-commit error

* third-party configuration

---------

Co-authored-by: Ashwin Vaidya <[email protected]>
@jpcbertoldo jpcbertoldo requested a review from djdameln August 7, 2023 08:26
* ignore mask check when dataset has only normal samples

* update changelog
@jpcbertoldo
Copy link
Contributor Author

I did a refactor, moved to anonther branch.

New PR: #1247

@jpcbertoldo jpcbertoldo closed this Aug 9, 2023
@jpcbertoldo jpcbertoldo deleted the jpcbertoldo/gsoc23-segementation-metrics branch February 9, 2024 12:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants