Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remote sensing examples #306

Merged
merged 58 commits into from
Jun 21, 2024

Conversation

AmmarMian
Copy link
Contributor

A basic example of using pyRiemann to cluster both Hyperspectral images (SPD matrices) or PolSAR images (HPD matrices).
Any input about the coding style is welcomed.

I had to rebase to get all the changes from pyRiemann master branch so I have lots of commits. Please let me know if you'd prefer I make a cleaner version starting from current pyRiemann.

AmmarMian and others added 30 commits February 6, 2024 17:44
* complete test for time delay cov

* update whatsnew

* speedup TimeDelayCovariances removing a for loop

* Update pyriemann/estimation.py

Co-authored-by: Alexandre Gramfort <[email protected]>

---------

Co-authored-by: Alexandre Gramfort <[email protected]>
* remove warnings from examples and tests

* correct flake8
* improve checking of metric arguments

* improve checking of metric arguments bis

* improve error messages

* last modifs

* apply code review remarks
* Enhance ajd module

* correct whatsnew

* improve code

* complete refs

* mean ale supporting hpd matrices

* remove support of hpd matrices for ajd_pham

* last modifs

* update version number

* modify n_iter_max of ajd_pham

* apply suggestions from code review

Co-authored-by: Vasco Schiavo <[email protected]>

* correct

---------

Co-authored-by: Vasco Schiavo <[email protected]>
* use a unique function check_function

* complete doc
* change range of window lengths

* improve imports in examples

* remove useless imports

* add blank line
* improve viz module

* move plot_cov_ellipse into viz module

* add plot_bihist in viz

* add reference

* garantee to have 0.5 value in bin edges

* add matplotlib in requirements

* complete whatsnew

* correct indentation

* add plot_scatter

Co-authored-by: gcattan <[email protected]>

* correct flake8

* ensure to draw line y=x

---------

Co-authored-by: gcattan <[email protected]>
* add class cross spectra

* complete whatsnew

* correct typo

* improve doc
* Add kernel option and tests

* Update whatsnew.rst

* Update whatsnew.rst

* Apply suggestions from code review

Co-authored-by: Quentin Barthélemy <[email protected]>

* lint

* improve whatsnew

* Apply suggestions from code review

Co-authored-by: Quentin Barthélemy <[email protected]>

* linting

* add none kernel test

* might needed for flake8

* Update embedding.py

* Delete .idea/workspace.xml

* Update test_embedding.py

* Update test_embedding.py

* Apply suggestions from code review

Co-authored-by: Quentin Barthélemy <[email protected]>

* change error

* linting

---------

Co-authored-by: Quentin Barthélemy <[email protected]>
@AmmarMian
Copy link
Contributor Author

@qbarthelemy Can you check now ?
Normally I resolved flake8 errors, added your suggestions and references.

I also split SAR and hyper spectral in two files because they are two types of images. Next I'll add an example of change detection rather than clustering so that will also be a new file.

Ammar

@qbarthelemy qbarthelemy mentioned this pull request Jun 17, 2024
@qbarthelemy
Copy link
Member

Thank you @AmmarMian for all this work!
I made some small modifs, so don't forget to pull before committing and pushing.

My concern is the building time of the documentation: it went from 20 / 30 min to almost 1hour
(see https://github.com/pyRiemann/pyRiemann/actions/workflows/deploy_ghpages.yml).
Is it possible to process only a sub-image of the SAR image?

Moreover, currently, you apply a raw downsampling on images, data = data[::7, ::7], but it generates spectral aliasing.
It is better to apply a decimation, ie an anti-aliasing low-pass filtering followed by the downsampling.
We could use resize function from Pillow, but it adds a dependencies to docs requirement. Ok for you @agramfort ?

@agramfort
Copy link
Member

agramfort commented Jun 19, 2024 via email

@AmmarMian
Copy link
Contributor Author

AmmarMian commented Jun 19, 2024

I had tuned the downsampling so that it doesn't take too long on my PC, but I guess the ressources provided by GitHub actions are quite limited. I reduced the sampling and the max_iter a bit more and removed some methods (we don't need to compare all estimation strategies, rather I let a variable estimator that the user can change to play with that aspect).

Finally,I tweaked the scripts a bit to visualise the images from the data without downsampling and the results with downsampling. I do not really want to use filters because the data is complex-valued for SAR so it's not evident how to approach it. And more importantly, the method relies on a statistical independence of the pixels which could be compromised.

To visualize results, I propose to use pcolormeshover a spatial meshgrid since we know the resolution on the pixels thanks to the documentation of the datasets. We can then see the results over the same area and also see the lost edge because of the sliding windows approach.

@qbarthelemy
Copy link
Member

Thanks for all the great advice about radar image processing!
Building time of documentation is now OK.

Copy link

@antoinecollas antoinecollas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well done @AmmarMian 👍🏻 ! I really like that you can do everything with a sklearn Pipeline. It opens many perspectives: better evaluation, comparison of estimators, ...

Overall, you should replace pca_image by sklearn.decomposition.PCA and put it directly in PCAImage.

examples/remote-sensing/helpers/processing_helpers.py Outdated Show resolved Hide resolved
examples/remote-sensing/helpers/processing_helpers.py Outdated Show resolved Hide resolved
examples/remote-sensing/helpers/processing_helpers.py Outdated Show resolved Hide resolved
@AmmarMian
Copy link
Contributor Author

AmmarMian commented Jun 20, 2024

Ok so from the input of @antoinecollas, I removed pca_imageand used sklearn.decomposition.PCA with a reshape. I also added the doctoring for transform of LabelsToImage.

@AmmarMian
Copy link
Contributor Author

Thanks for the changes @qbarthelemy. Let me know if anything more is needed before merging!

@qbarthelemy
Copy link
Member

I made some simplifications on PCA.

My last question is about RemoveMeanImage: this mean removal seems redundant with the centering applied into PCAImage.
If so, we could delete RemoveMeanImage.

@AmmarMian
Copy link
Contributor Author

Indeed you are quite right, I used stuff I had lying around but in this case no need indeed ! We can remove this from the pipeline. Let me do that now and commit again.

Copy link
Member

@qbarthelemy qbarthelemy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Once again, a huge thank you for this major contribution to pyRiemann!

@qbarthelemy qbarthelemy merged commit 205662a into pyRiemann:master Jun 21, 2024
10 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants