-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problem of implementation of InstanceNorm1d
in PyTorch Geometric
#705
Comments
Hi, x_norm = (x - x_mu) / x_sigma yields the correct results. Regarding normalization, it can be applied by normalizing the aggregation (as in GCN) or applied afterwards on the node features (like in GIN). For normalization on node features, I have yet to see anything other than BatchNorm. |
hello @rusty1s an apologizeFirst, I apologize that
But... the |
Well, first of all you index wrong features IMO
In addition, for |
oops! I got that! batch = torch.repeat_interleave(torch.tensor([12, 6, 12])) and after I fixed it, all the so... this is alright. I should be more careful next time. |
❓ Questions & Help
hello @rusty1s @Magnetar99:
I manage to implement the
InstanceNorm1d
shown in #687 . And I just write ademo
program... here is my code.my implementation and some error...
my code
This is my code, and I think this code is like the code provided in #687 .
error?
And I want to check what is happening, or, what

mean
andstd
is, I found a problem? 👇ScreenShotswhat it should be
as we all know that after normalization, the output's mean should be 0. and output's standard deviation should be 1. .
and why such kind of thing happen? (I know that this "can be regarded as" a small problem (since there is a slight differece with what it should be), but I just wonder why.
?? is there any papers, or ideas about "normalization on graph data?" ??
I know that in thomas kipf's gcn paper,
A+I
is called "renormalization trick", but perhaps this trick is not thenormalizaiton
in this issue. So... I just want to ask if there is some paper about normalization on "features"(not the normalization of "structure", i means, methods like "renormalization trick") of the graph?Appendix:
mean
andstd
innumpy
andpytorch
When I am finding out the reasons, I tried the
mean
andstd
innumpy
andpytorch
Take such "simple" 1d array/tensor [1., 2., 3., 4.]as an example.
numpy
ScreenShot👇

pytorch
ScreenShot👇

pytorch_scatter
ScreenShot👇 (sorry, this is too long, and i do not capture the code part)

What I want to express here is:
In Image's Normalization, we sum up all the pixel's values, and then calculate it. (Batch Normalization or Instance Normalization) But here, when i operate the
scatter_std
, thefeature_dimension
, does notreduce
(onlydim=
's dimension reduce)?And I think that /4 or /(4-1) is not the key point, since it is a problem of statistics, or numerical analysis. (when lim n->∞, the 2 approximations can be viewed as the same...)
final note:
#684 may be meaningless, and some of the thoughts may have no use. (perhaps some thoughts are too ahead of time, i.e. few people may pay attention to that. And I close the issue. Many sorry....
yours sincerely,
@WMF1997
The text was updated successfully, but these errors were encountered: