-
Notifications
You must be signed in to change notification settings - Fork 11.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
rpn_probs and rpn_class_logits foreground vs background #870
Comments
In rpn_class_loss_graph, BG/FG's label are -1/+1 and then 0/1. So the comment in rpn_class_loss_graph should be a typo. |
keineahnung2345
added a commit
to keineahnung2345/Mask_RCNN
that referenced
this issue
Jan 10, 2019
waleedka
pushed a commit
that referenced
this issue
Jan 28, 2019
luccauchon
added a commit
to luccauchon/Mask_RCNN
that referenced
this issue
Feb 4, 2019
* Fix visualize activations (matterport#1211) Fix visualize activations (Squashed 4 commits by @keineahnung2345) # Get activations of a few sample layers activations = model.run_graph([image], [ ("input_image", model.keras_model.get_layer("input_image").output) ]) leads to the error: InvalidArgumentError: input_image:0 is both fed and fetched. Revise the code according to https://stackoverflow.com/questions/39307108/placeholder-20-is-both-fed-and-fetched * Fix typo in config.py * Fix typo in rpn_class_loss_graph Fix matterport#870 * Reversed order (1) and (2) in installation Reference to matterport#1255 * remove unused code (matterport#1227) Remove unused code
Closed
aneeshchauhan
pushed a commit
to aneeshchauhan/Mask_RCNN
that referenced
this issue
Jul 9, 2019
smsegal
pushed a commit
to smsegal/mask_rcnn
that referenced
this issue
Nov 18, 2020
withyou53
pushed a commit
to withyou53/mask_r-cnn_for_object_detection
that referenced
this issue
Sep 27, 2023
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
A quick question about the rpn_probs and rpn_class_logits. In the rpn_graph function when rpn_probs is first defined it is stated that the last dimension is separated into BG/FG. However, when using rpn_class_logits in the rpn_class_loss_graph function, it's last dimension is defined to be FG/BG, which is the reverse of rpn_probs. Is there a reason why it is swapped?
The text was updated successfully, but these errors were encountered: