You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
But they are not being used at all and there is no place for them to be called in the code from other file or function.(I think we don't need them because we use the MFH model)
We only uses top_down_attention class in the build_image_attention_module function.
My questions are, Are they redundant? and If we wanted to use the ordinary concatenate_attention or project_attention should I modify the build_image_attention_module function to
Hi 😅
In
image_attention.py
There are three classesBut they are not being used at all and there is no place for them to be called in the code from other file or function.(I think we don't need them because we use the MFH model)
We only uses
top_down_attention
class in thebuild_image_attention_module
function.My questions are, Are they redundant? and If we wanted to use the ordinary
concatenate_attention
orproject_attention
should I modify thebuild_image_attention_module
function to?
The text was updated successfully, but these errors were encountered: