Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The reason for setting _lambda ascending ? #7

Open
huster-wgm opened this issue Dec 15, 2017 · 1 comment
Open

The reason for setting _lambda ascending ? #7

huster-wgm opened this issue Dec 15, 2017 · 1 comment

Comments

@huster-wgm
Copy link

huster-wgm commented Dec 15, 2017

I just run the code using default setting of weight of _lambda , and i observe very tiny difference between testing accuracy of source and target. In the other hand, fixed value of 1 shows better result.

result of ascending _lambda ( _lambda = (e+1)/EPOCHS)

ascending lambda

result of descending _lambda ( _lambda = (EPOCHS-e)/EPOCHS)

descending lambda

result of fixed _lambda ( _lambda = 1)

fixed lambda 1

@SSARCandy
Copy link
Owner

I set the lambda ascending is because I want to focus on classification task at the beginning.
In the original paper, the lambda is set in such way that at the end of training the classi cation loss and CORAL loss are roughly the same.

I think my linear lambda maybe too naive, or use a fixed lambda as your suggestion.
Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants