Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Subtract merge layer #7573

Merged
merged 8 commits into from
Aug 10, 2017
Merged

Subtract merge layer #7573

merged 8 commits into from
Aug 10, 2017

Conversation

ericwu09
Copy link
Contributor

@ericwu09 ericwu09 commented Aug 9, 2017

With the merge method being deprecated by end of 8/2017, there needs to be a way to define a subtraction of two layers. This Subtract layer takes exactly two tensors and returns the difference.

With the merge method being deprecated by end of 8/2017, there needs to be a way to define a subtraction of two layers. This Subtract layer takes exactly two tensors and returns the difference.
Adding Subtract unit test
@ericwu09 ericwu09 changed the title Add Subtract merge layer Subtract merge layer Aug 9, 2017
"""Layer that subtracts two inputs.

It takes as input a list of tensors of size 2,
both of the same shape, and returns
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not necessarily the same shape?

x = K.ones([10,10])
y = K.ones([1,10])
x - y # Does not raise.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, added support and unit test for shape mismatch.


It takes as input a list of tensors of size 2,
both of the same shape, and returns
a single tensor (also of the same shape).
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Explicitly state that it returns inputs[0] - inputs[1].

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, updated the documentation.

@ericwu09
Copy link
Contributor Author

ericwu09 commented Aug 9, 2017

All the tests passed, but the pull request failed. Is this related to this request, or possibly another issue?

@fchollet
Copy link
Collaborator

fchollet commented Aug 9, 2017

It's a temporary file system-related issue with Travis. Cause unclear; maybe Travis itself, or a recent update to pytest.

fchollet
fchollet previously approved these changes Aug 10, 2017
@fchollet fchollet merged commit 552727e into keras-team:master Aug 10, 2017
ahundt added a commit to ahundt/keras that referenced this pull request Aug 11, 2017
* master:
  model.fit(steps_per_epoch), 	mnist_tfrecord.py, progbar np.mean (keras-team#7113)
  Change default size to allow different MobileNet sizes (keras-team#7586)
  cntk backend: fix the reversed rnn bug (keras-team#7593)
  Fix mask for multi output --> multi inputs (keras-team#7591)
  [RELNOTES] Subtract merge layer (keras-team#7573)
  update docker with cntk 2.1 (keras-team#7565)
  [RELNOTES] Move constraint management to be based on variable attributes (like TF). (keras-team#7564)
  Add handling for `dtype` arg in initializer config.
  Fix keras-team#7550. (keras-team#7552)
  remove unnecessary function definition (keras-team#7542)
  refactor the function - _convert_string_dtype (keras-team#7540)
  Add batchnorm tests for CNTK (keras-team#7534)
  Clean up RNN tests (keras-team#7529)
  Add merge tests (keras-team#7533)

# Conflicts:
#	examples/mnist_tfrecord.py
#	keras/engine/training.py
#	keras/utils/generic_utils.py
#	tests/keras/engine/test_training.py
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants