Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Elixir style Piping #214

Closed
cgarciae opened this issue Jan 30, 2017 · 3 comments
Closed

Elixir style Piping #214

cgarciae opened this issue Jan 30, 2017 · 3 comments

Comments

@cgarciae
Copy link

Libraries such as Numpy and TensorFlow feature a function-based API where most functions take in an array or tensor respectively as their first parameter. You end up writing code like this

import tensorflow as tf
from tensorflow.contrib import layers

x = tf.placeholder(tf.float32, shape=[None, 4])

net = layers.fully_conneted(x, 32, activation_fn=tf.nn.elu)
net = layers.fully_conneted(net, 32, activation_fn=tf.nn.elu)
net = layers.fully_conneted(net, 16, activation_fn=tf.nn.elu)
net = layers.fully_conneted(net, 8, activation_fn=tf.nn.elu)

h = layers.fully_conneted(net, 4, activation_fn=tf.nn.softmax)

It would be nice if there was a way to do Elixir style Pipes (where the first argument of such pipe would be passed as first parameter to the next function) in Coconut to ease the use of these libraries. Lets imagine that ||> was an operator such that

a ||> f(b, c) == f(a, b, c)

then you could write the previous as

import tensorflow as tf
from tensorflow.contrib import layers

x = tf.placeholder(tf.float32, shape=[None, 4])

h = (
  x 
  ||> layers.fully_conneted(32, activation_fn=tf.nn.elu)
  ||> layers.fully_conneted(16, activation_fn=tf.nn.elu)
  ||> layers.fully_conneted(8, activation_fn=tf.nn.elu)

  ||> layers.fully_conneted(4, activation_fn=tf.nn.softmax) #h
)

If possible I'd also like to suggest an operator for doing function composition in a forward manner, like Elm's >> operator. Let

f |>> g == g..f
@evhub
Copy link
Owner

evhub commented Jan 30, 2017

@cgarciae You should be able to use partial application for this. Translating your example above,

h = (
  x 
  ||> layers.fully_conneted(32, activation_fn=tf.nn.elu)
  ||> layers.fully_conneted(16, activation_fn=tf.nn.elu)
  ||> layers.fully_conneted(8, activation_fn=tf.nn.elu)
  ||> layers.fully_conneted(4, activation_fn=tf.nn.softmax)
)

becomes

h = (
  x 
  |> layers.fully_conneted$(?, 32, activation_fn=tf.nn.elu)
  |> layers.fully_conneted$(?, 16, activation_fn=tf.nn.elu)
  |> layers.fully_conneted$(?, 8, activation_fn=tf.nn.elu)
  |> layers.fully_conneted$(?, 4, activation_fn=tf.nn.softmax)
)

using the new ? version of partial application in develop.

@evhub evhub added the feature label Jan 30, 2017
@cgarciae
Copy link
Author

@evhub Excellent! You've made my day!

Another question. I couldnt get the @ operator to work, even on python 3. I was trying to multiply 2 numpy matrices.

@evhub
Copy link
Owner

evhub commented Jan 31, 2017

@cgarciae Thanks, glad to hear it! As for the @ operator, make sure you're passing --target 3.5 to specify that you want to allow 3.5-specific syntax (the compiled output will no longer be compatible with lower Python versions). You should be getting an error telling you to do that, which should look like this:

Coconut Interpreter:
(type "exit()" or press Ctrl-D to end)
>>> a @ b

CoconutTargetError: found Python 3.5 matrix multiplication (enable --target 35 to dismiss) (line 1)
  a @ b
    ^

@evhub evhub closed this as completed Jan 31, 2017
@evhub evhub mentioned this issue May 12, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants