-
Notifications
You must be signed in to change notification settings - Fork 66
Conversation
@@ -1,4 +1,159 @@ | |||
FROM aztk/spark:v0.1.0-spark1.6.3-base |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why not build on top of this base image any more?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is a difference between base and gpu images in the Spark building process now (the added -Pnetlib-lgpl). We could build all images with that profile -- that might be the best option here.
#!/bin/bash | ||
apt-get update | ||
apt-get install -y libopenblas-base | ||
update-alternatives --config libblas.so.3 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does it make sense to just bake this into the dockerfile?
target_role=PluginTargetRole.All, | ||
execute="openblas.sh", | ||
files=[ | ||
PluginFile("openblas.sh", os.path.join(dir_path, "openblas.sh")), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it make more sense to have a generic apt-get install plugin?
The way I did this for the conda package installer was to let users specify an array of packages as a parameter. The main issue in this case though is the update-alternatives command which is non-standard... not sure how to get around that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Update alternatives is borderline not necessary since we are running in a container with a known environment and I already know that there aren't any alternatives present. It's mostly there as a precaution. I think that a general apt-get install
plugin would be great, but I don't know if it should be in place of this.
…tk into feature/spark-openblas-plugin
…tk into feature/spark-openblas-plugin
dir_path = os.path.dirname(os.path.realpath(__file__)) | ||
|
||
class NvBLASPlugin(PluginConfiguration): | ||
def __init__(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
use a function here instead of a class as we talked
def NvBLASPlugin():
return PluginConfiguration(...)
|
||
dir_path = os.path.dirname(os.path.realpath(__file__)) | ||
|
||
class OpenBLASPlugin(PluginConfiguration): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same
#498