Skip to content
This repository has been archived by the owner on Jan 9, 2020. It is now read-only.

Create RBAC role YAMLs and documentation #500

Closed
foxish opened this issue Sep 20, 2017 · 6 comments
Closed

Create RBAC role YAMLs and documentation #500

foxish opened this issue Sep 20, 2017 · 6 comments
Assignees

Comments

@foxish
Copy link
Member

foxish commented Sep 20, 2017

We need RBAC roles associated with each component - shuffle service, RSS.
Also, need instructions to setup service accounts for driver and executor pods.

@liyinan926
Copy link
Member

@foxish @kimoonkim Regarding RBAC roles for the RSS and shuffle service, is there any customization needed or they are well taken care of by the default role/service account used by the pods of them? AFAIK, none of them need write access the API server. Correct me if I'm wrong.

@kimoonkim
Copy link
Member

One thing that jumps out to me. The shuffle service relies on HostPath volumes, which is not necessarily available to all pods. There is PodSecurityPolicy that can be used together with RBAC to allow the access. For details, see this doc. So I think we should address PSP RBAC rules. I'll be happy to dig more in this, as it also applies to kubernetes-HDFS.

@liyinan926
Copy link
Member

@kimoonkim I also found this doc, which also seems related.

@kimoonkim
Copy link
Member

Ah. That doc seems very relevant. Thanks for sharing it!

@kimoonkim
Copy link
Member

Probably not the scope of this issue. But I was wondering if we should also think about human accounts and the role bindings they need to run Spark jobs and these other services.

I am personally using the cluster admin account for myself, but not every user will have access to that in a large org.

@liyinan926
Copy link
Member

I agree that we should think about non-admin user account, which is likely much more common in production environment in large clusters.

ifilonenko pushed a commit to bloomberg/apache-spark-on-k8s that referenced this issue Mar 13, 2019
This PR reverts back to using Scala 2.11

* Revert "Fix distribution publish to scala 2.12 apache-spark-on-k8s#478"
* Revert "[SPARK-25956] Make Scala 2.12 as default Scala version in Spark 3.0"
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants