-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARKR][SPARK-8452] expose jobGroup API in SparkR #6889
Conversation
#' @param interruptOnCancel flag to indicate if the job is interrupted on job cancellation | ||
|
||
setJobGroup <- function(groupId, description, interruptOnCancel) { | ||
if (exists(".sparkRjsc", envir = env)) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
While its technically fine to just lookup the SparkContext
I think all our methods take in a SQLContext / SparkContext explicitly. Will that work for your use case as well ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, that is perfectly fine. Just wondering why doesn't sparkR.stop() follow that convention?
Test build #35181 has finished for PR 6889 at commit
|
Test build #35193 has finished for PR 6889 at commit
|
LGTM, it will be better if you could add some tests for it, and add examples in the doc. Right now it's hard to tell what's the type of these parameters. |
#' @param sc existing spark context | ||
#' @param groupid the ID to be assigned to job groups | ||
#' @param description description for the the job group ID | ||
#' @param interruptOnCancel flag to indicate if the job is interrupted on job cancellation |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah @davies point is a good one. We can add an example usage here with something like
#' @examples
#'\dontrun{
#' sc <- sparkR.init()
# setJobGroup(sc, "group", "some group", TRUE)
#'}
LGTM, waiting for tests. |
Test build #35316 has finished for PR 6889 at commit
|
Thanks @falaki -- Merging this |
This pull request adds following methods to SparkR: ```R setJobGroup() cancelJobGroup() clearJobGroup() ``` For each method, the spark context is passed as the first argument. There does not seem to be a good way to test these in R. cc shivaram and davies Author: Hossein <[email protected]> Closes #6889 from falaki/SPARK-8452 and squashes the following commits: 9ce9f1e [Hossein] Added basic tests to verify methods can be called and won't throw errors c706af9 [Hossein] Added examples a2c19af [Hossein] taking spark context as first argument 343ca77 [Hossein] Added setJobGroup, cancelJobGroup and clearJobGroup to SparkR (cherry picked from commit 1fa29c2) Signed-off-by: Shivaram Venkataraman <[email protected]>
This pull request adds following methods to SparkR: ```R setJobGroup() cancelJobGroup() clearJobGroup() ``` For each method, the spark context is passed as the first argument. There does not seem to be a good way to test these in R. cc shivaram and davies Author: Hossein <[email protected]> Closes apache#6889 from falaki/SPARK-8452 and squashes the following commits: 9ce9f1e [Hossein] Added basic tests to verify methods can be called and won't throw errors c706af9 [Hossein] Added examples a2c19af [Hossein] taking spark context as first argument 343ca77 [Hossein] Added setJobGroup, cancelJobGroup and clearJobGroup to SparkR (cherry picked from commit 1fa29c2) Signed-off-by: Shivaram Venkataraman <[email protected]>
This pull request adds following methods to SparkR:
For each method, the spark context is passed as the first argument. There does not seem to be a good way to test these in R.
cc @shivaram and @davies