Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARKR][SPARK-8452] expose jobGroup API in SparkR #6889

Closed
wants to merge 4 commits into from

Conversation

falaki
Copy link
Contributor

@falaki falaki commented Jun 18, 2015

This pull request adds following methods to SparkR:

setJobGroup()
cancelJobGroup()
clearJobGroup()

For each method, the spark context is passed as the first argument. There does not seem to be a good way to test these in R.

cc @shivaram and @davies

#' @param interruptOnCancel flag to indicate if the job is interrupted on job cancellation

setJobGroup <- function(groupId, description, interruptOnCancel) {
if (exists(".sparkRjsc", envir = env)) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

While its technically fine to just lookup the SparkContext I think all our methods take in a SQLContext / SparkContext explicitly. Will that work for your use case as well ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that is perfectly fine. Just wondering why doesn't sparkR.stop() follow that convention?

@SparkQA
Copy link

SparkQA commented Jun 19, 2015

Test build #35181 has finished for PR 6889 at commit 343ca77.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jun 19, 2015

Test build #35193 has finished for PR 6889 at commit a2c19af.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@davies
Copy link
Contributor

davies commented Jun 19, 2015

LGTM, it will be better if you could add some tests for it, and add examples in the doc. Right now it's hard to tell what's the type of these parameters.

#' @param sc existing spark context
#' @param groupid the ID to be assigned to job groups
#' @param description description for the the job group ID
#' @param interruptOnCancel flag to indicate if the job is interrupted on job cancellation
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah @davies point is a good one. We can add an example usage here with something like

 #' @examples
 #'\dontrun{
 #' sc <- sparkR.init()
 # setJobGroup(sc, "group", "some group", TRUE)
 #'}

@davies
Copy link
Contributor

davies commented Jun 19, 2015

LGTM, waiting for tests.

@SparkQA
Copy link

SparkQA commented Jun 19, 2015

Test build #35316 has finished for PR 6889 at commit 9ce9f1e.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@shivaram
Copy link
Contributor

Thanks @falaki -- Merging this

asfgit pushed a commit that referenced this pull request Jun 19, 2015
This pull request adds following methods to SparkR:

```R
setJobGroup()
cancelJobGroup()
clearJobGroup()
```
For each method, the spark context is passed as the first argument. There does not seem to be a good way to test these in R.

cc shivaram and davies

Author: Hossein <[email protected]>

Closes #6889 from falaki/SPARK-8452 and squashes the following commits:

9ce9f1e [Hossein] Added basic tests to verify methods can be called and won't throw errors
c706af9 [Hossein] Added examples
a2c19af [Hossein] taking spark context as first argument
343ca77 [Hossein] Added setJobGroup, cancelJobGroup and clearJobGroup to SparkR

(cherry picked from commit 1fa29c2)
Signed-off-by: Shivaram Venkataraman <[email protected]>
@asfgit asfgit closed this in 1fa29c2 Jun 19, 2015
nemccarthy pushed a commit to nemccarthy/spark that referenced this pull request Jun 22, 2015
This pull request adds following methods to SparkR:

```R
setJobGroup()
cancelJobGroup()
clearJobGroup()
```
For each method, the spark context is passed as the first argument. There does not seem to be a good way to test these in R.

cc shivaram and davies

Author: Hossein <[email protected]>

Closes apache#6889 from falaki/SPARK-8452 and squashes the following commits:

9ce9f1e [Hossein] Added basic tests to verify methods can be called and won't throw errors
c706af9 [Hossein] Added examples
a2c19af [Hossein] taking spark context as first argument
343ca77 [Hossein] Added setJobGroup, cancelJobGroup and clearJobGroup to SparkR

(cherry picked from commit 1fa29c2)
Signed-off-by: Shivaram Venkataraman <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants