Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

remove unnecesary ::: inside SparkR #3

Merged
merged 3 commits into from
Apr 6, 2015
Merged

Conversation

davies
Copy link

@davies davies commented Mar 25, 2015

I'm not sure about it, all tests passed.

@shivaram
Copy link

shivaram commented Apr 6, 2015

@davies So we need the ::: to access package private functions in R. Now we don't need them inside the package itself (i.e. on the driver side). However on the executors we just load the SparkR package like any user would and then run the closures. So its not very clear to me what happens if the closures refer to package private functions.

@hlin09 Any thoughts on this ?

@hlin09
Copy link

hlin09 commented Apr 6, 2015

@shivaram Access to private functions in closure should work well in current implementation of cleanClosure. They will be serialized properly, and I think it should be safe to remove :::

@shivaram
Copy link

shivaram commented Apr 6, 2015

Thanks @hlin09 for the clarification.

@davies Is the log message cleanup to make the test output better ? The main reason we added the log messages is because we want to tell the user that calling sparkR.init twice does not recreate the SparkContext as we cache the java reference. I think that is still useful to have ?

@davies
Copy link
Author

davies commented Apr 6, 2015

@shivaram It makes sense to keeping the logging, because the returned SparkContext object is not in initial state as expected from sparkR.init. I will rollback that change.

Do we still need a logging say Stopping SparkR?

@shivaram
Copy link

shivaram commented Apr 6, 2015

Thanks @davies . We probably don't need the Stopping SparkR log messages as I think Spark prints a bunch of log messages on SparkContext.stop.

@shivaram
Copy link

shivaram commented Apr 6, 2015

LGTM. Merging this

shivaram added a commit that referenced this pull request Apr 6, 2015
remove unnecesary ::: inside SparkR
@shivaram shivaram merged commit eb5da53 into amplab-extras:R Apr 6, 2015
shivaram pushed a commit that referenced this pull request Apr 7, 2015
…her duplicate in HiveQl

Author: DoingDone9 <[email protected]>

Closes apache#4973 from DoingDone9/sort_token and squashes the following commits:

855fa10 [DoingDone9] Update HiveQl.scala
c7080b3 [DoingDone9] Sort these tokens in alphabetic order to avoid further duplicate in HiveQl
c87e8b6 [DoingDone9] Merge pull request #3 from apache/master
cb1852d [DoingDone9] Merge pull request #2 from apache/master
c3f046f [DoingDone9] Merge pull request #1 from apache/master
shivaram pushed a commit that referenced this pull request Apr 7, 2015
… failed!!

wrong code : val tmpDir = Files.createTempDir()
not Files should Utils

Author: DoingDone9 <[email protected]>

Closes apache#5198 from DoingDone9/FilesBug and squashes the following commits:

6e0140d [DoingDone9] Update InsertIntoHiveTableSuite.scala
e57d23f [DoingDone9] Update InsertIntoHiveTableSuite.scala
802261c [DoingDone9] Merge pull request #7 from apache/master
d00303b [DoingDone9] Merge pull request #6 from apache/master
98b134f [DoingDone9] Merge pull request #5 from apache/master
161cae3 [DoingDone9] Merge pull request #4 from apache/master
c87e8b6 [DoingDone9] Merge pull request #3 from apache/master
cb1852d [DoingDone9] Merge pull request #2 from apache/master
c3f046f [DoingDone9] Merge pull request #1 from apache/master
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants