Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-14013][SQL] Proper temp function support in catalog #11972

Closed
wants to merge 6 commits into from

Conversation

andrewor14
Copy link
Contributor

What changes were proposed in this pull request?

Session catalog was added in #11750. However, it doesn't really support temporary functions properly; right now we only store the metadata in the form of CatalogFunction, but this doesn't make sense for temporary functions because there is no class name.

This patch moves the FunctionRegistry into the SessionCatalog. With this, the user can call catalog.createTempFunction and catalog.lookupFunction to use the function they registered previously. This is currently still dead code, however.

How was this patch tested?

SessionCatalogSuite.

@rxin
Copy link
Contributor

rxin commented Mar 26, 2016

cc @yhuai

I took a quick look and it looks good -- but I didn't look at the details.

@SparkQA
Copy link

SparkQA commented Mar 26, 2016

Test build #54238 has finished for PR 11972 at commit 238d392.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@@ -476,33 +497,29 @@ class SessionCatalog(externalCatalog: ExternalCatalog, conf: CatalystConf) {
throw new AnalysisException("rename does not support moving functions across databases")
}
val db = oldName.database.getOrElse(currentDb)
if (oldName.database.isDefined || !tempFunctions.containsKey(oldName.funcName)) {
lazy val oldBuilder = functionRegistry.lookupFunctionBuilder(oldName.funcName)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why use lazy val?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if database is defined we don't need to do it; this could save us a call to Hive

@yhuai
Copy link
Contributor

yhuai commented Mar 26, 2016

Overall looks good.

@SparkQA
Copy link

SparkQA commented Mar 28, 2016

Test build #54341 has finished for PR 11972 at commit 2f53330.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@andrewor14
Copy link
Contributor Author

retest this please

@SparkQA
Copy link

SparkQA commented Mar 28, 2016

Test build #54362 has finished for PR 11972 at commit 2f53330.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@andrewor14
Copy link
Contributor Author

OK thanks, I'm merging this to unblock progress on other issues.

@asfgit asfgit closed this in 27aab80 Mar 28, 2016
@andrewor14 andrewor14 deleted the temp-functions branch March 28, 2016 23:57
@SparkQA
Copy link

SparkQA commented Mar 29, 2016

Test build #54383 has finished for PR 11972 at commit 271227c.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants