Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-31999][SQL] Add REFRESH FUNCTION command #28840

Closed
wants to merge 46 commits into from
Closed
Show file tree
Hide file tree
Changes from 20 commits
Commits
Show all changes
46 commits
Select commit Hold shift + click to select a range
69a47a1
init
ulysses-you Jun 16, 2020
a95dcb6
update doc
ulysses-you Jun 17, 2020
3fc807e
fix typo
ulysses-you Jun 17, 2020
b282348
update doc
ulysses-you Jun 17, 2020
f677a4a
update doc again
ulysses-you Jun 17, 2020
a6c5d8b
use v2 command
ulysses-you Jun 17, 2020
de54470
fix
ulysses-you Jun 17, 2020
9e09875
fix mistake
ulysses-you Jun 17, 2020
63695c0
use v2 commnd analyze
ulysses-you Jun 17, 2020
9e9d5ce
add line
ulysses-you Jun 17, 2020
c434821
update doc
ulysses-you Jun 18, 2020
35fd44b
update doc
ulysses-you Jun 18, 2020
f83fd8b
fix child
ulysses-you Jun 18, 2020
e444943
fix children
ulysses-you Jun 18, 2020
afd510b
add comment
ulysses-you Jun 18, 2020
1241bde
fix copy error
ulysses-you Jun 18, 2020
93f5d71
update doc
ulysses-you Jun 18, 2020
dc684b5
update comment
ulysses-you Jun 18, 2020
0ea7dd6
fix LookupCatalog
ulysses-you Jun 18, 2020
643969c
merge to ResolveFunctions
ulysses-you Jun 18, 2020
6cb2edd
remove ignoreIfNotExists
ulysses-you Jun 19, 2020
cffc207
fix ut
ulysses-you Jun 22, 2020
4b6408d
fix resolve
ulysses-you Jun 22, 2020
5d5fe71
brush functions
ulysses-you Jun 22, 2020
4ba345b
fix
ulysses-you Jun 22, 2020
6765395
use catalogfunction
ulysses-you Jun 22, 2020
dc86b82
fix
ulysses-you Jun 23, 2020
a38d656
fix comment
ulysses-you Jun 23, 2020
cdea55b
ut nit
ulysses-you Jun 24, 2020
5e227d7
fix nit
ulysses-you Jun 24, 2020
703ad47
nit
ulysses-you Jun 24, 2020
a79f72b
update ResolvedFunc
ulysses-you Jun 24, 2020
a4d144a
Merge branch 'master' of https://github.com/apache/spark into SPARK-3…
ulysses-you Jul 6, 2020
3bd8d23
update doc
ulysses-you Jul 6, 2020
60ac2a0
fix doc
ulysses-you Jul 6, 2020
b36b760
update comment
ulysses-you Jul 6, 2020
c5937a2
rewrite RefreshFunctionCommand
ulysses-you Jul 6, 2020
56ec5ea
update doc
ulysses-you Jul 13, 2020
c129a54
fix functions
ulysses-you Jul 14, 2020
a956144
fix
ulysses-you Jul 14, 2020
711656d
remove unnecessary param
ulysses-you Jul 14, 2020
5d4c152
simplify
ulysses-you Jul 16, 2020
94fa132
fix
ulysses-you Jul 17, 2020
fc4789f
simplify
ulysses-you Jul 17, 2020
e83194f
address comment
ulysses-you Jul 21, 2020
b18437c
fix
ulysses-you Jul 21, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions docs/_data/menu-sql.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -208,6 +208,8 @@
url: sql-ref-syntax-aux-cache-clear-cache.html
- text: REFRESH TABLE
url: sql-ref-syntax-aux-refresh-table.html
- text: REFRESH FUNCTION
url: sql-ref-syntax-aux-refresh-function.html
- text: REFRESH
url: sql-ref-syntax-aux-cache-refresh.html
- text: DESCRIBE
Expand Down
1 change: 1 addition & 0 deletions docs/sql-ref-syntax-aux-cache-cache-table.md
Original file line number Diff line number Diff line change
Expand Up @@ -80,3 +80,4 @@ CACHE TABLE testCache OPTIONS ('storageLevel' 'DISK_ONLY') SELECT * FROM testDat
* [UNCACHE TABLE](sql-ref-syntax-aux-cache-uncache-table.html)
* [REFRESH TABLE](sql-ref-syntax-aux-refresh-table.html)
* [REFRESH](sql-ref-syntax-aux-cache-refresh.html)
* [REFRESH FUNCTION](sql-ref-syntax-aux-refresh-function.html)
1 change: 1 addition & 0 deletions docs/sql-ref-syntax-aux-cache-clear-cache.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,3 +41,4 @@ CLEAR CACHE;
* [UNCACHE TABLE](sql-ref-syntax-aux-cache-uncache-table.html)
* [REFRESH TABLE](sql-ref-syntax-aux-refresh-table.html)
* [REFRESH](sql-ref-syntax-aux-cache-refresh.html)
* [REFRESH FUNCTION](sql-ref-syntax-aux-refresh-function.html)
1 change: 1 addition & 0 deletions docs/sql-ref-syntax-aux-cache-refresh.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,3 +54,4 @@ REFRESH "hdfs://path/to/table";
* [CLEAR CACHE](sql-ref-syntax-aux-cache-clear-cache.html)
* [UNCACHE TABLE](sql-ref-syntax-aux-cache-uncache-table.html)
* [REFRESH TABLE](sql-ref-syntax-aux-refresh-table.html)
* [REFRESH FUNCTION](sql-ref-syntax-aux-refresh-function.html)
1 change: 1 addition & 0 deletions docs/sql-ref-syntax-aux-cache-uncache-table.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,3 +50,4 @@ UNCACHE TABLE t1;
* [CLEAR CACHE](sql-ref-syntax-aux-cache-clear-cache.html)
* [REFRESH TABLE](sql-ref-syntax-aux-refresh-table.html)
* [REFRESH](sql-ref-syntax-aux-cache-refresh.html)
* [REFRESH FUNCTION](sql-ref-syntax-aux-refresh-function.html)
3 changes: 2 additions & 1 deletion docs/sql-ref-syntax-aux-cache.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,4 +23,5 @@ license: |
* [UNCACHE TABLE statement](sql-ref-syntax-aux-cache-uncache-table.html)
* [CLEAR CACHE statement](sql-ref-syntax-aux-cache-clear-cache.html)
* [REFRESH TABLE statement](sql-ref-syntax-aux-refresh-table.html)
* [REFRESH statement](sql-ref-syntax-aux-cache-refresh.html)
* [REFRESH statement](sql-ref-syntax-aux-cache-refresh.html)
* [REFRESH FUNCTION statement](sql-ref-syntax-aux-refresh-function.html)
60 changes: 60 additions & 0 deletions docs/sql-ref-syntax-aux-refresh-function.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
---
layout: global
title: REFRESH FUNCTION
displayTitle: REFRESH FUNCTION
license: |
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
---

### Description
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.


`REFRESH FUNCTION` statement invalidates the cached function entry, which includes a class name
and resource location of the given function. The invalidated cache is populated right away.
Note that `REFRESH FUNCTION` only works for permanent functions. Refreshing native functions or temporary functions will cause an exception.

### Syntax

```sql
REFRESH FUNCTION function_identifier
```

### Parameters

* **function_identifier**

Specifies a function name, which is either a qualified or unqualified name. If no database identifier is provided, uses the current database.

**Syntax:** `[ database_name. ] function_name`

### Examples

```sql
-- The cached entry of the function will be refreshed
-- The function is resolved from the current database as the function name is unqualified.
REFRESH FUNCTION func1;

-- The cached entry of the function will be refreshed
-- The function is resolved from tempDB database as the function name is qualified.
REFRESH FUNCTION tempDB.func1;
```

### Related Statements

* [CACHE TABLE](sql-ref-syntax-aux-cache-cache-table.html)
* [CLEAR CACHE](sql-ref-syntax-aux-cache-clear-cache.html)
* [UNCACHE TABLE](sql-ref-syntax-aux-cache-uncache-table.html)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to mention the above three data-related statement? The following REFRESH statement looks enough for this REFRESH FUNCTION.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just feel they are part of the cache and list each other.

* [REFRESH TABLE](sql-ref-syntax-aux-refresh-table.html)
* [REFRESH](sql-ref-syntax-aux-cache-refresh.html)
1 change: 1 addition & 0 deletions docs/sql-ref-syntax-aux-refresh-table.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,3 +57,4 @@ REFRESH TABLE tempDB.view1;
* [CLEAR CACHE](sql-ref-syntax-aux-cache-clear-cache.html)
* [UNCACHE TABLE](sql-ref-syntax-aux-cache-uncache-table.html)
* [REFRESH](sql-ref-syntax-aux-cache-refresh.html)
* [REFRESH FUNCTION](sql-ref-syntax-aux-refresh-function.html)
1 change: 1 addition & 0 deletions docs/sql-ref-syntax.md
Original file line number Diff line number Diff line change
Expand Up @@ -83,6 +83,7 @@ Spark SQL is Apache Spark's module for working with structured data. The SQL Syn
* [LIST JAR](sql-ref-syntax-aux-resource-mgmt-list-jar.html)
* [REFRESH](sql-ref-syntax-aux-cache-refresh.html)
* [REFRESH TABLE](sql-ref-syntax-aux-refresh-table.html)
* [REFRESH FUNCTION](sql-ref-syntax-aux-refresh-function.html)
* [RESET](sql-ref-syntax-aux-conf-mgmt-reset.html)
* [SET](sql-ref-syntax-aux-conf-mgmt-set.html)
* [SHOW COLUMNS](sql-ref-syntax-aux-show-columns.html)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,7 @@ statement
comment=(STRING | NULL) #commentNamespace
| COMMENT ON TABLE multipartIdentifier IS comment=(STRING | NULL) #commentTable
| REFRESH TABLE multipartIdentifier #refreshTable
| REFRESH FUNCTION multipartIdentifier #refreshFunction
| REFRESH (STRING | .*?) #refreshResource
| CACHE LAZY? TABLE multipartIdentifier
(OPTIONS options=tablePropertyList)? (AS? query)? #cacheTable
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1890,6 +1890,9 @@ class Analyzer(
object ResolveFunctions extends Rule[LogicalPlan] {
val trimWarningEnabled = new AtomicBoolean(true)
def apply(plan: LogicalPlan): LogicalPlan = plan.resolveOperatorsUp {
case UnresolvedFunc(CatalogAndFunctionIdentifier(catalog, identifier)) =>
ResolvedFunc(catalog, identifier)

case q: LogicalPlan =>
q transformExpressions {
case u if !u.childrenResolved => u // Skip until children are resolved.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@

package org.apache.spark.sql.catalyst.analysis

import org.apache.spark.sql.catalyst.FunctionIdentifier
import org.apache.spark.sql.catalyst.expressions.Attribute
import org.apache.spark.sql.catalyst.plans.logical.{LeafNode, LogicalPlan}
import org.apache.spark.sql.connector.catalog.{CatalogPlugin, Identifier, SupportsNamespaces, Table, TableCatalog}
Expand Down Expand Up @@ -50,6 +51,15 @@ case class UnresolvedTableOrView(multipartIdentifier: Seq[String]) extends LeafN
override def output: Seq[Attribute] = Nil
}

/**
* Holds the name of a function that has yet to be looked up in a catalog. It will be resolved to
* [[ResolvedFunc]] during analysis.
*/
case class UnresolvedFunc(multipartIdentifier: Seq[String]) extends LeafNode {
override lazy val resolved: Boolean = false
override def output: Seq[Attribute] = Nil
}

/**
* A plan containing resolved namespace.
*/
Expand All @@ -74,3 +84,8 @@ case class ResolvedTable(catalog: TableCatalog, identifier: Identifier, table: T
case class ResolvedView(identifier: Identifier) extends LeafNode {
override def output: Seq[Attribute] = Nil
}

case class ResolvedFunc(catalog: CatalogPlugin, functionIdentifier: FunctionIdentifier)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shall we put CatalogFunction instead of FunctionIdentifier in the parameter? otherwise the lookup is still done at runtime.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added it.

BTW do we need a FunctionCatalog like TableCatalog in v2 ?

extends LeafNode {
override def output: Seq[Attribute] = Nil
}
Original file line number Diff line number Diff line change
Expand Up @@ -1341,6 +1341,16 @@ class SessionCatalog(
functionRegistry.registerFunction(func, info, builder)
}

/**
* Unregister a temporary or permanent function from a session-specific [[FunctionRegistry]]
*/
def unregisterFunction(name: FunctionIdentifier, ignoreIfNotExists: Boolean): Unit = {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ignoreIfNotExists not used now?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not used. If the function not exists, refresh function will throw exception.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we remove it now?

if (!functionRegistry.dropFunction(name) && !ignoreIfNotExists) {
throw new NoSuchFunctionException(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, it does not throw this exception because this check's already done in https://github.com/apache/spark/pull/28840/files#diff-d2a203f08c862bd762e6740c16e972f7R267-R268 ?

formatDatabaseName(name.database.getOrElse(currentDb)), name.funcName)
}
}

/**
* Drop a temporary function.
*/
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3650,6 +3650,11 @@ class AstBuilder(conf: SQLConf) extends SqlBaseBaseVisitor[AnyRef] with Logging
ctx.REPLACE != null)
}

override def visitRefreshFunction(ctx: RefreshFunctionContext): LogicalPlan = withOrigin(ctx) {
val functionIdentifier = visitMultipartIdentifier(ctx.multipartIdentifier)
RefreshFunction(UnresolvedFunc(functionIdentifier))
}

override def visitCommentNamespace(ctx: CommentNamespaceContext): LogicalPlan = withOrigin(ctx) {
val comment = ctx.comment.getType match {
case SqlBaseParser.NULL => ""
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -516,3 +516,10 @@ case class CommentOnNamespace(child: LogicalPlan, comment: String) extends Comma
case class CommentOnTable(child: LogicalPlan, comment: String) extends Command {
override def children: Seq[LogicalPlan] = child :: Nil
}

/**
* The logical plan of the REFRESH FUNCTION command that works for v2 catalogs.
*/
case class RefreshFunction(child: LogicalPlan) extends Command {
override def children: Seq[LogicalPlan] = child :: Nil
}
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ package org.apache.spark.sql.connector.catalog

import org.apache.spark.internal.Logging
import org.apache.spark.sql.AnalysisException
import org.apache.spark.sql.catalyst.TableIdentifier
import org.apache.spark.sql.catalyst.{FunctionIdentifier, TableIdentifier}
import org.apache.spark.sql.internal.{SQLConf, StaticSQLConf}

/**
Expand Down Expand Up @@ -155,4 +155,37 @@ private[sql] trait LookupCatalog extends Logging {
None
}
}

/**
* Extract catalog and function identifier from a multi-part name with the current catalog if
* needed.
*
* Note that: function is only supported in v1 catalog.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Still not sure about what this means... Could you describe more?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment just aims to make the exception clear throw new AnalysisException("Function command is only supported in v1 catalog").

*/
object CatalogAndFunctionIdentifier {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have you checked my comment? https://github.com/apache/spark/pull/28840/files#r442021563 I personally think you don't need this refactoring. Could you just use parseSessionCatalogFunctionIdentifier in this PR?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for remind, missed here.

Because of the dependency that ResolveSessionCatalog is at sql-core package. Seems I have to do the refactor.

Copy link
Member

@maropu maropu Jun 19, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea, moving parseSessionCatalogFunctionIdentifier into the catalyst package looks okay, but I meant you need to refactor parseSessionCatalogFunctionIdentifier as a extractor? Actually, I think you don't need to update the existing code, e.g., https://github.com/apache/spark/pull/28840/files#diff-2e07be4d73605cb1941153441a0c0c14R568-R569

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will revert related change later.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After some thought. We have to modify the parseSessionCatalogFunctionIdentifier

  • since it used by v2, should return both CatalogPlugin and FunctionIdentifier
  • during resolve UnresolvedFunc, it's hard to decide the sql param. Actually the sql param is not important.

And after this, we also have to update the existing code in ResolveSessionCatalog. Maybe it's better to do the refactor ?

Copy link
Member

@maropu maropu Jun 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually the sql param is not important.

Sorry, but what does it mean? I made a PR for your branch. Please check my suggestion: ulysses-you#6

def unapply(nameParts: Seq[String]): Option[(CatalogPlugin, FunctionIdentifier)] = {

if (nameParts.length == 1 && catalogManager.v1SessionCatalog.isTempFunction(nameParts.head)) {
return Some(currentCatalog, FunctionIdentifier(nameParts.head))
}

nameParts match {
case SessionCatalogAndIdentifier(catalog, ident) =>
if (nameParts.length == 1) {
// If there is only one name part, it means the current catalog is the session catalog.
// Here we don't fill the default database, to keep the error message unchanged for
// v1 commands.
Some(catalog, FunctionIdentifier(nameParts.head, None))
} else {
ident.namespace match {
case Array(db) => Some(catalog, FunctionIdentifier(ident.name, Some(db)))
case _ =>
throw new AnalysisException(s"Unsupported function name '$ident'")
}
}

case _ => throw new AnalysisException("Function command is only supported in v1 catalog")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's Function command?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function command means CREATE FUNCTION, DROP FUNCTION, DESC FUNCTION ...

It seems confused, is it better we list all the function command here ?

}
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ package org.apache.spark.sql.catalyst.parser
import java.util.Locale

import org.apache.spark.sql.AnalysisException
import org.apache.spark.sql.catalyst.analysis.{AnalysisTest, GlobalTempView, LocalTempView, PersistedView, UnresolvedAttribute, UnresolvedNamespace, UnresolvedRelation, UnresolvedStar, UnresolvedTable, UnresolvedTableOrView}
import org.apache.spark.sql.catalyst.analysis.{AnalysisTest, GlobalTempView, LocalTempView, PersistedView, UnresolvedAttribute, UnresolvedFunc, UnresolvedNamespace, UnresolvedRelation, UnresolvedStar, UnresolvedTable, UnresolvedTableOrView}
import org.apache.spark.sql.catalyst.catalog.{ArchiveResource, BucketSpec, FileResource, FunctionResource, FunctionResourceType, JarResource}
import org.apache.spark.sql.catalyst.expressions.{EqualTo, Literal}
import org.apache.spark.sql.catalyst.plans.logical._
Expand Down Expand Up @@ -2113,6 +2113,15 @@ class DDLParserSuite extends AnalysisTest {
"Operation not allowed: CREATE FUNCTION with resource type 'other'")
}

test("REFRESH FUNCTION") {
parseCompare("REFRESH FUNCTION c",
RefreshFunction(UnresolvedFunc(Seq("c"))))
parseCompare("REFRESH FUNCTION b.c",
RefreshFunction(UnresolvedFunc(Seq("b", "c"))))
parseCompare("REFRESH FUNCTION a.b.c",
RefreshFunction(UnresolvedFunc(Seq("a", "b", "c"))))
}

private case class TableSpec(
name: Seq[String],
schema: Option[StructType],
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -566,24 +566,19 @@ class ResolveSessionCatalog(
case ShowTableProperties(r: ResolvedView, propertyKey) =>
ShowTablePropertiesCommand(r.identifier.asTableIdentifier, propertyKey)

case DescribeFunctionStatement(nameParts, extended) =>
val functionIdent =
parseSessionCatalogFunctionIdentifier(nameParts, "DESCRIBE FUNCTION")
case DescribeFunctionStatement(CatalogAndFunctionIdentifier(_, functionIdent), extended) =>
DescribeFunctionCommand(functionIdent, extended)

case ShowFunctionsStatement(userScope, systemScope, pattern, fun) =>
val (database, function) = fun match {
case Some(nameParts) =>
val FunctionIdentifier(fn, db) =
parseSessionCatalogFunctionIdentifier(nameParts, "SHOW FUNCTIONS")
case Some(CatalogAndFunctionIdentifier(_, FunctionIdentifier(fn, db))) =>
(db, Some(fn))
case None => (None, pattern)
}
ShowFunctionsCommand(database, function, userScope, systemScope)

case DropFunctionStatement(nameParts, ifExists, isTemp) =>
val FunctionIdentifier(function, database) =
parseSessionCatalogFunctionIdentifier(nameParts, "DROP FUNCTION")
case DropFunctionStatement(
CatalogAndFunctionIdentifier(_, FunctionIdentifier(function, database)), ifExists, isTemp) =>
DropFunctionCommand(database, function, ifExists, isTemp)

case CreateFunctionStatement(nameParts,
Expand All @@ -606,38 +601,16 @@ class ResolveSessionCatalog(
ignoreIfExists,
replace)
} else {
val FunctionIdentifier(function, database) =
parseSessionCatalogFunctionIdentifier(nameParts, "CREATE FUNCTION")
CreateFunctionCommand(database, function, className, resources, isTemp, ignoreIfExists,
replace)
}
}

// TODO: move function related v2 statements to the new framework.
private def parseSessionCatalogFunctionIdentifier(
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Move this method to LookupCatalog.CatalogAndFunctionIdentifier and drop the sql param.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR needs the change?

nameParts: Seq[String],
sql: String): FunctionIdentifier = {
if (nameParts.length == 1 && isTempFunction(nameParts.head)) {
return FunctionIdentifier(nameParts.head)
}

nameParts match {
case SessionCatalogAndIdentifier(_, ident) =>
if (nameParts.length == 1) {
// If there is only one name part, it means the current catalog is the session catalog.
// Here we don't fill the default database, to keep the error message unchanged for
// v1 commands.
FunctionIdentifier(nameParts.head, None)
} else {
ident.namespace match {
case Array(db) => FunctionIdentifier(ident.name, Some(db))
case _ =>
throw new AnalysisException(s"Unsupported function name '$ident'")
}
nameParts match {
case CatalogAndFunctionIdentifier(_, FunctionIdentifier(function, database)) =>
CreateFunctionCommand(database, function, className, resources, isTemp, ignoreIfExists,
replace)
}
}

case _ => throw new AnalysisException(s"$sql is only supported in v1 catalog")
}
case RefreshFunction(ResolvedFunc(_, func)) =>
// Fallback to v1 command
RefreshFunctionCommand(func.database, func.funcName)
}

private def parseV1Table(tableName: Seq[String], sql: String): Seq[String] = tableName match {
Expand Down
Loading