Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
  • Loading branch information
docs-action committed Dec 25, 2023
1 parent 9ec0b08 commit 656627d
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 1 deletion.
2 changes: 1 addition & 1 deletion assets/js/search-data.json
Original file line number Diff line number Diff line change
Expand Up @@ -3312,7 +3312,7 @@
},"473": {
"doc": "Unity Catalog",
"title": "Prerequisites",
"content": "Before starting, ensure you have the following: . | Access to Unity Catalog | An active lakeFS installation with S3 as the backing storage, and a repository in this installation. | A Databricks SQL warehouse. | AWS Credentials with S3 access. | lakeFS credentials with access to your Delta Tables. | . Databricks authentication . Given that the hook will ultimately register a table in Unity Catalog, authentication with Databricks is imperative. Make sure that: . | You have a Databricks Service Principal. | The Service principal has token usage permissions, and an associated token configured. | The service principal has the Service principal: Manager privilege over itself (Workspace: Admin console -> Service principals -> <service principal> -> Permissions -> Grant access (<service principal>: Service principal: Manager), with Workspace access and Databricks SQL access checked (Admin console -> Service principals -> <service principal> -> Configurations). | Your SQL warehouse allows the service principal to use it (SQL Warehouses -> <SQL warehouse> -> Permissions -> <service principal>: Can use). | The catalog grants the USE CATALOG, USE SCHEMA, CREATE SCHEMA permissions to the service principal(Catalog -> <catalog name> -> Permissions -> Grant -> <service principal>: USE CATALOG, USE SCHEMA, CREATE SCHEMA). | You have an External Location configured, and the service principal has the CREATE EXTERNAL TABLE permission over it (Catalog -> External Data -> External Locations -> Create location). | . ",
"content": "Before starting, ensure you have the following: . | Access to Unity Catalog | An active lakeFS installation with S3 as the backing storage, and a repository in this installation. | A Databricks SQL warehouse. | AWS Credentials with S3 access. | lakeFS credentials with access to your Delta Tables. | . Supported from lakeFS v1.4.0 . Databricks authentication . Given that the hook will ultimately register a table in Unity Catalog, authentication with Databricks is imperative. Make sure that: . | You have a Databricks Service Principal. | The Service principal has token usage permissions, and an associated token configured. | The service principal has the Service principal: Manager privilege over itself (Workspace: Admin console -> Service principals -> <service principal> -> Permissions -> Grant access (<service principal>: Service principal: Manager), with Workspace access and Databricks SQL access checked (Admin console -> Service principals -> <service principal> -> Configurations). | Your SQL warehouse allows the service principal to use it (SQL Warehouses -> <SQL warehouse> -> Permissions -> <service principal>: Can use). | The catalog grants the USE CATALOG, USE SCHEMA, CREATE SCHEMA permissions to the service principal(Catalog -> <catalog name> -> Permissions -> Grant -> <service principal>: USE CATALOG, USE SCHEMA, CREATE SCHEMA). | You have an External Location configured, and the service principal has the CREATE EXTERNAL TABLE permission over it (Catalog -> External Data -> External Locations -> Create location). | . ",
"url": "/integrations/unity_catalog.html#prerequisites",

"relUrl": "/integrations/unity_catalog.html#prerequisites"
Expand Down
4 changes: 4 additions & 0 deletions integrations/unity_catalog.html
Original file line number Diff line number Diff line change
Expand Up @@ -594,6 +594,10 @@ <h2 id="prerequisites">
<li>AWS Credentials with S3 access.</li>
<li>lakeFS credentials with access to your Delta Tables.</li>
</ol>

<blockquote class="note">
<p>Supported from lakeFS v1.4.0</p>
</blockquote>
<h3 id="databricks-authentication">


Expand Down

0 comments on commit 656627d

Please sign in to comment.