You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I'm trying to set a query tag while using the Snowflake connector. I am using as a reference the snippet from the Hive connector example:
# For more details on authentication, see the PyHive docs:
# https://github.com/dropbox/PyHive#passing-session-configuration.
# LDAP, Kerberos, etc. are supported using connect_args, which can be
# added under the `options` config parameter.
#options:
# connect_args:
# auth: KERBEROS
# kerberos_service_name: hive
I can use other SQLAlchemy parameters such as convert_unicode without issues, but if I add connect_args (no matter what parameters I add) I'm getting:
[2022-04-11 11:42:38,221] {taskinstance.py:1462} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1164, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1282, in _prepare_and_execute_task_with_callbacks
result = self._execute_task(context, task_copy)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1307, in _execute_task
result = task_copy.execute(context=context)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/operators/python.py", line 150, in execute
return_value = self.execute_callable()
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/operators/python.py", line 161, in execute_callable
return self.python_callable(*self.op_args, **self.op_kwargs)
File "/opt/airflow/dags/.../../...py", line 33, in datahub_recipe
pipeline.run()
File "/home/airflow/.local/lib/python3.9/site-packages/datahub/ingestion/run/pipeline.py", line 156, in run
for wu in itertools.islice(
File "/home/airflow/.local/lib/python3.9/site-packages/datahub/ingestion/source/sql/snowflake.py", line 380, in get_workunits
for wu in super().get_workunits():
File "/home/airflow/.local/lib/python3.9/site-packages/datahub/ingestion/source/sql/sql_common.py", line 519, in get_workunits
for inspector in self.get_inspectors():
File "/home/airflow/.local/lib/python3.9/site-packages/datahub/ingestion/source/sql/snowflake.py", line 189, in get_inspectors
db_listing_engine = create_engine(
TypeError: sqlalchemy.engine.create_engine() got multiple values for keyword argument 'connect_args'
To Reproduce
Steps to reproduce the behavior:
Create a YAML file to ingest Snowflake metadata and add the following snippet:
Describe the bug
I'm trying to set a query tag while using the Snowflake connector. I am using as a reference the snippet from the Hive connector example:
I can use other SQLAlchemy parameters such as convert_unicode without issues, but if I add
connect_args
(no matter what parameters I add) I'm getting:To Reproduce
Steps to reproduce the behavior:
Expected behavior
The pipeline runs normally and the specified tag is shown in the Query History.
Screenshots
If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: