-
Notifications
You must be signed in to change notification settings - Fork 14.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
JDBC operator not logging errors #16295
Comments
Thanks for opening your first issue here! Be sure to follow the issue template! |
Hi, What is the status of this issue @eladkal? Can You assign it to me? |
Assigned you |
Need some advice.
Example log after change:
Please let me know if it's okay. |
Is it not possible to do this instead? def _run_command(self, cur, sql_statement, parameters):
try:
return super()._run_command(cur, sql_statement, parameters)
except (jaydebeapi.DatabaseError, jaydebeapi.InterfaceError):
self.log.exception("Failed to execute statement in JDBC")
raise |
Also please investigate how far above the stack the exception is bubbled to. When is the exception lost so “somehow task can't see that”? It’d be awesome if we could ge a reason better than “somehow” 🙂 |
Thanks for response, of course i will provide explanation in PR why and when exception it's lost. I have theory about that, but right now i have issues with configuring debugger in Pycharm and i can't check it properly. |
Hi,
Since Airflow 2.0, we are having issues with logging for the JDBC operator. When such a tasks fails, we only see
INFO - Task exited with return code 1
The actual error and stack trace is not present.
It also seems to not try to execute it again, it only tries once even though my max_tries is 3.
I am using a Local Executor, and logs are also stored locally.
This issue occurs for both local installations and Docker.
full log:
The text was updated successfully, but these errors were encountered: