Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

wait_for errors when otel extras installed #16708

Closed
jakekaplan opened this issue Jan 13, 2025 · 0 comments · Fixed by #16709
Closed

wait_for errors when otel extras installed #16708

jakekaplan opened this issue Jan 13, 2025 · 0 comments · Fixed by #16709
Assignees
Labels
bug Something isn't working

Comments

@jakekaplan
Copy link
Contributor

jakekaplan commented Jan 13, 2025

Bug summary

  1. Install prefect with otel extras uv pip install "prefect[otel]"

  2. Run the following:

from prefect import task, flow

@task
def task_1():
    print("Task 1")

@task
def task_2():
    print("Task 2")

@flow
def the_flow():
    task1_future = task_1.submit()
    task2_future = task_2.submit(wait_for=[task1_future])
    task2_future.wait()


if __name__ == "__main__":
    the_flow()
  1. Observe error:
Traceback (most recent call last):
  File "/Users/jake/PycharmProjects/demo-flows/what.py", line 19, in <module>
    the_flow()
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/flows.py", line 1354, in __call__
    return run_flow(
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/flow_engine.py", line 1462, in run_flow
    ret_val = run_flow_sync(**kwargs)
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/flow_engine.py", line 1338, in run_flow_sync
    return engine.state if return_type == "state" else engine.result()
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/flow_engine.py", line 327, in result
    raise self._raised
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/flow_engine.py", line 737, in run_context
    yield self
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/flow_engine.py", line 1336, in run_flow_sync
    engine.call_flow_fn()
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/flow_engine.py", line 757, in call_flow_fn
    result = call_with_parameters(self.flow.fn, self.parameters)
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/utilities/callables.py", line 208, in call_with_parameters
    return fn(*args, **kwargs)
  File "/Users/jake/PycharmProjects/demo-flows/what.py", line 15, in the_flow
    task2_future.wait()
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/futures.py", line 142, in wait
    result = self._wrapped_future.result(timeout=timeout)
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/concurrent/futures/_base.py", line 458, in result
    return self.__get_result()
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
    raise self._exception
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/task_engine.py", line 1373, in run_task_sync
    with engine.start(task_run_id=task_run_id, dependencies=dependencies):
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/contextlib.py", line 135, in __enter__
    return next(self.gen)
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/task_engine.py", line 750, in start
    self.begin_run()
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/task_engine.py", line 378, in begin_run
    self._wait_for_dependencies()
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/task_engine.py", line 217, in _wait_for_dependencies
    visit_collection(
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/utilities/collections.py", line 480, in visit_collection
    items = [visit_nested(o) for o in seq]
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/utilities/collections.py", line 480, in <listcomp>
    items = [visit_nested(o) for o in seq]
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/utilities/collections.py", line 389, in visit_nested
    return visit_collection(
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/utilities/collections.py", line 421, in visit_collection
    result = visit_expression(expr)
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/utilities/collections.py", line 401, in visit_expression
    return _callback(expr, context)
  File "/opt/miniconda3/envs/demo-flows310/lib/python3.10/site-packages/prefect/utilities/engine.py", line 763, in resolve_to_final_result
    "prefect.input.name": context["parameter_name"],
KeyError: 'parameter_name'

Version info

Version:             3.1.12
API version:         0.8.4
Python version:      3.10.14
Git commit:          e299e5a7
Built:               Thu, Jan 9, 2025 10:09 AM
OS/Arch:             darwin/arm64
Profile:             prd-test
Server type:         cloud
Pydantic version:    2.10.5
Integrations:
  prefect-docker:    0.6.2
  prefect-aws:       0.5.0

Additional context

No response

@jakekaplan jakekaplan added the bug Something isn't working label Jan 13, 2025
@chrisguidry chrisguidry self-assigned this Jan 13, 2025
chrisguidry added a commit that referenced this issue Jan 13, 2025
In our OpenTelemetry instrumentation, we were assuming that if we were
resolving a future before running a task, it must be a parameter.  This
isn't true in the case where a future is used in `wait_for`, where the
context's `parameter_name` won't be set.

Fixes #16708
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants