Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Detecting Kernels" spins forever #167875

Closed
roblourens opened this issue Nov 30, 2022 · 5 comments · Fixed by #167878
Closed

"Detecting Kernels" spins forever #167875

roblourens opened this issue Nov 30, 2022 · 5 comments · Fixed by #167878
Assignees
Labels
bug Issue identified by VS Code Team member as probable bug insiders-released Patch has been released in VS Code Insiders verified Verification succeeded
Milestone

Comments

@roblourens
Copy link
Member

Testing microsoft/vscode-jupyter#11963

  • Remote-SSH to a remote linux machine, from mac
  • Open a notebook
  • "Detecting Kernels" has been there for minutes

jupyter.log

@roblourens
Copy link
Member Author

roblourens commented Nov 30, 2022

Just as I filed this, it updated to show a kernel name, and still has the spinner. You can see the 10 minute gap in the updated log
jupyter.log

@DonJayamanne
Copy link
Contributor

Based on the logs

  • Extension loads at 15:44:46
  • And kernel discovery finishes at 15:44:53
  • At 15:53:32 the kernel selection changed (either you selected this manually or vscode selected this or other)

I.e. i don't see any issues (at least not from the logs()

@roblourens
Copy link
Member Author

I updated to get your logs. I see

debug 10:44:45.352: Start refreshing Kernel Picker (1669920285352)
[snip]
debug 10:44:48.634: End refreshing Kernel Picker (1669920285352)

Oddly, when I debugged the extension on the same remote, same folder, I didn't see this issue. But in normal insiders with the latest extension, I see it 100% of the time

@roblourens
Copy link
Member Author

jupyter3.log

@roblourens roblourens added bug Issue identified by VS Code Team member as probable bug notebook-kernel labels Dec 1, 2022
@DonJayamanne
Copy link
Contributor

@rebornix based on the logs we start the spinner and dispose the task,
we only do this once meaning there aren't multiple tasks created by us, the only one created gets disposed.
I.e. i think the cause for an infinite spinner must be in core

@roblourens roblourens transferred this issue from microsoft/vscode-jupyter Dec 1, 2022
@roblourens roblourens added this to the November 2022 milestone Dec 1, 2022
roblourens added a commit that referenced this issue Dec 1, 2022
This is creating detection tasks to cover the period between opening a notebook, and the notebook extension activating and starting its own detection task. This code is a bit more complex than necessary, and it would be simpler to manage the lifecycle of this task from the `activateByEvent` promise.
Fix #167875
roblourens added a commit that referenced this issue Dec 1, 2022
Don't create more than one notebook detection task per-type.
This is creating detection tasks to cover the period between opening a notebook, and the notebook extension activating and starting its own detection task. This code is a bit more complex than necessary, and it would be simpler to manage the lifecycle of this task from the `activateByEvent` promise.
Fix #167875
@vscodenpa vscodenpa added unreleased Patch has not yet been released in VS Code Insiders insiders-released Patch has been released in VS Code Insiders and removed unreleased Patch has not yet been released in VS Code Insiders labels Dec 1, 2022
@connor4312 connor4312 added the verified Verification succeeded label Dec 2, 2022
@github-actions github-actions bot locked and limited conversation to collaborators Jan 15, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Issue identified by VS Code Team member as probable bug insiders-released Patch has been released in VS Code Insiders verified Verification succeeded
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants