-
Notifications
You must be signed in to change notification settings - Fork 803
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🐛 BUG: Only one instance of Wrangler can be started #4912
Comments
This is not reproducible in node 16. I think it is a reoccurence of the old node 18 switching to IPV6 by default problem. |
OK so looking into this I think the problem is this line where we use workers-sdk/packages/wrangler/src/dev.tsx Line 715 in 48f9085
I believe that what is happening here is that If you change the line above to use 127.0.0.1 as the host explicitly then the second call (Worker b) gets a completely new port and that seems to solve the problem. I think this does not fail in node 16 because it does not have support for IPV6 so calling cc @Lekensteyn |
An alternative solution is to ask workerd to listen on
e.g. unsafeDirectHost:
this.latestConfig.dev?.inspector?.hostname ?? "localhost", But I am worried that we then have two different servers listening on |
Fixed in #4907 |
Which Cloudflare product(s) does this pertain to?
Wrangler core
What version(s) of the tool(s) are you using?
main
What version of Node are you using?
18
What operating system and version are you using?
macOS
Describe the Bug
Observed behavior
Checkout & build Wrangler locally, and then run
npx wrangler dev
inworkers-sdk/fixtures/service-bindings-app/a
and then inworkers-sdk/fixtures/service-bindings-app/b
. The second command will fail withAddress already in use (127.0.0.1:9229)
Expected behavior
Multiple instances of Wrangler to be startable without error.
This first started regressing in #4830, and hasn't yet been published in a Wrangler release.
Please provide a link to a minimal reproduction
N/A
Please provide any relevant error logs
N/A
The text was updated successfully, but these errors were encountered: