Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: Remove binding of ports to IPv4 only #1363

Merged
merged 1 commit into from
Feb 7, 2025

Conversation

daviian
Copy link
Contributor

@daviian daviian commented Feb 7, 2025

What does this PR do?

Removes the HostIP from the Port Binding so that it does not restrict itself to IPv4.

Why is it important?

Not needed anymore.

Related issues

Copy link

netlify bot commented Feb 7, 2025

Deploy Preview for testcontainers-dotnet ready!

Name Link
🔨 Latest commit a58e52e
🔍 Latest deploy log https://app.netlify.com/sites/testcontainers-dotnet/deploys/67a5ee8de69aa50008379350
😎 Deploy Preview https://deploy-preview-1363--testcontainers-dotnet.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

@HofmeisterAn HofmeisterAn added bug Something isn't working chore A change that doesn't impact the existing functionality, e.g. internal refactorings or cleanups labels Feb 7, 2025
@HofmeisterAn HofmeisterAn changed the title chore: remove IPv4 Host IP from Port Binding chore: Remove binding of ports to IPv4 only Feb 7, 2025
Copy link
Collaborator

@HofmeisterAn HofmeisterAn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks

@HofmeisterAn HofmeisterAn merged commit 231e814 into testcontainers:develop Feb 7, 2025
58 checks passed
@BrunoJuchli
Copy link

BrunoJuchli commented Feb 26, 2025

Could this possibly have caused the problem described here to re-occur?
Since updating to TestContainers 4.2.0 we experience randomly failing tests.
From the error symptoms it looks like we're connecting to the wrong port of the container.
To give some context:

We run a fake SMTP Server in a docker container. The system-under-test connects to the SMTP port to send emails.
The test connects to the HTTP Port to retrieve all emails that have been sent.

The ports are configured using ContainerBuilder.WithPortBinding(int port, assignRandomHostPort: true)

We've not modified this code in the last few months; it used to work reliably.
Now we get SMTP responses on our HTTP Requests.

Of course, this being a timing/concurrency issue, it's entirely likely that the root cause has existed in the beginning, and only the execution timing has shifted recently.

@HofmeisterAn
Copy link
Collaborator

@BrunoJuchli Which container runtime do you use?

@BrunoJuchli
Copy link

BrunoJuchli commented Feb 26, 2025

@HofmeisterAn
Thank you for the super quick reply 🥇
Unfortunately I'm not too knowledgeable in this area, is it by chance what I get from the logs here?

[testcontainers.org 00:00:00.17] Connected to Docker:
Host: unix:///var/run/docker.sock
Server Version: 26.1.3
Kernel Version: 6.8.0-1021-azure
API Version: 1.45
Operating System: Ubuntu 24.04.1 LTS

We're running this on GitHub Action - Ubuntu-latest. If the info above doesn't make it obvious to you on first sight what container runtime is used, I'm happy to dig deeper.
In that case I guess I will have to look into the runner-image and find out whether containerd, wasmtime, gvisor, kata containers,... is used, and which version thereof, correct?

@HofmeisterAn
Copy link
Collaborator

@HofmeisterAn Thank you for the super quick reply 🥇 Unfortunately I'm not too knowledgeable in this area, is it by chance what I get from the logs here?

[testcontainers.org 00:00:00.17] Connected to Docker:
Host: unix:///var/run/docker.sock
Server Version: 26.1.3
Kernel Version: 6.8.0-1021-azure
API Version: 1.45
Operating System: Ubuntu 24.04.1 LTS

We're running this on GitHub Action - Ubuntu-latest. If the info above doesn't make it obvious to you on first sight what container runtime is used, I'm happy to dig deeper. In that case I guess I will have to look into the runner-image and find out whether containerd, wasmtime, gvisor, kata containers,... is used, and which version thereof, correct?

I assume you're encountering a different issue. I think the issue you're referring to is related to environments running Docker Desktop. I haven't noticed any flakiness in our CI pipeline (we're using GH-hosted runners too, though we use Ubuntu 22.04).

Have you been using 24.04 for a while? I believe there were some issues with the runners in general, if I'm not mistaken.

@BrunoJuchli
Copy link

BrunoJuchli commented Feb 27, 2025

I assume you're encountering a different issue. I think the issue you're referring to is related to environments running Docker Desktop. I haven't noticed any flakiness in our CI pipeline (we're using GH-hosted runners too, though we use Ubuntu 22.04).

Have you been using 24.04 for a while? I believe there were some issues with the runners in general, if I'm not mistaken.

@HofmeisterAn
Thanks for the tip! We've not been using Ubuntu 24 for long.

I've executed some test runs with Ubuntu 24 and Ubuntu 22. Since our tests are quite long-running, the sample size isn't large - and the conclusion not a 100% reliable, but:

  • Ubuntu 24: 2 out of 5 tests runs fail
  • Ubuntu 22: 0 out of 6 test runs fail.

From previous runs I see that close to half of the test runs on Ubuntu 24 in recent days failed. So having 6 test runs on Ubuntu 22 pass looks like a reliable enough indicator to me.
= we're switching to Ubuntu 22, for now.

@HofmeisterAn
Copy link
Collaborator

For reference, we have encountered some issues with the VM images in the past (anyone that depends on Docker) due to incompatible kernel versions. I do not know the exact details, but you can find more information about those issues here:

It might be a good idea to start running our tests against version 24.04 as well, to see if there are any issues we can address or information we need to provide to the upstream: GitHub Actions Runner Images.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working chore A change that doesn't impact the existing functionality, e.g. internal refactorings or cleanups
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: ryuk container are not shutdown [Enhancement]: Resolve port bindings according to IPv4 and IPv6
3 participants