Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reduce LauncherUtil check-interval from 500 to 50 ms #33066

Merged
merged 1 commit into from
Jun 6, 2023

Conversation

snazy
Copy link
Contributor

@snazy snazy commented May 2, 2023

500ms feels a bit long and can cause some wasted time during test runs, especially on beefier machines. It feels safe to reduce this to 50ms, which should work even on tiny CI runner instances, as it's mostly sleep time + a cheap-ish "has more data" check against the log file.

@quarkus-bot

This comment has been minimized.

@snazy snazy force-pushed the launcher-check-interval-50 branch from d1817ec to e587bf3 Compare May 3, 2023 12:29
@quarkus-bot

This comment has been minimized.

@snazy snazy force-pushed the launcher-check-interval-50 branch from e587bf3 to 06bd854 Compare May 4, 2023 15:03
@quarkus-bot

This comment has been minimized.

@gastaldi gastaldi force-pushed the launcher-check-interval-50 branch from 06bd854 to 9a8774d Compare May 5, 2023 14:37
@quarkus-bot

This comment has been minimized.

@snazy snazy force-pushed the launcher-check-interval-50 branch from 9a8774d to 6f6063a Compare May 12, 2023 13:00
@quarkus-bot

This comment has been minimized.

500ms feels a bit long and can cause some wasted time during test runs,
especially on beefier machines. It feels safe to reduce this to 50ms,
which should work even on tiny CI runner instances, as it's mostly sleep
time + a cheap-ish "has more data" check against the log file.
@gsmet gsmet force-pushed the launcher-check-interval-50 branch from 6f6063a to 935cc3a Compare June 6, 2023 14:53
Copy link
Member

@gsmet gsmet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I rebased it because it hit CI at a bad time.

Let's see how CI goes.

@gsmet gsmet added the triage/waiting-for-ci Ready to merge when CI successfully finishes label Jun 6, 2023
@quarkus-bot
Copy link

quarkus-bot bot commented Jun 6, 2023

Failing Jobs - Building 935cc3a

Status Name Step Failures Logs Raw logs
Native Tests - Amazon Build Failures Logs Raw logs

Full information is available in the Build summary check run.

Failures

⚙️ Native Tests - Amazon #

- Failing: integration-tests/amazon-lambda integration-tests/amazon-lambda-http 

📦 integration-tests/amazon-lambda

io.quarkus.it.amazon.lambda.AmazonLambdaSimpleIT.testSimpleLambdaSuccess - More details - Source on GitHub

java.lang.RuntimeException: 
java.lang.RuntimeException: io.quarkus.builder.BuildException: Build failure: Build failed due to errors
	[error]: Build step io.quarkus.amazon.lambda.deployment.DevServicesLambdaProcessor#startEventServer threw an exception: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.net.BindException: Address already in use

📦 integration-tests/amazon-lambda-http

io.quarkus.it.amazon.lambda.AmazonLambdaSimpleIT.testJaxrsCognitoJWTSecurityContext - More details - Source on GitHub

java.lang.RuntimeException: 
java.lang.RuntimeException: io.quarkus.builder.BuildException: Build failure: Build failed due to errors
	[error]: Build step io.quarkus.amazon.lambda.deployment.DevServicesLambdaProcessor#startEventServer threw an exception: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.net.BindException: Address already in use

@gastaldi gastaldi merged commit b7f29a2 into quarkusio:main Jun 6, 2023
@quarkus-bot quarkus-bot bot removed the triage/waiting-for-ci Ready to merge when CI successfully finishes label Jun 6, 2023
@quarkus-bot quarkus-bot bot added this to the 3.2 - main milestone Jun 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants