-
-
Notifications
You must be signed in to change notification settings - Fork 627
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Optionally trigger batched items sequentially to preserve order #1536
Conversation
🦋 Changeset detectedLatest commit: 2ecf510 The changes in this PR will be included in the next version bump. This PR includes changesets to release 11 packages
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
Caution Review failedThe pull request is closed. Warning There were issues while running some tools. Please review the errors and either fix the tool’s configuration or disable the tool if it’s a critical failure. 🔧 eslint
apps/webapp/app/v3/services/batchTriggerV2.server.tsOops! Something went wrong! :( ESLint: 8.45.0 ESLint couldn't find the config "custom" to extend from. Please check that the name of the config is correct. The config "custom" was referenced from the config file in "/.eslintrc.js". If you still have problems, please stop by https://eslint.org/chat/help to chat with the team. WalkthroughThis pull request introduces enhancements to the batch processing functionality within the application. It adds an option for sequential triggering of batched items, allowing users to choose between sequential and parallel execution. Modifications are made across various files, including updates to API routes, service classes, and type definitions, ensuring that the new triggering mechanism is integrated into existing workflows without disrupting current functionality. Changes
Possibly related PRs
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (3)
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Outside diff range and nitpick comments (8)
apps/webapp/app/services/routeBuilders/apiBuilder.server.ts (3)
36-36
: Add JSDoc documentation for the new optionThe
shouldRetryNotFound
option would benefit from documentation explaining:
- Its purpose in sequential processing
- Expected client behavior
- Any limitations or considerations
Add this documentation:
+ /** + * When true, adds x-should-retry header to 404 responses, + * indicating that clients should retry the request. + * Used for implementing sequential processing where + * resources may not be immediately available. + */ shouldRetryNotFound?: boolean;
167-170
: Consider using standard retry-related headersInstead of a custom
x-should-retry
header, consider using standard HTTP headers:
Retry-After
for suggesting delay durationX-RateLimit-*
for communicating rate limitsLink
withrel="retry-after"
for next polling URLExample implementation:
-{ status: 404, headers: { "x-should-retry": shouldRetryNotFound ? "true" : "false" } } +{ + status: 404, + headers: { + ...(shouldRetryNotFound ? { + "Retry-After": "5", + "x-should-retry": "true" + } : { + "x-should-retry": "false" + }) + } +}
85-85
: Ensure type safety for the new optionThe
shouldRetryNotFound
option is correctly typed but its usage in the response headers could be more type-safe.Consider creating a type for the response headers:
type ApiResponseHeaders = { "x-should-retry": string; "Retry-After"?: string; // ... other headers }; // Then use it in the response creation const headers: ApiResponseHeaders = { "x-should-retry": shouldRetryNotFound ? "true" : "false", ...(shouldRetryNotFound && { "Retry-After": "5" }) };apps/webapp/app/v3/services/batchTriggerV2.server.ts (2)
54-61
: Set default value for 'batchProcessingStrategy' directly in the constructor parameterConsider setting the default value for
batchProcessingStrategy
directly in the constructor parameter to improve code readability and consistency.Apply this diff to implement the suggestion:
constructor( - batchProcessingStrategy?: BatchProcessingStrategy, + batchProcessingStrategy: BatchProcessingStrategy = "parallel", protected readonly _prisma: PrismaClientOrTransaction = prisma ) { super(_prisma); - this._batchProcessingStrategy = batchProcessingStrategy ?? "parallel"; + this._batchProcessingStrategy = batchProcessingStrategy; }
463-470
: Ensure consistency in 'processingId' assignmentIn the sequential processing strategy,
processingId
is set tobatchId
, whereas in the parallel strategy, it is set to the index (\
${index}`). For clarity and consistency, consider using a consistent approach to
processingId` in both cases or document the reasoning for this difference..changeset/sweet-suits-kick.md (1)
6-6
: Refine the changeset message for clarityConsider rewording the message to improve readability:
- "Add option to trigger batched items sequentially; default is parallel triggering, which is faster."
apps/webapp/app/routes/api.v1.tasks.batch.ts (1)
90-90
: Simplify 'batchProcessingStrategy' parameter usageThe
?? undefined
inbatchProcessingStrategy ?? undefined
is redundant. PassbatchProcessingStrategy
directly to the constructor.Apply this diff to simplify:
-const service = new BatchTriggerV2Service(batchProcessingStrategy ?? undefined); +const service = new BatchTriggerV2Service(batchProcessingStrategy);packages/trigger-sdk/src/v3/shared.ts (1)
625-626
: LGTM! Consistent implementation of sequential processing across all batch functions.The implementation:
- Correctly maps
triggerSequentially
toprocessingStrategy
across all batch operations- Maintains consistency in the API contract
Consider adding telemetry for sequential processing.
To better understand usage patterns and performance implications, consider adding telemetry attributes for sequential processing.
{ spanParentAsLink: true, idempotencyKey: await makeIdempotencyKey(options?.idempotencyKey), idempotencyKeyTTL: options?.idempotencyKeyTTL, processingStrategy: options?.triggerSequentially ? "sequential" : undefined, + // Add telemetry attributes + attributes: { + ...span.attributes, + "batch.processing.strategy": options?.triggerSequentially ? "sequential" : "parallel", + "batch.items.count": items.length, + }, }Also applies to: 795-797, 1271-1272, 1437-1439
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (10)
.changeset/sweet-suits-kick.md
(1 hunks)apps/webapp/app/routes/api.v1.tasks.batch.ts
(4 hunks)apps/webapp/app/routes/api.v3.runs.$runId.ts
(1 hunks)apps/webapp/app/services/routeBuilders/apiBuilder.server.ts
(3 hunks)apps/webapp/app/services/worker.server.ts
(1 hunks)apps/webapp/app/v3/services/batchTriggerV2.server.ts
(5 hunks)packages/core/src/v3/apiClient/index.ts
(2 hunks)packages/core/src/v3/types/tasks.ts
(2 hunks)packages/trigger-sdk/src/v3/shared.ts
(15 hunks)references/v3-catalog/src/trigger/batch.ts
(17 hunks)
🧰 Additional context used
🪛 Gitleaks (8.21.2)
references/v3-catalog/src/trigger/batch.ts
507-507: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
508-508: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
536-536: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
537-537: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
🔇 Additional comments (24)
apps/webapp/app/routes/api.v3.runs.$runId.ts (1)
18-18
: Consider the implications of enabling retries by default
While enabling retries helps with sequential processing, consider:
- Adding a maximum retry count or timeout in the client to prevent infinite retries
- Implementing rate limiting to prevent excessive load during retry storms
- Adding retry delay recommendations via response headers (e.g.,
Retry-After
)
Let's check if rate limiting is implemented:
apps/webapp/app/v3/services/batchTriggerV2.server.ts (1)
29-30
: Definition and export of 'BatchProcessingStrategy' enum
The creation and export of the BatchProcessingStrategy
enum enhances code clarity and allows for consistent usage of processing strategies across modules.
apps/webapp/app/routes/api.v1.tasks.batch.ts (3)
12-15
: Import and utilize 'BatchProcessingStrategy'
The import of BatchProcessingStrategy
and its integration into the headers schema effectively allows clients to specify the processing strategy.
24-26
: Extend headers schema with 'batch-processing-strategy'
The extension of the headers schema to include "batch-processing-strategy"
using BatchProcessingStrategy.nullish()
correctly enables optional client specification of the processing strategy.
77-77
: Include 'batchProcessingStrategy' in debug logs
Logging the batchProcessingStrategy
enhances traceability and aids in debugging.
packages/core/src/v3/apiClient/index.ts (2)
77-77
: Add 'processingStrategy' to 'ClientBatchTriggerOptions'
The addition of processingStrategy
to ClientBatchTriggerOptions
enhances configurability, allowing clients to specify the desired batch processing strategy.
243-243
: Include 'processingStrategy' in request headers
Passing processingStrategy
in the headers ensures the server can determine the batch processing strategy based on client preference.
references/v3-catalog/src/trigger/batch.ts (14)
127-137
: Update 'run' method to accept 'triggerSequentially' parameter
The addition of the optional triggerSequentially
parameter allows for more flexible execution strategies, enabling sequential or parallel triggering based on the provided option.
165-173
: Use 'triggerSequentially' in 'batch.triggerByTask'
Incorporating triggerSequentially
in batch.triggerByTask
aligns with the new processing strategy options, enhancing control over task execution.
192-201
: Apply 'triggerSequentially' in 'batch.triggerAndWait'
Utilizing triggerSequentially
in batch.triggerAndWait
allows the batch to process items sequentially when required.
243-252
: Integrate 'triggerSequentially' in 'batch.triggerByTaskAndWait'
Passing the triggerSequentially
option to batch.triggerByTaskAndWait
provides consistency with other batch triggering methods and enhances execution flexibility.
Line range hint 299-330
: Test batch triggering with 'triggerSequentially'
Including triggerSequentially
in tests ensures that the sequential processing option is thoroughly verified for correctness and performance.
388-391
: Apply 'triggerSequentially' in batch trigger with multiple items
Using triggerSequentially
when triggering a batch with multiple items allows testing of sequential processing in larger batches.
Line range hint 416-462
: Test batch-level idempotency with 'triggerSequentially'
Incorporating triggerSequentially
in idempotency key tests ensures that sequential processing does not affect the idempotency behavior of batches.
479-493
: Pass 'triggerSequentially' when testing individual run idempotency
Including triggerSequentially
in tests for individual run idempotency keys validates that sequential processing is compatible with idempotent runs.
505-513
: Verify cached runs with 'triggerSequentially' option
Testing cached runs with triggerSequentially
ensures that the sequential strategy correctly recognizes and handles cached runs.
🧰 Tools
🪛 Gitleaks (8.21.2)
507-507: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
508-508: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
534-542
: Assess idempotency key expiry with 'triggerSequentially'
Testing the expiry of idempotency keys while using triggerSequentially
confirms that sequential processing adheres to idempotency key time-to-live constraints.
🧰 Tools
🪛 Gitleaks (8.21.2)
536-536: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
537-537: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
561-566
: Evaluate 'batchTriggerAndWait' with 'triggerSequentially'
Using triggerSequentially
in batchTriggerAndWait
tests ensures that waiting for batch completion functions correctly in sequential processing mode.
599-602
: Test large batch with 'triggerSequentially' in 'batchTriggerAndWait'
Processing a larger batch sequentially in batchTriggerAndWait
helps assess performance and correctness with more significant workloads.
611-617
: Use 'triggerSequentially' in 'tasks.batchTrigger' tests
Including triggerSequentially
in tasks.batchTrigger
tests ensures consistent behavior across different batch triggering methods.
641-644
: Assess performance with 'triggerSequentially' and many items
Testing tasks.batchTrigger
with triggerSequentially
and many items evaluates the impact on performance and helps identify potential bottlenecks.
packages/core/src/v3/types/tasks.ts (2)
785-797
: LGTM! Well-documented type definitions for batch processing options.
The new triggerSequentially
option is:
- Well-documented with clear JSDoc comments
- Properly typed as an optional boolean
- Includes clear explanation of performance implications
Also applies to: 799-810
595-597
: LGTM! Type-safe method signature update.
The batchTriggerAndWait
method signature has been correctly updated to include the optional BatchTriggerAndWaitOptions
parameter.
apps/webapp/app/services/worker.server.ts (1)
736-738
: Verify error handling behavior in sequential processing mode.
While the implementation correctly passes the processing strategy, we should verify how errors are handled when processing items sequentially:
- If an item fails, does it stop processing subsequent items?
- How are partial failures reported back to the caller?
✅ Verification successful
Error handling in sequential mode is properly implemented
Based on the code analysis, the error handling behavior in sequential processing mode is well-implemented:
-
When an item fails during processing:
- The error is logged with detailed context
- Processing stops at the failed item (returns with ERROR status)
- The batch is requeued starting from the failed item with the same batch size
- The attempt count is tracked to prevent infinite retries (maxAttempts: 5)
-
Partial failures are handled by:
- Maintaining a workingIndex to track progress
- Returning detailed error information including the failed item's position
- Preserving the sequential order by requeueing from the exact failure point
- Logging comprehensive error details for monitoring
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for error handling patterns in BatchTriggerV2Service
ast-grep --pattern 'class BatchTriggerV2Service {
$$$
processBatchTaskRun($$$) {
$$$
try {
$$$
} catch {
$$$
}
$$$
}
$$$
}'
Length of output: 165
Script:
#!/bin/bash
# Let's try a broader search for BatchTriggerV2Service implementation
rg -A 20 "class BatchTriggerV2Service"
Length of output: 1828
Script:
#!/bin/bash
# Search for error handling in processBatchTaskRun method
rg -A 20 "processBatchTaskRun"
Length of output: 24213
Script:
#!/bin/bash
# Look for any error handling patterns in files containing BatchTriggerV2Service
rg "BatchTriggerV2Service" --files-with-matches | xargs rg -A 5 "catch|throw|error"
Length of output: 13288
@trigger.dev/build
trigger.dev
@trigger.dev/react-hooks
@trigger.dev/rsc
@trigger.dev/sdk
@trigger.dev/core
commit: |
This PR adds an option to the batch trigger endpoints and SDK methods to make sure batched items are triggered sequentially, which is the way the deprecated batch trigger works:
This option is available to all the batch trigger SDK methods.
This PR also fixes an issue that was causing 404s when calling
runs.retrieve
to not retry, which is an issue with the new batch endpoint that eventually creates runs, instead of transactionally.Summary by CodeRabbit
Release Notes
New Features
shouldRetryNotFound
property for improved error handling in resource retrieval.Improvements
Bug Fixes
Documentation