-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update burst compute workflow to prepare it for node18 backend #1
base: master
Are you sure you want to change the base?
Conversation
…y to combine or combineerror
…isfy the input size constraint of 256K
…, lastBatchId, datasetStartIndex and datasetEndIndex
@@ -0,0 +1,2 @@ | |||
MAX_PARALLELISM=5000 | |||
MAX_BATCHED_JOBS_ITERATIONS=2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
at this point both values are the same as the defaults so it could be removed - I used this to test it with multiple values to find the best settings.
handler: src/main/nodejs/monitor.monitorHandler | ||
memorySize: 128 | ||
timeout: ${self:custom.defaultJobTimeoutSecs} | ||
environment: | ||
DEBUG: ${self:custom.debug} | ||
JOB_TIMEOUT_SECS: ${self:custom.defaultJobTimeoutSecs} | ||
TASKS_TABLE_NAME: ${self:custom.tasksTable} | ||
|
||
|
||
stepFunctions: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the biggest and the most important change
environment: | ||
DEBUG: ${self:custom.debug} | ||
|
||
cleanupBatch: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tried an automatic time to live for batch inputs but those files were never cleaned up so I added an explicit task
The PR includes changes so that we can migrate the burst compute to node18 and newer AWS backends.
Node18 requires aws sdk javascript v3 which no longer supports async lambda invocation so the old framework was replaced with a more complex step function that distributes the work using a Map state.