-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add clean_file_from_url #165
base: main
Are you sure you want to change the base?
Conversation
.await?; | ||
|
||
loop { | ||
let bytes = reader.read(&mut read_buf).await?; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But I'm not sure how does the RetryMiddleware works with stream reading. If an error occurs within this read()
does the RetryMiddleware retries this read or the entire GET? If it retries the entire GET what does the next successful read return? If it returns bytes from the beginning of the file then that's problematic.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe at this point it will not retry, though I also think we should not retry plainly at that moment since if we could have submitted some content to the cleaner already. The behavior for an error from the stream (e.g. in the highlighted line) should be the same behavior as if we had a file read return an error, which I think we just bubble it up in this case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's test to confirm the behavior of RetryMiddleware interacting with error reading from a stream (the error may be internally handled by retry and not propagated to this level).
Adds a utility function to clean a file given a url to use to "GET" the file, i.e. an S3 presigned url.
The contents are then read incrementally and passed into the cleaner handle to chunk, dedup and upload.
Intended potential use for streaming files to migrate them.