-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Uploading file in chunks using range header #3
Comments
Hi @triplef, I'm afraid this might not be in the Also, I'll move this ticket under the new |
Thanks for your feedback! I’m not sure there’s a requirement regarding reusing the connection, I’m guessing it should work either way but I’d probably try to reuse first. Do you happen to know of any other project that might be able to do this? |
I mostly work with AWS and use their sdk when I need something plug-and-play. So I made a quick search and found that Azure offers something similar in Ruby: https://github.com/azure/azure-storage-ruby It would make sense to use that instead of re-implementing it using Faraday. What do you think? |
Yeah that makes a lot of sense and I looked into this, unfortunately that SDK doesn’t seem to have the necessary initializers to work with an externally provided URL for our use case (see Azure/azure-storage-ruby#211). I’ll see if I can get more info there or find some other way. Thanks again! |
I may be wrong, but I suspect they're simply lacking documentation around it. Looking at the |
I forgot the link to the reference obviously 🤦 : https://github.com/Azure/azure-storage-ruby/blob/master/blob/lib/azure/storage/blob/blob_service.rb#L56 |
I still don’t see a way to give it a full URI (the URI seems to be always built from individual components, see So I ended up with the following: filePart = Faraday::FilePart.new(filePath, 'application/octet-stream')
Faraday.new(headers: {
'x-ms-blob-type': 'BlockBlob',
'x-ms-version': '2021-02-12'
}) {|c|
c.response :raise_error
c.adapter Faraday.default_adapter
}.put(fileUploadUrl, filePart, {'Content-Length': filePart.size.to_s}) Thanks again for your help! |
Thanks for coming back and sharing your solution! I'm sure that will help others in future ❤️ |
Basic Info
Issue description
I’m trying to use Faraday to upload a file to Azure Blob Storage, which has a limit (of 268435456 bytes) for the body of each request. Larger requests must be sent in chunks using a
Range
orx-ms-range
header. Is there a way to do that with Faraday using multipart / FilePart or otherwise?I looked at the implementation of
FilePart
, which seems to support passing anIO
object instead of a file path, but I’m not sure if there's a way to limit such an object to just a subsection of a file?Here is the code I’m currently using, which works fine for files below that limit:
Any help would be appreciated!
The text was updated successfully, but these errors were encountered: