We are having issues uploading a large file to Azure using HTTPAPIR4 (I am going to test with geturi as well to see if any difference)
A suggestion from Azure was to split the large file into multiple smaller files using tar or similar - the sample command they provided was
tar -cvf - backup.file | split -b 1G - backup.part.tar.
And also do a MD5 hash on the file to ensure integrity after the upload.
I am sure there is an equivalent command to join the files back up but that is not the reason for this email.
Just curious if anyone has used this approach and how you achieved it and how successful it has been ?
Thank you to any suggestions.
Cheers
Don
Brisbane - Sydney - Melbourne
Don Brown
Senior Consultant
P: 1300 088 400
DISCLAIMER. Before opening any attachments, check them for viruses and defects. This email and its attachments may contain confidential information. If you are not the intended recipient, please do not read, distribute or copy this email or its attachments but notify sender and delete it. Any views expressed in this email are those of the individual sender
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2025 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact
[javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.