This approach (ab)uses that trashed items can have naming conflicts
and that one can change their parents, even though direct replacing
("moving") is forbidden.
This means the feature can be disabled by setting the time to 0.
This also logs the HTTP status for analysis purposes.
Thanks Felix Bünemann for extensive testing and data collection.
When rclone is busy doing lots of very long uploads it doesn't refresh
the token. Amazon will fail uploads if they finish when the token is
more than 1 Hour past expiry.
Fix this by keeping track of the number of uploads and refreshing the
token when the token expires if there is an upload in progress.
Amazon Drive sometimes returns errors at the end of large uploads
* 408 REQUEST_TIMEOUT
* 504 GATEWAY_TIMEOUT
* 500 Internal error
The file may have been uploaded correctly though, so, on error, wait
for up to 2 minutes for it to appear if it was fully
uploaded (configure timeout with --acd-upload-wait-time).
Issues: #601#605#606
If remote:path points to a file make NewFs return a sentinel error
fs.ErrorIsFile and an Fs which points to the parent.
Use this to remove the LimitedFs and just add this file to the
--files-from list.
This means that server side operations can be used also.
Fixes#518Fixes#545
This should fix duplicate files on drive and 409 errors on
amazonclouddrive however it will slow down the upload slightly as
another roundtrip will be needed.
None of the other Fses needed adjusting.
Fixes#483
Sometimes ACD gives this error on reauthentication
HTTP code 403: "403 Forbidden", reponse body: {"message":"Authorization header requires 'Credential' parameter. Authorization header requires 'Signature' parameter. Authorization header requires 'SignedHeaders' parameter. Authorization header requires existence of either a 'X-Amz-Date' or a 'Date' header. Authorization=Bearer"}
This code retries this error if it is received.
Before this change rclone would retry only the page that was missing
from the directory listing. However it turns out that on 429 errors
at least, that page is gone from the directory listing which results
in missing files in the list. The workaround for this is to restart
the directory listing on any retryable errors.
Gives more accurate error propagation, control of depth of recursion
and short circuit recursion where possible.
Most of the the heavy lifting is done in the "fs" package, making file
system implementations a bit simpler.
This commit contains some code originally by Klaus Post.
Fixes#316
This will allow setting up a remote with copy&paste of values to a headless machine. It will allow copy+pasting a token into the configuration.
This requires rclone to be on a machine with a proper browser. Custom client id and secrets are supported.
To test token generation, use `rclone auth "fs type"`.