Gives more accurate error propagation, control of depth of recursion
and short circuit recursion where possible.
Most of the the heavy lifting is done in the "fs" package, making file
system implementations a bit simpler.
This commit contains some code originally by Klaus Post.
Fixes#316
* Now removes identical copies without asking
* Now obeys `--dry-run`
* Implement `--dedupe-mode` for non interactive running
* `--dedupe-mode interactive` - interactive the default.
* `--dedupe-mode skip` - removes identical files then skips anything left.
* `--dedupe-mode first` - removes identical files then keeps the first one.
* `--dedupe-mode newest` - removes identical files then keeps the newest one.
* `--dedupe-mode oldest` - removes identical files then keeps the oldest one.
* `--dedupe-mode rename` - removes identical files then renames the rest to be different.
* Add tests which will only run on Google Drive.
See #317 for details.
Use `rclone config` to add/change/remove password.
Tests that loads the default configuration will now fail with a better error message, and add a switch that makes it possible to disable password prompts and fail instead.
Make it possible to use the "RCLONE_CONFIG_PASS" environment variable as password for configuration.
* Make all integration tests start with an empty remote
* Add an -individual flag so this can be a different bucket/container/directory
* Fix up tests after changing the hashers
* Add sha1sum test
* Make directory checking in tests sleep more to fix acd inconsistencies
* Factor integration tests to make more maintainable
* Ensure remote writes have a fstest.CheckItems() before use
* this fixes eventual consistency on the directory listings later
* Call fs.Stats.ResetCounters() before every fs.Sync()
Note that the tests shouldn't be run concurrently as fs.Config is global state.
* Implement include/exclude
* Implement rsync compatible file globbing
* Implement command line filtering flags
* --delete-excluded - Delete files on dest excluded from sync
* --filter - Add a file-filtering rule
* --filter-from - Read filtering patterns from a file
* --exclude - Exclude files matching pattern
* --exclude-from - Read exclude patterns from file
* --include - Include files matching pattern
* --include-from - Read include patterns from file
* --files-from - Read list of source-file nam
* --min-size - Don't transfer any file smaller than this in k or suffix k|M|G
* --max-size - Don't transfer any file larger than this in k or suffix k|M|G
* Document
* Define Mover interface to move a single object
* Define DirMover interface to move a directory
* Implement DirMove operation
* Add `rclone move` command
* Tests for Dir Move
To Do
* Implement Move, DirMover in local, drive, dropbox
* unit test for Mover
* unit test for DirMover
This means that dropbox no longer stores MD5SUMs and modified times.
Fixup the tests so that blank MD5SUMs are ignored, and that if
Precision is set to a fs.ModTimeNotSupported, ModTimes can be ignored too.
This opens the door for other FSs which don't support metadata easily.