From e9a3ecc761e23fbbb0ec0f059776496ff833a881 Mon Sep 17 00:00:00 2001 From: lc63 <123966558+coffreo-lcabello@users.noreply.github.com> Date: Fri, 21 Jul 2023 17:19:55 +0200 Subject: [PATCH] Updated Big syncs with millions of files (markdown) --- Big-syncs-with-millions-of-files.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Big-syncs-with-millions-of-files.md b/Big-syncs-with-millions-of-files.md index 19bc4c8..3f5f1be 100644 --- a/Big-syncs-with-millions-of-files.md +++ b/Big-syncs-with-millions-of-files.md @@ -2,7 +2,7 @@ Rclone syncs on a directory by directory basis. If you have 10,000,000 directories with 1,000 files in and it will sync fine, but if you have a directory with 100,000,000 files in you will a lot of RAM to process it. -The log is then filled by: +Until the OOM killer kills the process, the log is then filled by, : ``` 2023/07/06 15:30:35 INFO : Transferred: 0 B / 0 B, -, 0 B/s, ETA -