chunk size for number of rows to query was far to low for huge tables like egw_history_log which can be a couple of million rows, backup took far to long

This commit is contained in:
Ralf Becker 2012-07-03 07:06:29 +00:00
parent 77d39f184e
commit 65a8e891b3

View File

@ -747,7 +747,7 @@ class db_backup
/**
* Number of rows to select per chunk, to not run into memory limit on huge tables
*/
const ROW_CHUNK = 100;
const ROW_CHUNK = 10000;
/**
* Backup all data in the form of a (compressed) csv file