forked from extern/egroupware
Update resumable file library, fix bug breaking drag & drop from filesystem in Firefox
This commit is contained in:
parent
0ec4708ba9
commit
236d8e8f06
@ -4,7 +4,7 @@ Resumable.js is a JavaScript library providing multiple simultaneous, stable and
|
||||
|
||||
The library is designed to introduce fault-tolerance into the upload of large files through HTTP. This is done by splitting each file into small chunks. Then, whenever the upload of a chunk fails, uploading is retried until the procedure completes. This allows uploads to automatically resume uploading after a network connection is lost either locally or to the server. Additionally, it allows for users to pause, resume and even recover uploads without losing state because only the currently uploading chunks will be aborted, not the entire upload.
|
||||
|
||||
Resumable.js does not have any external dependencies other than the `HTML5 File API`. This is relied on for the ability to chunk files into smaller pieces. Currently, this means that support is limited to Firefox 4+, Chrome 11+ and Safari 6+.
|
||||
Resumable.js does not have any external dependencies other than the `HTML5 File API`. This is relied on for the ability to chunk files into smaller pieces. Currently, this means that support is widely available in to Firefox 4+, Chrome 11+, Safari 6+ and Internet Explorer 10+.
|
||||
|
||||
Samples and examples are available in the `samples/` folder. Please push your own as Markdown to help document the project.
|
||||
|
||||
@ -13,29 +13,37 @@ Samples and examples are available in the `samples/` folder. Please push your ow
|
||||
|
||||
A new `Resumable` object is created with information of what and where to post:
|
||||
|
||||
var r = new Resumable({
|
||||
```js
|
||||
var r = new Resumable({
|
||||
target:'/api/photo/redeem-upload-token',
|
||||
query:{upload_token:'my_token'}
|
||||
});
|
||||
// Resumable.js isn't supported, fall back on a different method
|
||||
if(!r.support) location.href = '/some-old-crappy-uploader';
|
||||
});
|
||||
// Resumable.js isn't supported, fall back on a different method
|
||||
if(!r.support) location.href = '/some-old-crappy-uploader';
|
||||
```
|
||||
|
||||
To allow files to be selected and drag-dropped, you need to assign a drop target and a DOM item to be clicked for browsing:
|
||||
|
||||
r.assignBrowse(document.getElementById('browseButton'));
|
||||
r.assignDrop(document.getElementById('dropTarget'));
|
||||
```js
|
||||
r.assignBrowse(document.getElementById('browseButton'));
|
||||
r.assignDrop(document.getElementById('dropTarget'));
|
||||
```
|
||||
|
||||
It is recommended to use an HTML span for the browse button. Using an actual button does not work reliably across all browsers, because Resumable.js creates the file input as a child of this control, and this may be invalid in the case of an HTML button.
|
||||
|
||||
After this, interaction with Resumable.js is done by listening to events:
|
||||
|
||||
r.on('fileAdded', function(file, event){
|
||||
```js
|
||||
r.on('fileAdded', function(file, event){
|
||||
...
|
||||
});
|
||||
r.on('fileSuccess', function(file, message){
|
||||
r.on('fileSuccess', function(file, message){
|
||||
...
|
||||
});
|
||||
r.on('fileError', function(file, message){
|
||||
r.on('fileError', function(file, message){
|
||||
...
|
||||
});
|
||||
```
|
||||
|
||||
## How do I set it up with my server?
|
||||
|
||||
@ -53,7 +61,7 @@ To handle the state of upload chunks, a number of extra parameters are sent alon
|
||||
|
||||
You should allow for the same chunk to be uploaded more than once; this isn't standard behaviour, but on an unstable network environment it could happen, and this case is exactly what Resumable.js is designed for.
|
||||
|
||||
For every request, you can confirm reception in HTTP status codes (can be change through the `permanentErrors` option):
|
||||
For every request, you can confirm reception in HTTP status codes (can be changed through the `permanentErrors` option):
|
||||
|
||||
* `200`: The chunk was accepted and correct. No need to re-upload.
|
||||
* `404`, `415`. `500`, `501`: The file for which the chunk was uploaded is not supported, cancel the entire upload.
|
||||
@ -64,7 +72,7 @@ For every request, you can confirm reception in HTTP status codes (can be change
|
||||
Enabling the `testChunks` option will allow uploads to be resumed after browser restarts and even across browsers (in theory you could even run the same file upload across multiple tabs or different browsers). The `POST` data requests listed are required to use Resumable.js to receive data, but you can extend support by implementing a corresponding `GET` request with the same parameters:
|
||||
|
||||
* If this request returns a `200` HTTP code, the chunks is assumed to have been completed.
|
||||
* If the request returns anything else, the chunk will be uploaded in the standard fashion. (It is recommended to return *204 No Content* in these cases if possible to [avoid unwarrented notices in brower consoles](https://github.com/23/resumable.js/issues/160).)
|
||||
* If the request returns anything else, the chunk will be uploaded in the standard fashion. (It is recommended to return *204 No Content* in these cases if possible to [avoid unwarranted notices in browser consoles](https://github.com/23/resumable.js/issues/160).)
|
||||
|
||||
After this is done and `testChunks` enabled, an upload can quickly catch up even after a browser restart by simply verifying already uploaded chunks that do not need to be uploaded again.
|
||||
|
||||
@ -73,24 +81,43 @@ After this is done and `testChunks` enabled, an upload can quickly catch up even
|
||||
### Resumable
|
||||
#### Configuration
|
||||
|
||||
The object is loaded with a configuation hash:
|
||||
The object is loaded with a configuration hash:
|
||||
|
||||
var r = new Resumable({opt1:'val', ...});
|
||||
```js
|
||||
var r = new Resumable({opt1:'val', ...});
|
||||
```
|
||||
|
||||
All POST parameters can be omitted by setting them to a falsy value
|
||||
(e.g. `null`, `false` or empty string).
|
||||
Available configuration options are:
|
||||
|
||||
* `target` The target URL for the multipart POST request (Default: `/`)
|
||||
* `target` The target URL for the multipart POST request. This can be a `string` or a `function` that allows you you to construct and return a value, based on supplied `params`. (Default: `/`)
|
||||
* `testTarget` The target URL for the GET request to the server for each chunk to see if it already exists. This can be a `string` or a `function` that allows you you to construct and return a value, based on supplied `params`. (Default: `null`)
|
||||
* `chunkSize` The size in bytes of each uploaded chunk of data. The last uploaded chunk will be at least this size and up to two the size, see [Issue #51](https://github.com/23/resumable.js/issues/51) for details and reasons. (Default: `1*1024*1024`)
|
||||
* `forceChunkSize` Force all chunks to be less or equal than chunkSize. Otherwise, the last chunk will be greater than or equal to `chunkSize`. (Default: `false`)
|
||||
* `simultaneousUploads` Number of simultaneous uploads (Default: `3`)
|
||||
* `fileParameterName` The name of the multipart POST parameter to use for the file chunk (Default: `file`)
|
||||
* `query` Extra parameters to include in the multipart POST with data. This can be an object or a function. If a function, it will be passed a ResumableFile and a ResumableChunk object (Default: `{}`)
|
||||
* `headers` Extra headers to include in the multipart POST with data (Default: `{}`)
|
||||
* `method` Method to use when POSTing chunks to the server (`multipart` or `octet`) (Default: `multipart`)
|
||||
* `fileParameterName` The name of the multipart request parameter to use for the file chunk (Default: `file`)
|
||||
* `chunkNumberParameterName` The name of the chunk index (base-1) in the current upload POST parameter to use for the file chunk (Default: `resumableChunkNumber`)
|
||||
* `totalChunksParameterName` The name of the total number of chunks POST parameter to use for the file chunk (Default: `resumableTotalChunks`)
|
||||
* `chunkSizeParameterName` The name of the general chunk size POST parameter to use for the file chunk (Default: `resumableChunkSize`)
|
||||
* `totalSizeParameterName` The name of the total file size number POST parameter to use for the file chunk (Default: `resumableTotalSize`)
|
||||
* `identifierParameterName` The name of the unique identifier POST parameter to use for the file chunk (Default: `resumableIdentifier`)
|
||||
* `fileNameParameterName` The name of the original file name POST parameter to use for the file chunk (Default: `resumableFilename`)
|
||||
* `relativePathParameterName` The name of the file's relative path POST parameter to use for the file chunk (Default: `resumableRelativePath`)
|
||||
* `currentChunkSizeParameterName` The name of the current chunk size POST parameter to use for the file chunk (Default: `resumableCurrentChunkSize`)
|
||||
* `typeParameterName` The name of the file type POST parameter to use for the file chunk (Default: `resumableType`)
|
||||
* `query` Extra parameters to include in the multipart request with data. This can be an object or a function. If a function, it will be passed a ResumableFile and a ResumableChunk object (Default: `{}`)
|
||||
* `testMethod` Method for chunk test request. (Default: `'GET'`)
|
||||
* `uploadMethod` HTTP method to use when sending chunks to the server (`POST`, `PUT`, `PATCH`) (Default: `POST`)
|
||||
* `parameterNamespace` Extra prefix added before the name of each parameter included in the multipart POST or in the test GET. (Default: `''`)
|
||||
* `headers` Extra headers to include in the multipart POST with data. This can be an `object` or a `function` that allows you to construct and return a value, based on supplied `file` (Default: `{}`)
|
||||
* `method` Method to use when sending chunks to the server (`multipart` or `octet`) (Default: `multipart`)
|
||||
* `prioritizeFirstAndLastChunk` Prioritize first and last chunks of all files. This can be handy if you can determine if a file is valid for your service from only the first or last chunk. For example, photo or video meta data is usually located in the first part of a file, making it easy to test support from only the first chunk. (Default: `false`)
|
||||
* `testChunks` Make a GET request to the server for each chunks to see if it already exists. If implemented on the server-side, this will allow for upload resumes even after a browser crash or even a computer restart. (Default: `true`)
|
||||
* `preprocess` Optional function to process each chunk before testing & sending. Function is passed the chunk as parameter, and should call the `preprocessFinished` method on the chunk when finished. (Default: `null`)
|
||||
* `generateUniqueIdentifier` Override the function that generates unique identifiers for each file. (Default: `null`)
|
||||
* `preprocessFile` Optional function to process each file before testing & sending the corresponding chunks. Function is passed the file as parameter, and should call the `preprocessFinished` method on the file when finished. (Default: `null`)
|
||||
* `generateUniqueIdentifier(file, event)` Override the function that generates unique identifiers for each file. May return [Promise](https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/Promise)-like object with `then()` method for asynchronous id generation. Parameters are the ES `File` object and the event that led to
|
||||
adding the file. (Default: `null`)
|
||||
* `maxFiles` Indicates how many files can be uploaded in a single session. Valid values are any positive integer and `undefined` for no limit. (Default: `undefined`)
|
||||
* `maxFilesErrorCallback(files, errorCount)` A function which displays the *please upload n file(s) at a time* message. (Default: displays an alert box with the message *Please n one file(s) at a time.*)
|
||||
* `minFileSize` The minimum allowed file size. (Default: `undefined`)
|
||||
@ -102,7 +129,8 @@ Available configuration options are:
|
||||
* `maxChunkRetries` The maximum number of retries for a chunk before the upload is failed. Valid values are any positive integer and `undefined` for no limit. (Default: `undefined`)
|
||||
* `chunkRetryInterval` The number of milliseconds to wait before retrying a chunk on a non-permanent error. Valid values are any positive integer and `undefined` for immediate retry. (Default: `undefined`)
|
||||
* `withCredentials` Standard CORS requests do not send or set any cookies by default. In order to include cookies as part of the request, you need to set the `withCredentials` property to true. (Default: `false`)
|
||||
|
||||
* `xhrTimeout` The timeout in milliseconds for each request (Default: `0`)
|
||||
* `setChunkTypeFromFile` Set chunk content-type from original file.type. (Default: `false`, if `false` default Content-Type: `application/octet-stream`)
|
||||
#### Properties
|
||||
|
||||
* `.support` A boolean value indicator whether or not Resumable.js is supported by the current browser.
|
||||
@ -111,7 +139,7 @@ Available configuration options are:
|
||||
|
||||
#### Methods
|
||||
|
||||
* `.assignBrowse(domNodes, isDirectory)` Assign a browse action to one or more DOM nodes. Pass in `true` to allow directories to be selected (Chrome only).
|
||||
* `.assignBrowse(domNodes, isDirectory)` Assign a browse action to one or more DOM nodes. Pass in `true` to allow directories to be selected (Chrome only). See the note above about using an HTML span instead of an actual button.
|
||||
* `.assignDrop(domNodes)` Assign one or more DOM nodes as a drop target.
|
||||
* `.on(event, callback)` Listen for event from Resumable.js (see below)
|
||||
* `.upload()` Start or resume uploading.
|
||||
@ -120,27 +148,29 @@ Available configuration options are:
|
||||
* `.progress()` Returns a float between 0 and 1 indicating the current upload progress of all files.
|
||||
* `.isUploading()` Returns a boolean indicating whether or not the instance is currently uploading anything.
|
||||
* `.addFile(file)` Add a HTML5 File object to the list of files.
|
||||
* `.addFiles(files)` Add an Array of HTML5 File objects to the list of files.
|
||||
* `.removeFile(file)` Cancel upload of a specific `ResumableFile` object on the list from the list.
|
||||
* `.getFromUniqueIdentifier(uniqueIdentifier)` Look up a `ResumableFile` object by its unique identifier.
|
||||
* `.getSize()` Returns the total size of the upload in bytes.
|
||||
|
||||
#### Events
|
||||
|
||||
* `.fileSuccess(file)` A specific file was completed.
|
||||
* `.fileSuccess(file, message)` A specific file was completed. `message` is the response body from the server.
|
||||
* `.fileProgress(file)` Uploading progressed for a specific file.
|
||||
* `.fileAdded(file, event)` A new file was added. Optionally, you can use the browser `event` object from when the file was added.
|
||||
* `.filesAdded(array)` New files were added.
|
||||
* `.filesAdded(arrayAdded, arraySkipped)` New files were added (and maybe some have been skipped).
|
||||
* `.fileRetry(file)` Something went wrong during upload of a specific file, uploading is being retried.
|
||||
* `.fileError(file, message)` An error occured during upload of a specific file.
|
||||
* `.fileError(file, message)` An error occurred during upload of a specific file.
|
||||
* `.uploadStart()` Upload has been started on the Resumable object.
|
||||
* `.complete()` Uploading completed.
|
||||
* `.progress()` Uploading progress.
|
||||
* `.error(message, file)` An error, including fileError, occured.
|
||||
* `.error(message, file)` An error, including fileError, occurred.
|
||||
* `.pause()` Uploading was paused.
|
||||
* `.beforeCancel()` Triggers before the items are cancelled allowing to do any processing on uploading files.
|
||||
* `.cancel()` Uploading was canceled.
|
||||
* `.chunkingStart(file)` Started preparing file for upload
|
||||
* `.chunkingProgress(file,ratio)` Show progress in file preparation
|
||||
* `.chunkingComplete(file) File is ready for upload
|
||||
* `.chunkingComplete(file)` File is ready for upload
|
||||
* `.catchAll(event, ...)` Listen to all the events listed above with the same callback function.
|
||||
|
||||
### ResumableFile
|
||||
@ -168,5 +198,4 @@ Available configuration options are:
|
||||
|
||||
This library is explicitly designed for modern browsers supporting advanced HTML5 file features, and the motivation has been to provide stable and resumable support for large files (allowing uploads of several GB files through HTTP in a predictable fashion).
|
||||
|
||||
If your aim is just to support progress indications during upload/uploading multiple files at once, Resumable.js isn't for you. In those cases, [SWFUpload](http://swfupload.org/) and [Plupload](http://plupload.com/) provides the same features with wider browser support.
|
||||
|
||||
If your aim is just to support progress indications during upload/uploading multiple files at once, Resumable.js isn't for you. In those cases, something like [Plupload](http://plupload.com/) provides the same features with wider browser support.
|
||||
|
@ -1,7 +1,7 @@
|
||||
{
|
||||
"name": "resumable.js",
|
||||
"repo": "23/resumable.js",
|
||||
"version": "1.0.0",
|
||||
"version": "1.0.3",
|
||||
"main": "resumable.js",
|
||||
"scripts": ["resumable.js"]
|
||||
}
|
||||
|
@ -39,24 +39,42 @@
|
||||
forceChunkSize:false,
|
||||
simultaneousUploads:3,
|
||||
fileParameterName:'file',
|
||||
throttleProgressCallbacks:0.5,
|
||||
chunkNumberParameterName: 'resumableChunkNumber',
|
||||
chunkSizeParameterName: 'resumableChunkSize',
|
||||
currentChunkSizeParameterName: 'resumableCurrentChunkSize',
|
||||
totalSizeParameterName: 'resumableTotalSize',
|
||||
typeParameterName: 'resumableType',
|
||||
identifierParameterName: 'resumableIdentifier',
|
||||
fileNameParameterName: 'resumableFilename',
|
||||
relativePathParameterName: 'resumableRelativePath',
|
||||
totalChunksParameterName: 'resumableTotalChunks',
|
||||
throttleProgressCallbacks: 0.5,
|
||||
query:{},
|
||||
headers:{},
|
||||
preprocess:null,
|
||||
preprocessFile:null,
|
||||
method:'multipart',
|
||||
uploadMethod: 'POST',
|
||||
testMethod: 'GET',
|
||||
prioritizeFirstAndLastChunk:false,
|
||||
target:'/',
|
||||
testTarget: null,
|
||||
parameterNamespace:'',
|
||||
testChunks:true,
|
||||
generateUniqueIdentifier:null,
|
||||
maxChunkRetries:undefined,
|
||||
getTarget:null,
|
||||
maxChunkRetries:100,
|
||||
chunkRetryInterval:undefined,
|
||||
permanentErrors:[404, 415, 500, 501],
|
||||
permanentErrors:[400, 404, 409, 415, 500, 501],
|
||||
maxFiles:undefined,
|
||||
withCredentials:false,
|
||||
xhrTimeout:0,
|
||||
clearInput:true,
|
||||
chunkFormat:'blob',
|
||||
setChunkTypeFromFile:false,
|
||||
maxFilesErrorCallback:function (files, errorCount) {
|
||||
var maxFiles = $.getOpt('maxFiles');
|
||||
alert('Please upload ' + maxFiles + ' file' + (maxFiles === 1 ? '' : 's') + ' at a time.');
|
||||
alert('Please upload no more than ' + maxFiles + ' file' + (maxFiles === 1 ? '' : 's') + ' at a time.');
|
||||
},
|
||||
minFileSize:1,
|
||||
minFileSizeErrorCallback:function(file, errorCount) {
|
||||
@ -99,8 +117,8 @@
|
||||
|
||||
// EVENTS
|
||||
// catchAll(event, ...)
|
||||
// fileSuccess(file), fileProgress(file), fileAdded(file, event), fileRetry(file), fileError(file, message),
|
||||
// complete(), progress(), error(message, file), pause()
|
||||
// fileSuccess(file), fileProgress(file), fileAdded(file, event), filesAdded(files, filesSkipped), fileRetry(file),
|
||||
// fileError(file, message), complete(), progress(), error(message, file), pause()
|
||||
$.events = [];
|
||||
$.on = function(event,callback){
|
||||
$.events.push(event.toLowerCase(), callback);
|
||||
@ -139,10 +157,10 @@
|
||||
}
|
||||
}
|
||||
},
|
||||
generateUniqueIdentifier:function(file){
|
||||
generateUniqueIdentifier:function(file, event){
|
||||
var custom = $.getOpt('generateUniqueIdentifier');
|
||||
if(typeof custom === 'function') {
|
||||
return custom(file);
|
||||
return custom(file, event);
|
||||
}
|
||||
var relativePath = file.webkitRelativePath||file.fileName||file.name; // Some confusion in different versions of Firefox
|
||||
var size = file.size;
|
||||
@ -172,26 +190,150 @@
|
||||
return (size/1024.0/1024.0/1024.0).toFixed(1) + ' GB';
|
||||
}
|
||||
},
|
||||
getTarget:function(params){
|
||||
getTarget:function(request, params){
|
||||
var target = $.getOpt('target');
|
||||
if(target.indexOf('?') < 0) {
|
||||
target += '?';
|
||||
} else {
|
||||
target += '&';
|
||||
|
||||
if (request === 'test' && $.getOpt('testTarget')) {
|
||||
target = $.getOpt('testTarget') === '/' ? $.getOpt('target') : $.getOpt('testTarget');
|
||||
}
|
||||
return target + params.join('&');
|
||||
|
||||
if (typeof target === 'function') {
|
||||
return target(params);
|
||||
}
|
||||
|
||||
var separator = target.indexOf('?') < 0 ? '?' : '&';
|
||||
var joinedParams = params.join('&');
|
||||
|
||||
return target + separator + joinedParams;
|
||||
}
|
||||
};
|
||||
|
||||
var onDrop = function(event){
|
||||
$h.stopEvent(event);
|
||||
appendFilesFromFileList(event.dataTransfer.files, event);
|
||||
|
||||
//handle dropped things as items if we can (this lets us deal with folders nicer in some cases)
|
||||
if (event.dataTransfer && event.dataTransfer.items) {
|
||||
loadFiles(event.dataTransfer.items, event);
|
||||
}
|
||||
//else handle them as files
|
||||
else if (event.dataTransfer && event.dataTransfer.files) {
|
||||
loadFiles(event.dataTransfer.files, event);
|
||||
}
|
||||
};
|
||||
var onDragOver = function(e) {
|
||||
var preventDefault = function(e) {
|
||||
e.preventDefault();
|
||||
};
|
||||
|
||||
// INTERNAL METHODS (both handy and responsible for the heavy load)
|
||||
/**
|
||||
* processes a single upload item (file or directory)
|
||||
* @param {Object} item item to upload, may be file or directory entry
|
||||
* @param {string} path current file path
|
||||
* @param {File[]} items list of files to append new items to
|
||||
* @param {Function} cb callback invoked when item is processed
|
||||
*/
|
||||
function processItem(item, path, items, cb) {
|
||||
var entry;
|
||||
if(item.isFile){
|
||||
// file provided
|
||||
return item.file(function(file){
|
||||
file.relativePath = path + file.name;
|
||||
items.push(file);
|
||||
cb();
|
||||
});
|
||||
}else if(item.isDirectory){
|
||||
// item is already a directory entry, just assign
|
||||
entry = item;
|
||||
}else if(item instanceof File) {
|
||||
items.push(item);
|
||||
}
|
||||
if('function' === typeof item.webkitGetAsEntry){
|
||||
// get entry from file object
|
||||
entry = item.webkitGetAsEntry();
|
||||
}
|
||||
if(entry && entry.isDirectory){
|
||||
// directory provided, process it
|
||||
return processDirectory(entry, path + entry.name + '/', items, cb);
|
||||
}
|
||||
if('function' === typeof item.getAsFile){
|
||||
// item represents a File object, convert it
|
||||
item = item.getAsFile();
|
||||
if(item instanceof File) {
|
||||
item.relativePath = path + item.name;
|
||||
items.push(item);
|
||||
}
|
||||
}
|
||||
cb(); // indicate processing is done
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* cps-style list iteration.
|
||||
* invokes all functions in list and waits for their callback to be
|
||||
* triggered.
|
||||
* @param {Function[]} items list of functions expecting callback parameter
|
||||
* @param {Function} cb callback to trigger after the last callback has been invoked
|
||||
*/
|
||||
function processCallbacks(items, cb){
|
||||
if(!items || items.length === 0){
|
||||
// empty or no list, invoke callback
|
||||
return cb();
|
||||
}
|
||||
// invoke current function, pass the next part as continuation
|
||||
items[0](function(){
|
||||
processCallbacks(items.slice(1), cb);
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* recursively traverse directory and collect files to upload
|
||||
* @param {Object} directory directory to process
|
||||
* @param {string} path current path
|
||||
* @param {File[]} items target list of items
|
||||
* @param {Function} cb callback invoked after traversing directory
|
||||
*/
|
||||
function processDirectory (directory, path, items, cb) {
|
||||
var dirReader = directory.createReader();
|
||||
dirReader.readEntries(function(entries){
|
||||
if(!entries.length){
|
||||
// empty directory, skip
|
||||
return cb();
|
||||
}
|
||||
// process all conversion callbacks, finally invoke own one
|
||||
processCallbacks(
|
||||
entries.map(function(entry){
|
||||
// bind all properties except for callback
|
||||
return processItem.bind(null, entry, path, items);
|
||||
}),
|
||||
cb
|
||||
);
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* process items to extract files to be uploaded
|
||||
* @param {File[]} items items to process
|
||||
* @param {Event} event event that led to upload
|
||||
*/
|
||||
function loadFiles(items, event) {
|
||||
if(!items.length){
|
||||
return; // nothing to do
|
||||
}
|
||||
$.fire('beforeAdd');
|
||||
var files = [];
|
||||
processCallbacks(
|
||||
Array.prototype.map.call(items, function(item){
|
||||
// bind all properties except for callback
|
||||
return processItem.bind(null, item, "", files);
|
||||
}),
|
||||
function(){
|
||||
if(files.length){
|
||||
// at least one file found
|
||||
appendFilesFromFileList(files, event);
|
||||
}
|
||||
}
|
||||
);
|
||||
};
|
||||
|
||||
var appendFilesFromFileList = function(fileList, event){
|
||||
// check for uploading too many files
|
||||
var errorCount = 0;
|
||||
@ -205,15 +347,47 @@
|
||||
return false;
|
||||
}
|
||||
}
|
||||
var files = [];
|
||||
var files = [], filesSkipped = [], remaining = fileList.length;
|
||||
var decreaseReamining = function(){
|
||||
if(!--remaining){
|
||||
// all files processed, trigger event
|
||||
if(!files.length && !filesSkipped.length){
|
||||
// no succeeded files, just skip
|
||||
return;
|
||||
}
|
||||
window.setTimeout(function(){
|
||||
$.fire('filesAdded', files, filesSkipped);
|
||||
},0);
|
||||
}
|
||||
};
|
||||
$h.each(fileList, function(file){
|
||||
var fileName = file.name.split('.');
|
||||
var fileType = fileName[fileName.length-1].toLowerCase();
|
||||
var fileName = file.name;
|
||||
var fileType = file.type; // e.g video/mp4
|
||||
if(o.fileType.length > 0){
|
||||
var fileTypeFound = false;
|
||||
for(var index in o.fileType){
|
||||
// For good behaviour we do some inital sanitizing. Remove spaces and lowercase all
|
||||
o.fileType[index] = o.fileType[index].replace(/\s/g, '').toLowerCase();
|
||||
|
||||
if (o.fileType.length > 0 && !$h.contains(o.fileType, fileType)) {
|
||||
// Allowing for both [extension, .extension, mime/type, mime/*]
|
||||
var extension = ((o.fileType[index].match(/^[^.][^/]+$/)) ? '.' : '') + o.fileType[index];
|
||||
|
||||
if ((fileName.substr(-1 * extension.length) === extension) ||
|
||||
//If MIME type, check for wildcard or if extension matches the files tiletype
|
||||
(extension.indexOf('/') !== -1 && (
|
||||
(extension.indexOf('*') !== -1 && fileType.substr(0, extension.indexOf('*')) === extension.substr(0, extension.indexOf('*'))) ||
|
||||
fileType === extension
|
||||
))
|
||||
){
|
||||
fileTypeFound = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (!fileTypeFound) {
|
||||
o.fileTypeErrorCallback(file, errorCount++);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
if (typeof(o.minFileSize)!=='undefined' && file.size<o.minFileSize) {
|
||||
o.minFileSizeErrorCallback(file, errorCount++);
|
||||
@ -224,24 +398,46 @@
|
||||
return false;
|
||||
}
|
||||
|
||||
// directories have size == 0
|
||||
if (!$.getFromUniqueIdentifier($h.generateUniqueIdentifier(file))) {(function(){
|
||||
var f = new ResumableFile($, file);
|
||||
window.setTimeout(function(){
|
||||
function addFile(uniqueIdentifier){
|
||||
if (!$.getFromUniqueIdentifier(uniqueIdentifier)) {(function(){
|
||||
file.uniqueIdentifier = uniqueIdentifier;
|
||||
var f = new ResumableFile($, file, uniqueIdentifier);
|
||||
$.files.push(f);
|
||||
files.push(f);
|
||||
f.container = (typeof event != 'undefined' ? event.srcElement : null);
|
||||
window.setTimeout(function(){
|
||||
$.fire('fileAdded', f, event)
|
||||
},0);
|
||||
})()};
|
||||
})()} else {
|
||||
filesSkipped.push(file);
|
||||
};
|
||||
decreaseReamining();
|
||||
}
|
||||
// directories have size == 0
|
||||
var uniqueIdentifier = $h.generateUniqueIdentifier(file, event);
|
||||
if(uniqueIdentifier && typeof uniqueIdentifier.then === 'function'){
|
||||
// Promise or Promise-like object provided as unique identifier
|
||||
uniqueIdentifier
|
||||
.then(
|
||||
function(uniqueIdentifier){
|
||||
// unique identifier generation succeeded
|
||||
addFile(uniqueIdentifier);
|
||||
},
|
||||
function(){
|
||||
// unique identifier generation failed
|
||||
// skip further processing, only decrease file count
|
||||
decreaseReamining();
|
||||
}
|
||||
);
|
||||
}else{
|
||||
// non-Promise provided as unique identifier, process synchronously
|
||||
addFile(uniqueIdentifier);
|
||||
}
|
||||
});
|
||||
window.setTimeout(function(){
|
||||
$.fire('filesAdded', files)
|
||||
},0);
|
||||
};
|
||||
|
||||
// INTERNAL OBJECT TYPES
|
||||
function ResumableFile(resumableObj, file){
|
||||
function ResumableFile(resumableObj, file, uniqueIdentifier){
|
||||
var $ = this;
|
||||
$.opts = {};
|
||||
$.getOpt = resumableObj.getOpt;
|
||||
@ -250,18 +446,19 @@
|
||||
$.file = file;
|
||||
$.fileName = file.fileName||file.name; // Some confusion in different versions of Firefox
|
||||
$.size = file.size;
|
||||
$.relativePath = file.webkitRelativePath || $.fileName;
|
||||
$.uniqueIdentifier = $h.generateUniqueIdentifier(file);
|
||||
$.relativePath = file.relativePath || file.webkitRelativePath || $.fileName;
|
||||
$.uniqueIdentifier = uniqueIdentifier;
|
||||
$._pause = false;
|
||||
$.container = '';
|
||||
var _error = false;
|
||||
$.preprocessState = 0; // 0 = unprocessed, 1 = processing, 2 = finished
|
||||
var _error = uniqueIdentifier !== undefined;
|
||||
|
||||
// Callback when something happens within the chunk
|
||||
var chunkEvent = function(event, message){
|
||||
// event can be 'progress', 'success', 'error' or 'retry'
|
||||
switch(event){
|
||||
case 'progress':
|
||||
$.resumableObj.fire('fileProgress', $);
|
||||
$.resumableObj.fire('fileProgress', $, message);
|
||||
break;
|
||||
case 'error':
|
||||
$.abort();
|
||||
@ -345,7 +542,7 @@
|
||||
if(c.status()=='error') error = true;
|
||||
ret += c.progress(true); // get chunk progress relative to entire file
|
||||
});
|
||||
ret = (error ? 1 : (ret>0.999 ? 1 : ret));
|
||||
ret = (error ? 1 : (ret>0.99999 ? 1 : ret));
|
||||
ret = Math.max($._prevProgress, ret); // We don't want to lose percentages when an upload is paused
|
||||
$._prevProgress = ret;
|
||||
return(ret);
|
||||
@ -362,6 +559,9 @@
|
||||
};
|
||||
$.isComplete = function(){
|
||||
var outstanding = false;
|
||||
if ($.preprocessState === 1) {
|
||||
return(false);
|
||||
}
|
||||
$h.each($.chunks, function(chunk){
|
||||
var status = chunk.status();
|
||||
if(status=='pending' || status=='uploading' || chunk.preprocessState === 1) {
|
||||
@ -381,7 +581,31 @@
|
||||
$.isPaused = function() {
|
||||
return $._pause;
|
||||
};
|
||||
|
||||
$.preprocessFinished = function(){
|
||||
$.preprocessState = 2;
|
||||
$.upload();
|
||||
};
|
||||
$.upload = function () {
|
||||
var found = false;
|
||||
if ($.isPaused() === false) {
|
||||
var preprocess = $.getOpt('preprocessFile');
|
||||
if(typeof preprocess === 'function') {
|
||||
switch($.preprocessState) {
|
||||
case 0: $.preprocessState = 1; preprocess($); return(true);
|
||||
case 1: return(true);
|
||||
case 2: break;
|
||||
}
|
||||
}
|
||||
$h.each($.chunks, function (chunk) {
|
||||
if (chunk.status() == 'pending' && chunk.preprocessState !== 1) {
|
||||
chunk.send();
|
||||
found = true;
|
||||
return(false);
|
||||
}
|
||||
});
|
||||
}
|
||||
return(found);
|
||||
}
|
||||
|
||||
// Bootstrap and return
|
||||
$.resumableObj.fire('chunkingStart', $);
|
||||
@ -389,6 +613,7 @@
|
||||
return(this);
|
||||
}
|
||||
|
||||
|
||||
function ResumableChunk(resumableObj, fileObj, offset, callback){
|
||||
var $ = this;
|
||||
$.opts = {};
|
||||
@ -433,30 +658,52 @@
|
||||
};
|
||||
$.xhr.addEventListener('load', testHandler, false);
|
||||
$.xhr.addEventListener('error', testHandler, false);
|
||||
$.xhr.addEventListener('timeout', testHandler, false);
|
||||
|
||||
// Add data from the query options
|
||||
var params = [];
|
||||
var parameterNamespace = $.getOpt('parameterNamespace');
|
||||
var customQuery = $.getOpt('query');
|
||||
if(typeof customQuery == 'function') customQuery = customQuery($.fileObj, $);
|
||||
$h.each(customQuery, function(k,v){
|
||||
params.push([encodeURIComponent(k), encodeURIComponent(v)].join('='));
|
||||
params.push([encodeURIComponent(parameterNamespace+k), encodeURIComponent(v)].join('='));
|
||||
});
|
||||
// Add extra data to identify chunk
|
||||
params.push(['resumableChunkNumber', encodeURIComponent($.offset+1)].join('='));
|
||||
params.push(['resumableChunkSize', encodeURIComponent($.getOpt('chunkSize'))].join('='));
|
||||
params.push(['resumableCurrentChunkSize', encodeURIComponent($.endByte - $.startByte)].join('='));
|
||||
params.push(['resumableTotalSize', encodeURIComponent($.fileObjSize)].join('='));
|
||||
params.push(['resumableType', encodeURIComponent($.fileObjType)].join('='));
|
||||
params.push(['resumableIdentifier', encodeURIComponent($.fileObj.uniqueIdentifier)].join('='));
|
||||
params.push(['resumableFilename', encodeURIComponent($.fileObj.fileName)].join('='));
|
||||
params.push(['resumableRelativePath', encodeURIComponent($.fileObj.relativePath)].join('='));
|
||||
params.push(['resumableTotalChunks', encodeURIComponent($.fileObj.chunks.length)].join('='));
|
||||
params = params.concat(
|
||||
[
|
||||
// define key/value pairs for additional parameters
|
||||
['chunkNumberParameterName', $.offset + 1],
|
||||
['chunkSizeParameterName', $.getOpt('chunkSize')],
|
||||
['currentChunkSizeParameterName', $.endByte - $.startByte],
|
||||
['totalSizeParameterName', $.fileObjSize],
|
||||
['typeParameterName', $.fileObjType],
|
||||
['identifierParameterName', $.fileObj.uniqueIdentifier],
|
||||
['fileNameParameterName', $.fileObj.fileName],
|
||||
['relativePathParameterName', $.fileObj.relativePath],
|
||||
['totalChunksParameterName', $.fileObj.chunks.length]
|
||||
].filter(function(pair){
|
||||
// include items that resolve to truthy values
|
||||
// i.e. exclude false, null, undefined and empty strings
|
||||
return $.getOpt(pair[0]);
|
||||
})
|
||||
.map(function(pair){
|
||||
// map each key/value pair to its final form
|
||||
return [
|
||||
parameterNamespace + $.getOpt(pair[0]),
|
||||
encodeURIComponent(pair[1])
|
||||
].join('=');
|
||||
})
|
||||
);
|
||||
// Append the relevant chunk and send it
|
||||
$.xhr.open('GET', $h.getTarget(params));
|
||||
$.xhr.open($.getOpt('testMethod'), $h.getTarget('test', params));
|
||||
$.xhr.timeout = $.getOpt('xhrTimeout');
|
||||
$.xhr.withCredentials = $.getOpt('withCredentials');
|
||||
// Add data from header options
|
||||
$h.each($.getOpt('headers'), function(k,v) {
|
||||
var customHeaders = $.getOpt('headers');
|
||||
if(typeof customHeaders === 'function') {
|
||||
customHeaders = customHeaders($.fileObj, $);
|
||||
}
|
||||
$h.each(customHeaders, function(k,v) {
|
||||
$.xhr.setRequestHeader(k, v);
|
||||
});
|
||||
$.xhr.send(null);
|
||||
@ -472,7 +719,7 @@
|
||||
var preprocess = $.getOpt('preprocess');
|
||||
if(typeof preprocess === 'function') {
|
||||
switch($.preprocessState) {
|
||||
case 0: preprocess($); $.preprocessState = 1; return;
|
||||
case 0: $.preprocessState = 1; preprocess($); return;
|
||||
case 1: return;
|
||||
case 2: break;
|
||||
}
|
||||
@ -518,19 +765,29 @@
|
||||
};
|
||||
$.xhr.addEventListener('load', doneHandler, false);
|
||||
$.xhr.addEventListener('error', doneHandler, false);
|
||||
$.xhr.addEventListener('timeout', doneHandler, false);
|
||||
|
||||
// Set up the basic query data from Resumable
|
||||
var query = {
|
||||
resumableChunkNumber: $.offset+1,
|
||||
resumableChunkSize: $.getOpt('chunkSize'),
|
||||
resumableCurrentChunkSize: $.endByte - $.startByte,
|
||||
resumableTotalSize: $.fileObjSize,
|
||||
resumableType: $.fileObjType,
|
||||
resumableIdentifier: $.fileObj.uniqueIdentifier,
|
||||
resumableFilename: $.fileObj.fileName,
|
||||
resumableRelativePath: $.fileObj.relativePath,
|
||||
resumableTotalChunks: $.fileObj.chunks.length
|
||||
};
|
||||
var query = [
|
||||
['chunkNumberParameterName', $.offset + 1],
|
||||
['chunkSizeParameterName', $.getOpt('chunkSize')],
|
||||
['currentChunkSizeParameterName', $.endByte - $.startByte],
|
||||
['totalSizeParameterName', $.fileObjSize],
|
||||
['typeParameterName', $.fileObjType],
|
||||
['identifierParameterName', $.fileObj.uniqueIdentifier],
|
||||
['fileNameParameterName', $.fileObj.fileName],
|
||||
['relativePathParameterName', $.fileObj.relativePath],
|
||||
['totalChunksParameterName', $.fileObj.chunks.length],
|
||||
].filter(function(pair){
|
||||
// include items that resolve to truthy values
|
||||
// i.e. exclude false, null, undefined and empty strings
|
||||
return $.getOpt(pair[0]);
|
||||
})
|
||||
.reduce(function(query, pair){
|
||||
// assign query key/value
|
||||
query[$.getOpt(pair[0])] = pair[1];
|
||||
return query;
|
||||
}, {});
|
||||
// Mix in custom data
|
||||
var customQuery = $.getOpt('query');
|
||||
if(typeof customQuery == 'function') customQuery = customQuery($.fileObj, $);
|
||||
@ -538,36 +795,60 @@
|
||||
query[k] = v;
|
||||
});
|
||||
|
||||
var func = ($.fileObj.file.slice ? 'slice' : ($.fileObj.file.mozSlice ? 'mozSlice' : ($.fileObj.file.webkitSlice ? 'webkitSlice' : 'slice'))),
|
||||
bytes = $.fileObj.file[func]($.startByte,$.endByte),
|
||||
data = null,
|
||||
target = $.getOpt('target');
|
||||
var func = ($.fileObj.file.slice ? 'slice' : ($.fileObj.file.mozSlice ? 'mozSlice' : ($.fileObj.file.webkitSlice ? 'webkitSlice' : 'slice')));
|
||||
var bytes = $.fileObj.file[func]($.startByte, $.endByte, $.getOpt('setChunkTypeFromFile') ? $.fileObj.file.type : "");
|
||||
var data = null;
|
||||
var params = [];
|
||||
|
||||
var parameterNamespace = $.getOpt('parameterNamespace');
|
||||
if ($.getOpt('method') === 'octet') {
|
||||
// Add data from the query options
|
||||
data = bytes;
|
||||
var params = [];
|
||||
$h.each(query, function(k,v){
|
||||
params.push([encodeURIComponent(k), encodeURIComponent(v)].join('='));
|
||||
$h.each(query, function (k, v) {
|
||||
params.push([encodeURIComponent(parameterNamespace + k), encodeURIComponent(v)].join('='));
|
||||
});
|
||||
target = $h.getTarget(params);
|
||||
} else {
|
||||
// Add data from the query options
|
||||
data = new FormData();
|
||||
$h.each(query, function(k,v){
|
||||
data.append(k,v);
|
||||
$h.each(query, function (k, v) {
|
||||
data.append(parameterNamespace + k, v);
|
||||
params.push([encodeURIComponent(parameterNamespace + k), encodeURIComponent(v)].join('='));
|
||||
});
|
||||
data.append($.getOpt('fileParameterName'), bytes);
|
||||
if ($.getOpt('chunkFormat') == 'blob') {
|
||||
data.append(parameterNamespace + $.getOpt('fileParameterName'), bytes, $.fileObj.fileName);
|
||||
}
|
||||
else if ($.getOpt('chunkFormat') == 'base64') {
|
||||
var fr = new FileReader();
|
||||
fr.onload = function (e) {
|
||||
data.append(parameterNamespace + $.getOpt('fileParameterName'), fr.result);
|
||||
$.xhr.send(data);
|
||||
}
|
||||
fr.readAsDataURL(bytes);
|
||||
}
|
||||
}
|
||||
|
||||
$.xhr.open('POST', target);
|
||||
var target = $h.getTarget('upload', params);
|
||||
var method = $.getOpt('uploadMethod');
|
||||
|
||||
$.xhr.open(method, target);
|
||||
if ($.getOpt('method') === 'octet') {
|
||||
$.xhr.setRequestHeader('Content-Type', 'application/octet-stream');
|
||||
}
|
||||
$.xhr.timeout = $.getOpt('xhrTimeout');
|
||||
$.xhr.withCredentials = $.getOpt('withCredentials');
|
||||
// Add data from header options
|
||||
$h.each($.getOpt('headers'), function(k,v) {
|
||||
var customHeaders = $.getOpt('headers');
|
||||
if(typeof customHeaders === 'function') {
|
||||
customHeaders = customHeaders($.fileObj, $);
|
||||
}
|
||||
|
||||
$h.each(customHeaders, function(k,v) {
|
||||
$.xhr.setRequestHeader(k, v);
|
||||
});
|
||||
|
||||
if ($.getOpt('chunkFormat') == 'blob') {
|
||||
$.xhr.send(data);
|
||||
}
|
||||
};
|
||||
$.abort = function(){
|
||||
// Abort and reset
|
||||
@ -586,8 +867,8 @@
|
||||
// Status is really 'OPENED', 'HEADERS_RECEIVED' or 'LOADING' - meaning that stuff is happening
|
||||
return('uploading');
|
||||
} else {
|
||||
if($.xhr.status==200) {
|
||||
// HTTP 200, perfect
|
||||
if($.xhr.status == 200 || $.xhr.status == 201) {
|
||||
// HTTP 200, 201 (created)
|
||||
return('success');
|
||||
} else if($h.contains($.getOpt('permanentErrors'), $.xhr.status) || $.retries >= $.getOpt('maxChunkRetries')) {
|
||||
// HTTP 415/500/501, permanent error
|
||||
@ -607,6 +888,7 @@
|
||||
if(typeof(relative)==='undefined') relative = false;
|
||||
var factor = (relative ? ($.endByte-$.startByte)/$.fileObjSize : 1);
|
||||
if($.pendingRetry) return(0);
|
||||
if(!$.xhr || !$.xhr.status) factor*=.95;
|
||||
var s = $.status();
|
||||
switch(s){
|
||||
case 'success':
|
||||
@ -646,15 +928,7 @@
|
||||
|
||||
// Now, simply look for the next, best thing to upload
|
||||
$h.each($.files, function(file){
|
||||
if(file.isPaused()===false){
|
||||
$h.each(file.chunks, function(chunk){
|
||||
if(chunk.status()=='pending' && chunk.preprocessState === 0) {
|
||||
chunk.send();
|
||||
found = true;
|
||||
return(false);
|
||||
}
|
||||
});
|
||||
}
|
||||
found = file.upload();
|
||||
if(found) return(false);
|
||||
});
|
||||
if(found) return(true);
|
||||
@ -678,7 +952,6 @@
|
||||
// PUBLIC METHODS FOR RESUMABLE.JS
|
||||
$.assignBrowse = function(domNodes, isDirectory){
|
||||
if(typeof(domNodes.length)=='undefined') domNodes = [domNodes];
|
||||
|
||||
$h.each(domNodes, function(domNode) {
|
||||
var input;
|
||||
if(domNode.tagName==='INPUT' && domNode.type==='file'){
|
||||
@ -707,10 +980,26 @@
|
||||
} else {
|
||||
input.removeAttribute('webkitdirectory');
|
||||
}
|
||||
var fileTypes = $.getOpt('fileType');
|
||||
if (typeof (fileTypes) !== 'undefined' && fileTypes.length >= 1) {
|
||||
input.setAttribute('accept', fileTypes.map(function (e) {
|
||||
e = e.replace(/\s/g, '').toLowerCase();
|
||||
if(e.match(/^[^.][^/]+$/)){
|
||||
e = '.' + e;
|
||||
}
|
||||
return e;
|
||||
}).join(','));
|
||||
}
|
||||
else {
|
||||
input.removeAttribute('accept');
|
||||
}
|
||||
// When new files are added, simply append them to the overall list
|
||||
input.addEventListener('change', function(e){
|
||||
appendFilesFromFileList(e.target.files,e);
|
||||
var clearInput = $.getOpt('clearInput');
|
||||
if (clearInput) {
|
||||
e.target.value = '';
|
||||
}
|
||||
}, false);
|
||||
});
|
||||
};
|
||||
@ -718,7 +1007,8 @@
|
||||
if(typeof(domNodes.length)=='undefined') domNodes = [domNodes];
|
||||
|
||||
$h.each(domNodes, function(domNode) {
|
||||
domNode.addEventListener('dragover', onDragOver, false);
|
||||
domNode.addEventListener('dragover', preventDefault, false);
|
||||
domNode.addEventListener('dragenter', preventDefault, false);
|
||||
domNode.addEventListener('drop', onDrop, false);
|
||||
});
|
||||
};
|
||||
@ -726,7 +1016,8 @@
|
||||
if (typeof(domNodes.length) == 'undefined') domNodes = [domNodes];
|
||||
|
||||
$h.each(domNodes, function(domNode) {
|
||||
domNode.removeEventListener('dragover', onDragOver);
|
||||
domNode.removeEventListener('dragover', preventDefault);
|
||||
domNode.removeEventListener('dragenter', preventDefault);
|
||||
domNode.removeEventListener('drop', onDrop);
|
||||
});
|
||||
};
|
||||
@ -757,6 +1048,7 @@
|
||||
$.fire('pause');
|
||||
};
|
||||
$.cancel = function(){
|
||||
$.fire('beforeCancel');
|
||||
for(var i = $.files.length - 1; i >= 0; i--) {
|
||||
$.files[i].cancel();
|
||||
}
|
||||
@ -775,6 +1067,9 @@
|
||||
$.addFile = function(file, event){
|
||||
appendFilesFromFileList([file], event);
|
||||
};
|
||||
$.addFiles = function(files, event){
|
||||
appendFilesFromFileList(files, event);
|
||||
};
|
||||
$.removeFile = function(file){
|
||||
for(var i = $.files.length - 1; i >= 0; i--) {
|
||||
if($.files[i] === file) {
|
||||
@ -796,9 +1091,21 @@
|
||||
});
|
||||
return(totalSize);
|
||||
};
|
||||
$.handleDropEvent = function (e) {
|
||||
onDrop(e);
|
||||
};
|
||||
$.handleChangeEvent = function (e) {
|
||||
appendFilesFromFileList(e.target.files, e);
|
||||
e.target.value = '';
|
||||
};
|
||||
$.updateQuery = function(query){
|
||||
$.opts.query = query;
|
||||
};
|
||||
|
||||
return(this);
|
||||
};
|
||||
|
||||
|
||||
// Node.js-style export for Node and Component
|
||||
if (typeof module != 'undefined') {
|
||||
module.exports = Resumable;
|
||||
|
@ -306,17 +306,22 @@ var et2_file = (function(){ "use strict"; return et2_inputWidget.extend(
|
||||
return;
|
||||
}
|
||||
|
||||
var addFile = jQuery.proxy(function(i, file) {
|
||||
this.resumable.addFile(file,event);
|
||||
}, this);
|
||||
if(typeof value == 'object' && value.length && typeof value[0] == 'object' && value[0].name)
|
||||
{
|
||||
try
|
||||
{
|
||||
this.input[0].files = value;
|
||||
|
||||
jQuery.each(value, addFile);
|
||||
}
|
||||
catch (e)
|
||||
{
|
||||
var self = this;
|
||||
var args = arguments;
|
||||
jQuery.each(value, function(i,file) {self.resumable.addFile(this,event);});
|
||||
jQuery.each(value, addFile);
|
||||
}
|
||||
}
|
||||
},
|
||||
|
Loading…
Reference in New Issue
Block a user