Chrome Memory Issue - File Api + Angularjs
Solution 1:
I can't see any obvious memory leaks or things I can change to help garbage collection. I store the block IDs in an array so obviously there will be some memory creeep but this shouldn't be massive. It's almost as if the File API is holding the whole file it slices into memory.
You are correct. The new Blobs created by .slice() are being held in memory.
The solution is to call Blob.prototype.close() on the Blob reference when processing Blob or File object is complete.
Note also, at javascript at Question also creates a new instance of FileReader if upload function is called more than once.
The
slice()method returns a newBlobobject with bytes ranging from the optionalstartparameter up to but not including the optionalendparameter, and with atypeattribute that is the value of the optionalcontentTypeparameter.
Blob instances exist for the life of document. Though Blob should be garbage collected once removed from Blob URL Store
Note: User agents are free to garbage collect resources removed from the
Blob URL Store.
Each
Blobmust have an internal snapshot state, which must be initially set to the state of the underlying storage, if any such underlying storage exists, and must be preserved throughStructuredClone. Further normative definition of snapshot state can be found forFiles.
The
close()method is said tocloseaBlob, and must act as follows:
- If the
readability stateof the context object isCLOSED, terminate this algorithm.- Otherwise, set the
readability stateof thecontext objecttoCLOSED.- If the context object has an entry in the
Blob URL Store, remove the entry that corresponds to thecontext object.
If Blob object is passed to URL.createObjectURL(), call URL.revokeObjectURL() on Blob or File object, then call .close().
The
revokeObjectURL(url)static methodRevokes the
Blob URLprovided in the stringurlby removing the corresponding entry from the Blob URL Store. This method must act as follows: 1. If theurlrefers to aBlobthat has areadability stateofCLOSEDOR if the value provided for theurlargument is not aBlob URL, OR if the value provided for theurlargument does not have an entry in theBlob URL Store, this method call does nothing. User agents may display a message on the error console. 2. Otherwise, user agents mustremove the entryfrom theBlob URL Storeforurl.
You can view the result of these calls by opening
chrome://blob-internals
reviewing details of before and after calls which create Blob and close Blob.
For example, from
xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Refcount:1
Content Type: text/plain
Type: data
Length:3to
xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Refcount:1
Content Type: text/plain
following call to .close(). Similarly from
blob:http://example.com/c2823f75-de26-46f9-a4e5-95f57b8230bdUuid: 29e430a6-f093-40c2-bc70-2b6838a713bcAn alternative approach could be to send file as an ArrayBuffer or chunks of array buffers. Then re-assemble the file at server.
Or you can call FileReader constructor, FileReader.prototype.readAsArrayBuffer(), and load event of FileReader each once.
At load event of FileReader pass ArrayBuffer to Uint8Array, use ReadableStream, TypedArray.prototype.subarray(), .getReader(), .read() to get N chunks of ArrayBuffer as a TypedArray at pull from Uint8Array. When N chunks equaling .byteLength of ArrayBuffer have been processed, pass array of Uint8Arrays to Blob constructor to recombine file parts into single file at browser; then send Blob to server.
<!DOCTYPE html><html><head></head><body><inputid="file"type="file"><br><progressvalue="0"></progress><br><outputfor="file"><imgalt="preview"></output><scripttype="text/javascript">const [input, output, img, progress, fr, handleError, CHUNK] = [
document.querySelector("input[type='file']")
, document.querySelector("output[for='file']")
, document.querySelector("output img")
, document.querySelector("progress")
, newFileReader
, (err) =>console.log(err)
, 1024 * 1024
];
progress.addEventListener("progress", e => {
progress.value = e.detail.value;
e.detail.promise();
});
let [chunks, NEXT, CURR, url, blob] = [Array(), 0, 0];
input.onchange = () => {
NEXT = CURR = progress.value = progress.max = chunks.length = 0;
if (url) {
URL.revokeObjectURL(url);
if (blob.hasOwnProperty("close")) {
blob.close();
}
}
if (input.files.length) {
console.log(input.files[0]);
progress.max = input.files[0].size;
progress.step = progress.max / CHUNK;
fr.readAsArrayBuffer(input.files[0]);
}
}
fr.onload = () => {
constVIEW = newUint8Array(fr.result);
constLEN = VIEW.byteLength;
const {type, name:filename} = input.files[0];
const stream = newReadableStream({
pull(controller) {
if (NEXT < LEN) {
controller
.enqueue(VIEW.subarray(NEXT, !NEXT ? CHUNK : CHUNK + NEXT));
NEXT += CHUNK;
} else {
controller.close();
}
},
cancel(reason) {
console.log(reason);
thrownewError(reason);
}
});
const [reader, processData] = [
stream.getReader()
, ({value, done}) => {
if (done) {
return reader.closed.then(() => chunks);
}
chunks.push(value);
returnnewPromise(resolve => {
progress.dispatchEvent(
newCustomEvent("progress", {
detail:{
value:CURR += value.byteLength,
promise:resolve
}
})
);
})
.then(() => reader.read().then(data =>processData(data)))
.catch(e => reader.cancel(e))
}
];
reader.read()
.then(data =>processData(data))
.then(data => {
blob = newBlob(data, {type});
console.log("complete", data, blob);
if (/image/.test(type)) {
url = URL.createObjectURL(blob);
img.onload = () => {
img.title = filename;
input.value = "";
}
img.src = url;
} else {
input.value = "";
}
})
.catch(e =>handleError(e))
}
</script></body></html>plnkr http://plnkr.co/edit/AEZ7iQce4QaJOKut71jk?p=preview
You can also use utilize fetch()
fetch(new Request("/path/to/server/", {method:"PUT", body:blob}))
To transmit body for a requestrequest, run these steps:
- Let body be request’s body.
If body is null, then queue a fetch task on request to process request end-of-body for request and abort these steps.
Let read be the result of reading a chunk from body’s stream.
When read is fulfilled with an object whose
doneproperty is false and whosevalueproperty is aUint8Arrayobject, run these substeps:
- Let bytes be the byte sequence represented by the
Uint8Arrayobject.Transmit bytes.
Increase body’s transmitted bytes by bytes’s length.
Run the above step again.
When read is fulfilled with an object whose
doneproperty is true, queue a fetch task on request to process request end-of-body for request.When read is fulfilled with a value that matches with neither of the above patterns, or read is rejected, terminate the ongoing fetch with reason fatal.
See also
Post a Comment for "Chrome Memory Issue - File Api + Angularjs"