How to Encrypt Large Files Without Crashing the Browser
When you build your first client-side encryption tool, you naturally test it with a 2MB PDF file. It works perfectly. You deploy it, a user attempts to secure a 2GB 4K video, and their browser tab instantly crashes with an Out of Memory (OOM) error.
The problem is fundamental to how browsers manage RAM. If you call await file.arrayBuffer() on a 2GB file, the browser allocates 2GB of RAM just to read it. When you pass that buffer to the Web Crypto API, the browser allocates another 2GB to hold the encrypted ciphertext. A single upload just ate 4GB of RAM, killing the main thread.
The Solution: Chunking and the Streams API
To handle massive files, we cannot load the entire file into memory at once. We must process the file in bite-sized chunks (e.g., 10MB at a time). We read a chunk, encrypt it, flush it to a storage container (like a Blob or an ongoing upload stream), clear it from RAM, and grab the next chunk.
Step 1: Slicing the File
The native JavaScript File object (which you get from an HTML input) inherits from Blob. This means it has a magical method called .slice(). It allows us to read specific byte ranges from the user's hard drive without loading the entire file.
const CHUNK_SIZE = 10 * 1024 * 1024; // 10MB chunks
// Function to read a specific slice of the file
function readChunk(file, startByte, endByte) {
return new Promise((resolve, reject) => {
const fileReader = new FileReader();
fileReader.onload = (event) => {
resolve(event.target.result); // Returns an ArrayBuffer
};
fileReader.onerror = (error) => reject(error);
// Slice the file and read only that part into memory
const blobSlice = file.slice(startByte, endByte);
fileReader.readAsArrayBuffer(blobSlice);
});
}
Step 2: Sequential Encryption (The Loop)
Now we create a loop that iterates through the entire file size, grabbing 10MB at a time, encrypting it using the Web Crypto API, and storing the encrypted chunks into an array of Blobs.
Note: For AES-GCM, each chunk must use a unique Initialization Vector (IV). A common pattern is to generate a base IV and mathematically increment it for each chunk index to maintain cryptographic security.
async function encryptLargeFile(file, cryptoKey) {
const encryptedChunks = [];
let offset = 0;
let chunkIndex = 0;
console.log(`Starting encryption of ${file.name}...`);
while (offset < file.size) {
// Calculate the end byte for this chunk
const end = Math.min(offset + CHUNK_SIZE, file.size);
// 1. Read just 10MB from the disk into RAM
const chunkBuffer = await readChunk(file, offset, end);
// 2. Generate a unique IV for this specific chunk
const chunkIv = generateChunkIV(chunkIndex); // Custom deterministic function
// 3. Encrypt the 10MB chunk
const ciphertext = await window.crypto.subtle.encrypt(
{ name: "AES-GCM", iv: chunkIv },
cryptoKey,
chunkBuffer
);
// 4. Push the ciphertext to our Blob array and free the RAM
encryptedChunks.push(new Blob([ciphertext]));
offset += CHUNK_SIZE;
chunkIndex++;
// Optional: Update UI progress bar here!
console.log(`Encrypted ${Math.round((offset / file.size) * 100)}%`);
}
// Combine all encrypted Blob chunks into one massive file reference
// (Blobs are references to disk/memory, highly optimized by the browser)
const finalEncryptedFile = new Blob(encryptedChunks, { type: 'application/octet-stream' });
return finalEncryptedFile;
}
Step 3: Streaming to the Cloud
Once the file is converted into a combined Blob, you can use the native fetch() API to PUT the data directly to a Presigned URL (as we discussed in our earlier Vercel architecture post). The browser's networking engine is incredibly smart; it will automatically stream the Blob to the server without forcing the entire payload back into the JavaScript RAM execution context.
Handle Unlimited File Sizes
By utilizing Blob.slice() and sequential array chunking, ZeroKey bypasses traditional browser memory limits.
Whether you are securing a lightweight PDF document or a massive system database backup, our client-side architecture keeps your browser's RAM footprint small and your UI butter-smooth.
Conclusion
JavaScript is incredibly powerful, but you have to respect its memory constraints. Treating a massive file like a giant string will crash your users' browsers. By mastering file slicing and chunked asynchronous processing, you graduate from building basic web pages to architecting robust, enterprise-grade data pipelines.