Browser Memory Constraints

Breaking the 2MB Limit: Streaming E2E Encryption in the Browser

May 15, 2026
10 Min Read

If you've ever built a client-side encryption tool, you know the dreaded "Aw, Snap!" crash screen. When users try to encrypt a 500MB video file using the native Web Crypto API, the browser runs out of memory and violently kills the tab.

The Problem with `readAsDataURL`

Most developers (myself included, in early versions of ZeroKey) process files by reading the entire payload into a Base64 string or an ArrayBuffer, and then passing it to window.crypto.subtle.encrypt().

This is fine for text notes or small images. But if a user uploads a 1GB file, the browser has to allocate 1GB for the raw file, 1GB for the buffer, and another 1GB for the encrypted output. A standard mobile browser will instantly run out of RAM and crash.

The Solution: Web Streams API + Chunking

To fix this, we must abandon reading the whole file at once. Instead, we use the Web Streams API to read the file in small 1MB "chunks", encrypt each chunk individually, and stream the encrypted pieces directly to our database or storage bucket.

Implementation in JavaScript

Here is how to slice a file into chunks and encrypt them sequentially.

🚨 Cryptography Warning: You cannot use the exact same Initialization Vector (IV) for every chunk if you are using AES-GCM. Reusing an IV destroys the encryption. You must generate a unique IV per chunk, or mathematically derive it based on the chunk index.

// 1. Setup our chunk size (e.g., 1MB)
const CHUNK_SIZE = 1024 * 1024; 

async function encryptFileInChunks(file, cryptoKey) {
    const fileSize = file.size;
    let offset = 0;
    let chunkIndex = 0;
    
    // We will store the resulting chunks here (or stream to DB)
    const encryptedChunks = []; 

    while (offset < fileSize) {
        // Read just a 1MB slice of the massive file
        const slice = file.slice(offset, offset + CHUNK_SIZE);
        const chunkBuffer = await slice.arrayBuffer();
        
        // Generate a UNIQUE IV for this specific chunk
        const chunkIv = window.crypto.getRandomValues(new Uint8Array(12));
        
        // Encrypt the small chunk
        const encryptedBuffer = await window.crypto.subtle.encrypt(
            { name: "AES-GCM", iv: chunkIv },
            cryptoKey,
            chunkBuffer
        );
        
        // Save the chunk data alongside its unique IV
        encryptedChunks.push({
            index: chunkIndex,
            iv: bufferToBase64(chunkIv),
            data: bufferToBase64(encryptedBuffer)
        });
        
        offset += CHUNK_SIZE;
        chunkIndex++;
    }
    
    return encryptedChunks;
}

function bufferToBase64(buffer) {
    let binary = '';
    const bytes = new Uint8Array(buffer);
    for (let i = 0; i < bytes.byteLength; i++) { binary += String.fromCharCode(bytes[i]); }
    return window.btoa(binary);
}

Decrypting and Reassembling

On the receiver's end, you simply reverse the process. You fetch the encrypted chunks, decrypt them one by one using their unique IVs, and append the raw ArrayBuffers into a new Blob.

Because you are streaming the decryption directly into a Blob object, the browser handles the memory management on the disk, completely bypassing the strict RAM limits that cause mobile crashes.

Secure Your Payloads with ZeroKey

We are constantly evolving ZeroKey's architecture to handle larger payloads securely. Encrypt sensitive documents without relying on third-party servers.