Skip to content

Large file save freeze app for seconds #364

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
3 tasks done
Rigwarl opened this issue Nov 26, 2019 · 38 comments
Open
3 tasks done

Large file save freeze app for seconds #364

Rigwarl opened this issue Nov 26, 2019 · 38 comments
Labels

Comments

@Rigwarl
Copy link

Rigwarl commented Nov 26, 2019

Bug Report

Problem

What is expected to happen?

On large file save (> 5mb) app freeze for seconds.
Its pause render in my app, and animations start lag and jump.

What does actually happen?

No freeze or lag in render on file save.

Information

Command or Code

const save = (data, filename) => {
  const onErrorHandler = e => console.error('file save error ', filename, e);

  window.resolveLocalFileSystemURL(cordova.file.externalDataDirectory, dirEntry => {
    dirEntry.getFile(filename, { create: true, exclusive: false }, fileEntry => {
      fileEntry.createWriter(fileWriter => {
        fileWriter.onwriteend = () =>
          console.log('Successful file write ', filename);
        fileWriter.onerror = onErrorHandler;
        fileWriter.write(data);
      }, onErrorHandler);
    }, onErrorHandler);
  }, onErrorHandler);
};

Environment, Platform, Device

  • Galaxy S8 / android 9
  • Mi A2 Lite / android 9

Version information

  • cordova: 9.0.0
  • cordova-android: 8.1.0
  • cordova-plugin-file: 6.0.2

Checklist

  • I searched for existing GitHub issues
  • I updated all Cordova tooling to most recent version
  • I included all the necessary information above
@breautek
Copy link
Contributor

Where is data coming from and how are you serialising it?

I have an app where I write a rather large amount of data to a file, upwards of 50mb in size and I don't really see a huge freeze. FileWriter is asynchronous so it should not be pausing the webview. But in my app, the most expensive operation was JSON.stringify/JSON.parse.

@Rigwarl
Copy link
Author

Rigwarl commented Nov 26, 2019

I download png files with xhr and save them without any processing.
Ye, i know its asynchronous, so its very strange.

@breautek
Copy link
Contributor

When you download your png files, are you using blobs or are you referencing that data via strings?

@Rigwarl
Copy link
Author

Rigwarl commented Nov 26, 2019

I download it as blob.

@breautek
Copy link
Contributor

Very strange indeed. If possible, if you can create a sample reproduction app that reproduces the issue, it might help move this ticket along, though this may very well be device dependent.

@Rigwarl
Copy link
Author

Rigwarl commented Jan 24, 2020

Hello again!
I made a sample reproduction app, it downloads 9mb png file and save it on disc, this cause around 4sec lag on all devices i tested.

Click on load button for load file, then click save button for save. =)

@breautek
Copy link
Contributor

Thanks, I'll try to take a quick look at this this weekend and report what I find on my devices.

@Rigwarl
Copy link
Author

Rigwarl commented Feb 21, 2020

Hi, any update on this?

@breautek
Copy link
Contributor

breautek commented Feb 21, 2020

My apologies, I've been really tied up with work trying to meet a deadline and I completely forgot about this.

I'll take a look now just to at least confirm that I can reproduce the issue, but I'm not sure when I'll be able to take a deep look.

Edit: Definitely can see the ~4 second pause when I click the "Save Image" button. I'll have to investigate further to see exactly where the bottleneck is, but I'm not sure when I'll have the time to do so since I've been doing some overtime at my workplace.

@breautek breautek added bug and removed support labels Feb 22, 2020
@breautek
Copy link
Contributor

The line that blocks the UI thread for a significant amount of time here I found to be:

https://github.com/apache/cordova-android/blob/c35a990c09f1ab278873754eb7f81e52250893a3/cordova-js-src/exec.js#L87

args[i] = base64.fromArrayBuffer(args[i]);

When blobs are used, it reads the blob as an arraybuffer. Then it converts the array buffer to a base64 encoded string. I assume it does this because the JavasciptInterface I believe can only accept strings. Unfortunately, I don't think there is any way around that, based on some brief googling around.

This is a weak workaround since this plugin is deprecated, you can try using the file transfer plugin which allows you to download a file and save it to the file system completely on the native side. I do still use the file transfer plugin myself in my own apps and it still works, but it is a plugin that cordova doesn't provide updates or support for.

Possible ideas for solutions:

Cordova doesn't support web workers, but I think this would be a good case where web workers would be beneficial, to at least move the bridge execution calls off the UI thread. It won't improve performance, but at least it will make exec non-blocking.

Another idea is to bring the concept of file transfers into the file plugin downloading and saving to the filesystem can be handled completely on the native side to avoid dealing with binary data in the javascript environment.

@Rigwarl
Copy link
Author

Rigwarl commented Mar 2, 2020

My knowledge of web workers is very limited.

I think i cant make cordova calls from worker, so i need to pass my file to worker, convert it to base64 in worker, pass it back to main and call save() ?

@breautek
Copy link
Contributor

breautek commented Mar 2, 2020

I think i cant make cordova calls from worker

You are correct, this is something that cordova doesn't currently support.

so i need to pass my file to worker, convert it to base64 in worker, pass it back to main and call save()

You can try this, but I'm not certain if that will help as there is a bottleneck of copying the data to and from the worker.

If you're downloading a file and saving it to disk, you're best/quickest work around is probably using the File Transfer Plugin despite it being deprecated. Because it downloads the file and saves it to the filesystem all on the native side.

The primary reason why the transfer plugin is deprecated to my understanding is because it is possible to download binary files using XMLHttpRequest (however, then you can run into performance/efficiency issues like you are currently experiencing). Despite the plugin is being deprecated, I still use it to upload large files to my server, and that part is at least working for me still.

@digaus
Copy link

digaus commented Mar 10, 2020

@breautek

We can convert the blob into chunks and write the file in parts:

   private async convertToBase64Chunks(blob: Blob, size: 3 | 6 | 9 | 12 | 15 | 18, writeChunk: (value: string, first?: boolean) => Promise<void>): Promise<void> {
       const chunkSize: number = 1024 * 1024 * size;
       const blobSize: number = blob.size;
       while (blob.size > chunkSize) {
           const value: string = await this.convertToBase64(blob.slice(0, chunkSize));
           await writeChunk(blobSize === blob.size ? value : value.split(',')[1], blobSize === blob.size);
           blob = blob.slice(chunkSize);
       }
       const lastValue: string = await this.convertToBase64(blob.slice(0, blob.size));
       await writeChunk(lastValue.split(',')[1], blobSize === blob.size);
       blob = blob.slice(blob.size);
   }


   private convertToBase64(blob: Blob): Promise<string> {
       return new Promise((resolve: (base64: string) => void, reject: () => void): void =>  {
           let reader: FileReader = new FileReader();
           const realFileReader: FileReader = reader['_realReader'];
           if (realFileReader) {
               reader = realFileReader;
           }
           reader.onerror = (err: any): void => {
               console.log(err);
               reject();
           };
           reader.onload = (): void => {
               resolve(reader.result as string);
           };
           reader.readAsDataURL(blob);
       });
   }

writeChunk is the asynchronus callback where we can call the write method of the plugin. first indicates the first part which also has the encoding prefix (but I think this is dropped anyways).

Implemented this in a Capacitor project which works very nicely:

            await this.convertToBase64Chunks(blob, chunkSize, async (value: string, first: boolean): Promise<void> => {
                if (first) {
                    await Filesystem.writeFile({
                        path: this.folder + '/' + fileName,
                        directory: FilesystemDirectory.Data,
                        data: value,
                    });
                } else {
                    await Filesystem.appendFile({
                        path: this.folder + '/' + fileName,
                        directory: FilesystemDirectory.Data,
                        data: value,
                    });
                }
            });

Edit: 20MB took arround 2 seconds to write

@maksymov
Copy link

I have <input type='file'> in my form. Now I want to save selected file in LocalFileSystem.PERSISTENT.

My code for that:

self.requestFileSystem(LocalFileSystem.PERSISTENT, 0, function (fs) {
  fs.root.getFile(file_name, { create: true, exclusive: false }, function (fileEntry) {
    var reader = new FileReader();
    reader.onloadend = function () {
      fileEntry.createWriter(function (fileWriter) {
        fileWriter.onwriteend = function () {};
        fileWriter.onerror = function (e) {};
        dataObj = new Blob([reader.result], { type: 'text/plain' });
        // freeze here
        fileWriter.write(dataObj);
        postMessage('ok');
      });
    };
    reader.readAsDataURL($(file_id)[0].files[0]);
  }, onErrorCreateFile);
}, onErrorLoadFs);

But this code freeze my UI on device (ios, android) up to 10 seconds when file is 30MB. I discovered the world of Web Workers, but there is no access to LocalFileSystem.

How can I save selected file (<input type='file'>) to my LocalFileSystem without UI freezes?

@HarelM
Copy link

HarelM commented Jul 9, 2020

I have a similar issue when downloading a large file (100Mb) in iOS. but in my case the app completely freezes and goes to white screen of death.
Unfortunately this doesn't happen on all the devices just some, which is very hard to narrow down the root cause.
I'll try and use the File Transfer Plugin and see if it helps.
If anyone had found a workaround besides the not supported plugin, let me know...

@ssmithereens
Copy link

ssmithereens commented Nov 16, 2020

We are also hitting this issue when downloading large video files (>25MB) on iOS. Even worse, wkwebview decides that the page is unresponsive, and reloads it, terminating our download task before it completes successfully.

@ssmithereens
Copy link

ssmithereens commented Nov 16, 2020

We tried the chunking method described above, but unfortunately wkwebview still eventually decides that the page is unresponsive. We experimented with chunk sizes from 512KB to 15MB.

@HarelM
Copy link

HarelM commented Nov 16, 2020

File transer plugin was somewhat "undeprecated", was upgraded to be available for latest cordova and shows good performance and results. Solved most of the issues my app was experiencing and it can also download when the app is not inn the front which is a nice plus. I recommend using it.

@ssmithereens
Copy link

File transer plugin was somewhat "undeprecated", was upgraded to be available for latest cordova and shows good performance and results. Solved most of the issues my app was experiencing and it can also download when the app is not inn the front which is a nice plus. I recommend using it.

We just tested the file transfer plugin and it works perfectly for us. Thank you.

@LightMind
Copy link

LightMind commented Feb 5, 2021

We were running into some of the same issues. We try to download files via XMLHttpRequest, and write the contents via a fileWriter. Writing files larger than a few megabytes would consistently crash the whole application. We switched to writing the files in chunks, which alleviated the problem, but I was still confused why writing files less than 50MiB would result in crashes or white-screens.

I profiled with chromes debugging tools, and found that the garbage collector is running like crazy, and the function uses a lot of memory. You can see here, that while invoking .writeFile that the memory usage goes from 11MiB to about 40MiB, even though we are writing 1MiB chunks of binary data.
Screenshot 2021-02-03 at 13 58 41

As already mentioned by @breautek, this part in cordova is really slow:
args[i] = base64.fromArrayBuffer(args[i]);

I wanted to understand what is happening here:
Taking a look at the base64 module in cordova-common:

function uint8ToBase64 (rawData) {
    var numBytes = rawData.byteLength;
    var output = '';
    var segment;
    var table = b64_12bitTable();
    for (var i = 0; i < numBytes - 2; i += 3) {
        segment = (rawData[i] << 16) + (rawData[i + 1] << 8) + rawData[i + 2];
        output += table[segment >> 12];
        output += table[segment & 0xfff];
    }
    if (numBytes - i === 2) {
        segment = (rawData[i] << 16) + (rawData[i + 1] << 8);
        output += table[segment >> 12];
        output += b64_6bit[(segment & 0xfff) >> 6];
        output += '=';
    } else if (numBytes - i === 1) {
        segment = (rawData[i] << 16);
        output += table[segment >> 12];
        output += '==';
    }
    return output;
}

Ouch. JavaScript has immutable strings. Using += to concat the output creates new strings in memory all the time. This means if we need to convert n bytes to base64, we create at least 2 * n/3 new strings, each bigger than the last. This ends up costing O(n^2) bytes of memory. ( "AA", "AABB", "AABBCC" ... all the way to the full string )

Note that if you want to write 1MiB, you would have to allocate several gigabytes of memory!

This explains the memory usage, and why writing binary files is slow. The garbage collector needs to clean up millions of strings, and if it can't keep up, we run out of memory.

This not only affects writing files, but ANY cordova plugin that wants to pass binary data from JS to the native layer.

The conversion via FileReader.readAsDataURL that @digaus used is infinitely better than the cordova-common algorithm.

I thought it would be reasonable to support FileWriter.write(buffer: ArrayBuffer), and let it do the chunking and conversion to base64, to bypass the cordova algorithm. Then users of this plugin would not need to implement custom chunking code.

So I started to change the FileWriter.js code. The files are written to disk, but I must have made a mistake somewhere, because the files seem corrupted ( I have removed the prefix from the readAsDataURL result ).

I am also not sure why @digaus solution works. When I pass a string to FileWriter.write(), it should write the strings as-is to disk, instead of interpreting it as binary data encoded in base64?
In www/FileWriter.js, we need to pass isBinary = true here:

}, 'File', 'write', [this.localURL, data, this.position, isBinary]);

In order to get LocalFileSystem.java writeToFileAtURL() to decode it to binary data.

Does FileSystem.writeFile behave differently?

I will spend some more time on this. I would really like to solve this issue and make a PR, once I get this to work.

@LightMind
Copy link

I have managed to change FileWriter, such that it can convert ArrayBuffers to base64 encoded strings itself.

To get the correct base64 encoded string, it is important to call FileReader.readAsDataURL in the right way;

   fileReader.readAsDataURL(
        new Blob([arrayBuffer], {
            type: 'application/octet-binary'
        })
    );

Otherwise you would encode the string representation of the ArrayBuffer, which is not at all what you want.
Link to the changed file

I have used this to write 1MiB and smaller files, and the performance looks better for now. Especially the memory usage for each file or chunk is only about twice the filesize/chunksize. There is less garbage collection happening.

I will test this with bigger files and measure the timing in comparison to the original version, to see if it actually performs better, and implement chunking, if that seems necessary.

@timbru31
Copy link
Member

timbru31 commented Feb 9, 2021

Thanks for the detailed analysis and potential fix(es) @LightMind - looking forward to hearing more from you.

@LightMind
Copy link

I have implemented the chunking behaviour in FileWriter.js.
It seems pretty necessary to do chunking, because even though we skips the args[i] = base64.fromArrayBuffer(args[i]); part, cordova.js still calls JSON.encode on all the params we want to pass to the native layer.

This does not go so well for several megabyte of data.

I tried with chunks of 128KiB, 1MiB, and 5MiB.

Larger chunks are faster to write, having less downtime between writes, but the memory spikes are worse. Memory usage is mostly due to JSON.encode, as far as I can debug.
When writing one chunk of 5 MiB, my android-phone would use about 30 MiB.
Speeds were around 4.5 to 4.8 MiB/second ( measured from calling FileWriter.write, to succesCallback ).
Writing 5MiB chunks looks like this:
Screenshot 2021-02-10 at 17 34 23

When writing 1MiB chunks, there is more downtime, but much less memory pressure.
Spikes of less than 10MiB memory usage, and write speed is between 4.2 to 4.5.
So, sligthly slower, but more stable.

Writing smaller chunks does not make sense, as the downtime between writes is too much, halving the write speed at chunks of 128KiB.

I have some questions, but I'll take that when I make a PR, which I will do soon.

@kputh
Copy link

kputh commented Oct 7, 2021

If this is an issue with cordova-common, did you report it? I'd like the see this issue's status in the repo that owns it.

@erisu
Copy link
Member

erisu commented Oct 8, 2021

@kputh This cant be an issue from cordova-common. cordova-common is not bundled with any app.

From the content of this thread, it sounds like you might mean cordova-js. This file specifically: https://github.com/apache/cordova-js/blob/master/src/common/base64.js

@raphinesse
Copy link

I filed a PR in cordova-js that tries to improve the performance for big files as much as is possible with the current architecture. However, I think the only solid solution to this problem is to not process big files in the browser's main thread.

It seems the file-transfer-plugin caters well to the common use case of simply downloading and storing files. Chunked reading and writing seems to be a reasonable alternative if you really need to transfer a lot of bytes over the web/native bridge. However, I'm not quite sure if we should promote that by including it into this plugin.

One thing that we should definitely consider for this plugin is the move from the synchronous base64.fromArrayBuffer to using FileReader.readAsDataURL to convert files to Base64 strings.

@LightMind Thanks so much for your detailed analysis of the problem. You really pushed this issue forward!

@LightMind
Copy link

@raphinesse Thank you so much, I really appreciate it :)

I am also a little bit worried about changing the behaviour of FileWriter to always use chunks, because it can potentially give problems with partial writes. Writing to a file feels to me like something that should be atomic, if possible.

This makes me unsure about if or how to continue #461

Would it be reasonable to provide a ChunkedFileWriter ( And maybe a ChunkedFileReader )? It could implement the chunked writing behaviour on top of FileWriter.

Something like

fileEntry.createWriter(async (fileWriter) => {
    var chunkedWriter = new ChunkedFileWriter(fileWriter)
    chunkedWriter.setChunkSize(2 ** 15)

    chunkedWriter.onwriteend = function()
    ...
                  

@raphinesse
Copy link

@LightMind First off: sorry for the late reply.

Writing to a file feels to me like something that should be atomic, if possible.

Agreed. Maybe a ChunkedFileWriter would delete the whole file if a chunk write fails?

Would it be reasonable to provide a ChunkedFileWriter ( And maybe a ChunkedFileReader )? It could implement the chunked writing behaviour on top of FileWriter.

If we want a chunked writer in this plugin, I think that would be the way to go.

Another interesting approach can be seen in capacitor-blob-writer. It streams the data out of the WebView by means of network requests. Seems more reasonable than pushing everything over the bridge in base64 form. Moreover, this approach seems to grant great performance. A pity that we actually need a local server running on the device for that.

@ZouhirBensm
Copy link

ZouhirBensm commented Apr 19, 2025

Hi All!

I read this thread, and some advanced concepts are being discussed here I see!

(Sorry for being a lay man developer, please go easy on me)

I am using this implementation, and I can download up to ~30 MB of data on my A53 Samsung Android device with it:

  // File download
  function downloadResult(taskId) {
    cordova.plugin.http.sendRequest(
      `http://XXX/poc4/download/${taskId}`,
      // `https://XXX.com/poc4/download/${taskId}`, // Production URL
      {
        method: 'get',
        responseType: 'arraybuffer',
        headers: { 'Accept': 'audio/mpeg' }
      },
      function (response) {
        console.log('Download successful, saving file');
        const blob = new Blob([response.data], { type: 'audio/mpeg' });
        saveMP3(
          blob,
          function (savedPath) {
            statusDiv.innerHTML = `Download complete! File saved to:<br>${savedPath}`;
            convertBtn.disabled = false;
          },
          function (error) {
            console.error('Save file failed:', error);
            statusDiv.textContent = `Failed to save file: ${error.message || 'Unknown error'}`;
            convertBtn.disabled = false;
          }
        );
      },
      function (error) {
        console.error('Download failed:', error);
        statusDiv.textContent = `Download failed: ${error.error || error.message || 'Unknown error'}`;
        convertBtn.disabled = false;
      }
    );
  }

  // Save MP3 file to device
  function saveMP3(blob, successCallback, errorCallback) {
    window.resolveLocalFileSystemURL(cordova.file.externalRootDirectory, function (rootDir) {
      rootDir.getDirectory('Download', { create: true }, function (downloadDir) {
        const fileName = `converted_${Date.now()}.mp3`;
        downloadDir.getFile(fileName, { create: true }, function (fileEntry) {
          fileEntry.createWriter(function (writer) {
            writer.onwriteend = function () {
              successCallback(fileEntry.toURL());
            };
            writer.onerror = function (error) {
              errorCallback(new Error('Failed to write file'));
            };
            writer.write(blob);
          }, function (error) {
            errorCallback(new Error('Failed to create file writer'));
          });
        }, function (error) {
          errorCallback(new Error('Failed to create file'));
        });
      }, function (error) {
        errorCallback(new Error('Failed to access Download directory'));
      });
    }, function (error) {
      errorCallback(new Error('Failed to access external storage'));
    });
  }
}

The app works and downloads the file when trying to download ~30 MB of data.

The app crashes when trying to download ~60 MB of data.

I was looking for a bullet proof solution capable to download files (.mp3) up to ~ +300 MB.

I'm using this for the request: cordova-plugin-advanced-http

I'm using this for the download: cordova-plugin-file (uses cordova.file).

Could I just use: https://github.com/sgrebnov/cordova-plugin-background-download ?

All help or recommendations would be appreciated.

Extra information:

Cordova Packages:

    cli: 12.0.0
        common: 5.0.1
        create: 5.0.0
        lib: 12.0.2
            common: 5.0.1
            fetch: 4.0.0
            serve: 4.0.1

Project Installed Platforms:

    android: 14.0.0

Project Installed Plugins:

    cordova-plugin-advanced-http: 3.3.1
    cordova-plugin-file: 8.1.3
    cordova-plugin-file-transfer: 2.0.0

Environment:

    OS: Ventura 13.1 (22C65) (darwin 22.2.0) x64
    Node: v18.19.0
    npm: 10.2.3

android Environment:

    android:
[=======================================] 100% Fetch remote repository...       
Available Android targets:
----------
id: 1 or "android-35"
     Name: Android API 35, extension level 13
     Type: Platform
     API level: 35
     Revision: 2

@HarelM
Copy link

HarelM commented Apr 19, 2025

As mentioned in other posts, you should be using file-transfer-plugin to download and save the file without holding it in memory and causing the app to crash. I'm able to download and save 100Mb+ files suing this method.

@GitToTheHub
Copy link
Contributor

Another approach would be to download files with the fetch api. This is a native browser feature and most modern browsers support it.

@HarelM
Copy link

HarelM commented Apr 19, 2025

This is what I got from ChatGPT, I haven't tested it though...
To download a large file in the browser using the Fetch API and File API (and optionally Streams API) while keeping memory usage low, the key is to stream the response and write it incrementally to a file rather than loading the entire file into memory.

Here's how to do it step-by-step:


✅ Use Case: Efficiently Download a Large File via Streaming

async function downloadLargeFile(url, filename) {
  const response = await fetch(url);

  if (!response.ok) throw new Error(`HTTP error! Status: ${response.status}`);

  const contentLength = response.headers.get('Content-Length');
  const total = contentLength ? parseInt(contentLength, 10) : 0;

  const reader = response.body.getReader();
  const stream = new ReadableStream({
    async start(controller) {
      let received = 0;

      while (true) {
        const { done, value } = await reader.read();
        if (done) break;

        controller.enqueue(value);
        received += value.length;
        console.log(`Received ${received} of ${total} bytes`);
      }

      controller.close();
      reader.releaseLock();
    }
  });

  const blob = await new Response(stream).blob();

  // Create a temporary download link and trigger download
  const blobUrl = URL.createObjectURL(blob);
  const a = document.createElement('a');
  a.href = blobUrl;
  a.download = filename;
  document.body.appendChild(a);
  a.click();
  a.remove();
  URL.revokeObjectURL(blobUrl);
}

🔍 Why this keeps memory usage low:

  • Streaming with ReadableStream avoids loading the full response into memory.
  • controller.enqueue() feeds small chunks into the ReadableStream.
  • new Response(stream).blob() buffers the stream efficiently compared to manually collecting chunks in an array.
  • The browser handles the blob efficiently once created, especially for download.

🧠 Bonus Tips

  • Progress Bar: You can update a progress bar using the received / total info.
  • Web Streams API: This solution takes full advantage of it, supported in most modern browsers.
  • Avoid collecting chunks in an array, as that will accumulate all data in memory — the very thing we're trying to avoid.

Would you like a version that saves directly to the user's disk using the File System Access API (Chrome/Edge support)? That allows real streaming to disk, ideal for very large files.

@ZouhirBensm
Copy link

ZouhirBensm commented Apr 19, 2025

I'll do more research and update it here.
After reading the new comment posts from @HarelM and @GitToTheHub I remain skeptical and convinced it will not work. (I thank both of you)
If anyone has a technical, efficient, simple work around the memory problem, I would appreciate it.
I'm convinced it has something to do with memory management from the android app, therefor the app silently crashes.
I implemented and tried https://github.com/sgrebnov/cordova-plugin-background-download, and the CORDOVA build breaks.
(PS: I'm working on a audio converter product if anyone wants to help me develop and get it on google play store)
Will keep working, and may try recommendations.

@breautek
Copy link
Contributor

If anyone has a technical, efficient, simple work around the memory problem, I would appreciate it.
I'm convinced it has something to do with memory management from the android app, therefor the app silently crashes.

Keep in mind that in android, there is a rather conservative heap size. It does varies by device, where devices with more total memory will have a larger heap, but in my experience most heap sizes are between 128-256mb, or upwards to 512mb on very high end devices. What this means, that the android app can only allocate approximately 128mb in the JVM before memory pressure gets too high and risking the OOM error.

If you can't keep your chunk sizes low and require allocating larger amounts of memory, android does expose a largeHeap flag which puts the JVM heap size closer to hardware limitations, though if you're actually allocating large amount of memory, it will destroy the performance of the application and be very inefficient with energy consumption. More objects to be tracked, longer GC sweeps, etc... Cordova has no option to enable this flag, but you can use the edit-config directive to apply it.

You can also bypass the JVM limitations by passing the data chunks through to Android NDK, so that you actually do the processing in pure native land (using C), as memory allocated in NDK is outside of the JVM's heap and thus not included in that 128-256mb limit. If you're doing audio conversion, you probably would benefit greatly from android NDK since audio codecs typically isn't lightweight, and of course you get the raw hardware performance as well, but there is a cost to maintaining NDK code.

Speaking from my own experience in with maintaining GIS-like systems.

@ZouhirBensm
Copy link

ZouhirBensm commented Apr 22, 2025

I found the solution.

I still need to decipher what exactly is going on.

These are my first impressions.

Cordova is able to download big (+ ~300 MB) files from a server.

To do this, what worked for me was editing the API calls in the method (tool/lib/package/...).

I randomly came across this article dated 18th October 2017 that shows how to make API calls the right way, in order to retrieve a file IMO.

https://cordova.apache.org/blog/2017/10/18/from-filetransfer-to-xhr2.html

You will find an implementation example on the webpage.

DM me on Github if you want the exact implementation I used to do so. I essentially was able to implement a version to download an MP3 file from a server in Blob format.

I identified that the response had a valid blob format of the MP3 file by inspecting it through code and logs.

After this add-on, the Cordova app was still crashing.

Get this!

I realized the crash was not from the API call, but from the save file process that saves "large" files onto the Android file system.

The saving process kept making the Cordova Android app crash.

So I am still not sure if it was my previous API call method or both it and the save process that make the Cordova Android app crash.

(I had to make a bunch of detailed logs provided in real time from the Android OS level to identify that the save operation was making the app crash. Again, you can DM me to know the exact error if you want.)

The "hack" to fix this is to essentially save the file in chunks. (DM for more code details)

The diagnosis and solution are too long to explain here in detail, but the gist of the solution is here.

I find it curious that Cordova doesn't have an easy-to-find, simple built-in lib/tool/api to save large files?

I also find it curious that Cordova doesn't have easy-to-find documentation showing the best standard way to download and save a large file?

I'll be posting a detailed solution for downloading and saving large files in Cordova Android apps on Stack Overflow.

Anyhow, I thought I would share my solution for others that are struggling and wondering why their app is silently crashing when downloading a large file on an Android Cordova app.

Thanks.

@HarelM
Copy link

HarelM commented Apr 22, 2025

You keep ignoring an existing solution to this problem with the file transfer plugin, I wonder why...

@ZouhirBensm
Copy link

ZouhirBensm commented Apr 23, 2025

You keep ignoring an existing solution to this problem with the file transfer plugin, I wonder why...

My intuition told me it would not work tbh

@HarelM
Copy link

HarelM commented Apr 23, 2025

I have it in production for a few years now. Working great.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests