如何将应用程序资产文件夹中的大文件写入SD卡而不会出现内存不足错误?

时间:2021-12-12 22:19:05

I am trying to copy a file of about 80 megabytes from the assets folder of an Android application to the SD card.

我正在尝试将大约80兆字节的文件从Android应用程序的assets文件夹复制到SD卡。

The file is another apk. For various reasons I have to do it this way and can't simply link to an online apk or put it on the Android market.

该文件是另一个apk。由于各种原因,我必须这样做,不能简单地链接到在线apk或把它放在Android市场上。

The application works fine with smaller apks but for this large one I am getting an out of memory error.

该应用程序适用于较小的apks,但对于这个大的,我得到一个内存不足的错误。

I'm not sure exactly how this works but I am assuming that here I am trying to write the full 80 megabytes to memory.

我不确定这是如何工作的,但我假设在这里我试图将完整的80兆字节写入内存。

try {
                int length = 0;
                newFile.createNewFile();

                InputStream inputStream = ctx.getAssets().open(
                        "myBigFile.apk");
                FileOutputStream fOutputStream = new FileOutputStream(
                        newFile);
                byte[] buffer = new byte[inputStream.available()];
                while ((length = inputStream.read(buffer)) > 0) {
                    fOutputStream.write(buffer, 0, length);
                }
                fOutputStream.flush();
                fOutputStream.close();
                inputStream.close();
            } catch (Exception ex) {
                if (ODP_App.getInstance().isInDebugMode())
                    Log.e(TAG, ex.toString());
            } 

I found this interesting - A question about an out of memory issue with Bitmaps

我发现这很有趣 - 关于Bitmaps的内存不足问题

Unless I've misunderstood, in the case of Bitmaps, there appears to be some way to split the stream to reduce memory usage using BitmapFactory.Options.

除非我误解了,在Bitmaps的情况下,似乎有一些方法可以使用BitmapFactory.Options拆分流以减少内存使用量。

Is this do-able in my scenario or is there any other possible solution?

这在我的方案中是否可行或是否有其他可能的解决方案?

2 个解决方案

#1


7  

The trick is not to try to read the whole file in one go, but rather read it in small chunks and write each chunk before reading the next one into the same memory segment. The following version will read it in 1K chunks. It's for example only - you need to determine the right chunk size.

诀窍不是一次性尝试读取整个文件,而是在小块中读取它并在将每个块读取到同一个内存段之前写入每个块。以下版本将以1K块的形式读取它。仅举例来说 - 您需要确定正确的块大小。

try {
    int length = 0;
    newFile.createNewFile();

    InputStream inputStream = ctx.getAssets().open(
            "myBigFile.apk");
    FileOutputStream fOutputStream = new FileOutputStream(
            newFile);
    //note the following line
    byte[] buffer = new byte[1024];
    while ((length = inputStream.read(buffer)) > 0) {
        fOutputStream.write(buffer, 0, length);
    }
    fOutputStream.flush();
    fOutputStream.close();
    inputStream.close();
} catch (Exception ex) {
    if (ODP_App.getInstance().isInDebugMode())
        Log.e(TAG, ex.toString());
} 

#2


2  

Do not read the whole file into memory; read 64k at a time, then write them, repeat until you reach the end of file. Or use IOUtils from Apache Commons IO.

不要将整个文件读入内存;一次读取64k,然后写入它们,重复直到到达文件末尾。或者使用Apache Commons IO的IOUtils。

#1


7  

The trick is not to try to read the whole file in one go, but rather read it in small chunks and write each chunk before reading the next one into the same memory segment. The following version will read it in 1K chunks. It's for example only - you need to determine the right chunk size.

诀窍不是一次性尝试读取整个文件,而是在小块中读取它并在将每个块读取到同一个内存段之前写入每个块。以下版本将以1K块的形式读取它。仅举例来说 - 您需要确定正确的块大小。

try {
    int length = 0;
    newFile.createNewFile();

    InputStream inputStream = ctx.getAssets().open(
            "myBigFile.apk");
    FileOutputStream fOutputStream = new FileOutputStream(
            newFile);
    //note the following line
    byte[] buffer = new byte[1024];
    while ((length = inputStream.read(buffer)) > 0) {
        fOutputStream.write(buffer, 0, length);
    }
    fOutputStream.flush();
    fOutputStream.close();
    inputStream.close();
} catch (Exception ex) {
    if (ODP_App.getInstance().isInDebugMode())
        Log.e(TAG, ex.toString());
} 

#2


2  

Do not read the whole file into memory; read 64k at a time, then write them, repeat until you reach the end of file. Or use IOUtils from Apache Commons IO.

不要将整个文件读入内存;一次读取64k,然后写入它们,重复直到到达文件末尾。或者使用Apache Commons IO的IOUtils。