如何使用PHP在MySQL数据库中插入大文件?

时间:2022-09-27 16:42:54

I want to upload a large file of maximum size 10MB to my MySQL database. Using .htaccess I changed PHP's own file upload limit to "10485760" = 10MB. I am able to upload files up to 10MB without any problem.

我想将最大大小为10MB的大文件上传到我的MySQL数据库。使用.htaccess我将PHP自己的文件上传限制更改为“10485760”= 10MB。我可以毫无问题地上传最大10MB的文件。

But I can not insert the file in the database if it is more that 1 MB in size.

但是如果文件大小超过1 MB,我就无法在文件中插入文件。

I am using file_get_contents to read all file data and pass it to the insert query as a string to be inserted into a LONGBLOB field.

我正在使用file_get_contents读取所有文件数据并将其作为要插入LONGBLOB字段的字符串传递给插入查询。

But files bigger than 1 MB are not added to the database, although I can use print_r($_FILES) to make sure that the file is uploaded correctly. Any help will be appreciated and I will need it within the next 6 hours. So, please help!

但是大于1 MB的文件不会添加到数据库中,尽管我可以使用print_r($ _ FILES)来确保正确上载文件。任何帮助将不胜感激,我将在接下来的6个小时内需要它。所以,请帮忙!

7 个解决方案

#1


7  

As far as I know it's generally quicker and better practice not to store the file in the db as it will get massive very quickly and slow it down. It's best to make a way of storing the file in a directory and then just store the location of the file in the db.

据我所知,通常更快更好的做法是不将文件存储在数据库中,因为它会很快变大并减慢速度。最好将文件存储在目录中,然后将文件的位置存储在数据库中。

We do it for images/pdfs/mpegs etc in the CMS we have at work by creating a folder for the file named from the url-safe filename and storing the folder name in the db. It's easy just to write out the url of it in the presentation layer then.

我们通过为url-safe文件名命名的文件创建文件夹并在db中存储文件夹名称,在我们工作的CMS中为images / pdfs / mpegs等执行此操作。然后,只需在表示层中写出它的url就很容易了。

#2


13  

You will want to check the MySQL configuration value "max_allowed_packet", which might be set too small, preventing the INSERT (which is large itself) from happening.

您将需要检查MySQL配置值“max_allowed_pa​​cket”,该值可能设置得太小,从而阻止INSERT(其本身很大)发生。

Run the following from a mysql command prompt:

从mysql命令提示符运行以下命令:

mysql> show variables like 'max_allowed_packet';

Make sure its large enough. For more information on this config option see

确保它足够大。有关此配置选项的更多信息,请参阅

MySQL max_allowed_packet

This also impacts mysql_escape_string() and mysql_real_escape_string() in PHP limiting the size of the string creation.

这也会影响PHP中的mysql_escape_string()和mysql_real_escape_string(),限制字符串创建的大小。

#3


4  

The best answer is to use an implementation that is better and also works around that issue. You can read an article here. Store 10MB, 1000MB, doesn't matter. The implementation chunks/cuts the file into many smaller pieces and stores them in multiple rows.. This helps with load and fetching so memory doesn't also become an issue.

最好的答案是使用更好的实现,并解决该问题。你可以在这里阅读一篇文章。存储10MB,1000MB,没关系。实现将文件分块/切割成许多较小的块并将它们存储在多行中。这有助于加载和获取,因此内存也不会成为问题。

#4


3  

Some PHP extensions for MySQL have issues with LONGBLOB and LONGTEXT data types. The extensions may not support blob streaming (posting the blob one segment at a time), so they have to post the entire object in one go.

MySQL的一些PHP扩展存在LONGBLOB和LONGTEXT数据类型的问题。扩展可能不支持blob流(一次发布blob一个段),因此他们必须一次性发布整个对象。

So if PHP's memory limit or MySQL's packet size limit restrict the size of an object you can post to the database, you may need to change some configuration on either PHP or MySQL to allow this.

因此,如果PHP的内存限制或MySQL的数据包大小限制限制了可以发布到数据库的对象的大小,则可能需要更改PHP或MySQL上的某些配置以允许此操作。

You didn't say which PHP extension you're using (there are at least three for MySQL), and you didn't show any of the code you're using to post the blob to the database.

您没有说明您正在使用哪个PHP扩展(MySQL至少有三个),并且您没有显示用于将blob发布到数据库的任何代码。

#5


2  

You could use MySQL's LOAD_FILE function to store the file, but you still have to obey the max_allowed_packet value and the fact that the file must be on the same server as the MySQL instance.

您可以使用MySQL的LOAD_FILE函数来存储文件,但您仍然必须遵守max_allowed_pa​​cket值以及该文件必须与MySQL实例位于同一服务器上的事实。

#6


0  

You don't say what error you're getting (use mysql_error() to find out), but I suspect you may be hitting the maximum packet size.

你没有说你得到了什么错误(使用mysql_error()来查找),但我怀疑你可能达到了最大数据包大小。

If this is the case, you'd need to change your MySQL configuration max_allowed_packet

如果是这种情况,则需要更改MySQL配置max_allowed_pa​​cket

#7


0  

You don't say what error you're getting (use mysql_error() to find out), but I suspect you may be hitting the maximum packet size.

你没有说你得到了什么错误(使用mysql_error()来查找),但我怀疑你可能达到了最大数据包大小。

If this is the case, you'd need to change your MySQL configuration max_allowed_packet

如果是这种情况,则需要更改MySQL配置max_allowed_pa​​cket

Well I have the same problem. And data cannot be entered in the mysql database chunck by chunck in a "io mode"

好吧,我有同样的问题。并且数据无法通过chunck在“io模式”中输入到mysql数据库中

loop for : 
   read $data from file,
   write $data to blob
end loop
close file
close blob

A solution seems to create a table with multi-part blobs like create table data_details ( id int pk auto_increment, chunck_number int not null, dataPart blob ); ???

解决方案似乎创建了一个包含多部分blob的表,例如create table data_details(id int pk auto_increment,chunck_number int not null,dataPart blob); ???

#1


7  

As far as I know it's generally quicker and better practice not to store the file in the db as it will get massive very quickly and slow it down. It's best to make a way of storing the file in a directory and then just store the location of the file in the db.

据我所知,通常更快更好的做法是不将文件存储在数据库中,因为它会很快变大并减慢速度。最好将文件存储在目录中,然后将文件的位置存储在数据库中。

We do it for images/pdfs/mpegs etc in the CMS we have at work by creating a folder for the file named from the url-safe filename and storing the folder name in the db. It's easy just to write out the url of it in the presentation layer then.

我们通过为url-safe文件名命名的文件创建文件夹并在db中存储文件夹名称,在我们工作的CMS中为images / pdfs / mpegs等执行此操作。然后,只需在表示层中写出它的url就很容易了。

#2


13  

You will want to check the MySQL configuration value "max_allowed_packet", which might be set too small, preventing the INSERT (which is large itself) from happening.

您将需要检查MySQL配置值“max_allowed_pa​​cket”,该值可能设置得太小,从而阻止INSERT(其本身很大)发生。

Run the following from a mysql command prompt:

从mysql命令提示符运行以下命令:

mysql> show variables like 'max_allowed_packet';

Make sure its large enough. For more information on this config option see

确保它足够大。有关此配置选项的更多信息,请参阅

MySQL max_allowed_packet

This also impacts mysql_escape_string() and mysql_real_escape_string() in PHP limiting the size of the string creation.

这也会影响PHP中的mysql_escape_string()和mysql_real_escape_string(),限制字符串创建的大小。

#3


4  

The best answer is to use an implementation that is better and also works around that issue. You can read an article here. Store 10MB, 1000MB, doesn't matter. The implementation chunks/cuts the file into many smaller pieces and stores them in multiple rows.. This helps with load and fetching so memory doesn't also become an issue.

最好的答案是使用更好的实现,并解决该问题。你可以在这里阅读一篇文章。存储10MB,1000MB,没关系。实现将文件分块/切割成许多较小的块并将它们存储在多行中。这有助于加载和获取,因此内存也不会成为问题。

#4


3  

Some PHP extensions for MySQL have issues with LONGBLOB and LONGTEXT data types. The extensions may not support blob streaming (posting the blob one segment at a time), so they have to post the entire object in one go.

MySQL的一些PHP扩展存在LONGBLOB和LONGTEXT数据类型的问题。扩展可能不支持blob流(一次发布blob一个段),因此他们必须一次性发布整个对象。

So if PHP's memory limit or MySQL's packet size limit restrict the size of an object you can post to the database, you may need to change some configuration on either PHP or MySQL to allow this.

因此,如果PHP的内存限制或MySQL的数据包大小限制限制了可以发布到数据库的对象的大小,则可能需要更改PHP或MySQL上的某些配置以允许此操作。

You didn't say which PHP extension you're using (there are at least three for MySQL), and you didn't show any of the code you're using to post the blob to the database.

您没有说明您正在使用哪个PHP扩展(MySQL至少有三个),并且您没有显示用于将blob发布到数据库的任何代码。

#5


2  

You could use MySQL's LOAD_FILE function to store the file, but you still have to obey the max_allowed_packet value and the fact that the file must be on the same server as the MySQL instance.

您可以使用MySQL的LOAD_FILE函数来存储文件,但您仍然必须遵守max_allowed_pa​​cket值以及该文件必须与MySQL实例位于同一服务器上的事实。

#6


0  

You don't say what error you're getting (use mysql_error() to find out), but I suspect you may be hitting the maximum packet size.

你没有说你得到了什么错误(使用mysql_error()来查找),但我怀疑你可能达到了最大数据包大小。

If this is the case, you'd need to change your MySQL configuration max_allowed_packet

如果是这种情况,则需要更改MySQL配置max_allowed_pa​​cket

#7


0  

You don't say what error you're getting (use mysql_error() to find out), but I suspect you may be hitting the maximum packet size.

你没有说你得到了什么错误(使用mysql_error()来查找),但我怀疑你可能达到了最大数据包大小。

If this is the case, you'd need to change your MySQL configuration max_allowed_packet

如果是这种情况,则需要更改MySQL配置max_allowed_pa​​cket

Well I have the same problem. And data cannot be entered in the mysql database chunck by chunck in a "io mode"

好吧,我有同样的问题。并且数据无法通过chunck在“io模式”中输入到mysql数据库中

loop for : 
   read $data from file,
   write $data to blob
end loop
close file
close blob

A solution seems to create a table with multi-part blobs like create table data_details ( id int pk auto_increment, chunck_number int not null, dataPart blob ); ???

解决方案似乎创建了一个包含多部分blob的表,例如create table data_details(id int pk auto_increment,chunck_number int not null,dataPart blob); ???