使用codeigniter将大csv,xls数据文件插入Mysql

时间:2022-10-06 19:40:12

This code only works fine with csv file containing around 1K records. But it hungs if I try to insert files with let's say 100K. I searched online and found many suggestions, like adjust php time-out of execution, increase memory allocated for php ( this helped a little), but I need to import large data files like 1 million records.

此代码仅适用于包含大约1K记录的csv文件。但是,如果我尝试插入文件,假设100K,它就会挂起。我在网上搜索并发现了许多建议,比如调整php超时执行,增加为php分配的内存(这有点帮助),但我需要导入大数据文件,如100万条记录。

I'm using codeigniter. Is there a way to speed up this process: Description about this function: - The first "INSERT INTO" inserts into table="client" the logged-in client(admin_id) and time-created(create_time).

我正在使用codeigniter。有没有办法加快这个过程:关于这个函数的描述: - 第一个“INSERT INTO”插入到table =“client”登录的客户端(admin_id)和时间创建(create_time)。

-The second "INSERT INTO" inserts into table="client_attribute_value" the attribute_id,client_id, and value.

- 第二个“INSERT INTO”将attribute_id,client_id和value插入table =“client_attribute_value”。

    function Add_multiple_users($values)
{
    $err = '';
    foreach($values as $rows)
    {
        $clientQuery = 'INSERT INTO
                            client
                            (
                                admin_id,
                                create_time
                            )
                            VALUES
                            (
                                "'.$this -> session -> userdata('user_id').'",
                                "'.date('Y-m-d H:i:s').'"
                            )';
        $clientResult = @$this -> db -> query($clientQuery);
        if($clientResult)
        {
            $client_id = $this -> db -> insert_id();
            foreach($rows as $row)
            {
                $attrQuery = 'INSERT INTO
                                    client_attribute_value
                                    (
                                        attribute_id,
                                        client_id,
                                        value
                                    )
                                    VALUES
                                    (
                                        "'.$row['attribute_id'].'",
                                        "'.$client_id.'",
                                        "'.addslashes(trim($row['value'])).'"
                                    )';
                $attrResult = @$this -> db -> query($attrQuery);
                if(!$attrResult)
                {
                    $err .= '<p class="box error">Could not add attribute for<br>
                            Attribute ID: '.$row['attribute_id'].'<br>
                            Client ID: '.$client_id.'<br>
                            Attribute Value: '.trim($row['value']).'</p>';
                }
            }
        }
    }
    return $err;
}

1 个解决方案

#1


0  

I've dealt with this before. How to parse Large CSV file without timing out?

我以前处理过这个问题。如何在没有超时的情况下解析大型CSV文件?

Upload the file itself or find the location of the file and use a cron job to handle it. No timing out. The way I did it personally was set a queue system with a database table that pointed to the file and had a status flag. Once the cron job picked up that item it would begin inserting the CSV file (end of the day it was ~300mb in size). Once the cron job was done inserting into the db, it would delete the csv file, update the status for that queued item, and email me saying it was completed.

上传文件本身或查找文件的位置并使用cron作业来处理它。没有超时。我个人做的方式是设置一个队列系统,其中包含一个指向该文件并具有状态标志的数据库表。一旦cron作业拿起那个项目,它就会开始插入CSV文件(当天结束时它的大小约为300mb)。一旦cron作业完成插入数据库,它将删除csv文件,更新该排队项目的状态,并通过电子邮件发送给我说它已完成。

#1


0  

I've dealt with this before. How to parse Large CSV file without timing out?

我以前处理过这个问题。如何在没有超时的情况下解析大型CSV文件?

Upload the file itself or find the location of the file and use a cron job to handle it. No timing out. The way I did it personally was set a queue system with a database table that pointed to the file and had a status flag. Once the cron job picked up that item it would begin inserting the CSV file (end of the day it was ~300mb in size). Once the cron job was done inserting into the db, it would delete the csv file, update the status for that queued item, and email me saying it was completed.

上传文件本身或查找文件的位置并使用cron作业来处理它。没有超时。我个人做的方式是设置一个队列系统,其中包含一个指向该文件并具有状态标志的数据库表。一旦cron作业拿起那个项目,它就会开始插入CSV文件(当天结束时它的大小约为300mb)。一旦cron作业完成插入数据库,它将删除csv文件,更新该排队项目的状态,并通过电子邮件发送给我说它已完成。