Mysql导入大型sql文件需要很长时间才是正常的吗?

时间:2023-01-19 20:21:19

Several days ago, I downloaded the Wikipedia "External Links" sql file. The external links table has three columnns: the first is an int(8) and the two others are a blob which has a url in it. I started importing the sql file, which has a size of 9.3 GB, with phpmyadmin, and it is now over 50 hours later, and the import process hasn't finished yet! There are so far over 33 million rows inserted.

几天前,我下载了*的“External Links”sql文件。外部链接表有三个列:第一个是int(8),另外两个是一个包含url的blob。我开始使用phpmyadmin导入sql文件,其大小为9.3 GB,现在已超过50小时,导入过程尚未完成!到目前为止已插入超过3300万行。

Is it normal for it to take THAT long, or is there something wrong with my setup?

它花了很长时间,或者我的设置有问题是正常的吗?

Note: my server is an e3-1230 with 16 GB of RAM.

注意:我的服务器是带有16 GB RAM的e3-1230。

TIA!

1 个解决方案

#1


1  

It seems too long. This is for your reference.

这似乎太长了。这是供您参考。

Specification of my computer : - Windows 10 home - Intel Core i5-3317U CPU@1.7GHz - 6GB of RAM memory - 750 GB 5400 RPM hard drive - MySQL 5.7 and MySQL Workbench 6.3CE

我的电脑规格: - Windows 10家庭 - 英特尔酷睿i5-3317U CPU@1.7GHz - 6GB内存 - 750 GB 5400转硬盘 - MySQL 5.7和MySQL Workbench 6.3CE

I imported two files which were downloaded from NYC open data portal. NYC taxi fare.csv : 1.67 GB of the size, 11 columns, 15749228 rows. nyc311calls.csv : 5.83 GB of the size, 53 columns, 11145686 rows

我导入了两个从NYC开放数据门户下载的文件。纽约出租车fare.csv:1.67 GB的大小,11列,15749228行。 nyc311calls.csv:5.83 GB的大小,53列,11145686行

The sql script is the below LOAD DATA LOCAL INFILE 'C:/test_data/FILE_NAME' INTO TABLE DB_NAME FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n' IGNORE 1 ROWS;

sql脚本是下面的LOAD DATA LOCAL INFILE'C:/ test_data / FILE_NAME'INTO TABLE DB_NAME FIELDS终止',''''''''''''''''''''''''''''''''''''''''''''

Running time to import fare.csv : 320.706 sec (less than 6 minutes) Running time to import nyc311calls.csv : 968 sec (about 16 minutes)

运行时间导入fare.csv:320.706秒(小于6分钟)导入时间nyc311calls.csv:968秒(约16分钟)

#1


1  

It seems too long. This is for your reference.

这似乎太长了。这是供您参考。

Specification of my computer : - Windows 10 home - Intel Core i5-3317U CPU@1.7GHz - 6GB of RAM memory - 750 GB 5400 RPM hard drive - MySQL 5.7 and MySQL Workbench 6.3CE

我的电脑规格: - Windows 10家庭 - 英特尔酷睿i5-3317U CPU@1.7GHz - 6GB内存 - 750 GB 5400转硬盘 - MySQL 5.7和MySQL Workbench 6.3CE

I imported two files which were downloaded from NYC open data portal. NYC taxi fare.csv : 1.67 GB of the size, 11 columns, 15749228 rows. nyc311calls.csv : 5.83 GB of the size, 53 columns, 11145686 rows

我导入了两个从NYC开放数据门户下载的文件。纽约出租车fare.csv:1.67 GB的大小,11列,15749228行。 nyc311calls.csv:5.83 GB的大小,53列,11145686行

The sql script is the below LOAD DATA LOCAL INFILE 'C:/test_data/FILE_NAME' INTO TABLE DB_NAME FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n' IGNORE 1 ROWS;

sql脚本是下面的LOAD DATA LOCAL INFILE'C:/ test_data / FILE_NAME'INTO TABLE DB_NAME FIELDS终止',''''''''''''''''''''''''''''''''''''''''''''

Running time to import fare.csv : 320.706 sec (less than 6 minutes) Running time to import nyc311calls.csv : 968 sec (about 16 minutes)

运行时间导入fare.csv:320.706秒(小于6分钟)导入时间nyc311calls.csv:968秒(约16分钟)