在大型脚本中以队列形式执行批处理命令(cron)

时间:2022-06-05 01:16:33

I am making a remote fileuploading script which uploads a single or multiple chosen files to multiple filehosts, I am doing this by some cli script and as such all my main commands are executed using exec function. As such what I do is create my commands as per user input from one php file and save the commands in a .json file.
Then I have a separate file to be run manually or via cron to execute those batch of commands per json file. However sometimes if I input 50 files at once with 3 filehosts, commands to be executed are almost 100-150+ and manytimes due to nginx/php timeout or other such reasons the CLI script simply stops or suspends midway and then I have to restart whole batch and reupload all files again rather then the point where it ended/suspended.
Is there a better way to manage this type of long command queue and possibly resume it from where it last suspended or aborted ?
One way I thought is rather then creating all commands in a single json file, I create one file each for each command and save it in a new folder created for that queue, then the cron script picks one command file, executes it, if its success, deletes the file and selects next file (using loop)

我正在制作一个远程文件上传脚本,它将一个或多个选择的文件上传到多个文件主机,我通过一些cli脚本执行此操作,因此所有主要命令都是使用exec函数执行的。因此我所做的是根据用户输入从一个php文件创建我的命令,并将命令保存在.json文件中。然后我有一个单独的文件手动或通过cron运行,以执行每个json文件批量命令。但是有时候如果我用3个文件主机一次输入50个文件,要执行的命令几乎是100-150 +并且很多时候由于nginx / php超时或其他原因导致CLI脚本只是中途停止或暂停然后我必须重新启动整个批量并重新上载所有文件,而不是它结束/暂停的点。有没有更好的方法来管理这种类型的长命令队列,并可能从上次暂停或中止的位置恢复它?我认为的一种方法是在单个json文件中创建所有命令,我为每个命令创建一个文件并将其保存在为该队列创建的新文件夹中,然后cron脚本选择一个命令文件,执行它,如果它成功,删除文件并选择下一个文件(使用循环)

Is that the only best option I can have ?

这是我唯一可以选择的最佳选择吗?

1 个解决方案

#1


0  

Check your php.ini to increase max execution time or the function set_time_limit

检查你的php.ini以增加最大执行时间或函数set_time_limit

#1


0  

Check your php.ini to increase max execution time or the function set_time_limit

检查你的php.ini以增加最大执行时间或函数set_time_limit