将数据从S3存储桶定期传输到红移

时间:2022-12-25 23:06:17

I have some data stored in S3 . I need to clone/copy this data periodically from S3 to Redshift cluster. To do bulk copy , I can use copy command to copy from S3 to redshift.

我有一些数据存储在S3中。我需要定期从S3到Redshift集群克隆/复制这些数据。要进行批量复制,我可以使用复制命令从S3复制到redshift。

Similarly is there any trivial way to copy data from S3 to Redshift periodically .

同样,有任何简单的方法可以定期将数据从S3复制到Redshift。

Thanks

谢谢

1 个解决方案

#1


1  

Try using AWS Data Pipeline which has various templates for moving data from one AWS service to other. The "Load data from S3 into Redshift" template copies data from an Amazon S3 folder into a Redshift table. You can load the data into an existing table or provide a SQL query to create the table. The Redshift table must have the same schema as the data in Amazon S3.

尝试使用AWS Data Pipeline,它具有各种模板,用于将数据从一个AWS服务移动到另一个AWS服务。 “将数据从S3加载到Redshift”模板将数据从Amazon S3文件夹复制到Redshift表中。您可以将数据加载到现有表中,也可以提供SQL查询来创建表。 Redshift表必须与Amazon S3中的数据具有相同的架构。

Data Pipeline supports pipelines to be running on a schedule. You have a cron style editor for scheduling

Data Pipeline支持管道按计划运行。你有一个cron样式编辑器用于安排

#1


1  

Try using AWS Data Pipeline which has various templates for moving data from one AWS service to other. The "Load data from S3 into Redshift" template copies data from an Amazon S3 folder into a Redshift table. You can load the data into an existing table or provide a SQL query to create the table. The Redshift table must have the same schema as the data in Amazon S3.

尝试使用AWS Data Pipeline,它具有各种模板,用于将数据从一个AWS服务移动到另一个AWS服务。 “将数据从S3加载到Redshift”模板将数据从Amazon S3文件夹复制到Redshift表中。您可以将数据加载到现有表中,也可以提供SQL查询来创建表。 Redshift表必须与Amazon S3中的数据具有相同的架构。

Data Pipeline supports pipelines to be running on a schedule. You have a cron style editor for scheduling

Data Pipeline支持管道按计划运行。你有一个cron样式编辑器用于安排