Spark流可以在执行器上创建一个线程

时间:2022-06-18 02:12:10

I have a question on spark streaming. In my spark streaming application, I have a code that runs on worker/executor as a task (inside foreachPartition() while processing a RDD). I want to create a thread as part of this code that will run continuously on executor/worker from the time it is launched till executor is alive, listen to some external events and take some action based on that.

我有一个关于火花流的问题。在我的spark流应用程序中,我有一个代码在worker / executor上作为任务运行(在处理RDD时在foreachPartition()内部)。我想创建一个线程作为此代码的一部分,它将在执行器/工作程序从启动时执行器持续运行,直到执行程序处于活动状态,侦听一些外部事件并根据它执行某些操作。

Is this possible to do in spark streaming?

火花流可以做到这一点吗?

1 个解决方案

#1


0  

You could try to fit this into a custom receive. You can find some details in Implementing a Custom Receiver. Otherwise it doesn't fit very well in the Spark streaming.

您可以尝试将其纳入自定义接收。您可以在实现自定义接收器中找到一些详细信息。否则它在Spark流媒体中不太适合。

It is possible to start thread on the drive but I understand it is not what you want.

可以在驱动器上启动线程,但我知道它不是你想要的。

#1


0  

You could try to fit this into a custom receive. You can find some details in Implementing a Custom Receiver. Otherwise it doesn't fit very well in the Spark streaming.

您可以尝试将其纳入自定义接收。您可以在实现自定义接收器中找到一些详细信息。否则它在Spark流媒体中不太适合。

It is possible to start thread on the drive but I understand it is not what you want.

可以在驱动器上启动线程,但我知道它不是你想要的。