Spark在StandAlone模式下提交任务,spark.rpc.message.maxSize太小而出错

时间:2022-01-22 05:06:02
1.错误信息
org.apache.spark.SparkException: Job aborted due to stage failure:Serialized task 32:5 was 1728746673 bytes,
which exceeds max allowed: spark.rpc.message.maxSize (134217728 bytes).
Consider increasing spark.rpc.message.maxSize or using broadcast variables for large values.at org.apache.spark.scheduler.DAGScheduler……
2.错误原因
  Spark节点间传输的数据过大,超过系统默认的128M,因此需要提高
spark.rpc.message.maxSize的大小或者选择用broadcast广播数据。
然而在某些情况下,广播数据并不能契合我们的需求,这时我们可以在提交任务时对
spark.rpc.message.maxSize进行配置,调高maxSize即可。
3.具体解决方案
./bin/spark-submit \
  --class <main-class>
  --master <master-url> \
  --deploy-mode <deploy-mode> \
  --conf spark.rpc.message.maxSize=256
  ... # other options
  <application-jar> \
  [application-arguments]
红色区域即可根据需求更改spark.rpc.message.maxSize的大小,举例改为256M,实测有效。