的配置

时间:2025-05-15 07:46:56

1.原因:

  • 官方文档
  • /docs/latest/#preparations

To make Spark runtime jars accessible from YARN side, you can specify or . For details please refer to Spark Properties. If neither nor is specified, Spark will create a zip file with all jars under $SPARK_HOME/jars and upload it to the distributed cache

设置

  • 将spark/jars下的jar上传到hdfs中
#创建目录
[hadoop@hadoop001 jars]$ hadoop fs -mkdir -p /spark-yarn/jars/
#上传jar
[hadoop@hadoop001 jars]$ hadoop fs -put *.jar /spark-yarn/jars/
#配置目录  添加
[hadoop@hadoop001 conf]$ vi 

                    hdfs://hadoop001:9000/spark-yarn/jars/*.jar