IDEA 本地调试spark程序 Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.

时间:2024-04-14 14:55:13

1.很简单的一测试程序:

IDEA 本地调试spark程序 Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.

2.run运行时:出现ERROR,报错行显示在  new SparkContext 这行。

val sc = new SparkContext(conf)

Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
    at akka.actor.ActorCell$.<init>(ActorCell.scala:336)
    at akka.actor.ActorCell$.<clinit>(ActorCell.scala)
    at akka.actor.RootActorPath.$div(ActorPath.scala:159)
    at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:464)
    at akka.remote.RemoteActorRefProvider.<init>(RemoteActorRefProvider.scala:124)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
    at scala.util.Try$.apply(Try.scala:192)
    at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
    at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
    at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
    at scala.util.Success.flatMap(Try.scala:231)
    at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
    at akka.actor.ActorSystemImpl.liftedTree1$1(ActorSystem.scala:584)
    at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:577)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:141)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:118)
    at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122)
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
    at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1991)
    at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
    at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1982)
    at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
    at org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:245)
    at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:247)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
    at com.spark.sample.WordCount$.main(WordCount.scala:11)
    at com.spark.sample.WordCount.main(WordCount.scala)
20/03/11 21:53:22 INFO Utils: Shutdown hook called
3.百度,码友说scala 和spark 版本不兼容导致。

本地安装的scala 版本是:2.11.12

idea 上Global Libraries  版本也是:2.11.12

由于saprk程序是maven项目,查看pom

<dependency>
   <groupId>org.apache.spark</groupId>
   <artifactId>spark-core_2.10</artifactId>
   <version>1.4.0</version>
</dependency>

4.到 maven repository 官网,查找scala匹配的spark版本,更改为:

<dependency>
   <groupId>org.apache.spark</groupId>
   <artifactId>spark-core_2.11</artifactId>
   <version>2.1.0</version>
</dependency>

5.重试程序,OK.通过。

在此感谢码友!!!