如何在2.2.3项目中使用Spark 1.2.0,因为它失败于NoSuchMethodError: akka.util. helper ?

时间:2022-06-27 23:09:40

Have you ever had a problem with Play framework? In my case, first of all I have build all in one jar: spark-assebmly-1.2.0-hadoop2.4.0.jar, and Spark works perfectly from a shell. But there are two questions:

你曾经有过游戏框架的问题吗?在我的例子中,首先我构建了一个jar: spark-资产-1.2.0-hadoop2.4.0。罐子,和火花完美地从一个壳。但有两个问题:

  1. Should I use this assebmled Spark_jar in Play_project and how?? Because I try to move it into the lib_directiry and it did n`t help to provide any Spark_imports.

    我应该在Play_project中使用这个资产配置的Spark_jar吗?因为我尝试将它移动到lib_directiry,并且它没有帮助提供任何spark_import。

  2. If I'm defining Spark library like: "org.apache.spark" %% "spark-core" % "1.2.0"

    如果我定义Spark库,比如:“org.apache”。火花“%%”火花核心“%”1.2.0“

PLAY FRAMEWORK CODE:

游戏框架代码:

Build.scala :

构建。scala:

val appDependencies = Seq(
        jdbc
        ,"org.apache.spark" %% "spark-streaming" % "1.2.0"
        ,"org.apache.spark" %% "spark-core" % "1.2.0"
        ,"org.apache.spark" %% "spark-sql" % "1.2.0"

TestEntity.scala :

TestEntity。scala:

package models
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.hadoop.conf.Configuration
import models.SparkMain
import org.apache.spark.rdd.RDD
import org.apache.spark.SparkContext._

object TestEntity {
 val TestEntityPath = "/home/t/PROD/dict/TestEntity .txt"
 val TestEntitySpark= SparkMain.sc.textFile(TestEntityPath, 4).cache
 val TestEntityData = TestEntitySpark.flatMap(_.split(","))
 def getFive() : Seq[String] = {
                println("TestEntity.getFive")
                TestEntityData.take(5)
                   }
}

SparkMain.scala :

SparkMain。scala:

package models
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.hadoop.conf.Configuration
import org.apache.spark.rdd.RDD
import org.apache.spark.SparkContext._
import org.apache.spark.streaming.{ Seconds, StreamingContext }
import StreamingContext._
import org.apache.spark.sql.SQLContext
import org.apache.spark.SparkConf


object SparkMain {
 val driverPort = 8080
 val driverHost = "localhost"
 val conf = new SparkConf(false) // skip loading external settings
 .setMaster("local[4]") // run locally with enough threads
 .setAppName("firstSparkApp")
 .set("spark.logConf", "true")
 .set("spark.driver.port", s"$driverPort")
 .set("spark.driver.host", s"$driverHost")
 .set("spark.akka.logLifecycleEvents", "true")
 val sc = new SparkContext(conf)
}

and controller code, which use Spark stuff :

和控制器代码,使用Spark的东西:

def test = Action {
    implicit req => {
      val chk = TestEntity.getFive
      Ok("it works")
    }
  }

..in runtime a have this errors:

. .在运行时a有此错误:

[info] o.a.s.SparkContext - Spark configuration:
spark.akka.logLifecycleEvents=true
spark.app.name=firstSparkApp
spark.driver.host=localhost
spark.driver.port=8080
spark.logConf=true
spark.master=local[4]
[warn] o.a.s.u.Utils - Your hostname, uisprk resolves to a loopback address: 127.0.1.1; using 10.0.2.15 instead (on interface eth0)
[warn] o.a.s.u.Utils - Set SPARK_LOCAL_IP if you need to bind to another address
[info] o.a.s.SecurityManager - Changing view acls to: t
[info] o.a.s.SecurityManager - Changing modify acls to: t
[info] o.a.s.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(t); users with modify permissions: Set(t)
[error] application -

! @6l039e8d5 - Internal server error, for (GET) [/ui] ->

play.api.Application$$anon$1: Execution exception[[RuntimeException: java.lang.NoSuchMethodError: akka.util.Helpers$.ConfigOps(Lcom/typesafe/config/Config;)Lcom/typesafe/config/Config;]]
        at play.api.Application$class.handleError(Application.scala:293) ~[play_2.10-2.2.3.jar:2.2.3]
        at play.api.DefaultApplication.handleError(Application.scala:399) [play_2.10-2.2.3.jar:2.2.3]
        at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$13$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:166) [play_2.10-2.2.3.jar:2.2.3]
        at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$13$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:163) [play_2.10-2.2.3.jar:2.2.3]
        at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33) [scala-library-2.10.4.jar:na]
        at scala.util.Failure$$anonfun$recover$1.apply(Try.scala:185) [scala-library-2.10.4.jar:na]
Caused by: java.lang.RuntimeException: java.lang.NoSuchMethodError: akka.util.Helpers$.ConfigOps(Lcom/typesafe/config/Config;)Lcom/typesafe/config/Config;
        at play.api.mvc.ActionBuilder$$anon$1.apply(Action.scala:314) ~[play_2.10-2.2.3.jar:2.2.3]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:109) ~[play_2.10-2.2.3.jar:2.2.3]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:109) ~[play_2.10-2.2.3.jar:2.2.3]
        at play.utils.Threads$.withContextClassLoader(Threads.scala:18) ~[play_2.10-2.2.3.jar:2.2.3]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:108) ~[play_2.10-2.2.3.jar:2.2.3]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:107) ~[play_2.10-2.2.3.jar:2.2.3]
Caused by: java.lang.NoSuchMethodError: akka.util.Helpers$.ConfigOps(Lcom/typesafe/config/Config;)Lcom/typesafe/config/Config;
        at akka.remote.RemoteSettings.<init>(RemoteSettings.scala:48) ~[akka-remote_2.10-2.3.4-spark.jar:na]
        at akka.remote.RemoteActorRefProvider.<init>(RemoteActorRefProvider.scala:114) ~[akka-remote_2.10-2.3.4-spark.jar:na]
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.7.0_72]
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) ~[na:1.7.0_72]
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.7.0_72]
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526) ~[na:1.7.0_72]

How to tie the library? through dependency or assembled_jar? Any advice, please.

如何连接图书馆?通过依赖或assembled_jar吗?任何建议,请。

2 个解决方案

#1


0  

nosuchmethodeersrror exception is 100 % due to mismatch of jars version at compile time and runtime.

由于jar版本在编译时和运行时不匹配,nosuchmethodeersrror异常是100%。

check the versions of jar. Also I have some questions about architecture of your app

检查jar的版本。我对你的应用程序的架构也有一些疑问。

Instead of calling spark code from play framework you can also call spark submit from shell scripts which looks better in your case. Even you can do it from your play application. no need to include jar in play app classpath.

与其从play框架调用spark代码,还可以从shell脚本中调用spark submit,这在您的示例中看起来更好。即使你可以从你的游戏应用程序中完成它。不需要在play应用程序类路径中包含jar。

#2


0  

The problem with the configuration is Akka dependency of Apache Spark and Play Framework -- they both depend on Akka and, as you've faced it, different and incompatible versions should be resolved at build time with evicted command in sbt.

配置的问题是Apache Spark和Play框架的Akka依赖关系——它们都依赖于Akka,正如您所面对的,不同的和不兼容的版本应该在构建时解决,在sbt中取消命令。

You may want to use update command and find the reports in target/resolution-cache/reports quite useful.

您可能想要使用update命令,并在目标/分辨率缓存/报告中找到非常有用的报告。

#1


0  

nosuchmethodeersrror exception is 100 % due to mismatch of jars version at compile time and runtime.

由于jar版本在编译时和运行时不匹配,nosuchmethodeersrror异常是100%。

check the versions of jar. Also I have some questions about architecture of your app

检查jar的版本。我对你的应用程序的架构也有一些疑问。

Instead of calling spark code from play framework you can also call spark submit from shell scripts which looks better in your case. Even you can do it from your play application. no need to include jar in play app classpath.

与其从play框架调用spark代码,还可以从shell脚本中调用spark submit,这在您的示例中看起来更好。即使你可以从你的游戏应用程序中完成它。不需要在play应用程序类路径中包含jar。

#2


0  

The problem with the configuration is Akka dependency of Apache Spark and Play Framework -- they both depend on Akka and, as you've faced it, different and incompatible versions should be resolved at build time with evicted command in sbt.

配置的问题是Apache Spark和Play框架的Akka依赖关系——它们都依赖于Akka,正如您所面对的,不同的和不兼容的版本应该在构建时解决,在sbt中取消命令。

You may want to use update command and find the reports in target/resolution-cache/reports quite useful.

您可能想要使用update命令,并在目标/分辨率缓存/报告中找到非常有用的报告。