为什么构建Spark源代码会给出“对象sbt不是package com.typesafe的成员”?

时间:2022-08-29 22:18:52

I tried to compile the https://github.com/apache/spark project using IntelliJ IDEA with the sbt plugin on Windows.

我尝试使用Windows上的sbt插件IntelliJ IDEA来编译https://github.com/apache/spark项目。

I'm facing an error about sbt. Since I'm not familiar with sbt, I don't know how to fix it.

我正面临一个关于sbt的错误。由于我不熟悉sbt,所以我不知道如何修复它。

The error messages are as follows:

错误信息如下:

[info] Loading project definition from F:\codeReading\sbtt\spark-master\project
[info] Compiling 3 Scala sources to F:\codeReading\sbtt\spark-master\project\target\scala-2.10\sbt-0.13\classes...
[error] F:\codeReading\sbtt\spark-master\project\SparkBuild.scala:26: object sbt is not a member of package com.typesafe
[error] import com.typesafe.sbt.pom.{PomBuild, SbtPomKeys}
[error]                     ^
[error] F:\codeReading\sbtt\spark-master\project\SparkBuild.scala:51: not found: type PomBuild
[error] object SparkBuild extends PomBuild {
[error]                           ^
[error] F:\codeReading\sbtt\spark-master\project\SparkBuild.scala:118: not found: value SbtPomKeys
[error]     otherResolvers <<= SbtPomKeys.mvnLocalRepository(dotM2 => Seq(Resolver.file("dotM2", dotM2))),
[error]                        ^
[error] F:\codeReading\sbtt\spark-master\project\SparkBuild.scala:178: value projectDefinitions is not a member of AnyRef
[error]     super.projectDefinitions(baseDirectory).map { x =>
[error]           ^
[error] four errors found
[error] (plugins/compile:compile) Compilation failed

2 个解决方案

#1


2  

Spark is built with Maven. The SBT build is only a convenience. You will have far better results importing it as a Maven project.

Spark是使用Maven构建的。SBT构建只是为了方便。将其作为Maven项目导入将会有更好的结果。

#2


0  

It looks like IDEA doesn't like project references to git projects that the Spark build definition uses for sbt-pom-reader.

看起来,IDEA不喜欢对git项目的项目引用,而Spark构建定义用于sbt- pomreader。

It's showed up when I ran sbt within the cloned project:

当我在克隆项目中运行sbt时,它就出现了:

➜  spark git:(master) ✗ xsbt
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
[info] Loading project definition from /Users/jacek/oss/spark/project/project
[info] Loading project definition from /Users/jacek/.sbt/0.13/staging/ec3aa8f39111944cc5f2/sbt-pom-reader/project
[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[info] Loading project definition from /Users/jacek/oss/spark/project
[info] Set current project to spark-parent (in build file:/Users/jacek/oss/spark/)
>

You can see the project reference to the git project of sbt-pom-reader when accessing the plugins project for which the reference is defined:

您可以在访问该引用定义的plugins项目时,看到项目参考sbt- pomreader的git项目。

> reload plugins
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
[info] Loading project definition from /Users/jacek/oss/spark/project/project
[info] Updating {file:/Users/jacek/oss/spark/project/project/}project-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Loading project definition from /Users/jacek/.sbt/0.13/staging/ec3aa8f39111944cc5f2/sbt-pom-reader/project
[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[info] Updating {file:/Users/jacek/.sbt/0.13/staging/ec3aa8f39111944cc5f2/sbt-pom-reader/project/}sbt-pom-reader-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Loading project definition from /Users/jacek/oss/spark/project
> projects
[info] In file:/Users/jacek/oss/spark/project/
[info]   * plugins
[info]     spark-style
[info] In https://github.com/ScrapCodes/sbt-pom-reader.git
[info]     sbt-pom-reader

A solution could be executing sbt gen-idea to generate the project files for IDEA. It's just a guess, though.

一个解决方案是执行sbt gen-idea来生成IDEA的项目文件。不过这只是个猜测。

#1


2  

Spark is built with Maven. The SBT build is only a convenience. You will have far better results importing it as a Maven project.

Spark是使用Maven构建的。SBT构建只是为了方便。将其作为Maven项目导入将会有更好的结果。

#2


0  

It looks like IDEA doesn't like project references to git projects that the Spark build definition uses for sbt-pom-reader.

看起来,IDEA不喜欢对git项目的项目引用,而Spark构建定义用于sbt- pomreader。

It's showed up when I ran sbt within the cloned project:

当我在克隆项目中运行sbt时,它就出现了:

➜  spark git:(master) ✗ xsbt
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
[info] Loading project definition from /Users/jacek/oss/spark/project/project
[info] Loading project definition from /Users/jacek/.sbt/0.13/staging/ec3aa8f39111944cc5f2/sbt-pom-reader/project
[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[info] Loading project definition from /Users/jacek/oss/spark/project
[info] Set current project to spark-parent (in build file:/Users/jacek/oss/spark/)
>

You can see the project reference to the git project of sbt-pom-reader when accessing the plugins project for which the reference is defined:

您可以在访问该引用定义的plugins项目时,看到项目参考sbt- pomreader的git项目。

> reload plugins
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
[info] Loading project definition from /Users/jacek/oss/spark/project/project
[info] Updating {file:/Users/jacek/oss/spark/project/project/}project-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Loading project definition from /Users/jacek/.sbt/0.13/staging/ec3aa8f39111944cc5f2/sbt-pom-reader/project
[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[info] Updating {file:/Users/jacek/.sbt/0.13/staging/ec3aa8f39111944cc5f2/sbt-pom-reader/project/}sbt-pom-reader-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Loading project definition from /Users/jacek/oss/spark/project
> projects
[info] In file:/Users/jacek/oss/spark/project/
[info]   * plugins
[info]     spark-style
[info] In https://github.com/ScrapCodes/sbt-pom-reader.git
[info]     sbt-pom-reader

A solution could be executing sbt gen-idea to generate the project files for IDEA. It's just a guess, though.

一个解决方案是执行sbt gen-idea来生成IDEA的项目文件。不过这只是个猜测。