Hadoop 2.4:. lang。NoClassDefFoundError:org/apache/hcatalog/mapreduce/InputJobInfo

时间:2022-12-16 15:36:30

I have upgraded to a recent Hadoop from Hortonworks:

我已经从Hortonworks升级到最近的Hadoop:

Hadoop 2.4.0.2.1.2.1-471
Subversion git@github.com:hortonworks/hadoop.git -r 9e5db004df1a751e93aa89b42956c5325f3a4482
Compiled by jenkins on 2014-05-27T18:57Z
Compiled with protoc 2.5.0
From source with checksum 9e788148daa5dd7934eb468e57e037b5
This command was run using /usr/lib/hadoop/hadoop-common-2.4.0.2.1.2.1-471.jar

Before upgrading I wrote a Java MRD program that uses Hive tables both for input & output. In previous version of Hadoop it worked, notwithstanding I got deprecation warnings at compile time for this code:

在升级之前,我编写了一个Java MRD程序,使用Hive表进行输入和输出。在以前的Hadoop版本中,尽管我在编译时得到了这个代码的弃用警告:

    Job job = new Job(conf, "Foo");
    HCatInputFormat.setInput(job,InputJobInfo.create(dbName, inputTableName, null));

Now, after updating dependencies to new jars in Hadoop 2.4.0.2.1.2.1-471 and runing the same code I get the following error:

现在,在Hadoop 2.4.0.2.1.2.1-471中更新对新jar的依赖关系并运行相同的代码后,我得到以下错误:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hcatalog/mapreduce/InputJobInfo
    at com.bigdata.hadoop.Foo.run(Foo.java:240)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at com.bigdata.hadoop.Foo.main(Foo.java:272)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.lang.ClassNotFoundException: org.apache.hcatalog.mapreduce.InputJobInfo
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    ... 9 more

To run my code I use the following settings:

要运行我的代码,我使用以下设置:

export LIBJARS=/usr/lib/hive-hcatalog/share/hcatalog/hive-hcatalog-core.jar,/usr/lib/hive/lib/hive-exec.jar,/usr/lib/hive/lib/hive-metastore.jar,/usr/lib/hive/lib/libfb303-0.9.0.jar,/usr/lib/hive/lib/jdo-api-3.0.1.jar,/usr/lib/hive/lib/antlr-runtime-3.4.jar,/usr/lib/hive/lib/datanucleus-api-jdo-3.2.6.jar,/usr/lib/hive/lib/datanucleus-core-3.2.10.jar

出口LIBJARS = / usr / lib / hive-hcatalog /分享/ hcatalog / hive-hcatalog-core.jar,/ usr / lib /蜂巢/ lib / hive-exec.jar,/ usr / lib /蜂巢/ lib / hive-metastore.jar,/ usr / lib /蜂巢/ lib / libfb303-0.9.0.jar,/ usr / lib /蜂巢/ lib / jdo-api-3.0.1.jar,/ usr / lib /蜂巢/ lib / antlr-runtime-3.4.jar,/ usr / lib /蜂巢/ lib / datanucleus-api-jdo-3.2.6.jar,/ usr / lib /蜂巢/ lib / datanucleus-core-3.2.10.jar

export HADOOP_CLASSPATH=/usr/lib/hive-hcatalog/share/hcatalog/hive-hcatalog-core.jar,/usr/lib/hive/lib/hive-exec.jar,/usr/lib/hive/lib/hive-metastore.jar,/usr/lib/hive/lib/libfb303-0.9.0.jar,/usr/lib/hive/lib/jdo-api-3.0.1.jar,/usr/lib/hive/lib/antlr-runtime-3.4.jar,/usr/lib/hive/lib/datanucleus-api-jdo-3.2.6.jar,/usr/lib/hive/lib/datanucleus-core-3.2.10.jar

出口HADOOP_CLASSPATH = / usr / lib / hive-hcatalog /分享/ hcatalog / hive-hcatalog-core.jar,/ usr / lib /蜂巢/ lib / hive-exec.jar,/ usr / lib /蜂巢/ lib / hive-metastore.jar,/ usr / lib /蜂巢/ lib / libfb303-0.9.0.jar,/ usr / lib /蜂巢/ lib / jdo-api-3.0.1.jar,/ usr / lib /蜂巢/ lib / antlr-runtime-3.4.jar,/ usr / lib /蜂巢/ lib / datanucleus-api-jdo-3.2.6.jar,/ usr / lib /蜂巢/ lib / datanucleus-core-3.2.10.jar

Any ideas why I get java.lang.NoClassDefFoundError: org/apache/hcatalog/mapreduce/InputJobInfo ?

有什么想法吗?NoClassDefFoundError:org/apache/hcatalog/mapreduce/InputJobInfo ?

2 个解决方案

#1


0  

I think you should add the following dependency in pom.xml.

我认为您应该在py .xml中添加以下依赖项。

<dependency>
     <groupId>org.apache.hcatalog</groupId>
     <artifactId>hcatalog-core</artifactId>
     <version>0.11.0</version>
</dependency>

#2


0  

I did face exactly same issue. In your case you will need the following jars to your class path:

我确实面临着同样的问题。在您的情况下,您将需要以下jar文件到您的类路径:

1. jdo2-api-2.3-eb.jar,
2. libthrift-0.9.0.jar,
3. datanucleus-rdbms-3.2.6.jar,
4. hive-ant-0.13.0.jar 

#1


0  

I think you should add the following dependency in pom.xml.

我认为您应该在py .xml中添加以下依赖项。

<dependency>
     <groupId>org.apache.hcatalog</groupId>
     <artifactId>hcatalog-core</artifactId>
     <version>0.11.0</version>
</dependency>

#2


0  

I did face exactly same issue. In your case you will need the following jars to your class path:

我确实面临着同样的问题。在您的情况下,您将需要以下jar文件到您的类路径:

1. jdo2-api-2.3-eb.jar,
2. libthrift-0.9.0.jar,
3. datanucleus-rdbms-3.2.6.jar,
4. hive-ant-0.13.0.jar