未找到teradata从蜂巢到teradata类的出口

时间:2022-01-13 02:01:58

I am trying to export from hive table to teradata using TDCH connector, I am getting below errror:-

我正在尝试使用TDCH连接器从hive表导出到teradata,我得到以下错误: -

15/05/07 08:01:03 INFO tool.ConnectorExportTool: java.lang.NoClassDefFoundError: org/apache/hadoop/hive/metastore/api/MetaException
            at java.lang.Class.forName0(Native Method)
            at java.lang.Class.forName(Class.java:190)
            at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:81)
            at com.teradata.connector.common.tool.ConnectorExportTool.run(ConnectorExportTool.java:61)
            at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
            at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
            at com.teradata.hadoop.tool.TeradataExportTool.main(TeradataExportTool.java:24)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:606)
            at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
            at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
    Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.metastore.api.MetaException
            at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
            at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
            at java.security.AccessController.doPrivileged(Native Method)
            at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
            at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
            at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
            ... 13 more

I understand from error that hive-metastore jar is missing. But it is already there in hive/lib folder

我从错误中了解到hive-metastore jar缺失了。但它已经存在于hive / lib文件夹中

 hive-metastore.jar -> hive-metastore-0.9.0.jar

Is already present in the path: /usr/hdp/2.2.4.2-2/hive/lib

已存在于路径中:/usr/hdp/2.2.4.2-2/hive/lib

3 个解决方案

#1


You have to define environment variables LIB_JARS and HADOOP_CLASSPATH. You can use the environment variable LIB_JARS with the parameter -libjars

您必须定义环境变量LIB_JARS和HADOOP_CLASSPATH。您可以将环境变量LIB_JARS与参数-libjars一起使用

You can find an example under https://developer.teradata.com/sites/all/files/Teradata%20Connector%20for%20Hadoop%20Tutorial%20v1%200%20final.pdf.

您可以在https://developer.teradata.com/sites/all/files/Teradata%20Connector%20for%20Hadoop%20Tutorial%20v1%200%20final.pdf下找到示例。

According to the README, following jars are necessairy:

根据README,以下罐子是必需的:

    Hive Job(version 0.11.0 as example):
         a) hive-metastore-0.11.0.jar
         b) hive-exec-0.11.0.jar
         c) hive-cli-0.11.0.jar
         d) libthrift-0.9.0.jar
         e) libfb303-0.9.0.jar
         f) jdo2-api-2.3-ec.jar
         g) slf4j-api-1.6.1.jar
         h) datanucleus-core-3.0.9.jar
         i) datanucleus-rdbms-3.0.8.jar
         j) commons-dbcp-1.4.jar
         k) commons-pool-1.5.4.jar
         l) antlr-runtime-3.4.jar
         m) datanucleus-api-jdo-3.0.7.jar

    HCatalog Job:
         a) above Hive required jar files
         b) hcatalog-core-0.11.0.jar

#2


The problem is that org/apache/hadoop/hive/metastore/api/MetaException class is not available to java runtime. can you ensure that the Jar is under valid runtime classpath.

问题是org / apache / hadoop / hive / metastore / api / MetaException类对java运行时不可用。你能确保Jar在有效的运行时类路径下。

few try outs can be try passing the classpath explicitly to jvm via -cp , alternatively you can try putting the jar in launch directory to ensure its present.

很少尝试可以尝试通过-cp显式地将类路径传递给jvm,或者你可以尝试将jar放在启动目录中以确保其存在。

#3


Sometimes you may have all the jar files in your "lib" folder, but for some reason Oozie may not be reading it simply because you have a typo in your job.properties or coordinator.properties. Double check your properties file.

有时您可能在“lib”文件夹中包含所有jar文件,但由于某种原因,Oozie可能不会仅仅因为您的job.properties或coordinator.properties中有拼写错误而读取它。仔细检查您的属性文件。

The "job.properties" file should read like the following -

“job.properties”文件应如下所示:

oozie.use.system.libpath=true
oozie.wf.application.path=${nameNode}/apps/myapp/workflow/

The "coordinator.properties" file should read like the following -

“coordinator.properties”文件应如下所示:

oozie.use.system.libpath=true
oozie.coord.application.path=${nameNode}/apps/myapp/workflow/

and your "lib" folder with jars should be at -

你的“lib”文件夹与jar应该在 -

/apps/myapp/workflow/lib

#1


You have to define environment variables LIB_JARS and HADOOP_CLASSPATH. You can use the environment variable LIB_JARS with the parameter -libjars

您必须定义环境变量LIB_JARS和HADOOP_CLASSPATH。您可以将环境变量LIB_JARS与参数-libjars一起使用

You can find an example under https://developer.teradata.com/sites/all/files/Teradata%20Connector%20for%20Hadoop%20Tutorial%20v1%200%20final.pdf.

您可以在https://developer.teradata.com/sites/all/files/Teradata%20Connector%20for%20Hadoop%20Tutorial%20v1%200%20final.pdf下找到示例。

According to the README, following jars are necessairy:

根据README,以下罐子是必需的:

    Hive Job(version 0.11.0 as example):
         a) hive-metastore-0.11.0.jar
         b) hive-exec-0.11.0.jar
         c) hive-cli-0.11.0.jar
         d) libthrift-0.9.0.jar
         e) libfb303-0.9.0.jar
         f) jdo2-api-2.3-ec.jar
         g) slf4j-api-1.6.1.jar
         h) datanucleus-core-3.0.9.jar
         i) datanucleus-rdbms-3.0.8.jar
         j) commons-dbcp-1.4.jar
         k) commons-pool-1.5.4.jar
         l) antlr-runtime-3.4.jar
         m) datanucleus-api-jdo-3.0.7.jar

    HCatalog Job:
         a) above Hive required jar files
         b) hcatalog-core-0.11.0.jar

#2


The problem is that org/apache/hadoop/hive/metastore/api/MetaException class is not available to java runtime. can you ensure that the Jar is under valid runtime classpath.

问题是org / apache / hadoop / hive / metastore / api / MetaException类对java运行时不可用。你能确保Jar在有效的运行时类路径下。

few try outs can be try passing the classpath explicitly to jvm via -cp , alternatively you can try putting the jar in launch directory to ensure its present.

很少尝试可以尝试通过-cp显式地将类路径传递给jvm,或者你可以尝试将jar放在启动目录中以确保其存在。

#3


Sometimes you may have all the jar files in your "lib" folder, but for some reason Oozie may not be reading it simply because you have a typo in your job.properties or coordinator.properties. Double check your properties file.

有时您可能在“lib”文件夹中包含所有jar文件,但由于某种原因,Oozie可能不会仅仅因为您的job.properties或coordinator.properties中有拼写错误而读取它。仔细检查您的属性文件。

The "job.properties" file should read like the following -

“job.properties”文件应如下所示:

oozie.use.system.libpath=true
oozie.wf.application.path=${nameNode}/apps/myapp/workflow/

The "coordinator.properties" file should read like the following -

“coordinator.properties”文件应如下所示:

oozie.use.system.libpath=true
oozie.coord.application.path=${nameNode}/apps/myapp/workflow/

and your "lib" folder with jars should be at -

你的“lib”文件夹与jar应该在 -

/apps/myapp/workflow/lib