Friday, 15 March 2013

Hadoop 2.4: java.lang.NoClassDefFoundError: org/apache/hcatalog/mapreduce/InputJobInfo -



Hadoop 2.4: java.lang.NoClassDefFoundError: org/apache/hcatalog/mapreduce/InputJobInfo -

i have upgraded recent hadoop hortonworks:

hadoop 2.4.0.2.1.2.1-471 subversion git@github.com:hortonworks/hadoop.git -r 9e5db004df1a751e93aa89b42956c5325f3a4482 compiled jenkins on 2014-05-27t18:57z compiled protoc 2.5.0 source checksum 9e788148daa5dd7934eb468e57e037b5 command run using /usr/lib/hadoop/hadoop-common-2.4.0.2.1.2.1-471.jar

before upgrading wrote java mrd programme uses hive tables both input & output. in previous version of hadoop worked, notwithstanding got deprecation warnings @ compile time code:

job job = new job(conf, "foo"); hcatinputformat.setinput(job,inputjobinfo.create(dbname, inputtablename, null));

now, after updating dependencies new jars in hadoop 2.4.0.2.1.2.1-471 , runing same code next error:

exception in thread "main" java.lang.noclassdeffounderror: org/apache/hcatalog/mapreduce/inputjobinfo @ com.bigdata.hadoop.foo.run(foo.java:240) @ org.apache.hadoop.util.toolrunner.run(toolrunner.java:70) @ org.apache.hadoop.util.toolrunner.run(toolrunner.java:84) @ com.bigdata.hadoop.foo.main(foo.java:272) @ sun.reflect.nativemethodaccessorimpl.invoke0(native method) @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:57) @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43) @ java.lang.reflect.method.invoke(method.java:606) @ org.apache.hadoop.util.runjar.main(runjar.java:212) caused by: java.lang.classnotfoundexception: org.apache.hcatalog.mapreduce.inputjobinfo @ java.net.urlclassloader$1.run(urlclassloader.java:366) @ java.net.urlclassloader$1.run(urlclassloader.java:355) @ java.security.accesscontroller.doprivileged(native method) @ java.net.urlclassloader.findclass(urlclassloader.java:354) @ java.lang.classloader.loadclass(classloader.java:425) @ java.lang.classloader.loadclass(classloader.java:358) ... 9 more

to run code utilize next settings:

export libjars=/usr/lib/hive-hcatalog/share/hcatalog/hive-hcatalog-core.jar,/usr/lib/hive/lib/hive-exec.jar,/usr/lib/hive/lib/hive-metastore.jar,/usr/lib/hive/lib/libfb303-0.9.0.jar,/usr/lib/hive/lib/jdo-api-3.0.1.jar,/usr/lib/hive/lib/antlr-runtime-3.4.jar,/usr/lib/hive/lib/datanucleus-api-jdo-3.2.6.jar,/usr/lib/hive/lib/datanucleus-core-3.2.10.jar

export hadoop_classpath=/usr/lib/hive-hcatalog/share/hcatalog/hive-hcatalog-core.jar,/usr/lib/hive/lib/hive-exec.jar,/usr/lib/hive/lib/hive-metastore.jar,/usr/lib/hive/lib/libfb303-0.9.0.jar,/usr/lib/hive/lib/jdo-api-3.0.1.jar,/usr/lib/hive/lib/antlr-runtime-3.4.jar,/usr/lib/hive/lib/datanucleus-api-jdo-3.2.6.jar,/usr/lib/hive/lib/datanucleus-core-3.2.10.jar

any ideas why java.lang.noclassdeffounderror: org/apache/hcatalog/mapreduce/inputjobinfo ?

i think should add together next dependency in pom.xml.

<dependency> <groupid>org.apache.hcatalog</groupid> <artifactid>hcatalog-core</artifactid> <version>0.11.0</version> </dependency>

java hadoop hive hcatalog

No comments:

Post a Comment