jdbc - Method not supported in spark -


first, start thrift server in spark. /sbin/start-thriftserver.sh , deamon started.

hadoop   13015     1 99 13:52 pts/1    00:00:09 /usr/lib/jvm/jre-1.7.0-openjdk.x86_64/bin/java -cp /home/hadoop/spark/lib/hive-jdbc-0.13.0.jar:/home/hadoop/spark-1.4.1-bin-hadoop2.6/sbin/../conf/:/home/hadoop/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar:/home/hadoop/spark-1.4.1-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar:/home/hadoop/spark-1.4.1-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar:/home/hadoop/spark-1.4.1-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar -xms512m -xmx512m -xx:maxpermsize=256m org.apache.spark.deploy.sparksubmit --class org.apache.spark.sql.hive.thriftserver.hivethriftserver2 spark-internal 

after, start /bin/pyspark

my hive version 0.13.1,

spark version 1.4.1,

hadoop version 2.7

spark classpath below.

spark_classpath= /home/account/spark/lib/hive-jdbc-0.13.0.jar: /home/account/spark/lib/hive-exec-0.13.0.jar: /home/account/spark/lib/hive-metastore-0.13.0.jar: /home/account/spark/lib/hive-service-0.13.0.jar: /home/account/spark/lib/libfb303-0.9.0.jar: /home/account/spark/lib/log4j-1.2.16.jar

in pyspark(python-shell), wrote code.

>>> df = sqlcontext.load(source="jdbc",driver="org.apache.hive.jdbc.hivedriver", url="jdbc:hive2://ip:10000/default", dbtable="default.test")

but it's not working, error. how can resolve error?

traceback (most recent call last):   file "<stdin>", line 1, in <module>   file "/home/dev/user/ja/spark/python/pyspark/sql/context.py", line 458, in load     return self.read.load(path, source, schema, **options)   file "/home/dev/user/ja/spark/python/pyspark/sql/readwriter.py", line 112, in load     return self._df(self._jreader.load())   file "/home/dev/user/ja/spark/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 538, in __call__   file "/home/dev/user/ja/spark/python/lib/py4j-0.8.2.1-src.zip/py4j/protocol.py", line 300, in get_return_value py4j.protocol.py4jjavaerror: error occurred while calling o29.load. : java.sql.sqlexception: method not supported         @ org.apache.hive.jdbc.hiveresultsetmetadata.issigned(hiveresultsetmetadata.java:141)         @ org.apache.spark.sql.jdbc.jdbcrdd$.resolvetable(jdbcrdd.scala:132)         @ org.apache.spark.sql.jdbc.jdbcrelation.<init>(jdbcrelation.scala:128)         @ org.apache.spark.sql.jdbc.defaultsource.createrelation(jdbcrelation.scala:113)         @ org.apache.spark.sql.sources.resolveddatasource$.apply(ddl.scala:269)         @ org.apache.spark.sql.dataframereader.load(dataframereader.scala:114)         @ sun.reflect.nativemethodaccessorimpl.invoke0(native method)         @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:57)         @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)         @ java.lang.reflect.method.invoke(method.java:601)         @ py4j.reflection.methodinvoker.invoke(methodinvoker.java:231)         @ py4j.reflection.reflectionengine.invoke(reflectionengine.java:379)         @ py4j.gateway.invoke(gateway.java:259)         @ py4j.commands.abstractcommand.invokemethod(abstractcommand.java:133)         @ py4j.commands.callcommand.execute(callcommand.java:79)         @ py4j.gatewayconnection.run(gatewayconnection.java:207)         @ java.lang.thread.run(thread.java:722) 

i think hiveresultsetmetadata.issigned method not supported in hive. don't know how can resolve error. please help..

thank you

it's uncertain. answer question.

i think caused version. when execute command below, "method not support" error.

but when command @ spark-1.3.1, it's worked.

>>> df = sqlcontext.load(source="jdbc",driver="org.apache.hive.jdbc.hivedriver", url="jdbc:hive2://ip:10000/default", dbtable="default.test")

so think problem version.

but it's guess.

this page maybe you. http://docs.hortonworks.com/hdpdocuments/hdp1/hdp-1.2.4/ds_hive/jdbc-hs2.html


Comments

Popular posts from this blog

java - UnknownEntityTypeException: Unable to locate persister (Hibernate 5.0) -

python - ValueError: empty vocabulary; perhaps the documents only contain stop words -

ubuntu - collect2: fatal error: ld terminated with signal 9 [Killed] -