hive - The type org.apache.spark.sql.SchemaRDD cannot be resolved -
i getting error when try below code in eclipse.
sparkconf sparkconf = new sparkconf().setappname("simple hive app").setmaster("local"); javasparkcontext javasparkctx = new javasparkcontext(sparkconf); hivecontext hivecontext = new hivecontext(javasparkctx.sc()); //hivecontext.sql("show tables").collect();
i using below dependencies.
<dependency> <groupid>org.apache.spark</groupid> <artifactid>spark-core_2.10</artifactid> <version>1.4.0</version> </dependency> <dependency> <groupid>org.apache.spark</groupid> <artifactid>spark-sql_2.10</artifactid> <version>1.4.0</version> </dependency> <dependency> <groupid>org.apache.spark</groupid> <artifactid>spark-hive_2.10</artifactid> <version>1.2.1</version> </dependency>
it seems schemardd class has been removed version 1.3. not sure problem in code. can 1 on this?
the problem old version of spark-hive (1.2.1), still requiers schemardd. bump 1.4.0.
Comments
Post a Comment