2021年1月1日星期五

Connecting HIVE from Spark/Scala

I have installed Hadoop-3.3.0 and Hive-3.1.2 in Ubuntu WSL (as windows subsystem).
I have all hadoop, YARN and hiveserver2 demons running in Ubuntu WSL.

In my windows OS (host), I open Scala IDE. Via Spark/Scala, I would like to connect to HIVE tables which are available in Ubuntu WSL.

In Windows, I have nothing related to Hadoop/HIVE installed. Everything is available only in Ubuntu WSL.

Can someone please help how to do this in Scala IDE.

Code I use:

 val spark =  SparkSession               .builder               .master("local[*]")               .appName("My APP")               .config("spark.sql.uris", "thrift://localhost:9083")                .enableHiveSupport()                .getOrCreate     spark.sql("show tables").show();  

Error I get:

Exception in thread "main" java.lang.IllegalArgumentException: Unable to instantiate SparkSession with Hive support because Hive classes are not found.      at org.apache.spark.sql.SparkSession$Builder.enableHiveSupport  

Thanks!

https://stackoverflow.com/questions/65531972/connecting-hive-from-spark-scala January 02, 2021 at 01:26AM

没有评论:

发表评论