Hive or spark error

chnhbhndchngn 2022-02-13 08:18:11 阅读数:40

hive spark error

FAILED: SemanticException Failed to get a spark session: org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session e7509679-36bf-4bf9-973f-a091ddf0cb79

 

Because this problem is accidental , This exception is not thrown at the beginning . So it's been debugging all night , Finally, it is found that when the queue resource reaches 100% after , If there are no tasks in the short term, spare resources will be released and provided to the creator sparksession Use , Will cause the task to fail and throw this exception .

terms of settlement : enlarge client Connection interval ;

modify hive Configuration file for hive-site.xml

<!--Hive and Spark Connection timeout -->
<property>
<name>hive.spark.client.connect.timeout</name>
<value>100000ms</value>
</property>

Or modify the parameters on the command line ( temporary )

set hive.spark.client.server.connect.timeout=100000; 

copyright:author[chnhbhndchngn],Please bring the original link to reprint, thank you. https://en.javamana.com/2022/02/202202130818095008.html