【hadoop | hive 问题】hive on spark问题记录

hive、spark版本

apache-hive-3.1.2-bin.tar.gz
spark-3.1.2-bin-without-hadoop.tgz

搭建过程

已完成spark集群on yarn的搭建,正常运行

已完成hive集群搭建,在mr引擎下正常运行

切换hive配置为spark引擎的配置后出现一系列问题,疑似还是版本未兼容问题

hive on spark的报错问题

spark java.lang.NoClassDefFoundError: org/apache/spark/SparkConf

解决
cp $SPARK_HOME/jars/scala-library-2.12.10.jar $HIVE_HOME/lib

cp $SPARK_HOME/jars/spark-core_2.12-3.1.2.jar $HIVE_HOME/lib

cp $SPARK_HOME/jars/spark-network-common_2.12-3.1.2.jar $HIVE_HOME/lib

2022-07-28T14:57:41,864 ERROR [44953928-d51a-4ef1-8172-26e1958d2d75 main]: spark.SparkTask (:()) - Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session 3d5f23a6-d9f1-47c0-ada3-5fa9a82bb425)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session 3d5f23a6-d9f1-47c0-ada3-5fa9a82bb425
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.getHiveException(SparkSessionImpl.java:221)
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92)
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115)
	at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136)
	at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664)
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335)
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703)
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218)
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188)
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402)
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
Caused by: java.lang.NoClassDefFoundError: org/apache/spark/SparkConf
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.generateSparkConf(HiveSparkClientFactory.java:263)
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:98)
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76)
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87)
	... 24 more
Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkConf
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	... 28 more

2022-07-28T14:57:41,869 INFO  [44953928-d51a-4ef1-8172-26e1958d2d75 main]: conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 44953928-d51a-4ef1-8172-26e1958d2d75
2022-07-28T14:57:41,869 INFO  [44953928-d51a-4ef1-8172-26e1958d2d75 main]: session.SessionState (SessionState.java:resetThreadName(452)) - Resetting thread name to  main
2022-07-28T14:57:41,864 ERROR [44953928-d51a-4ef1-8172-26e1958d2d75 main]: spark.SparkTask (:()) - Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session 3d5f23a6-d9f1-47c0-ada3-5fa9a82bb425)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session 3d5f23a6-d9f1-47c0-ada3-5fa9a82bb425
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.getHiveException(SparkSessionImpl.java:221) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683) ~[hive-cli-3.1.2.jar:3.1.2]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_232]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_232]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_232]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_232]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:323) ~[hadoop-common-3.2.1.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:236) ~[hadoop-common-3.2.1.jar:?]
Caused by: java.lang.NoClassDefFoundError: org/apache/spark/SparkConf
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.generateSparkConf(HiveSparkClientFactory.java:263) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:98) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
	... 24 more
Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkConf
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382) ~[?:1.8.0_232]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[?:1.8.0_232]
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) ~[?:1.8.0_232]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[?:1.8.0_232]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.generateSparkConf(HiveSparkClientFactory.java:263) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:98) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
	... 24 more
2022-07-28T14:57:41,865 INFO  [44953928-d51a-4ef1-8172-26e1958d2d75 main]: reexec.ReOptimizePlugin (:()) - ReOptimization: retryPossible: false
2022-07-28T14:57:41,865 ERROR [44953928-d51a-4ef1-8172-26e1958d2d75 main]: ql.Driver (:()) - FAILED: Execution Error, return code 30041 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create Spark client for Spark session 3d5f23a6-d9f1-47c0-ada3-5fa9a82bb425
2022-07-28T14:57:41,865 INFO  [44953928-d51a-4ef1-8172-26e1958d2d75 main]: ql.Driver (:()) - Completed executing command(queryId=root_20220728145741_8b8d40ca-5d19-4daa-afdb-2ed63dd3e302); Time taken: 0.233 seconds
2022-07-28T14:57:41,865 INFO  [44953928-d51a-4ef1-8172-26e1958d2d75 main]: ql.Driver (:()) - Concurrency mode is disabled, not creating a lock manager
2022-07-28T15:01:52,480 INFO  [shutdown-hook-0]: session.SparkSessionManagerImpl (:()) - Closing the session manager.

spark java.lang.ClassNotFoundException: scala.Cloneable

scala-library-2.12.10.jar

2022-07-30T15:45:27,052 ERROR [04093a64-9ed6-451d-8afa-fe7ae43c6136 main]: spark.SparkTask (:()) - Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session e0a6188f-3bcd-4484-ac97-01bb6189edcd)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session e0a6188f-3bcd-4484-ac97-01bb6189edcd
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.getHiveException(SparkSessionImpl.java:221) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683) ~[hive-cli-3.1.2.jar:3.1.2]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_232]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_232]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_232]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_232]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:323) ~[hadoop-common-3.2.1.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:236) ~[hadoop-common-3.2.1.jar:?]
Caused by: java.lang.NoClassDefFoundError: scala/Cloneable
	at java.lang.ClassLoader.defineClass1(Native Method) ~[?:1.8.0_232]
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756) ~[?:1.8.0_232]
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) ~[?:1.8.0_232]
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468) ~[?:1.8.0_232]
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74) ~[?:1.8.0_232]
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369) ~[?:1.8.0_232]
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363) ~[?:1.8.0_232]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_232]
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362) ~[?:1.8.0_232]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[?:1.8.0_232]
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) ~[?:1.8.0_232]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[?:1.8.0_232]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.generateSparkConf(HiveSparkClientFactory.java:263) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:98) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
	... 24 more
Caused by: java.lang.ClassNotFoundException: scala.Cloneable
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382) ~[?:1.8.0_232]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[?:1.8.0_232]
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) ~[?:1.8.0_232]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[?:1.8.0_232]
	at java.lang.ClassLoader.defineClass1(Native Method) ~[?:1.8.0_232]
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756) ~[?:1.8.0_232]
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) ~[?:1.8.0_232]
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468) ~[?:1.8.0_232]
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74) ~[?:1.8.0_232]
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369) ~[?:1.8.0_232]
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363) ~[?:1.8.0_232]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_232]
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362) ~[?:1.8.0_232]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[?:1.8.0_232]
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) ~[?:1.8.0_232]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[?:1.8.0_232]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.generateSparkConf(HiveSparkClientFactory.java:263) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:98) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
	... 24 more

spark java.lang.ClassNotFoundException: org.sparkproject.guava.cache.CacheLoader

复制spark-launcher_2.12-3.1.2.jar至hive/lib

2022-07-30T15:47:23,379 ERROR [66fff150-4600-4bab-b72b-27a909ea721d main]: spark.SparkTask (:()) - Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session fb5c9b20-25eb-43a7-a7e4-dae14cd5735b)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session fb5c9b20-25eb-43a7-a7e4-dae14cd5735b
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.getHiveException(SparkSessionImpl.java:221)
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92)
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115)
	at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136)
	at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664)
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335)
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703)
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218)
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188)
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402)
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
Caused by: java.lang.NoClassDefFoundError: org/sparkproject/guava/cache/CacheLoader
	at org.apache.spark.internal.config.ConfigHelpers$.stringToSeq(ConfigBuilder.scala:49)
	at org.apache.spark.internal.config.TypedConfigBuilder.$anonfun$toSequence$1(ConfigBuilder.scala:125)
	at org.apache.spark.internal.config.TypedConfigBuilder.createWithDefault(ConfigBuilder.scala:144)
	at org.apache.spark.internal.config.package$.<init>(package.scala:52)
	at org.apache.spark.internal.config.package$.<clinit>(package.scala)
	at org.apache.spark.SparkConf$.<init>(SparkConf.scala:654)
	at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
	at org.apache.spark.SparkConf.set(SparkConf.scala:94)
	at org.apache.spark.SparkConf.set(SparkConf.scala:83)
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.generateSparkConf(HiveSparkClientFactory.java:265)
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:98)
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76)
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87)
	... 24 more
Caused by: java.lang.ClassNotFoundException: org.sparkproject.guava.cache.CacheLoader
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	... 37 more

2022-07-30T15:47:23,379 ERROR [66fff150-4600-4bab-b72b-27a909ea721d main]: spark.SparkTask (:()) - Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session fb5c9b20-25eb-43a7-a7e4-dae14cd5735b)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session fb5c9b20-25eb-43a7-a7e4-dae14cd5735b
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.getHiveException(SparkSessionImpl.java:221) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683) ~[hive-cli-3.1.2.jar:3.1.2]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_232]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_232]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_232]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_232]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:323) ~[hadoop-common-3.2.1.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:236) ~[hadoop-common-3.2.1.jar:?]
Caused by: java.lang.NoClassDefFoundError: org/sparkproject/guava/cache/CacheLoader
	at org.apache.spark.internal.config.ConfigHelpers$.stringToSeq(ConfigBuilder.scala:49) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.internal.config.TypedConfigBuilder.$anonfun$toSequence$1(ConfigBuilder.scala:125) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.internal.config.TypedConfigBuilder.createWithDefault(ConfigBuilder.scala:144) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.internal.config.package$.<init>(package.scala:52) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.internal.config.package$.<clinit>(package.scala) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.SparkConf$.<init>(SparkConf.scala:654) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.SparkConf.set(SparkConf.scala:94) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.SparkConf.set(SparkConf.scala:83) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.generateSparkConf(HiveSparkClientFactory.java:265) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:98) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
	... 24 more
Caused by: java.lang.ClassNotFoundException: org.sparkproject.guava.cache.CacheLoader
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382) ~[?:1.8.0_232]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[?:1.8.0_232]
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) ~[?:1.8.0_232]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[?:1.8.0_232]
	at org.apache.spark.internal.config.ConfigHelpers$.stringToSeq(ConfigBuilder.scala:49) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.internal.config.TypedConfigBuilder.$anonfun$toSequence$1(ConfigBuilder.scala:125) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.internal.config.TypedConfigBuilder.createWithDefault(ConfigBuilder.scala:144) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.internal.config.package$.<init>(package.scala:52) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.internal.config.package$.<clinit>(package.scala) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.SparkConf$.<init>(SparkConf.scala:654) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.SparkConf.set(SparkConf.scala:94) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.SparkConf.set(SparkConf.scala:83) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.generateSparkConf(HiveSparkClientFactory.java:265) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:98) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
	... 24 more

spark java.lang.ClassNotFoundException: org.apache.spark.unsafe.array.ByteArrayMethods

解决复制spark-unsafe_2.12-3.1.2.jar至hive/lib

2022-07-30T06:38:47,283 ERROR [3a260c38-1551-4866-ad28-7a05485eb466 main]: spark.SparkTask (:()) - Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session 070c0799-0602-4b2d-b0ab-ad1ce6c85061)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session 070c0799-0602-4b2d-b0ab-ad1ce6c85061
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.getHiveException(SparkSessionImpl.java:221) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683) ~[hive-cli-3.1.2.jar:3.1.2]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_232]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_232]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_232]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_232]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:323) ~[hadoop-common-3.2.1.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:236) ~[hadoop-common-3.2.1.jar:?]
Caused by: java.lang.NoClassDefFoundError: org/apache/spark/unsafe/array/ByteArrayMethods
	at org.apache.spark.internal.config.package$.<init>(package.scala:1095) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.internal.config.package$.<clinit>(package.scala) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.SparkConf$.<init>(SparkConf.scala:654) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.SparkConf.set(SparkConf.scala:94) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.SparkConf.set(SparkConf.scala:83) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.generateSparkConf(HiveSparkClientFactory.java:265) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:98) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
	... 24 more
Caused by: java.lang.ClassNotFoundException: org.apache.spark.unsafe.array.ByteArrayMethods
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382) ~[?:1.8.0_232]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[?:1.8.0_232]
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) ~[?:1.8.0_232]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[?:1.8.0_232]
	at org.apache.spark.internal.config.package$.<init>(package.scala:1095) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.internal.config.package$.<clinit>(package.scala) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.SparkConf$.<init>(SparkConf.scala:654) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.SparkConf.set(SparkConf.scala:94) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.spark.SparkConf.set(SparkConf.scala:83) ~[spark-core_2.12-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.generateSparkConf(HiveSparkClientFactory.java:265) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:98) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
	... 24 more
2022-07-30T06:38:47,283 INFO  [3a260c38-1551-4866-ad28-7a05485eb466 main]: reexec.ReOptimizePlugin (:()) - ReOptimization: retryPossible: false
2022-07-30T06:38:47,284 ERROR [3a260c38-1551-4866-ad28-7a05485eb466 main]: ql.Driver (:()) - FAILED: Execution Error, return code 30041 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create Spark client for Spark session 070c0799-0602-4b2d-b0ab-ad1ce6c85061
2022-07-30T06:38:47,284 INFO  [3a260c38-1551-4866-ad28-7a05485eb466 main]: ql.Driver (:()) - Completed executing command(queryId=root_20220730063843_f0996fc6-8530-47dc-bc20-a3f756c2a208); Time taken: 0.663 seconds
2022-07-30T06:38:47,284 INFO  [3a260c38-1551-4866-ad28-7a05485eb466 main]: ql.Driver (:()) - Concurrency mode is disabled, not creating a lock manager
2022-07-30T06:38:47,296 INFO  [3a260c38-1551-4866-ad28-7a05485eb466 main]: conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 3a260c38-1551-4866-ad28-7a05485eb466
2022-07-30T06:38:47,296 INFO  [3a260c38-1551-4866-ad28-7a05485eb466 main]: session.SessionState (SessionState.java:resetThreadName(452)) - Resetting thread name to  main
2022-07-30T06:38:56,498 INFO  [shutdown-hook-0]: session.SparkSessionManagerImpl (:()) - Closing the session manager.

spark org/apache/spark/launcher/SparkSubmitOptionParser

解决复制spark-launcher_2.12-3.1.2.jar至hive/lib

2022-07-30T06:51:20,220 WARN  [Driver]: client.SparkClientImpl (:()) - Child process exited with code 1
2022-07-30T06:51:20,223 ERROR [01d47f89-6aff-4942-9fce-a4494ba637b8 main]: client.SparkClientImpl (:()) - Error while waiting for client to connect.
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client '45cfbecf-2f90-4195-be32-d37263af4bf1'. Error: Child process (spark-submit) exited before connecting back with error log Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/launcher/SparkSubmitOptionParser
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	at java.lang.Class.getDeclaredMethods0(Native Method)
	at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
	at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
	at java.lang.Class.getMethod0(Class.java:3018)
	at java.lang.Class.getMethod(Class.java:1784)
	at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
	at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.launcher.SparkSubmitOptionParser
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	... 31 more

	at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:106) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:88) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:105) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:101) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683) ~[hive-cli-3.1.2.jar:3.1.2]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_232]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_232]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_232]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_232]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:323) ~[hadoop-common-3.2.1.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:236) ~[hadoop-common-3.2.1.jar:?]
Caused by: java.lang.RuntimeException: Cancel client '45cfbecf-2f90-4195-be32-d37263af4bf1'. Error: Child process (spark-submit) exited before connecting back with error log Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/launcher/SparkSubmitOptionParser
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	at java.lang.Class.getDeclaredMethods0(Native Method)
	at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
	at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
	at java.lang.Class.getMethod0(Class.java:3018)
	at java.lang.Class.getMethod(Class.java:1784)
	at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
	at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.launcher.SparkSubmitOptionParser
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	... 31 more

	at org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:211) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hive.spark.client.SparkClientImpl$2.run(SparkClientImpl.java:491) ~[hive-exec-3.1.2.jar:3.1.2]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_232]
2022-07-30T06:51:20,260 ERROR [01d47f89-6aff-4942-9fce-a4494ba637b8 main]: spark.SparkTask (:()) - Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session 45cfbecf-2f90-4195-be32-d37263af4bf1)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session 45cfbecf-2f90-4195-be32-d37263af4bf1
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.getHiveException(SparkSessionImpl.java:215)
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92)
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115)
	at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136)
	at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664)
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335)
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703)
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218)
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188)
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402)
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client '45cfbecf-2f90-4195-be32-d37263af4bf1'. Error: Child process (spark-submit) exited before connecting back with error log Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/launcher/SparkSubmitOptionParser
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	at java.lang.Class.getDeclaredMethods0(Native Method)
	at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
	at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
	at java.lang.Class.getMethod0(Class.java:3018)
	at java.lang.Class.getMethod(Class.java:1784)
	at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
	at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.launcher.SparkSubmitOptionParser
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	... 31 more

	at com.google.common.base.Throwables.propagate(Throwables.java:241)
	at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:128)
	at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:88)
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:105)
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:101)
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76)
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87)
	... 24 more
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client '45cfbecf-2f90-4195-be32-d37263af4bf1'. Error: Child process (spark-submit) exited before connecting back with error log Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/launcher/SparkSubmitOptionParser
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	at java.lang.Class.getDeclaredMethods0(Native Method)
	at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
	at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
	at java.lang.Class.getMethod0(Class.java:3018)
	at java.lang.Class.getMethod(Class.java:1784)
	at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
	at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.launcher.SparkSubmitOptionParser
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	... 31 more

	at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41)
	at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:106)
	... 29 more
Caused by: java.lang.RuntimeException: Cancel client '45cfbecf-2f90-4195-be32-d37263af4bf1'. Error: Child process (spark-submit) exited before connecting back with error log Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/launcher/SparkSubmitOptionParser
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	at java.lang.Class.getDeclaredMethods0(Native Method)
	at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
	at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
	at java.lang.Class.getMethod0(Class.java:3018)
	at java.lang.Class.getMethod(Class.java:1784)
	at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
	at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.launcher.SparkSubmitOptionParser
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	... 31 more

	at org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:211)
	at org.apache.hive.spark.client.SparkClientImpl$2.run(SparkClientImpl.java:491)
	at java.lang.Thread.run(Thread.java:748)

2022-07-30T06:51:20,260 ERROR [01d47f89-6aff-4942-9fce-a4494ba637b8 main]: spark.SparkTask (:()) - Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session 45cfbecf-2f90-4195-be32-d37263af4bf1)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session 45cfbecf-2f90-4195-be32-d37263af4bf1
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.getHiveException(SparkSessionImpl.java:215) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683) ~[hive-cli-3.1.2.jar:3.1.2]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_232]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_232]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_232]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_232]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:323) ~[hadoop-common-3.2.1.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:236) ~[hadoop-common-3.2.1.jar:?]
Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client '45cfbecf-2f90-4195-be32-d37263af4bf1'. Error: Child process (spark-submit) exited before connecting back with error log Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/launcher/SparkSubmitOptionParser
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	at java.lang.Class.getDeclaredMethods0(Native Method)
	at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
	at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
	at java.lang.Class.getMethod0(Class.java:3018)
	at java.lang.Class.getMethod(Class.java:1784)
	at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
	at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.launcher.SparkSubmitOptionParser
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	... 31 more

	at com.google.common.base.Throwables.propagate(Throwables.java:241) ~[guava-27.0-jre.jar:?]
	at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:128) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:88) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:105) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:101) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
	... 24 more
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client '45cfbecf-2f90-4195-be32-d37263af4bf1'. Error: Child process (spark-submit) exited before connecting back with error log Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/launcher/SparkSubmitOptionParser
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	at java.lang.Class.getDeclaredMethods0(Native Method)
	at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
	at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
	at java.lang.Class.getMethod0(Class.java:3018)
	at java.lang.Class.getMethod(Class.java:1784)
	at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
	at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.launcher.SparkSubmitOptionParser
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	... 31 more

	at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:106) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:88) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:105) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:101) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
	... 24 more
Caused by: java.lang.RuntimeException: Cancel client '45cfbecf-2f90-4195-be32-d37263af4bf1'. Error: Child process (spark-submit) exited before connecting back with error log Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/launcher/SparkSubmitOptionParser
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	at java.lang.Class.getDeclaredMethods0(Native Method)
	at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
	at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
	at java.lang.Class.getMethod0(Class.java:3018)
	at java.lang.Class.getMethod(Class.java:1784)
	at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
	at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.launcher.SparkSubmitOptionParser
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	... 31 more

	at org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:211) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hive.spark.client.SparkClientImpl$2.run(SparkClientImpl.java:491) ~[hive-exec-3.1.2.jar:3.1.2]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_232]
2022-07-30T06:51:20,260 INFO  [01d47f89-6aff-4942-9fce-a4494ba637b8 main]: reexec.ReOptimizePlugin (:()) - ReOptimization: retryPossible: false
2022-07-30T06:51:20,261 ERROR [01d47f89-6aff-4942-9fce-a4494ba637b8 main]: ql.Driver (:()) - FAILED: Execution Error, return code 30041 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create Spark client for Spark session 45cfbecf-2f90-4195-be32-d37263af4bf1
2022-07-30T06:51:20,261 INFO  [01d47f89-6aff-4942-9fce-a4494ba637b8 main]: ql.Driver (:()) - Completed executing command(queryId=root_20220730065116_747aea1e-e48e-48f4-b5c0-85c528e33593); Time taken: 1.209 seconds
2022-07-30T06:51:20,261 INFO  [01d47f89-6aff-4942-9fce-a4494ba637b8 main]: ql.Driver (:()) - Concurrency mode is disabled, not creating a lock manager
2022-07-30T06:51:20,273 INFO  [01d47f89-6aff-4942-9fce-a4494ba637b8 main]: conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 01d47f89-6aff-4942-9fce-a4494ba637b8
2022-07-30T06:51:20,273 INFO  [01d47f89-6aff-4942-9fce-a4494ba637b8 main]: session.SessionState (SessionState.java:resetThreadName(452)) - Resetting thread name to  main
2022-07-30T06:51:26,015 INFO  [shutdown-hook-0]: session.SparkSessionManagerImpl (:()) - Closing the session manager.

spark java.lang.ClassNotFoundException: org.json4s.Formats

解决复制至hive/lib下
json4s-core_2.12-3.7.0-M5.jar
json4s-jackson_2.12-3.7.0-M5.jar
json4s-ast_2.12-3.7.0-M5.jar
json4s-scalap_2.12-3.7.0-M5.jar

2022-08-05T16:40:23,366 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:main(8799)) - Starting hive metastore on port 9083
2022-08-05T16:40:23,807 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:newRawStoreForConf(717)) - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2022-08-05T16:40:23,876 WARN  [main]: metastore.ObjectStore (ObjectStore.java:correctAutoStartMechanism(638)) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2022-08-05T16:40:23,893 INFO  [main]: metastore.ObjectStore (ObjectStore.java:initializeHelper(481)) - ObjectStore, initialize called
2022-08-05T16:40:23,894 INFO  [main]: conf.MetastoreConf (MetastoreConf.java:findConfigFile(1233)) - Unable to find config file hivemetastore-site.xml
2022-08-05T16:40:23,894 INFO  [main]: conf.MetastoreConf (MetastoreConf.java:findConfigFile(1240)) - Found configuration file null
2022-08-05T16:40:23,895 INFO  [main]: conf.MetastoreConf (MetastoreConf.java:findConfigFile(1233)) - Unable to find config file metastore-site.xml
2022-08-05T16:40:23,895 INFO  [main]: conf.MetastoreConf (MetastoreConf.java:findConfigFile(1240)) - Found configuration file null
2022-08-05T16:40:25,566 INFO  [main]: hikari.HikariDataSource (HikariDataSource.java:<init>(71)) - HikariPool-1 - Starting...
2022-08-05T16:40:25,888 INFO  [main]: pool.PoolBase (PoolBase.java:getAndSetNetworkTimeout(503)) - HikariPool-1 - Driver does not support get/set network timeout for connections. (Method org.postgresql.jdbc.PgConnection.getNetworkTimeout() is not yet implemented.)
2022-08-05T16:40:25,903 INFO  [main]: hikari.HikariDataSource (HikariDataSource.java:<init>(73)) - HikariPool-1 - Start completed.
2022-08-05T16:40:26,205 INFO  [main]: hikari.HikariDataSource (HikariDataSource.java:<init>(71)) - HikariPool-2 - Starting...
2022-08-05T16:40:26,212 INFO  [main]: pool.PoolBase (PoolBase.java:getAndSetNetworkTimeout(503)) - HikariPool-2 - Driver does not support get/set network timeout for connections. (Method org.postgresql.jdbc.PgConnection.getNetworkTimeout() is not yet implemented.)
2022-08-05T16:40:26,214 INFO  [main]: hikari.HikariDataSource (HikariDataSource.java:<init>(73)) - HikariPool-2 - Start completed.
2022-08-05T16:40:26,642 INFO  [main]: metastore.ObjectStore (ObjectStore.java:getPMF(670)) - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2022-08-05T16:40:26,863 INFO  [main]: metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(186)) - Using direct SQL, underlying DB is POSTGRES
2022-08-05T16:40:26,864 INFO  [main]: metastore.ObjectStore (ObjectStore.java:setConf(396)) - Initialized ObjectStore
2022-08-05T16:40:28,626 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:createDefaultRoles_core(812)) - Added admin role in metastore
2022-08-05T16:40:28,628 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:createDefaultRoles_core(821)) - Added public role in metastore
2022-08-05T16:40:28,653 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:addAdminUsers_core(861)) - No user is added in admin role, since config is empty
2022-08-05T16:40:28,806 INFO  [main]: conf.HiveConf (HiveConf.java:findConfigFile(187)) - Found configuration file file:/opt/hive/conf/hive-site.xml
2022-08-05T16:40:29,006 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:startMetaStore(8958)) - Starting DB backed MetaStore Server with SetUGI enabled
2022-08-05T16:40:29,030 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:startMetaStore(9030)) - Started the new metaserver on port [9083]...
2022-08-05T16:40:29,030 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:startMetaStore(9032)) - Options.minWorkerThreads = 200
2022-08-05T16:40:29,030 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:startMetaStore(9034)) - Options.maxWorkerThreads = 1000
2022-08-05T16:40:29,030 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:startMetaStore(9036)) - TCP keepalive = true
2022-08-05T16:40:29,030 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:startMetaStore(9037)) - Enable SSL = false
2022-08-05T16:41:15,267 INFO  [main]: conf.HiveConf (HiveConf.java:findConfigFile(187)) - Found configuration file file:/opt/hive/conf/hive-site.xml
2022-08-05T16:41:15,847 INFO  [main]: SessionState (:()) - Hive Session ID = 1f78ca2b-55f2-47b0-aec3-c2b81b429118
2022-08-05T16:41:15,987 INFO  [main]: SessionState (:()) -
Logging initialized using configuration in file:/opt/hive/conf/hive-log4j2.properties Async: true
2022-08-05T16:41:17,591 INFO  [main]: session.SessionState (SessionState.java:createPath(790)) - Created HDFS directory: /tmp/hive/root/1f78ca2b-55f2-47b0-aec3-c2b81b429118
2022-08-05T16:41:17,642 INFO  [main]: session.SessionState (SessionState.java:createPath(790)) - Created local directory: /tmp/root/1f78ca2b-55f2-47b0-aec3-c2b81b429118
2022-08-05T16:41:17,645 INFO  [main]: session.SessionState (SessionState.java:createPath(790)) - Created HDFS directory: /tmp/hive/root/1f78ca2b-55f2-47b0-aec3-c2b81b429118/_tmp_space.db
2022-08-05T16:41:17,656 INFO  [main]: conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 1f78ca2b-55f2-47b0-aec3-c2b81b429118
2022-08-05T16:41:17,656 INFO  [main]: session.SessionState (SessionState.java:updateThreadName(441)) - Updating thread name to 1f78ca2b-55f2-47b0-aec3-c2b81b429118 main
2022-08-05T16:41:18,599 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2022-08-05T16:41:18,620 WARN  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.ObjectStore (:()) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2022-08-05T16:41:18,624 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.ObjectStore (:()) - ObjectStore, initialize called
2022-08-05T16:41:18,625 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: conf.MetastoreConf (:()) - Found configuration file file:/opt/hive/conf/hive-site.xml
2022-08-05T16:41:18,626 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: conf.MetastoreConf (:()) - Unable to find config file hivemetastore-site.xml
2022-08-05T16:41:18,626 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: conf.MetastoreConf (:()) - Found configuration file null
2022-08-05T16:41:18,627 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: conf.MetastoreConf (:()) - Unable to find config file metastore-site.xml
2022-08-05T16:41:18,627 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: conf.MetastoreConf (:()) - Found configuration file null
2022-08-05T16:41:18,853 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: hikari.HikariDataSource (:()) - HikariPool-1 - Starting...
2022-08-05T16:41:18,912 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: pool.PoolBase (:()) - HikariPool-1 - Driver does not support get/set network timeout for connections. (Method org.postgresql.jdbc.PgConnection.getNetworkTimeout() is not yet implemented.)
2022-08-05T16:41:18,925 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: hikari.HikariDataSource (:()) - HikariPool-1 - Start completed.
2022-08-05T16:41:18,949 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: hikari.HikariDataSource (:()) - HikariPool-2 - Starting...
2022-08-05T16:41:18,955 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: pool.PoolBase (:()) - HikariPool-2 - Driver does not support get/set network timeout for connections. (Method org.postgresql.jdbc.PgConnection.getNetworkTimeout() is not yet implemented.)
2022-08-05T16:41:18,956 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: hikari.HikariDataSource (:()) - HikariPool-2 - Start completed.
2022-08-05T16:41:19,116 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.ObjectStore (:()) - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2022-08-05T16:41:19,171 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.MetaStoreDirectSql (:()) - Using direct SQL, underlying DB is POSTGRES
2022-08-05T16:41:19,172 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.ObjectStore (:()) - Initialized ObjectStore
2022-08-05T16:41:20,093 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:newRawStoreForConf(717)) - 1: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2022-08-05T16:41:20,094 WARN  [pool-6-thread-1]: metastore.ObjectStore (ObjectStore.java:correctAutoStartMechanism(638)) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2022-08-05T16:41:20,094 INFO  [pool-6-thread-1]: metastore.ObjectStore (ObjectStore.java:initializeHelper(481)) - ObjectStore, initialize called
2022-08-05T16:41:20,106 INFO  [pool-6-thread-1]: metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(186)) - Using direct SQL, underlying DB is POSTGRES
2022-08-05T16:41:20,107 INFO  [pool-6-thread-1]: metastore.ObjectStore (ObjectStore.java:setConf(396)) - Initialized ObjectStore
2022-08-05T16:41:20,182 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-05T16:41:20,183 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-05T16:41:21,478 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - Added admin role in metastore
2022-08-05T16:41:21,479 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - Added public role in metastore
2022-08-05T16:41:21,491 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - No user is added in admin role, since config is empty
2022-08-05T16:41:21,590 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=root (auth:SIMPLE) retries=1 delay=1 lifetime=0
2022-08-05T16:41:21,606 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_all_functions
2022-08-05T16:41:21,606 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_all_functions
2022-08-05T16:41:21,631 INFO  [pool-10-thread-1]: SessionState (:()) - Hive Session ID = f808072f-0c73-44c2-bbdb-966f15c45739
2022-08-05T16:41:21,644 INFO  [pool-10-thread-1]: session.SessionState (SessionState.java:createPath(790)) - Created HDFS directory: /tmp/hive/root/f808072f-0c73-44c2-bbdb-966f15c45739
2022-08-05T16:41:21,647 INFO  [pool-10-thread-1]: session.SessionState (SessionState.java:createPath(790)) - Created local directory: /tmp/root/f808072f-0c73-44c2-bbdb-966f15c45739
2022-08-05T16:41:21,651 INFO  [pool-10-thread-1]: session.SessionState (SessionState.java:createPath(790)) - Created HDFS directory: /tmp/hive/root/f808072f-0c73-44c2-bbdb-966f15c45739/_tmp_space.db
2022-08-05T16:41:21,653 INFO  [pool-10-thread-1]: metastore.HiveMetaStore (:()) - 1: get_databases: @hive#
2022-08-05T16:41:21,653 INFO  [pool-10-thread-1]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_databases: @hive#
2022-08-05T16:41:21,657 INFO  [pool-10-thread-1]: metastore.HiveMetaStore (:()) - 1: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2022-08-05T16:41:21,658 INFO  [pool-10-thread-1]: metastore.ObjectStore (:()) - ObjectStore, initialize called
2022-08-05T16:41:21,668 INFO  [pool-10-thread-1]: metastore.MetaStoreDirectSql (:()) - Using direct SQL, underlying DB is POSTGRES
2022-08-05T16:41:21,668 INFO  [pool-10-thread-1]: metastore.ObjectStore (:()) - Initialized ObjectStore
2022-08-05T16:41:21,675 INFO  [pool-10-thread-1]: metastore.HiveMetaStore (:()) - 1: get_tables_by_type: db=@hive#default pat=.*,type=MATERIALIZED_VIEW
2022-08-05T16:41:21,675 INFO  [pool-10-thread-1]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_tables_by_type: db=@hive#default pat=.*,type=MATERIALIZED_VIEW
2022-08-05T16:41:21,679 INFO  [pool-10-thread-1]: metastore.HiveMetaStore (:()) - 1: get_multi_table : db=default tbls=
2022-08-05T16:41:21,679 INFO  [pool-10-thread-1]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_multi_table : db=default tbls=
2022-08-05T16:41:21,681 INFO  [pool-10-thread-1]: metastore.HiveMetaStore (:()) - 1: get_tables_by_type: db=@hive#test pat=.*,type=MATERIALIZED_VIEW
2022-08-05T16:41:21,681 INFO  [pool-10-thread-1]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_tables_by_type: db=@hive#test pat=.*,type=MATERIALIZED_VIEW
2022-08-05T16:41:21,682 INFO  [pool-10-thread-1]: metastore.HiveMetaStore (:()) - 1: get_multi_table : db=test tbls=
2022-08-05T16:41:21,682 INFO  [pool-10-thread-1]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_multi_table : db=test tbls=
2022-08-05T16:41:21,682 INFO  [pool-10-thread-1]: metadata.HiveMaterializedViewsRegistry (:()) - Materialized views registry has been initialized
2022-08-05T16:41:24,861 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 1f78ca2b-55f2-47b0-aec3-c2b81b429118
2022-08-05T16:41:24,962 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Compiling command(queryId=root_20220805164124_0fb768cd-c062-47f7-bdcc-fa3f3c864686): show tables
2022-08-05T16:41:25,333 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Concurrency mode is disabled, not creating a lock manager
2022-08-05T16:41:25,336 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_database: @hive#default
2022-08-05T16:41:25,336 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_database: @hive#default
2022-08-05T16:41:25,356 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Semantic Analysis Completed (retrial = false)
2022-08-05T16:41:25,381 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string, comment:from deserializer)], properties:null)
2022-08-05T16:41:25,449 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: exec.ListSinkOperator (:()) - Initializing operator LIST_SINK[0]
2022-08-05T16:41:25,455 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Completed compiling command(queryId=root_20220805164124_0fb768cd-c062-47f7-bdcc-fa3f3c864686); Time taken: 0.512 seconds
2022-08-05T16:41:25,455 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: reexec.ReExecDriver (:()) - Execution #1 of query
2022-08-05T16:41:25,455 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Concurrency mode is disabled, not creating a lock manager
2022-08-05T16:41:25,455 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Executing command(queryId=root_20220805164124_0fb768cd-c062-47f7-bdcc-fa3f3c864686): show tables
2022-08-05T16:41:25,461 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Starting task [Stage-0:DDL] in serial mode
2022-08-05T16:41:25,462 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_database: @hive#default
2022-08-05T16:41:25,462 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_database: @hive#default
2022-08-05T16:41:25,464 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_tables: db=@hive#default pat=.*
2022-08-05T16:41:25,464 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_tables: db=@hive#default pat=.*
2022-08-05T16:41:25,469 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Completed executing command(queryId=root_20220805164124_0fb768cd-c062-47f7-bdcc-fa3f3c864686); Time taken: 0.014 seconds
2022-08-05T16:41:25,469 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - OK
2022-08-05T16:41:25,469 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Concurrency mode is disabled, not creating a lock manager
2022-08-05T16:41:25,473 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: Configuration.deprecation (Configuration.java:logDeprecation(1395)) - mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir
2022-08-05T16:41:26,055 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: mapred.FileInputFormat (:()) - Total input files to process : 1
2022-08-05T16:41:26,095 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: exec.ListSinkOperator (:()) - RECORDS_OUT_INTERMEDIATE:0, RECORDS_OUT_OPERATOR_LIST_SINK_0:1,
2022-08-05T16:41:26,100 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: CliDriver (SessionState.java:printInfo(1227)) - Time taken: 0.529 seconds, Fetched: 1 row(s)
2022-08-05T16:41:26,100 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 1f78ca2b-55f2-47b0-aec3-c2b81b429118
2022-08-05T16:41:26,100 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: session.SessionState (SessionState.java:resetThreadName(452)) - Resetting thread name to  main
2022-08-05T16:42:00,520 INFO  [main]: conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 1f78ca2b-55f2-47b0-aec3-c2b81b429118
2022-08-05T16:42:00,521 INFO  [main]: session.SessionState (SessionState.java:updateThreadName(441)) - Updating thread name to 1f78ca2b-55f2-47b0-aec3-c2b81b429118 main
2022-08-05T16:42:00,525 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Compiling command(queryId=root_20220805164200_7f498c34-66ef-48e3-a168-19db0e8094de): insert into table student values(1,'abc')
2022-08-05T16:42:00,575 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Concurrency mode is disabled, not creating a lock manager
2022-08-05T16:42:00,578 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - Starting Semantic Analysis
2022-08-05T16:42:00,598 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: sqlstd.SQLStdHiveAccessController (:()) - Created SQLStdHiveAccessController for session context : HiveAuthzSessionContext [sessionString=1f78ca2b-55f2-47b0-aec3-c2b81b429118, clientType=HIVECLI]
2022-08-05T16:42:00,599 WARN  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: session.SessionState (SessionState.java:setAuthorizerV2Config(950)) - METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory.
2022-08-05T16:42:00,600 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStoreClient (:()) - Mestastore configuration metastore.filter.hook changed from org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook
2022-08-05T16:42:00,601 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: Cleaning up thread local RawStore...
2022-08-05T16:42:00,601 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=Cleaning up thread local RawStore...
2022-08-05T16:42:00,601 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: Done cleaning up thread local RawStore
2022-08-05T16:42:00,601 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=Done cleaning up thread local RawStore
2022-08-05T16:42:00,603 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2022-08-05T16:42:00,603 WARN  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.ObjectStore (:()) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2022-08-05T16:42:00,604 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.ObjectStore (:()) - ObjectStore, initialize called
2022-08-05T16:42:00,608 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.MetaStoreDirectSql (:()) - Using direct SQL, underlying DB is POSTGRES
2022-08-05T16:42:00,608 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.ObjectStore (:()) - Initialized ObjectStore
2022-08-05T16:42:00,608 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=root (auth:SIMPLE) retries=1 delay=1 lifetime=0
2022-08-05T16:42:00,614 WARN  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.ObjectStore (:()) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2022-08-05T16:42:00,614 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.ObjectStore (:()) - ObjectStore, initialize called
2022-08-05T16:42:00,616 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.MetaStoreDirectSql (:()) - Using direct SQL, underlying DB is POSTGRES
2022-08-05T16:42:00,616 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.ObjectStore (:()) - Initialized ObjectStore
2022-08-05T16:42:00,616 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=root (auth:SIMPLE) retries=1 delay=1 lifetime=0
2022-08-05T16:42:00,617 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-05T16:42:00,618 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-05T16:42:01,064 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - Completed phase 1 of Semantic Analysis
2022-08-05T16:42:01,064 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - Get metadata for source tables
2022-08-05T16:42:01,064 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - Get metadata for subqueries
2022-08-05T16:42:01,068 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - Get metadata for destination tables
2022-08-05T16:42:01,069 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-05T16:42:01,069 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-05T16:42:01,083 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - Completed getting MetaData in Semantic Analysis
2022-08-05T16:42:02,294 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Context (:()) - New scratch dir is hdfs://namenode:9000/tmp/hive/root/1f78ca2b-55f2-47b0-aec3-c2b81b429118/hive_2022-08-05_16-42-00_544_8151355816140355569-1
2022-08-05T16:42:02,411 INFO  [Thread-10]: sasl.SaslDataTransferClient (:()) - SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2022-08-05T16:42:02,540 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_not_null_constraints : tbl=hive._dummy_database._dummy_table
2022-08-05T16:42:02,540 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_not_null_constraints : tbl=hive._dummy_database._dummy_table
2022-08-05T16:42:02,562 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_primary_keys : tbl=hive._dummy_database._dummy_table
2022-08-05T16:42:02,562 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_primary_keys : tbl=hive._dummy_database._dummy_table
2022-08-05T16:42:02,570 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_primary_keys : tbl=hive._dummy_database._dummy_table
2022-08-05T16:42:02,570 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_primary_keys : tbl=hive._dummy_database._dummy_table
2022-08-05T16:42:02,573 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_unique_constraints : tbl=hive._dummy_database._dummy_table
2022-08-05T16:42:02,573 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_unique_constraints : tbl=hive._dummy_database._dummy_table
2022-08-05T16:42:02,577 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_foreign_keys : parentdb=null parenttbl=null foreigndb=_dummy_database foreigntbl=_dummy_table
2022-08-05T16:42:02,577 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_foreign_keys : parentdb=null parenttbl=null foreigndb=_dummy_database foreigntbl=_dummy_table
2022-08-05T16:42:03,236 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_databases: @hive#
2022-08-05T16:42:03,236 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_databases: @hive#
2022-08-05T16:42:03,240 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_materialized_views_for_rewriting: db=@hive#default
2022-08-05T16:42:03,240 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_materialized_views_for_rewriting: db=@hive#default
2022-08-05T16:42:03,244 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_materialized_views_for_rewriting: db=@hive#test
2022-08-05T16:42:03,245 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_materialized_views_for_rewriting: db=@hive#test
2022-08-05T16:42:03,267 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - Get metadata for source tables
2022-08-05T16:42:03,267 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - Get metadata for subqueries
2022-08-05T16:42:03,268 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - Get metadata for source tables
2022-08-05T16:42:03,268 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive._dummy_database._dummy_table
2022-08-05T16:42:03,268 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive._dummy_database._dummy_table
2022-08-05T16:42:03,270 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - Get metadata for subqueries
2022-08-05T16:42:03,270 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - Get metadata for destination tables
2022-08-05T16:42:03,273 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - Get metadata for destination tables
2022-08-05T16:42:03,273 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-05T16:42:03,273 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-05T16:42:03,287 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Context (:()) - New scratch dir is hdfs://namenode:9000/tmp/hive/root/1f78ca2b-55f2-47b0-aec3-c2b81b429118/hive_2022-08-05_16-42-00_544_8151355816140355569-1
2022-08-05T16:42:03,294 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: common.FileUtils (FileUtils.java:mkdir(580)) - Creating directory if it doesn't exist: hdfs://namenode:9000/user/hive/warehouse/student/.hive-staging_hive_2022-08-05_16-42-00_544_8151355816140355569-1
2022-08-05T16:42:03,298 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_not_null_constraints : tbl=hive.default.student
2022-08-05T16:42:03,298 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_not_null_constraints : tbl=hive.default.student
2022-08-05T16:42:03,301 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_check_constraints : tbl=hive.default.student
2022-08-05T16:42:03,301 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_check_constraints : tbl=hive.default.student
2022-08-05T16:42:03,312 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-05T16:42:03,312 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-05T16:42:03,326 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - Generate an operator pipeline to autogather column stats for table default.student in query insert into table student values(1,'abc')
2022-08-05T16:42:03,329 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStoreClient (:()) - Mestastore configuration metastore.filter.hook changed from org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook to org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl
2022-08-05T16:42:03,329 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: Cleaning up thread local RawStore...
2022-08-05T16:42:03,329 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=Cleaning up thread local RawStore...
2022-08-05T16:42:03,329 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: Done cleaning up thread local RawStore
2022-08-05T16:42:03,329 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=Done cleaning up thread local RawStore
2022-08-05T16:42:03,331 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2022-08-05T16:42:03,331 WARN  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.ObjectStore (:()) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2022-08-05T16:42:03,331 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.ObjectStore (:()) - ObjectStore, initialize called
2022-08-05T16:42:03,333 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.MetaStoreDirectSql (:()) - Using direct SQL, underlying DB is POSTGRES
2022-08-05T16:42:03,333 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.ObjectStore (:()) - Initialized ObjectStore
2022-08-05T16:42:03,334 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=root (auth:SIMPLE) retries=1 delay=1 lifetime=0
2022-08-05T16:42:03,334 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-05T16:42:03,334 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-05T16:42:03,352 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - Get metadata for source tables
2022-08-05T16:42:03,352 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-05T16:42:03,352 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-05T16:42:03,362 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - Get metadata for subqueries
2022-08-05T16:42:03,362 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - Get metadata for destination tables
2022-08-05T16:42:03,390 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Context (:()) - New scratch dir is hdfs://namenode:9000/tmp/hive/root/1f78ca2b-55f2-47b0-aec3-c2b81b429118/hive_2022-08-05_16-42-03_326_6456612087930169162-1
2022-08-05T16:42:03,402 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: common.FileUtils (FileUtils.java:mkdir(580)) - Creating directory if it doesn't exist: hdfs://namenode:9000/tmp/hive/root/1f78ca2b-55f2-47b0-aec3-c2b81b429118/hive_2022-08-05_16-42-03_326_6456612087930169162-1/-mr-10000/.hive-staging_hive_2022-08-05_16-42-03_326_6456612087930169162-1
2022-08-05T16:42:03,407 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - CBO Succeeded; optimized logical plan.
2022-08-05T16:42:03,425 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ppd.OpProcFactory (:()) - Processing for FS(4)
2022-08-05T16:42:03,425 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ppd.OpProcFactory (:()) - Processing for FS(11)
2022-08-05T16:42:03,425 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ppd.OpProcFactory (:()) - Processing for SEL(10)
2022-08-05T16:42:03,425 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ppd.OpProcFactory (:()) - Processing for GBY(9)
2022-08-05T16:42:03,425 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ppd.OpProcFactory (:()) - Processing for RS(8)
2022-08-05T16:42:03,425 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ppd.OpProcFactory (:()) - Processing for GBY(7)
2022-08-05T16:42:03,425 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ppd.OpProcFactory (:()) - Processing for SEL(6)
2022-08-05T16:42:03,425 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ppd.OpProcFactory (:()) - Processing for SEL(3)
2022-08-05T16:42:03,425 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ppd.OpProcFactory (:()) - Processing for UDTF(2)
2022-08-05T16:42:03,425 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ppd.OpProcFactory (:()) - Processing for SEL(1)
2022-08-05T16:42:03,425 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ppd.OpProcFactory (:()) - Processing for TS(0)
2022-08-05T16:42:03,440 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: optimizer.ColumnPrunerProcFactory (:()) - RS 8 oldColExprMap: {VALUE._col0=Column[_col0], VALUE._col1=Column[_col1]}
2022-08-05T16:42:03,440 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: optimizer.ColumnPrunerProcFactory (:()) - RS 8 newColExprMap: {VALUE._col0=Column[_col0], VALUE._col1=Column[_col1]}
2022-08-05T16:42:03,493 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: spark.SetSparkReducerParallelism (:()) - Number of reducers for sink RS[8] was already determined to be: 1
2022-08-05T16:42:03,514 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-05T16:42:03,514 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-05T16:42:03,573 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: physical.Vectorizer (:()) - Examining input format to see if vectorization is enabled.
2022-08-05T16:42:03,575 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: physical.Vectorizer (:()) - Map vectorization enabled: false
2022-08-05T16:42:03,575 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: physical.Vectorizer (:()) - Map vectorized: false
2022-08-05T16:42:03,575 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: physical.Vectorizer (:()) - Map vectorizedVertexNum: 0
2022-08-05T16:42:03,575 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: physical.Vectorizer (:()) - Map enabledConditionsMet: [hive.vectorized.use.vectorized.input.format IS true]
2022-08-05T16:42:03,575 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: physical.Vectorizer (:()) - Map enabledConditionsNotMet: [Could not enable vectorization due to partition column names size 1 is greater than the number of table column names size 0 IS false]
2022-08-05T16:42:03,575 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: physical.Vectorizer (:()) - Map inputFileFormatClassNameSet: [org.apache.hadoop.hive.ql.io.NullRowsInputFormat]
2022-08-05T16:42:03,599 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: physical.Vectorizer (:()) - Validating and vectorizing ReduceWork... (vectorizedVertexNum 1)
2022-08-05T16:42:03,610 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: physical.Vectorizer (:()) - Reduce vectorization enabled: true
2022-08-05T16:42:03,610 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: physical.Vectorizer (:()) - Reduce vectorized: false
2022-08-05T16:42:03,610 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: physical.Vectorizer (:()) - Reduce notVectorizedReason: Aggregation Function expression for GROUPBY operator: UDF compute_stats not supported
2022-08-05T16:42:03,610 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: physical.Vectorizer (:()) - Reduce vectorizedVertexNum: 1
2022-08-05T16:42:03,610 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: physical.Vectorizer (:()) - Reducer hive.vectorized.execution.reduce.enabled: true
2022-08-05T16:42:03,610 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: physical.Vectorizer (:()) - Reducer engine: spark
2022-08-05T16:42:03,612 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: common.HiveStatsUtils (:()) - Error requested is 20.0%
2022-08-05T16:42:03,612 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: common.HiveStatsUtils (:()) - Choosing 16 bit vectors..
2022-08-05T16:42:03,615 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: parse.CalcitePlanner (:()) - Completed plan generation
2022-08-05T16:42:03,615 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Semantic Analysis Completed (retrial = false)
2022-08-05T16:42:03,615 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:col1, type:int, comment:null), FieldSchema(name:col2, type:string, comment:null)], properties:null)
2022-08-05T16:42:03,615 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Completed compiling command(queryId=root_20220805164200_7f498c34-66ef-48e3-a168-19db0e8094de); Time taken: 3.09 seconds
2022-08-05T16:42:03,615 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: reexec.ReExecDriver (:()) - Execution #1 of query
2022-08-05T16:42:03,615 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Concurrency mode is disabled, not creating a lock manager
2022-08-05T16:42:03,616 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Executing command(queryId=root_20220805164200_7f498c34-66ef-48e3-a168-19db0e8094de): insert into table student values(1,'abc')
2022-08-05T16:42:03,616 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Query ID = root_20220805164200_7f498c34-66ef-48e3-a168-19db0e8094de
2022-08-05T16:42:03,616 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Total jobs = 1
2022-08-05T16:42:03,623 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Launching Job 1 out of 1
2022-08-05T16:42:03,624 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Starting task [Stage-1:MAPRED] in serial mode
2022-08-05T16:42:03,624 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: spark.SparkTask (:()) - In order to change the average load for a reducer (in bytes):
2022-08-05T16:42:03,624 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: spark.SparkTask (:()) -   set hive.exec.reducers.bytes.per.reducer=<number>
2022-08-05T16:42:03,624 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: spark.SparkTask (:()) - In order to limit the maximum number of reducers:
2022-08-05T16:42:03,624 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: spark.SparkTask (:()) -   set hive.exec.reducers.max=<number>
2022-08-05T16:42:03,624 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: spark.SparkTask (:()) - In order to set a constant number of reducers:
2022-08-05T16:42:03,624 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: spark.SparkTask (:()) -   set mapreduce.job.reduces=<number>
2022-08-05T16:42:03,633 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: session.SparkSessionManagerImpl (:()) - Setting up the session manager.
2022-08-05T16:42:03,847 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: session.SparkSession (:()) - Trying to open Spark session 7523b336-d47a-41ca-8899-5dd7b83717fa
2022-08-05T16:42:05,136 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - No spark.home provided, calling SparkSubmit directly.
2022-08-05T16:42:05,262 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - Running client driver with argv: /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java org.apache.spark.deploy.SparkSubmit --properties-file /tmp/spark-submit.6659189783927959801.properties --class org.apache.hive.spark.client.RemoteDriver /opt/hive/lib/hive-exec-3.1.2.jar --remote-host hive-metastore --remote-port 45813 --conf hive.spark.client.connect.timeout=10000 --conf hive.spark.client.server.connect.timeout=90000 --conf hive.spark.client.channel.log.level=null --conf hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256 --conf hive.spark.client.rpc.server.address=null
2022-08-05T16:42:06,614 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
2022-08-05T16:42:06,614 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
2022-08-05T16:42:06,614 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
2022-08-05T16:42:06,614 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
2022-08-05T16:42:06,614 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
2022-08-05T16:42:07,178 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 2022-08-05T16:42:07,176 WARN [main] org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2022-08-05T16:42:07,236 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
2022-08-05T16:42:07,236 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 2022-08-05T16:42:07,236 INFO [main] org.apache.spark.SecurityManager - Changing view acls to: root
2022-08-05T16:42:07,237 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 2022-08-05T16:42:07,237 INFO [main] org.apache.spark.SecurityManager - Changing modify acls to: root
2022-08-05T16:42:07,237 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 2022-08-05T16:42:07,237 INFO [main] org.apache.spark.SecurityManager - Changing view acls groups to:
2022-08-05T16:42:07,237 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 2022-08-05T16:42:07,237 INFO [main] org.apache.spark.SecurityManager - Changing modify acls groups to:
2022-08-05T16:42:07,238 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 2022-08-05T16:42:07,238 INFO [main] org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
2022-08-05T16:42:08,081 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 2022-08-05T16:42:08,081 INFO [main] org.apache.spark.util.Utils - Successfully started service 'driverClient' on port 38067.
2022-08-05T16:42:08,442 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 2022-08-05T16:42:08,442 INFO [netty-rpc-connection-0] org.apache.spark.network.client.TransportClientFactory - Successfully created connection to spark-master-hp/172.25.0.3:7077 after 105 ms (0 ms spent in bootstraps)
2022-08-05T16:42:08,694 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 2022-08-05T16:42:08,684 ERROR [dispatcher-event-loop-2] org.apache.spark.rpc.netty.Inbox - An error happened while processing message in the inbox for client
2022-08-05T16:42:08,695 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - java.lang.NoClassDefFoundError: org/json4s/Formats
2022-08-05T16:42:08,695 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at org.apache.spark.deploy.ClientEndpoint.onStart(Client.scala:104) ~[spark-core_2.12-3.1.2.jar:3.1.2]
2022-08-05T16:42:08,695 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:120) [spark-core_2.12-3.1.2.jar:3.1.2]
2022-08-05T16:42:08,695 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.12-3.1.2.jar:3.1.2]
2022-08-05T16:42:08,695 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.12-3.1.2.jar:3.1.2]
2022-08-05T16:42:08,695 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.12-3.1.2.jar:3.1.2]
2022-08-05T16:42:08,695 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.12-3.1.2.jar:3.1.2]
2022-08-05T16:42:08,695 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_232]
2022-08-05T16:42:08,695 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_232]
2022-08-05T16:42:08,695 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_232]
2022-08-05T16:42:08,695 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - Caused by: java.lang.ClassNotFoundException: org.json4s.Formats
2022-08-05T16:42:08,695 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at java.net.URLClassLoader.findClass(URLClassLoader.java:382) ~[?:1.8.0_232]
2022-08-05T16:42:08,695 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[?:1.8.0_232]
2022-08-05T16:42:08,695 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) ~[?:1.8.0_232]
2022-08-05T16:42:08,695 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[?:1.8.0_232]
2022-08-05T16:42:08,695 INFO  [RemoteDriver-stdout-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	... 9 more
2022-08-05T16:42:08,696 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - Exception in thread "dispatcher-event-loop-2" java.lang.NoClassDefFoundError: org/json4s/Formats
2022-08-05T16:42:08,696 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at org.apache.spark.deploy.ClientEndpoint.onStart(Client.scala:104)
2022-08-05T16:42:08,696 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:120)
2022-08-05T16:42:08,696 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213)
2022-08-05T16:42:08,696 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
2022-08-05T16:42:08,696 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75)
2022-08-05T16:42:08,696 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41)
2022-08-05T16:42:08,696 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
2022-08-05T16:42:08,696 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
2022-08-05T16:42:08,696 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at java.lang.Thread.run(Thread.java:748)
2022-08-05T16:42:08,696 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - Caused by: java.lang.ClassNotFoundException: org.json4s.Formats
2022-08-05T16:42:08,696 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
2022-08-05T16:42:08,696 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
2022-08-05T16:42:08,696 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
2022-08-05T16:42:08,696 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
2022-08-05T16:42:08,696 INFO  [RemoteDriver-stderr-redir-1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - 	... 9 more
2022-08-05T16:42:19,050 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-05T16:42:19,051 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-05T16:43:19,048 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-05T16:43:19,049 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-05T16:43:35,400 ERROR [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: client.SparkClientImpl (:()) - Timed out waiting for client to connect.
Possible reasons include network issues, errors in remote driver or the cluster has no available resources, etc.
Please check YARN or Spark driver's logs for further information.
java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Timed out waiting for client connection.
	at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:106) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:88) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:105) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:101) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683) ~[hive-cli-3.1.2.jar:3.1.2]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_232]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_232]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_232]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_232]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:323) ~[hadoop-common-3.2.1.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:236) ~[hadoop-common-3.2.1.jar:?]
Caused by: java.util.concurrent.TimeoutException: Timed out waiting for client connection.
	at org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:172) ~[hive-exec-3.1.2.jar:3.1.2]
	at io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:120) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_232]
2022-08-05T16:43:35,401 WARN  [Driver]: client.SparkClientImpl (:()) - Thread waiting on the child process (spark-submit) is interrupted, killing the child process.
2022-08-05T16:43:35,442 ERROR [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: spark.SparkTask (:()) - Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session 7523b336-d47a-41ca-8899-5dd7b83717fa)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session 7523b336-d47a-41ca-8899-5dd7b83717fa
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.getHiveException(SparkSessionImpl.java:215)
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92)
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115)
	at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136)
	at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664)
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335)
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703)
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218)
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188)
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402)
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Timed out waiting for client connection.
	at com.google.common.base.Throwables.propagate(Throwables.java:241)
	at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:128)
	at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:88)
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:105)
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:101)
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76)
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87)
	... 24 more
Caused by: java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Timed out waiting for client connection.
	at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41)
	at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:106)
	... 29 more
Caused by: java.util.concurrent.TimeoutException: Timed out waiting for client connection.
	at org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:172)
	at io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
	at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:120)
	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
	at java.lang.Thread.run(Thread.java:748)

2022-08-05T16:43:35,442 ERROR [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: spark.SparkTask (:()) - Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session 7523b336-d47a-41ca-8899-5dd7b83717fa)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session 7523b336-d47a-41ca-8899-5dd7b83717fa
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.getHiveException(SparkSessionImpl.java:215) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683) ~[hive-cli-3.1.2.jar:3.1.2]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_232]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_232]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_232]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_232]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:323) ~[hadoop-common-3.2.1.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:236) ~[hadoop-common-3.2.1.jar:?]
Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Timed out waiting for client connection.
	at com.google.common.base.Throwables.propagate(Throwables.java:241) ~[guava-27.0-jre.jar:?]
	at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:128) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:88) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:105) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:101) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
	... 24 more
Caused by: java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Timed out waiting for client connection.
	at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:106) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:88) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:105) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:101) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
	... 24 more
Caused by: java.util.concurrent.TimeoutException: Timed out waiting for client connection.
	at org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:172) ~[hive-exec-3.1.2.jar:3.1.2]
	at io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:120) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_232]
2022-08-05T16:43:35,442 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: reexec.ReOptimizePlugin (:()) - ReOptimization: retryPossible: false
2022-08-05T16:43:35,442 ERROR [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - FAILED: Execution Error, return code 30041 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create Spark client for Spark session 7523b336-d47a-41ca-8899-5dd7b83717fa
2022-08-05T16:43:35,442 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Completed executing command(queryId=root_20220805164200_7f498c34-66ef-48e3-a168-19db0e8094de); Time taken: 91.827 seconds
2022-08-05T16:43:35,442 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: ql.Driver (:()) - Concurrency mode is disabled, not creating a lock manager
2022-08-05T16:43:35,454 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 1f78ca2b-55f2-47b0-aec3-c2b81b429118
2022-08-05T16:43:35,455 INFO  [1f78ca2b-55f2-47b0-aec3-c2b81b429118 main]: session.SessionState (SessionState.java:resetThreadName(452)) - Resetting thread name to  main
2022-08-05T16:43:42,788 INFO  [shutdown-hook-0]: session.SparkSessionManagerImpl (:()) - Closing the session manager.

spark java.nio.file.NoSuchFileException: /opt/hive/lib/hive-exec-3.1.2.jar

解决在spark的机器上增加/opt/hive/lib/hive-exec-3.1.2.jar

2022-08-05T17:39:58,842 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:main(8799)) - Starting hive metastore on port 9083
2022-08-05T17:39:59,024 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:newRawStoreForConf(717)) - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2022-08-05T17:39:59,052 WARN  [main]: metastore.ObjectStore (ObjectStore.java:correctAutoStartMechanism(638)) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2022-08-05T17:39:59,057 INFO  [main]: metastore.ObjectStore (ObjectStore.java:initializeHelper(481)) - ObjectStore, initialize called
2022-08-05T17:39:59,058 INFO  [main]: conf.MetastoreConf (MetastoreConf.java:findConfigFile(1233)) - Unable to find config file hivemetastore-site.xml
2022-08-05T17:39:59,058 INFO  [main]: conf.MetastoreConf (MetastoreConf.java:findConfigFile(1240)) - Found configuration file null
2022-08-05T17:39:59,058 INFO  [main]: conf.MetastoreConf (MetastoreConf.java:findConfigFile(1233)) - Unable to find config file metastore-site.xml
2022-08-05T17:39:59,059 INFO  [main]: conf.MetastoreConf (MetastoreConf.java:findConfigFile(1240)) - Found configuration file null
2022-08-05T17:39:59,352 INFO  [main]: hikari.HikariDataSource (HikariDataSource.java:<init>(71)) - HikariPool-1 - Starting...
2022-08-05T17:39:59,405 INFO  [main]: pool.PoolBase (PoolBase.java:getAndSetNetworkTimeout(503)) - HikariPool-1 - Driver does not support get/set network timeout for connections. (Method org.postgresql.jdbc.PgConnection.getNetworkTimeout() is not yet implemented.)
2022-08-05T17:39:59,418 INFO  [main]: hikari.HikariDataSource (HikariDataSource.java:<init>(73)) - HikariPool-1 - Start completed.
2022-08-05T17:39:59,437 INFO  [main]: hikari.HikariDataSource (HikariDataSource.java:<init>(71)) - HikariPool-2 - Starting...
2022-08-05T17:39:59,443 INFO  [main]: pool.PoolBase (PoolBase.java:getAndSetNetworkTimeout(503)) - HikariPool-2 - Driver does not support get/set network timeout for connections. (Method org.postgresql.jdbc.PgConnection.getNetworkTimeout() is not yet implemented.)
2022-08-05T17:39:59,444 INFO  [main]: hikari.HikariDataSource (HikariDataSource.java:<init>(73)) - HikariPool-2 - Start completed.
2022-08-05T17:39:59,616 INFO  [main]: metastore.ObjectStore (ObjectStore.java:getPMF(670)) - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2022-08-05T17:39:59,693 INFO  [main]: metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(186)) - Using direct SQL, underlying DB is POSTGRES
2022-08-05T17:39:59,694 INFO  [main]: metastore.ObjectStore (ObjectStore.java:setConf(396)) - Initialized ObjectStore
2022-08-05T17:40:01,260 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:createDefaultRoles_core(812)) - Added admin role in metastore
2022-08-05T17:40:01,261 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:createDefaultRoles_core(821)) - Added public role in metastore
2022-08-05T17:40:01,272 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:addAdminUsers_core(861)) - No user is added in admin role, since config is empty
2022-08-05T17:40:01,345 INFO  [main]: conf.HiveConf (HiveConf.java:findConfigFile(187)) - Found configuration file file:/opt/hive/conf/hive-site.xml
2022-08-05T17:40:01,507 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:startMetaStore(8958)) - Starting DB backed MetaStore Server with SetUGI enabled
2022-08-05T17:40:01,511 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:startMetaStore(9030)) - Started the new metaserver on port [9083]...
2022-08-05T17:40:01,511 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:startMetaStore(9032)) - Options.minWorkerThreads = 200
2022-08-05T17:40:01,511 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:startMetaStore(9034)) - Options.maxWorkerThreads = 1000
2022-08-05T17:40:01,511 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:startMetaStore(9036)) - TCP keepalive = true
2022-08-05T17:40:01,511 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:startMetaStore(9037)) - Enable SSL = false
2022-08-05T17:40:20,079 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:newRawStoreForConf(717)) - 1: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2022-08-05T17:40:20,080 WARN  [pool-6-thread-1]: metastore.ObjectStore (ObjectStore.java:correctAutoStartMechanism(638)) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2022-08-05T17:40:20,080 INFO  [pool-6-thread-1]: metastore.ObjectStore (ObjectStore.java:initializeHelper(481)) - ObjectStore, initialize called
2022-08-05T17:40:20,086 INFO  [pool-6-thread-1]: metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(186)) - Using direct SQL, underlying DB is POSTGRES
2022-08-05T17:40:20,087 INFO  [pool-6-thread-1]: metastore.ObjectStore (ObjectStore.java:setConf(396)) - Initialized ObjectStore
2022-08-05T17:40:20,169 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-05T17:40:20,170 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-05T17:40:22,740 INFO  [main]: conf.HiveConf (HiveConf.java:findConfigFile(187)) - Found configuration file file:/opt/hive/conf/hive-site.xml
2022-08-05T17:40:23,066 INFO  [main]: SessionState (:()) - Hive Session ID = d1c66613-70b0-48d8-8d91-88078c32eb9b
2022-08-05T17:40:23,104 INFO  [main]: SessionState (:()) -
Logging initialized using configuration in file:/opt/hive/conf/hive-log4j2.properties Async: true
2022-08-05T17:40:24,168 INFO  [main]: session.SessionState (SessionState.java:createPath(790)) - Created HDFS directory: /tmp/hive/root/d1c66613-70b0-48d8-8d91-88078c32eb9b
2022-08-05T17:40:24,193 INFO  [main]: session.SessionState (SessionState.java:createPath(790)) - Created local directory: /tmp/root/d1c66613-70b0-48d8-8d91-88078c32eb9b
2022-08-05T17:40:24,196 INFO  [main]: session.SessionState (SessionState.java:createPath(790)) - Created HDFS directory: /tmp/hive/root/d1c66613-70b0-48d8-8d91-88078c32eb9b/_tmp_space.db
2022-08-05T17:40:24,206 INFO  [main]: conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: d1c66613-70b0-48d8-8d91-88078c32eb9b
2022-08-05T17:40:24,206 INFO  [main]: session.SessionState (SessionState.java:updateThreadName(441)) - Updating thread name to d1c66613-70b0-48d8-8d91-88078c32eb9b main
2022-08-05T17:40:25,145 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2022-08-05T17:40:25,164 WARN  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.ObjectStore (:()) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2022-08-05T17:40:25,169 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.ObjectStore (:()) - ObjectStore, initialize called
2022-08-05T17:40:25,170 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: conf.MetastoreConf (:()) - Found configuration file file:/opt/hive/conf/hive-site.xml
2022-08-05T17:40:25,171 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: conf.MetastoreConf (:()) - Unable to find config file hivemetastore-site.xml
2022-08-05T17:40:25,171 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: conf.MetastoreConf (:()) - Found configuration file null
2022-08-05T17:40:25,171 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: conf.MetastoreConf (:()) - Unable to find config file metastore-site.xml
2022-08-05T17:40:25,171 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: conf.MetastoreConf (:()) - Found configuration file null
2022-08-05T17:40:25,409 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: hikari.HikariDataSource (:()) - HikariPool-1 - Starting...
2022-08-05T17:40:25,485 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: pool.PoolBase (:()) - HikariPool-1 - Driver does not support get/set network timeout for connections. (Method org.postgresql.jdbc.PgConnection.getNetworkTimeout() is not yet implemented.)
2022-08-05T17:40:25,497 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: hikari.HikariDataSource (:()) - HikariPool-1 - Start completed.
2022-08-05T17:40:25,522 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: hikari.HikariDataSource (:()) - HikariPool-2 - Starting...
2022-08-05T17:40:25,529 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: pool.PoolBase (:()) - HikariPool-2 - Driver does not support get/set network timeout for connections. (Method org.postgresql.jdbc.PgConnection.getNetworkTimeout() is not yet implemented.)
2022-08-05T17:40:25,530 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: hikari.HikariDataSource (:()) - HikariPool-2 - Start completed.
2022-08-05T17:40:25,714 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.ObjectStore (:()) - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2022-08-05T17:40:25,786 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.MetaStoreDirectSql (:()) - Using direct SQL, underlying DB is POSTGRES
2022-08-05T17:40:25,786 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.ObjectStore (:()) - Initialized ObjectStore
2022-08-05T17:40:28,058 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - Added admin role in metastore
2022-08-05T17:40:28,059 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - Added public role in metastore
2022-08-05T17:40:28,072 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - No user is added in admin role, since config is empty
2022-08-05T17:40:28,175 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=root (auth:SIMPLE) retries=1 delay=1 lifetime=0
2022-08-05T17:40:28,192 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_all_functions
2022-08-05T17:40:28,193 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_all_functions
2022-08-05T17:40:28,220 INFO  [pool-10-thread-1]: SessionState (:()) - Hive Session ID = d78f4f97-f713-44fb-a989-0bd0df2d98c7
2022-08-05T17:40:28,261 INFO  [pool-10-thread-1]: session.SessionState (SessionState.java:createPath(790)) - Created HDFS directory: /tmp/hive/root/d78f4f97-f713-44fb-a989-0bd0df2d98c7
2022-08-05T17:40:28,264 INFO  [pool-10-thread-1]: session.SessionState (SessionState.java:createPath(790)) - Created local directory: /tmp/root/d78f4f97-f713-44fb-a989-0bd0df2d98c7
2022-08-05T17:40:28,269 INFO  [pool-10-thread-1]: session.SessionState (SessionState.java:createPath(790)) - Created HDFS directory: /tmp/hive/root/d78f4f97-f713-44fb-a989-0bd0df2d98c7/_tmp_space.db
2022-08-05T17:40:28,271 INFO  [pool-10-thread-1]: metastore.HiveMetaStore (:()) - 1: get_databases: @hive#
2022-08-05T17:40:28,271 INFO  [pool-10-thread-1]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_databases: @hive#
2022-08-05T17:40:28,275 INFO  [pool-10-thread-1]: metastore.HiveMetaStore (:()) - 1: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2022-08-05T17:40:28,275 INFO  [pool-10-thread-1]: metastore.ObjectStore (:()) - ObjectStore, initialize called
2022-08-05T17:40:28,292 INFO  [pool-10-thread-1]: metastore.MetaStoreDirectSql (:()) - Using direct SQL, underlying DB is POSTGRES
2022-08-05T17:40:28,293 INFO  [pool-10-thread-1]: metastore.ObjectStore (:()) - Initialized ObjectStore
2022-08-05T17:40:28,302 INFO  [pool-10-thread-1]: metastore.HiveMetaStore (:()) - 1: get_tables_by_type: db=@hive#default pat=.*,type=MATERIALIZED_VIEW
2022-08-05T17:40:28,302 INFO  [pool-10-thread-1]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_tables_by_type: db=@hive#default pat=.*,type=MATERIALIZED_VIEW
2022-08-05T17:40:28,307 INFO  [pool-10-thread-1]: metastore.HiveMetaStore (:()) - 1: get_multi_table : db=default tbls=
2022-08-05T17:40:28,307 INFO  [pool-10-thread-1]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_multi_table : db=default tbls=
2022-08-05T17:40:28,315 INFO  [pool-10-thread-1]: metastore.HiveMetaStore (:()) - 1: get_tables_by_type: db=@hive#test pat=.*,type=MATERIALIZED_VIEW
2022-08-05T17:40:28,315 INFO  [pool-10-thread-1]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_tables_by_type: db=@hive#test pat=.*,type=MATERIALIZED_VIEW
2022-08-05T17:40:28,317 INFO  [pool-10-thread-1]: metastore.HiveMetaStore (:()) - 1: get_multi_table : db=test tbls=
2022-08-05T17:40:28,317 INFO  [pool-10-thread-1]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_multi_table : db=test tbls=
2022-08-05T17:40:28,317 INFO  [pool-10-thread-1]: metadata.HiveMaterializedViewsRegistry (:()) - Materialized views registry has been initialized
2022-08-05T17:40:42,107 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: d1c66613-70b0-48d8-8d91-88078c32eb9b
2022-08-05T17:40:42,170 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ql.Driver (:()) - Compiling command(queryId=root_20220805174042_428d3cd2-b9a8-40ac-9535-cd696c0bc43d): insert into table student values(1,'abc')
2022-08-05T17:40:42,529 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ql.Driver (:()) - Concurrency mode is disabled, not creating a lock manager
2022-08-05T17:40:42,531 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - Starting Semantic Analysis
2022-08-05T17:40:42,543 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: sqlstd.SQLStdHiveAccessController (:()) - Created SQLStdHiveAccessController for session context : HiveAuthzSessionContext [sessionString=d1c66613-70b0-48d8-8d91-88078c32eb9b, clientType=HIVECLI]
2022-08-05T17:40:42,544 WARN  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: session.SessionState (SessionState.java:setAuthorizerV2Config(950)) - METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory.
2022-08-05T17:40:42,545 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStoreClient (:()) - Mestastore configuration metastore.filter.hook changed from org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook
2022-08-05T17:40:42,546 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: Cleaning up thread local RawStore...
2022-08-05T17:40:42,546 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=Cleaning up thread local RawStore...
2022-08-05T17:40:42,546 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: Done cleaning up thread local RawStore
2022-08-05T17:40:42,546 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=Done cleaning up thread local RawStore
2022-08-05T17:40:42,548 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2022-08-05T17:40:42,548 WARN  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.ObjectStore (:()) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2022-08-05T17:40:42,549 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.ObjectStore (:()) - ObjectStore, initialize called
2022-08-05T17:40:42,555 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.MetaStoreDirectSql (:()) - Using direct SQL, underlying DB is POSTGRES
2022-08-05T17:40:42,555 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.ObjectStore (:()) - Initialized ObjectStore
2022-08-05T17:40:42,556 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=root (auth:SIMPLE) retries=1 delay=1 lifetime=0
2022-08-05T17:40:42,562 WARN  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.ObjectStore (:()) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2022-08-05T17:40:42,562 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.ObjectStore (:()) - ObjectStore, initialize called
2022-08-05T17:40:42,564 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.MetaStoreDirectSql (:()) - Using direct SQL, underlying DB is POSTGRES
2022-08-05T17:40:42,564 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.ObjectStore (:()) - Initialized ObjectStore
2022-08-05T17:40:42,565 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=root (auth:SIMPLE) retries=1 delay=1 lifetime=0
2022-08-05T17:40:42,566 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-05T17:40:42,566 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-05T17:40:42,669 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - Completed phase 1 of Semantic Analysis
2022-08-05T17:40:42,669 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - Get metadata for source tables
2022-08-05T17:40:42,669 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - Get metadata for subqueries
2022-08-05T17:40:42,675 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - Get metadata for destination tables
2022-08-05T17:40:42,676 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-05T17:40:42,676 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-05T17:40:42,694 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - Completed getting MetaData in Semantic Analysis
2022-08-05T17:40:43,594 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ql.Context (:()) - New scratch dir is hdfs://namenode:9000/tmp/hive/root/d1c66613-70b0-48d8-8d91-88078c32eb9b/hive_2022-08-05_17-40-42_185_3531394714000146511-1
2022-08-05T17:40:43,678 INFO  [Thread-9]: sasl.SaslDataTransferClient (:()) - SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2022-08-05T17:40:43,784 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_not_null_constraints : tbl=hive._dummy_database._dummy_table
2022-08-05T17:40:43,784 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_not_null_constraints : tbl=hive._dummy_database._dummy_table
2022-08-05T17:40:43,790 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_primary_keys : tbl=hive._dummy_database._dummy_table
2022-08-05T17:40:43,790 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_primary_keys : tbl=hive._dummy_database._dummy_table
2022-08-05T17:40:43,796 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_primary_keys : tbl=hive._dummy_database._dummy_table
2022-08-05T17:40:43,796 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_primary_keys : tbl=hive._dummy_database._dummy_table
2022-08-05T17:40:43,798 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_unique_constraints : tbl=hive._dummy_database._dummy_table
2022-08-05T17:40:43,798 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_unique_constraints : tbl=hive._dummy_database._dummy_table
2022-08-05T17:40:43,802 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_foreign_keys : parentdb=null parenttbl=null foreigndb=_dummy_database foreigntbl=_dummy_table
2022-08-05T17:40:43,802 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_foreign_keys : parentdb=null parenttbl=null foreigndb=_dummy_database foreigntbl=_dummy_table
2022-08-05T17:40:44,271 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_databases: @hive#
2022-08-05T17:40:44,271 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_databases: @hive#
2022-08-05T17:40:44,275 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_materialized_views_for_rewriting: db=@hive#default
2022-08-05T17:40:44,275 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_materialized_views_for_rewriting: db=@hive#default
2022-08-05T17:40:44,279 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_materialized_views_for_rewriting: db=@hive#test
2022-08-05T17:40:44,279 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_materialized_views_for_rewriting: db=@hive#test
2022-08-05T17:40:44,305 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - Get metadata for source tables
2022-08-05T17:40:44,306 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - Get metadata for subqueries
2022-08-05T17:40:44,306 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - Get metadata for source tables
2022-08-05T17:40:44,306 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive._dummy_database._dummy_table
2022-08-05T17:40:44,306 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive._dummy_database._dummy_table
2022-08-05T17:40:44,308 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - Get metadata for subqueries
2022-08-05T17:40:44,308 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - Get metadata for destination tables
2022-08-05T17:40:44,312 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - Get metadata for destination tables
2022-08-05T17:40:44,312 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-05T17:40:44,312 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-05T17:40:44,325 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ql.Context (:()) - New scratch dir is hdfs://namenode:9000/tmp/hive/root/d1c66613-70b0-48d8-8d91-88078c32eb9b/hive_2022-08-05_17-40-42_185_3531394714000146511-1
2022-08-05T17:40:44,355 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: common.FileUtils (FileUtils.java:mkdir(580)) - Creating directory if it doesn't exist: hdfs://namenode:9000/user/hive/warehouse/student/.hive-staging_hive_2022-08-05_17-40-42_185_3531394714000146511-1
2022-08-05T17:40:44,390 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_not_null_constraints : tbl=hive.default.student
2022-08-05T17:40:44,390 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_not_null_constraints : tbl=hive.default.student
2022-08-05T17:40:44,393 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_check_constraints : tbl=hive.default.student
2022-08-05T17:40:44,393 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_check_constraints : tbl=hive.default.student
2022-08-05T17:40:44,405 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-05T17:40:44,405 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-05T17:40:44,418 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - Generate an operator pipeline to autogather column stats for table default.student in query insert into table student values(1,'abc')
2022-08-05T17:40:44,422 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStoreClient (:()) - Mestastore configuration metastore.filter.hook changed from org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook to org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl
2022-08-05T17:40:44,422 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: Cleaning up thread local RawStore...
2022-08-05T17:40:44,422 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=Cleaning up thread local RawStore...
2022-08-05T17:40:44,422 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: Done cleaning up thread local RawStore
2022-08-05T17:40:44,422 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=Done cleaning up thread local RawStore
2022-08-05T17:40:44,424 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2022-08-05T17:40:44,424 WARN  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.ObjectStore (:()) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2022-08-05T17:40:44,424 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.ObjectStore (:()) - ObjectStore, initialize called
2022-08-05T17:40:44,426 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.MetaStoreDirectSql (:()) - Using direct SQL, underlying DB is POSTGRES
2022-08-05T17:40:44,426 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.ObjectStore (:()) - Initialized ObjectStore
2022-08-05T17:40:44,427 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=root (auth:SIMPLE) retries=1 delay=1 lifetime=0
2022-08-05T17:40:44,427 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-05T17:40:44,427 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-05T17:40:44,444 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - Get metadata for source tables
2022-08-05T17:40:44,444 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-05T17:40:44,444 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-05T17:40:44,455 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - Get metadata for subqueries
2022-08-05T17:40:44,455 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - Get metadata for destination tables
2022-08-05T17:40:44,468 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ql.Context (:()) - New scratch dir is hdfs://namenode:9000/tmp/hive/root/d1c66613-70b0-48d8-8d91-88078c32eb9b/hive_2022-08-05_17-40-44_418_5005256304731580716-1
2022-08-05T17:40:44,479 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: common.FileUtils (FileUtils.java:mkdir(580)) - Creating directory if it doesn't exist: hdfs://namenode:9000/tmp/hive/root/d1c66613-70b0-48d8-8d91-88078c32eb9b/hive_2022-08-05_17-40-44_418_5005256304731580716-1/-mr-10000/.hive-staging_hive_2022-08-05_17-40-44_418_5005256304731580716-1
2022-08-05T17:40:44,482 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - CBO Succeeded; optimized logical plan.
2022-08-05T17:40:44,499 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ppd.OpProcFactory (:()) - Processing for FS(4)
2022-08-05T17:40:44,500 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ppd.OpProcFactory (:()) - Processing for FS(11)
2022-08-05T17:40:44,500 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ppd.OpProcFactory (:()) - Processing for SEL(10)
2022-08-05T17:40:44,500 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ppd.OpProcFactory (:()) - Processing for GBY(9)
2022-08-05T17:40:44,500 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ppd.OpProcFactory (:()) - Processing for RS(8)
2022-08-05T17:40:44,500 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ppd.OpProcFactory (:()) - Processing for GBY(7)
2022-08-05T17:40:44,500 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ppd.OpProcFactory (:()) - Processing for SEL(6)
2022-08-05T17:40:44,500 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ppd.OpProcFactory (:()) - Processing for SEL(3)
2022-08-05T17:40:44,500 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ppd.OpProcFactory (:()) - Processing for UDTF(2)
2022-08-05T17:40:44,500 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ppd.OpProcFactory (:()) - Processing for SEL(1)
2022-08-05T17:40:44,500 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ppd.OpProcFactory (:()) - Processing for TS(0)
2022-08-05T17:40:44,513 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: optimizer.ColumnPrunerProcFactory (:()) - RS 8 oldColExprMap: {VALUE._col0=Column[_col0], VALUE._col1=Column[_col1]}
2022-08-05T17:40:44,513 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: optimizer.ColumnPrunerProcFactory (:()) - RS 8 newColExprMap: {VALUE._col0=Column[_col0], VALUE._col1=Column[_col1]}
2022-08-05T17:40:44,564 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: spark.SetSparkReducerParallelism (:()) - Number of reducers for sink RS[8] was already determined to be: 1
2022-08-05T17:40:44,588 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-05T17:40:44,589 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-05T17:40:44,624 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: physical.Vectorizer (:()) - Examining input format to see if vectorization is enabled.
2022-08-05T17:40:44,629 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: physical.Vectorizer (:()) - Map vectorization enabled: false
2022-08-05T17:40:44,629 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: physical.Vectorizer (:()) - Map vectorized: false
2022-08-05T17:40:44,629 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: physical.Vectorizer (:()) - Map vectorizedVertexNum: 0
2022-08-05T17:40:44,629 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: physical.Vectorizer (:()) - Map enabledConditionsMet: [hive.vectorized.use.vectorized.input.format IS true]
2022-08-05T17:40:44,629 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: physical.Vectorizer (:()) - Map enabledConditionsNotMet: [Could not enable vectorization due to partition column names size 1 is greater than the number of table column names size 0 IS false]
2022-08-05T17:40:44,629 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: physical.Vectorizer (:()) - Map inputFileFormatClassNameSet: [org.apache.hadoop.hive.ql.io.NullRowsInputFormat]
2022-08-05T17:40:44,638 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: physical.Vectorizer (:()) - Validating and vectorizing ReduceWork... (vectorizedVertexNum 1)
2022-08-05T17:40:44,649 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: physical.Vectorizer (:()) - Reduce vectorization enabled: true
2022-08-05T17:40:44,649 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: physical.Vectorizer (:()) - Reduce vectorized: false
2022-08-05T17:40:44,649 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: physical.Vectorizer (:()) - Reduce notVectorizedReason: Aggregation Function expression for GROUPBY operator: UDF compute_stats not supported
2022-08-05T17:40:44,649 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: physical.Vectorizer (:()) - Reduce vectorizedVertexNum: 1
2022-08-05T17:40:44,650 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: physical.Vectorizer (:()) - Reducer hive.vectorized.execution.reduce.enabled: true
2022-08-05T17:40:44,650 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: physical.Vectorizer (:()) - Reducer engine: spark
2022-08-05T17:40:44,651 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: common.HiveStatsUtils (:()) - Error requested is 20.0%
2022-08-05T17:40:44,651 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: common.HiveStatsUtils (:()) - Choosing 16 bit vectors..
2022-08-05T17:40:44,655 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: parse.CalcitePlanner (:()) - Completed plan generation
2022-08-05T17:40:44,655 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ql.Driver (:()) - Semantic Analysis Completed (retrial = false)
2022-08-05T17:40:44,656 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ql.Driver (:()) - Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:col1, type:int, comment:null), FieldSchema(name:col2, type:string, comment:null)], properties:null)
2022-08-05T17:40:44,660 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ql.Driver (:()) - Completed compiling command(queryId=root_20220805174042_428d3cd2-b9a8-40ac-9535-cd696c0bc43d); Time taken: 2.511 seconds
2022-08-05T17:40:44,660 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: reexec.ReExecDriver (:()) - Execution #1 of query
2022-08-05T17:40:44,660 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ql.Driver (:()) - Concurrency mode is disabled, not creating a lock manager
2022-08-05T17:40:44,661 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ql.Driver (:()) - Executing command(queryId=root_20220805174042_428d3cd2-b9a8-40ac-9535-cd696c0bc43d): insert into table student values(1,'abc')
2022-08-05T17:40:44,662 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ql.Driver (:()) - Query ID = root_20220805174042_428d3cd2-b9a8-40ac-9535-cd696c0bc43d
2022-08-05T17:40:44,662 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ql.Driver (:()) - Total jobs = 1
2022-08-05T17:40:44,670 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ql.Driver (:()) - Launching Job 1 out of 1
2022-08-05T17:40:44,671 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ql.Driver (:()) - Starting task [Stage-1:MAPRED] in serial mode
2022-08-05T17:40:44,671 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: spark.SparkTask (:()) - In order to change the average load for a reducer (in bytes):
2022-08-05T17:40:44,671 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: spark.SparkTask (:()) -   set hive.exec.reducers.bytes.per.reducer=<number>
2022-08-05T17:40:44,671 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: spark.SparkTask (:()) - In order to limit the maximum number of reducers:
2022-08-05T17:40:44,671 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: spark.SparkTask (:()) -   set hive.exec.reducers.max=<number>
2022-08-05T17:40:44,671 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: spark.SparkTask (:()) - In order to set a constant number of reducers:
2022-08-05T17:40:44,671 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: spark.SparkTask (:()) -   set mapreduce.job.reduces=<number>
2022-08-05T17:40:44,681 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: session.SparkSessionManagerImpl (:()) - Setting up the session manager.
2022-08-05T17:40:44,865 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: session.SparkSession (:()) - Trying to open Spark session cbdaa3a9-9b79-4ce5-a4ca-5401ce4ddef7
2022-08-05T17:40:45,332 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - No spark.home provided, calling SparkSubmit directly.
2022-08-05T17:40:45,351 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - Running client driver with argv: /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java org.apache.spark.deploy.SparkSubmit --properties-file /tmp/spark-submit.3829110648136000580.properties --class org.apache.hive.spark.client.RemoteDriver /opt/hive/lib/hive-exec-3.1.2.jar --remote-host hive-metastore --remote-port 35955 --conf hive.spark.client.connect.timeout=10000 --conf hive.spark.client.server.connect.timeout=90000 --conf hive.spark.client.channel.log.level=null --conf hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256 --conf hive.spark.client.rpc.server.address=null
2022-08-05T17:40:46,525 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
2022-08-05T17:40:46,526 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
2022-08-05T17:40:46,526 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
2022-08-05T17:40:46,526 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
2022-08-05T17:40:46,526 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
2022-08-05T17:40:46,958 INFO  [RemoteDriver-stdout-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 2022-08-05T17:40:46,956 WARN [main] org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2022-08-05T17:40:47,013 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
2022-08-05T17:40:47,013 INFO  [RemoteDriver-stdout-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 2022-08-05T17:40:47,013 INFO [main] org.apache.spark.SecurityManager - Changing view acls to: root
2022-08-05T17:40:47,013 INFO  [RemoteDriver-stdout-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 2022-08-05T17:40:47,013 INFO [main] org.apache.spark.SecurityManager - Changing modify acls to: root
2022-08-05T17:40:47,014 INFO  [RemoteDriver-stdout-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 2022-08-05T17:40:47,014 INFO [main] org.apache.spark.SecurityManager - Changing view acls groups to:
2022-08-05T17:40:47,014 INFO  [RemoteDriver-stdout-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 2022-08-05T17:40:47,014 INFO [main] org.apache.spark.SecurityManager - Changing modify acls groups to:
2022-08-05T17:40:47,014 INFO  [RemoteDriver-stdout-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 2022-08-05T17:40:47,014 INFO [main] org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
2022-08-05T17:40:47,291 INFO  [RemoteDriver-stdout-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 2022-08-05T17:40:47,290 INFO [main] org.apache.spark.util.Utils - Successfully started service 'driverClient' on port 37987.
2022-08-05T17:40:47,357 INFO  [RemoteDriver-stdout-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 2022-08-05T17:40:47,357 INFO [netty-rpc-connection-0] org.apache.spark.network.client.TransportClientFactory - Successfully created connection to spark-master-hp/172.25.0.3:7077 after 40 ms (0 ms spent in bootstraps)
2022-08-05T17:40:47,507 INFO  [RemoteDriver-stdout-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 2022-08-05T17:40:47,507 INFO [dispatcher-event-loop-2] org.apache.spark.deploy.ClientEndpoint - ... waiting before polling master for driver state
2022-08-05T17:40:47,846 INFO  [RemoteDriver-stdout-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 2022-08-05T17:40:47,845 INFO [dispatcher-event-loop-3] org.apache.spark.deploy.ClientEndpoint - Driver successfully submitted as driver-20220805174047-0000
2022-08-05T17:40:52,588 INFO  [RemoteDriver-stdout-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 2022-08-05T17:40:52,588 INFO [dispatcher-event-loop-4] org.apache.spark.deploy.ClientEndpoint - State of driver-20220805174047-0000 is ERROR
2022-08-05T17:40:52,589 INFO  [RemoteDriver-stdout-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 2022-08-05T17:40:52,589 ERROR [dispatcher-event-loop-4] org.apache.spark.deploy.ClientEndpoint - Exception from cluster was: java.nio.file.NoSuchFileException: /opt/hive/lib/hive-exec-3.1.2.jar
2022-08-05T17:40:52,590 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - java.nio.file.NoSuchFileException: /opt/hive/lib/hive-exec-3.1.2.jar
2022-08-05T17:40:52,590 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
2022-08-05T17:40:52,590 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
2022-08-05T17:40:52,590 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
2022-08-05T17:40:52,590 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 	at sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:526)
2022-08-05T17:40:52,590 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 	at sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:253)
2022-08-05T17:40:52,590 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 	at java.nio.file.Files.copy(Files.java:1274)
2022-08-05T17:40:52,590 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 	at org.apache.spark.util.Utils$.copyRecursive(Utils.scala:726)
2022-08-05T17:40:52,590 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 	at org.apache.spark.util.Utils$.copyFile(Utils.scala:697)
2022-08-05T17:40:52,590 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 	at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:771)
2022-08-05T17:40:52,590 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 	at org.apache.spark.util.Utils$.fetchFile(Utils.scala:541)
2022-08-05T17:40:52,590 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 	at org.apache.spark.deploy.worker.DriverRunner.downloadUserJar(DriverRunner.scala:162)
2022-08-05T17:40:52,590 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 	at org.apache.spark.deploy.worker.DriverRunner.prepareAndRunDriver(DriverRunner.scala:180)
2022-08-05T17:40:52,590 INFO  [RemoteDriver-stderr-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 	at org.apache.spark.deploy.worker.DriverRunner$$anon$2.run(DriverRunner.scala:99)
2022-08-05T17:40:52,619 INFO  [RemoteDriver-stdout-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 2022-08-05T17:40:52,618 INFO [shutdown-hook-0] org.apache.spark.util.ShutdownHookManager - Shutdown hook called
2022-08-05T17:40:52,620 INFO  [RemoteDriver-stdout-redir-d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - 2022-08-05T17:40:52,620 INFO [shutdown-hook-0] org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-24474cf1-b43a-422b-970c-6742c4cac3b8
2022-08-05T17:40:52,994 WARN  [Driver]: client.SparkClientImpl (:()) - Child process exited with code 255
2022-08-05T17:40:52,997 ERROR [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: client.SparkClientImpl (:()) - Error while waiting for client to connect.
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client 'cbdaa3a9-9b79-4ce5-a4ca-5401ce4ddef7'. Error: Child process (spark-submit) exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
java.nio.file.NoSuchFileException: /opt/hive/lib/hive-exec-3.1.2.jar
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:526)
	at sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:253)
	at java.nio.file.Files.copy(Files.java:1274)
	at org.apache.spark.util.Utils$.copyRecursive(Utils.scala:726)
	at org.apache.spark.util.Utils$.copyFile(Utils.scala:697)
	at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:771)
	at org.apache.spark.util.Utils$.fetchFile(Utils.scala:541)
	at org.apache.spark.deploy.worker.DriverRunner.downloadUserJar(DriverRunner.scala:162)
	at org.apache.spark.deploy.worker.DriverRunner.prepareAndRunDriver(DriverRunner.scala:180)
	at org.apache.spark.deploy.worker.DriverRunner$$anon$2.run(DriverRunner.scala:99)

	at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:106) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:88) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:105) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:101) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683) ~[hive-cli-3.1.2.jar:3.1.2]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_232]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_232]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_232]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_232]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:323) ~[hadoop-common-3.2.1.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:236) ~[hadoop-common-3.2.1.jar:?]
Caused by: java.lang.RuntimeException: Cancel client 'cbdaa3a9-9b79-4ce5-a4ca-5401ce4ddef7'. Error: Child process (spark-submit) exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
java.nio.file.NoSuchFileException: /opt/hive/lib/hive-exec-3.1.2.jar
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:526)
	at sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:253)
	at java.nio.file.Files.copy(Files.java:1274)
	at org.apache.spark.util.Utils$.copyRecursive(Utils.scala:726)
	at org.apache.spark.util.Utils$.copyFile(Utils.scala:697)
	at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:771)
	at org.apache.spark.util.Utils$.fetchFile(Utils.scala:541)
	at org.apache.spark.deploy.worker.DriverRunner.downloadUserJar(DriverRunner.scala:162)
	at org.apache.spark.deploy.worker.DriverRunner.prepareAndRunDriver(DriverRunner.scala:180)
	at org.apache.spark.deploy.worker.DriverRunner$$anon$2.run(DriverRunner.scala:99)

	at org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:211) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hive.spark.client.SparkClientImpl$2.run(SparkClientImpl.java:491) ~[hive-exec-3.1.2.jar:3.1.2]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_232]
2022-08-05T17:40:53,028 ERROR [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: spark.SparkTask (:()) - Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session cbdaa3a9-9b79-4ce5-a4ca-5401ce4ddef7)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session cbdaa3a9-9b79-4ce5-a4ca-5401ce4ddef7
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.getHiveException(SparkSessionImpl.java:215)
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92)
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115)
	at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136)
	at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664)
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335)
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703)
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218)
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188)
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402)
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client 'cbdaa3a9-9b79-4ce5-a4ca-5401ce4ddef7'. Error: Child process (spark-submit) exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
java.nio.file.NoSuchFileException: /opt/hive/lib/hive-exec-3.1.2.jar
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:526)
	at sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:253)
	at java.nio.file.Files.copy(Files.java:1274)
	at org.apache.spark.util.Utils$.copyRecursive(Utils.scala:726)
	at org.apache.spark.util.Utils$.copyFile(Utils.scala:697)
	at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:771)
	at org.apache.spark.util.Utils$.fetchFile(Utils.scala:541)
	at org.apache.spark.deploy.worker.DriverRunner.downloadUserJar(DriverRunner.scala:162)
	at org.apache.spark.deploy.worker.DriverRunner.prepareAndRunDriver(DriverRunner.scala:180)
	at org.apache.spark.deploy.worker.DriverRunner$$anon$2.run(DriverRunner.scala:99)

	at com.google.common.base.Throwables.propagate(Throwables.java:241)
	at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:128)
	at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:88)
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:105)
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:101)
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76)
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87)
	... 24 more
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client 'cbdaa3a9-9b79-4ce5-a4ca-5401ce4ddef7'. Error: Child process (spark-submit) exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
java.nio.file.NoSuchFileException: /opt/hive/lib/hive-exec-3.1.2.jar
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:526)
	at sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:253)
	at java.nio.file.Files.copy(Files.java:1274)
	at org.apache.spark.util.Utils$.copyRecursive(Utils.scala:726)
	at org.apache.spark.util.Utils$.copyFile(Utils.scala:697)
	at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:771)
	at org.apache.spark.util.Utils$.fetchFile(Utils.scala:541)
	at org.apache.spark.deploy.worker.DriverRunner.downloadUserJar(DriverRunner.scala:162)
	at org.apache.spark.deploy.worker.DriverRunner.prepareAndRunDriver(DriverRunner.scala:180)
	at org.apache.spark.deploy.worker.DriverRunner$$anon$2.run(DriverRunner.scala:99)

	at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41)
	at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:106)
	... 29 more
Caused by: java.lang.RuntimeException: Cancel client 'cbdaa3a9-9b79-4ce5-a4ca-5401ce4ddef7'. Error: Child process (spark-submit) exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
java.nio.file.NoSuchFileException: /opt/hive/lib/hive-exec-3.1.2.jar
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:526)
	at sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:253)
	at java.nio.file.Files.copy(Files.java:1274)
	at org.apache.spark.util.Utils$.copyRecursive(Utils.scala:726)
	at org.apache.spark.util.Utils$.copyFile(Utils.scala:697)
	at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:771)
	at org.apache.spark.util.Utils$.fetchFile(Utils.scala:541)
	at org.apache.spark.deploy.worker.DriverRunner.downloadUserJar(DriverRunner.scala:162)
	at org.apache.spark.deploy.worker.DriverRunner.prepareAndRunDriver(DriverRunner.scala:180)
	at org.apache.spark.deploy.worker.DriverRunner$$anon$2.run(DriverRunner.scala:99)

	at org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:211)
	at org.apache.hive.spark.client.SparkClientImpl$2.run(SparkClientImpl.java:491)
	at java.lang.Thread.run(Thread.java:748)

2022-08-05T17:40:53,028 ERROR [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: spark.SparkTask (:()) - Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session cbdaa3a9-9b79-4ce5-a4ca-5401ce4ddef7)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session cbdaa3a9-9b79-4ce5-a4ca-5401ce4ddef7
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.getHiveException(SparkSessionImpl.java:215) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) ~[hive-cli-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683) ~[hive-cli-3.1.2.jar:3.1.2]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_232]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_232]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_232]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_232]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:323) ~[hadoop-common-3.2.1.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:236) ~[hadoop-common-3.2.1.jar:?]
Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client 'cbdaa3a9-9b79-4ce5-a4ca-5401ce4ddef7'. Error: Child process (spark-submit) exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
java.nio.file.NoSuchFileException: /opt/hive/lib/hive-exec-3.1.2.jar
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:526)
	at sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:253)
	at java.nio.file.Files.copy(Files.java:1274)
	at org.apache.spark.util.Utils$.copyRecursive(Utils.scala:726)
	at org.apache.spark.util.Utils$.copyFile(Utils.scala:697)
	at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:771)
	at org.apache.spark.util.Utils$.fetchFile(Utils.scala:541)
	at org.apache.spark.deploy.worker.DriverRunner.downloadUserJar(DriverRunner.scala:162)
	at org.apache.spark.deploy.worker.DriverRunner.prepareAndRunDriver(DriverRunner.scala:180)
	at org.apache.spark.deploy.worker.DriverRunner$$anon$2.run(DriverRunner.scala:99)

	at com.google.common.base.Throwables.propagate(Throwables.java:241) ~[guava-27.0-jre.jar:?]
	at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:128) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:88) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:105) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:101) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
	... 24 more
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client 'cbdaa3a9-9b79-4ce5-a4ca-5401ce4ddef7'. Error: Child process (spark-submit) exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
java.nio.file.NoSuchFileException: /opt/hive/lib/hive-exec-3.1.2.jar
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:526)
	at sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:253)
	at java.nio.file.Files.copy(Files.java:1274)
	at org.apache.spark.util.Utils$.copyRecursive(Utils.scala:726)
	at org.apache.spark.util.Utils$.copyFile(Utils.scala:697)
	at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:771)
	at org.apache.spark.util.Utils$.fetchFile(Utils.scala:541)
	at org.apache.spark.deploy.worker.DriverRunner.downloadUserJar(DriverRunner.scala:162)
	at org.apache.spark.deploy.worker.DriverRunner.prepareAndRunDriver(DriverRunner.scala:180)
	at org.apache.spark.deploy.worker.DriverRunner$$anon$2.run(DriverRunner.scala:99)

	at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41) ~[netty-all-4.1.17.Final.jar:4.1.17.Final]
	at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:106) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:88) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:105) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:101) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
	... 24 more
Caused by: java.lang.RuntimeException: Cancel client 'cbdaa3a9-9b79-4ce5-a4ca-5401ce4ddef7'. Error: Child process (spark-submit) exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
java.nio.file.NoSuchFileException: /opt/hive/lib/hive-exec-3.1.2.jar
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:526)
	at sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:253)
	at java.nio.file.Files.copy(Files.java:1274)
	at org.apache.spark.util.Utils$.copyRecursive(Utils.scala:726)
	at org.apache.spark.util.Utils$.copyFile(Utils.scala:697)
	at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:771)
	at org.apache.spark.util.Utils$.fetchFile(Utils.scala:541)
	at org.apache.spark.deploy.worker.DriverRunner.downloadUserJar(DriverRunner.scala:162)
	at org.apache.spark.deploy.worker.DriverRunner.prepareAndRunDriver(DriverRunner.scala:180)
	at org.apache.spark.deploy.worker.DriverRunner$$anon$2.run(DriverRunner.scala:99)

	at org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:211) ~[hive-exec-3.1.2.jar:3.1.2]
	at org.apache.hive.spark.client.SparkClientImpl$2.run(SparkClientImpl.java:491) ~[hive-exec-3.1.2.jar:3.1.2]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_232]
2022-08-05T17:40:53,029 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: reexec.ReOptimizePlugin (:()) - ReOptimization: retryPossible: false
2022-08-05T17:40:53,029 ERROR [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ql.Driver (:()) - FAILED: Execution Error, return code 30041 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create Spark client for Spark session cbdaa3a9-9b79-4ce5-a4ca-5401ce4ddef7
2022-08-05T17:40:53,029 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ql.Driver (:()) - Completed executing command(queryId=root_20220805174042_428d3cd2-b9a8-40ac-9535-cd696c0bc43d); Time taken: 8.369 seconds
2022-08-05T17:40:53,029 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: ql.Driver (:()) - Concurrency mode is disabled, not creating a lock manager
2022-08-05T17:40:53,056 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: d1c66613-70b0-48d8-8d91-88078c32eb9b
2022-08-05T17:40:53,056 INFO  [d1c66613-70b0-48d8-8d91-88078c32eb9b main]: session.SessionState (SessionState.java:resetThreadName(452)) - Resetting thread name to  main
2022-08-05T17:41:19,050 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-05T17:41:19,051 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-05T17:42:06,330 INFO  [shutdown-hook-0]: session.SparkSessionManagerImpl (:()) - Closing the session manager.
提交参数

client.SparkClientImpl (😦)) - Running client driver with argv:
/usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java
org.apache.spark.deploy.SparkSubmit
–properties-file /tmp/spark-submit.3829110648136000580.properties
–class org.apache.hive.spark.client.RemoteDriver /opt/hive/lib/hive-exec-3.1.2.jar
–remote-host hive-metastore
–remote-port 35955
–conf hive.spark.client.connect.timeout=10000
–conf hive.spark.client.server.connect.timeout=90000
–conf hive.spark.client.channel.log.level=null
–conf hive.spark.client.rpc.max.size=52428800
–conf hive.spark.client.rpc.threads=8
–conf hive.spark.client.secret.bits=256
–conf hive.spark.client.rpc.server.address=null

TODO未解决,疑似还是版本不兼容问题

spark java.lang.NoClassDefFoundError: org/apache/spark/AccumulatorParam

2022-08-06T03:41:19,048 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:41:19,049 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:42:19,048 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:42:19,049 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:43:19,048 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:43:19,049 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:44:19,048 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:44:19,049 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:45:19,048 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:45:19,049 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:46:19,049 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:46:19,050 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:47:19,048 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:47:19,049 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:48:19,048 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:48:19,049 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:49:19,052 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:49:19,053 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:50:19,049 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:50:19,049 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:51:19,049 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:51:19,050 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:52:19,048 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:52:19,049 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:53:19,048 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:53:19,049 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:54:19,048 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:54:19,049 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:55:19,048 INFO  [pool-6-thread-1]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(895)) - 1: source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:55:19,048 INFO  [pool-6-thread-1]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(347)) - ugi=root	ip=172.25.0.13	cmd=source:172.25.0.13 get_config_value: name=metastore.batch.retrieve.max defaultValue=50
2022-08-06T03:55:26,284 INFO  [main]: conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 37c29c4b-6e5f-4069-ba1e-92a89299277d
2022-08-06T03:55:26,285 INFO  [main]: session.SessionState (SessionState.java:updateThreadName(441)) - Updating thread name to 37c29c4b-6e5f-4069-ba1e-92a89299277d main
2022-08-06T03:55:26,292 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Driver (:()) - Compiling command(queryId=root_20220806035526_cfc53157-3a0a-4515-94d1-633ca3664491): insert into table student values(1,'abc')
2022-08-06T03:55:26,310 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStoreClient (:()) - Mestastore configuration metastore.filter.hook changed from org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook
2022-08-06T03:55:26,310 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: Cleaning up thread local RawStore...
2022-08-06T03:55:26,310 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=Cleaning up thread local RawStore...
2022-08-06T03:55:26,310 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: Done cleaning up thread local RawStore
2022-08-06T03:55:26,310 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=Done cleaning up thread local RawStore
2022-08-06T03:55:26,310 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Driver (:()) - Concurrency mode is disabled, not creating a lock manager
2022-08-06T03:55:26,310 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - Starting Semantic Analysis
2022-08-06T03:55:26,312 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2022-08-06T03:55:26,312 WARN  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.ObjectStore (:()) - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
2022-08-06T03:55:26,312 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.ObjectStore (:()) - ObjectStore, initialize called
2022-08-06T03:55:26,315 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.MetaStoreDirectSql (:()) - Using direct SQL, underlying DB is POSTGRES
2022-08-06T03:55:26,315 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.ObjectStore (:()) - Initialized ObjectStore
2022-08-06T03:55:26,316 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=root (auth:SIMPLE) retries=1 delay=1 lifetime=0
2022-08-06T03:55:26,316 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-06T03:55:26,316 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-06T03:55:26,328 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - Completed phase 1 of Semantic Analysis
2022-08-06T03:55:26,328 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - Get metadata for source tables
2022-08-06T03:55:26,328 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - Get metadata for subqueries
2022-08-06T03:55:26,328 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - Get metadata for destination tables
2022-08-06T03:55:26,329 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-06T03:55:26,329 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-06T03:55:26,340 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - Completed getting MetaData in Semantic Analysis
2022-08-06T03:55:26,384 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Context (:()) - New scratch dir is hdfs://namenode:9000/tmp/hive/root/37c29c4b-6e5f-4069-ba1e-92a89299277d/hive_2022-08-06_03-55-26_308_294949150141486344-1
2022-08-06T03:55:26,394 INFO  [Thread-17]: sasl.SaslDataTransferClient (:()) - SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2022-08-06T03:55:26,405 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_not_null_constraints : tbl=hive._dummy_database._dummy_table
2022-08-06T03:55:26,406 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_not_null_constraints : tbl=hive._dummy_database._dummy_table
2022-08-06T03:55:26,410 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_primary_keys : tbl=hive._dummy_database._dummy_table
2022-08-06T03:55:26,410 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_primary_keys : tbl=hive._dummy_database._dummy_table
2022-08-06T03:55:26,413 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_primary_keys : tbl=hive._dummy_database._dummy_table
2022-08-06T03:55:26,414 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_primary_keys : tbl=hive._dummy_database._dummy_table
2022-08-06T03:55:26,417 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_unique_constraints : tbl=hive._dummy_database._dummy_table
2022-08-06T03:55:26,417 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_unique_constraints : tbl=hive._dummy_database._dummy_table
2022-08-06T03:55:26,420 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_foreign_keys : parentdb=null parenttbl=null foreigndb=_dummy_database foreigntbl=_dummy_table
2022-08-06T03:55:26,420 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_foreign_keys : parentdb=null parenttbl=null foreigndb=_dummy_database foreigntbl=_dummy_table
2022-08-06T03:55:26,451 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_databases: @hive#
2022-08-06T03:55:26,451 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_databases: @hive#
2022-08-06T03:55:26,452 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_materialized_views_for_rewriting: db=@hive#default
2022-08-06T03:55:26,452 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_materialized_views_for_rewriting: db=@hive#default
2022-08-06T03:55:26,454 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_materialized_views_for_rewriting: db=@hive#test
2022-08-06T03:55:26,454 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_materialized_views_for_rewriting: db=@hive#test
2022-08-06T03:55:26,458 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - Get metadata for source tables
2022-08-06T03:55:26,458 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - Get metadata for subqueries
2022-08-06T03:55:26,458 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - Get metadata for source tables
2022-08-06T03:55:26,458 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive._dummy_database._dummy_table
2022-08-06T03:55:26,458 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive._dummy_database._dummy_table
2022-08-06T03:55:26,459 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - Get metadata for subqueries
2022-08-06T03:55:26,459 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - Get metadata for destination tables
2022-08-06T03:55:26,459 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - Get metadata for destination tables
2022-08-06T03:55:26,459 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-06T03:55:26,459 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-06T03:55:26,469 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Context (:()) - New scratch dir is hdfs://namenode:9000/tmp/hive/root/37c29c4b-6e5f-4069-ba1e-92a89299277d/hive_2022-08-06_03-55-26_308_294949150141486344-1
2022-08-06T03:55:26,472 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: common.FileUtils (FileUtils.java:mkdir(580)) - Creating directory if it doesn't exist: hdfs://namenode:9000/user/hive/warehouse/student/.hive-staging_hive_2022-08-06_03-55-26_308_294949150141486344-1
2022-08-06T03:55:26,475 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_not_null_constraints : tbl=hive.default.student
2022-08-06T03:55:26,475 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_not_null_constraints : tbl=hive.default.student
2022-08-06T03:55:26,477 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_check_constraints : tbl=hive.default.student
2022-08-06T03:55:26,477 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_check_constraints : tbl=hive.default.student
2022-08-06T03:55:26,480 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-06T03:55:26,480 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-06T03:55:26,488 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - Generate an operator pipeline to autogather column stats for table default.student in query insert into table student values(1,'abc')
2022-08-06T03:55:26,490 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-06T03:55:26,490 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-06T03:55:26,498 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - Get metadata for source tables
2022-08-06T03:55:26,498 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-06T03:55:26,498 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-06T03:55:26,505 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - Get metadata for subqueries
2022-08-06T03:55:26,505 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - Get metadata for destination tables
2022-08-06T03:55:26,509 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Context (:()) - New scratch dir is hdfs://namenode:9000/tmp/hive/root/37c29c4b-6e5f-4069-ba1e-92a89299277d/hive_2022-08-06_03-55-26_488_3430707210940845133-1
2022-08-06T03:55:26,511 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: common.FileUtils (FileUtils.java:mkdir(580)) - Creating directory if it doesn't exist: hdfs://namenode:9000/tmp/hive/root/37c29c4b-6e5f-4069-ba1e-92a89299277d/hive_2022-08-06_03-55-26_488_3430707210940845133-1/-mr-10000/.hive-staging_hive_2022-08-06_03-55-26_488_3430707210940845133-1
2022-08-06T03:55:26,513 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - CBO Succeeded; optimized logical plan.
2022-08-06T03:55:26,514 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ppd.OpProcFactory (:()) - Processing for FS(4)
2022-08-06T03:55:26,514 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ppd.OpProcFactory (:()) - Processing for FS(11)
2022-08-06T03:55:26,514 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ppd.OpProcFactory (:()) - Processing for SEL(10)
2022-08-06T03:55:26,514 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ppd.OpProcFactory (:()) - Processing for GBY(9)
2022-08-06T03:55:26,514 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ppd.OpProcFactory (:()) - Processing for RS(8)
2022-08-06T03:55:26,514 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ppd.OpProcFactory (:()) - Processing for GBY(7)
2022-08-06T03:55:26,514 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ppd.OpProcFactory (:()) - Processing for SEL(6)
2022-08-06T03:55:26,514 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ppd.OpProcFactory (:()) - Processing for SEL(3)
2022-08-06T03:55:26,514 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ppd.OpProcFactory (:()) - Processing for UDTF(2)
2022-08-06T03:55:26,514 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ppd.OpProcFactory (:()) - Processing for SEL(1)
2022-08-06T03:55:26,514 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ppd.OpProcFactory (:()) - Processing for TS(0)
2022-08-06T03:55:26,514 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: optimizer.ColumnPrunerProcFactory (:()) - RS 8 oldColExprMap: {VALUE._col0=Column[_col0], VALUE._col1=Column[_col1]}
2022-08-06T03:55:26,514 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: optimizer.ColumnPrunerProcFactory (:()) - RS 8 newColExprMap: {VALUE._col0=Column[_col0], VALUE._col1=Column[_col1]}
2022-08-06T03:55:26,518 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: spark.SetSparkReducerParallelism (:()) - Number of reducers for sink RS[8] was already determined to be: 1
2022-08-06T03:55:26,519 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: metastore.HiveMetaStore (:()) - 0: get_table : tbl=hive.default.student
2022-08-06T03:55:26,519 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: HiveMetaStore.audit (:()) - ugi=root	ip=unknown-ip-addr	cmd=get_table : tbl=hive.default.student
2022-08-06T03:55:26,526 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: physical.Vectorizer (:()) - Examining input format to see if vectorization is enabled.
2022-08-06T03:55:26,526 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: physical.Vectorizer (:()) - Map vectorization enabled: false
2022-08-06T03:55:26,526 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: physical.Vectorizer (:()) - Map vectorized: false
2022-08-06T03:55:26,526 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: physical.Vectorizer (:()) - Map vectorizedVertexNum: 0
2022-08-06T03:55:26,526 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: physical.Vectorizer (:()) - Map enabledConditionsMet: [hive.vectorized.use.vectorized.input.format IS true]
2022-08-06T03:55:26,526 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: physical.Vectorizer (:()) - Map enabledConditionsNotMet: [Could not enable vectorization due to partition column names size 1 is greater than the number of table column names size 0 IS false]
2022-08-06T03:55:26,526 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: physical.Vectorizer (:()) - Map inputFileFormatClassNameSet: [org.apache.hadoop.hive.ql.io.NullRowsInputFormat]
2022-08-06T03:55:26,527 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: physical.Vectorizer (:()) - Validating and vectorizing ReduceWork... (vectorizedVertexNum 1)
2022-08-06T03:55:26,527 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: physical.Vectorizer (:()) - Reduce vectorization enabled: true
2022-08-06T03:55:26,527 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: physical.Vectorizer (:()) - Reduce vectorized: false
2022-08-06T03:55:26,527 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: physical.Vectorizer (:()) - Reduce notVectorizedReason: Aggregation Function expression for GROUPBY operator: UDF compute_stats not supported
2022-08-06T03:55:26,527 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: physical.Vectorizer (:()) - Reduce vectorizedVertexNum: 1
2022-08-06T03:55:26,527 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: physical.Vectorizer (:()) - Reducer hive.vectorized.execution.reduce.enabled: true
2022-08-06T03:55:26,527 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: physical.Vectorizer (:()) - Reducer engine: spark
2022-08-06T03:55:26,527 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: common.HiveStatsUtils (:()) - Error requested is 20.0%
2022-08-06T03:55:26,527 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: common.HiveStatsUtils (:()) - Choosing 16 bit vectors..
2022-08-06T03:55:26,527 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: parse.CalcitePlanner (:()) - Completed plan generation
2022-08-06T03:55:26,527 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Driver (:()) - Semantic Analysis Completed (retrial = false)
2022-08-06T03:55:26,527 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Driver (:()) - Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:col1, type:int, comment:null), FieldSchema(name:col2, type:string, comment:null)], properties:null)
2022-08-06T03:55:26,527 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Driver (:()) - Completed compiling command(queryId=root_20220806035526_cfc53157-3a0a-4515-94d1-633ca3664491); Time taken: 0.235 seconds
2022-08-06T03:55:26,527 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: reexec.ReExecDriver (:()) - Execution #1 of query
2022-08-06T03:55:26,527 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Driver (:()) - Concurrency mode is disabled, not creating a lock manager
2022-08-06T03:55:26,527 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Driver (:()) - Executing command(queryId=root_20220806035526_cfc53157-3a0a-4515-94d1-633ca3664491): insert into table student values(1,'abc')
2022-08-06T03:55:26,528 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Driver (:()) - Query ID = root_20220806035526_cfc53157-3a0a-4515-94d1-633ca3664491
2022-08-06T03:55:26,528 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Driver (:()) - Total jobs = 1
2022-08-06T03:55:26,528 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Driver (:()) - Launching Job 1 out of 1
2022-08-06T03:55:26,528 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Driver (:()) - Starting task [Stage-1:MAPRED] in serial mode
2022-08-06T03:55:26,528 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: spark.SparkTask (:()) - In order to change the average load for a reducer (in bytes):
2022-08-06T03:55:26,528 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: spark.SparkTask (:()) -   set hive.exec.reducers.bytes.per.reducer=<number>
2022-08-06T03:55:26,528 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: spark.SparkTask (:()) - In order to limit the maximum number of reducers:
2022-08-06T03:55:26,528 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: spark.SparkTask (:()) -   set hive.exec.reducers.max=<number>
2022-08-06T03:55:26,528 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: spark.SparkTask (:()) - In order to set a constant number of reducers:
2022-08-06T03:55:26,528 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: spark.SparkTask (:()) -   set mapreduce.job.reduces=<number>
2022-08-06T03:55:26,528 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: session.SparkSession (:()) - Trying to open Spark session a39481cc-e0dd-4472-b17a-e2517f695b59
2022-08-06T03:55:26,546 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - No spark.home provided, calling SparkSubmit directly.
2022-08-06T03:55:26,546 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - Running client driver with argv: /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java org.apache.spark.deploy.SparkSubmit --properties-file /tmp/spark-submit.7771556194894583932.properties --class org.apache.hive.spark.client.RemoteDriver /opt/hive/lib/hive-exec-3.1.2.jar --remote-host hive-metastore --remote-port 39601 --conf hive.spark.client.connect.timeout=10000 --conf hive.spark.client.server.connect.timeout=90000 --conf hive.spark.client.channel.log.level=null --conf hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256 --conf hive.spark.client.rpc.server.address=null
2022-08-06T03:55:27,774 INFO  [RemoteDriver-stderr-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
2022-08-06T03:55:27,774 INFO  [RemoteDriver-stderr-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
2022-08-06T03:55:27,774 INFO  [RemoteDriver-stderr-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
2022-08-06T03:55:27,774 INFO  [RemoteDriver-stderr-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
2022-08-06T03:55:27,774 INFO  [RemoteDriver-stderr-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
2022-08-06T03:55:28,223 INFO  [RemoteDriver-stdout-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - 2022-08-06T03:55:28,221 WARN [main] org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2022-08-06T03:55:28,280 INFO  [RemoteDriver-stderr-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
2022-08-06T03:55:28,280 INFO  [RemoteDriver-stdout-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - 2022-08-06T03:55:28,280 INFO [main] org.apache.spark.SecurityManager - Changing view acls to: root
2022-08-06T03:55:28,281 INFO  [RemoteDriver-stdout-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - 2022-08-06T03:55:28,280 INFO [main] org.apache.spark.SecurityManager - Changing modify acls to: root
2022-08-06T03:55:28,281 INFO  [RemoteDriver-stdout-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - 2022-08-06T03:55:28,281 INFO [main] org.apache.spark.SecurityManager - Changing view acls groups to:
2022-08-06T03:55:28,281 INFO  [RemoteDriver-stdout-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - 2022-08-06T03:55:28,281 INFO [main] org.apache.spark.SecurityManager - Changing modify acls groups to:
2022-08-06T03:55:28,282 INFO  [RemoteDriver-stdout-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - 2022-08-06T03:55:28,282 INFO [main] org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
2022-08-06T03:55:28,574 INFO  [RemoteDriver-stdout-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - 2022-08-06T03:55:28,574 INFO [main] org.apache.spark.util.Utils - Successfully started service 'driverClient' on port 38121.
2022-08-06T03:55:28,640 INFO  [RemoteDriver-stdout-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - 2022-08-06T03:55:28,640 INFO [netty-rpc-connection-0] org.apache.spark.network.client.TransportClientFactory - Successfully created connection to spark-master-hp/172.25.0.3:7077 after 40 ms (0 ms spent in bootstraps)
2022-08-06T03:55:28,732 INFO  [RemoteDriver-stdout-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - 2022-08-06T03:55:28,732 INFO [dispatcher-event-loop-2] org.apache.spark.deploy.ClientEndpoint - ... waiting before polling master for driver state
2022-08-06T03:55:28,758 INFO  [RemoteDriver-stdout-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - 2022-08-06T03:55:28,757 INFO [dispatcher-event-loop-3] org.apache.spark.deploy.ClientEndpoint - Driver successfully submitted as driver-20220806035528-0002
2022-08-06T03:55:33,563 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: session.SparkSession (:()) - Spark session a39481cc-e0dd-4472-b17a-e2517f695b59 is successfully opened
2022-08-06T03:55:33,603 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Context (:()) - New scratch dir is hdfs://namenode:9000/tmp/hive/root/37c29c4b-6e5f-4069-ba1e-92a89299277d/hive_2022-08-06_03-55-26_308_294949150141486344-1
2022-08-06T03:55:33,768 INFO  [RemoteDriver-stdout-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - 2022-08-06T03:55:33,768 INFO [dispatcher-event-loop-4] org.apache.spark.deploy.ClientEndpoint - State of driver-20220806035528-0002 is RUNNING
2022-08-06T03:55:33,769 INFO  [RemoteDriver-stdout-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - 2022-08-06T03:55:33,769 INFO [dispatcher-event-loop-4] org.apache.spark.deploy.ClientEndpoint - Driver running on 172.25.0.9:43419 (worker-20220802005046-172.25.0.9-43419)
2022-08-06T03:55:33,770 INFO  [RemoteDriver-stdout-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - 2022-08-06T03:55:33,770 INFO [dispatcher-event-loop-4] org.apache.spark.deploy.ClientEndpoint - spark-submit not configured to wait for completion, exiting spark-submit JVM.
2022-08-06T03:55:33,775 INFO  [RemoteDriver-stdout-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - 2022-08-06T03:55:33,775 INFO [shutdown-hook-0] org.apache.spark.util.ShutdownHookManager - Shutdown hook called
2022-08-06T03:55:33,777 INFO  [RemoteDriver-stdout-redir-37c29c4b-6e5f-4069-ba1e-92a89299277d main]: client.SparkClientImpl (:()) - 2022-08-06T03:55:33,777 INFO [shutdown-hook-0] org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-97a6d965-c06e-4e44-b6bf-4f6c843c0a01
2022-08-06T03:55:36,167 INFO  [RPC-Handler-4]: client.SparkClientImpl (:()) - Received result for dbef0160-ac9b-4b78-ba33-6636ff7dca4c
2022-08-06T03:55:36,775 ERROR [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: status.SparkJobMonitor (:()) - Job failed with java.lang.ClassNotFoundException: org.apache.spark.AccumulatorParam
java.lang.NoClassDefFoundError: org/apache/spark/AccumulatorParam
	at org.apache.hive.spark.counter.SparkCounterGroup.createCounter(SparkCounterGroup.java:52)
	at org.apache.hive.spark.counter.SparkCounters.createCounter(SparkCounters.java:71)
	at org.apache.hive.spark.counter.SparkCounters.createCounter(SparkCounters.java:67)
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient$JobStatusJob.call(RemoteHiveSparkClient.java:350)
	at org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:378)
	at org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:343)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.AccumulatorParam
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	... 10 more

2022-08-06T03:55:36,783 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: reexec.ReOptimizePlugin (:()) - ReOptimization: retryPossible: false
2022-08-06T03:55:36,783 ERROR [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Driver (:()) - FAILED: Execution Error, return code 3 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Spark job failed during runtime. Please check stacktrace for the root cause.
2022-08-06T03:55:36,784 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Driver (:()) - Completed executing command(queryId=root_20220806035526_cfc53157-3a0a-4515-94d1-633ca3664491); Time taken: 10.256 seconds
2022-08-06T03:55:36,784 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: ql.Driver (:()) - Concurrency mode is disabled, not creating a lock manager
2022-08-06T03:55:36,787 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: conf.HiveConf (HiveConf.java:getLogIdVar(5040)) - Using the default value passed in for log id: 37c29c4b-6e5f-4069-ba1e-92a89299277d
2022-08-06T03:55:36,787 INFO  [37c29c4b-6e5f-4069-ba1e-92a89299277d main]: session.SessionState (SessionState.java:resetThreadName(452)) - Resetting thread name to  main
2022-08-06T03:55:59,955 INFO  [shutdown-hook-0]: session.SparkSessionManagerImpl (:()) - Closing the session manager.
  • 2
    点赞
  • 7
    收藏
    觉得还不错? 一键收藏
  • 1
    评论
Docker是一种容器化技术,用于创建、部署和管理应用程序的容器。Hadoop是一个分布式计算框架,用于处理大数据集并在集群中进行高速计算。HBase是一个面向列的分布式数据库,用于存储和管理大规模的结构化数据。Hive是一个基于Hadoop的数据仓库基础架构,用于提供简化的查询和分析大数据的能力。而Spark是一个高级的分布式计算系统,用于加速大规模数据处理和分析。 在使用这些技术时,Docker可以用于快速搭建和部署容器化的Hadoop、HBase、HiveSpark环境。使用Docker容器,我们可以轻松地在任何机器上部署这些组件,而无需担心环境配置的问题Hadoop是一个开源的分布式计算框架,它可以容纳大规模数据并以可靠的方式在集群中进行处理。通过用Docker容器来运行Hadoop,我们可以更快地搭建和管理Hadoop集群,并且容易进行监控和维护。 HBase是一个分布式数据库系统,以表的形式存储数据,并提供高效的读写操作。通过Docker容器,我们可以轻松地部署HBase集群,并且可以根据需求进行水平扩展,以满足不同规模的数据存储需求。 Hive是一个基于Hadoop的数据仓库基础架构,它提供了类似于SQL的查询接口,方便用户进行大规模数据的查询和分析。使用Docker容器,我们可以轻松地搭建Hive环境,并通过对容器进行配置和管理,优化Hive的性能。 Spark是一个高级的分布式计算系统,它强调内存计算和迭代计算的能力,从而加速大规模数据处理和分析。通过Docker容器,我们可以快速部署和管理Spark集群,并且可以根据需求进行资源配置和任务调度,以实现高性能和高吞吐量的数据处理。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值