主要日志报错如下:
+ exec /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/spark/bin/spark-class org.apache.spark.deploy.history.HistoryServer --properties-file /var/run/cloudera-scm-agent/process/145-spark_on_yarn-SPARK_YARN_HISTORY_SERVER/spark-conf/spark-history-server.conf
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:280)
at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
Caused by: java.io.FileNotFoundException: Log directory specified does not exist: hdfs://hadoop105:8020/user/spark/applicationHistory
at org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$startPolling(FsHistoryProvider.scala:268)
at org.apache.spark.deploy.history.FsHistoryProvider.initialize(FsHistoryProvider.scala:212)
at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:208)
at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:87)
... 6 more
Caused by: java.io.FileNotFoundException: File does not exist: hdfs://hadoop105:8020/user/spark/applicationHistory
at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1500)
at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1493)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1508)
at org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$startPolling(FsHistoryProvider.scala:258)
... 9 more
说明 hdfs namenode 服务器上并没有 /user/spark/applicationHistory 这个目录,进入 hdfs 服务进行创建目录
[hdfs@hadoop105 ~]$ hadoop fs -ls /user
Found 2 items
drwxr-xr-x - mapred hadoop 0 2023-05-24 09:59 /user/history
drwxrwxr-x - hue hue 0 2023-05-23 10:44 /user/hue
[hdfs@hadoop105 ~]$ hdfs dfs -mkdir -p /user/spark/applicationHistory
[hdfs@hadoop105 ~]$
创建完之后重启服务