记一次诡异的问题:启动Hive时报:java.lang.ClassCastException
Mac安装Hive后命令行启动,然后报类型转换异常,如下图。
检查jdk环境变量配置正确,尝试修改了hive可执行文件中的JAVA_HOME,依旧不行。
vim $(which hive)
#!/bin/bash
JAVA_HOME="/Library/Java/JavaVirtualMachines/jdk1.8.0_251.jdk/Contents/Home" HIVE_HOME="/usr/local/Cellar/hive/3.1.2_1/libexec" exec "/usr/local/Cellar/hive/3.1.2_1/libexec/bin/hive" "$@"
但是排查方向是正确的,还是jdk指向路径的问题。
最后折腾了一个晚上,发现是hadoop-env.sh中的JAVA_HOME路径没有配置,我这里将其配置成绝对路径了,配置成${JAVA_HOME}也行。
cd /usr/local/Cellar/hadoop/3.2.1_1/libexec/etc/hadoop
vim hadoop-env.sh
###
# Technically, the only required environment variable is JAVA_HOME.
# All others are optional. However, the defaults are probably not
# preferred. Many sites configure these options outside of Hadoop,
# such as in /etc/profile.d
# The java implementation to use. By default, this environment
# variable is REQUIRED on ALL platforms except OS X!
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_251.jdk/Contents/Home
最后切记重启hadoop服务以生效:start-all.sh。
hive启动成功!如果有相同问题的小伙伴,希望我的随手记录能够帮到您!