----Spark
在spark集群中执行打包后的wordcount.jar程序时,如果运行发生压缩类没找到
(1)原因
Spark on Yarn会默认使用Hadoop集群配置文件设置编码方式,但是Spark在自己的spark-yarn/jars 包里面没有找到支持lzo压缩的jar包,所以报错。
2)解决方案一:拷贝lzo的包到/opt/module/spark-yarn/jars目录
[abc@hadoop102 common]$cp /opt/module/hadoop-3.1.3/share/hadoop/common/hadoop-lzo-0.4.20.jar /opt/module/spark-yarn/jars
3)解决方案二:在执行命令的时候指定lzo的包位置
[abc@hadoop102 spark-yarn]$
bin/spark-submit \
--class com.atguigu.spark.WordCount \
--master yarn \
--driver-class-path /opt/module/hadoop-3.1.3/share/hadoop/common/hadoop-lzo-0.4.20.jar \
WordCount.jar \
/input \
/output