![](https://img-blog.csdnimg.cn/20201014180756780.png?x-oss-process=image/resize,m_fixed,h_64,w_64)
spark
只因太菜
这个作者很懒,什么都没留下…
展开
-
Supervisor配置Presto报错:Exited too quickly (process log may have details)
1.在$PRESTO_HOME/bin/launcher 文件里添加Java环境#JAVA_HOMEexport JAVA_HOME=/usr/local/jdkexport PATH=$PATH:$JAVA_HOME/bin2.重启即可原创 2021-11-05 08:46:03 · 397 阅读 · 0 评论 -
Hive json Class org.openx.data.jsonserde.JsonSerDe not found Exception
自己的解决办法,不一定完全正确,也可能会有多余步骤,仅供参考json-serde-1.3.8-jar-with-dependencies.jar 在hive-site.xml中添加配置<property> <name>hive.aux.jars.path</name> <value>hdfs://hadoop01:8020/common/lib</value></property>...原创 2021-11-02 23:11:24 · 1060 阅读 · 1 评论 -
SparkCore自定义分区器
Spark自定义分区器自定义分区器:创建一个类继承Partitioner,并实现方法下方代码实现功能:把偶数放到0分区,基数放到1分区import org.apache.spark.rdd.RDDimport org.apache.spark.{Partitioner, SparkConf, SparkContext, TaskContext}object TestPartitioner { def main(args: Array[String]): Unit = {原创 2021-10-14 22:22:18 · 126 阅读 · 0 评论 -
Caused by: org.apache.spark.SparkException: This RDD lacks a SparkContext. It could happen in the
这个报错是因为RDD的transformation中嵌套transformation或action,导致计算失败可以先从报错那一行找到嵌套的transformation或action操作,把这个操作拿出来运算报错Caused by: org.apache.spark.SparkException: This RDD lacks a SparkContext. It could happen in the following cases:(1) RDD transformations...原创 2021-10-12 22:21:18 · 1489 阅读 · 0 评论