![](https://img-blog.csdnimg.cn/20201014180756738.png?x-oss-process=image/resize,m_fixed,h_64,w_64)
大数据
有风微冷
奋发向上
展开
-
CDH删除脚本
#!/bin/bash# 停止CM服务systemctl stop cloudera-scm-serversystemctl stop cloudera-scm-agent# 卸载CM软件包yum -y remove cloudera-manager-daemons cloudera-manager-agent cloudera-manager-server# 卸载装载点umount cm_processesumount cm_processesumount cm_processes原创 2021-12-28 10:44:16 · 185 阅读 · 0 评论 -
结合zookeeper详细说明CAP定理
C:Consistency,一致性,数据一直更新,所有数据变动都是同步的。A:Availbaility,可用性,系统具有很好的响应性能。P:Partition tolerance,分区容错性。这三个基本需求,最多只能同时满足其中的两个,因为p是必须的,因此往往选择在CP和AP中。C: Zookeeper保证了最终一致性,在十几秒可以Sync到各个节点.A: Zookeeper保...原创 2019-07-24 11:11:17 · 207 阅读 · 0 评论 -
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category WR
原来的active namenode节点由active转变成standby检查namenode 是否有死掉的,重新启动namenode,如果两个namenode都是standby,检查zkfc,是否启动,如果启动可以从新启动zkfc....原创 2019-07-31 16:44:14 · 7514 阅读 · 0 评论 -
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.rdd.RDD.mapPartitionsWithIn
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.rdd.RDD.mapPartitionsWithIndexInternal(Lscala/Function2;ZLscala/reflect/ClassTag;)Lorg/apache/spark/rdd/RDD; 错误原因:pom.xml文件...原创 2019-08-01 10:18:20 · 3016 阅读 · 0 评论 -
azkaban---Missing required property 'azkaban.native.lib'
解决办法:进入到/azkaban-web-server/plugins/jobtypesvim commonprivate.properties加上azkaban.native.lib=false然后拷贝commonprivate.properties到azkaban-exec-server/plugins/jobtypes重启即可(注意一定要azkaban-we...原创 2019-08-06 12:05:35 · 1407 阅读 · 0 评论 -
Python in worker has different version 3.7 than that in driver 3.6, PySpark cannot run with differe
错误如下:使用Anaconda,默认是python37,下载python36,idea中更换为python3.6,报了如下错误。Exception: Python in worker has different version 3.7 than that in driver 3.6, PySpark cannot run with different minor versions.Pleas...原创 2019-08-06 19:09:24 · 2965 阅读 · 0 评论 -
启动hadoop集群的一个脚本start-hadoop-all.sh
#!/bin/bash#启动ZKfor sxt in node002 node003 node004do ssh $sxt "source ~/.bash_profile; zkServer.sh start"donesleep 2#启动hdfsstart-dfs.shsleep 2#启动yarn ssh node001 "source ~/.bash_profile;...原创 2019-08-07 17:31:36 · 1360 阅读 · 0 评论