spark
awangyuk
java程序员
展开
-
jupyterhub+k8s+spark/yarn
1.生产集群若是是Spark/Yarn,方便集成(docker内部链接到现有spark on yarn集群)2.自定义镜像 2.1 work机上安装python3.7 link到/opt/conda/bin/pythonFROM jupyter/all-spark-notebook:2ce7c06a61a1ENV HADOOP_HOME /usr/local/hadoop...原创 2019-11-21 15:45:18 · 890 阅读 · 0 评论 -
spark虚拟机测试
docker run --name hadoop1 --net mynetwork --ip 172.18.0.3 --add-host hadoop1:172.18.0.3 --add-host hadoop2:172.18.0.4 \--add-host hadoop3:172.18.0.5 -d \-p 5002:22 \-p 9870:9870 \-p 8088...原创 2019-06-27 11:50:46 · 244 阅读 · 0 评论 -
spark集群
http://blog.51yip.com/hadoop/2022.html转载 2019-07-11 21:20:43 · 62 阅读 · 0 评论