Hadoop & Spark & Hive & HBase

 
 
 
 
 
 
 
 
 
 
 
  
  1. bin/hdfs namenode -format
  2. sbin/start-dfs.sh

 
  
 
   
  1. bin/hdfs dfs -mkdir /user
  2. bin/hdfs dfs -mkdir /user/<username>
these are for testing:
 
   
 
    
  1. bin/hdfs dfs -put etc/hadoop input
  2. bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.4.jar grep input output 'dfs[a-z.]+'
  3. bin/hdfs dfs -cat output/*
testing results:
 
     
  1. 6 dfs.audit.logger
  2. 4 dfs.class
  3. 3 dfs.server.namenode.
  4. 2 dfs.period
  5. 2 dfs.audit.log.maxfilesize
  6. 2 dfs.audit.log.maxbackupindex
  7. 1 dfsmetrics.log
  8. 1 dfsadmin
  9. 1 dfs.servers
  10. 1 dfs.replication
  11. 1 dfs.file


YARN: 
ResourceManager
 
     
 
      
  1. ./sbin/start-yarn.sh
HistoryServer
 
      
  1. ./sbin/mr-jobhistory-daemon.sh start historyserver



Spark:

start:  
 
     
  1. ./sbin/start-master.sh

start worker:
 
     
  1. ./sbin/start-slaves.sh spark://<your-computer-name>:7077  
You will see:

  • Alive Workers: 1

This is for testing:

 
     
  1. ./bin/spark-shell --master spark://<your-computer-name>:7077

You will see the scala shell.
use : q to quit.

To see the history:
 
 
 
 
 
 
 
  
  1. ./sbin/start-history-server.sh



Hive:

Bug:
in mysql 5.7 you should use :
 
   
  1. jdbc:mysql://localhost:3306/hivedb?useSSL=false&amp;createDatabaseIfNotExist=true

start hiveserver2:
 
   
  1.  nohup hiveserver2 &

Bug:


Hwi 界面Bug:
 
   
  1. HWI WAR file not found at
pack the war file yourself, then copy it to the right place, then add needed setting into hive-site.xml

 
  
 
   
  1. Problem: failed to create task or type componentdef
  2. Or:
  3. Could not create task or type of type: componentdef
sudo apt-get install libjasperreports-java
sudo apt-get install ant
_________________________________________________________________________________not finished


自定义配置:



数据库连接软件:
默认 用户名就是登录账号 密码为空


语法




more info:





HBase


 
   
  1. ./bin/start-hbase.sh 

HBase & Hive





 
 
 
 
 
 
SparkSQL
 
 





Spark SQL架构如下图所示:

 
 
 
 
 
  
 


phoenix
 
   
  1. queryserver.py start
  2. jdbc:phoenix:thin:url=http://localhost:8765;serialization=PROTOBUF
Or:
 
   
  1. phoenix-sqlline.py localhost:2181

 
 






转载于:https://www.cnblogs.com/jins-note/p/9513419.html

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值