bin/hdfs namenode -format
sbin/start-dfs.sh
bin/hdfs dfs -mkdir /user
bin/hdfs dfs -mkdir /user/<username>
these are for testing:
bin/hdfs dfs -put etc/hadoop input
bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.4.jar grep input output 'dfs[a-z.]+'
bin/hdfs dfs -cat output/*
testing results:
6 dfs.audit.logger
4 dfs.class
3 dfs.server.namenode.
2 dfs.period
2 dfs.audit.log.maxfilesize
2 dfs.audit.log.maxbackupindex
1 dfsmetrics.log
1 dfsadmin
1 dfs.servers
1 dfs.replication
1 dfs.file
HistoryServer
./sbin/mr-jobhistory-daemon.sh start historyserver
![](https://i-blog.csdnimg.cn/blog_migrate/11f3028a165e186a043b63e6cb7897ec.jpeg)
Spark:
start:
./sbin/start-slaves.sh spark://<your-computer-name>:7077
You will see:
- Alive Workers: 1
This is for testing:
./bin/spark-shell --master spark://<your-computer-name>:7077
You will see the scala shell.
use
:
q
to quit.
To see the history:
./sbin/start-history-server.sh
Hive:
Bug:
in mysql 5.7 you should use :
jdbc:mysql://localhost:3306/hivedb?useSSL=false&createDatabaseIfNotExist=true
start hiveserver2:
nohup hiveserver2 &
Bug:
User: is not allowed to impersonate anonymous (state=,code=0)
See more:
Hwi 界面Bug:
HWI WAR file not found at
pack the war file yourself, then copy it to the right place, then add needed setting into hive-site.xml
Problem: failed to create task or type componentdef
- Or:
Could not create task or type of type: componentdef
sudo apt-get install libjasperreports-java
sudo apt-get install ant
_________________________________________________________________________________not finished
自定义配置:
数据库连接软件:
默认 用户名就是登录账号 密码为空
![](https://i-blog.csdnimg.cn/blog_migrate/1ac788984823c1e722690b3fda2ac3b7.png)
语法
more info:
HBase
./bin/start-hbase.sh
HBase & Hive
![](https://i-blog.csdnimg.cn/blog_migrate/411ec4f73813a3b16b0c771d2a24ff88.jpeg)
![](https://i-blog.csdnimg.cn/blog_migrate/ba45ea6bc94c04455942e7152715b485.jpeg)
![](https://i-blog.csdnimg.cn/blog_migrate/b3ad43439b9baebc89bbc84a19f6926c.jpeg)
SparkSQL
Spark SQL架构如下图所示:
phoenix
- queryserver.py start
- jdbc:phoenix:thin:url=http://localhost:8765;serialization=PROTOBUF
Or:
phoenix-sqlline.py localhost:2181