Hadoop集群之flume安装配置
vi /etc/profile
#set flum
export FLUME_HOME=/opt/hadoop/flume-bin
export FLUME_CONF_DIR=$FLUME_HOME/conf
export PATH=$PATH:$FLUME_HOME/bin
sftp> put apache-flume-1.6.0-bin.tar.gz
sftp> put apache-flume-1.6.0-src.tar.gz
[hadoop@slavenode1 hadoop]# pwd
/opt/hadoop
tar -zxvf apache-flume-1.6.0-src.tar.gz ; tar -zxvf apache-flume-1.6.0-bin.tar.gz
[hadoop@slavenode1 hadoop]# mv apache-flume-1.6.0-bin flume-bin
[hadoop@slavenode1 flume-bin]# cd conf/
[hadoop@slavenode1 conf]# cp flume-env.sh.template flume-env.sh
[hadoop@slavenode1 conf]# vi flume-env.sh
export JAVA_HOME=/usr/java/jdk1.7.0_79
[hadoop@slavenode1 conf]#
[hadoop@slavenode8 conf]$ /opt/hadoop/flume-bin/bin/flume-ng version
Flume 1.6.0
Source code repository: https://git-wip-us.apache.org/repos/asf/flume.git
Revision: 2561a23240a71ba20bf288c7c2cda88f443c2080
Compiled by hshreedharan on Mon May 11 11:15:44 PDT 2015
From source with checksum b29e416802ce9ece3269d34233baf43f
6. 分发各节点(slavenode1-slavenode7)
[hadoop@slavenode8 hadoop]$ for i in {32,33,34,35,36,37,38};do scp -r flume-bin 192.168.237.2$i:/opt/hadoop/ ; done
创建一个新的目录共平时日志采集放配置文件使用。
[hadoop@slavenode4 example]$ mkdir /opt/hadoop/flume-bin/example
1) 单节点flume直接写入hdfs,监控一个日志文件
[hadoop@slavenode4 example]$ cat flume_directHDFS.conf
# Define a memory channel called ch1 on agent1