打开idea的“susu”项目,修改“ThingViewController.java”部分代码如下
@Autowired
private IThingViewService thingViewService;
private Logger log4j=Logger.getLogger(ThingViewController.class);
@RequestMapping("findThingByType.do")
public ModelAndView findThing() {
ModelAndView mv = new ModelAndView();
List<ThingView> fresh = thingViewService.findThingByType(0);
List<ThingView> snacks = thingViewService.findThingByType(1);
List<ThingView> liquor = thingViewService.findThingByType(2);
List<ThingView> dairy = thingViewService.findThingByType(3);
List<ThingView> water = thingViewService.findThingByType(4);
List<ThingView> cereals = thingViewService.findThingByType(5);
mv.addObject("fresh", fresh);
mv.addObject("snacks", snacks);
mv.addObject("liquor", liquor);
mv.addObject("dairy", dairy);
mv.addObject("water", water);
mv.addObject("cereals", cereals);
mv.setViewName("../main");
return mv;
}
@RequestMapping("findThingById.do")
public ModelAndView findThingById(int thingId) {
log4j.info("thing be searched:"+thingId);
ThingView thing=thingViewService.findThingById(thingId);
ModelAndView mv = new ModelAndView();
mv.addObject("thingInfo", thing);
mv.setViewName("details");
return mv;
}
运行“susu”,登录后随意浏览商品,结束运行后D盘“logs”文件夹中生成“log.log”文件
“alt+p”将“log.log”拖入“flumedata/”目录下
启动zk:
zjgm01,zjgm02,zjgm03
zkServer.sh start
启动kafka
zjgm01,zjgm02,zjgm03
kafka-server-start.sh /home/hadoop/app/kafka_2.11-0.11.0.2/config/server.properties
启动storm
zjgm01
storm nimbus (如果失败就删掉hadoop/app/apache-storm-0.9.2-incubating/bin/storm-local 里的文件)
zjgm02,zjgm03
storm supervisor (如果失败就删掉hadoop/app/apache-storm-0.9.2-incubating/bin/storm-local 里的文件)
启动flume
zjgm01
flume-ng agent --conf conf --conf-file /home/hadoop/app/apache-flume-1.9.0-bin/conf/ks.conf --name a1
(ks.conf 为自己的conf文件名)
打开idea的“stormkafka”项目,修改“ReadBolt.java”部分代码如下
int index = s.indexOf("thing be searched:");
if(index!=-1 && s.length()>=index + 23){
String s1 = s.substring(index + 18, index + 23);
basicOutputCollector.emit(new Values(s1));
}
运行“CountToMain.java”,可在D盘“storm”下看到生成文件
其内容如下图
安装“RedisDesktopManager”
双击安装,下一步到底完成
打开点击左下角
新建如下
打开idea的“stormkafka”项目,修改“WriteCountBolt.java”代码如下
Jedis jedis=null;
@Override
public void prepare(Map stormConf, TopologyContext context) {
jedis=new Jedis("127.0.0.1",6379);
}
@Override
public void execute(Tuple tuple, BasicOutputCollector basicOutputCollector) {
String s=tuple.getString(0);
jedis.hincrBy("thingIds",s,1);
}
@Override
public void declareOutputFields(OutputFieldsDeclarer outputFieldsDeclarer) {
}
运行后可在Redis中看到