手撕项目-Flink电商实时分析三

目录

 

Redis服务

将本地应用放在服务器上测试

报错记录

总结


Redis服务

在服务器上搭建redis服务,本次采用redis-5.0.8,官网下载gz包,tar解压,再进行make操作,不报错的话,进入src(出现redis-server、redis-cli文件即为make成功)目录进行make install操作。之后进入src之前的目录修改redis.conf文件,配置为远程可以访问和后台运行,这样可以在本地使用可视化工具(比如:RedisDesktopManager)进行查看。

修改配置文件:

配置后台运行
#daemonize no
daemonize yes
配置远程访问
vim redis.conf
注释
bind 127.0.0.1
protected-mode   yes
修改为
bind  0.0.0.0
protected-mode   no

下面是测试结果:

编写RedusUtil并测试

public static final Jedis jedis = new Jedis("自己redis服务的ip",6379);
public  static String getBykey (String key){
    return jedis.get(key);
}
public static void main(String[] args) {
    jedis.set("test3","test33");
    String value = jedis.get("test3");
    System.out.println(value);
}

 

flink整合redis进行测试:

RedisUtil.jedis.lpush("pingdaord:"+pindaoid,count+"");

 

将本地应用放在服务器上测试

首先下载flink-1.7.2-bin-hadoop28-scala_2.11.tgz,在服务器上解压并启动./start-cluster.sh,作为运行flink进行MapReduce的环境

启动后可以输入:http:服务器地址:8081/进入面板页面查看

由于服务器一直刷不出Task Slots的数量,所以只能进行到这里,后续是本地调用上报服务,生成数据,在服务器flink上进行分析并存入redis

报错记录

org.apache.flink.runtime.jobmanager.scheduler.NoResourceAvailableException: Could not allocate all requires slots within timeout of 300000 ms. Slots required: 2, slots allocated: 0
  at org.apache.flink.runtime.executiongraph.ExecutionGraph.lambda$scheduleEager$3(ExecutionGraph.java:991)
  at java.util.concurrent.CompletableFuture.uniExceptionally(CompletableFuture.java:870)
  at java.util.concurrent.CompletableFuture$UniExceptionally.tryFire(CompletableFuture.java:852)
  at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
  at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1977)
  at org.apache.flink.runtime.concurrent.FutureUtils$ResultConjunctFuture.handleCompletedFuture(FutureUtils.java:535)
  at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:760)
  at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:736)
  at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
  at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1977)
  at org.apache.flink.runtime.concurrent.FutureUtils$1.onComplete(FutureUtils.java:772)
  at akka.dispatch.OnComplete.internal(Future.scala:258)
  at akka.dispatch.OnComplete.internal(Future.scala:256)
  at akka.dispatch.japi$CallbackBridge.apply(Future.scala:186)
  at akka.dispatch.japi$CallbackBridge.apply(Future.scala:183)
  at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36)
  at org.apache.flink.runtime.concurrent.Executors$DirectExecutionContext.execute(Executors.java:83)
  at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44)
  at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252)
  at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:603)
  at akka.actor.Scheduler$$anon$4.run(Scheduler.scala:126)
  at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:601)
  at scala.concurrent.BatchingExecutor$class.execute(BatchingExecutor.scala:109)
  at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:599)
  at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(LightArrayRevolverScheduler.scala:329)
  at akka.actor.LightArrayRevolverScheduler$$anon$4.executeBucket$1(LightArrayRevolverScheduler.scala:280)
  at akka.actor.LightArrayRevolverScheduler$$anon$4.nextTick(LightArrayRevolverScheduler.scala:284)
  at akka.actor.LightArrayRevolverScheduler$$anon$4.run(LightArrayRevolverScheduler.scala:236)
  at java.lang.Thread.run(Thread.java:748)
2020-05-03 22:51:40,205 INFO  org.apache.flink.runtime.executiongraph.ExecutionGraph        - Source: Custom Source -> Map (1/1) (f29f5bc1259a09aa314eb46231a295b5) switched from SCHEDULED to CANCELED.
2020-05-03 22:51:40,205 INFO  org.apache.flink.runtime.executiongraph.ExecutionGraph        - Window(GlobalWindows(), CountTrigger, CountEvictor, PindaoReduce, PassThroughWindowFunction) -> Sink: pdrdreduce (1/1) (f0258b8006ced3f5cdd787d454e8ddc8) switched from SCHEDULED to CANCELED.
2020-05-03 22:51:40,217 INFO  org.apache.flink.runtime.executiongraph.ExecutionGraph        - Try to restart or fail the job pindaoredian (a189c7f3d756b8745fb07c7124502dec) if no longer possible.
2020-05-03 22:51:40,217 INFO  org.apache.flink.runtime.executiongraph.ExecutionGraph        - Job pindaoredian (a189c7f3d756b8745fb07c7124502dec) switched from state FAILING to RESTARTING.
2020-05-03 22:51:40,217 INFO  org.apache.flink.runtime.executiongraph.ExecutionGraph        - Restarting the job pindaoredian (a189c7f3d756b8745fb07c7124502dec).
2020-05-03 22:51:50,218 INFO  org.apache.flink.runtime.executiongraph.ExecutionGraph        - Job pindaoredian (a189c7f3d756b8745fb07c7124502dec) switched from state RESTARTING to CREATED.
2020-05-03 22:51:50,219 INFO  org.apache.flink.runtime.executiongraph.ExecutionGraph        - Job pindaoredian (a189c7f3d756b8745fb07c7124502dec) switched from state CREATED to RUNNING.
2020-05-03 22:51:50,219 INFO  org.apache.flink.runtime.executiongraph.ExecutionGraph        - Source: Custom Source -> Map (1/1) (f1d8e3f1df3c25820190ba00bd1cbcbf) switched from CREATED to SCHEDULED.
2020-05-03 22:51:50,219 INFO  org.apache.flink.runtime.executiongraph.ExecutionGraph        - Window(GlobalWindows(), CountTrigger, CountEvictor, PindaoReduce, PassThroughWindowFunction) -> Sink: pdrdreduce (1/1) (1dd55c67f10b3cb2c0962f3bd2f20958) switched from CREATED to SCHEDULED.
2020-05-03 22:51:50,220 INFO  org.apache.flink.runtime.jobmaster.slotpool.SlotPool          - Requesting new slot [SlotRequestId{1f915c06c6ef004f5d2504065d517bc7}] and profile ResourceProfile{cpuCores=-1.0, heapMemoryInMB=-1, directMemoryInMB=0, nativeMemoryInMB=0, networkMemoryInMB=0} from resource manager.
2020-05-03 22:51:50,220 INFO  org.apache.flink.runtime.resourcemanager.StandaloneResourceManager  - Request slot with profile ResourceProfile{cpuCores=-1.0, heapMemoryInMB=-1, directMemoryInMB=0, nativeMemoryInMB=0, networkMemoryInMB=0} for job a189c7f3d756b8745fb07c7124502dec with allocation id AllocationID{da4e0ae01707e2cc6211e6f968ee163d}.
2020-05-03 22:51:53,036 INFO  org.apache.flink.runtime.checkpoint.CheckpointCoordinator     - Checkpoint triggering task Source: Custom Source -> Map (1/1) of job a189c7f3d756b8745fb07c7124502dec is not in state RUNNING but SCHEDULED instead. Aborting checkpoint.
2020-05-03 22:51:58,036 INFO  org.apache.flink.runtime.checkpoint.CheckpointCoordinator     - Checkpoint triggering task Source: Custom Source -> Map (1/1) of job a189c7f3d756b8745fb07c7124502dec is not in state RUNNING but SCHEDULED instead. Aborting checkpoint.
2020-05-03 22:52:03,036 INFO  org.apache.flink.runtime.checkpoint.CheckpointCoordinator     - Checkpoint triggering task Source: Custom Source -> Map (1/1) of job a189c7f3d756b8745fb07c7124502dec is not in state RUNNING but SCHEDULED instead. Aborting checkpoint.
2020-05-03 22:52:08,036 INFO  org.apache.flink.runtime.checkpoint.CheckpointCoordinator     - Checkpoint triggering task Source: Custom Source -> Map (1/1) of job a189c7f3d756b8745fb07c7124502dec is not in state RUNNING but SCHEDULED instead. Aborting checkpoint.
2020-05-03 22:52:13,036 INFO  org.apache.flink.runtime.checkpoint.CheckpointCoordinator     - Checkpoint triggering task Source: Custom Source -> Map (1/1) of job a189c7f3d756b8745fb07c7124502dec is not in state RUNNING but SCHEDULED instead. Aborting checkpoint.
2020-05-03 22:52:18,036 INFO  org.apache.flink.runtime.checkpoint.CheckpointCoordinator     - Checkpoint triggering task Source: Custom Source -> Map (1/1) of job a189c7f3d756b8745fb07c7124502dec is not in state RUNNING but SCHEDULED instead. Aborting checkpoint.
2020-05-03 22:52:23,036 INFO  org.apache.flink.runtime.checkpoint.CheckpointCoordinator     - Checkpoint triggering task Source: Custom Source -> Map (1/1) of job a189c7f3d756b8745fb07c7124502dec is not in state RUNNING but SCHEDULED instead. Aborting checkpoint.
2020-05-03 22:52:28,036 INFO  org.apache.flink.runtime.checkpoint.CheckpointCoordinator     - Checkpoint triggering task Source: Custom Source -> Map (1/1) of job a189c7f3d756b8745fb07c7124502dec is not in state RUNNING but SCHEDULED instead. Aborting checkpoint.
2020-05-03 22:52:33,036 INFO  org.apache.flink.runtime.checkpoint.CheckpointCoordinator     - Checkpoint triggering task Source: Custom Source -> Map (1/1) of job a189c7f3d756b8745fb07c7124502dec is not in state RUNNING but SCHEDULED instead. Aborting checkpoint.
2020-05-03 22:52:38,036 INFO  org.apache.flink.runtime.checkpoint.CheckpointCoordinator     - Checkpoint triggering task Source: Custom Source -> Map (1/1) of job a189c7f3d756b8745fb07c7124502dec is not in state RUNNING but SCHEDULED instead. Aborting checkpoint.
2020-05-03 22:52:43,036 INFO  org.apache.flink.runtime.checkpoint.CheckpointCoordinator     - Checkpoint triggering task Source: Custom Source -> Map (1/1) of job a189c7f3d756b8745fb07c7124502dec is not in state RUNNING but SCHEDULED instead. Aborting checkpoint.
2020-05-03 22:52:48,036 INFO  org.apache.flink.runtime.checkpoint.CheckpointCoordinator     - Checkpoint triggering task Source: Custom Source -> Map (1/1) of job a189c7f3d756b8745fb07c7124502dec is not in state RUNNING but SCHEDULED instead. Aborting checkpoint.
2020-05-03 22:52:53,036 INFO  org.apache.flink.runtime.checkpoint.CheckpointCoordinator     - Checkpoint triggering task Source: Custom Source -> Map (1/1) of job a189c7f3d756b8745fb07c7124502dec is not in state RUNNING but SCHEDULED instead. Aborting checkpoint.
2020-05-03 22:52:58,036 INFO  org.apache.flink.runtime.checkpoint.CheckpointCoordinator     - Checkpoint triggering task Source: Custom Source -> Map (1/1) of job a189c7f3d756b8745fb07c7124502dec is not in state RUNNING but SCHEDULED instead. Aborting checkpoint.
2020-05-03 22:53:03,036 INFO  org.apache.flink.runtime.checkpoint.CheckpointCoordinator     - Checkpoint triggering task Source: Custom Source -> Map (1/1) of job a189c7f3d756b8745fb07c7124502dec is not in state RUNNING but SCHEDULED instead. Aborting checkpoint.

总结

今天大部分时间都用在了调通服务器上,接下来将编写前端服务,如果有知道为什么Task Solts为0的可以私信或者留言。

具体代码可参照我的git项目地址,现有代码均已通过测试可以使用,后续会持续更新,直到项目结束,不懂的细节,可以关注公众号,后台留言,会细致解答。

git地址:https://github.com/jyqjyq/filnkDS.git

  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 1
    评论
好的,针对用户行为分析的 Flink 代码解析,我可以提供一些简单的示例代码和解释,供您参考。 首先,我们需要定义用户行为数据的数据模型,通常包括用户 ID、行为类型、品 ID、时间戳等字段。示例代码如下: ```java public class UserBehavior { public long userId; public long itemId; public int categoryId; public String behavior; public long timestamp; } ``` 接下来,我们使用 Flink 流处理框架来实时处理和分析用户行为数据。示例代码如下: ```java // 创建数据源,从 Kafka 中读取用户行为数据 FlinkKafkaConsumer<String> consumer = new FlinkKafkaConsumer<>("user-behavior", new SimpleStringSchema(), properties); DataStream<String> stream = env.addSource(consumer); // 将用户行为数据转换为 UserBehavior 对象 DataStream<UserBehavior> behaviorStream = stream.map(new MapFunction<String, UserBehavior>() { @Override public UserBehavior map(String value) throws Exception { JSONObject json = JSONObject.parseObject(value); return new UserBehavior( json.getLong("user_id"), json.getLong("item_id"), json.getInteger("category_id"), json.getString("behavior"), json.getLong("timestamp") ); } }); // 过滤出浏览行为,并进行分组聚合 DataStream<Tuple2<Long, Long>> pvStream = behaviorStream .filter(new FilterFunction<UserBehavior>() { @Override public boolean filter(UserBehavior behavior) throws Exception { return behavior.behavior.equals("pv"); } }) .map(new MapFunction<UserBehavior, Tuple2<Long, Long>>() { @Override public Tuple2<Long, Long> map(UserBehavior behavior) throws Exception { return new Tuple2<>(behavior.itemId, 1L); } }) .keyBy(0) .sum(1); // 输出结果到控制台 pvStream.print(); ``` 以上代码实现了从 Kafka 中读取用户行为数据,将数据转换为 UserBehavior 对象,过滤出浏览行为,并按品 ID 进行分组聚合,最后将结果输出到控制台。 当然,用户行为分析还涉及到很多其他的问题和场景,如购买转化率分析、用户活跃度分析品热度排名分析等等。针对不同的问题和场景,需要进行不同的数据处理和分析。希望这个示例代码能够帮助您理解 Flink 在用户行为分析中的应用。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值