【第二节】- Idea本地调试提交Flink程序

1、下载Flink tar包:
tar
解压,查看对应的flink.sh脚本:

#!/usr/bin/env bash
################################################################################
#  Licensed to the Apache Software Foundation (ASF) under one
#  or more contributor license agreements.  See the NOTICE file
#  distributed with this work for additional information
#  regarding copyright ownership.  The ASF licenses this file
#  to you under the Apache License, Version 2.0 (the
#  "License"); you may not use this file except in compliance
#  with the License.  You may obtain a copy of the License at
#
#      http://www.apache.org/licenses/LICENSE-2.0
#
#  Unless required by applicable law or agreed to in writing, software
#  distributed under the License is distributed on an "AS IS" BASIS,
#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#  See the License for the specific language governing permissions and
# limitations under the License.
################################################################################
# 获取Linux执行命令的第一个参数,即Flink脚本
target="$0"
# For the case, the executable has been directly symlinked, figure out
# the correct bin path by following its symlink up to an upper bound.
# Note: we can't use the readlink utility here if we want to be POSIX
# compatible.
iteration=0

# 如果target是软链接,则查看软链接的详细信息,然后用正则表达式匹配出软链接指向的路径
# 如果软链接指向的路径还是一个软链接,则重复的一层层进行解析
# 如果解析超过一百次,则跳出循环。可能会导致后面运行config.sh脚本报找不到文件的错误
while [ -L "$target" ]; do
    if [ "$iteration" -gt 100 ]; then
        echo "Cannot resolve path: You have a cyclic symlink in $target."
        break
    fi
    ls=`ls -ld -- "$target"`
    target=`expr "$ls" : '.* -> \(.*\)$'`
    iteration=$((iteration + 1))
done

# 获取bin目录的路径
# Convert relative path to absolute path
bin=`dirname "$target"`

# 运行config.sh脚本,这个脚本会设置很多变量,和定义了很多函数。我们可以在当前脚本引用
# get flink config
. "$bin"/config.sh

# 获取运行Flink脚本的Linux用户
if [ "$FLINK_IDENT_STRING" = "" ]; then
        FLINK_IDENT_STRING="$USER"
fi

# 调用config.sh中的函数constructFlinkClassPath,获取Flink的classpath
CC_CLASSPATH=`constructFlinkClassPath`

# 定义log文件,定义Java运行的日志参数
log=$FLINK_LOG_DIR/flink-$FLINK_IDENT_STRING-client-$HOSTNAME.log
log_setting=(-Dlog.file="$log" -Dlog4j.configuration=file:"$FLINK_CONF_DIR"/log4j-cli.properties -Dlog4j.configurationFile=file:"$FLINK_CONF_DIR"/log4j-cli.properties -Dlogback.configurationFile=file:"$FLINK_CONF_DIR"/logback.xml)

# Flink客户端指定的JVM参数,本示例为空字符串
# Add Client-specific JVM options
FLINK_ENV_JAVA_OPTS="${FLINK_ENV_JAVA_OPTS} ${FLINK_ENV_JAVA_OPTS_CLI}"

# -classpath设置了Flink的classpath和hadoop的classpath
# 执行的Java类是:org.apache.flink.client.cli.CliFrontend
# $@是Linux执行命令除一个参数,后面的所有参数
# Add HADOOP_CLASSPATH to allow the usage of Hadoop file systems
exec "${JAVA_RUN}" $JVM_ARGS $FLINK_ENV_JAVA_OPTS "${log_setting[@]}" -classpath "`manglePathList "$CC_CLASSPATH:$INTERNAL_HADOOP_CLASSPATHS"`" org.apache.flink.client.cli.CliFrontend "$@"

找到对应的flink执行入口: org.apache.flink.client.cli.CliFrontend

2、在IDEA本地运行 CliFrontend
配置run
配置执行信息:

run -t local /Users/xxx/work_tools/flink-1.17.0/examples/streaming/WordCount.jar

配置conf目录:

FLINK_CONF_DIR=/Users/xxx/work_tools/flink-1.17.0/conf

conf
执行报错:
error
需要在 flink-clients 中引入对应的项目依赖:
lib
右键,Add as Libiary
source
再次执行,查看执行结果:
result

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

雾岛与鲸

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值