hadoop fs -put test.txt /data/ 上传文件时报异常

博客内容涉及Hadoop集群中出现磁盘使用率为0的问题,原因是多次格式化导致DataNode元数据错乱。解决方法包括删除data和logs文件夹,重新格式化集群并启动,然后使用hadoopdfsadmin-report检查,最后通过hadoopfs-put命令成功上传文件。

在这里插入图片描述

查看:

hadoop dfsadmin -report

在这里插入图片描述
发现磁盘使用都为 0

报错原因: 应该是多次格式化,导致集群的DataNode 的元数据信息错乱

解决办法:
删除hadoop3台集群中的data,logs 文件夹
重新格式化集群
启动集群
输入 hadoop dfsadmin -report

在这里插入图片描述
hadoop fs -put test.txt /data/

再次重新使用上传命令即可成功上传文件

在这里插入图片描述

Hadoop HDFS常用Shell命令及实验步骤 一、环境准备 1. 确保Hadoop集群已启动(NameNode、DataNode) 2. 本地创建测试文件: Bash   echo "Hello Hadoop" > test.txt  二、文件上传 1. 命令格式 Bash   hadoop fs -copyFromLocal <本地路径> <HDFS路径> # 或使用 -put 2. 示例 将test.txt上传到HDFS根目录: Bash   hadoop fs -copyFromLocal ./test.txt /test.txt 3. 验证 Bash   hadoop fs -ls / # 查看根目录文件列表[^3]  三、文件下载 1. 命令格式 Bash   hadoop fs -copyToLocal <HDFS路径> <本地路径> # 或使用 -get 2. 示例 下载HDFS的/test.txt到本地: Bash   hadoop fs -copyToLocal /test.txt ./downloaded.txt 3. 验证 Bash   cat downloaded.txt # 输出应为"Hello Hadoop"[^1]  四、内容追加 1. 命令格式 Bash   hadoop fs -appendToFile <本地文件> <HDFS文件> 2. 示例 追加本地append.txt到HDFS的/test.txt: Bash   echo "New Content" > append.txt hadoop fs -appendToFile append.txt /test.txt 3. 验证 Bash   hadoop fs -cat /test.txt # 应包含追加内容  五、文件删除 1. 命令格式 Bash   hadoop fs -rm <HDFS路径> # 删除文件 hadoop fs - rm -r <HDFS目录路径> # 递归删除目录 2. 示例 删除HDFS的/test.txt: Bash   hadoop fs -rm /test.txt 3. 验证 Bash   hadoop fs -ls / # 文件应不存在  六、权限查看与修改 1. 查看权限 Bash   hadoop fs -ls -d / # 显示根目录权限[^3] 输出示例: drwxr-xr-x - user supergroup 0 2023-10-01 12:00 / 2. 修改权限 Bash   hadoop fs -chmod 755 /test.txt # 设置读写执行权限  七、其他操作 1. 复制文件 Bash   hadoop fs -cp /test.txt /backup/test.txt # 跨路径复制[^2] 2. 创建目录 Bash   hadoop fs -mkdir /bigdata # 创建新目录[^3] 5. 给定HDFS中某一个目录,输出该目录下的所有文件的读写权限、大小、创建时间、路径等信息,如果该文件是目录,则递归输出该目录下所有文件相关信息; 6. 提供一个HDFS内的文件的路径,对该文件进行创建和删除操作。如果文件所在目录不存在,则自动创建目录; 7. 提供一个HDFS的目录的路径,对该目录进行创建和删除操作。创建目录时,如果目录文件所在目录不存在则自动创建相应目录;删除目录时,由用户指定当该目录不为空时是否还删除该目录;3. 将HDFS中指定文件的内容输出到终端中;在上面的基础上重点完成一下下面的内容 8. 向HDFS中指定的文件追加内容,由用户指定内容追加到原有文件的开头或结尾; 9. 删除HDFS中指定的文件; 10. 在HDFS中,将文件从源路径移动到目的路径。
03-20
历史记录 登录 我的虚拟机IP为192.168.128.130,我的用户名为:maoxiuting,请为我生成对应的代码和操作步骤,要详细 回答 题目重述 根据您提供的文档内容,结合您的实际环境(虚拟机IP为 192.168.128.130,用户名为 maoxiuting),需要生成一套完整的配置操作步骤和对应代码,涵盖: 固定IP设置 主机名映射(包括Windows和Linux) SSH免密登录配置(针对新用户) JDK与Hadoop环境配置 HDFS相关服务启动与操作 以下是为您量身定制的详细操作流程。 一、配置固定IP 操作步骤: 打开 VMware Workstation,找到当前虚拟机的网络设置。 建议使用 NAT 模式 修改虚拟网络编辑器中子网IP为 192.168.128.0,子网掩码 255.255.255.0,网关设为 192.168.128.2 登录Linux系统,使用root权限编辑网络接口配置文件: Bash vim /etc/sysconfig/network-scripts/ifcfg-ens33 修改内容如下: 重启网络服务: ✅ 验证:执行 ifconfig 或 ip addr show ens33,确认IP已变为 192.168.128.130 二、配置主机名与映射 1. 设置主机名为 master Bash hostnamectl set-hostname master 注:可通过 hostname 命令验证。 2. Windows 添加 hosts 映射(管理员权限) 步骤: 打开路径:C:\Windows\System32\drivers\etc\hosts 用记事本(以管理员身份运行)打开 hosts 文件 添加以下内容: text 192.168.128.130 master 若后续添加slave节点,请继续追加如: text 192.168.128.131 slave1 192.168.128.132 slave2 3. Linux 系统中也配置 hosts Bash vim /etc/hosts 添加: text 192.168.128.130 master ✅ 验证:执行 ping master 应能通。 三、配置SSH免密登录(适用于用户 maoxiuting) 由于原文档基于root用户配置SSH,现改为普通用户 maoxiuting。 1. 切换到目标用户并生成密钥对 Bash su - maoxiuting ssh-keygen -t rsa -b 4096 -P '' -f ~/.ssh/id_rsa -P '' 表示无密码;可按需交互式输入密码 2. 将公钥写入授权列表(本机免登自己) Bash cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys chmod 600 ~/.ssh/authorized_keys chmod 700 ~/.ssh 3. 若有多台主机需互通,复制公钥到其他机器: Bash ssh-copy-id maoxiuting@slave1 提前确保 maoxiuting 用户已在目标主机存在,且 .ssh 目录权限正确 4. 测试免密登录 Bash ssh maoxiuting@master ✅ 成功则无需密码直接进入。 四、配置JDK环境(继续使用 /export/server 路径) 1. 创建安装目录 Bash sudo mkdir -p /export/server sudo chown maoxiuting:maoxiuting /export/server # 授予用户权限 2. 上传JDK安装包至 /root 或 /home/maoxiuting 通过 FinalShell 拖拽上传 jdk-22_linux-x64_bin.tar.gz 3. 解压并重命名 Bash tar -zxvf jdk-22_linux-x64_bin.tar.gz -C /export/server cd /export/server mv jdk-22 jdk 4. 配置环境变量(用户级 or 全局) 推荐在用户家目录下修改 .bash_profile: Bash vim ~/.bash_profile 添加: Bash export JAVA_HOME=/export/server/jdk export PATH=$PATH:$JAVA_HOME/bin 加载配置: Bash source ~/.bash_profile ✅ 验证:java -version 输出应显示JDK 22版本信息 如需全局生效,可编辑 /etc/profile 并 source,但注意权限问题 五、关闭防火墙和 SELinux 1. 关闭防火墙 Bash sudo systemctl stop firewalld sudo systemctl disable firewalld 2. 关闭 SELinux Bash sudo vim /etc/sysconfig/selinux 修改为: text SELINUX=disabled 📌 保存后必须重启系统才能生效: Bash sudo reboot ⚠️ 注意:SELINUX拼写不能出错,否则系统可能无法启动! 六、安装Hadoop并配置核心文件 1. 解压Hadoop到指定目录 Bash tar -zxvf hadoop-3.x.x.tar.gz -C /export/server cd /export/server mv hadoop-3.x.x hadoop 2. 设置用户权限 Bash sudo chown -R maoxiuting:maoxiuting /export/server/hadoop 3. 配置 core-site.xml 路径:/export/server/hadoop/etc/hadoop/core-site.xml ✅ 使用主机名 master 替代 localhost,便于集群扩展 4. 配置 hdfs-site.xml 创建数据目录并授权: Bash mkdir -p /export/server/tmp/namenode mkdir -p /export/server/tmp/datanode chown -R maoxiuting:maoxiuting /export/server/tmp 5. 配置 hadoop-env.sh Bash vim /export/server/hadoop/etc/hadoop/hadoop-env.sh 添加: ❗关键:指定HDFS进程运行用户为 maoxiuting,避免权限拒绝错误 七、格式化NameNode ⚠️ 仅首次执行! Bash /export/server/hadoop/bin/hdfs namenode -format 观察输出是否有 Storage directory has been successfully created. 字样 八、启动Hadoop HDFS服务 Bash /export/server/hadoop/sbin/start-dfs.sh ✅ 验证是否启动成功: Bash jps 应看到: NameNode DataNode SecondaryNameNode 若没有出现,请检查日志 /export/server/hadoop/logs/ 九、Web界面与命令行操作HDFS 1. 访问 Web UI 在Windows浏览器中访问: http://192.168.128.130:9870 或使用主机名: http://master:9870 ✅ 可视化查看HDFS状态 2. 命令行操作示例(切换回 maoxiuting 用户) (1) 创建用户目录 Bash /export/server/hadoop/bin/hadoop fs -mkdir -p hdfs://master:9000/user/maoxiuting (2) 准备本地测试文件 Bash echo "I love Hadoop" > /export/server/test.txt (3) 上传文件 Bash /export/server/hadoop/bin/hadoop fs -put /export/server/test.txt hdfs://master:9000/user/maoxiuting/ (4) 查看文件列表 Bash /export/server/hadoop/bin/hadoop fs -ls hdfs://master:9000/user/maoxiuting/ (5) 查看文件内容 Bash /export/server/hadoop/bin/hadoop fs -cat hdfs://master:9000/user/maoxiuting/test.txt (6) 下载文件并改名 Bash /export/server/hadoop/bin/hadoop fs -get hdfs://master:9000/user/maoxiuting/test.txt /export/server/test1.txt (7) 删除文件 Bash /export/server/hadoop/bin/hadoop fs -rm hdfs://master:9000/user/maoxiuting/test.txt (8) 删除目录 Bash /export/server/hadoop/bin/hadoop fs -rm -r hdfs://master:9000/user/maoxiuting 知识点 SSH免密登录原理 基于非对称加密,客户端私钥与服务端公钥匹配后自动认证,无需密码。 HDFS核心配置参数作用 fs.defaultFS 定义默认文件系统地址;dfs.namenode.name.dir 指定元数据存储位置。 Linux环境变量与权限管理 $PATH 决定命令查找路径;chmod/chown 控制文件访问权限,防止服务启动失败。 jdk和Hadoop的路径分别为/home/maoxiuting/jdk-8u391-linux-x64.tar.gz,/home/maoxiuting/hadoop-3.1.3.tar.gz, 回答 题目重述 根据您提供的新路径信息: JDK 压缩包位于 /home/maoxiuting/jdk-8u391-linux-x64.tar.gz Hadoop 压缩包位于 /home/maoxiuting/hadoop-3.1.3.tar.gz 结合之前的配置需求(IP: 192.168.128.130,用户名:maoxiuting),更新并修正所有操作步骤,确保路径准确、权限合理、服务正常启动。 一、解压 JDK 与 Hadoop 到指定目录 1. 创建统一安装目录并赋权 Bash sudo mkdir -p /export/server sudo chown maoxiuting:maoxiuting /export/server 2. 解压 JDK 3. 解压 Hadoop ✅ 确保 /export/server 下有两个文件夹: /export/server/jdk /export/server/hadoop 二、配置环境变量(用户级别) 编辑当前用户的环境配置文件: Bash vim ~/.bash_profile 添加以下内容: Bash export JAVA_HOME=/export/server/jdk export HADOOP_HOME=/export/server/hadoop export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin 保存后立即生效: Bash source ~/.bash_profile ✅ 验证: Bash java -version hadoop version 应分别输出 Java 8 和 Hadoop 3.1.3 版本信息。 ⚠️ 若提示命令未找到,请检查路径拼写及是否执行 source 三、配置 Hadoop 核心文件(路径已适配)详细一点,然后所有命令均改为vi
最新发布
10-31
/usr/local/jdk1.8.0_341/bin/java -javaagent:/opt/idea-IC-223.8836.41/lib/idea_rt.jar=35029:/opt/idea-IC-223.8836.41/bin -Dfile.encoding=UTF-8 -classpath /usr/local/jdk1.8.0_341/jre/lib/charsets.jar:/usr/local/jdk1.8.0_341/jre/lib/deploy.jar:/usr/local/jdk1.8.0_341/jre/lib/ext/cldrdata.jar:/usr/local/jdk1.8.0_341/jre/lib/ext/dnsns.jar:/usr/local/jdk1.8.0_341/jre/lib/ext/jaccess.jar:/usr/local/jdk1.8.0_341/jre/lib/ext/jfxrt.jar:/usr/local/jdk1.8.0_341/jre/lib/ext/localedata.jar:/usr/local/jdk1.8.0_341/jre/lib/ext/nashorn.jar:/usr/local/jdk1.8.0_341/jre/lib/ext/sunec.jar:/usr/local/jdk1.8.0_341/jre/lib/ext/sunjce_provider.jar:/usr/local/jdk1.8.0_341/jre/lib/ext/sunpkcs11.jar:/usr/local/jdk1.8.0_341/jre/lib/ext/zipfs.jar:/usr/local/jdk1.8.0_341/jre/lib/javaws.jar:/usr/local/jdk1.8.0_341/jre/lib/jce.jar:/usr/local/jdk1.8.0_341/jre/lib/jfr.jar:/usr/local/jdk1.8.0_341/jre/lib/jfxswt.jar:/usr/local/jdk1.8.0_341/jre/lib/jsse.jar:/usr/local/jdk1.8.0_341/jre/lib/management-agent.jar:/usr/local/jdk1.8.0_341/jre/lib/plugin.jar:/usr/local/jdk1.8.0_341/jre/lib/resources.jar:/usr/local/jdk1.8.0_341/jre/lib/rt.jar:/root/IdeaProjects/demo20250411/target/classes:/usr/local/src/repo/com/fasterxml/jackson/core/jackson-databind/2.10.4/jackson-databind-2.10.4.jar:/usr/local/src/repo/org/dom4j/dom4j/2.1.4/dom4j-2.1.4.jar:/usr/local/src/repo/com/fasterxml/jackson/core/jackson-core/2.10.4/jackson-core-2.10.4.jar:/usr/local/src/repo/com/fasterxml/jackson/core/jackson-annotations/2.10.4/jackson-annotations-2.10.4.jar:/usr/local/src/repo/com/fasterxml/jackson/jaxrs/jackson-jaxrs-json-provider/2.10.4/jackson-jaxrs-json-provider-2.10.4.jar:/usr/local/src/repo/com/fasterxml/jackson/jaxrs/jackson-jaxrs-base/2.10.4/jackson-jaxrs-base-2.10.4.jar:/usr/local/src/repo/com/fasterxml/jackson/module/jackson-module-jaxb-annotations/2.10.4/jackson-module-jaxb-annotations-2.10.4.jar:/usr/local/src/repo/jakarta/xml/bind/jakarta.xml.bind-api/2.3.2/jakarta.xml.bind-api-2.3.2.jar:/usr/local/src/repo/jakarta/activation/jakarta.activation-api/1.2.1/jakarta.activation-api-1.2.1.jar:/usr/local/src/repo/com/fasterxml/jackson/dataformat/jackson-dataformat-xml/2.10.4/jackson-dataformat-xml-2.10.4.jar:/usr/local/src/repo/org/codehaus/woodstox/stax2-api/4.2/stax2-api-4.2.jar:/usr/local/src/repo/com/fasterxml/woodstox/woodstox-core/6.2.0/woodstox-core-6.2.0.jar:/usr/local/src/repo/org/scala-lang/scala-reflect/2.12.10/scala-reflect-2.12.10.jar:/usr/local/src/repo/org/scala-lang/scala-compiler/2.12.10/scala-compiler-2.12.10.jar:/usr/local/src/repo/org/scala-lang/modules/scala-xml_2.12/1.0.6/scala-xml_2.12-1.0.6.jar:/usr/local/src/repo/org/scala-lang/scala-library/2.12.10/scala-library-2.12.10.jar:/usr/local/src/repo/org/apache/kafka/kafka_2.12/2.4.1/kafka_2.12-2.4.1.jar:/usr/local/src/repo/com/fasterxml/jackson/module/jackson-module-scala_2.12/2.10.0/jackson-module-scala_2.12-2.10.0.jar:/usr/local/src/repo/com/fasterxml/jackson/module/jackson-module-paranamer/2.10.0/jackson-module-paranamer-2.10.0.jar:/usr/local/src/repo/com/fasterxml/jackson/dataformat/jackson-dataformat-csv/2.10.0/jackson-dataformat-csv-2.10.0.jar:/usr/local/src/repo/com/fasterxml/jackson/datatype/jackson-datatype-jdk8/2.10.0/jackson-datatype-jdk8-2.10.0.jar:/usr/local/src/repo/net/sf/jopt-simple/jopt-simple/5.0.4/jopt-simple-5.0.4.jar:/usr/local/src/repo/com/yammer/metrics/metrics-core/2.2.0/metrics-core-2.2.0.jar:/usr/local/src/repo/org/scala-lang/modules/scala-collection-compat_2.12/2.1.2/scala-collection-compat_2.12-2.1.2.jar:/usr/local/src/repo/org/scala-lang/modules/scala-java8-compat_2.12/0.9.0/scala-java8-compat_2.12-0.9.0.jar:/usr/local/src/repo/com/typesafe/scala-logging/scala-logging_2.12/3.9.2/scala-logging_2.12-3.9.2.jar:/usr/local/src/repo/org/slf4j/slf4j-api/1.7.28/slf4j-api-1.7.28.jar:/usr/local/src/repo/org/apache/zookeeper/zookeeper/3.5.7/zookeeper-3.5.7.jar:/usr/local/src/repo/org/apache/zookeeper/zookeeper-jute/3.5.7/zookeeper-jute-3.5.7.jar:/usr/local/src/repo/io/netty/netty-handler/4.1.45.Final/netty-handler-4.1.45.Final.jar:/usr/local/src/repo/io/netty/netty-common/4.1.45.Final/netty-common-4.1.45.Final.jar:/usr/local/src/repo/io/netty/netty-buffer/4.1.45.Final/netty-buffer-4.1.45.Final.jar:/usr/local/src/repo/io/netty/netty-transport/4.1.45.Final/netty-transport-4.1.45.Final.jar:/usr/local/src/repo/io/netty/netty-resolver/4.1.45.Final/netty-resolver-4.1.45.Final.jar:/usr/local/src/repo/io/netty/netty-codec/4.1.45.Final/netty-codec-4.1.45.Final.jar:/usr/local/src/repo/io/netty/netty-transport-native-epoll/4.1.45.Final/netty-transport-native-epoll-4.1.45.Final.jar:/usr/local/src/repo/io/netty/netty-transport-native-unix-common/4.1.45.Final/netty-transport-native-unix-common-4.1.45.Final.jar:/usr/local/src/repo/commons-cli/commons-cli/1.4/commons-cli-1.4.jar:/usr/local/src/repo/org/apache/flink/flink-connector-jdbc_2.12/1.14.0/flink-connector-jdbc_2.12-1.14.0.jar:/usr/local/src/repo/com/h2database/h2/1.4.200/h2-1.4.200.jar:/usr/local/src/repo/org/apache/flink/flink-shaded-force-shading/14.0/flink-shaded-force-shading-14.0.jar:/usr/local/src/repo/org/apache/flink/flink-runtime-web_2.12/1.14.0/flink-runtime-web_2.12-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-runtime/1.14.0/flink-runtime-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-rpc-core/1.14.0/flink-rpc-core-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-rpc-akka-loader/1.14.0/flink-rpc-akka-loader-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-queryable-state-client-java/1.14.0/flink-queryable-state-client-java-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-hadoop-fs/1.14.0/flink-hadoop-fs-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-shaded-zookeeper-3/3.4.14-14.0/flink-shaded-zookeeper-3-3.4.14-14.0.jar:/usr/local/src/repo/org/javassist/javassist/3.24.0-GA/javassist-3.24.0-GA.jar:/usr/local/src/repo/org/apache/flink/flink-shaded-netty/4.1.65.Final-14.0/flink-shaded-netty-4.1.65.Final-14.0.jar:/usr/local/src/repo/org/apache/flink/flink-shaded-guava/30.1.1-jre-14.0/flink-shaded-guava-30.1.1-jre-14.0.jar:/usr/local/src/repo/org/apache/flink/flink-shaded-jackson/2.12.4-14.0/flink-shaded-jackson-2.12.4-14.0.jar:/usr/local/src/repo/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/usr/local/src/repo/org/apache/flink/flink-clients_2.12/1.14.0/flink-clients_2.12-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-core/1.14.0/flink-core-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-annotations/1.14.0/flink-annotations-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-metrics-core/1.14.0/flink-metrics-core-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-shaded-asm-7/7.1-14.0/flink-shaded-asm-7-7.1-14.0.jar:/usr/local/src/repo/com/esotericsoftware/kryo/kryo/2.24.0/kryo-2.24.0.jar:/usr/local/src/repo/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar:/usr/local/src/repo/org/objenesis/objenesis/2.1/objenesis-2.1.jar:/usr/local/src/repo/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.jar:/usr/local/src/repo/org/apache/commons/commons-compress/1.21/commons-compress-1.21.jar:/usr/local/src/repo/org/apache/flink/flink-optimizer/1.14.0/flink-optimizer-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-java/1.14.0/flink-java-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-streaming-java_2.12/1.14.0/flink-streaming-java_2.12-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-file-sink-common/1.14.0/flink-file-sink-common-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-streaming-scala_2.12/1.14.0/flink-streaming-scala_2.12-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-scala_2.12/1.14.0/flink-scala_2.12-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-connector-kafka_2.12/1.14.0/flink-connector-kafka_2.12-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-connector-base/1.14.0/flink-connector-base-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-sql-connector-hbase-2.2_2.12/1.14.0/flink-sql-connector-hbase-2.2_2.12-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-table-planner_2.12/1.14.0/flink-table-planner_2.12-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-table-common/1.14.0/flink-table-common-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-table-api-java/1.14.0/flink-table-api-java-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-table-api-scala_2.12/1.14.0/flink-table-api-scala_2.12-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-table-api-java-bridge_2.12/1.14.0/flink-table-api-java-bridge_2.12-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-table-runtime_2.12/1.14.0/flink-table-runtime_2.12-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-table-code-splitter/1.14.0/flink-table-code-splitter-1.14.0.jar:/usr/local/src/repo/org/codehaus/janino/janino/3.0.11/janino-3.0.11.jar:/usr/local/src/repo/org/apache/calcite/avatica/avatica-core/1.17.0/avatica-core-1.17.0.jar:/usr/local/src/repo/org/apache/flink/flink-json/1.14.0/flink-json-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-table-api-scala-bridge_2.12/1.14.0/flink-table-api-scala-bridge_2.12-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-connector-redis_2.11/1.1.5/flink-connector-redis_2.11-1.1.5.jar:/usr/local/src/repo/redis/clients/jedis/2.8.0/jedis-2.8.0.jar:/usr/local/src/repo/org/apache/commons/commons-pool2/2.3/commons-pool2-2.3.jar:/usr/local/src/repo/org/slf4j/slf4j-log4j12/1.7.7/slf4j-log4j12-1.7.7.jar:/usr/local/src/repo/log4j/log4j/1.2.17/log4j-1.2.17.jar:/usr/local/src/repo/org/apache/flink/force-shading/1.1.5/force-shading-1.1.5.jar:/usr/local/src/repo/org/apache/commons/commons-lang3/3.9/commons-lang3-3.9.jar:/usr/local/src/repo/org/apache/flink/flink-connector-hive_2.12/1.14.0/flink-connector-hive_2.12-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-connector-files/1.14.0/flink-connector-files-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-connector-hbase-2.2_2.12/1.14.0/flink-connector-hbase-2.2_2.12-1.14.0.jar:/usr/local/src/repo/org/apache/flink/flink-connector-hbase-base_2.12/1.14.0/flink-connector-hbase-base_2.12-1.14.0.jar:/usr/local/src/repo/io/netty/netty-all/4.1.46.Final/netty-all-4.1.46.Final.jar:/usr/local/src/repo/com/alibaba/fastjson/1.2.62/fastjson-1.2.62.jar:/usr/local/src/repo/org/apache/kafka/kafka-clients/2.6.0/kafka-clients-2.6.0.jar:/usr/local/src/repo/com/github/luben/zstd-jni/1.4.4-7/zstd-jni-1.4.4-7.jar:/usr/local/src/repo/org/lz4/lz4-java/1.7.1/lz4-java-1.7.1.jar:/usr/local/src/repo/org/xerial/snappy/snappy-java/1.1.7.3/snappy-java-1.1.7.3.jar:/usr/local/src/repo/mysql/mysql-connector-java/5.1.47/mysql-connector-java-5.1.47.jar:/usr/local/src/repo/org/apache/spark/spark-graphx_2.12/3.1.1/spark-graphx_2.12-3.1.1.jar:/usr/local/src/repo/org/apache/spark/spark-mllib-local_2.12/3.1.1/spark-mllib-local_2.12-3.1.1.jar:/usr/local/src/repo/org/apache/xbean/xbean-asm7-shaded/4.15/xbean-asm7-shaded-4.15.jar:/usr/local/src/repo/com/github/fommil/netlib/core/1.1.2/core-1.1.2.jar:/usr/local/src/repo/net/sourceforge/f2j/arpack_combined_all/0.1/arpack_combined_all-0.1.jar:/usr/local/src/repo/org/apache/spark/spark-tags_2.12/3.1.1/spark-tags_2.12-3.1.1.jar:/usr/local/src/repo/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/usr/local/src/repo/org/apache/spark/spark-mllib_2.12/3.1.1/spark-mllib_2.12-3.1.1.jar:/usr/local/src/repo/org/scala-lang/modules/scala-parser-combinators_2.12/1.1.2/scala-parser-combinators_2.12-1.1.2.jar:/usr/local/src/repo/org/apache/spark/spark-streaming_2.12/3.1.1/spark-streaming_2.12-3.1.1.jar:/usr/local/src/repo/org/scalanlp/breeze_2.12/1.0/breeze_2.12-1.0.jar:/usr/local/src/repo/org/scalanlp/breeze-macros_2.12/1.0/breeze-macros_2.12-1.0.jar:/usr/local/src/repo/net/sf/opencsv/opencsv/2.3/opencsv-2.3.jar:/usr/local/src/repo/com/github/wendykierp/JTransforms/3.1/JTransforms-3.1.jar:/usr/local/src/repo/pl/edu/icm/JLargeArrays/1.5/JLargeArrays-1.5.jar:/usr/local/src/repo/com/chuusai/shapeless_2.12/2.3.3/shapeless_2.12-2.3.3.jar:/usr/local/src/repo/org/typelevel/macro-compat_2.12/1.1.1/macro-compat_2.12-1.1.1.jar:/usr/local/src/repo/org/typelevel/spire_2.12/0.17.0-M1/spire_2.12-0.17.0-M1.jar:/usr/local/src/repo/org/typelevel/spire-macros_2.12/0.17.0-M1/spire-macros_2.12-0.17.0-M1.jar:/usr/local/src/repo/org/typelevel/spire-platform_2.12/0.17.0-M1/spire-platform_2.12-0.17.0-M1.jar:/usr/local/src/repo/org/typelevel/spire-util_2.12/0.17.0-M1/spire-util_2.12-0.17.0-M1.jar:/usr/local/src/repo/org/typelevel/machinist_2.12/0.6.8/machinist_2.12-0.6.8.jar:/usr/local/src/repo/org/typelevel/algebra_2.12/2.0.0-M2/algebra_2.12-2.0.0-M2.jar:/usr/local/src/repo/org/typelevel/cats-kernel_2.12/2.0.0-M4/cats-kernel_2.12-2.0.0-M4.jar:/usr/local/src/repo/org/apache/commons/commons-math3/3.4.1/commons-math3-3.4.1.jar:/usr/local/src/repo/org/glassfish/jaxb/jaxb-runtime/2.3.2/jaxb-runtime-2.3.2.jar:/usr/local/src/repo/com/sun/istack/istack-commons-runtime/3.0.8/istack-commons-runtime-3.0.8.jar:/usr/local/src/repo/org/apache/spark/spark-core_2.12/3.1.1/spark-core_2.12-3.1.1.jar:/usr/local/src/repo/com/thoughtworks/paranamer/paranamer/2.8/paranamer-2.8.jar:/usr/local/src/repo/org/apache/avro/avro/1.8.2/avro-1.8.2.jar:/usr/local/src/repo/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/usr/local/src/repo/org/tukaani/xz/1.5/xz-1.5.jar:/usr/local/src/repo/org/apache/avro/avro-mapred/1.8.2/avro-mapred-1.8.2-hadoop2.jar:/usr/local/src/repo/org/apache/avro/avro-ipc/1.8.2/avro-ipc-1.8.2.jar:/usr/local/src/repo/com/twitter/chill_2.12/0.9.5/chill_2.12-0.9.5.jar:/usr/local/src/repo/com/esotericsoftware/kryo-shaded/4.0.2/kryo-shaded-4.0.2.jar:/usr/local/src/repo/com/esotericsoftware/minlog/1.3.0/minlog-1.3.0.jar:/usr/local/src/repo/com/twitter/chill-java/0.9.5/chill-java-0.9.5.jar:/usr/local/src/repo/org/apache/spark/spark-launcher_2.12/3.1.1/spark-launcher_2.12-3.1.1.jar:/usr/local/src/repo/org/apache/spark/spark-kvstore_2.12/3.1.1/spark-kvstore_2.12-3.1.1.jar:/usr/local/src/repo/org/fusesource/leveldbjni/leveldbjni-all/1.8/leveldbjni-all-1.8.jar:/usr/local/src/repo/org/apache/spark/spark-network-common_2.12/3.1.1/spark-network-common_2.12-3.1.1.jar:/usr/local/src/repo/org/apache/spark/spark-network-shuffle_2.12/3.1.1/spark-network-shuffle_2.12-3.1.1.jar:/usr/local/src/repo/org/apache/spark/spark-unsafe_2.12/3.1.1/spark-unsafe_2.12-3.1.1.jar:/usr/local/src/repo/javax/activation/activation/1.1.1/activation-1.1.1.jar:/usr/local/src/repo/org/apache/curator/curator-recipes/2.13.0/curator-recipes-2.13.0.jar:/usr/local/src/repo/jakarta/servlet/jakarta.servlet-api/4.0.3/jakarta.servlet-api-4.0.3.jar:/usr/local/src/repo/org/apache/commons/commons-text/1.6/commons-text-1.6.jar:/usr/local/src/repo/org/slf4j/jul-to-slf4j/1.7.30/jul-to-slf4j-1.7.30.jar:/usr/local/src/repo/org/slf4j/jcl-over-slf4j/1.7.30/jcl-over-slf4j-1.7.30.jar:/usr/local/src/repo/com/ning/compress-lzf/1.0.3/compress-lzf-1.0.3.jar:/usr/local/src/repo/org/roaringbitmap/RoaringBitmap/0.9.0/RoaringBitmap-0.9.0.jar:/usr/local/src/repo/org/roaringbitmap/shims/0.9.0/shims-0.9.0.jar:/usr/local/src/repo/commons-net/commons-net/3.1/commons-net-3.1.jar:/usr/local/src/repo/org/json4s/json4s-jackson_2.12/3.7.0-M5/json4s-jackson_2.12-3.7.0-M5.jar:/usr/local/src/repo/org/json4s/json4s-core_2.12/3.7.0-M5/json4s-core_2.12-3.7.0-M5.jar:/usr/local/src/repo/org/json4s/json4s-ast_2.12/3.7.0-M5/json4s-ast_2.12-3.7.0-M5.jar:/usr/local/src/repo/org/json4s/json4s-scalap_2.12/3.7.0-M5/json4s-scalap_2.12-3.7.0-M5.jar:/usr/local/src/repo/org/glassfish/jersey/core/jersey-client/2.30/jersey-client-2.30.jar:/usr/local/src/repo/jakarta/ws/rs/jakarta.ws.rs-api/2.1.6/jakarta.ws.rs-api-2.1.6.jar:/usr/local/src/repo/org/glassfish/hk2/external/jakarta.inject/2.6.1/jakarta.inject-2.6.1.jar:/usr/local/src/repo/org/glassfish/jersey/core/jersey-common/2.30/jersey-common-2.30.jar:/usr/local/src/repo/jakarta/annotation/jakarta.annotation-api/1.3.5/jakarta.annotation-api-1.3.5.jar:/usr/local/src/repo/org/glassfish/hk2/osgi-resource-locator/1.0.3/osgi-resource-locator-1.0.3.jar:/usr/local/src/repo/org/glassfish/jersey/core/jersey-server/2.30/jersey-server-2.30.jar:/usr/local/src/repo/org/glassfish/jersey/media/jersey-media-jaxb/2.30/jersey-media-jaxb-2.30.jar:/usr/local/src/repo/jakarta/validation/jakarta.validation-api/2.0.2/jakarta.validation-api-2.0.2.jar:/usr/local/src/repo/org/glassfish/jersey/containers/jersey-container-servlet/2.30/jersey-container-servlet-2.30.jar:/usr/local/src/repo/org/glassfish/jersey/containers/jersey-container-servlet-core/2.30/jersey-container-servlet-core-2.30.jar:/usr/local/src/repo/org/glassfish/jersey/inject/jersey-hk2/2.30/jersey-hk2-2.30.jar:/usr/local/src/repo/org/glassfish/hk2/hk2-locator/2.6.1/hk2-locator-2.6.1.jar:/usr/local/src/repo/org/glassfish/hk2/external/aopalliance-repackaged/2.6.1/aopalliance-repackaged-2.6.1.jar:/usr/local/src/repo/org/glassfish/hk2/hk2-api/2.6.1/hk2-api-2.6.1.jar:/usr/local/src/repo/org/glassfish/hk2/hk2-utils/2.6.1/hk2-utils-2.6.1.jar:/usr/local/src/repo/com/clearspring/analytics/stream/2.9.6/stream-2.9.6.jar:/usr/local/src/repo/io/dropwizard/metrics/metrics-core/4.1.1/metrics-core-4.1.1.jar:/usr/local/src/repo/io/dropwizard/metrics/metrics-jvm/4.1.1/metrics-jvm-4.1.1.jar:/usr/local/src/repo/io/dropwizard/metrics/metrics-json/4.1.1/metrics-json-4.1.1.jar:/usr/local/src/repo/io/dropwizard/metrics/metrics-graphite/4.1.1/metrics-graphite-4.1.1.jar:/usr/local/src/repo/io/dropwizard/metrics/metrics-jmx/4.1.1/metrics-jmx-4.1.1.jar:/usr/local/src/repo/org/apache/ivy/ivy/2.4.0/ivy-2.4.0.jar:/usr/local/src/repo/oro/oro/2.0.8/oro-2.0.8.jar:/usr/local/src/repo/net/razorvine/pyrolite/4.30/pyrolite-4.30.jar:/usr/local/src/repo/net/sf/py4j/py4j/0.10.9/py4j-0.10.9.jar:/usr/local/src/repo/org/apache/commons/commons-crypto/1.1.0/commons-crypto-1.1.0.jar:/usr/local/src/repo/org/apache/spark/spark-sql_2.12/3.1.1/spark-sql_2.12-3.1.1.jar:/usr/local/src/repo/com/univocity/univocity-parsers/2.9.1/univocity-parsers-2.9.1.jar:/usr/local/src/repo/org/apache/spark/spark-sketch_2.12/3.1.1/spark-sketch_2.12-3.1.1.jar:/usr/local/src/repo/org/apache/spark/spark-catalyst_2.12/3.1.1/spark-catalyst_2.12-3.1.1.jar:/usr/local/src/repo/org/codehaus/janino/commons-compiler/3.0.16/commons-compiler-3.0.16.jar:/usr/local/src/repo/org/antlr/antlr4-runtime/4.8-1/antlr4-runtime-4.8-1.jar:/usr/local/src/repo/org/apache/arrow/arrow-vector/2.0.0/arrow-vector-2.0.0.jar:/usr/local/src/repo/org/apache/arrow/arrow-format/2.0.0/arrow-format-2.0.0.jar:/usr/local/src/repo/org/apache/arrow/arrow-memory-core/2.0.0/arrow-memory-core-2.0.0.jar:/usr/local/src/repo/com/google/flatbuffers/flatbuffers-java/1.9.0/flatbuffers-java-1.9.0.jar:/usr/local/src/repo/org/apache/arrow/arrow-memory-netty/2.0.0/arrow-memory-netty-2.0.0.jar:/usr/local/src/repo/org/apache/orc/orc-core/1.5.12/orc-core-1.5.12.jar:/usr/local/src/repo/org/apache/orc/orc-shims/1.5.12/orc-shims-1.5.12.jar:/usr/local/src/repo/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/usr/local/src/repo/io/airlift/aircompressor/0.10/aircompressor-0.10.jar:/usr/local/src/repo/org/threeten/threeten-extra/1.5.0/threeten-extra-1.5.0.jar:/usr/local/src/repo/org/apache/orc/orc-mapreduce/1.5.12/orc-mapreduce-1.5.12.jar:/usr/local/src/repo/org/apache/hive/hive-storage-api/2.7.2/hive-storage-api-2.7.2.jar:/usr/local/src/repo/org/apache/parquet/parquet-column/1.10.1/parquet-column-1.10.1.jar:/usr/local/src/repo/org/apache/parquet/parquet-common/1.10.1/parquet-common-1.10.1.jar:/usr/local/src/repo/org/apache/parquet/parquet-encoding/1.10.1/parquet-encoding-1.10.1.jar:/usr/local/src/repo/org/apache/parquet/parquet-hadoop/1.10.1/parquet-hadoop-1.10.1.jar:/usr/local/src/repo/org/apache/parquet/parquet-format/2.4.0/parquet-format-2.4.0.jar:/usr/local/src/repo/org/apache/parquet/parquet-jackson/1.10.1/parquet-jackson-1.10.1.jar:/usr/local/src/repo/org/apache/spark/spark-hive_2.12/3.1.1/spark-hive_2.12-3.1.1.jar:/usr/local/src/repo/org/apache/hive/hive-common/2.3.7/hive-common-2.3.7.jar:/usr/local/src/repo/jline/jline/2.12/jline-2.12.jar:/usr/local/src/repo/com/tdunning/json/1.8/json-1.8.jar:/usr/local/src/repo/com/github/joshelser/dropwizard-metrics-hadoop-metrics2-reporter/0.1.2/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:/usr/local/src/repo/org/apache/hive/hive-exec/2.3.7/hive-exec-2.3.7-core.jar:/usr/local/src/repo/org/apache/hive/hive-vector-code-gen/2.3.7/hive-vector-code-gen-2.3.7.jar:/usr/local/src/repo/org/apache/velocity/velocity/1.5/velocity-1.5.jar:/usr/local/src/repo/org/antlr/antlr-runtime/3.5.2/antlr-runtime-3.5.2.jar:/usr/local/src/repo/org/antlr/ST4/4.0.4/ST4-4.0.4.jar:/usr/local/src/repo/stax/stax-api/1.0.1/stax-api-1.0.1.jar:/usr/local/src/repo/org/apache/hive/hive-metastore/2.3.7/hive-metastore-2.3.7.jar:/usr/local/src/repo/javolution/javolution/5.5.1/javolution-5.5.1.jar:/usr/local/src/repo/com/jolbox/bonecp/0.8.0.RELEASE/bonecp-0.8.0.RELEASE.jar:/usr/local/src/repo/com/zaxxer/HikariCP/2.5.1/HikariCP-2.5.1.jar:/usr/local/src/repo/org/datanucleus/datanucleus-api-jdo/4.2.4/datanucleus-api-jdo-4.2.4.jar:/usr/local/src/repo/org/datanucleus/datanucleus-rdbms/4.1.19/datanucleus-rdbms-4.1.19.jar:/usr/local/src/repo/commons-pool/commons-pool/1.5.4/commons-pool-1.5.4.jar:/usr/local/src/repo/commons-dbcp/commons-dbcp/1.4/commons-dbcp-1.4.jar:/usr/local/src/repo/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar:/usr/local/src/repo/javax/transaction/jta/1.1/jta-1.1.jar:/usr/local/src/repo/org/datanucleus/javax.jdo/3.2.0-m3/javax.jdo-3.2.0-m3.jar:/usr/local/src/repo/javax/transaction/transaction-api/1.1/transaction-api-1.1.jar:/usr/local/src/repo/org/apache/hive/hive-serde/2.3.7/hive-serde-2.3.7.jar:/usr/local/src/repo/org/apache/hive/hive-shims/2.3.7/hive-shims-2.3.7.jar:/usr/local/src/repo/org/apache/hive/shims/hive-shims-common/2.3.7/hive-shims-common-2.3.7.jar:/usr/local/src/repo/org/apache/hive/shims/hive-shims-0.23/2.3.7/hive-shims-0.23-2.3.7.jar:/usr/local/src/repo/org/apache/hive/shims/hive-shims-scheduler/2.3.7/hive-shims-scheduler-2.3.7.jar:/usr/local/src/repo/org/apache/hive/hive-llap-common/2.3.7/hive-llap-common-2.3.7.jar:/usr/local/src/repo/org/apache/hive/hive-llap-client/2.3.7/hive-llap-client-2.3.7.jar:/usr/local/src/repo/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/usr/local/src/repo/commons-logging/commons-logging/1.0.4/commons-logging-1.0.4.jar:/usr/local/src/repo/org/apache/httpcomponents/httpclient/4.5.6/httpclient-4.5.6.jar:/usr/local/src/repo/org/apache/httpcomponents/httpcore/4.4.10/httpcore-4.4.10.jar:/usr/local/src/repo/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/usr/local/src/repo/commons-codec/commons-codec/1.10/commons-codec-1.10.jar:/usr/local/src/repo/joda-time/joda-time/2.10.5/joda-time-2.10.5.jar:/usr/local/src/repo/org/jodd/jodd-core/3.5.2/jodd-core-3.5.2.jar:/usr/local/src/repo/org/datanucleus/datanucleus-core/4.1.17/datanucleus-core-4.1.17.jar:/usr/local/src/repo/org/apache/thrift/libthrift/0.12.0/libthrift-0.12.0.jar:/usr/local/src/repo/org/apache/thrift/libfb303/0.9.3/libfb303-0.9.3.jar:/usr/local/src/repo/org/apache/derby/derby/10.12.1.1/derby-10.12.1.1.jar:/usr/local/src/repo/org/apache/hadoop/hadoop-client/3.1.3/hadoop-client-3.1.3.jar:/usr/local/src/repo/org/apache/hadoop/hadoop-common/3.1.3/hadoop-common-3.1.3.jar:/usr/local/src/repo/org/eclipse/jetty/jetty-servlet/9.3.24.v20180605/jetty-servlet-9.3.24.v20180605.jar:/usr/local/src/repo/org/eclipse/jetty/jetty-security/9.3.24.v20180605/jetty-security-9.3.24.v20180605.jar:/usr/local/src/repo/org/eclipse/jetty/jetty-webapp/9.3.24.v20180605/jetty-webapp-9.3.24.v20180605.jar:/usr/local/src/repo/org/eclipse/jetty/jetty-xml/9.3.24.v20180605/jetty-xml-9.3.24.v20180605.jar:/usr/local/src/repo/javax/servlet/jsp/jsp-api/2.1/jsp-api-2.1.jar:/usr/local/src/repo/com/sun/jersey/jersey-servlet/1.19/jersey-servlet-1.19.jar:/usr/local/src/repo/commons-beanutils/commons-beanutils/1.9.3/commons-beanutils-1.9.3.jar:/usr/local/src/repo/org/apache/commons/commons-configuration2/2.1.1/commons-configuration2-2.1.1.jar:/usr/local/src/repo/com/google/re2j/re2j/1.1/re2j-1.1.jar:/usr/local/src/repo/org/apache/curator/curator-client/2.13.0/curator-client-2.13.0.jar:/usr/local/src/repo/org/apache/hadoop/hadoop-hdfs-client/3.1.3/hadoop-hdfs-client-3.1.3.jar:/usr/local/src/repo/com/squareup/okhttp/okhttp/2.7.5/okhttp-2.7.5.jar:/usr/local/src/repo/com/squareup/okio/okio/1.6.0/okio-1.6.0.jar:/usr/local/src/repo/org/apache/hadoop/hadoop-yarn-api/3.1.3/hadoop-yarn-api-3.1.3.jar:/usr/local/src/repo/javax/xml/bind/jaxb-api/2.2.11/jaxb-api-2.2.11.jar:/usr/local/src/repo/org/apache/hadoop/hadoop-yarn-client/3.1.3/hadoop-yarn-client-3.1.3.jar:/usr/local/src/repo/org/apache/hadoop/hadoop-mapreduce-client-core/3.1.3/hadoop-mapreduce-client-core-3.1.3.jar:/usr/local/src/repo/org/apache/hadoop/hadoop-yarn-common/3.1.3/hadoop-yarn-common-3.1.3.jar:/usr/local/src/repo/org/eclipse/jetty/jetty-util/9.3.24.v20180605/jetty-util-9.3.24.v20180605.jar:/usr/local/src/repo/com/sun/jersey/jersey-client/1.19/jersey-client-1.19.jar:/usr/local/src/repo/org/apache/hadoop/hadoop-mapreduce-client-jobclient/3.1.3/hadoop-mapreduce-client-jobclient-3.1.3.jar:/usr/local/src/repo/org/apache/hadoop/hadoop-mapreduce-client-common/3.1.3/hadoop-mapreduce-client-common-3.1.3.jar:/usr/local/src/repo/org/apache/hadoop/hadoop-annotations/3.1.3/hadoop-annotations-3.1.3.jar:/usr/local/src/repo/org/apache/hadoop/hadoop-auth/3.1.3/hadoop-auth-3.1.3.jar:/usr/local/src/repo/com/nimbusds/nimbus-jose-jwt/4.41.1/nimbus-jose-jwt-4.41.1.jar:/usr/local/src/repo/com/github/stephenc/jcip/jcip-annotations/1.0-1/jcip-annotations-1.0-1.jar:/usr/local/src/repo/net/minidev/json-smart/2.3/json-smart-2.3.jar:/usr/local/src/repo/net/minidev/accessors-smart/1.2/accessors-smart-1.2.jar:/usr/local/src/repo/org/ow2/asm/asm/5.0.4/asm-5.0.4.jar:/usr/local/src/repo/org/apache/curator/curator-framework/2.13.0/curator-framework-2.13.0.jar:/usr/local/src/repo/org/apache/kerby/kerb-simplekdc/1.0.1/kerb-simplekdc-1.0.1.jar:/usr/local/src/repo/org/apache/kerby/kerb-client/1.0.1/kerb-client-1.0.1.jar:/usr/local/src/repo/org/apache/kerby/kerby-config/1.0.1/kerby-config-1.0.1.jar:/usr/local/src/repo/org/apache/kerby/kerb-core/1.0.1/kerb-core-1.0.1.jar:/usr/local/src/repo/org/apache/kerby/kerby-pkix/1.0.1/kerby-pkix-1.0.1.jar:/usr/local/src/repo/org/apache/kerby/kerby-asn1/1.0.1/kerby-asn1-1.0.1.jar:/usr/local/src/repo/org/apache/kerby/kerby-util/1.0.1/kerby-util-1.0.1.jar:/usr/local/src/repo/org/apache/kerby/kerb-common/1.0.1/kerb-common-1.0.1.jar:/usr/local/src/repo/org/apache/kerby/kerb-crypto/1.0.1/kerb-crypto-1.0.1.jar:/usr/local/src/repo/org/apache/kerby/kerb-util/1.0.1/kerb-util-1.0.1.jar:/usr/local/src/repo/org/apache/kerby/token-provider/1.0.1/token-provider-1.0.1.jar:/usr/local/src/repo/org/apache/kerby/kerb-admin/1.0.1/kerb-admin-1.0.1.jar:/usr/local/src/repo/org/apache/kerby/kerb-server/1.0.1/kerb-server-1.0.1.jar:/usr/local/src/repo/org/apache/kerby/kerb-identity/1.0.1/kerb-identity-1.0.1.jar:/usr/local/src/repo/org/apache/kerby/kerby-xdr/1.0.1/kerby-xdr-1.0.1.jar:/usr/local/src/repo/com/google/guava/guava/27.0-jre/guava-27.0-jre.jar:/usr/local/src/repo/com/google/guava/failureaccess/1.0/failureaccess-1.0.jar:/usr/local/src/repo/com/google/guava/listenablefuture/9999.0-empty-to-avoid-conflict-with-guava/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/usr/local/src/repo/org/checkerframework/checker-qual/2.5.2/checker-qual-2.5.2.jar:/usr/local/src/repo/com/google/errorprone/error_prone_annotations/2.2.0/error_prone_annotations-2.2.0.jar:/usr/local/src/repo/com/google/j2objc/j2objc-annotations/1.1/j2objc-annotations-1.1.jar:/usr/local/src/repo/org/codehaus/mojo/animal-sniffer-annotations/1.17/animal-sniffer-annotations-1.17.jar:/usr/local/src/repo/org/apache/hbase/hbase-mapreduce/2.2.3/hbase-mapreduce-2.2.3.jar:/usr/local/src/repo/org/apache/hbase/thirdparty/hbase-shaded-miscellaneous/2.2.1/hbase-shaded-miscellaneous-2.2.1.jar:/usr/local/src/repo/org/apache/hbase/thirdparty/hbase-shaded-netty/2.2.1/hbase-shaded-netty-2.2.1.jar:/usr/local/src/repo/org/apache/hbase/thirdparty/hbase-shaded-protobuf/2.2.1/hbase-shaded-protobuf-2.2.1.jar:/usr/local/src/repo/org/apache/hbase/hbase-common/2.2.3/hbase-common-2.2.3.jar:/usr/local/src/repo/com/github/stephenc/findbugs/findbugs-annotations/1.3.9-1/findbugs-annotations-1.3.9-1.jar:/usr/local/src/repo/org/apache/hbase/hbase-zookeeper/2.2.3/hbase-zookeeper-2.2.3.jar:/usr/local/src/repo/org/apache/hbase/hbase-protocol/2.2.3/hbase-protocol-2.2.3.jar:/usr/local/src/repo/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/usr/local/src/repo/org/apache/hbase/hbase-protocol-shaded/2.2.3/hbase-protocol-shaded-2.2.3.jar:/usr/local/src/repo/org/apache/hbase/hbase-metrics/2.2.3/hbase-metrics-2.2.3.jar:/usr/local/src/repo/org/apache/hbase/hbase-metrics-api/2.2.3/hbase-metrics-api-2.2.3.jar:/usr/local/src/repo/org/apache/htrace/htrace-core4/4.2.0-incubating/htrace-core4-4.2.0-incubating.jar:/usr/local/src/repo/org/apache/hbase/hbase-hadoop-compat/2.2.3/hbase-hadoop-compat-2.2.3.jar:/usr/local/src/repo/org/apache/hbase/hbase-hadoop2-compat/2.2.3/hbase-hadoop2-compat-2.2.3.jar:/usr/local/src/repo/org/apache/hbase/hbase-server/2.2.3/hbase-server-2.2.3.jar:/usr/local/src/repo/org/apache/hbase/hbase-http/2.2.3/hbase-http-2.2.3.jar:/usr/local/src/repo/org/eclipse/jetty/jetty-util-ajax/9.3.27.v20190418/jetty-util-ajax-9.3.27.v20190418.jar:/usr/local/src/repo/org/eclipse/jetty/jetty-http/9.3.27.v20190418/jetty-http-9.3.27.v20190418.jar:/usr/local/src/repo/javax/ws/rs/javax.ws.rs-api/2.0.1/javax.ws.rs-api-2.0.1.jar:/usr/local/src/repo/org/apache/hbase/hbase-procedure/2.2.3/hbase-procedure-2.2.3.jar:/usr/local/src/repo/org/eclipse/jetty/jetty-server/9.3.27.v20190418/jetty-server-9.3.27.v20190418.jar:/usr/local/src/repo/org/eclipse/jetty/jetty-io/9.3.27.v20190418/jetty-io-9.3.27.v20190418.jar:/usr/local/src/repo/org/glassfish/web/javax.servlet.jsp/2.3.2/javax.servlet.jsp-2.3.2.jar:/usr/local/src/repo/org/glassfish/javax.el/3.0.1-b12/javax.el-3.0.1-b12.jar:/usr/local/src/repo/javax/servlet/jsp/javax.servlet.jsp-api/2.3.1/javax.servlet.jsp-api-2.3.1.jar:/usr/local/src/repo/org/jamon/jamon-runtime/2.4.1/jamon-runtime-2.4.1.jar:/usr/local/src/repo/javax/servlet/javax.servlet-api/3.1.0/javax.servlet-api-3.1.0.jar:/usr/local/src/repo/com/lmax/disruptor/3.3.6/disruptor-3.3.6.jar:/usr/local/src/repo/org/apache/hadoop/hadoop-distcp/2.8.5/hadoop-distcp-2.8.5.jar:/usr/local/src/repo/org/apache/hbase/hbase-replication/2.2.3/hbase-replication-2.2.3.jar:/usr/local/src/repo/commons-io/commons-io/2.5/commons-io-2.5.jar:/usr/local/src/repo/org/apache/hadoop/hadoop-hdfs/2.8.5/hadoop-hdfs-2.8.5.jar:/usr/local/src/repo/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar:/usr/local/src/repo/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/usr/local/src/repo/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/usr/local/src/repo/com/sun/jersey/jersey-server/1.9/jersey-server-1.9.jar:/usr/local/src/repo/asm/asm/3.1/asm-3.1.jar:/usr/local/src/repo/commons-daemon/commons-daemon/1.0.13/commons-daemon-1.0.13.jar:/usr/local/src/repo/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/usr/local/src/repo/org/apache/yetus/audience-annotations/0.5.0/audience-annotations-0.5.0.jar:/usr/local/src/repo/org/apache/hbase/hbase-client/2.2.3/hbase-client-2.2.3.jar:/usr/local/src/repo/org/jruby/jcodings/jcodings/1.0.18/jcodings-1.0.18.jar:/usr/local/src/repo/org/jruby/joni/joni/2.1.11/joni-2.1.11.jar:/usr/local/src/repo/ru/yandex/clickhouse/clickhouse-jdbc/0.3.2/clickhouse-jdbc-0.3.2.jar:/usr/local/src/repo/com/clickhouse/clickhouse-http-client/0.3.2/clickhouse-http-client-0.3.2.jar:/usr/local/src/repo/com/clickhouse/clickhouse-client/0.3.2/clickhouse-client-0.3.2.jar:/usr/local/src/repo/com/google/code/gson/gson/2.8.8/gson-2.8.8.jar:/usr/local/src/repo/org/apache/httpcomponents/httpmime/4.5.13/httpmime-4.5.13.jar:/opt/scala-2.12.10/lib/scala-parser-combinators_2.12-1.0.7.jar:/opt/scala-2.12.10/lib/scala-xml_2.12-1.0.6.jar:/opt/scala-2.12.10/lib/scala-swing_2.12-2.0.3.jar:/opt/scala-2.12.10/lib/scala-reflect.jar:/opt/scala-2.12.10/lib/scala-library.jar gs8.shujuwaqu2 log4j:WARN No appenders could be found for logger (org.apache.hadoop.hive.conf.HiveConf). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 25/10/09 15:19:59 WARN Utils: Your hostname, pbcp resolves to a loopback address: 127.0.1.1; using 192.168.75.3 instead (on interface ens33) 25/10/09 15:19:59 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 25/10/09 15:19:59 INFO SparkContext: Running Spark version 3.1.1 25/10/09 15:20:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 25/10/09 15:20:00 INFO ResourceUtils: ============================================================== 25/10/09 15:20:00 INFO ResourceUtils: No custom resources configured for spark.driver. 25/10/09 15:20:00 INFO ResourceUtils: ============================================================== 25/10/09 15:20:00 INFO SparkContext: Submitted application: RandomForestModel 25/10/09 15:20:00 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 25/10/09 15:20:00 INFO ResourceProfile: Limiting resource is cpu 25/10/09 15:20:00 INFO ResourceProfileManager: Added ResourceProfile id: 0 25/10/09 15:20:00 INFO SecurityManager: Changing view acls to: root 25/10/09 15:20:00 INFO SecurityManager: Changing modify acls to: root 25/10/09 15:20:00 INFO SecurityManager: Changing view acls groups to: 25/10/09 15:20:00 INFO SecurityManager: Changing modify acls groups to: 25/10/09 15:20:00 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 25/10/09 15:20:00 INFO Utils: Successfully started service 'sparkDriver' on port 39167. 25/10/09 15:20:00 INFO SparkEnv: Registering MapOutputTracker 25/10/09 15:20:00 INFO SparkEnv: Registering BlockManagerMaster 25/10/09 15:20:00 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 25/10/09 15:20:00 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 25/10/09 15:20:00 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 25/10/09 15:20:00 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-7824d7b3-541f-4ec6-88e6-298ad263b86a 25/10/09 15:20:00 INFO MemoryStore: MemoryStore started with capacity 1948.2 MiB 25/10/09 15:20:00 INFO SparkEnv: Registering OutputCommitCoordinator 25/10/09 15:20:01 INFO Utils: Successfully started service 'SparkUI' on port 4040. 25/10/09 15:20:01 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.75.3:4040 25/10/09 15:20:01 INFO Executor: Starting executor ID driver on host 192.168.75.3 25/10/09 15:20:01 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 33353. 25/10/09 15:20:01 INFO NettyBlockTransferService: Server created on 192.168.75.3:33353 25/10/09 15:20:01 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 25/10/09 15:20:01 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.75.3, 33353, None) 25/10/09 15:20:01 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.75.3:33353 with 1948.2 MiB RAM, BlockManagerId(driver, 192.168.75.3, 33353, None) 25/10/09 15:20:01 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.75.3, 33353, None) 25/10/09 15:20:01 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.75.3, 33353, None) 25/10/09 15:20:02 ERROR FileUtils: The jar file path /opt/module/hive-3.1.2/lib doesn't exist 25/10/09 15:20:02 ERROR FileUtils: The jar file path /opt/module/hive-3.1.2/jdbc doesn't exist 25/10/09 15:20:03 ERROR FileUtils: The jar file path /opt/module/hive-3.1.2/lib doesn't exist 25/10/09 15:20:03 ERROR FileUtils: The jar file path /opt/module/hive-3.1.2/jdbc doesn't exist 25/10/09 15:20:03 INFO Persistence: Property datanucleus.metadata.validate unknown - will be ignored 25/10/09 15:20:03 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored 25/10/09 15:20:03 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored 25/10/09 15:20:04 ERROR FileUtils: The jar file path /opt/module/hive-3.1.2/lib doesn't exist 25/10/09 15:20:04 ERROR FileUtils: The jar file path /opt/module/hive-3.1.2/jdbc doesn't exist 随机森林模型训练完成! 预测完成,前5条结果: +-----------------+--------------------+ |machine_record_id|machine_record_state| +-----------------+--------------------+ +-----------------+--------------------+ 预测结果已成功写入 MySQL 表 ml_result! Hive 中 dwd.fact_machine_learning_data_test 预测完毕。 请在 MySQL 中执行以下查询语句查看结果: SELECT * FROM ml_result WHERE machine_record_id IN (1,8,20,28,36); +-----------------+--------------------+ |machine_record_id|machine_record_state| +-----------------+--------------------+ +-----------------+--------------------+ 代码显示空值
10-10
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

缘不易

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值