flink1.13.5编译,各种填坑

系列文章目录



前言

问题:使用官网flink1.13.5,hadoop2.7.2,无法把jar提交到集群进行standalone和yarn模式的部署
解决方法:自编译源码


一、源码准备

到官网或者github下载源码

二、 修改pom

pom中把hadoop和hive的版本改为与自己的版本一致。

三、编译

1.编译失败

编译命令:
mvn clean install -DskipTests

[INFO] > fsevents@1.2.7 install /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/fsevents
[INFO] > node install
[INFO] 
[INFO] 
[INFO] > node-sass@4.11.0 install /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass
[INFO] > node scripts/install.js
[INFO] 
[ERROR] Unable to save binary /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/vendor/linux-x64-64 : { Error: EACCES: permission denied, mkdir '/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/vendor'
[ERROR]     at Object.mkdirSync (fs.js:729:3)
[ERROR]     at sync (/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/mkdirp/index.js:71:13)
[ERROR]     at Function.sync (/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/mkdirp/index.js:77:24)
[ERROR]     at checkAndDownloadBinary (/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/scripts/install.js:114:11)
[ERROR]     at Object.<anonymous> (/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/scripts/install.js:157:1)
[ERROR]     at Module._compile (internal/modules/cjs/loader.js:689:30)
[ERROR]     at Object.Module._extensions..js (internal/modules/cjs/loader.js:700:10)
[ERROR]     at Module.load (internal/modules/cjs/loader.js:599:32)
[ERROR]     at tryModuleLoad (internal/modules/cjs/loader.js:538:12)
[ERROR]     at Function.Module._load (internal/modules/cjs/loader.js:530:3)
[ERROR]   errno: -13,
[ERROR]   syscall: 'mkdir',
[ERROR]   code: 'EACCES',
[ERROR]   path:
[ERROR]    '/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/vendor' }
[INFO] 
[INFO] > node-sass@4.11.0 postinstall /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass
[INFO] > node scripts/build.js
[INFO] 
[INFO] Building: /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node/node /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-gyp/bin/node-gyp.js rebuild --verbose --libsass_ext= --libsass_cflags= --libsass_ldflags= --libsass_library=
[ERROR] gyp info it worked if it ends with ok
[ERROR] gyp verb cli [ '/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node/node',
[ERROR] gyp verb cli   '/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-gyp/bin/node-gyp.js',
[ERROR] gyp verb cli   'rebuild',
[ERROR] gyp verb cli   '--verbose',
[ERROR] gyp verb cli   '--libsass_ext=',
[ERROR] gyp verb cli   '--libsass_cflags=',
[ERROR] gyp verb cli   '--libsass_ldflags=',
[ERROR] gyp verb cli   '--libsass_library=' ]
[ERROR] gyp info using node-gyp@3.8.0
[ERROR] gyp info using node@10.9.0 | linux | x64
[ERROR] gyp verb command rebuild []
[ERROR] gyp verb command clean []
[ERROR] gyp verb clean removing "build" directory
[ERROR] gyp verb command configure []
[ERROR] gyp verb check python checking for Python executable "python2" in the PATH
[ERROR] gyp verb `which` succeeded python2 /usr/bin/python2
[ERROR] gyp verb check python version `/usr/bin/python2 -c "import sys; print "2.7.5
[ERROR] gyp verb check python version .%s.%s" % sys.version_info[:3];"` returned: %j
[ERROR] gyp verb get node dir no --target version specified, falling back to host node version: 10.9.0
[ERROR] gyp verb command install [ '10.9.0' ]
[ERROR] gyp verb install input version string "10.9.0"
[ERROR] gyp verb install installing version: 10.9.0
[ERROR] gyp verb install --ensure was passed, so won't reinstall if already installed
[ERROR] gyp verb install version not already installed, continuing with install 10.9.0
[ERROR] gyp verb ensuring nodedir is created /root/.node-gyp/10.9.0
[ERROR] gyp WARN EACCES user "root" does not have permission to access the dev dir "/root/.node-gyp/10.9.0"
[ERROR] gyp WARN EACCES attempting to reinstall using temporary dev dir "/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/.node-gyp"
[ERROR] gyp verb tmpdir == cwd automatically will remove dev files after to save disk space
[ERROR] gyp verb command install [ '--node_gyp_internal_noretry', '10.9.0' ]
[ERROR] gyp verb install input version string "10.9.0"
[ERROR] gyp verb install installing version: 10.9.0
[ERROR] gyp verb install --ensure was passed, so won't reinstall if already installed
[ERROR] gyp verb install version not already installed, continuing with install 10.9.0
[ERROR] gyp verb ensuring nodedir is created /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/.node-gyp/10.9.0
[ERROR] gyp WARN install got an error, rolling back install
[ERROR] gyp verb command remove [ '10.9.0' ]
[ERROR] gyp verb remove using node-gyp dir: /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/.node-gyp
[ERROR] gyp verb remove removing target version: 10.9.0
[ERROR] gyp verb remove removing development files for version: 10.9.0
[ERROR] gyp WARN install got an error, rolling back install
[ERROR] gyp verb command remove [ '10.9.0' ]
[ERROR] gyp verb remove using node-gyp dir: /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/.node-gyp
[ERROR] gyp verb remove removing target version: 10.9.0
[ERROR] gyp verb remove removing development files for version: 10.9.0
[ERROR] gyp ERR! configure error 
[ERROR] gyp ERR! stack Error: EACCES: permission denied, mkdir '/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/.node-gyp'
[ERROR] gyp ERR! System Linux 3.10.0-1160.49.1.el7.x86_64
[ERROR] gyp ERR! command "/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node/node" "/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-gyp/bin/node-gyp.js" "rebuild" "--verbose" "--libsass_ext=" "--libsass_cflags=" "--libsass_ldflags=" "--libsass_library="
[ERROR] gyp ERR! cwd /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass
[ERROR] gyp ERR! node -v v10.9.0
[ERROR] gyp ERR! node-gyp -v v3.8.0
[ERROR] gyp ERR! not ok 
[ERROR] Build failed with error code: 1
[INFO] 
[INFO] > husky@1.3.1 install /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/husky
[INFO] > node husky install
[INFO] 
[INFO] husky > setting up git hooks
[INFO] HUSKY_SKIP_INSTALL environment variable is set to 'true', skipping Git hooks installation.
[ERROR] added 1250 packages in 16.206s
[INFO] 
[INFO] --- frontend-maven-plugin:1.6:npm (npm run build) @ flink-runtime-web_2.11 ---
[INFO] Running 'npm run build' in /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard
[INFO] 
[INFO] > flink-dashboard@2.0.0 build /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard
[INFO] > ng build --prod --base-href ./
[INFO] 
[ERROR] Browserslist: caniuse-lite is outdated. Please run next command `npm update`
Killed
[root@hadoop103 flink-1.13.5]# npm update caniuse-lite browserslist
[root@hadoop103 flink-1.13.5]# npm update caniuse-lite browserslist
[root@hadoop103 flink-1.13.5]# npm i caniuse-lite browserslist -S
npm ERR! fetch failed https://registry.npmjs.org/browserslist/-/browserslist-4.20.2.tgz
npm WARN retry will retry, error on last attempt: Error: Parse Error
npm ERR! fetch failed https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001325.tgz
npm WARN retry will retry, error on last attempt: Error: Parse Error
npm ERR! fetch failed https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001325.tgz
npm WARN retry will retry, error on last attempt: Error: Parse Error
npm ERR! fetch failed https://registry.npmjs.org/browserslist/-/browserslist-4.20.2.tgz
npm WARN retry will retry, error on last attempt: Error: Parse Error
npm ERR! fetch failed https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001325.tgz
npm ERR! fetch failed https://registry.npmjs.org/browserslist/-/browserslist-4.20.2.tgz
npm ERR! Linux 3.10.0-1160.49.1.el7.x86_64
npm ERR! argv "/usr/bin/node" "/usr/bin/npm" "i" "caniuse-lite" "browserslist" "-S"
npm ERR! node v6.14.2
npm ERR! npm  v3.10.10
npm ERR! code HPE_UNEXPECTED_CONTENT_LENGTH

npm ERR! Parse Error
npm ERR! 
npm ERR! If you need help, you may report this error at:
npm ERR!     <https://github.com/npm/npm/issues>

npm ERR! Please include the following file with any support request:
npm ERR!     /root/opensource/flink-1.13.5/npm-debug.log

2. 填坑

2.1 填坑1

cd xxxx/flink
mvn clean install -DskipTests -Dfast -T 4 -Dmaven.compile.fork=true  -Dscala-2.11

分解:
mvn clean install \
  -DskipTests \ # 跳过测试部分
  -Dfast \ # 跳过doc检查等
  -T 4 \ # 支持多处理器或者处理器核数参数,加快构建速度,推荐Maven3.3及以上
  -Dmaven.compile.fork=true #允许多线程编译,推荐maven在3.3及以上

(1)flink-runtime-web这个包访问外网npm下载难,需要改一下这个包下的pom文件(执行编译命令前需要做的事)
搜索“ci --cache-max=0 --no-save”替换为“install -registry=https://registry.npm.taobao.org --cache-max=0 --no-save”
即:install -registry=https://registry.npm.taobao.org --cache-max=0 --no-save

作者:FishMAN__
链接:https://www.jianshu.com/p/66a3cf379042

重新编译:

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Flink : Table : Planner 1.13.5:
[INFO] 
[INFO] Flink : Table : Planner ............................ SUCCESS [02:49 min]
[INFO] Flink : Formats : .................................. SUCCESS [  4.462 s]
[INFO] Flink : Format : Common ............................ SUCCESS [  0.279 s]
[INFO] Flink : Table : SQL Parser Hive .................... SUCCESS [ 16.250 s]
[INFO] Flink : Table : Runtime Blink ...................... SUCCESS [ 21.968 s]
[INFO] Flink : Table : Planner Blink ...................... FAILURE [ 58.828 s]
[INFO] Flink : Formats : Json ............................. SKIPPED
[INFO] Flink : Connectors : Elasticsearch base ............ SKIPPED
[INFO] Flink : Connectors : Elasticsearch 5 ............... SKIPPED
[INFO] Flink : Connectors : Elasticsearch 6 ............... SKIPPED
[INFO] Flink : Connectors : Elasticsearch 7 ............... SKIPPED
[INFO] Flink : Connectors : HBase base .................... SUCCESS [  7.488 s]
[INFO] Flink : Connectors : HBase 1.4 ..................... SKIPPED
[INFO] Flink : Connectors : HBase 2.2 ..................... SKIPPED
[INFO] Flink : Formats : Hadoop bulk ...................... SUCCESS [  1.868 s]
[INFO] Flink : Formats : Orc .............................. SKIPPED
[INFO] Flink : Formats : Orc nohive ....................... SKIPPED
[INFO] Flink : Formats : Avro ............................. SKIPPED
[INFO] Flink : Formats : Parquet .......................... SKIPPED
[INFO] Flink : Formats : Csv .............................. SKIPPED
[INFO] Flink : Connectors : Hive .......................... SKIPPED
[INFO] Flink : Connectors : JDBC .......................... SKIPPED
[INFO] Flink : Connectors : RabbitMQ ...................... SUCCESS [  2.197 s]
[INFO] Flink : Connectors : Twitter ....................... SUCCESS [  6.866 s]
[INFO] Flink : Connectors : Nifi .......................... SUCCESS [  2.562 s]
[INFO] Flink : Connectors : Cassandra ..................... SKIPPED
[INFO] Flink : Metrics : JMX .............................. SUCCESS [  1.353 s]
[INFO] Flink : Formats : Avro confluent registry .......... SKIPPED
[INFO] Flink : Connectors : Kafka ......................... SKIPPED
[INFO] Flink : Connectors : Google PubSub ................. SUCCESS [  4.299 s]
[INFO] Flink : Connectors : Kinesis ....................... SKIPPED
[INFO] Flink : Connectors : SQL : Elasticsearch 6 ......... SKIPPED
[INFO] Flink : Connectors : SQL : Elasticsearch 7 ......... SKIPPED
[INFO] Flink : Connectors : SQL : HBase 1.4 ............... SKIPPED
[INFO] Flink : Connectors : SQL : HBase 2.2 ............... SKIPPED
[INFO] Flink : Connectors : SQL : Hive 1.2.2 .............. SKIPPED
[INFO] Flink : Connectors : SQL : Hive 2.2.0 .............. SKIPPED
[INFO] Flink : Connectors : SQL : Hive 2.3.6 .............. SKIPPED
[INFO] Flink : Connectors : SQL : Hive 3.1.2 .............. SKIPPED
[INFO] Flink : Connectors : SQL : Kafka ................... SKIPPED
[INFO] Flink : Connectors : SQL : Kinesis ................. SKIPPED
[INFO] Flink : Formats : Sequence file .................... SUCCESS [  1.608 s]
[INFO] Flink : Formats : Compress ......................... SUCCESS [  1.434 s]
[INFO] Flink : Formats : Avro AWS Glue Schema Registry .... SKIPPED
[INFO] Flink : Formats : SQL Orc .......................... SKIPPED
[INFO] Flink : Formats : SQL Parquet ...................... SKIPPED
[INFO] Flink : Formats : SQL Avro ......................... SKIPPED
[INFO] Flink : Formats : SQL Avro Confluent Registry ...... SKIPPED
[INFO] Flink : Examples : Streaming ....................... SKIPPED
[INFO] Flink : Examples : Table ........................... SKIPPED
[INFO] Flink : Examples : Build Helper : .................. SUCCESS [  0.401 s]
[INFO] Flink : Examples : Build Helper : Streaming Twitter  SKIPPED
[INFO] Flink : Examples : Build Helper : Streaming State machine SKIPPED
[INFO] Flink : Examples : Build Helper : Streaming Google PubSub SKIPPED
[INFO] Flink : Container .................................. SUCCESS [  1.244 s]
[INFO] Flink : Queryable state : Runtime .................. SUCCESS [  1.906 s]
[INFO] Flink : Mesos ...................................... SUCCESS [ 57.883 s]
[INFO] Flink : Kubernetes ................................. SUCCESS [ 19.016 s]
[INFO] Flink : Yarn ....................................... SUCCESS [  6.343 s]
[INFO] Flink : Libraries : Gelly .......................... SUCCESS [  8.045 s]
[INFO] Flink : Libraries : Gelly scala .................... SUCCESS [ 42.796 s]
[INFO] Flink : Libraries : Gelly Examples ................. SKIPPED
[INFO] Flink : External resources : ....................... SUCCESS [  0.458 s]
[INFO] Flink : External resources : GPU ................... SUCCESS [  0.462 s]
[INFO] Flink : Metrics : Dropwizard ....................... SUCCESS [  1.061 s]
[INFO] Flink : Metrics : Graphite ......................... SUCCESS [  0.773 s]
[INFO] Flink : Metrics : InfluxDB ......................... SUCCESS [  3.366 s]
[INFO] Flink : Metrics : Prometheus ....................... SUCCESS [  1.748 s]
[INFO] Flink : Metrics : StatsD ........................... SUCCESS [  0.825 s]
[INFO] Flink : Metrics : Datadog .......................... SUCCESS [  1.147 s]
[INFO] Flink : Metrics : Slf4j ............................ SUCCESS [  0.882 s]
[INFO] Flink : Libraries : CEP Scala ...................... SUCCESS [ 34.466 s]
[INFO] Flink : Table : Uber ............................... SKIPPED
[INFO] Flink : Table : Uber Blink ......................... SKIPPED
[INFO] Flink : Python ..................................... SKIPPED
[INFO] Flink : Table : SQL Client ......................... SKIPPED
[INFO] Flink : Libraries : State processor API ............ SUCCESS [  3.106 s]
[INFO] Flink : Dist ....................................... SKIPPED
[INFO] Flink : Yarn Tests ................................. SKIPPED
[INFO] Flink : E2E Tests : ................................ SKIPPED
[INFO] Flink : E2E Tests : CLI ............................ SKIPPED
[INFO] Flink : E2E Tests : Parent Child classloading program SKIPPED
[INFO] Flink : E2E Tests : Parent Child classloading lib-package SKIPPED
[INFO] Flink : E2E Tests : Dataset allround ............... SKIPPED
[INFO] Flink : E2E Tests : Dataset Fine-grained recovery .. SKIPPED
[INFO] Flink : E2E Tests : Datastream allround ............ SKIPPED
[INFO] Flink : E2E Tests : Batch SQL ...................... SKIPPED
[INFO] Flink : E2E Tests : Stream SQL ..................... SKIPPED
[INFO] Flink : E2E Tests : Distributed cache via blob ..... SKIPPED
[INFO] Flink : E2E Tests : High parallelism iterations .... SKIPPED
[INFO] Flink : E2E Tests : Stream stateful job upgrade .... SKIPPED
[INFO] Flink : E2E Tests : Queryable state ................ SKIPPED
[INFO] Flink : E2E Tests : Local recovery and allocation .. SKIPPED
[INFO] Flink : E2E Tests : Elasticsearch 5 ................ SKIPPED
[INFO] Flink : E2E Tests : Elasticsearch 6 ................ SKIPPED
[INFO] Flink : Quickstart : ............................... SUCCESS [  1.541 s]
[INFO] Flink : Quickstart : Java .......................... SUCCESS [  2.873 s]
[INFO] Flink : Quickstart : Scala ......................... SUCCESS [  0.258 s]
[INFO] Flink : E2E Tests : Quickstart ..................... SKIPPED
[INFO] Flink : E2E Tests : Confluent schema registry ...... SKIPPED
[INFO] Flink : E2E Tests : Stream state TTL ............... SKIPPED
[INFO] Flink : E2E Tests : SQL client ..................... SKIPPED
[INFO] Flink : E2E Tests : File sink ...................... SKIPPED
[INFO] Flink : E2E Tests : State evolution ................ SKIPPED
[INFO] Flink : E2E Tests : RocksDB state memory control ... SKIPPED
[INFO] Flink : E2E Tests : Common ......................... SKIPPED
[INFO] Flink : E2E Tests : Metrics availability ........... SKIPPED
[INFO] Flink : E2E Tests : Metrics reporter prometheus .... SKIPPED
[INFO] Flink : E2E Tests : Heavy deployment ............... SKIPPED
[INFO] Flink : E2E Tests : Connectors : Google PubSub ..... SKIPPED
[INFO] Flink : E2E Tests : Streaming Kafka base ........... SKIPPED
[INFO] Flink : E2E Tests : Streaming Kafka ................ SKIPPED
[INFO] Flink : E2E Tests : Plugins : ...................... SKIPPED
[INFO] Flink : E2E Tests : Plugins : Dummy fs ............. SKIPPED
[INFO] Flink : E2E Tests : Plugins : Another dummy fs ..... SKIPPED
[INFO] Flink : E2E Tests : TPCH ........................... SKIPPED
[INFO] Flink : E2E Tests : Streaming Kinesis .............. SKIPPED
[INFO] Flink : E2E Tests : Elasticsearch 7 ................ SKIPPED
[INFO] Flink : E2E Tests : Common Kafka ................... SKIPPED
[INFO] Flink : E2E Tests : TPCDS .......................... SKIPPED
[INFO] Flink : E2E Tests : Netty shuffle memory control ... SKIPPED
[INFO] Flink : E2E Tests : Python ......................... SKIPPED
[INFO] Flink : E2E Tests : HBase .......................... SKIPPED
[INFO] Flink : E2E Tests : AWS Glue Schema Registry ....... SKIPPED
[INFO] Flink : State backends : Heap spillable ............ SUCCESS [  1.198 s]
[INFO] Flink : Contrib : .................................. SUCCESS [  0.504 s]
[INFO] Flink : Contrib : Connectors : Wikiedits ........... SUCCESS [  1.566 s]
[INFO] Flink : FileSystems : Tests ........................ SKIPPED
[INFO] Flink : Docs ....................................... SKIPPED
[INFO] Flink : Walkthrough : .............................. SUCCESS [  0.476 s]
[INFO] Flink : Walkthrough : Common ....................... SUCCESS [  1.539 s]
[INFO] Flink : Walkthrough : Datastream Java .............. SUCCESS [  0.261 s]
[INFO] Flink : Walkthrough : Datastream Scala ............. SUCCESS [  0.295 s]
[INFO] Flink : Tools : CI : Java .......................... SUCCESS [  1.627 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  02:52 min (Wall Clock)
[INFO] Finished at: 2022-04-07T11:19:11+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project flink-table-planner-blink_2.12: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 137 (Exit value: 137) -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project flink-table-planner-blink_2.12: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 137 (Exit value: 137)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:215)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:196)
    at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:186)
    at java.util.concurrent.FutureTask.run (FutureTask.java:266)
    at java.util.concurrent.Executors$RunnableAdapter.call (Executors.java:511)
    at java.util.concurrent.FutureTask.run (FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:624)
    at java.lang.Thread.run (Thread.java:748)
Caused by: org.apache.maven.plugin.MojoExecutionException: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 137 (Exit value: 137)
    at scala_maven.ScalaMojoSupport.execute (ScalaMojoSupport.java:490)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:210)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:196)
    at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:186)
    at java.util.concurrent.FutureTask.run (FutureTask.java:266)
    at java.util.concurrent.Executors$RunnableAdapter.call (Executors.java:511)
    at java.util.concurrent.FutureTask.run (FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:624)
    at java.lang.Thread.run (Thread.java:748)
Caused by: org.apache.commons.exec.ExecuteException: Process exited with an error: 137 (Exit value: 137)
    at org.apache.commons.exec.DefaultExecutor.executeInternal (DefaultExecutor.java:377)
    at org.apache.commons.exec.DefaultExecutor.execute (DefaultExecutor.java:160)
    at org.apache.commons.exec.DefaultExecutor.execute (DefaultExecutor.java:147)
    at scala_maven_executions.JavaMainCallerByFork.run (JavaMainCallerByFork.java:100)
    at scala_maven.ScalaCompilerSupport.compile (ScalaCompilerSupport.java:161)
    at scala_maven.ScalaCompilerSupport.doExecute (ScalaCompilerSupport.java:99)
    at scala_maven.ScalaMojoSupport.execute (ScalaMojoSupport.java:482)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:210)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:196)
    at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:186)
    at java.util.concurrent.FutureTask.run (FutureTask.java:266)
    at java.util.concurrent.Executors$RunnableAdapter.call (Executors.java:511)
    at java.util.concurrent.FutureTask.run (FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:624)
    at java.lang.Thread.run (Thread.java:748)
[ERROR] 
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <args> -rf :flink-table-planner-blink_2.12

2.2 填坑2 仓库没有confluent.version>5.3.0版本

[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  28.374 s (Wall Clock)
[INFO] Finished at: 2022-04-07T11:34:06+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project flink-avro-confluent-registry: Could not resolve dependencies for project org.apache.flink:flink-avro-confluent-registry:jar:1.13.5: io.confluent:kafka-schema-registry-client:jar:5.5.2 was not found in http://maven.aliyun.com/nexus/content/groups/public during a previous attempt. This failure was cached in the local repository and resolution is not reattempted until the update interval of nexus-aliyun has elapsed or updates are forced -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:

原因: 到mvnrepo查,发现: 只有5.3.0版本
解决方法:把pom 版本改了

[root@hadoop103 flink-formats]# vim flink-avro-confluent-registry/pom.xml
<confluent.version>5.3.0</confluent.version>

2.3 填坑3 编译出错,从错误地方继续跑的方法

-rf : 指定失败的module ,从失败的地方继续跑

编译命令:

[root@hadoop103 flink-1.13.5]# mvn clean install -DskipTests -Dfast -T 4 -Dmaven.compile.fork=true -Dscala-2.12 -rf :flink-avro-confluent-registry

[INFO] Flink : E2E Tests : Connectors : Google PubSub ..... SUCCESS [ 21.321 s]
[INFO] Flink : E2E Tests : Streaming Kafka base ........... SUCCESS [  0.210 s]
[INFO] Flink : E2E Tests : Streaming Kafka ................ SUCCESS [  7.944 s]
[INFO] Flink : E2E Tests : Plugins : ...................... SUCCESS [  0.113 s]
[INFO] Flink : E2E Tests : Plugins : Dummy fs ............. SUCCESS [  0.095 s]
[INFO] Flink : E2E Tests : Plugins : Another dummy fs ..... SUCCESS [  0.148 s]
[INFO] Flink : E2E Tests : TPCH ........................... SUCCESS [  1.719 s]
[INFO] Flink : E2E Tests : Streaming Kinesis .............. SUCCESS [ 19.961 s]
[INFO] Flink : E2E Tests : Elasticsearch 7 ................ SUCCESS [  4.154 s]
[INFO] Flink : E2E Tests : Common Kafka ................... SUCCESS [ 56.202 s]
[INFO] Flink : E2E Tests : TPCDS .......................... SUCCESS [  1.079 s]
[INFO] Flink : E2E Tests : Netty shuffle memory control ... SUCCESS [  0.152 s]
[INFO] Flink : E2E Tests : Python ......................... SUCCESS [  8.451 s]
[INFO] Flink : E2E Tests : HBase .......................... SUCCESS [  2.208 s]
[INFO] Flink : E2E Tests : AWS Glue Schema Registry ....... SUCCESS [ 23.854 s]
[INFO] Flink : State backends : Heap spillable ............ SUCCESS [  0.517 s]
[INFO] Flink : Contrib : .................................. SUCCESS [  0.108 s]
[INFO] Flink : Contrib : Connectors : Wikiedits ........... SUCCESS [  0.858 s]
[INFO] Flink : FileSystems : Tests ........................ SUCCESS [  2.662 s]
[INFO] Flink : Docs ....................................... SUCCESS [  3.249 s]
[INFO] Flink : Walkthrough : .............................. SUCCESS [  0.423 s]
[INFO] Flink : Walkthrough : Common ....................... SUCCESS [  0.433 s]
[INFO] Flink : Walkthrough : Datastream Java .............. SUCCESS [  0.133 s]
[INFO] Flink : Walkthrough : Datastream Scala ............. SUCCESS [  0.222 s]
[INFO] Flink : Tools : CI : Java .......................... SUCCESS [  2.507 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  07:46 min (Wall Clock)
[INFO] Finished at: 2022-04-07T11:59:03+08:00
[INFO] ------------------------------------------

最后的安装包在:build-target目录下

/root/opensource/flink-1.13.5/build-target

总结

编译成功,经过测试,服务端,能客户端提交的任务

  • 2
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 1
    评论
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值