flink在Linux的运行方式

普通的jar:主类入口是普通的类,监控kafka,但是里面又创建了flink批环境

nohup java -jar -Xmx8192m -Xms2048m  k4.jar &

flink jar:主类就是flink的入库监控

bin/flink run -m yarn-cluster -p 3 -c flinkonkafka -yjm 2048m -ytm 8192m /test/flink_test.jar

 

报错:Flink Could not resolve substitution to a value: ${akka.stream.materializer}

报错现象:

Exception in thread "main" com.typesafe.config.ConfigException$UnresolvedSubstitution: reference.conf @ jar:file:/bigdata/app/flink-1.0-SNAPSHOT-jar-with-dependencies.jar!/reference.conf: 804: Could not resolve substitution to a value: ${akka.stream.materializer}

pom文件里面添加:

<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
     <resource>reference.conf</resource>
</transformer>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.yss.datamiddle</groupId>
  <artifactId>test-scala</artifactId>
  <version>1.0-SNAPSHOT</version>
  <inceptionYear>2008</inceptionYear>

  <properties>
    <scala.version>2.12.6</scala.version>
    <scala.artifactId>2.12</scala.artifactId>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <hadoop.version>2.7.0</hadoop.version>
    <spark.version>2.1.0</spark.version>
    <flink.version>1.10.0</flink.version>
    <hbase.version>2.2.1</hbase.version>
    <flink.jdbc.version>1.8.0</flink.jdbc.version>
    <dynamic.datasource.version>3.0.0</dynamic.datasource.version>
    <flink.streaming.scala.version>1.10.0</flink.streaming.scala.version>
    <breeze.version>1.0</breeze.version>
    <commons.math.version>2.2</commons.math.version>
    <oracle.version>12.1.0.2</oracle.version>
    <scala.binary.version>2.12</scala.binary.version>
    <druid.version>1.1.22</druid.version>
    <json.smart.version>2.2.1</json.smart.version>
    <scala.logging.version>3.5.0</scala.logging.version>
    <logback.classic.version>1.2.3</logback.classic.version>
    <gson.version>2.8.6</gson.version>
      <http.scala.version>2.4.1</http.scala.version>


  </properties>

  <dependencies>
    <!-- https://mvnrepository.com/artifact/com.alibaba/druid -->
    <dependency>
      <groupId>com.alibaba</groupId>
      <artifactId>druid</artifactId>
      <version>${druid.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-table-api-scala-bridge_2.12</artifactId>
      <version>${flink.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-table-api-java-bridge_2.12</artifactId>
      <version>${flink.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-table-planner-blink_${scala.binary.version}</artifactId>
      <version>${flink.version}</version>
      <!--<scope>provided</scope>-->
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-table-planner -->
    <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-table-planner_${scala.binary.version}</artifactId>
      <version>${flink.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-table-common</artifactId>
      <version>${flink.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-connector-hive_2.12</artifactId>
      <version>${flink.version}</version>
      <scope>${scope}</scope>
    </dependency>
    <dependency>
      <groupId>com.google.code.gson</groupId>
      <artifactId>gson</artifactId>
      <version>${gson.version}</version>
    </dependency>





    <!--flink on kafka start-->
    <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-connector-kafka-0.11_2.12</artifactId>
      <version>${flink.version}</version>
    </dependency>

    <!--解析json字符串-->
    <dependency>
      <groupId>net.minidev</groupId>
      <artifactId>json-smart</artifactId>
      <version>${json.smart.version}</version>
    </dependency>
    <!--log + logback-->
    <dependency>
      <groupId>com.typesafe.scala-logging</groupId>
      <artifactId>scala-logging_2.12</artifactId>
      <version>${scala.logging.version}</version>
    </dependency>
    <dependency>
      <groupId>ch.qos.logback</groupId>
      <artifactId>logback-classic</artifactId>
      <version>${logback.classic.version}</version>
    </dependency>

    <!--flink on kafka end-->


    <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-scala_2.12</artifactId>
      <version>${flink.version}</version>
    </dependency>

    <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-hbase_2.12</artifactId>
      <version>${flink.version}</version>
    </dependency>

    <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-clients_2.12</artifactId>
      <version>${flink.version}</version>
    </dependency>

    <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-java</artifactId>
      <version>${flink.version}</version>
    </dependency>

    <dependency>
      <groupId>org.apache.hbase</groupId>
      <artifactId>hbase-client</artifactId>
      <version>${hbase.version}</version>
    </dependency>

    <!-- 引入hbase包报错,添加解决-->
    <!--<dependency>-->
      <!--<groupId>jdk.tools</groupId>-->
      <!--<artifactId>jdk.tools</artifactId>-->
      <!--<version>1.8</version>-->
      <!--<scope>system</scope>-->
      <!--<systemPath>${JAVA_HOME}/lib/tools.jar</systemPath>-->
    <!--</dependency>-->
    <dependency>
      <groupId>org.apache.hbase</groupId>
      <artifactId>hbase-server</artifactId>
      <version>${hbase.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hbase</groupId>
      <artifactId>hbase-common</artifactId>
      <version>${hbase.version}</version>
      <exclusions>
        <exclusion>
          <artifactId>jdk.tools</artifactId>
          <groupId>jdk.tools</groupId>
          <!--                <version>1.8</version>-->
        </exclusion>
      </exclusions>
    </dependency>

    <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-jdbc_2.12</artifactId>
      <version>${flink.jdbc.version}</version>
    </dependency>
    <!-- scala相关的jar  begin-->
    <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-streaming-scala_2.12</artifactId>
      <version>${flink.streaming.scala.version}</version>
    </dependency>

    <dependency>
      <groupId>org.scalanlp</groupId>
      <artifactId>breeze_2.12</artifactId>
      <version>${breeze.version}</version>
    </dependency>

    <dependency>
      <groupId>org.apache.commons</groupId>
      <artifactId>commons-math</artifactId>
      <version>${commons.math.version}</version>
    </dependency>

    <dependency>
      <groupId>com.github.noraui</groupId>
      <artifactId>ojdbc7</artifactId>
      <version>${oracle.version}</version>
    </dependency>
    <dependency>
      <groupId>org.scalaj</groupId>
      <artifactId>scalaj-http_2.12</artifactId>
      <version>${http.scala.version}</version>
    </dependency>
  </dependencies>


  <build>
    <!--scala待编译的文件目录-->
    <sourceDirectory>src/main/scala</sourceDirectory>
    <plugins>

      <plugin>
        <groupId>net.alchim31.maven</groupId>
        <artifactId>scala-maven-plugin</artifactId>
        <version>4.4.0</version>
        <executions>
          <execution>
            <goals>
              <goal>compile</goal>
              <goal>testCompile</goal>
            </goals>
          </execution>
        </executions>
      </plugin>

      <plugin>
        <artifactId>maven-compiler-plugin</artifactId>
        <version>3.6.0</version>
        <configuration>
          <source>1.8</source>
          <target>1.8</target>
        </configuration>
      </plugin>
      <!--  指定模块打包 与下面的二选一-->
<!--      <plugin>-->
<!--        <groupId>org.apache.maven.plugins</groupId>-->
<!--        <artifactId>maven-shade-plugin</artifactId>-->
<!--        <version>3.1.1</version>-->
<!--        <configuration>-->

<!--          <filters>-->
<!--            <filter>-->
<!--              <artifact>*:*</artifact>-->
<!--              <excludes>-->

<!--                &lt;!&ndash; 滤掉这些文件,避免影响&ndash;&gt;-->
<!--                <exclude>META-INF/*.SF</exclude>-->
<!--                <exclude>META-INF/*.DSA</exclude>-->
<!--                <exclude>META-INF/*.RSA</exclude>-->
<!--              &lt;!&ndash; index指标计算层需要过滤的包&ndash;&gt;-->
<!--              <exclude>**/index/bondsIndx/**</exclude>-->
<!--              &lt;!&ndash;service 需要过滤的包&ndash;&gt;-->
<!--              <exclude>**/service/bondsIndx/**</exclude>-->
<!--              </excludes>-->
<!--            </filter>-->
<!--          </filters>-->

<!--          <transformers>-->
<!--            <transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">-->
<!--              <resource>reference.conf</resource>-->
<!--            </transformer>-->
<!--          </transformers>-->
<!--        </configuration>-->
<!--        <executions>-->
<!--          <execution>-->
<!--            <phase>package</phase>-->
<!--            <goals>-->
<!--              <goal>shade</goal>-->
<!--            </goals>-->
<!--          </execution>-->
<!--        </executions>-->
<!--      </plugin>-->
      <!--下面的plugin是需要单独指定某个类的时候,可以加载这个插件-->
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-shade-plugin</artifactId>
        <version>2.4.3</version>
        <executions>
          <execution>
            <phase>package</phase>
            <goals>
              <goal>shade</goal>
            </goals>
            <configuration>
              <filters>
                <filter>
                  <artifact>*:*</artifact>
                  <excludes>
                    <exclude>META-INF/*.SF</exclude>
                    <exclude>META-INF/*.DSA</exclude>
                    <exclude>META-INF/*.RSA</exclude>
                  </excludes>
                </filter>
              </filters>
              <!-- 指定主类 -->
              <transformers>
                <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                  <mainClass>flinkonkafka.KafkaConsumer</mainClass>
                </transformer>
                <transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">-->
                              <resource>reference.conf</resource>
                            </transformer>
              </transformers>
            </configuration>
          </execution>
        </executions>
      </plugin>
    </plugins>

    <resources>
      <resource>
        <directory>src/main/resources</directory>
        <includes>
          <include>**/*.xml</include>
          <include>**/*.yml</include>
        </includes>
        <filtering>false</filtering>
      </resource>
    </resources>
  </build>

</project>

 

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论
Flink JNI方式使用.so文件需要以下步骤: 1.编写Native方法 在Java类中编写Native方法,该方法将调用.so文件中的函数。例如: ``` public class MyNativeClass { public native void printMessage(String message); } ``` 2.生成头文件 使用javah命令生成头文件,该头文件将用于C++实现Native方法。例如: ``` javah -classpath /path/to/class/files MyNativeClass ``` 将生成名为MyNativeClass.h的头文件。 3.实现Native方法 在C++中实现Native方法,该方法将调用.so文件中的函数。在实现方法时,需要包含生成的头文件,并使用JNI函数来获取Java参数和返回值。例如: ``` #include "MyNativeClass.h" #include <jni.h> #include <stdio.h> #include <stdlib.h> JNIEXPORT void JNICALL Java_MyNativeClass_printMessage(JNIEnv *env, jobject obj, jstring message) { const char *str = env->GetStringUTFChars(message, NULL); printf("%s\n", str); env->ReleaseStringUTFChars(message, str); } ``` 4.编译C++代码 编译C++代码并生成.so文件。例如: ``` g++ -shared -fPIC -I${JAVA_HOME}/include -I${JAVA_HOME}/include/linux MyNativeClass.cpp -o libMyNativeClass.so ``` 将生成名为libMyNativeClass.so的.so文件。 5.加载.so文件 在Java程序中加载.so文件。例如: ``` System.load("/path/to/libMyNativeClass.so"); ``` 6.调用Native方法 在Java程序中调用Native方法。例如: ``` MyNativeClass myNativeClass = new MyNativeClass(); myNativeClass.printMessage("Hello, JNI!"); ``` 该方法将输出“Hello, JNI!”到控制台。 注意:在运行Java程序时,需要将.so文件路径添加到LD_LIBRARY_PATH环境变量中,以便动态链接库能够找到.so文件。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

꧁꫞ND꫞꧂

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值