一、配置Flink相关属性
public class FlinkKafkaStreaming {
/**
* 加载Kafka配置
*/
@Autowired
private KafkaProperties kafkaProperties;
public void main(String[] args) throws Exception {
// 流处理
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
Properties properties = new Properties();
properties.putAll(kafkaProperties.buildConsumerProperties());
FlinkKafkaConsumer<String> consumer = new FlinkKafkaConsumer<>(KafkaTopicName, new SimpleStringSchema(), properties);
DataStreamSource<String> source = env.addSource(consumer);
// ...相关算子操作
recordData.addSink(new MySink());
env.execute();
}
}
二、自定义Sink使用@Autowired注入Spring容器中的类
public class MySink extends RichSinkFunction<Bean> {
@Autowired
private MyService myService;
/**
* Flink Sink执行方法
*/
@Override
public void invoke(Bean bean, Context context) throws Exception {
// 使用myService时发现空指针异常
}
}
分析:在启动SpringBoot项目是加载了Spring容器,其他地方可以使用@Autowired获取Spring容器中的类;但是Flink启动的项目中,默认启动了多线程执行相关代码,导致在其他线程无法获取Spring容器,只有在Spring所在的线程才能使用@Autowired,故在Flink自定义的Sink的open()方法中初始化Spring容器,示例代码如下:
public class MySink extends RichSinkFunction<Bean> {
private MyService myService;
/**
* 在open()方法中动态注入Spring容器的类
*/
@Override
public void open(Configuration parameters) throws Exception {
super.open(parameters);
myService = ApplicationContextUtil.getBean(myServiceImpl.class);
}
/**
* Flink Sink执行方法
*/
@Override
public void invoke(Bean bean, Context context) throws Exception {
// 可以正常使用myService
}
}
三、SpringBoot动态加载Spring容器的类
@Component
public class ApplicationContextUtil implements ApplicationContextAware, Serializable {
/**
* 上下文
*/
private static ApplicationContext context;
@Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
this.context = applicationContext;
}
public static ApplicationContext getApplicationContext() {
return context;
}
public static <T> T getBean(Class<T> beanClass) {
return context.getBean(beanClass);
}
}