环境:win7 cdh5.0.0 spark1.4.0
问题一:编译后运行报错:错误: 找不到或无法加载主类 main.test
再看run - edit config.... 里面对应的有:Warning: Class 'main.test' not found in module 'projectName'
解决办法:对应的类右键 - make directory as - source root
问题二:运行到new SparkContext(conf)的时候出现报错:
Exception in thread "main" java.lang.UnsupportedClassVersionError: com/typesafe/config/ConfigValue : Unsupported major.minor version 52.0
开始一直以为是jdk版本问题,因为jdk8能运行,但jdk7无法运行(以前开发的时候jdk7能运行);
原因后来发现是依赖导致:
<!-- https://mvnrepository.com/artifact/io.github.openconnectors/connector-kafka010 -->
<dependency>
<groupId>io.github.openconnectors</groupId>
<artifactId>connector-kafka010</artifactId>
<version>0.0.3</version>
</dependency>
问题三:
sparkstreaming报错:Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/Time
具体为new StreamingContext(sc, Seconds(20))
或者引用org.apache.spark.streaming.Time( "43758943".toLong)时。
然后pom检查了依赖是有写的,编译没问题;
解决办法:在project structure里面添加对应依赖,我这里加了个lib文件夹,里面有好多个依赖,然后就好了。