##spark.sql.SqlContext等包找不到##
报错信息如下:
百度了很多,没有找到原因,后在一朋友指导下修复了该问题
解决方案
修改pom,
把依赖包的范围给注释掉
然后reimport dependencies
重新运行main示例,问题解决
关于maven依赖范围的几点补充:
scope:
This element refers to the classpath of the task at hand (compiling and runtime, testing, etc.) as well as how to limit the transitivity of a dependency. There are five scopes available:
compile - this is the default scope, used if none is specified. Compile dependencies are available in all classpaths. Furthermore, those dependencies are propagated to dependent projects.
provided - this is much like compile, but indicates you expect the JDK or a container to provide it at runtime. It is only available on the compi