方案1:把所有的第三方jar和自己的class打成一个大的jar包,这种方案显然笨拙,而且更新升级比较繁琐。
方案2:
在你的project里面建立一个lib文件夹,然后把所有的第三方jar包放到里面去,hadoop会自动加载lib依赖里面的jar。
http://www.blowide.com/2010/04/including-external-jars-in-a-hadoop-job/
注意最后一段:
Luckily, I bumped into a solution [b][u]suggested Doug Cutting as an answer [/u][/b]to someone who had a similar predicament.[u][b] The solution was to create a “lib” folder in your project and copy all the external jars into this folder[/b][/u]. According to Doug, Hadoop will look for third-party jars in this folder. It works great!
原作者给的solution。
方案2:
在你的project里面建立一个lib文件夹,然后把所有的第三方jar包放到里面去,hadoop会自动加载lib依赖里面的jar。
http://www.blowide.com/2010/04/including-external-jars-in-a-hadoop-job/
注意最后一段:
Luckily, I bumped into a solution [b][u]suggested Doug Cutting as an answer [/u][/b]to someone who had a similar predicament.[u][b] The solution was to create a “lib” folder in your project and copy all the external jars into this folder[/b][/u]. According to Doug, Hadoop will look for third-party jars in this folder. It works great!
原作者给的solution。