CDH修改Oozie数据库类型为mysql
1.将mysql驱动存放到/var/lib/oozie目录下
2.创建oozie数据库
3.初始化数据库
workflow.xml文件配置mapreduce遇到的问题:
1
.java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.rytong.mdap.analytics.compute.basic.UserCounterJob.CounterMapper not found
问题原因:
<name>mapreduce.map.class</name>
<value>com.rytong.mdap.analytics.compute.basic.UserCounterJob.CounterMapper</value>
解决方式:将点.改为 com.rytong.mdap.analytics.compute.basic.UserCounterJob c o m . r y t o n g . m d a p . a n a l y t i c s . c o m p u t e . b a s i c . U s e r C o u n t e r J o b CounterMapper
问题2:
java.lang.NoClassDefFoundError: com/maxmind/geoip2/exception/AddressNotFoundException
问题原因:
缺失第三方jar包
解决方式:
将jar包放到workfolw.xml所在目录的lib目录下
问题3:
Oozie 出现 ClassNotFoundException 解决方法
只需要在job.properties里面加入oozie.use.system.libpath=true
参考地址:http://jyd.me/nosql/oozie-classnotfoundexception-solution/
问题4:
执行shell时:[org.apache.oozie.action.hadoop.ShellMain], main() threw exception, Cannot run program
问题原因:由于我们是在集群上运行,因此我们需要把脚本拷贝到每个节点:
加一个file节点
/user/root/examples/apps/hive/create_mysql_table.sh
/user/root/examples/apps/hive/create_mysql_table.sh#create_mysql_table.sh
**执行mapreduce时的workfolw.xml**
<workflow-app xmlns="uri:oozie:workflow:0.2" name="usercounter-job-wf">
<start to="mr-node"/>
<action name="mr-node">
<map-reduce>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<prepare>
<delete path="${nameNode}/zqc/264/tmp/${outputDir}"/>
</prepare>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${queueName}</value>
</property>
<property>
<name>mapred.mapper.new-api</name>
<value>true</value>
</property>
<property>
<name>mapred.reducer.new-api</name>
<value>true</value>
</property>
<property>
<name>mapred.job.name</name>
<value>UserCounterJob</value>
</property>
<property>
<name>mapreduce.inputformat.class</name>
<value>org.apache.hadoop.mapreduce.lib.input.SequenceFileInputFormat</value>
</property>
<property>
<name>mapreduce.map.class</name>
<value>com.rytong.mdap.analytics.compute.basic.UserCounterJob$CounterMapper</value>
</property>
<property>
<name>mapreduce.reduce.class</name>
<value>com.rytong.mdap.analytics.compute.basic.UserCounterJob$CounterReducer</value>
</property>
<property>
<name>mapred.mapoutput.key.class</name>
<value>org.apache.hadoop.io.Text</value>
</property>
<property>
<name>mapred.mapoutput.value.class</name>
<value>com.rytong.mdap.analytics.source.Message</value>
</property>
<property>
<name>mapred.output.key.class</name>
<value>org.apache.hadoop.hbase.io.ImmutableBytesWritable</value>
</property>
<property>
<name>mapred.output.value.class</name>
<value>org.apache.hadoop.io.LongWritable</value>
</property>
<property>
<name>mapreduce.outputformat.class</name>
<value>org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat</value>
</property>
<property>
<name>mapred.reduce.tasks</name>
<value>2</value>
</property>
<property>
<name>mapred.map.tasks</name>
<value>2</value>
</property>
<property>
<name>mapred.input.dir</name>
<value>/zqc/264/clean/session/20180415</value>
</property>
<property>
<name>mapred.output.dir</name>
<value>/zqc/264/tmp/${outputDir}</value>
</property>
</configuration>
<file>/user/root/examples/apps/java-main/config</file>
</map-reduce>
<ok to="end"/>
<error to="fail"/>
</action>
<kill name="fail">
<message>Map/Reduce failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app>
“`