Sqoop

1. unzip
2. go to /sqoop_home/server/conf/catalina.properties 
change common.loader add hadoop lib jars

3. add tomcat-juli.jar to server/startup.sh








sqoop server cannot be started
Reason: 
1. port confliction, delete catalina environment variable let it use the default setting. 
2. set jar files into catalina-properties, check if hadoop-home env variable does not affect
3. catalina.properites common-loader lack of guava.jar
4. check server webapp, sqoop under this should've been created
5. sqoop.properties  @LOG_DIR@ ...
6. make sure hadoop is running
7. if want to transfer data between oracle and hdfs, you must add ojdbc6.jar to hadoop lib like common/lib, and specify this dir to sqoop server conf catalina.properties common.loader






command line run sqoop
1. set server --host vm-9ac7-806d.apac.nsroot.net --port 12000 --webapp sqoop
2. create connection --cid 1
3. create job --xid 1 --type import // create job --xid 1
4. start job --jid 1


more:
5. update job --jid 1
6. delete job --jid 1






shift + g vi the last row
change directory authority in hdfs
bin/hdfs dfs -chown -R zg67978 /SqoopTable/


kill a hadoop job
bin/hadoop job -kill job_1394627419025_0073




java.lang.VerifyError: class org.apache.hadoop.security.proto.SecurityProtos$GetDelegationTokenRequestProto overrides final method getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;


solution: 
this error is mainly coursed by protobuf.jar version conflicted. Hadoop applies 2.5.0 but hive applies 2.4.1




question: hive
java.net.SocketTimeoutException: Read timed out


solution:
revise hive-site.xml set hive.metastore.client.socket.timeout to a larger number




org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0012:The type is not supported - java.math.BigDecimal
solution:
remove extractor, like below
//frameworkForm.getIntegerInput("throttling.loaders").setValue(1);




java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.


solution:
add hadoop-mapreduce-client-jobclient-2.2.0.jar and hadoop-yarn-api-2.2.0.jar and hadoop-yarn-client-2.2.0.jar and hadoop-yarn-common-2.2.0.jar and hadoop-mapreduce-client-common-2.2.0.jar to class path




ERROR org.apache.hadoop.security.UserGroupInformation  - PriviledgedActionException


solution:
user is not in the group of linux


use DistributedCache encounter error : file not found


solution:
you have to use DistributedCache.getLocalCacheFiles nor getCacheFiles
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值