shell脚本学习笔记

一、引号问题

 

ls: cannot access /usr/local/spark-2.2.0-bin-hadoop2.6/lib/spark-assembly-*.jar: No such file or directory

Logging initialized using configuration in jar:file:/apps/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/apps/hadoop-2.7.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/apps/hbase-1.2.1/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
OK
Time taken: 1.688 seconds
NoViableAltException(26@[])
        at org.apache.hadoop.hive.ql.parse.HiveParser.tablePropertiesList(HiveParser.java:34375)
        at org.apache.hadoop.hive.ql.parse.HiveParser.tableProperties(HiveParser.java:34243)
        at org.apache.hadoop.hive.ql.parse.HiveParser.rowFormatSerde(HiveParser.java:33596)
        at org.apache.hadoop.hive.ql.parse.HiveParser.tableRowFormat(HiveParser.java:34077)
        at org.apache.hadoop.hive.ql.parse.HiveParser.createTableStatement(HiveParser.java:5353)
        at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:2640)
        at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1650)
        at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1109)
        at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:202)
        at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:396)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:308)
        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1122)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1170)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
        at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:311)
        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:708)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
FAILED: ParseException line 11:28 cannot recognize input near 'input' '.' 'regex' in table properties list

原因:

hql3="
      drop table ratings;
      create external table if not exists ratings
      (userid bigint,
      movieid bigint,
      rating double,
      timestamped string)
      partitioned by (dt string,city string)
      row format serde 'org.apache.hadoop.hive.serde2.RegexSerDe'
      with serdeProperties ("input.regex"="([0-9]*)::([0-9]*)::([0-9]*)::([0-9]*)")    这里要改成单引号
 
    location '/zgm/ratings'
      ;

      load data local inpath '/zgm/ratings.dat' overwrite into table ratings partition(dt='${yesterday}',city='shandong');
     "

 


 

二:正则写错

hive (myhive)>
             >  select * from users limit 10;
OK
users.userid    users.gender    users.age       users.occupation        users.zipcode   users.dt        users.city
Failed with exception java.io.IOException:org.apache.hadoop.hive.serde2.SerDeException: Number of matching groups doesn't match the number of columns
Time taken: 0.097 seconds

原因:

     create external table if not exists users
      (userid bigint,
      gender string,
      age int,
      occupation string,
      zipcode string
      )
      partitioned by (dt string,city string)
      row format serde 'org.apache.hadoop.hive.serde2.RegexSerDe'
      with serdeProperties ('input.regex'='([0-9]*)::([A-Z])::([0-9]*)::([0-9]*)::([0-9]*)')
      location '/zgm/users'
      ;

绿色的地方,每个小部分要用括号括起来,不括起来,就报错,正则没有学好

 

 

 

三、判断文件夹是否存在,并打印日志

#!/bin/bash

yesterday=$(date -d '1 day ago' +%Y%m%d)
echo $yesterday

runtime=$(date +%Y%m%d-%H:%M:%S)

logdir=/Log/movieLog/$yesterday/top10
if [ ! -d "$logdir" ];then
mkdir -p $logdir
echo "${logdir}不存在,创建完毕"
else echo "${logdir}已存在"
fi

echo "hello">$logdir/"${runtime}".log

 

注意这些空格!!!!!
if [   !   -d   "$logdir"   ];then    
mkdir -p $logdir    //创建多级目录要加上-p
 

 

四、简单练习

1.将/test目录及其子目录下,所有以扩展名.txt结尾的文件中包含girl的字符串全部替换为boy;

sed -i 's/girl/boy/g'  /test/*.txt   //不能涉及子目录

 find /test/ -name '*.txt' | xargs sed -i 's/boy/gril/g'   //可以到子目录
 


 

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值