Hive中SQL执行语句报错“Display all 469 possibilities? (y or n)”

Hive中执行SQL语句时,出现类似于“Display all 469 possibilities? (y or n)”的错误,根本原因是因为SQL语句中存在tab键导致,tab键在linux系统中是有特殊含义的。

下面我们做个小demo测试一下,如下建表语句中各字段前均为tab缩进:

CREATE TABLE `test_Display_all_xxx_possibilities`(
    `name` string,
    `age` string,
    `sex` string
)
ROW FORMAT SERDE 
  'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' 
WITH SERDEPROPERTIES ( 
  'field.delim'='\u0001', 
  'serialization.format'='\u0001') 
STORED AS INPUTFORMAT 
  'org.apache.hadoop.mapred.TextInputFormat' 
OUTPUTFORMAT 
  'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
  'hdfs://nameservice1/user/hive/warehouse/ti_z_wh_inf_20160928'
TBLPROPERTIES (
  'COLUMN_STATS_ACCURATE'='true', 
  'numFiles'='1', 
  'numRows'='0', 
  'rawDataSize'='0', 
  'serialization.null.format'='', 
  'totalSize'='2326', 
  'transient_lastDdlTime'='1475101756');

hive环境执行结果如下:


hive> CREATE TABLE `test_Display_all_xxx_possibilities`(
    > Display all 469 possibilities? (y or n)
    > ame` string,
    > Display all 469 possibilities? (y or n)
    > g,
    > Display all 469 possibilities? (y or n)
    > g
    > )
    > ROW FORMAT SERDE 
    >   'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' 
    > WITH SERDEPROPERTIES ( 
    >   'field.delim'='\u0001', 
    >   'serialization.format'='\u0001') 
    > STORED AS INPUTFORMAT 
    >   'org.apache.hadoop.mapred.TextInputFormat' 
    > OUTPUTFORMAT 
    >   'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
    > LOCATION
    >   'hdfs://nameservice1/user/hive/warehouse/ti_z_wh_inf_20160928'
    > TBLPROPERTIES (
    >   'COLUMN_STATS_ACCURATE'='true', 
    >   'numFiles'='1', 
    >   'numRows'='0', 
    >   'rawDataSize'='0', 
    >   'serialization.null.format'='', 
    >   'totalSize'='2326', 
    >   'transient_lastDdlTime'='1475101756');
NoViableAltException(-1@[])
        at org.apache.hadoop.hive.ql.parse.HiveParser.type(HiveParser.java:38756)
        at org.apache.hadoop.hive.ql.parse.HiveParser.colType(HiveParser.java:38521)
        at org.apache.hadoop.hive.ql.parse.HiveParser.columnNameType(HiveParser.java:38221)
        at org.apache.hadoop.hive.ql.parse.HiveParser.columnNameTypeList(HiveParser.java:36416)
        at org.apache.hadoop.hive.ql.parse.HiveParser.createTableStatement(HiveParser.java:4863)
        at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:2363)
        at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1585)
        at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1061)
        at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:201)
        at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:417)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:311)
        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1189)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1284)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1115)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1103)
        at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:220)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:172)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:383)
        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:775)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:693)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:628)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
FAILED: ParseException line 24:39 cannot recognize input near '<EOF>' '<EOF>' '<EOF>' in column type


修改建表语句,将字段前tab替换为空格:

CREATE TABLE `test_Display_all_xxx_possibilities`(
  `name` string,
  `age` string,
  `sex` string
)
ROW FORMAT SERDE 
  'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' 
WITH SERDEPROPERTIES ( 
  'field.delim'='\u0001', 
  'serialization.format'='\u0001') 
STORED AS INPUTFORMAT 
  'org.apache.hadoop.mapred.TextInputFormat' 
OUTPUTFORMAT 
  'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
  'hdfs://nameservice1/user/hive/warehouse/ti_z_wh_inf_20160928'
TBLPROPERTIES (
  'COLUMN_STATS_ACCURATE'='true', 
  'numFiles'='1', 
  'numRows'='0', 
  'rawDataSize'='0', 
  'serialization.null.format'='', 
  'totalSize'='2326', 
  'transient_lastDdlTime'='1475101756');


hive环境下重新执行,结果如下:


hive> CREATE TABLE `test_Display_all_xxx_possibilities`(
    >   `name` string,
    >   `age` string,
    >   `sex` string
    > )
    > ROW FORMAT SERDE 
    >   'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' 
    > WITH SERDEPROPERTIES ( 
    >   'field.delim'='\u0001', 
    >   'serialization.format'='\u0001') 
    > STORED AS INPUTFORMAT 
    >   'org.apache.hadoop.mapred.TextInputFormat' 
    > OUTPUTFORMAT 
    >   'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
    > LOCATION
    >   'hdfs://nameservice1/user/hive/warehouse/ti_z_wh_inf_20160928'
    > TBLPROPERTIES (
    >   'COLUMN_STATS_ACCURATE'='true', 
    >   'numFiles'='1', 
    >   'numRows'='0', 
    >   'rawDataSize'='0', 
    >   'serialization.null.format'='', 
    >   'totalSize'='2326', 
    >   'transient_lastDdlTime'='1475101756');
OK
Time taken: 0.087 seconds

结论:hive环境中SQL执行语句中应避免使用tab键。
参考:https://blog.csdn.net/stefan_xiepj/article/details/53197212

  • 1
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值